Cleantech is Alive and Well

Last month, CBS ran a piece on 60 Minutes called “The Cleantech Crash” about the wasteful decline of companies coined as Cleantech industries (a generic term for industries in alternative energy). In 2011, President Barack Obama funded approximately $100 billion into developing Cleantech industries hoping such an investment would spark innovation, development, and ultimately, jobs. According to 60 Minutes, this expensive project turned out to be a large mistake funded by taxpayer dollars. Several companies like Abound Power, Beacon Power, Range Fuels, ECOtality and a host of others went under, which begs the question whether clean, renewable technologies will ever become a viable alternative given what was seemingly a massive failure.

After CBS aired that segment, they received quite a lot of harsh criticism from many experts within the green industry, even the US Department of Energy called it “flat wrong”. Although there were a number of Cleantech companies that went under, the segment failed to mention the resounding success stories that emerged from Cleantech funding. The American solar industry (which had benefited from the $100 billion US government funding) has grown dramatically since 2008. According to Slate Magazine, there were more solar power installations in 2013 alone than in the previous 20 years combined. Employment in the solar industry grew 10 times faster than the US average; it now employs more people than the natural gas and coal industries combined.

Solar generation plant

Ivanpah Solar Electric Generating System in California, the word’s largest solar thermal facility is now operational.

Clearly, such growth is not an indication of decline and decay, but rather, a sign of strength and stability, and it isn’t stopping there. According to Mercom Capital Group, an Austin based clean energy firm, US solar energy generation is projected to increase by another 6,000 MW in 2014 alone!

It isn’t just the solar industry that has benefited from Cleantech funding; wind power has also grown substantially in recent years. According to the American Wind Energy Association (AWEA), the cost of generating wind power has decreased by more than 40% in only 4 years, and at the end of 2013, there was an additional 12,000 MW of planned generation under construction. Furthermore, the vast majority of the additional wind power capacity will be coming from onshore wind turbines; the US has barely tapped the potential from offshore wind generation, where wind speeds are greater and more consistent.

Flat wind farm

Shepherds Flat Wind Farm in Oregon, the second largest of its kind in the US began operations in 2012.

During the 60 Minutes segment, reporter Leslie Stahl lists a whole group of companies that failed even though they received funding from the federal government. It is understandable that some would oppose the US government’s handling of Cleantech given that so many government subsidies were wasted on projects that never left the ground, but if that’s all it takes for collective outrage, the fossil fuel industries should never hear the end of it.

According to Businessweek, when it comes to government subsidies worldwide, coal, oil and gas have received more than $400 billion in 2010 alone. That same year, renewable energy industries received a comparatively small $60 billion. Additionally, every year, the US government subsidises coal, oil and gas by as much as $50 billion if the cost of securing oil reserves in the politically volatile Middle East are included and for what? A 2009 study conducted by the National Academy of Sciences reported that burning fossil fuels costs the US around $120 billion a year in health related costs and thousands of premature deaths. In essence, the American taxpayer is giving $50 billion to some of the wealthiest corporations only to have them contaminate the environment making them sick and slowly killing them. Additionally, the costs resulting from the effects of global warming have yet to be fully quantified with no realistic estimate available. Even so, I can’t imagine that it would be a small figure.

Coal Generation Plant

Fossil fuel industries receive far more in government subsidies than Cleantech.

It seems rather blind and contradictory to condemn the US government for subsidising Cleantech while remaining silent over the subsidies given to oil, coal and natural gas. In the worst case scenario, subsidising a renewable energy project could result in wasted money; subsidising oil, coal or natural gas could result in an environmental disaster and wasted money. Since there is a fixed amount of things we can burn, it seems rather obvious that some kind of an investment into Cleantech industries will be necessary now and in the future.

It is very clear then that calling Cleantech a failure is far too premature, and from the projections and potential for growth, it seems as though Cleantech has a great future ahead. Although there were some bumps along the way, the journey is far from over, and the experiment continues. Just because the first attempt wasn’t a resounding success, it doesn’t mean that any future attempt is doomed to fail. Like many industries, those behind Cleantech are still learning and adapting with the best yet to come. The present has already proven Leslie Stahl a little premature in her judgments, I’m certain time will prove her completely wrong.



The necessity of electricity is never more evident until you don’t have it. Exactly 10 years ago, such a thought was in the minds of over 50 million people across 8 North-eastern US states  and Ontario as they experienced, at the time, the second largest blackout in history.

The epic event was caused by sagging high-voltage power lines in Ohio coming in contact with overgrown trees that should have been trimmed and maintained properly. The power line then overloaded, shut down, and should have triggered an alarm to the grid operators. However, within an hour, 3 more lines overloaded and shut down, while grid operators remained in the dark (figuratively), completely unaware of what was happening. By 4:10 pm, millions were left in the dark (literally) as a massive blackout spread across the grid, lasting as long as 2 days in some places. It is estimated that the 2003 Northeast Blackout, cost around $6 billion and resulted in 11 deaths.

New York skyline

New York on August 14, 2003

 Since that fateful day in August, which remains a watershed moment for those who had to live through it, the power grid has changed. On this 10 year anniversary, it is important to highlight the problems of the grid then, and look at the progress that has been made, and what still needs to be done to ensure such events are a thing of the past.


The blackout spurred both the American and Canadian government to respond with sweeping changes and regulations to prevent such events from happening again. Before the 2003 blackout, the North American Electricity Reliability Council (NERC) compiled a list of voluntary standards for energy distributors. At the time, many power distributors could opt out of such recommendations. After the blackout, however, these standards became mandatory and are now enforced by overseeing agencies that have the authority to issue penalties to violating parties. One of these new regulations requires foliage to be adequately cleared from power lines (the cause of the blackout). Failure to comply with this regulation could result in a fine upwards of a $1 million per day depending on risk and severity.

In 2003, electricity data sharing was well behind today’s standards. In North America’s interconnected grid, a power outage in one area can cause a measurable change in demand within seconds in another area of the grid thousands of miles away. Back then, it would take a distributor 30 seconds or more to receive data measuring such a change in demand. Today, that delay has been reduced, largely as a result of the deployment of phasor measurement units (PMUs for short). These devices are connected to transmission lines, measuring any changes in voltage. If the fluctuations are severe, they can warn distributors of an imminent power failure in less than 10 seconds. In 2003, there were exactly zero PMUs deployed in North America, at the end of this year, there will be more than 1,000 in operation.

Phasor Unit

A phasor measurement unit, devices like these are helping us to understand what’s happening to our grid much faster

Although no electrical grid system is immune to blackouts (no matter how advanced or well maintained), as a result of many of these preventative changes, power distributors know more about what’s happening in their grid today than they did yesterday, and they will know if there’s a problem a lot sooner. Additionally, more oversight is ensuring that the grid doesn’t remain vulnerable to an outage like it did in the past. Experts like former NERC vice president, David Hilt, agree that blackouts of the magnitude experienced in 2003 are less likely to happen today.


Although the grid has come a long way since the blackout, there are still daunting challenges ahead, among them is generating capacity. According to the US Department of Energy, by 2040 electricity usage is expected to increase by 28% from 2011 levels, and according to the Ontario Power Authority, by 2030, demand in Ontario will be 15% higher than 2010 levels. This scenario poses certain complications and no guaranteed solution. It will likely require an increase in generating capacity, more transmission lines or both in order to decrease the likelihood of power outages, and all of these options are very expensive.


New transmission lines are needed in the future

The 2003 blackout proved that a centralized grid system is very vulnerable to a large scale outage, and 10 years after the fact, the grid is still a largely centralized operation. In a reverse trend, another viable option would be to have a more decentralized grid where power is generated in a number of different locations, and distributed over shorter distances. Although such a grid would not be immune to a blackout either, in the event of one it’s unlikely to affect all regions at once as each region would be able to generate power independently of the others.

As was mentioned in a previous blog post, decreasing energy consumption is a viable option as it requires no additional infrastructure and would result in a less strained grid. However, even if we managed to decrease per capita energy consumption drastically, the infrastructure itself would still pose a problem. Simply put, the grid’s infrastructure is old and it needs to be replaced. Much of our current grid remains unchanged since the 1960s. According to Massoud Amin, Professor in Computer and Electrical Engineering at the University of Minnesota, upgrading transmission lines throughout North America alone would cost about $80 billion over the next 10 years.

Grid Updates

Updating and refurbishing the grid will be an expensive but necessary project

The end goal that consumers and distributors alike want is a smart grid, which is capable of monitoring itself, and automatically adapting and compensating for any conflict involving generation, consumption and distribution. This dream, however, is a good 20 years away and is largely dependent upon investments in better data collection tools (such as PMUs). Concerning a smart grid, Amin estimates that it will cost, at the very least, $340 billion over the next 20 years. Such a figure seems like a crippling blow to any smart grid proponent, however, such an investment may pay for itself overtime as inefficiencies are eliminated and blackouts prevented, which are already expensive, economy crippling events.

In any case, the worst thing that can be done is nothing at all, and the steps that we are taking now seem to be bearing fruit as there hasn’t been a blackout of the magnitude 10 years ago. Given the daunting (and expensive) task to recreate and refurbish our grid, the future seems somewhat bleak, but making the right decisions today will, at the very least, guarantee that the future won’t be dark.