Renewable energy technology has been split into two camps since it became a reality around the turn of the century.
On the one hand there are the passionate environmental believers for whom the inflated subsidies were an irrelevance in the face of saving our planet, and on the other were naysayers for whom the arguments about global warming were a plot by the far left to raise taxes or run some kind of tree-hugging environmental agenda at the expense of business and consumers.
Neither polarized position was fair, of course, and the quiet majority in the middle have watched the technologies become progressively more efficient and costs fall dramatically while the extremes of global warming horror stories have been discredited, but the hard science of gradually rising carbon levels has been widely accepted.
Who Cares Why The Temperature is Rising?
In the process, a wider acceptance has gained ground that global temperatures really are rising and whether it is part of a natural cycle or man-made is not a risk we can afford to take. Ultimately, action to reduce carbon emissions will be cheaper than many possible downside scenarios if left unchecked and most people would accept we are making a mess of our environment and really should behave more responsibly.
Meanwhile, politicians have been plowing our taxpayer money into supporting wind, solar and a number of other “renewable” technologies, with some degree of success. Costs for the major energy sources — solar and wind — have fallen, partly as a result of technology improvements and partly due to economies of scale, to the point now where private firms are signing up to invest in major wind projects for a tariff of just $100 per MegWatt/Hour (€90 per mw/h). Indeed, in Europe all the extra power capacity added since the mid ’90s has been renewable.
The biggest hurdle renewables now have to overcome is not the cost of production, but the curse of intermittency. Where does the power come from when the wind doesn’t blow or the sun doesn’t shine? Read more