Thoughts on Xcel’s 2030 Resource Plan

Xcel Energy, the state’s largest electric utility, has filed their 2016-2030 Resource Plan with the Public Utilities Commission. This begins a long process of commenting and modification until their plan is approved by that body (which can take years). The Resource Plan details what trends in usage Xcel expects, and what resources (like new power plants, etc) are needed to meet that demand. The plan is important because it identifies the infrastructure investments the utility will need to make, and also the resulting environmental performance, among many other details.

I’m slowly making my way through it, both for professional and personal interest, and hope to highlight some thoughts for you, my dozens of readers.

There are a lot of things to like in the plan, the first being that Xcel is planning to meet State greenhouse gas emissions reduction goals within their own system. This is unlike the previous plan, which showed emissions increasing between 2015 and 2030. The chart below, from Appendix D, compares the two plans. (State goals include a reduction of 15 percent by 2015, 30 percent by 2025 and 80 percent by 2050)

2030 CO2 Emissions Xcel

Most of the planned reductions in carbon pollution come from the addition of renewable energy resources to their system, as the chart below shows. By 2030, Xcel plans for 35 percent of their energy portfolio to be renewables.

Sources of CO2 reductions

However, I think the plan’s assumptions about the future cost of the solar portion of those renewables is probably too high.

Xcel plans to add over 1,800 MW of utility-scale solar to their system by 2030 (up from basically zero in 2015). This is a significant increase from the “reference case”, a ten-fold increase in fact. However, this slide was presented at a public meeting at the Public Utilities Commission:

Renewable Price ForecastXcel says this in Appendix J about their assumption:

As solar technology is still not fully mature, and costs are expected to decline and conversion efficiency to improve, it was assumed that the $95/MWh price holds throughout the study period. In effect, the assumption is that fundamental cost driver improvements will offset inflation.

So the rate of decrease in solar prices will match the inflation rate? Many sources have documented the dramatic decline in solar PV prices over recent years. Lazard seems to be an oft-cited source, and their 2014 Levelized Cost of Energy Analysis shows the price of energy from solar has dropped 78% since 2009. According to, the cumulative rate of inflation between 2009 and 2014 was about 10%. So, at least looking historically, this seems way off.

Of course, current precipitous declines probably won’t continue forever (most of the cost is now not modules). NREL says costs have been dropping on average 6 to 8 percent per year since 1998. If we assume just half of that decline per year (4 percent), solar energy would be around $51 per MWh in 2030. Using some very back-of-envelope calculations, a price difference of $46 per MWh in 2030 means costs for new solar energy shown in the Plan’s “Preferred Plan” scenario could be over-estimated by $97 million.

This is significant not just because the price estimates of the Preferred Plan may be too high. In preparing the plan, Xcel also ran seemingly dozens of other scenarios, some including CO2 reductions of over 50% in 2030 (compared with 2005). The price difference, according to Xcel, between the Preferred Plan scenario and the scenario with the largest CO2 benefit is $172 million (from Appendix J). These other scenarios which seem too costly may actually be more in line with what Xcel is currently asking to spend once dropping technology costs are factored in.

Xcel Energy: social cost of carbon is $21 per ton

Old news, but still worth posting. In October, Xcel Energy filed a report with the Public Utilities Commission defending the cost overruns of upgrading the nuclear power plant in Monticello. Via the Star Tribune:

Xcel filed the report in response to the state Public Utilities Commission’s pledge in August to investigate the Monticello investment. The company said that even with the cost overruns, the project benefits customers — saving an estimated $174 million through the remaining 16 years of its license.

Yet that cost-benefit number relies on a “social cost” comparison between keeping the nuclear plant, which emits no greenhouse gases, vs. generating electricity from a plant that does emit them. State law says utility regulators should consider the cost of greenhouse gas emissions, though they’re not currently regulated. Without carbon-emissions savings, the Monticello upgrade would be a losing proposition, costing customers $303 million extra over its life, according to Xcel’s filing.

In interviews, Xcel executives defended the investment, saying they would make the same decision today, even though the utility world has changed since 2008, when the project began. Natural gas, now a favored fuel for power plants, is low-priced thanks to the fracking boom. And electricity demand has lagged since the recession, dampening the need for new plants.

“If we didn’t have our nuclear plants, we would be taking a big step backward in terms of our CO2 accomplishments,” said Laura McCarten, an Xcel regional vice president.

If you dig into the dockets (CI-13-754), you can find that Xcel’s modeling assumptions include a price on carbon of $21.50 per metric ton starting in 2017.

Regardless of your feelings about nuclear power, a utility stating that the externalities of carbon should be priced when making energy planning/financing decisions is significant. The use of a ‘social cost of carbon’ (SCC) metric at the federal level has (not shockingly) been the point of some contention.  The Office of Management and Budget’s SCC is $35/mt in 2015 versus Xcel’s $21 in 2017.

Theoretically, we should start to see this figure or something similar used in all future energy planning decisions (Sherco, cough, cough) in Minnesota.  Unless of course, Xcel was only being selective in order to justify recovering this very large expense (and spare the shareholders).

It would be an interesting exercise to apply this Minnesota SCC to land use and transportation infrastructure and planning decisions.

All the ways we subsidize growth

Over at, I reflect on a recent event I attended, ““Kicking the Habit: Unsustainable Economic Growth” that featured contributor Chuck Marohn delivering his Strong Towns message.  I focus on one issue that is featured prominently in the Strong Towns narrative: intergovernmental transfer payments which subsidize growth and potentially hide the true cost of development.

In Minnesota, we build roads really well. If you look at the metro area, we’ve created a system where despite wide differences in job and housing density, commute times are virtually the same whether you live in Dahlgren Township or Loring Park in downtown Minneapolis. We also have a semi-famous regional government that makes connection to the same wastewater system easy, no matter where you are in a 7-county region that includes both farms and skyscrapers. All these things (and more) are made possible by shared resources, often collected from one area or community type, and sent to another with a different character. Somehow we’ve determined that this is a good thing (for ease of access, equity, environmental protection, political will, etc) As I listened to Chuck I thought, “you’d really have to remake how local governments interact if you wanted to promote (or even test) the idea that our “most productive places” should be differentiated from our least productive.

I won’t attempt to figure out how this can be done. But I think it’s valuable to think about all these “transfer payments”. There are more than most people ever think about. So, here goes:

Read the rest.

How much would a 10% solar standard reduce Minnesota’s greenhouse gas emissions?

Presenting Curt Tosh's farm-based solar project - Solar Works in Central Minnesota!

This week a bill was introduced to the Minnesota legislature to establish a 10 percent solar energy standard by 2030.  This would be on top of the existing requirements for utility renewable energy, bringing the total amount of energy coming from renewables in the state to at least 35 percent in 2030.

This bill is being promoted for it’s job creation aspects, but clearly a key benefit is the reduction in greenhouse gas emissions from the electricity generation sector (which currently produces 32% of the state’s greenhouse gas emissions).  So, by how much would a 10% solar standard reduce Minnesota’s emissions?  Would it allow us to meet our greenhouse gas reduction goals?

The first (and easier) part of trying to put some numbers to this is estimating how much electricity Minnesota will use in 2030.  EIA summaries tell us that Minnesota consumed a little under 68 million megawatt hours in 2010.  Power projections produced for the Annual Energy Outlook tell us that in MRO West (our electricity grid region), the annual growth in electricity consumption will be very modest through 2030, typically under 1% annually.  If you assume these growth rates apply to Minnesota, we may consume over 73 million MWh in 2030, or 8 percent growth over 20 years.


10 percent of that is 7.3 million MWhs in 2030.  Figuring out exactly how much greenhouse gas this would save is trickier.  In 2010, electricity generation accounted for 32% of the total 155.6 million metric tons of CO2 equivalent emissions.  Rough math using EIA consumption figures provides a greenhouse gas coefficient for Minnesota electricity of 0.73 metric tons of CO2e per MWh.  However, this figure will surely go down over the next 20 years as utilities work to meet the existing renewable energy mandates already on the books.  Xcel Energy, which has to meet a more aggressive renewable energy standard then the rest of the state, already has a coefficient closer to 0.5 mt/MWh, which will be declining (see slide 17) to something like 0.42 mt/MWh by 2025.

So, assume the state’s net greenhouse gas coefficient for electricity is somewhere around 0.5 mt/MWh in 2030 (assuming other utilities and imported electricity are both dirtier than Xcel).  If 10% of our electricity demand is met by solar energy, this would be a savings of 3.6 million metric tons of CO2e.  3.6 m metric tons is about 2.3% of our 2010 emissions total, or about 7% of emissions from the electricity sector in 2010.

Using an net coefficient average emissions factor for calculation may be too simplistic, but it’s the best I’ve got right now.  Those more in the know say that renewable energy like solar will most frequently replace natural gas production, rather than coal or nuclear, as gas is easier to cycle on and off.  I’m not sure whether this would increase or decrease the benefit of this level of installed solar (but I’m working on it).

Update: I was pointed to this journal article by Carbon Counter, which attempts to calculate “marginal emissions factors”, rather than average factors. It turns out, since the Midwest is coal-heavy, usually an “intervention” (adding solar, for example) would displace coal power first, rather than gas.  The marginal emissions factor they calculate for the Midwest is about 13% higher than the 0.73 mt/MWh I mention above.  The Midwest is somewhat unique in this regard, as most regions show gas as the most common “marginal fuel source”.  It also has the highest marginal emissions factor of all the regional electricity generators looked at in the study.  A 12% increase over 3.6 million mt is 4.03 million mt.

At something near 3 or 4 million metric tons of emissions saved, would a 10% solar standard help us meet our state emissions reduction goals?  Nowhere near on its own, but it would be a significant step in the right direction, especially when combined with strong action in other sectors like transportation and agriculture.  Think of it as part of the Minnesota version of the wedge game.

What is a carbon tax worth?

California has begun a historic cap and trade market in carbon, completing the first auction, with permits going for $10.09 per metric ton.  I’m not sure cap and trade and the offsets it allows are the right way to go. But when I read this, I wanted to understand what such a program might mean for an average Minnesota energy consumer (after all, California is a distant and foreign land).

Xcel Energy, the electricity provider for most of the Twin Cities metro, produced 0.5266 metric tons of CO2e per MWh in 2011.  At $10 per mt, that’s about $5.31 per MWh, or roughly half a cent per kWh.  The EIA says the average Minnesota residential consumption is 813 kWh per month.  This seems awfully high, but we’ll go with it.  At that rate, the average residential customer would pay an extra $4 per month on their electricity bill.

Natural gas is trickier to estimate an average for, although some 2005 data says perhaps 650 therms per year, per household, using metro assumptions about people per household.  That seems low.  We used over 1,000 therms the last two years, but our house is old.  At 0.005 mt of CO2e per therm, the tax would increase the price of natural gas 5 cents per therm.  If you use 1,000 therms per year, that’s about $4.50 more per month.

So if something like $10 per metric ton was imposed in Minnesota, residential customers might see a utility bill increase of $8 per month, or $96 per year.  The California Public Utilities Commission has proposed a means to eliminate that cost.  Residential customers would actually be paid a dividend from the revenue generated by the auctions, which they say would more than offset the cost of the carbon tax.  Commercial and industrial users are a whole other ball of wax I haven’t touched here, and higher energy prices probably means higher product prices.

All this is not to say that a carbon tax or cap and trade system is appropriate for Minnesota (or the US).  $10 per ton is likely too low, their could be serious equity issues with offsets and increasing energy prices, and other tricky stuff.  But at $10/ton, direct energy costs to residents probably wouldn’t break the bank.

$20 billion to protect NYC from climate change

From WNYC:

City Council Speaker Christine Quinn laid out a massive $20 billion proposal Tuesday to combat the effects of climate change on New York City’s infrastructure as the region continues to assess damage and plan clean-up after Hurricane Sandy…

The plan was framed around two key issues: how to prevent flooding and how to safeguard infrastructure. It includes studies to assess what solutions – from manmade sea walls to natural defenses like sand dunes – could best protect the city’s most vulnerable neighborhoods.


A reader shares this infographic from, which decries “America’s crumbling infrastructure“.

Ryan O’Connor shares this infographic from Strong Towns on the challenges facing Memphis (and the potential solutions).  This may be one of the more straightforward explanations of Strong Towns solutions I’ve seen to date.

Incidentally, the very first comment on the infographic is from Chuck Marohn, Strong Towns founder.

The lack of context in this bit of propaganda is disappointing. It is formatted to insinuate that there is this huge problem with maintenance (there is) and that the problem is not enough money (it isn’t). If you start to break out these numbers you see that every American family of four has the responsibility to pay to maintain 176 feet of pipe ($26,400), 5 feet of highway ($5,700), 0.6% of a bridge ($20,000). $2.2 trillion is $29,000 for a family of four OVER THE NEXT FIVE YEARS. Maybe….just maybe….we’re not making very productive use out of everything that has been built up to this point and, if so, maybe….just maybe….a more viable economic solution would be to start. Come on – you can do much better than simply repeating ASCE’s worn out propaganda.

A challenge to the market-oriented urbanists


Josh Barro, over at City Journal, makes some good points about the real contribution of subsidies to the auto/transit war.  However, I’m disappointed that this is yet another example of “market-oriented urbanists” (MOUs) admiring the problem without proposing a solution.  Barro, like others before, posits that local political decisions about planning and zoning laws are standing in the way of market operations which would achieve beneficial results for us all.  If only we would just change the dang zoning, dense housing would rise, rents would fall and transit would become a more attractive travel mode (in this world there are few, if any, externalities of dense development, a position we’ll take as given for the rest of this post).

Understanding the impacts of restrictive zoning on rents is important. But every time I read one of these change-the-zoning posts, I can’t help feeling that I’m watching the discovery of a concept (densifying urban areas) that smart growth advocates and planning students have known and been advocating for a very long time.  Clarence Perry dreamed up the “Neighborhood Unit” in 1929 in an attempt to address the nation’s rising automobility and associated externalities (the Neighborhood Unit called for at least ten units per acre). There may be more market demand now for dense, transit- (or stuff)-oriented development, but the issues are the same.

More calls for density based on market forces, fine.  But what almost every single one of these articles seems to lack is any robust exploration of how zoning rules are adopted, enforced, and changed and what exactly the author proposes as an alternative.  Barro, after spending eight paragraphs detailing auto vs. transit subsidies, says “Cities should allow dense development…but locals tend to oppose greater housing density.”  Solution? None given.  Barro states, “A much smarter approach…allow looser urban zoning”.  Got your fairy wand ready for waving?  Me neither.  I haven’t yet read Matt Yglesias’s “The Rent is Too Damn High”, probably the pinnacle of pundit-driven, change-the-zoning rallying cries, but every reference to it I read talks about “regulatory framework” not public process or neighborhood preferences.  Do away with parking minimums, unrestrict maximum heights, reduce setbacks.  All fine. What’s the roadmap? How will Yglesias, Barro or Lee move these changes through the court of local landowner public opinion?  The public process piece is mostly overlooked.

Zoning laws are made by men and women and enforced by same, often times by existing landowners who are risk- and change-averse.  How often has this scene played out across America: 1) Developer buys property 2) Developer decides that in order to make profit, he/she must build something that is larger than zoning allows 3) He/she goes to neighborhood board/zoning board to ask for rezoning or variance 4) Neighbors howl that the building is too tall/will generate too much traffic 5) Zoning board caves or developer backs out 6) Project doesn’t get built or is downsized.  The other process to change local zoning happens like this: 1) City decides to update their comprehensive plan (and zoning to implement) 2) A public process occurs and 3) density is usually restricted in some or many places to less than the market might bear (especially in existing single-family neighborhoods). This is how zoning law gets made in America.  There is no single authority, no dusty bureaucrat simply refusing to pull the magic zoning lever that will unleash the benevolent market forces.  Its individual homeowners and developers showing up at public meetings, testifying, and sitting on advisory boards.  Its elected city councils voting based on the feelings of their (loudest) constituents.  Future residents don’t typically have much of a voice.  This is local democracy in action.

Do the MOUs suggest making land use authority more regional and less local as Yglesias hints at in his NYT interview?  They will encounter some strong resistance from some other “libertarians”.  Do they suggest some changes to the local public process used to adopt/change zoning rules?  If so, I haven’t read any detailed proposals yet.  Andres Duany, famous architect and urban planner (and not someone I would classify as a MOU), has proposed citizen juries, but I’m not aware of many other proposals.  Do they propose abolishing some or all local land use authority and process?  They will likely meet strong resistance from all sides, conservative and liberal alike.

So my challenge to the MOUs is this: stop writing about rent-spiraling zoning.  We get it, in some places developers can’t build as tall as they want/rents are too high.  This is the easy part.  Start writing about public process.  What changes do you propose to local government decision-making processes that would speed development and/or make the costs and benefits of planning and zoning decisions (especially the long-term ones) more plain?  This is the hard part.  Public sector planners have been working on it for quite some time, and haven’t really come up with a great solution yet.  We could use your help.

Update: Josh Barro has in fact proposed some solutions, to which he pointed me.  I don’t find any of these particularly realistic, except perhaps moving towards more rental (which is still a long shot).  None of these are process solutions either, more like total structural shifts.  He actually mentions abolishing local land use authority which I mention above, but ultimately talks himself out of it.

America’s first carbon tax

Via Terrapass:

It’s finally here. The first overt economic deterrent aimed at US consumers for their emissions of greenhouse gases has arrived on our shores. Figuratively, at least.

This past week, most major US airlines levied a $3 ticket surcharge on all flights to and from European Union (EU) nations after a European court determined that the “EU Aviation Directive” can and should apply to them. This means that US-based airlines will need to acquire and submit carbon emission permits in line with their emissions, consistent with the EU emissions trading scheme.

Boulder has actually had a carbon tax since 2007, but the airline fee is the first with a national impact.

Taking local action

Minneapolis Skyline

Over at Grist, David Roberts lays down the brutal logic of climate change:

With immediate, concerted action at global scale, we have a slim chance to halt climate change at the extremely dangerous level of 2 degrees C. If we delay even a decade — waiting for better technology or a more amenable political situation or whatever — we will have no chance.

And what’s so special about 2 degrees C?  Well, that may be something like a point of no return.

The thing is, if 2 degrees C is extremely dangerous, 4 degrees C is absolutely catastrophic. In fact, according to the latest science, says Anderson, “a 4 degrees C future is incompatible with an organized global community, is likely to be beyond ‘adaptation’, is devastating to the majority of ecosystems, and has a high probability of not being stable.”

Roberts is citing the work of Kevin Anderson, former head of the UK’s leading climate research institution.  Other scientists are making similar predictions.  James Hanson, director of Nasa’s Goddard Institute for Space Studies, says, “The target of 2C… is a prescription for long-term disaster“.  Increasingly, you don’t have to look far to find words like “apocalyptic” being used to describe the path we’re on.

So we need to reverse course on emissions by 2015, and in dramatic fashion.  But the latest round of international talks seem to be on shaky ground.  All US climate bills have so far failed.  So what’s a local planner or public official to do?  Decry the problem as global in scope and thus unsolvable? Shrug shoulders and pour a stiff drink?  While I have a healthy amount of skepticism about the ability of one jurisdiction or even one state to have a measurable impact on the global trendline, I think we absolutely must be making our best efforts now, for a number of reasons:

Continue reading