Originally published on Medium
In the second half of the 20th century, the personal computer brought on a series of advancement in the spread of information, boosted productivity, and collective intelligence. No one, not even Alan Turing, the inventor of the Enigma code breaking machine in WWII, could have anticipated that simple ones and zeros can have such an impact on society, possibly greater than any technologies ever invented. Now, one-hundred years after the dawn of the industrial revolution, another opportunity precedes us. This time, it is the rising need of global energy security and environmental sustainability that will offer the greatest socioeconomic impact for new technology innovations. But, thus far, only incremental improvements have been made. The prospect of large-scale wind and solar adoption, for example, is limited by the fundamental nature of their intermittency. Notwithstanding, can the next wave of innovation really offer monumental transformations in energy the way computers did for information, productivity and interconnectivity?
To offer an alternative approach, we need to examine our current energy infrastructure for what it is worth. The power grid in the United States had been in place since the early to mid 1900’s. It’s old. According to the American Academy of Engineering it is also considered to be the most “significant engineering achievement of the 20th century.” It was given birth by the industrialization that reaped greater efficiency at a time when power were generated within city bounds and populated areas. As it currently stands, the grid is the largest interconnected machine on the planet with 9200 electric generating units, over one million megawatts of generating capacity, and over 300,000 miles of transmission lines. The whole system is valued at $876 billion by the industry group Edison Electric Institute. Impressively, despite its age, the Department of Energy states that the country’s power grid is 99.97% reliable.
After all, the central power grid is based on a serial architecture — generation precedes transmission, followed by distribution. And to obtain such tremendous level of reliability, it requires backup reserves that can be available at moments notice. Load-following and peaking plants are two types of reserve capacities that can be dispatched if the main source of power goes off-line or for demand-response. At any one time, there is twice as much reserve capacity available than what is utilized. What this means is that, not only is the reserve capacity sitting idle most of the time, the downstream transmission and distribution (T&D) infrastructure must be designed to handle the total capacity regardless of usage — to recoup cost, utility companies charge larger industrial ratepayers for peaking capacities even if not used. Such serial redundancy is the manner in which our centralized system work to maintain continuity and prevent supply breach. The associated costs, which include both fixed (grid optimization) and variable cost (peaking/load-following capacity), are passed down to the ratepayers to hedge against the 0.03% of time when the grid is unavailable (and it doesn’t even prevent weather-relate outages!). For those 2.6 hours every year, according to DOE, the US economy suffers $70-$200 billions on mostly weather-related outages — such as lost wages and economic output. And, according to estimates, the cost needed to maintain the status quo in the next two-decades is ~$14T.
Overhauling the grid is akin to prolonging the life of a decade-old car that breaks down every time the weather gets cold. Worse, the reluctance for taking on this stop-gap investment goes beyond its astronomical cost. It also has to do with public policy. The great impediment to infrastructure investment is: ‘Who should make the investment?’.
One factor in the public policy discussion has to do with the increased penetration of intermittent solar and wind power. For example, utilities are going to have to make significant investments to modernize the grid while their revenues from the current cost recovery model are declining due to rising-levels of residential PV. The problem is that more than half of power delivery is a fixed cost and only a tiny fraction of that is recovered through variable charges — maybe 10%. Such is observed in solar installations where a 50% variable PV capacity only represent 5% of the total energy supply!
Another issue deals with the heterogeneous nature of solar and wind energy. For instance, the sunbelt states account for 85% of total solar radiation. The same applies for the windbelt states (ND, SD, TX, KS, MT, OK, CO, NM), which accounts for 93% (EIA, 2009c) of the total land-based wind power capacity. The greatest energy-users, however, exist in California and the Northeast. And since the utility companies and grid operator are also divided geographically, the challenge is real when determining who should fund the T&D expenses— which can quickly build up where every mile of above-ground transmission line costs a new Ferrari to build.
Lack of Infrastructure investment is already disrupting Hawaii’s residential solar market. As of April 2015, roughly 12% of electricity customers have rooftop solar and the capacity has already exceeded 120% of the utility’s daytime load. Such overloading condition is causing a backlog of residence wanting to install solar panels. One of them is Allan Akamine, 61, a manager for a cable company, who had spent $12,000 for his rooftop solar panels but for 18-months he has been waiting to negotiate tie-in with Hawaii Electric Company, the largest of the state. As a result of the customer backlog, for the first time in 2014, the growth of PV installations has slowed in Hawaii. Given that PV residence represents so far only a small percentage of Hawaii’s homes, grid optimization is badly needed to address solar installations at the most granular level, behind every meter.
- No new infrastructure
- No distribution
- No transmission
- No intermittent storage
- No grid feed-in
- No permit requirements
- No peaking costs
Alternatively, imagine another approach to modernizing our energy infrastructure, one that does not require revamping the thousands of generators units and the nearly a million mile of T&D. Imagine, instead, an energy infrastructure that relies on billions of localized, out-of-sight micro-generators to power our modern economy. Like the benefit of personal computers, this network of micro-generators will yield unprecedented levels of user data, efficiency, and interconnectivity. Currently in many places across the country, the only way for a utility company to respond to a blackout event is by customers calling in! By that time, millions of residences and businesses are affected and costly damages made. Such scenario will not happen in a world with interconnected micro-generators since system-level reliability is directly proportional to the total redundant units.
Such views of off-grid approach to secure electricity supply has already been implemented in part in the form of microgrids. Some calls it a smart grid and the concept is this: individual islands consisting of multiple generation sources including solar panels, wind turbines and even advanced storage capacity serving the needs of local energy users. The University of Texas at Austin has one of the largest microgrid in the world, with a peak load of 62 MW of capacity, serving 150 buildings on its campus.
To push the envelop further, we propose a network of micro-generators that can operate independently from the power grid, create redundancy via a diversity of resources and scale, and incorporate a host of state-of-the-art energy efficiency measures. The advantage is the added efficiency of eliminating transmission losses from centralized plants, as well as the on-site use of heat as a byproduct of electricity generation for space and water heating. The localized, massively interconnected power generators also provide exciting opportunities for using sensors for data analytics, management and feedback controls to make the supply of power more responsive and reliable.
No greater challenge and better timing than we are now with transforming our energy infrastructure. The impetus for such change using micro-generation can be summed up by the vital need to increase visibility, flexibility and reliability of our energy supply. As Dr. Cheryl Martin from ARPA-E imparted, “Looking back 50 years from now, I’m hopeful that those who judge it [energy innovations] will see the same level of impact [as the internet].” I agree. Having an internet of energy would be exciting.
Pyro-E is a provider of a last-mile energy solution that empowers local, sustainable electricity, water and cooling.
Don't want to miss a cleantech story? Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.