Modeling the Grid — Breakthrough

To start the new year off with a bang, I may be going out on a limb here, but I don’t think so. I hope you’ll take a close look at this….

DOE, EPRI and the entire power industry is abuzz with talk about how the grid can be operated better. The grand vision comes up hard against the incredibly difficult problem of modeling. For many decades, the best mathematicians, operations researchers, utility engineers and others have struggled to come up with (computerized) representations of the grid that can guide planners and operators.

Since the beginning, despite ever faster-cheaper computers, and tremendous innovations in algorithms and computational methods, the state of the art has been forced to make many bad compromises among such factors as speed, accuracy, detail, breadth, time domain, treatment of boundary effects, and applications. Unless corners are cut, a solution might not be found at all (i.e. converge). Areas of study and tools are stove-piped into many separate categories of time-scale and function:

– Real time (sec. to minutes)
optimal power flow, voltage and frequency control, contingency analysis

– Short term (hours to a week)
unit commitment, thermal-hydro coordination

– Annual ( 1-3 years)
maintenance scheduling, rate-design, production costing, hydro scheduling…

– Long term (3-40 years)
generations expansion, transmission planning, etc.

(see "A Primer on Electric Power Flow for Economists and Utility Planners" EPRI TR-104604, Feb 1995.)

To make things worse, the industry is highly fragmented and way behind the curve. Utilities don’t have the same cadre of experts in-house that they used to. Vendors sell "black-box" solutions that don’t live up to promises. Obsolete tools continue to be used because "everybody else uses them" and "regulators accept them". (Never mind the results may be worthless.) A guru of power flow analysis, now retired, told me that much of the industry isn’t even using more powerful real time analysis tools that are over 25 years old.

So there are major institutional problems and technical ones, and the two are intertwined. Not only is the problem fiendishly hard, but lot of people also have vested interests in the status quo (e.g., experts have devoted entire careers, and don’t look kindly at upstart claims of a breakthrough–just as in every field of human endeavor).


This is a long prologue to a story of just such a claimed breakthrough. Optimal Technologies appeared on the scene late in 2001, announcing they had analyzed the June 14, 2000 California blackout, and stating they could have prevented it by fine-tuning the grid according to results from their analysis tool, AEMPFAST.

Needless to say, the world was not especially open to the idea that a newcomer had succeeded in coming up with a methodology that did what so many had sought for so long:

"AEMPFAST is based on a new near-real-time (solves a several thousand bus system in milliseconds) mathematical approach to network analysis, optimization, ranking, and prediction called QuixFlow … a proprietary N-Dimensional (non-linear) analysis, optimization, and ranking engine that also has defendable predictive capabilities and is applicable to any problem that can be modeled as a network. … QuixFlow uses no approximations; it handles multiple objectives; and is able to enforce multi-objective inequality constraints." [from factsheet – see link below]

I have been closely following the company’s progress since then. Their revolutionary claims are finally beginning to overcome the natural skepticism and resistance. At least one major ISO/RTO is signing up, and DOE and a number of large utilities are taking it very seriously. The implications are, as Donald Trump would say, "huge".
Here is an introduction in the company’s own words:

Optimal Technologies is a private company focused on making power-grid systems more efficient, more reliable, and more cost effective to plan and operate. In other words, "smarter". Think of Optimal as the Internet for power grids [or Sonet for telecommunications] self-healing, self-enabling, lowest cost operation with highest reliability.

Problem: Power system infrastructures and the grid networks that support them are breaking down faster than solutions can be developed to address the underlying problems.

Because of inadequate core technologies and especially slow and limited mathematical tools, the utility industry is plagued with many tools based on algorithms that no longer work well for their intended tasks and that do not work well together. Last year’s blackout that effected more than 50 million people should help provide some context. Despite new advances in materials and hardware, blackouts and brownouts are becoming larger and more common because utility system planning and control methods are still in the horse and buggy era — done much as they were 50 years ago — fragmented and piecemealed. In other words, even though system peripherals (such as wind energy, distributed gas generation, fuel cell generators, meters, and demand-side management) are improving, the core grid Operating System that makes them all work well together doesn’t exist.

New Technology: Our software and hardware solutions are based on a revolutionary new mathematical approach to network analysis, optimization, and management. Our technology is far better than current approaches to understanding and managing networks, and allows for both local and integrated, end-to-end views of Generation, Transmission, Distribution and Load. Unlike competing products, our technology can view the complete energy delivery supply chain as an integrated asset, which allows for entirely new levels of risk review and risk management — previously not possible. Optimal’s new technology should be viewed as "Foundational" in that it has pervasive application within the power industry and provides a common framework for many new tools.

Optimal’s Solution: Think of us as the much needed underlying "operating system engine" that integrates, defragments, and prioritizes utility planning, operations, and business processes in the best controllable and defendable way. Our technologies have the ability to simultaneously analyze, optimize, and manage generation, transmission, distribution and customer load Ð down to the individual power line and building. Instead of viewing customer load as a problem, our technology has the ability to make all aspects of the system, including customer load, potential risk-reducing resources [i.e. reliability enhancers] not otherwise possible.

Products: Applications include: Congestion Management, Locational Marginal Pricing, Simultaneous Transfer Limits, Multi-Dimensional Reliability, Automated Network Planning, Emergency Control, System Restoration, and Smart Asset Management.

Beyond the scope of this note, Optimal also has a suite of software and hardware for the demand side, which enables measurement and control — and optimization — down to individual loads.
There is a great deal of information on the company’s website:

Roland Schoettle, CEO
Optimal Technologies International Inc. 707 557-1788

AEMPFAST FACTSHEET (good starting point)

DG Update

Has DG (distributed generation) gone quiet, or mainstream, or both?  Meanwhile, the DOE program has not done well in the proposed budget.  Congressional earmarks are taking up so much money that DOE is forced to cancel some ongoing DG applications projects.

 Here are some developments and updates.

 – DUIT Facility Up and Running 
 – CADER Meeting  Jan. 2004
 – IEEE 1547 Interconnection Standards
 – PG&E DG Interconnection program


Distributed Utility Integration Test Facility

The Distributed Utility Integration Test (DUIT) is the first full-scale, integration test of commercial-grade, utility grid interactive Distributed Energy Resources (DER) in the U.S.  DUIT addresses a key technical issue: electrical implications of operating multiple, diverse DERs at high penetration levels within a utility distribution system.  DUIT’s test plan is intended to focus on grid interaction, integration and aggregation issues, not on DER technology itself. 

After an exhaustive study of program goals and alternative sites, DOE selected the facilities at PG&E’s Modular Generation Test Facility in San Ramon, CA as the home of the new DUIT Facility.  Pre existing buildings, labs and professional staff helped make the choice, along with the adjacent test substation and high-current yard.  The site held an official opening ceremony in August 2003.

The facility offers a realistic yet controlled laboratory environment, enabling  testing of normal and abnormal operational conditions without interfering with a customer’s electric service. DG equipment at the site is commercially available and all on loan to the project from the vendors:  Inverters, rotating machinery, and generation and storage devices. DUIT provides a full-scale multi-megawatt implementation, testing and demonstration of distributed generation technologies in a realistic utility installation.

Utilities may want to take note that DUIT will be confirming and testing to the newly passed IEEE 1547 Interconnection standard, which is expected to be adopted by a large number of state regulators and legislators. Similarly, for California, DUIT will  be testing to the Rule 21 document.

To inquire about prospective DUIT project participation, technical specifications, test plans, project plans or the DUIT white paper, contact the DUIT Project Team.  Reports will be issued by CEC and other sponsors beginning this Summer, and information will be available on the DUIT website:

Susan Horgan, DUIT Project Leader
    Distributed Utility Associates

For the complete history:
"DUIT: Distributed Utility Integration Test", NREL/SR-560-34389, August 2003 (250 pages)


CADER (California Alliance for Distributed Energy Resources)

The 2004 DG conference in San Diego on January 26-28, 2004 had 202 attendees.

Presentations are posted on CADER’s website at or go directly to:

The draft DG-DER Cost and Benefit Primer was developed as a first step to support the discussions at the "Costs and Benefits of DER" session at the Conference on January 26-28, 2004. Comments about the document can be provided via the CADER member list-server to reach all members.


IEEE 1547 Update

As you know, "IEEE 1547 Standard for Interconnecting Distributed Resources with Electric Power Systems" was approved by the IEEE Standards Board in June 2003. It was approved as an American National Standard in October 2003. (available for purchase from IEEE:

SCC21 develops and coordinates new IEEE standards and maintains existing standards developed under past SCC21 projects. These include the original 1547, along with the four spinoff efforts.

> P1547.1 Conformance Test Procedures for Equipment Interconnecting Distributed Resources with Electric Power Systems (EPS)  (draft standard)

> P1547.2 Draft Application Guide for the IEEE 1547 Standard

> P1547.3 Monitoring, Information Exchange, and Control of Distributed Resources Interconnected with EPS (draft guide)

> P1547.4 Design, Operation, and Integration of Distributed Resource Island Systems with EPS  (draft guide)

#1 and 2 have drafts out to their working groups for review.  #1 expects to be ready for ballot early in 2005.
#3 has just completed a draft.
#4 has just been approved as a new initiative, and will be organized over the coming summer.

Complete information is available at:

The next meeting of the IEEE 1547 series working groups will be April 20-22, 2004 in San Francisco. The P1547.1, P1547.2, and P1547.3 working groups will meet concurrently 8 a.m. to 5 p.m. each day. Working groups will be meeting separately – no plenary session is planned.  Details at:


PG&E DG Interconnection program

PG&E held a Distributed Generation (DG) Workshop last December 10. The free event provided PG&E customers and the DG community with practical information on how to navigate the various Electric Rule 21 application and interconnection review processes – from initial application through to permission to parallel with PG&E’s electric distribution system. The focus of the workshop was to communicate PG&E’s internal DG processes and interconnection technical requirements to the DG community. (For details on California’s Rule 21, see:

PG&E has set up an entire cross-company team to deal with all aspects of DG interconnection in a coordinated way.  They appear to be very committed to low hassle, low cost, minimum time for DG projects. A great deal of information about PG&E’s program, (including the 117 page powerpoint from the workshop) is available at:

Jerry Jackson, Team Leader

PS- Jerry’s office generously offers to send a hard copy on request of the nearly 2 inch thick binder that was handed out at the workshop.

       ———CALIFORNIA RULE 21 ——-

After passing Rule 21 in Dec 2000, California PUC established, and the CEC coordinated, a working group of all DG stakeholders. Electric Rule 21 Working Group meetings have been held about once a month since mid 2001.  The purpose is to establish procedures and work through issues to simplify and expedite interconnection projects.  (Agenda and minutes are at:

  California Interconnection Guidebook
  Publication # 500-03-083F
  PDF file, 94 pages, 1.1 megabytes) online November 13, 2003.

The Guidebook is intended to help a person or project team interconnect one or more electricity generators to the local electric utility grid in California under California Rule 21. Rule 21 applies only to the three electric utilities in California that are under jurisdiction of the California PUC: PG&E, SCE, and SDG&E. The Guidebook is written as an aid to interconnection in these utility areas. It may also be useful for interconnection in some municipal utility areas with interconnection rules resembling Rule 21, principally Riverside, SMUD, and the LADWP.


Recommended:   DG Monitor, a free email newsletter from Resource Dynamics Corp. Archive and subscription at:

DOE Office of Electric Transmission & Distribution (OETD)

Back in March, we thought announcements were imminent. (See UFTO Note ? T&D R&D Gaining Attention, 21 Mar 2003.) Little did we realize the kinds of struggles that would ensue internally in DOE over which people, programs and budgets would be won or lost by which office. The new office started its work nonetheless, judging from numerous appearances by its chief, Jimmy Glotfelty, and several planning and roadmapping meetings over the spring and summer. And the dust has settled internally.

OETD officially “stood up” on August 10, but the big August 14th blackout made for awkward timing for a press release–none has been issued. (In fact, until an appropriations bill passes, I’m told they aren’t actually officially “up”.)

A new website quietly appeared on August 21. If offers a first cut at describing the Office and its scope of responsibilities and giving links to planning documents:

[This site has a good compendium of information on the blackout, however for the 12 Sept announcement of the release of a report on the events sequence, go to the DOE home page,]

**National Electric Delivery Technologies Vision and Roadmap**
There’ve been two major meetings this year, one in April and one in July. In chronological order:

April 2003 Vision Meeting Proceedings (PDF 1.1 MB)
[65 people attended, of whom only 8 represented utilities]

Results of the April meeting are given in this vision document**. [The results of the July meeting will be reported in a few more weeks.]:

“Grid 2030” — “A National Vision for Electricity’s Second 100 Years,

**DOE’s National Electric Vision Document
(Final version, July 31, 2003) (PDF 1.2 MB)

Proceedings for National Electric Delivery Technologies Roadmap,
July 8-9, 2003 (PDF 1.0 MB)
[About 20 utilities were represented, with less than 40 people out of 200 participants.]

Glotfelty’s kickoff presentation July 8:
“Transforming the Grid to Revolutionize Electric Power in North America” roadmap opening 07 08 03.pdf


No personnel are identified on the new website (other than Gotfelty and Bill Parks, Assistant Director), and no org charts shown. The most complete descriptions of the programs appear in a series of factsheets:

The work of OETD follows these earlier developments: (see reliability program materials at

— The National Energy Policy (May 2001) calls for the Department of Energy to address constraints in electric transmission and relieve bottlenecks.

— The National Transmission Grid Study (May 2002) contains 51 recommendations for accomplishing the President’s National Energy Policy and speeding the pace of the transition to competitive regional electricity markets.

— The Transmission Grid Solutions Report (September 2002) provides guidance for priority actions to address congestion on “national interest” transmission corridors.

OETD conducts research in several areas:
–High-Temperature Superconductivity
–Electric Distribution Transformation
–Energy Storage
–Transmission Reliability

One participant at the July meeting told me he thought that DOE seems to be in the thrall of superconductors and other mega-technology solutions, and giving short shrift to distributed generation, microgrids, and other common sense approaches.

As for budget, through the end of Sept (FY03), OETD is operating on funds already committed to the programs that were brought in. Of roughly $85 Million in FY’03, high temperature superconductors have $40 M, and $27M was subject to Congressional earmarks. The FY04 budget request has a new line item for electric power infrastructure, and hopefully will provide more resources in FY05) explicitly for transmission reliability. Another observer said that the future program will be more balanced as a result.

The R&D plan is based on a 3-level architecture:
1. “Supergrid”, or coast to coast backbone for power exchange. (superconducting)
2. RegionGrid
3. CityGrid, ultimately involving fully integrated 2-way power flow, microgrids, etc.

Planning and analysis tools are needed at all 3 levels. The Supergrid is a longer term goal, operational perhaps in 10-15 years. Other near term elements include sensors, storage, and DC systems.

Update on Alchemix HydroMax

The HydroMax technology uses any carbon source including low sulfur and high sulfur coal to produce electricity, hydrogen and syngases which can be used as fuel for gas-fired power plants or converted into diesel, jet fuel, gasoline or ammonia. Alternate carbon sources include petroleum coke, municipal waste, biomass and shredded tires.

The company continues to make excellent progress as the U.S. Patent Office has now allowed 206 claims contained within a handful of patent applications. There is an opportunity to participate in an independent engineering evaluation of HydroMax vs. other hydrogen production technologies (such as gasification), to participate in a demonstration program, and to make a direct investment in Alchemix.


See: UFTO Note – H2 Production Adapts Smelting Technology, 15 Nov 2002:
(password required)

HydroMax adapts existing metal smelting technology to convert dirty solid fuels to clean gases. In iron making, carbon (coke) is mixed into molten iron oxide, and the result is elemental iron (Fe) and CO2. Alchemix’s new process, HydroMax, injects steam into a molten iron bath which makes H2 and iron oxide (FeO). HydroMax then makes use of iron making technology to return the iron oxide to pure iron for re-use. These two steps are done one after the other, and the fixed inventory of iron/iron oxide remains in place. (To produce a steady output stream, two reactors alternate, one in each mode.)

FeO + C –> Fe + CO2
Fe + H2O –> FeO + H2


A great deal of information is available at the company’s website:

Look under “News” and “Shareholders” for several powerpoint presentations and other items. Also a white paper under “Technology”. These emphasize the point that Alchemix provides a bridge strategy between hydrogen now, and the hydrogen economy of the future.

Alchemix says they have the lowest cost zero-emission coal/hydrogen technology, noteworthy in light of the somewhat controversial and problematic DOE FutureGen plan* to spend over $1 billion on a gasification approach. See Alchemix’s comments on how HydroMax will meet the FutureGen goals far more effectively.



Latest developments include specific plans for a commercial demonstration plant to be built in cooperation with members of the Canadian Oil Sands Network for Research and Development (CONRAD, Several members of CONRAD decided on July 15 to proceed with an engineering study to evaluate the HydroMax technology, economics and environmental impact in comparison with the alternate methods of producing hydrogen (i.e. steam methane reforming, gasification of solids and partial oxidation of heavy liquids). If the results of the study are positive for HydroMax as expected, then this group is likely to proceed with funding the first HydroMax plant, to be built in northern Alberta where the oil sands are located.

The plant will use petroleum coke to make 20 million scf/day of hydrogen and 10 MW of electricity. The plant will be profitable. An executive summary available on the Alchemix website (under “Introduction”) includes pro formas for the plant.

The group in Canada would welcome participation in the study (and the demo plant) by additional companies including US utilities. Alchemix will make introductions for anyone who is interested.

The group includes governmental organizations and private companies who will provide funding for the plant but may not require an equity position since they are interested in accelerated access to the technology. Alchemix, anticipating a capital requirement on its part for a substantial portion of the project (estimated at $120 million US), has drafted an investment opportunity. The proposal is for sale of stock in Alchemix with a call option for another traunch as the project proceeds.

A detailed memo on the rationale for this investment is available (password required) at:

Contact Robert Horton, Chairman

Bicarb Cleans Up Stack Gas Emissions

The same baking soda (sodium bicarbonate) sold in grocery stores and used for a 101 things around the home is also one of the best solutions to scrub emissions from coal-fired power plants. Purification of flue gas emissions using sodium bicarbonate has always been recognized as a highly effective process for removing SO2, SO3, NOx and heavy metal compounds from flue gas. However, sodium bicarbonate scrubbing has 3 serious drawbacks:

1. The cost of sodium bicarbonate is excessive;
2. The resulting byproduct of the sodium bicarbonate SOx reaction (sodium sulfate) has limited economic value;
3. Sodium sulfate disposal is expensive and poses a significant environmental problem.

Despite its recognition as a superior scrubbing technology, these prohibitive operating issues have kept flue gas scrubbing with sodium bicarbonate from realizing any significant market share.

Airborne Pollution Control Inc., a Calgary based company, has developed a solution to the challenges of sodium scrubbing. The Airborne process begins with the injection of bicarbonate into the flue, where it reacts with and captures the pollutants. The key to Airborne’s patented process is its ability to regenerate the “residue” (it is converted back into sodium bicarbonate that can be reused for flue gas scrubbing), and at the same time, to make a high-grade fertilizer byproduct.

The Airborne process eliminates the disposal problem, improves the economics and most importantly it does a superior job of addressing the multiple pollutants inherent in flue gas emissions. Additionally, Airborne has a proprietary process to granulate their fertilizer. Airborne’s thin-film pan granulation technology makes the fertilizer more stable, shippable, blendable, customizable and ultimately more valuable.

Together with the Babcock & Wilcox, US Filter HPD Systems, and Icon Construction, Airborne is operating an integrated 5 MW demonstration facility to showcase the Airborne Process. The plant is located in Kentucky at LG&E Energy Corp’s Ghent generating facility.

Last year DOE received 36 proposals for projects valued at more than US$5 billion in the first round of President Bush’s Clean Coal Power Initiative. The Airborne Process was 1 of only 8 successful proposals, and was selected for US$31 million in funding for the implementation of Airborne’s multi-pollutant control process.

| Clean Coal Power Initiative Round One
| “Commercial Demonstration of the Airborne Process” [PDF-495KB] __

In short, this means that high sulfur coal can be burned in an environmentally friendly and economically efficient manner. The Airborne process removes multiple pollutants and it meets or exceeds all current and pending environmental requirements for SO2, SO3, NOx and mercury. For the first time pollution abatement becomes an economically rewarding investment for the power producer.

Over the next 5 years, Airborne has conservatively targeted the application of its technology to 10 new and existing coal-fired electrical generation plants. This conservative target represents less than 1% of the global available market and translates to a total installed capacity of approximately 7500 Megawatts (MW) out of approximately 800,000 MW of coal-fired power generated world-wide.

One concern with the production of fertilizer byproducts is maintaining a balance between the supply and demand for sulfur based fertilizers, a demand which is predicted to grow as sulfur emissions are reduced at the source. Airborne has a worldwide agreement with the Potash Corp of Saskatchewan Inc. (PCS), the world’s largest manufacturer and distributor of fertilizer products. Airborne has a worldwide marketing agreement with PCS whereby PCS will market the various fertilizer outputs, providing Airborne with access to worldwide markets and providing PCS with a unique addition to their portfolio of fertilizer products.

Airborne has made a major investment in the development and demonstration of this patented process and is seeking equity investment partners to take it to the next level.

Contact: Leonard Seidman
T: 403.253.7887 Ext: 310

“Multi Pollutant Control with the Airborne Process” [ 1.1 MB PDF] (… details the experimental and analytical results of a lab and pilot scale 0.3 MW coal fired combustion test facility and the progression to an integrated 5 MW facility)

DOE H2&FC Reviews’03

DOE Hydrogen and Fuel Cells Merit Review Meeting
May 19-22, 2003, Berkeley, CA

(See UFTO Note 10 June 2002 for last year’s meeting.)

“Annual Review Proceedings” are (will be) available:

DOE’s new organization for hydrogen and fuel cells is in place. Steve Chalk heads the program, and has about 20 direct reports for the many sub-areas. The org chart and key contacts list are available here:

Of course, the program got a huge boost when the president announced the $1.2 billion Hydrogen Fuel Initiative and “FreedomCar” program in the state-of-the-union address this past January.

In a plenary opening session, Steve Chalk gave an overview of DOE’s response, based on a major planning effort involving many stakeholders. (This is all heavily documented on the website.) He showed budgets steadily growing over the next several years.
H2: $47, $55, $77 million (FY 02, 03, 04)
FC: $29, $40, $88 million

The Plan involves a decade of R&D, with commercialization decisions towards the end, and subsequent “transition” and “expansion” in the marketplace. Meanwhile, “technology validation” projects will attempt semi-real world demonstrations of complete integrated infrastructure elements, e.g. refueling stations (major RFP was announced May 6 for a 5 year “learning demo” of hydrogen vehicle infrastructure.)

The DOE Secretary will have a new Hydrogen Policy Group (heads of EE, FE, Nuclear, etc.) and the Hydrogen Technical Advisory Committee. Lower down, Steve Chalk will work with the Hydrogen Matrix Group and an Interagency Task Force. Of particular note, a new Systems Integration and Analysis office will be set up at NREL, and several “virtual centers” at national labs focused on specific technical areas.

In each area, goals have been established for the various cost and performance parameters. (e.g., by 2005 electrolytic hydrogen at 5000 psi should be produced at 65% efficiency, for under $3.75/kg. By 2010, moving hydrogen from central production sites to distribution facilities should be under $0.70/kg.) [One kg of H2 is about equivalent in energy content to one gallon of gasoline, making comparisons easier.]

When Chalk’s powerpoint becomes available, it will be worth reviewing if you’re interested in how all of this is going.

This year’s annual review meetings drew a large crowd again. A subset of projects were chosen from each technical area for 20-30 minute presentations, while other investigators were asked to do poster papers instead. Hydrogen and Fuel Cell sessions were held in parallel (last year they were on separate days), making it impossible to cover everything. A two inch thick binder had all the vugraphs, however, and all of it be posted on the website.

Here are the session headings:


– Production -Biological & Biomass Based
– Production -Fossil Based
– Production -Electrolytic
– Production -Photolytic and Photoelectro-chemical
– Storage – High Pressure Tanks
– Storage – Hydrides
– Storage – Carbon & Other Storage
– Infrastructure Development -H2 Fueling Systems & Infrastructure
– Codes & Standards

Fuel Cells

– High Temp Membranes/ Cathodes/ Manufacturing
– High Temp Membranes/ Cathodes/ Electrocatalysts
– Fuel Cell Power Systems Analysis
– Fuel Processing
– Direct Methanol Fuel Cells
– Fuel Cell Power System Development
– Fuels Effects
– Sensors for Safety & Performance
– Air Management Subsystems

A few highlights:

– Codes and standards were compared to the “iceberg below the surface” (i.e. that sunk the Titanic). The voluntary standards-making process in this country, along with the 40,000 independent local jurisdictions, represent a huge educational and process challenge to make society ready for hydrogen. The recently announced fueling station in Las Vegas needed 16 separate permits, and the local fire marshal was the toughest to deal with.

– Carbon nanotube storage is living on borrowed time. It has the distinction of a stern “Go-No go” decision that’s been put in its path (2005), and the science seems not to be making the greatest progress.

– Another Go-No Go decision is set for late 2004, for onboard fuel processing.

– Photolytic H2 production makes slow progress, but researchers close to it acknowledge it’s practical application can only happen if the right materials are found. The search continues using “combinatorial” methods. (see UFTO Note 2 April 2003).

– The fuel cell work seems mostly to do with the tough slugging it out with materials and costs, finding formulations and configurations that gradually improve the situation. A fair amount of attention is going towards higher temperature PEM cell membranes, where hydrogen purity is less of an issue, however no breakthroughs seem imminent.

– Quite a bit of attention is going to fueling systems. Several projects involve the building of equipment and actual demonstration fueling stations and “power parks”. DTE and Pinnacle West are the only utilities that seem to have really pursued this; each has a major demonstration project in development.

In view of the volume and technical nature of this material, let me suggest that I can dig deeper into any particular area of interest to you, but that otherwise the DOE website has all the documentation on the programs and specific projects.

Other Hydrogen news:

You may have seen Wired 11.4 (April). The cover story is by Peter Schwartz, the famous futurist, who proclaims that a full-blown hydrogen economy is urgent and inevitable. I saw him present the argument at a seminar at Stanford recently, and found it very short on practical specifics and less than compelling. For one thing, he asserts that nuclear will be the major source of energy to make hydrogen a decade or two from now.

Along the same lines, the June issue of Business 2.0 came last week, with a feature story about the head of Accenture’s Resource Group, Mary Tolan, and her blunt challenge to the energy industry to go invest like crazy to make the hydrogen economy happen quickly. She says it’s the only way the oil majors in particular will be able to continue to make big profits in the future. She apparently let loose with this at CERA Week, back in February. Business 2.0’s website ( won’t have it online for a few weeks, but I was able to locate a reference to an Accenture utility industry event that outlines the argument.

Curious to know what you think. In my own opinion, both sound over the top. We’ve got a ways to go before the technology, or the society, will be ready for hydrogen on a massive scale. I’ve written to Ms. Tolan to see if I can get more details as to their reasoning.

New New Solar PV

There are a number of fascinating new developments in the world of solar photovoltaic cells, which represent major shifts from the usual crystalline silicon cell based on semiconductor technology, which supplies as much as 80% of the market today (referring to wafers sliced from large single crystal or polycrystalline ingots). Here is a quick overview. Much more information exists on most of these topics.

Evergreen Solar
Evergreen has one of most mature of the new approaches, and is now a growing public company (symbol ESLR), ramping up production of its unique string ribbon Silicon cell. The Evergreen cell is fully equivalent on a functional basis, but is considerably than the ingot slice method. Evergreen anticipates sales of $6-9 million in 2003. The website does a good job explaining the whole story.

Solar Grade Silicon
In March, Solar Grade Silicon LLC announced full production of polycrystalline silicon at its new plant in Washington, the first ever plant dedicated wholly to producing feedstock for the solar industry. They supply the purified silicon that is then melted and made into single crystals, i.e. in large ingots, or Evergreen’s ribbon. In the past, solar cell makers relied on scraps from the semiconductor industry, which won’t be sufficient to handle the growth in the PV industry.

Spheral (ATS Automation)
In one of the stranger sagas of solar, you may recall that in 1995, Texas Instruments finally gave up on a major development program to develop “Spheral” solar cells, an effort they’d devoted many years and many dollars to (with considerable support from DOE). Spheral technology comprises thousands of tiny silicon spheres, bonded between thin flexible aluminum foil substrates to form solar cells, which are then assembled into lightweight flexible modules. TI’s goal was to develop a manufacturing process that would drive PV costs to $2/watt. Ontario Hydro Technologies acquired the technology, set up manufacturing in Toronto, and sold some systems, but in 1997, reorganizations and a return to basics led them to sell it off. Apparently dormant since then, in July 2002 ATS Automation announced it had acquired the technology, set up a subsidiary, and was scaling up production with plans to be in commercial production this year. The Canadian government put in nearly $30 Million. The jury is out on this one. For the story, go to:

Thin Film-CIGS
Commercially produced thin film PV falls into 3 general categories, Cadium Telluride, Amorphous Silicon, and CIGS (Cu(In,Ga)Se2). The first two technologies are struggling, with BP’s notable exit last November from both. CIGS is having instances of some apparent success and continuing development efforts, and enjoys strong support at NREL, a true believer. There are production facilities doing CIGS as well as innumerable development efforts around the world to make it cheaper and more efficient. CIGS has the unique feature of becoming more efficient as it ages.

Global Solar**
Global, partly owned Unisource, the parent of Tucson Electric, is selling thin film CIGS modules to the military, commercial and recreational markets. One product is a blanket a soldier can unfold on the ground. Current production capacity is 2.3 MW per year, and they’re fundraising to expand to 7.5 MW.

Among the new entrants, Raycom is a startup in Silicon Valley, led by veterans of thin film coating for disk drives and optical filters. They believe their experience (and existing equipment) will enable them to avoid the long and painful development cycles that have traditionally characterized the solar PV industry, and be in production in less than 2 years. Their secret is “dual-rotary magnetron sputtering” a patented process that has already proven effective in high volume manufacturing. Cost targets are under $1 per watt. They also have brought a fresh eye to the formulation of CIGS, and see ways to make it without cadmium, which is highly toxic. Raycom produced their first working cells in a matter of months. They are in the midst of fundraising. One might observe that this is a rare instance where someone comes to PV from manufacturing instead of science. Normally, people develop PV technology in the lab and then endeavor to become manufacturers. This time it’s the other way around. [To see the magetron sputtering technology, go to:]
Contact David Pearce 408-456-5706,

Konarka has attracted a great deal of attention and sizable VC participation (funding round Oct 02) with promises of a way to commercialize the “Gratzel” cell, which Dr. Michael Grätzel developed and subsequently patented in the 1990’s. The core of the technology consists of nanometer-scale crystals of TiO2 semiconductor coated with light-absorbing dye and embedded in an electrolyte between the front and back electrical contacts. Photons are absorbed by the dye, liberating an electron which escapes via the TiO2 to the external circuit. The electron returns on the other side of the cell, and is restores another dye molecule. The jury is out on this one, whether it’ll happen quickly as the company and its investors hope, or will there be a long road ahead. One of the biggest issues since this idea was first tried has been the stability of the organic dyes.

For a good discussion of dye-sensitized cells, see this pdf:

This Palo Alto based company has a long list of goals for its nanotechnology, ranging from chemical/biological sensors, to electronics and photovoltaics, based on formulations of nanowires, nanotubes, and nanoparticles. Their idea for PV is reportedly to embed nanorods of photosensitive material in a polymer electrolyte, on a principle not unlike Konarka’s. On April 24, they announced an amazing $30 Million VC funding. You have to wonder about this one, i.e. if the nano-hype has taken over, and how successful they’ll be about solar as compared with the other areas.

The technology was originally developed at Lawrence Berkeley Lab:

Also Palo Alto based, this one is in stealth mode. The basic idea is similar to Nanosys, but they are focused only on solar. They also incorporate technology licensed from Sandia for nano-self-assembly to align the nanorods perpendicular to the surface, which is supposed to make a big difference in the efficiency. (Nanosys’s nanorods are said to be randomly oriented in clumps.) NanoSolar has some very famous investors, who are maintaining an extremely low profile.

Solaicx is a new spinout from SRI International, and has a way to make polycrystalline silicon cell material in a continuous process atmospheric-pressure furnace. Their presentations and materials tell very little about what they have, making it pretty hard to judge.

This is a very unusual concentrator story involving the use of variable “graded” index glass optics. The work started in the mid 80’s. Solaria Corporation was formed in 1998 by the founders and former management from LightPath Technologies, Inc., Albuquerque, New Mexico. Solaria holds the exclusive license from LightPath to use its proprietary GRADIUM® optics in the field of solar energy.

** These companies presented at the Cleantech Venture Forum in San Francisco, April 30.

Photolytic Hydrogen from Sunlight

Researchers have been working on a process that uses sunlight to produce hydrogen by splitting water directly. To understand photoelectrolysis, think of a PV cell underwater, where the electrochemical energy produced is immediately used to electrolyze water, instead of creating an external current. The light hits the cell, and hydrogen bubbles appear on one side of the cell, while oxygen appears on the other side, just as in electrolysis. (Of course one could use a PV cell to power an electrolyzer, but the idea here is to make a simpler and more economical system.)

The interface between the water (electrolyte) and certain semiconductor materials forms a diode junction that generates power–and thus does the electrolysis. The presence of catalysts at the surface can also help with the energetics and kinetics of the reactions that form the hydrogen and oxygen, respectively.

One of the problems is that the minimum voltage for splitting water (1.3 volts) is higher than a photocell can easily produce, and high-bandgap materials capable of generating enough voltage can utilize only ultraviolet light, which is a small fraction of the solar spectrum.

Work at NREL and the University of Hawaii has focused on developing multijunction cells which use more of the solar spectrum. These additional layers are sandwiched inside the basic cell that does the photolysis, and provide a boost to the electro potential available to do the water splitting. The electrochemistry and solid state physics of these devices are very complex. One of the main challenges has been to come up with materials and configurations that will be less susceptible to corrosion from the electrolyte and which will last long enough to be practical. Efficencies above 12% have been seen (i.e., the energy value of the hydrogen produced vs. the amount of incident sunlight. (See the 2002 H2 DOE Program Reviews–ref. below. Also, the 2003 meeting in May will have new updates.)

Researchers at the University of Duquesne published an important development in Science Magazine last September. Titanium dioxide is known to be a cheap and stable photocatalyst for splitting water, but hydrogen yields were always less than 1% (due to the high band gap of the material). The new development involved preparing the material in a flame, introducing carbon into its structure. Cells using this new material saw a factor of 10 increase in hydrogen production. The University is actively seeking licensees or partners to pursue this technology. (Contact me for details).

The design goal at NREL and Hawaii is to come up with a monolithic device that needs no external electrical connections. The simple version of the Duquesne cell requires an external bias power source (which could be powered by a fuel cell using some of the hydrogen produced), but which would still be a net producer of power. Net yields are already at 8.5%, and are expected to improve.

Though commercial devices are a ways off, photosplitting of water is another process that could supply hydrogen by purely renewable means.


2002 Hydrogen Program Review Meeting – Renewable Production Electrolytic Processes

Science…27 Sept 02
“Efficient Photochemical Water Splitting by a Chemically Modified n-TiO2”

Science 17 April 98
“A Monolithic Photovoltaic-Photoelectrochemical Device for Hydrogen Production via Water Splitting”

( I can provide pdf copies of the Science articles).