DOE Office of Electric Transmission & Distribution (OETD)

Back in March, we thought announcements were imminent. (See UFTO Note ? T&D R&D Gaining Attention, 21 Mar 2003.) Little did we realize the kinds of struggles that would ensue internally in DOE over which people, programs and budgets would be won or lost by which office. The new office started its work nonetheless, judging from numerous appearances by its chief, Jimmy Glotfelty, and several planning and roadmapping meetings over the spring and summer. And the dust has settled internally.

OETD officially “stood up” on August 10, but the big August 14th blackout made for awkward timing for a press release–none has been issued. (In fact, until an appropriations bill passes, I’m told they aren’t actually officially “up”.)

A new website quietly appeared on August 21. If offers a first cut at describing the Office and its scope of responsibilities and giving links to planning documents:

[This site has a good compendium of information on the blackout, however for the 12 Sept announcement of the release of a report on the events sequence, go to the DOE home page,]

**National Electric Delivery Technologies Vision and Roadmap**
There’ve been two major meetings this year, one in April and one in July. In chronological order:

April 2003 Vision Meeting Proceedings (PDF 1.1 MB)
[65 people attended, of whom only 8 represented utilities]

Results of the April meeting are given in this vision document**. [The results of the July meeting will be reported in a few more weeks.]:

“Grid 2030” — “A National Vision for Electricity’s Second 100 Years,

**DOE’s National Electric Vision Document
(Final version, July 31, 2003) (PDF 1.2 MB)

Proceedings for National Electric Delivery Technologies Roadmap,
July 8-9, 2003 (PDF 1.0 MB)
[About 20 utilities were represented, with less than 40 people out of 200 participants.]

Glotfelty’s kickoff presentation July 8:
“Transforming the Grid to Revolutionize Electric Power in North America” roadmap opening 07 08 03.pdf


No personnel are identified on the new website (other than Gotfelty and Bill Parks, Assistant Director), and no org charts shown. The most complete descriptions of the programs appear in a series of factsheets:

The work of OETD follows these earlier developments: (see reliability program materials at

— The National Energy Policy (May 2001) calls for the Department of Energy to address constraints in electric transmission and relieve bottlenecks.

— The National Transmission Grid Study (May 2002) contains 51 recommendations for accomplishing the President’s National Energy Policy and speeding the pace of the transition to competitive regional electricity markets.

— The Transmission Grid Solutions Report (September 2002) provides guidance for priority actions to address congestion on “national interest” transmission corridors.

OETD conducts research in several areas:
–High-Temperature Superconductivity
–Electric Distribution Transformation
–Energy Storage
–Transmission Reliability

One participant at the July meeting told me he thought that DOE seems to be in the thrall of superconductors and other mega-technology solutions, and giving short shrift to distributed generation, microgrids, and other common sense approaches.

As for budget, through the end of Sept (FY03), OETD is operating on funds already committed to the programs that were brought in. Of roughly $85 Million in FY’03, high temperature superconductors have $40 M, and $27M was subject to Congressional earmarks. The FY04 budget request has a new line item for electric power infrastructure, and hopefully will provide more resources in FY05) explicitly for transmission reliability. Another observer said that the future program will be more balanced as a result.

The R&D plan is based on a 3-level architecture:
1. “Supergrid”, or coast to coast backbone for power exchange. (superconducting)
2. RegionGrid
3. CityGrid, ultimately involving fully integrated 2-way power flow, microgrids, etc.

Planning and analysis tools are needed at all 3 levels. The Supergrid is a longer term goal, operational perhaps in 10-15 years. Other near term elements include sensors, storage, and DC systems.

Humid Air Injection Boosts CT Output

Additional megawatt-hours (MWH) can be obtained at low cost during peak demand periods from gas turbines and combined cycle power plants by injecting externally compressed, humidified, and heated air into a combustion turbine (CT) up-stream of combustors. This novel approach is denoted as CT-HAI, (HAI is an acronym for Humidified Air Injection) for simple cycles and CC-HAI for combined cycles. It results in a significant power augmentation over the whole range of ambient temperatures, but it is the most effective at high ambient temperature conditions when reduction in power output is most severe.

The simplified explanation for reduced power production by CT and CC plants is that lower inlet air density, a result of the high ambient temperature, reduces mass flow through a CT with a corresponding reduction in power.

With HAI, power output can be maintained essentially constant over the range of 0 F to 95 F at about 20 % above the nominal 59 F rating. The overall heat rate for the total output of the power augmented CT also drops by about 8%-12% over that temperature range, saving fuel as the temperature rises. The heat rate for the incremental power is approximately 6000-6400 Btu/kWh, i.e. in the range of CC plants. Engineering and mechanical aspects of the air injection for CT-HAI concept are similar to the steam injection for the power augmentation, which has accumulated significant commercial operating experience.

This system can be operated to produce additional MW for sale whenever market conditions are attractive. The value to individual utilities will vary according to the number of hours that the additional megawatts can be sold at attractive prices. Specific capital costs of additional kWs (i.e. for installing HAI) are less than $200/kW. With lower net heat rates, the cost of electricity obtained with this technology can provide power at lower production costs in peak power markets.

The process is an interesting coming together of two separate ideas for getting more out of CTs: (1) adding humidity, and (2) (externally) compressing the air:

Just Add Water —
The output of a CT can be increased by adding water in various ways, like evaporative cooling, wet compression, and inlet chilling. Unfortunately, these technologies that may have low initial capital costs introduce the water into compression process and can create significant operational problems. For example, GE has told users to cease inlet fogging and evaporative cooler operation until compressor blade erosion inspections can be performed. Technologies that introduce condensation or carryover of water into the compressor section can cause blade erosion and ductwork corrosion, pitting and thermal stress.

While steam injection technology also bypasses the compressor, with HAI, humidity is introduced in the form of humidified air that, as compared with the steam injection, provides for a safer and more stable combustion process, and allows for higher injection rates with associated greater power augmentation. Steam injection flow is limited by a number of combustion related and other considerations.

Compressed Air —
The other development behind HAI is compressed air energy storage (CAES), a diurnal peak shifting method where air is compressed off-peak and stored in underground formations or piping systems. On-peak, the compressed air is fed to the CT, relieving it of the need to do its own compression and thus increasing output. From there it was a short step to realizing that an external compressor could be beneficial under certain operating conditions. Adding humidity to this external air supply enhances the performance even more.

Dr. Michael Nakhamkin, President, Energy Storage and Power Consultants (ESPC), has fourteen patents; including five on CAES technology and another five on the power augmentation technologies with humid and dry air injection into CT.

– Combustion Turbine with Humid Air Injection (CTHAI) -pat. 6038849
– Combustion Turbine with [Dry]Air Injection (CTDAI) -pat. pending

Both methods can increase power output by 15%-25% or more; use proven equipment; and are simple to implement and operate. The humid version also reduces NOx by 15%. Developers have also come up with a clever means to avoid entraining impurities in the water, simplifying water treatment. A once-through boiler with partial steam generation requires only demineralized water.

Several HAI/DAI concepts as applied to simple-cycle (CT) and combined-cycle (CC) plants are available for commercial implementation. Successful validations have been done at Calpine on GE 7241 FA. HAI can be practical for any CT 5 MW and larger.

Hill Energy System, a subsidiary of Hill International, is a licensee of the HAI technology, and is actively marketing systems. The website has contact information and a number of helpful documents.

Also see a full discussion in the July 2003 issue of Power Engineering Magazine:
“Humid Air Injection Turns to Out-Of-Shelf Equipment to Enhance Viability for Combustion Turbine Power Augmentation”

“Air Injected Power Augmentation Validated by Fr7FA Peaker Tests”, Gas Turbine World, March/April 2002.


Ron Wolk, prominent power technology expert, has been involved in this program for years, and can provide additional insights. Contact him at:

Update on Alchemix HydroMax

The HydroMax technology uses any carbon source including low sulfur and high sulfur coal to produce electricity, hydrogen and syngases which can be used as fuel for gas-fired power plants or converted into diesel, jet fuel, gasoline or ammonia. Alternate carbon sources include petroleum coke, municipal waste, biomass and shredded tires.

The company continues to make excellent progress as the U.S. Patent Office has now allowed 206 claims contained within a handful of patent applications. There is an opportunity to participate in an independent engineering evaluation of HydroMax vs. other hydrogen production technologies (such as gasification), to participate in a demonstration program, and to make a direct investment in Alchemix.


See: UFTO Note – H2 Production Adapts Smelting Technology, 15 Nov 2002:
(password required)

HydroMax adapts existing metal smelting technology to convert dirty solid fuels to clean gases. In iron making, carbon (coke) is mixed into molten iron oxide, and the result is elemental iron (Fe) and CO2. Alchemix’s new process, HydroMax, injects steam into a molten iron bath which makes H2 and iron oxide (FeO). HydroMax then makes use of iron making technology to return the iron oxide to pure iron for re-use. These two steps are done one after the other, and the fixed inventory of iron/iron oxide remains in place. (To produce a steady output stream, two reactors alternate, one in each mode.)

FeO + C –> Fe + CO2
Fe + H2O –> FeO + H2


A great deal of information is available at the company’s website:

Look under “News” and “Shareholders” for several powerpoint presentations and other items. Also a white paper under “Technology”. These emphasize the point that Alchemix provides a bridge strategy between hydrogen now, and the hydrogen economy of the future.

Alchemix says they have the lowest cost zero-emission coal/hydrogen technology, noteworthy in light of the somewhat controversial and problematic DOE FutureGen plan* to spend over $1 billion on a gasification approach. See Alchemix’s comments on how HydroMax will meet the FutureGen goals far more effectively.



Latest developments include specific plans for a commercial demonstration plant to be built in cooperation with members of the Canadian Oil Sands Network for Research and Development (CONRAD, Several members of CONRAD decided on July 15 to proceed with an engineering study to evaluate the HydroMax technology, economics and environmental impact in comparison with the alternate methods of producing hydrogen (i.e. steam methane reforming, gasification of solids and partial oxidation of heavy liquids). If the results of the study are positive for HydroMax as expected, then this group is likely to proceed with funding the first HydroMax plant, to be built in northern Alberta where the oil sands are located.

The plant will use petroleum coke to make 20 million scf/day of hydrogen and 10 MW of electricity. The plant will be profitable. An executive summary available on the Alchemix website (under “Introduction”) includes pro formas for the plant.

The group in Canada would welcome participation in the study (and the demo plant) by additional companies including US utilities. Alchemix will make introductions for anyone who is interested.

The group includes governmental organizations and private companies who will provide funding for the plant but may not require an equity position since they are interested in accelerated access to the technology. Alchemix, anticipating a capital requirement on its part for a substantial portion of the project (estimated at $120 million US), has drafted an investment opportunity. The proposal is for sale of stock in Alchemix with a call option for another traunch as the project proceeds.

A detailed memo on the rationale for this investment is available (password required) at:

Contact Robert Horton, Chairman

Bicarb Cleans Up Stack Gas Emissions

The same baking soda (sodium bicarbonate) sold in grocery stores and used for a 101 things around the home is also one of the best solutions to scrub emissions from coal-fired power plants. Purification of flue gas emissions using sodium bicarbonate has always been recognized as a highly effective process for removing SO2, SO3, NOx and heavy metal compounds from flue gas. However, sodium bicarbonate scrubbing has 3 serious drawbacks:

1. The cost of sodium bicarbonate is excessive;
2. The resulting byproduct of the sodium bicarbonate SOx reaction (sodium sulfate) has limited economic value;
3. Sodium sulfate disposal is expensive and poses a significant environmental problem.

Despite its recognition as a superior scrubbing technology, these prohibitive operating issues have kept flue gas scrubbing with sodium bicarbonate from realizing any significant market share.

Airborne Pollution Control Inc., a Calgary based company, has developed a solution to the challenges of sodium scrubbing. The Airborne process begins with the injection of bicarbonate into the flue, where it reacts with and captures the pollutants. The key to Airborne’s patented process is its ability to regenerate the “residue” (it is converted back into sodium bicarbonate that can be reused for flue gas scrubbing), and at the same time, to make a high-grade fertilizer byproduct.

The Airborne process eliminates the disposal problem, improves the economics and most importantly it does a superior job of addressing the multiple pollutants inherent in flue gas emissions. Additionally, Airborne has a proprietary process to granulate their fertilizer. Airborne’s thin-film pan granulation technology makes the fertilizer more stable, shippable, blendable, customizable and ultimately more valuable.

Together with the Babcock & Wilcox, US Filter HPD Systems, and Icon Construction, Airborne is operating an integrated 5 MW demonstration facility to showcase the Airborne Process. The plant is located in Kentucky at LG&E Energy Corp’s Ghent generating facility.

Last year DOE received 36 proposals for projects valued at more than US$5 billion in the first round of President Bush’s Clean Coal Power Initiative. The Airborne Process was 1 of only 8 successful proposals, and was selected for US$31 million in funding for the implementation of Airborne’s multi-pollutant control process.

| Clean Coal Power Initiative Round One
| “Commercial Demonstration of the Airborne Process” [PDF-495KB] __

In short, this means that high sulfur coal can be burned in an environmentally friendly and economically efficient manner. The Airborne process removes multiple pollutants and it meets or exceeds all current and pending environmental requirements for SO2, SO3, NOx and mercury. For the first time pollution abatement becomes an economically rewarding investment for the power producer.

Over the next 5 years, Airborne has conservatively targeted the application of its technology to 10 new and existing coal-fired electrical generation plants. This conservative target represents less than 1% of the global available market and translates to a total installed capacity of approximately 7500 Megawatts (MW) out of approximately 800,000 MW of coal-fired power generated world-wide.

One concern with the production of fertilizer byproducts is maintaining a balance between the supply and demand for sulfur based fertilizers, a demand which is predicted to grow as sulfur emissions are reduced at the source. Airborne has a worldwide agreement with the Potash Corp of Saskatchewan Inc. (PCS), the world’s largest manufacturer and distributor of fertilizer products. Airborne has a worldwide marketing agreement with PCS whereby PCS will market the various fertilizer outputs, providing Airborne with access to worldwide markets and providing PCS with a unique addition to their portfolio of fertilizer products.

Airborne has made a major investment in the development and demonstration of this patented process and is seeking equity investment partners to take it to the next level.

Contact: Leonard Seidman
T: 403.253.7887 Ext: 310

“Multi Pollutant Control with the Airborne Process” [ 1.1 MB PDF] (… details the experimental and analytical results of a lab and pilot scale 0.3 MW coal fired combustion test facility and the progression to an integrated 5 MW facility)

Non-Thermal Plasma H2, no CO2

Precision H2, a Canadian company, is developing a non-thermal plasma process which disassembles methane (CH4) into hydrogen and carbon black. Note, no CO2!

There are dozens of plasma companies, often focused on medical waste, and some on power (with coal or some waste stream as the feedstock). (See footnote) Usually these are hot plasmas, and tend to be expensive due to the materials problems at high temperature. In a plasma, sometimes called the 4th state of matter, material is very highly ionized by an electrical arc discharge. Lightning is a good example, and many plasma systems are brute force, require a lot of energy, and get very hot.

A so-called “non-thermal” plasma is one in which the electric discharge is controlled and confined. Locally it is extremely hot, but each spark doesn’t last long enough to heat up the surrounding materials. Precision H2 has created a “plasma dissociation reactor”, where the electrical discharge is carefully shaped and especially tailored to the specific job of dismantling methane. The electrical energy goes straight to the molecule, and doesn’t have to get there as heat. (It’s a little bit like cooking with microwaves instead of a conventional oven.)

The methane streaming through the reactor is partly converted to H2, with the carbon dropping out as a nanopowder. The output is then a blend of methane enriched with hydrogen (hythane). In an intriguing twist, this blend can be sent to a fuel cell which will consume the hydrogen, leaving the methane to be cycled back to the reactor. In effect, the fuel cell itself is used to separate out the hydrogen–for its own use. This configuration would produce electricity directly, rather than hydrogen. Pure hydrogen is gotten by using PSA (pressure swing absorption) or membranes to do the separation. Potential partners are already in discussions on both fronts (i.e. fuel cells and purification). Also, hythane can be used directly in engines, to good advantage.

The key is electronics (pulse shaping, and analysis and control of the discharge), and costs for electronics are well understood. Because temperatures remain modest, the reaction chamber can be made inexpensively, and is readily scalable.

There is an energy penalty–not all the “fuel value” of the methane is used, because the carbon itself isn’t oxidized. Instead, since no oxygen is present, no CO2 is produced–think of it as “presequestration”, with resulting GHG and carbon-trading benefits. Also, the carbon is in a valuable form which can be sold, enhancing overall economics. Detailed thermodynamic and financial models have been developed, and the company believes that even today, with “one-off” systems, they can produce hydrogen cost competitively.

The company is raising a round of equity financing.

Contact Dan Fletcher
Precision H2
Montreal, Quebec, Canada

An amazing find can be found at:

“Non-Incineration Medical Waste Treatment Technologies”, an August 2001 report …. explores the environmental and economic impacts, among other considerations, of about 50 specific technologies.

Chapter 4 in particular is an exhaustive review of every technology and nearly every company with a means to destroy hazardous materials. While the focus is on medical waste, most of the technologies also apply to hazardous materials, municipal waste and sludge, biomass, and fossil fuels. Gasification, pyrolysis, plasmas, and many different chemical and electrochemical oxidation and reduction methods are out there, and are being used today at industrial scale. When they can be made to work, the issues are cost, reliability, system longevity, emissions (creation of new hazards, e.g. dioxins), materials handling, feedstock variability, etc. etc. The key is to inject sufficient energy into the material to break the chemical bonds, for example, to get it hot enough for long enough (dwell time).

Firefly Re-invents the Lead Acid Battery

In early May, Caterpillar announced the formation of a new spin-off company called Firefly Energy Inc., whose purpose is to complete the development and commercialization of a dramatically improved lead acid battery technology. The entire research program, people and technology have been transferred out of CAT into the new startup after several years of in-house research. CAT will remain as only a partial investor once there is new financing.

Attempts have been made before to re-invent the lead acid battery, without much success. Prominent among them, Electrosource/Horizon and Bolder Technologies, both of whom ran into obstacles in cost, performance and manufacturability that couldn’t be overcome. . (In Dec 2001, Bolder was acquired out of bankruptcy by Singapore based GP Battery. In Feb 2003, Eagle-Picher announced a new joint to produce the Horizon battery.)

Firefly has high expectations that they’ve got it right. In fact, key personnel from those previous efforts are involved, along with an all star cast of battery industry veterans.

Firefly’s claims include: 1/4 the weight (eliminating 80% of the lead), double life expectancy, 7x charge rate, and manufacturing that is compatible with existing lead acid battery production facilities. It should cost no more than current lead-acid batteries, making it a small fraction of the cost of nickel metal hydride and lithium technologies. Cycle life, even at 80% depth of discharge, is several thousand cycles, one or two orders of magnitude better than conventional lead acid, on a par with the advanced technologies. Two main problems of lead acid, sulfation and corrosion, are all but eliminated. Heat dissipation is excellent, even at the greatly increased charge and discharge rates.

One of the keys to these improvements is a substrate material for the plates that no-one thought to try before. Highly porous, it provides for thousands of times more “cells”, or locations where the reaction can take place. Fourteen patents are already in process, with more to come.

The company plans to license the technology, and to manufacture with partners that already have production lines, co-branding new products that will be priced at or below leading batteries on the market.

They are raising an initial seed round now, with a $2 million “A” round to follow immediately.

Ed Williams, CEO

Energy Efficiency as a Resource

ACEEE National Conference on Energy Efficiency as a Resource
Berkeley CA Jun 9,10

A number of the papers are already posted online (when the author’s name is a link):

This event was a real eye opener. The energy efficiency crowd is on a roll, very much back from near death. These are the champions of Energy Efficiency (EE) and Demand Response (DR, not to be confused with distributed resources) who push for equal treatment of the demand side “resource” alongside generation and supply. In California especially, they feel vindicated by the failure of deregulation, and gleefully describe the end of a “dark age” with the return of rate-base regulation and integrated resource planning (IRP). In this view, reliance on the “market” to deliver the right mix of supply and conservation has been completely discredited.

The emphasis was on California, with two PUC commissioners giving major speeches supporting the basic premise. We heard about recent 3-2 votes to push efficiency as an integral part of a state “action plan”. Various state agencies are pledging to coordinate their efforts. The state’s investor-owned utilities have submitted major plans that go well beyond using the public benefits charge to “procure” energy and capacity from the demand side. Since the utilities are the default/only provider, but don’t have their own generation anymore (they are pipes and wires companies!), they now need to submit detailed resource plans–thus the rebirth of “IRP”.

This all felt like a jump back in time–apparently I hadn’t realized how little deregulation has progressed. Clearly, “prices” haven’t replaced “rates”; “revenue requirement” still has meaning; utilities are still utilities, and a key issue is how to put efficiency investments into the rate base and assure they get a rate of return comparable to generation facilities.

There was, however, a recognition that things would be different — that the intervening experience and lessons learned could be built on. One speaker compared it to a second marriage, where you’re wiser and may have a better chance to get it right. In particular, there’s a lot of support for “decoupling”. This refers to the idea that distribution utilities should not have their cost recovery/revenues tied to throughput of kwhs, but to performance based measures like reliability of service.

The California Action Plan includes goals for 5% of peak demand from efficiency along with renewables, distributed generation, transmission upgrades, and “reliable affordable energy”. California led the nation over the last 20 years in conservation and efficiency, and will again. Cities like San Diego and San Francisco are undertaking their own resource planning efforts as well.

Other areas are proceeding vigorously. In New York, the governor’s office is running a multi-agency Coordinated Electric Demand Reduction Initiative (CEDRI), with a goal of making 600 MW available on short notice. The state’s goal is to create a vigorous market for efficency; 92 ESCOs are operating there currently. The Northwest has a multistate program; Montana has come up with an ambitious approach; the Northeast is active as well (NEDRI). In the midwest, the situation was described as being “several years behind”, since energy is cheap and plentiful there. In Texas, there doesn’t seem to be a problem incorporating demand aspects alongside restructuring. Their markets are set up so that “DR” can compete directly, and as much as 500 MW is in the game.

Another issue receiving a lot of attention is the relationship between “efficiency”, i.e. energy, and “demand response”, i.e. capacity. In many regions, it seems problematic to work these two pieces together, but there was a strong recognition that they are really two sides of the same coin. Chuck Goldman of LBL has been studying the lay of the land in states all across the country, and noted a marked drop in traditional load control and interruptible rate programs–these are practically ‘stranded assets’ — ignored until price spikes appeared. Now there are wildly varying arrangements for retail competition, and for EE and DR, which are rarely coordinated. (

For the rulemaking in Calif for demand response, go to:

There was a lot of support for real time pricing, which must eventually become a reality as the only real mechanism that can send the proper economic signals to consumers. In fact, the Calif plan has it starting in 2004.

Monica Rudman of the Calif Energy Commission reported on how they managed to rush a set of programs together to try to alleviate the demand crunch during the California crisis. The state legislature urgently approved $50 million in August of 2000 and then an additional $327 million in April 2001. The CEC launched a wide array of over a dozen measures with astonishing speed, and almost in time to help. (Efficiency may not take as long to “construct” as generators, but it still has a lead time.)

Art Rosenfeld, former head of energy programs at LBL, and now commissioner on the Calif Energy Commission, is widely viewed as the father of the conservation movement, in California in particular. See

He tells a convincing story about the scope the efficiency resource, citing the example of how refrigerators now consume 1/4 of the energy each (and they’re larger) compared with 20 years ago when “market transformation” efforts and appliance efficiency standards began. This “resource” is now comparable to the entire US hydro or nuclear power contribution to the nation’s energy mix..

There’s a great deal more detail to talk about from this conference, and about this whole subject, than can fit in one UFTO Note. If there’s interest in pursuing any of this in greater detail, please let me know.

By coincidence, this morning’s UtiliPoint IssueAlert was on this very subject! ” Energy Conservation is Now In Vogue”. Go to:
(I hope you are on the list to get this daily commentary. It’s almost always interesting, timely and useful.)


Several months ago I put together a set of references on demand studies. You can download it here (UFTO client password required):


A personal view…. I’m struggling with one aspect of efficiency as a resource– just what kind of a “resource” is it? And why does it exist in the first place? The refrigerator example makes sense as public policy–not too different from needing government to overcome the inability of the “market” to put smog controls in cars. When it comes to “bidding” negawatts into the power market, however, one might reflect that there’s no other instance where a product or service is “unsold” (except maybe in agriculture, and look what a mess that is). If that negawatt is available, then maybe it should have already been taken up. Its existence is purely a result of an existing market imperfection. The question of what demand “would have been” is fundamentally messy, and despite all the brave talk, “Measurement and Verification” (another huge topic of interest at the conference) is never going to feel entirely satisfactory as an answer. Efficiency advocates don’t seem to understand, and aren’t addressing, what critics are uncomfortable with, and they need to.

DOE H2&FC Reviews’03

DOE Hydrogen and Fuel Cells Merit Review Meeting
May 19-22, 2003, Berkeley, CA

(See UFTO Note 10 June 2002 for last year’s meeting.)

“Annual Review Proceedings” are (will be) available:

DOE’s new organization for hydrogen and fuel cells is in place. Steve Chalk heads the program, and has about 20 direct reports for the many sub-areas. The org chart and key contacts list are available here:

Of course, the program got a huge boost when the president announced the $1.2 billion Hydrogen Fuel Initiative and “FreedomCar” program in the state-of-the-union address this past January.

In a plenary opening session, Steve Chalk gave an overview of DOE’s response, based on a major planning effort involving many stakeholders. (This is all heavily documented on the website.) He showed budgets steadily growing over the next several years.
H2: $47, $55, $77 million (FY 02, 03, 04)
FC: $29, $40, $88 million

The Plan involves a decade of R&D, with commercialization decisions towards the end, and subsequent “transition” and “expansion” in the marketplace. Meanwhile, “technology validation” projects will attempt semi-real world demonstrations of complete integrated infrastructure elements, e.g. refueling stations (major RFP was announced May 6 for a 5 year “learning demo” of hydrogen vehicle infrastructure.)

The DOE Secretary will have a new Hydrogen Policy Group (heads of EE, FE, Nuclear, etc.) and the Hydrogen Technical Advisory Committee. Lower down, Steve Chalk will work with the Hydrogen Matrix Group and an Interagency Task Force. Of particular note, a new Systems Integration and Analysis office will be set up at NREL, and several “virtual centers” at national labs focused on specific technical areas.

In each area, goals have been established for the various cost and performance parameters. (e.g., by 2005 electrolytic hydrogen at 5000 psi should be produced at 65% efficiency, for under $3.75/kg. By 2010, moving hydrogen from central production sites to distribution facilities should be under $0.70/kg.) [One kg of H2 is about equivalent in energy content to one gallon of gasoline, making comparisons easier.]

When Chalk’s powerpoint becomes available, it will be worth reviewing if you’re interested in how all of this is going.

This year’s annual review meetings drew a large crowd again. A subset of projects were chosen from each technical area for 20-30 minute presentations, while other investigators were asked to do poster papers instead. Hydrogen and Fuel Cell sessions were held in parallel (last year they were on separate days), making it impossible to cover everything. A two inch thick binder had all the vugraphs, however, and all of it be posted on the website.

Here are the session headings:


– Production -Biological & Biomass Based
– Production -Fossil Based
– Production -Electrolytic
– Production -Photolytic and Photoelectro-chemical
– Storage – High Pressure Tanks
– Storage – Hydrides
– Storage – Carbon & Other Storage
– Infrastructure Development -H2 Fueling Systems & Infrastructure
– Codes & Standards

Fuel Cells

– High Temp Membranes/ Cathodes/ Manufacturing
– High Temp Membranes/ Cathodes/ Electrocatalysts
– Fuel Cell Power Systems Analysis
– Fuel Processing
– Direct Methanol Fuel Cells
– Fuel Cell Power System Development
– Fuels Effects
– Sensors for Safety & Performance
– Air Management Subsystems

A few highlights:

– Codes and standards were compared to the “iceberg below the surface” (i.e. that sunk the Titanic). The voluntary standards-making process in this country, along with the 40,000 independent local jurisdictions, represent a huge educational and process challenge to make society ready for hydrogen. The recently announced fueling station in Las Vegas needed 16 separate permits, and the local fire marshal was the toughest to deal with.

– Carbon nanotube storage is living on borrowed time. It has the distinction of a stern “Go-No go” decision that’s been put in its path (2005), and the science seems not to be making the greatest progress.

– Another Go-No Go decision is set for late 2004, for onboard fuel processing.

– Photolytic H2 production makes slow progress, but researchers close to it acknowledge it’s practical application can only happen if the right materials are found. The search continues using “combinatorial” methods. (see UFTO Note 2 April 2003).

– The fuel cell work seems mostly to do with the tough slugging it out with materials and costs, finding formulations and configurations that gradually improve the situation. A fair amount of attention is going towards higher temperature PEM cell membranes, where hydrogen purity is less of an issue, however no breakthroughs seem imminent.

– Quite a bit of attention is going to fueling systems. Several projects involve the building of equipment and actual demonstration fueling stations and “power parks”. DTE and Pinnacle West are the only utilities that seem to have really pursued this; each has a major demonstration project in development.

In view of the volume and technical nature of this material, let me suggest that I can dig deeper into any particular area of interest to you, but that otherwise the DOE website has all the documentation on the programs and specific projects.

Other Hydrogen news:

You may have seen Wired 11.4 (April). The cover story is by Peter Schwartz, the famous futurist, who proclaims that a full-blown hydrogen economy is urgent and inevitable. I saw him present the argument at a seminar at Stanford recently, and found it very short on practical specifics and less than compelling. For one thing, he asserts that nuclear will be the major source of energy to make hydrogen a decade or two from now.

Along the same lines, the June issue of Business 2.0 came last week, with a feature story about the head of Accenture’s Resource Group, Mary Tolan, and her blunt challenge to the energy industry to go invest like crazy to make the hydrogen economy happen quickly. She says it’s the only way the oil majors in particular will be able to continue to make big profits in the future. She apparently let loose with this at CERA Week, back in February. Business 2.0’s website ( won’t have it online for a few weeks, but I was able to locate a reference to an Accenture utility industry event that outlines the argument.

Curious to know what you think. In my own opinion, both sound over the top. We’ve got a ways to go before the technology, or the society, will be ready for hydrogen on a massive scale. I’ve written to Ms. Tolan to see if I can get more details as to their reasoning.