Readablewiki

Faraday-efficiency effect

Content sourced from Wikipedia, licensed under CC BY-SA 3.0.

Faraday efficiency is the share of electrical current that actually goes into producing hydrogen and oxygen gas during electrolysis. If 100% of the current makes gas, the calculation of energy balance is straightforward. But in real cells, some current can go into other processes, or hydrogen and oxygen can recombine inside the cell. This lowers the Faraday efficiency.

Why this matters: In the past, researchers who claimed to see “excess heat” in electrochemical cells often assumed the Faraday efficiency was 100%. They didn’t measure it, so their energy calculations could falsely show extra heat. Some of these claims were linked to ideas like cold fusion, though they didn’t account for conventional chemistry.

Key study and findings: Between 1991 and 1993, Zvi Shkedi and colleagues built well-insulated water-based cells with calorimeters that could measure Faraday efficiency in real time. Their setup used a fine-wire nickel cathode, a platinum anode, and a K2CO3 electrolyte, with highly accurate calorimeters.

- They did 64 experiments and measured the actual Faraday efficiency each time.
- On average, the efficiency was about 78%.
- If they analyzed the data assuming 100% efficiency, they saw about 21% apparent excess heat.
- When they used the measured efficiency, the actual excess heat was essentially zero: 0.13% +/- 0.48%.

Conclusion: The apparent excess heat could be explained by the real, less-than-perfect Faraday efficiency, not new physics. The researchers advised that any reports of excess heat should include simultaneous measurements of Faraday efficiency. Subsequent work by Jones and colleagues confirmed that less-than-100% Faraday efficiency can account for claims of excess heat in these cells.


This page was last edited on 2 February 2026, at 20:50 (CET).