News & Analysis
/
Article

Training more resilient artificial neural networks via innovative chemical reaction networks

JUL 11, 2025
Design ensures greater computational reliability under stochastic fluctuations
Training more resilient artificial neural networks via innovative chemical reaction networks internal name

Training more resilient artificial neural networks via innovative chemical reaction networks lead image

Biological computation is a growing area of focus in artificial neural network (NN) development. In this field, DNA strands and protein enzymes stand in as the digital inputs typically used in silicon chip-based classical computation.

Recent research has employed chemical reaction networks (CRNs), which harness biochemical processes for computations that translate interactions involving biochemical species into graphical form. Often, however, CRNs are plagued by random fluctuations known as noise disruption, which create small disturbances that lead to large output deviations.

Sunghwa Kang and Jinsu Kim designed a CRN that conducts both NN computations and the associated training processes that support them, while reducing noise disruption to ensure more reliable behavior under stochastic fluctuations.

“In contrast to prior CRN-based implementations that often rely on piecewise or discontinuous functions, a key distinguishing feature of our approach is the use of smooth activation functions, whose derivatives are continuous,” said Kim. “This is crucial for NN training since gradients determine how parameters are updated, and, because chemical reaction systems inherently exhibit noise, it is essential that computations be robust to such fluctuations.”

The researchers achieved “one-pot computing” where computation and training processes occur simultaneously. In previous studies, realizing full NN functionality typically required oscillatory control, and alternated between computation and training phases.

“In contrast, our CRN’s reactions proceed concurrently, and the division between computation and training is governed by timescale separation,” said Kim. “Specifically, we slow down certain reactions associated with training dynamics to prevent interference with faster forward computations. Interestingly, we found this naturally mimics classical gradient descent-based optimization, which updates NN parameters with small step sizes.”

Source: “Noise-robust training of artificial neural networks using chemical reaction networks,” by Sunghwa Kang and Jinsu Kim, APL Machine Learning (2025). The article can be accessed at: https://doi.org/10.1063/5.0271766 .

More Science
APS
/
Article
A drone-borne photon counter can provide a new view into environmental health.
AAS
/
Article
JWST observations of the young star cluster IC 348 revealed extremely low-mass brown dwarfs with signatures of hydrocarbons in their spectra: meet the members of the proposed “H” spectral class.
/
Article
Molecular dynamics model uses topological order parameter to show stratification of water molecules based on density.
/
Article
Accidental discovery could enable significant advances in organic optoelectronic devices.