News & Analysis
/
Article

Training more resilient artificial neural networks via innovative chemical reaction networks

JUL 11, 2025
Design ensures greater computational reliability under stochastic fluctuations
Training more resilient artificial neural networks via innovative chemical reaction networks internal name

Training more resilient artificial neural networks via innovative chemical reaction networks lead image

Biological computation is a growing area of focus in artificial neural network (NN) development. In this field, DNA strands and protein enzymes stand in as the digital inputs typically used in silicon chip-based classical computation.

Recent research has employed chemical reaction networks (CRNs), which harness biochemical processes for computations that translate interactions involving biochemical species into graphical form. Often, however, CRNs are plagued by random fluctuations known as noise disruption, which create small disturbances that lead to large output deviations.

Sunghwa Kang and Jinsu Kim designed a CRN that conducts both NN computations and the associated training processes that support them, while reducing noise disruption to ensure more reliable behavior under stochastic fluctuations.

“In contrast to prior CRN-based implementations that often rely on piecewise or discontinuous functions, a key distinguishing feature of our approach is the use of smooth activation functions, whose derivatives are continuous,” said Kim. “This is crucial for NN training since gradients determine how parameters are updated, and, because chemical reaction systems inherently exhibit noise, it is essential that computations be robust to such fluctuations.”

The researchers achieved “one-pot computing” where computation and training processes occur simultaneously. In previous studies, realizing full NN functionality typically required oscillatory control, and alternated between computation and training phases.

“In contrast, our CRN’s reactions proceed concurrently, and the division between computation and training is governed by timescale separation,” said Kim. “Specifically, we slow down certain reactions associated with training dynamics to prevent interference with faster forward computations. Interestingly, we found this naturally mimics classical gradient descent-based optimization, which updates NN parameters with small step sizes.”

Source: “Noise-robust training of artificial neural networks using chemical reaction networks,” by Sunghwa Kang and Jinsu Kim, APL Machine Learning (2025). The article can be accessed at: https://doi.org/10.1063/5.0271766 .

More Science
/
Article
The development of a low-noise velocity-sensitive microphone, inspired by acoustic flow detection in nature, enhances acoustic sensing capabilities
/
Article
Rearranging the unit cells of a metamaterial switch can control the propagation of mechanical vibrations.
AAS
/
Article
The “little red dot” CAPERS-LRD-z9 is the most distant object to show the characteristic broad emission lines of fast-moving gas around a black hole.
APS
/
Article
Simulations suggest that the combination of two cancer-therapy strategies, which individually deliver poor outcomes, might produce optimal results.