Photonics Research, Volume. 9, Issue 3, B71(2021)
Backpropagation through nonlinear units for the all-optical training of neural networks
Fig. 1. ONN with all-optical forward- and backward-propagation. (a) A single ONN layer that consists of weighted interconnections and an SA nonlinear activation function. The forward- (red) and backward-propagating (orange) optical signals, whose amplitudes are proportional to the neuron activations,
Fig. 2. Saturable absorber response. (a) The transmission and (b) transmission derivative of an SA unit with optical depths of 1 (left) and 30 (right), as defined by Eqs. (
Fig. 3. Effects of imperfect approximation of the activation function derivative. (a) Feed-forward neural network architecture using a single hidden layer of 128 neurons. (b) Distribution of neuron inputs (
Fig. 4. Performance on image classification. (a) (i) The fully connected network architecture. (ii) Learning curves for the SA [with either exact derivatives in Eq. (
Fig. 5. Optical backpropagation through saturable gain (SG) nonlinearity. (a) Fully connected network architecture, which is the same as Fig.
Get Citation
Copy Citation Text
Xianxin Guo, Thomas D. Barrett, Zhiming M. Wang, A. I. Lvovsky, "Backpropagation through nonlinear units for the all-optical training of neural networks," Photonics Res. 9, B71 (2021)
Special Issue: DEEP LEARNING IN PHOTONICS
Received: Oct. 8, 2020
Accepted: Jan. 11, 2021
Published Online: Mar. 2, 2021
The Author Email: Xianxin Guo (xianxin.guo@physics.ox.ac.uk), Thomas D. Barrett (thomas.barrett@physics.ox.ac.uk), Zhiming M. Wang (zhmwang@uestc.edu.cn), A. I. Lvovsky (alex.lvovsky@physics.ox.ac.uk)