Advanced Photonics, Volume. 6, Issue 1, 016002(2024)
Programming nonlinear propagation for efficient optical learning machines
[1] B. Fasel. An introduction to bio-inspired artificial neural network architectures. Acta Neurol. Belg., 103, 6-12(2003).
[4] W. Fedus, B. Zoph, N. Shazeer. Switch transformers: scaling to trillion parameter models with simple and efficient sparsity(2022).
[5] S. Reed et al. A generalist agent(2022).
[21] R. Turner et al. Bayesian optimization is superior to random search for machine learning hyperparameter tuning: analysis of the black-box optimization challenge 2020(2021).
[22] D. Eriksson, D. Bindel, C. A. Shoemaker. pySOT and POAP: an event-driven asynchronous framework for surrogate optimization(2019).
[24] C. Cartis, L. Roberts, O. Sheridan-Methven. Escaping local minima with derivative-free methods: a numerical investigation(2019).
[43] A. Klein et al. Meta-surrogate benchmarking for hyperparameter optimization(2019).
[44] H. Xiao, K. Rasul, R. Vollgraf. Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms(2017).
[47] M. Tan, Q. E. Le. EfficientNet: rethinking model scaling for convolutional neural networks(2020).
Get Citation
Copy Citation Text
Ilker Oguz, Jih-Liang Hsieh, Niyazi Ulas Dinc, Uğur Teğin, Mustafa Yildirim, Carlo Gigli, Christophe Moser, Demetri Psaltis, "Programming nonlinear propagation for efficient optical learning machines," Adv. Photon. 6, 016002 (2024)
Category: Research Articles
Received: --
Accepted: --
Posted: Dec. 6, 2023
Published Online: Jan. 26, 2024
The Author Email: Oguz Ilker (ilker.oguz@epfl.ch)