ADAPTIV FAOLLASHTIRISH FUNKSIYALARINING CHUQUR NEYRON TARMOQLARNING UMUMLASHTIRISH QOBILIYATIGA TA'SIRI

Authors

  • Obidjonov Boburjon Rustamjon o’g’li Kimyo International University in Tashkent Author

Keywords:

adaptiv faollashtirish funksiyalari, umumlashtirish qobiliyati, generalization gap, chuqur neyron tarmoqlar, Mish, gradient oqimi.

Abstract

  

References

1. Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.

2. Dubey, S. R., Singh, S. K., & Chaudhuri, B. B. (2022). Activation functions in deep learning: A comprehensive survey and benchmark. Neurocomputing, 503, 92-108.

3. Apicella, A., Donnarumma, F., Isgrò, F., & Prevete, R. (2021). A survey on modern trainable activation functions. Neural Networks, 138, 14-32.

4. Bouraya, S., & Belangour, A. (2024). A comparative analysis of activation functions in neural networks: Unveiling categories. Bulletin of Electrical Engineering and Informatics, 13(5), 3301-3308.

5. He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers. ICCV.

6. Agostinelli, F., et al. (2015). Learning activation functions to improve deep neural networks. ICLR Workshop.

7. Ramachandran, P., Zoph, B., & Le, Q. V. (2017). Searching for activation functions. arXiv:1710.05941.

8. Ma, N., Zhang, X., Liu, M., & Sun, J. (2021). Activate or not: Learning customized activation. CVPR.

9. Molina, A., Schramowski, P., & Kersting, K. (2020). Padé activation units. ICLR.

10. Misra, D. (2019). Mish: A self regularized non-monotonic activation function. arXiv:1908.08681.

Downloads

Published

2026-04-07