Comparison Study and Analysis of Implementing Activation Function of Machine Learning in MATLAB and FPGA


Authors : Mallika Roy; Jishnu Nath Paul; Josita Sengupta; Swagata Bhattacharya

Volume/Issue : Volume 9 - 2024, Issue 6 - June


Google Scholar : https://tinyurl.com/3vp5bh2j

Scribd : https://rb.gy/vcdncp

DOI : https://doi.org/10.38124/ijisrt/IJISRT24JUN1109

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.


Abstract : This study examines the implementation and comparative analysis of sigmoid, approximation sigmoid, and hard sigmoid activation functions on FPGA using Verilog HDL and Xilinx ISE simulator and investigates key performance parameters including device usage, clock load, and time characteristics among. The findings suggest that sigmoid functions provide greater accuracy at the expense of larger processors. An approximate sigmoid roughly strikes a balance between accuracy and efficiency, whereas a hard sigmoid is more efficient but imprecise. Comparison of MATLAB results showed the effect of non-stationary computation and lower number, where lower quantization level resulted in improved accuracy. This study highlights the trade-off involved in FPGA-based neural network implementations and fixed- point emphasis. It also suggests future research on reducing the representation and developing effective activation algorithms.

Keywords : Activation Function; FPGA; Verilog HDL; Xilinx; MATLAB; Machine Learning.

References :

  1. Bañuelos-Saucedo, M A, et al. “Implementation of a Neuron Model Using FPGAS.” Journal of Applied Research and Technology, vol. 1, no. 03, 1 Oct. 2003, https://doi.org/10.22201/icat.16656423.2003.1.03.611. Accessed 25 Aug. 2023.
  2. Beiu, Valeriu. Closse Approximations of Sigmoid Functions by Sum of Step for VLSI Implementation of Neural Networks. 2014.
  3. Deng, Li. “A Tutorial Survey of Architectures, Algorithms, and Applications for Deep Learning.” APSIPA Transactions on Signal and Information Processing, vol. 3, 2014, www.cambridge.org/core/ journals/apsipa-transactions-on-signal-and-informationa-processing/article/tutorial-survey-of-architectures-algorithms-and-applications-for-deep-learning/023B6ADF962FA37F8EC684B209E3DFAE, https://doi.org/10.1017/atsip.2013.9. Accessed 15 Aug. 2019.
  4. Dubey, Shiv Ram, et al. “Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark.” Neurocomputing, vol. 503, Sept. 2022, pp. 92–108, https://doi.org/10.1016/j.neucom. 2022.06.111. Accessed 28 May 2024.
  5. Feng, Jianli, and Shengnan Lu. “Performance Analysis of Various Activation Functions in Artificial Neural Networks.” Journal of Physics: Conference Series, vol. 1237, June 2019, p. 022030, https://doi.org/10.1088/1742-6596/1237/2/022030.
  6. Gustineli, Murilo. “A Survey on Recently Proposed Activation Functions for Deep Learning.” ArXiv.org, 6 Apr. 2022, arxiv.org/abs/2204.02921. Accessed 2 July 2023.
  7. Kwan, H.K. “Simple Sigmoid-like Activation Function Suitable for Digital Hardware Implementation.” Electronics Letters, vol. 28, no. 15, 1992, p. 1379, https://doi.org/10.1049/el:19920877.
  8. Muhammed, Thamer, et al. IMPLEMENTATION of a SIGMOID ACTIVATION FUNCTION for NEURAL NETWORK USING FPGA IMPLEMENTATION of a SIGMOID ACTIVATION FUNCTION for NEURAL NETWORK USING FPGA. 2012.
  9. Ngah, Syahrulanuar, and Rohani Abu Bakar. Sigmoid Function Implementation Using the Unequal Segmentation of Differential Lookup Table and Second Order Nonlinear Function.
  10. Reza Raeisi, and Armin Kabir. IMPLEMENTATION of ARTIFICIAL NEURAL NETWORK on FPGA. 1 Jan. 2006. Accessed 3 June 2024.

This study examines the implementation and comparative analysis of sigmoid, approximation sigmoid, and hard sigmoid activation functions on FPGA using Verilog HDL and Xilinx ISE simulator and investigates key performance parameters including device usage, clock load, and time characteristics among. The findings suggest that sigmoid functions provide greater accuracy at the expense of larger processors. An approximate sigmoid roughly strikes a balance between accuracy and efficiency, whereas a hard sigmoid is more efficient but imprecise. Comparison of MATLAB results showed the effect of non-stationary computation and lower number, where lower quantization level resulted in improved accuracy. This study highlights the trade-off involved in FPGA-based neural network implementations and fixed- point emphasis. It also suggests future research on reducing the representation and developing effective activation algorithms.

Keywords : Activation Function; FPGA; Verilog HDL; Xilinx; MATLAB; Machine Learning.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe