Fractional-Order Activation Functions for Neural Networks

ebook Case Studies on Forecasting Wind Turbines' Generated Power · Studies in Systems, Decision and Control

By Kishore Bingi

cover image of Fractional-Order Activation Functions for Neural Networks

Sign up to save your library

With an OverDrive account, you can save your favorite libraries for at-a-glance information about availability. Find out more about OverDrive accounts.

   Not today

Find this title in Libby, the library reading app by OverDrive.

Download Libby on the App Store Download Libby on Google Play

Search for a digital library with this title

Title found at these libraries:

Library Name Distance
Loading...

This book suggests the development of single and multi-layer fractional-order neural networks that incorporate fractional-order activation functions derived using fractional-order derivatives. Activation functions are essential in neural networks as they introduce nonlinearity, enabling the models to learn complex patterns in data. However, traditional activation functions have limitations such as non-differentiability, vanishing gradient problems, and inactive neurons at negative inputs, which can affect the performance of neural networks, especially for tasks involving intricate nonlinear dynamics. To address these issues, fractional-order derivatives from fractional calculus have been proposed. These derivatives can model complex systems with non-local or non-Markovian behavior. The aim is to improve wind power prediction accuracy using datasets from the Texas wind turbine and Jeju Island wind farm under various scenarios. The book explores the advantages of fractional-order activation functions in terms of robustness, faster convergence, and greater flexibility in hyper-parameter tuning. It includes a comparative analysis of single and multi-layer fractional-order neural networks versus conventional neural networks, assessing their performance based on metrics such as mean square error and coefficient of determination. The impact of using machine learning models to impute missing data on the performance of networks is also discussed. This book demonstrates the potential of fractional-order activation functions to enhance neural network models, particularly in predicting chaotic time series. The findings suggest that fractional-order activation functions can significantly improve accuracy and performance, emphasizing the importance of advancing activation function design in neural network analysis. Additionally, the book is a valuable teaching and learning resource for undergraduate and postgraduate students conducting research in this field. 

Fractional-Order Activation Functions for Neural Networks