Chaos, Solitons and Fractals 177 (2023) 114302
Available online 24 November 2023
0960-0779/© 2023 Elsevier Ltd. All rights reserved.
Contents lists available at ScienceDirect
Chaos, Solitons and Fractals
journal homepage: www.elsevier.com/locate/chaos
Numerical investigation and deep learning approach for fractal–fractional
order dynamics of Hopfield neural network model
İbrahim Avcı
∗
, Hüseyin Lort, Buğce E. Tatlıcıoğlu
Department of Computer Engineering, Faculty of Engineering, Final International University, Kyrenia, 99300, Northern Cyprus, via Mersin 10, Turkey
ARTICLE INFO
MSC:
26A33
34A08
34K40
Keywords:
Deep learning
Fractional differential equations
Fractal–fractional derivative
Numerical analysis
Neural network
ABSTRACT
This paper investigates the dynamics of Hopfield neural networks involving fractal–fractional derivatives. The
incorporation of fractal–fractional derivatives in the neural network framework brings forth novel modeling
capabilities, capturing nonlocal dependencies, complex scaling behaviors, and memory effects. The aim of
this study is to provide a comprehensive analysis of the dynamics of Hopfield neural networks with fractal–
fractional derivatives, including the existence and uniqueness of solutions, stability properties, and numerical
analysis techniques. Numerical analysis techniques, including the Adams–Bashforth method, are employed to
accurately simulate the fractal–fractional Hopfield neural network system. Moreover, the obtained numerical
data serves as validation for developing predictions using Multilayer Perceptron (MLP) and Long Short-Term
Memory (LSTM) neural network methods. The findings contribute to the advancement of both fractional
calculus and neural network theory, providing valuable insights for theoretical investigations and practical
applications in complex systems analysis.
1. Introduction
Investigating dynamical systems involving fractal–fractional deriva-
tive operators has emerged as a captivating and promising research
direction in recent years [1–4]. The incorporation of fractal–fractional
derivative operators introduces a new level of complexity and richness
to the dynamics of the systems. Unlike traditional derivatives, fractal–
fractional derivatives provide a powerful tool to describe and analyze
phenomena with memory effects, nonlocality, and complex scaling
behaviors [5]. This novel approach offers several advantages, such
as capturing long-range dependencies, modeling systems with non-
Markovian dynamics, and incorporating fractional order dynamics in
a fractal framework. The exploration of dynamical systems involv-
ing fractal–fractional derivative operators has led to groundbreaking
insights and opened up new avenues for understanding complex behav-
iors in various fields, including physics, engineering, biology, finance,
and beyond [6–8]. By delving into the intricate interplay between
fractal geometry and fractional calculus, researchers are uncovering
the hidden dynamics, stability properties, chaos, and synchronization
phenomena within these systems. The investigation of such systems
holds immense potential to advance our understanding of complex
phenomena and foster the development of innovative modeling ap-
proaches, analysis techniques, and applications in diverse scientific and
technological domains.
∗
Corresponding author.
E-mail address: ibrahim.avci@final.edu.tr (İ. Avcı).
Although artificial neural networks (ANN) have been known in
the scientific community since the 1940s [9], they had not achieved
sufficient popularity due to the capacity of computing machines. ANNs
have emerged as a powerful computational paradigm inspired by the
intricate workings of the human brain. With improvements in data
storage and computer processing [10], artificial neural networks have
evolved into information-processing systems that share certain perfor-
mance characteristics with biological neural networks [11]. Among
the various types of ANNs, Hopfield neural networks (HNNs) hold a
significant place. Developed by John Hopfield in the early 1980s, HNNs
were among the pioneering models of recurrent neural networks. They
gained prominence for their ability to store and retrieve patterns, ex-
hibit associative memory capabilities, and solve optimization problems.
Over the years, extensive research has been conducted to enhance
the understanding of HNNs, explore their theoretical properties, and
develop efficient learning algorithms [12–15]. These efforts have led
to a deeper comprehension of the dynamics, stability, and convergence
properties of HNNs. Moreover, HNNs have found diverse applications
in areas such as pattern recognition, image processing, optimization,
data compression, and combinatorial optimization [16–18]. In [19],
the application of fractional-order Hopfield neural networks for solving
optimization problems was explored. This work employed a semi-
analytical method based on Adomian decomposition, and it showcased
https://doi.org/10.1016/j.chaos.2023.114302
Received 23 August 2023; Received in revised form 15 November 2023; Accepted 21 November 2023