Beyond Whole Numbers: Neural Networks Tackle Fractional Growth

Author: Denis Avetisyan


A novel approach combines artificial neural networks with fractional calculus to more accurately model complex growth patterns.

Fractional exponential growth demonstrates that a quantity can increase at a rate proportional to a non-integer power of itself, offering a nuanced alternative to traditional exponential models where growth is governed by $e^x$ and instead allowing for rates described by functions like $e^{x^\alpha}$ with $0 < \alpha < 1$.
Fractional exponential growth demonstrates that a quantity can increase at a rate proportional to a non-integer power of itself, offering a nuanced alternative to traditional exponential models where growth is governed by $e^x$ and instead allowing for rates described by functions like $e^{x^\alpha}$ with $0 < \alpha < 1$.

This review details a method for approximating solutions to initial value problems in fractional growth models using a discretized Caputo derivative within a neural network architecture.

While traditional methods struggle with the complexities of fractional differential equations, this paper, ‘Fractional Artificial Neural Networks for Growth Models’, introduces a novel approach to solving initial value problems in fractional growth models. By discretizing the Caputo derivative and implementing a fractional artificial neural network in R, we demonstrate accurate approximations of analytical solutions for generalized exponential and logistic growth with harvesting. This method offers a promising avenue for numerical solutions in fractional calculus, but could this framework be extended to address more complex, high-dimensional fractional systems?


Beyond Integer-Order Reality: The Necessity of Fractional Calculus

Traditional growth models, such as the Exponential and Logistic models, frequently struggle to represent the nuanced behavior observed in many real-world systems. These established frameworks assume instantaneous responses to stimuli, effectively neglecting the impact of past states on present dynamics. However, numerous phenomena – ranging from viscoelastic material behavior to biological population dynamics and even financial markets – exhibit what’s known as ‘memory effects’ or ‘hereditary properties’. This means the current state isn’t solely determined by present conditions, but also by a history of prior states. Consequently, these classical models, built on the assumption of immediate change, often fail to accurately predict or describe the long-term evolution of systems where past experiences demonstrably influence current and future behavior, creating a need for more sophisticated mathematical tools capable of capturing these complex dependencies.

Traditional mathematical models frequently struggle to capture the nuances of real-world phenomena because they depend on integer-order derivatives – calculations that assess rates of change as whole numbers. However, many processes, such as the movement of particles in complex fluids or the fading of signals in viscoelastic materials, exhibit anomalous diffusion or relaxation – behaviors that deviate from standard predictions. This occurs because these systems possess “memory” – their current state is influenced by their entire past history, something integer-order derivatives cannot fully account for. Consider a particle undergoing Brownian motion: its path is seemingly random, but a traditional model might oversimplify the process by assuming each step is independent. In reality, the particle’s previous movements subtly influence its trajectory, creating a more complex pattern. This inherent dependency necessitates a mathematical tool capable of quantifying rates of change that are not limited to whole-number orders, highlighting a fundamental limitation of classical approaches and paving the way for more sophisticated techniques.

Fractional calculus extends the concepts of differentiation and integration to non-integer orders, providing a more nuanced mathematical toolkit for describing systems exhibiting memory effects. Unlike traditional calculus which relies on derivatives like the first ($D^1$) or second ($D^2$) to characterize change, fractional derivatives-such as $D^{0.5}$-capture intermediate behaviors. This capability is crucial for modeling phenomena where the past state significantly influences the present, a characteristic often observed in viscoelastic materials, diffusion in porous media, and even certain financial markets. By incorporating these non-integer orders, the framework allows for a more accurate representation of anomalous diffusion, where particles don’t spread at a constant rate, and relaxation processes, where systems don’t immediately return to equilibrium. Essentially, fractional calculus offers a way to move beyond the limitations of integer-order models and capture the complex, often subtle, dynamics inherent in many real-world processes.

Fractional logistics growth demonstrates a decelerating growth rate as the population approaches carrying capacity.
Fractional logistics growth demonstrates a decelerating growth rate as the population approaches carrying capacity.

Fractional Artificial Neural Networks: A Logical Progression in Modeling

Fractional Artificial Neural Networks (FANNs) represent an advancement over traditional Artificial Neural Networks by enabling the solution of Initial Value Problems formulated as fractional differential equations. These networks move beyond integer-order derivatives to incorporate fractional derivatives, allowing for more accurate modeling of systems exhibiting anomalous diffusion or those with memory effects. This capability extends the applicability of neural networks to a broader range of physical and engineering problems, including viscoelasticity, anomalous transport phenomena, and control systems where historical states influence current behavior. The ability to directly approximate solutions to $ \frac{d^{\alpha}y}{dt^{\alpha}} = f(t, y) $ – where $\alpha$ is a non-integer order – is the fundamental distinction of FANNs and provides a potential advantage over standard neural network approaches when dealing with these types of equations.

The incorporation of the Caputo derivative is central to Fractional Artificial Neural Networks’ ability to model complex dynamical systems. Unlike standard derivatives which represent instantaneous rates of change, the Caputo derivative accounts for the history of the system, effectively introducing a ‘memory’ component. This is achieved through an integral transform that considers past states, allowing the network to accurately represent systems exhibiting non-local interactions – where the current state depends not only on immediate inputs but also on values from prior times. Mathematically, the Caputo derivative of order $\alpha$ is defined as $D^{\alpha}f(t) = \frac{1}{\Gamma(n-\alpha)} \int_{0}^{t} \frac{f^{(n)}(\tau)}{(t-\tau)^{\alpha-n+1}} d\tau$, where $n = \lceil \alpha \rceil$ and $\Gamma$ is the Gamma function. This formulation is particularly useful for modeling phenomena in fields such as viscoelasticity, anomalous diffusion, and control systems where memory and hereditary effects are significant.

Evaluations of the Fractional Artificial Neural Network utilized diverse architectural configurations to assess performance. The tested networks consistently featured a single input layer and a single output layer, with variations introduced through the number of hidden layers and neuron counts per layer. The shallowest network implemented two hidden layers, each containing 42 neurons. More complex configurations increased the number of hidden layers up to a maximum of six, with varying neuron counts per layer – specifically, 8, 42, 64, 64, 42, and 8 neurons respectively. This range of architectures allowed for investigation into the relationship between network depth, layer size, and the accuracy of solutions to fractional differential equations.

The implementation of Fractional Artificial Neural Networks necessitates the discretization of the fractional derivative operator due to its non-local nature and incompatibility with direct computation. The Backward Difference method was employed to approximate the Caputo fractional derivative within the network structure. This technique involves representing the fractional derivative as a weighted sum of past values of the function, effectively approximating the integral in the definition of the fractional derivative. Specifically, the $ \alpha $-order Caputo fractional derivative is approximated using a finite difference scheme based on past function values, allowing for its inclusion within the network’s forward pass. The accuracy of this approximation is dependent on the chosen time step and the order of the fractional derivative, $\alpha$, and influences the overall performance of the network in solving the Initial Value Problem.

A fractional logistic model demonstrates the impact of periodic harvesting on population dynamics.
A fractional logistic model demonstrates the impact of periodic harvesting on population dynamics.

Optimizing Network Performance: A Rigorous Validation Protocol

Network performance evaluation relies on quantifying the difference between the model’s predictions and the known, correct solutions. This is achieved using a Loss Function, with the Root Mean Square Error (RMSE) being the metric of choice in this implementation. RMSE calculates the square root of the average squared differences between predicted values and actual values; mathematically, $RMSE = \sqrt{\frac{1}{n}\sum_{i=1}^{n}(y_i – \hat{y}_i)^2}$, where $n$ is the number of data points, $y_i$ represents the actual value for data point $i$, and $\hat{y}_i$ is the predicted value. A lower RMSE indicates a better fit of the network to the data and, consequently, improved performance.

Network training utilizes Gradient Descent to iteratively adjust network weights and biases, minimizing the Root Mean Square Error (RMSE) loss function. The Adam Optimizer is integrated to adaptively modify the learning rate for each parameter, accelerating convergence and improving training efficiency. Adam computes adaptive learning rates for each parameter based on estimates of first and second moments of the gradients, effectively combining the advantages of both AdaGrad and RMSProp. This adaptive learning rate approach allows for larger updates in sparse gradients and smaller updates in frequent gradients, resulting in faster and more stable training compared to standard Gradient Descent with a fixed learning rate. The combination of Gradient Descent and the Adam Optimizer ensures effective minimization of the loss function and optimal network performance.

The R statistical software environment serves as the primary platform for implementing and validating the Fractional Artificial Neural Networks described. R provides a comprehensive suite of tools for numerical computation, data manipulation, and statistical analysis, facilitating the construction of network architectures and the execution of training algorithms. Specifically, R packages are utilized for defining network layers, implementing the Adam optimizer for gradient descent, and calculating the Root Mean Square Error ($RMSE$) as the loss function. The software’s capabilities extend to data visualization and statistical testing, enabling thorough validation of network performance across diverse parameter settings and model types, including Exponential, Logistic, and Harvesting models. This robust environment supports repeatable experimentation and rigorous assessment of the network’s predictive accuracy and generalization capability.

Network performance was evaluated using three distinct growth models with specific parameterizations. Exponential growth was simulated with a growth rate ($a$) of 1 and an initial condition ($u_0$) of 1. Logistic growth utilized a carrying capacity ($N$) of 1, an initial condition ($u_0$) of 0.01, and a growth rate ($a$) of 10. Finally, a harvesting model was implemented with $N=1$, $u_0=0.4$, $a=5$, and a harvesting parameter ($b$) of 0.8. These parameter settings allowed for a comparative assessment of the network’s ability to approximate solutions across different dynamic systems.

Loss decreases consistently with training epochs across all values of α (1.0, 0.9, 0.8, and 0.7), indicating effective model convergence.
Loss decreases consistently with training epochs across all values of α (1.0, 0.9, 0.8, and 0.7), indicating effective model convergence.

Beyond Standard Growth: Ecological Implications and Predictive Power

Ecological systems are notoriously complex, often defying accurate representation by traditional mathematical models which rely on integer-order derivatives and assume instantaneous change. Fractional Artificial Neural Networks (FANNs) present a significant advancement by incorporating fractional calculus – derivatives of non-integer order – into their architecture. This allows FANNs to capture the ‘memory’ and non-local dependencies inherent in many ecological processes, such as delayed effects of resource availability or the lingering impact of environmental stressors. Unlike models constrained by immediate responses, FANNs can effectively model systems exhibiting power-law behaviors and long-range correlations, leading to more realistic simulations of population dynamics, species interactions, and ecosystem resilience. The ability to accurately represent these nuanced relationships positions FANNs as a powerful tool for ecological forecasting and informed conservation strategies, exceeding the limitations of conventional modeling approaches.

Fractional Artificial Neural Networks demonstrate a compelling ability to model population growth while incorporating ecologically relevant constraints. Traditional models, such as the Logistic Growth Model – described by the equation $dN/dt = rN(1 – N/K)$ where $r$ is the growth rate and $K$ is the carrying capacity – often rely on simplifying assumptions. These networks, however, can accurately simulate population dynamics even when faced with limitations like resource scarcity or environmental fluctuations, because of their ability to capture memory and hereditary characteristics. Recent studies show these networks not only replicate observed growth curves, but also predict population responses to changing conditions with greater precision than conventional methods, offering a valuable tool for understanding and forecasting ecological changes.

The adaptability of fractional artificial neural networks extends beyond simple population dynamics to address the intricacies of resource management, specifically periodic harvesting. These networks can simulate the impact of regular resource extraction – mimicking real-world scenarios like timber harvesting or fisheries – and predict long-term sustainability. By adjusting parameters representing harvest rates and regeneration times, researchers can explore various management strategies and identify optimal harvesting schedules that balance economic yield with ecological preservation. The models account for the time delays inherent in biological systems, offering a more realistic assessment than traditional models which often assume instantaneous growth. This capability is crucial for informing policy decisions, allowing for the development of strategies that maximize resource availability while minimizing the risk of depletion, and ensuring the long-term health of ecosystems and the communities that depend on them.

The pursuit of accurate growth model solutions, as demonstrated by this work utilizing fractional artificial neural networks, echoes a fundamental principle of computational rigor. The discretization of the Caputo derivative, while a numerical approximation, strives for a provable convergence-a hallmark of sound mathematical practice. As Marvin Minsky once stated, “If it feels like magic, you haven’t revealed the invariant.” This sentiment perfectly encapsulates the paper’s underlying aim: to move beyond simply ‘fitting’ data with a neural network and instead expose the inherent mathematical structure governing fractional differential equations, making the solution transparent and verifiable-not reliant on opaque, emergent behavior. The power lies not in the network’s predictive ability alone, but in its capacity to illuminate the underlying invariant properties of the system.

Beyond Approximation: Charting a Course for Rigor

The presented work, while demonstrating a functional approximation of solutions to fractional growth models, merely skirts the fundamental question. The discretization of the Caputo derivative, a necessary evil for implementation, introduces a degree of arbitrariness that true mathematical elegance resists. Future efforts must grapple with the error introduced by such discretizations, not simply by minimizing loss functions on test cases, but by establishing rigorous bounds on the approximation error itself. To claim a solution is ‘good’ because it fits observed data is, frankly, insufficient; the goal is not mimicry, but demonstrable correctness.

A critical next step lies in extending this framework beyond the relatively well-behaved domain of growth models. Fractional calculus is not a mere tool for curve fitting; it describes inherent non-locality in many physical systems. The true test of this fractional artificial neural network approach will be its application to problems where the underlying fractional order is not known a priori, requiring the network to simultaneously learn both the solution and the order of the derivative. This demands a fundamental re-evaluation of the loss function, moving beyond simple residual minimization to incorporate principles of parsimony and physical consistency.

Ultimately, the pursuit of ‘fractional intelligence’ should not be driven by a desire to achieve incremental improvements in predictive accuracy. It should be motivated by a deeper understanding of the mathematical structure of fractional systems, and a commitment to building models that are not merely effective, but provably correct. Simplicity, it must be remembered, is not about brevity-it is about non-contradiction and logical completeness.


Original article: https://arxiv.org/pdf/2511.16676.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2025-11-24 07:53