Dynamic analysis of stochastic bidirectional associative memory neural networks with delays

https://doi.org/10.1016/j.chaos.2005.12.010Get rights and content

Abstract

In this paper, stochastic bidirectional associative memory neural networks model with delays is considered. By constructing Lyapunov functionals, and using stochastic analysis method and inequality technique, we give some sufficient criteria ensuring almost sure exponential stability, pth exponential stability and mean value exponential stability. The obtained criteria can be used as theoretic guidance to stabilize neural networks in practical applications when stochastic noise is taken into consideration.

Introduction

Recently, the dynamics such as stability and periodicity of bidirectional associative memory (BAM) neural networks have received much attention due to their potential application in associative memory, parallel computation and optimization problems. Some important results have been obtained in Refs. [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11]. Most neural network models proposed and discussed in the literature are deterministic. As is well known, a real system is usually affected by external perturbations which in many cases are of great uncertainty and hence may be treated as random, as pointed out by Haykin [12] that in real nervous system synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes. Therefore, it is of prime importance and great interest to consider stochastic effects to the stability of neural networks. To date, some results on stability of stochastic cellular neural networks and stochastic Cohen–Grossberg neural networks have been reported (see, [13], [14], [15], [16], [17]). However, to the best of our knowledge, few authors study the stability of stochastic BAM neural networks with delays.

Motivated by the above discussion, in this paper, we analyze the stochastic BAM neural networks model with delays. By constructing Lyapunov functional, and using stochastic analysis method and inequality technique, we give a set of sufficient conditions ensuring almost sure exponential stability, pth exponential stability and mean value exponential stability of an equilibrium point. The obtained criteria can be used as theoretic guidance to stabilize neural networks in practical applications when stochastic noise is taken into consideration.

Section snippets

Preliminary

Consider stochastic BAM neural networks model with delaysdui(t)=-ciui(t)+j=1npjifj(vj(t-τj))+Iidt+j=1nσji(vj(t))dωj(t),dvj(t)=-djvj(t)+i=1nqijgi(ui(t-ηi))+Jjdt+mi=1nσij(ui(t))dωi(t),t0,ui(t)=ξi(t),vj(t)=ζj(t),-τt0,in which i, j = 1,  , n; 0  τj, ηi  τ; ui(t), vj(t) denote the potential of the cell i and j at time t, respectively; ci, dj are positive constants, they denote the rate with which the cell i and j reset their potential to the resting state when isolated from the other cells and

Almost sure exponential stability

 

Theorem 1

If (A1) and (A2) hold, then system (3) is almost surely exponentially stable.

Proof

It follows from (A2) that there exists a sufficiently small constant 0 < c < min{c, d} such that-ri(ci-c)+j=1nsj|qij|rijβisijecτ+(q-1)(q-2)2rij=1nLji2bijq-2<0and-sj(dj-c)+i=1nri|pji|rijαjsijecτ+(q-1)(q-2)2i=1nsjLij2bijq-2<0.

Taking V(t)=1qecti=1nri|yi(t)|q+j=1nsj|zj(t)|q, and applying Itô’s formula to V(t), we haveV(t)=V(0)+0t1qcecsi=1nri|yi(s)|qds+0t1qcecsj=1nsj|zj(s)|qds+0tecsi=1nri|yi(s)|q-1sgn(yi(t))-ci(ui

Moment exponential stability

 

Theorem 2

If (A1) and (A2) hold, then system (3) is pth exponentially stable.

Proof

Let c=qcp, by Theorem 1 we havemin1in,1jn{ri,sj}ecti=1n|yi(t)|q+j=1n|zj(t)|qqi=1nri|yi(0)|q+qj=1nsj|zj(0)|q+-τ0ecsi=1nj=1nri|pji|rijαjsijecτ|zj(s)|qds+-τ0ecsi=1nj=1nsj|qij|rijβisijecτ|yi(s)|qds+q0tecsi=1nj=1nri|yi(s)|q-1σji(vj(s))dωj(s)+q0tecsi=1nj=1nsj|zj(s)|q-1σij(ui(s))dωi(s).From (A2), we havemin1in,1jn{ri,sj}ect(y(t),z(t))Tqqqmax1in,1jn{ri,sj}(ϕ,ψ)Tqq+q0tecsi=1nj=1nri|yi(s)|q

Examples

 

Example 1

Consider a stochastic BAM neural networks with delaysdu1(t)=-c1u1(t)dt+p11f1(v1(t-τ1))dt+σ11(v1(t))dω1(t),dv1(t)=-d1v1(t)dt+q11g1(u1(t-η1))dt+σ11(u1(t))dω1(t),where f1(v1) = v1, σ11(v1)=v1,g1(u1)=eu1-e-u1eu1+-e-u1, σ11(u1) = u1. Obviously, f1, g1, σ11(v1) satisfy the conditions (A1) with α1 = β1 = L11 = 1. By taking p11 = q11 = 1, c1 = 5, d1 = 10, q=3,r1=3,s1=2,r11=r11=1,s11=s11=2,b11=b11=1, it is very easy to see the condition (A2) holds. It follows Theorem 1, Theorem 2 that system (7) is almost surely

Conclusions

In this paper, stochastic bidirectional associative memory neural networks with delays is studied. By constructing Lyapunov functionals, and using stochastic analysis method and inequality technique, we obtain some sufficient conditions of the almost sure exponential stability, pth exponential stability and mean value exponential stability.The obtained conditions can be used as theoretic guidance to stabilize neural networks in practical applications when stochastic noise is taken into

Cited by (50)

  • Exponential p-convergence analysis for stochastic BAM neural networks with time-varying and infinite distributed delays

    2015, Applied Mathematics and Computation
    Citation Excerpt :

    From the theoretical and application view, it is also essential to study convergence for stochastic systems and stochastic BAM neural networks. In [5,13,41–44], the authors studied global asymptotic stability in Lyapunov sense for stochastic BAM networks with time-varying delays based on the Lyapunov–Krasovskii functional and stochastic analysis approach. The authors [21] discussed the convergence behavior of delayed BAM cellular neural networks with asymptotically periodic coefficients by applying mathematical analysis techniques.

  • Mean square stabilization and mean square exponential stabilization of stochastic BAM neural networks with Markovian jumping parameters

    2015, Chaos, Solitons and Fractals
    Citation Excerpt :

    In the last few years, the stability of bidirectional associative memory (BMA) neural networks has gained a mass of research attention [1,2,6,9,13–22] because of their applications and potential applications in many fields such as patten recognition, artificial intelligence, parallel computation, associative memory and optimization problems, etc.

  • Stochastic stability of Markovian jump BAM neural networks with leakage delays and impulse control

    2014, Neurocomputing
    Citation Excerpt :

    These feasible LMIs are sufficient conditions for the proposed stability analysis of Markovian jump BAM neural networks. As discussed in Remarks 3.4–3.6, the criteria existing in [8,9,19,24,25,39–44] fail in Example 4.1. In this paper, we have investigated the stability analysis problem for a class of Markovian jump BAM neural networks with impulse control and leakage time-varying delays.

View all citing articles on Scopus

This research was supported by the National Natural Science Foundation of Nanjing University of Aeronautics and Astronautics.

View full text