1.3. Leaky Integrator Rate Neuron Model

The leaky integrator (LI) is a standard model for neuronal dynamics. It has been used for spiking as well as rate neuron models. Here, we introduce a rate-coupled LI neuron model. It has a single state-variable \(v_i\), the dynamics of which are governed by

\[ \begin{align}\begin{aligned}\dot v_i &= -\frac{v_i}{\tau} + I_i(t) + k r_i^{in},\\r_i &= f(v_i).\end{aligned}\end{align} \]

The two constants governing the dynamics of \(v_i\) are the global decay time constant \(\tau\) and the global coupling constant \(k\). The variable \(r_i\) represents a potentially non-linear transform of the state-variable \(v_i\) and is used as a representation of the output rate of the neuron. This variable should be used to connect a neuron to other neurons in an RNN. The variable \(r_i^{in}\) serves as target variable for such connections. RectiPy provides two different versions of the LI neuron that use two different choices for \(f\). In the examples below, we examine an RNN of randomly coupled LI neurons for each of these two choices.

1.3.1. LI neuron with a hyperbolic tangent transform

In this example, we will examine an RNN of LI neurons with \(r_i = \tanh(v_i)\), i.e. the hyperbolic tangent function that represents a non-linear mapping of \(v_i\) to the open interval \((-1, 1)\).

1.3.1.1. Step 1: Network initialization

We will start out by creating a random coupling matrix and implementing an RNN model of rate-coupled LI neurons.

from rectipy import Network
import numpy as np

# define network parameters
node = "neuron_model_templates.rate_neurons.leaky_integrator.tanh"
N = 5
J = np.random.randn(N, N)*1.5

# initialize network
net = Network(dt=1e-3)

# add leaky integrator node
net.add_diffeq_node("leaky_integrator", node, weights=J, source_var="tanh_op/r", target_var="li_op/r_in",
                    input_var="li_op/I_ext", output_var="li_op/v")

The above code implements a rectipy.Network with \(N = 5\) LI neurons with random coupling weights drawn from a standard Gaussian distribution with mean \(\mu = 0\) and standard deviation \(\sigma = 2\). This particular choice of source and target variables for the coupling implements \(r_i^{in} = \sum_{j=1}^N J_{ij} r_j\).

1.3.1.2. Step 2: Simulation of the network dynamics

Now, lets examine the network dynamics by integrating the evolution equations over 10000 steps, using the integration step-size of dt = 1e-3 as defined above.

# define network input
steps = 10000
inp = np.zeros((steps, N)) + 0.5

# perform numerical simulation
obs = net.run(inputs=inp, sampling_steps=10)

We created a timeseries of constant input and fed that input to the input variable of the RNN at each integration step. Let’s have a look at the resulting network dynamics.

from matplotlib.pyplot import show, legend

obs.plot("out")
legend([f"v_{i}" for i in range(N)])
show()

As can be seen, the different LI neurons generated different output rates even though they received the same extrinsic input, due to the randomly sampled coupling weights and the RNN dynamics emerging from that.

1.3.2. LI neuron with a logistic transform

In this example, we will examine an RNN of LI neurons with \(r_i = \frac{1}{1 + \exp(-v_i)}\), i.e. the logistic function that represents a non-linear mapping of \(v_i\) to the open interval \((0, 1)\).

# remove leaky integrator node from network
net.pop_node("leaky_integrator")

# add LI neuron node with the logistic activation function
node = "neuron_model_templates.rate_neurons.leaky_integrator.sigmoid"
net.add_diffeq_node("li_sigmoid", node, weights=J, source_var="sigmoid_op/r", target_var="li_op/r_in",
                    input_var="li_op/I_ext", output_var="li_op/v")

# perform numerical simulation
obs = net.run(inputs=inp, sampling_steps=10)

# visualize the network dynamics
obs.plot("out")
legend([f"v_{i}" for i in range(N)])
show()

As can be seen, using the logistic function instead of the hyperbolic tangent function as an output function of the LI neurons led to a different network dynamics, even though both networks were initialized with the same coupling strengths and received the same extrinsic input.

Gallery generated by Sphinx-Gallery