1.1. Leaky Integrate-and-Fire Spiking Neuron Model

The leaky integrate-and-fire (LIF) neuron model is the most commonly used spiking neuron model. It has a single state-variable \(v_i\), the dynamics of which are governed by

\[\dot v_i = -\frac{v_i}{\tau} + I_i(t) + k s_i^{in}.\]

The two constants governing the dynamics of \(v_i\) are the global decay time constant \(\tau\) and the global coupling constant \(k\). Any time \(v_i \geq v_{peak}\), a spike is counted and the reset condition \(v_i \rightarrow v_{reset}\) is applied. This introduces a discontinuity around the spike event to the dynamics of \(v_i\), and makes the state variable similar to the membrane potential of a neuron close to the soma. Spikes or spike-driven synapses should be used as input to other neurons in an RNN. Here, we use a spike-driven synaptic activation \(s_i\) with the following dynamics:

\[\tau_s \dot s_i = -\frac{s_i}{\tau_s} + \delta(v_i - v_{peak}),\]

where \(\tau_s\) is the synaptic time constant and \(\delta\) is the Dirac delta function, which is one whenever a spike is elicited by the \(i^{th}\) neuron and zero otherwise. The variable \(s_i^{in}\) serves as target variable for such synaptic activation variables.

In this example, we will examine an RNN of LIF neurons with spike-coupling.

1.1.1. Step 1: Network initialization

We will start out by creating a random coupling matrix and implementing an RNN model of rate-coupled LIF neurons.

import numpy as np

from rectipy import Network

# define network parameters
node = "neuron_model_templates.spiking_neurons.lif.lif"
N = 5
J = np.random.randn(N, N)*20.0

# initialize network
net = Network(dt=1e-3)

# add LIF population to network
net.add_diffeq_node("lif", node, weights=J, source_var="s", target_var="s_in", input_var="I_ext",
                    output_var="s", spike_var="spike", spike_def="v", op="lif_op")

The above code implements a rectipy.Network with \(N = 5\) LIF neurons with random coupling weights drawn from a standard Gaussian distribution with mean \(\mu = 0\) and standard deviation \(\sigma = 10\). This particular choice of source and target variables for the coupling implements \(s_i^{in} = \sum_{j=1}^N J_{ij} s_j\).

1.1.2. Step 2: Simulation of the network dynamics

Now, lets examine the network dynamics by integrating the evolution equations over 10000 steps, using the integration step-size of dt = 1e-3 as defined above.

# define network input
steps = 10000
inp = np.zeros((steps, N)) + 200.0

# perform numerical simulation
obs = net.run(inputs=inp, sampling_steps=10)

We created a timeseries of constant input and fed that input to the input variable of the RNN at each integration step. Let’s have a look at the resulting network dynamics.

from matplotlib.pyplot import show, legend

obs.plot("out")
legend([f"s_{i}" for i in range(N)])
show()

As can be seen, the different LIF neurons spiked at different times, even though they received the same extrinsic input, due to the randomly sampled coupling weights and the RNN dynamics emerging from that.

Gallery generated by Sphinx-Gallery