## Description

## 1. LIF Neuron Model [9 marks total]

In this exercise, you will finish the implementation of a leaky integrate-and-fire (LIF) neuron. The

notebook includes an incomplete version of the class LIFNeuron. The class models the dynamics

of the LIF neuron, including its membrane potential (v), post-synaptic (input) current (s), and

spike times (spikes). The class includes the functions Slope and Step, which will be called by

another function that implements Euler’s method (Simulate, in question 2).

The class also has a function called SpikesBetween, which counts the number of times the neuron

spiked between a specified start and end time.

(a) [2 marks] Complete the function Slope as specified by its documentation. It evaluates the

right-hand side of the differential equations that govern the dynamics of the LIF neuron, and

saves those slopes in its member variables dvdt and dsdt.

(b) [7 marks] Complete the function Step, which advances the neuron’s state one time step using

dvdt and dsdt. You should use Euler’s method to update the membrane potential (v) and

the input current (s).

When you detect a spike (the membrane potential reaches or exceeds the threshold value of

1), you should record the spike time and reset the membrane potential to zero1

. Most likely,

the step will produce a v-value greater than 1.

You should use linear interpolation to estimate the time at which v was exactly 1, and use that interpolated time as the spike time. Don’t

forget about the refractory period, during which the membrane potential is fixed at zero (but

the input current continues to integrate the incoming spikes).

Record the updated values of v and s in the class’ lists v_history and s_history.

You may assume that the time-step is shorter than Tau_ref, Tau_m, and Tau_s.

## 2. Spiking Network Model [5 marks total]

Now that you’ve implemented the LIF neuron model, you can build a network out of them. In this

task, you will complete the class SpikingNetwork.

1Or you can set the membrane potential to 1, since it will be set to zero in the next time step.

LIF

30Hz

Input

⌧m =0.02

⌧s =0.1

⌧ref =0.002

0.05

(a) One LIF

0 – 0.3s: 30Hz

0.3 – 1s: 0Hz

Input LIF A

⌧m =0.02

⌧ref =0.002

LIF B

⌧m =0.02

⌧ref =0.002

0.05

0.05

0.05

⌧s =0.05

⌧s =0.05

(b) Two LIF

0 – 0.3s: 30Hz

0.3 – 1s: 0Hz

Input LIF A

⌧m =0.02

⌧ref =0.002

LIF B

⌧m =0.02

⌧ref =0.002

0.05

0.05

LIF C

⌧m =0.02

⌧ref =0.002

-0.2

0.05

0 – 0.7s: 0Hz 0.05

0.7 – 1s: 50Hz

Input

⌧s =0.05

⌧s =0.05

⌧s =0.05

(c) Three LIF: Inhibition

A B

H C

F E

G D

Input

(d) Ring Oscillator

Figure 1: Network Models

Complete the function Simulate. As its documentation explains, this function simulates the operation of the network, including all of the neurons in the network, for a prescribed amount of

time. The supplied code includes a loop over time steps. For each time step, each neuron has to be

updated, and spikes have to be tabulated for the next time step.

## 3. Experiments with Spiking Networks [19 marks total]

For these exercises, you will build and simulate various neural networks. To help you, I have

supplied you with another type of neuron class called InputNeuron. On the outside, it looks like

a LIF neuron. However, it’s not a real neuron; its only job is to deliver spikes, which is useful to

activate the the other (real) neurons in the network.

For all these simulations, you should use a time step of 1 ms (0.001 s).

(a) LIF Firing Rate Curve

i. [3 marks] Implement the network shown in Fig. 1(a). However, alter the input neuron

so that it incrementally ramps up its firing rate, starting with firing at 5 Hz for 2 s, then

10 Hz for the next 2 s, then 15 Hz for the next 2 s, etc. Continue until you reach a firing

rate of 100 Hz.

You will find GenerateSpikeTrain very useful for this. Connect this

input neuron to the LIF neuron with a weight of 0.03. Simulate this network through the

entire sequence of input firing rates. If you’ve set it up properly, you can do this with a

single call to Simulate.

ii. [2 marks] For each 2-second interval, compute how many times the LIF neuron spikes.

You can use LIFNeuron.SpikesBetween to help you with this.

iii. [1 mark] Make a plot of input firing rate (on the horizontal axis) vs. LIF firing rate (vertical

axis). Just plot one dot per input firing rate (so your plot should have 20 dots on it). Don’t

forget about labelling the axes.

(b) Two LIF Neurons

i. [2 marks] Build the network with two LIFNeurons as shown in Fig. 1(b), each projecting

its input to the other. That is, neuron A is connected to neuron B with a weight of 0.05,

and B is connected to A with a weight of 0.05.

Add an input neuron and connect it to one of the two LIF neurons with a weight of 0.05.

Have the input neuron fire at 30 Hz for 0.3 seconds at the start, and then go silent. You

can create that input neuron using

InputNeuron( GenerateSpikeTrain([30], [0.3]) )

This simple network is depicted in Fig. 1(b).

ii. [1 mark] Simulate the network for at least 1 second, and produce a spike-raster plot for

the three neurons in the network.

(c) Three LIF Neurons: Inhibition

i. [3 marks] Copy your network from question 3b (which included neurons A and B). Now

add a third LIF neuron (C), and connect it to one of the original LIF neurons with a connection weight of -0.2; the negative weight makes this an inhibitory connection. Also add

another input neuron and connect it to C with a weight of 0.05. This input neuron should

be dormant for the first 0.7 s, and then fire at 50 Hz from 0.7 s until 1 s.

You can create that input neuron using

InputNeuron( GenerateSpikeTrain([0, 50], [0.7, 1.]) )

This network is shown in Fig 1c.

ii. Simulate the network for at least 1.5 seconds, and produce a spike-raster plot of all 5

neurons in the network.

iii. [1 mark] Comparing your results to those in question 3b, what effect does the addition of

neuron C and its negative connection have on the activity of neurons A and B? Put your

answer in a markdown cell in the notebook.

(d) Ring Oscillator

i. [2 marks] Create a network with 8 LIF neurons; for the sake of this question, let’s label

them A through H. For all 8 neurons, use a Tau_m value of 50 ms, and a Tau_s value of

100 ms. Connect A to B with a weight of 0.2. Likewise, connect B to C with a weight of

0.2, etc., until you finally close the loop by connecting H to A with a weight of 0.2. This is

the excitatory ring.

ii. [1 mark] Now, add inhibitory connections between the same pairs of neurons, but running in the opposite direction around the ring. That is, connect B to A with a weight of

-0.4, and A to H with a weight of -0.4, etc.

iii. [1 mark] Add an input neuron to stimulate any one of the ring neurons (with a weight

of 0.2). This input neuron should have a firing rate of 25 Hz for the first 0.3 seconds, and

then go dormant. This full network is shown in Fig. 1(d).

iv. [1 mark] Simulate the network for at least 4 seconds, and produce a spike-raster plot of all

9 neurons in the network. If all is working, you should see each neuron in the ring exhibit

a “burst” of spikes in sequence. This excitation travels like a wave around the ring. If this

is not what you are seeing, play with Tau_m, Tau_s, and the connection weights until

you see this wave of activation travel around the ring.

v. [1 mark] How long does it take the wave of activity to go around the ring? Answer in a

markdown cell in the notebook.

## 4. Neural Activation Functions [14 marks total]

For the plots in this question, you may use functions from Python libraries, or you can implement

them yourself.

(a) [2 marks] Calculate the derivative of ReLU(z) with respect to z.

(b) [6 marks] For each of the functions below, create a plot of the function over the domain z ∈

[−6, 6]. State the values of the domain for which the derivative approaches or equals zero, the

maximum value of the derivative, and the location of that maximum.

i. Standard logistic function, σ(z)

ii. Hyperbolic tangent, tanh(z)

iii. Rectified Linear Unit, ReLU(z)

Note: You may use Numpy’s linspace function to generate regularly-spaced samples in the

domain.

(c) [6 marks] Consider the expression f(wx + b), where w, x, b ∈ R, and f : R → R is a differentiable activation function. Use chain rule to derive an expression for each of the following:

i. d

dx f(wx + b)

ii. d

dw f(wx + b)

iii. d

dbf(wx + b)

#### What to submit

Your assignment submission should be a single jupyter notebook file, named (_a1.ipynb),

where is your UW WatIAM login ID (not your student number). The notebook must include

solutions to all the questions. Submit this file to Desire2Learn. You do not need to submit any of the

additional files supplied for the assignment.