bionics-engineering:computational-neuroscience:lab4
Differenze
Queste sono le differenze tra la revisione selezionata e la versione attuale della pagina.
Prossima revisione | Revisione precedente | ||
bionics-engineering:computational-neuroscience:lab4 [13/04/2016 alle 13:20 (9 anni fa)] – Created webpage for assignment 4 Davide Bacciu | bionics-engineering:computational-neuroscience:lab4 [20/04/2016 alle 14:38 (9 anni fa)] (versione attuale) – Corrected erroneous mention to eigenvalue in Assignment 4 Davide Bacciu | ||
---|---|---|---|
Linea 14: | Linea 14: | ||
Implement a discrete time version of the Hebb correlation rule by | Implement a discrete time version of the Hebb correlation rule by | ||
* Starting from a weight vector w randomly initialized in $[-1,1]$ | * Starting from a weight vector w randomly initialized in $[-1,1]$ | ||
- | * For each data sample in the data matrix update the synaptic weights using $w(t+1) = w(t) + \epsilon \frac{dw}{dt}$, | + | * For each data sample in the data matrix update the synaptic weights using $w(t+1) = w(t) + \epsilon \frac{dw}{dt}$, |
To implement this process, at each time step, feed the neuron with an input $u$ from the data matrix. Once you have reached the last data point in matrix data, shuffle (i.e. randomly reorder) the samples in data (e.g. consider using the function '' | To implement this process, at each time step, feed the neuron with an input $u$ from the data matrix. Once you have reached the last data point in matrix data, shuffle (i.e. randomly reorder) the samples in data (e.g. consider using the function '' | ||
- | After training has converged, plot a figure displaying (on the same graph) the training data points, the final weight vector $w$ resulting from the learning process and the first eigenvalue | + | After training has converged, plot a figure displaying (on the same graph) the training data points, the final weight vector $w$ resulting from the learning process and the first eigenvector |
Generate two figures plotting the evolution in time of the two components of the weight vector $w$ (for this you will need to keep track of $w(t)$ evolution during training). The plot will have time on the $x$ axis and the weight value on the $y$ axis (provide a separate plot for each component of the weight vector). Also provide another plot of the evolution in time of the norm of the weight vector during learning. | Generate two figures plotting the evolution in time of the two components of the weight vector $w$ (for this you will need to keep track of $w(t)$ evolution during training). The plot will have time on the $x$ axis and the weight value on the $y$ axis (provide a separate plot for each component of the weight vector). Also provide another plot of the evolution in time of the norm of the weight vector during learning. |
bionics-engineering/computational-neuroscience/lab4.1460553642.txt.gz · Ultima modifica: 13/04/2016 alle 13:20 (9 anni fa) da Davide Bacciu