Help with .record - Too much data!

Anything that doesn't fit elsewhere.
Post Reply
Grado

Help with .record - Too much data!

Post by Grado »

I am trying to record the weight value of every synapse in my model. Currently, I use the code below, which I then write to a file. The issue with this code, however, is that it stores the weight value at every time step for every synapse, regardless of if it changes or not. This causes issues when you run a simulation with 10000 synapses for 10000 ms - you get a 30gB file of weights. Is there a way to record only when the weights actually change?

Some more information about the model: it is the Hahn & McIntyre model of the Basal Ganglia. I have added spike time dependent plasticity to each type of synapse, and would like to record the weights accordingly. Maybe the recording should be moved to the .mod file, because the weights are really only updated after a neuron spikes. Or maybe it would be best to just .record once per ms or less, but I am unser how to modify .record to achieve that functionality.

Thanks

objref weightArray[pnm.nclist.count+1]

for i=0,pnm.nclist.count{
weightArray = new Vector()
}

proc RECweights(){
weightArray[0].record(&t)
for i=1,pnm.nclist.count{
weightArray.record(&pnm.nclist.o(i-1).weight)
}
}
ted
Site Admin
Posts: 6299
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine
Contact:

Re: Help with .record - Too much data!

Post by ted »

I don't know the details of your STDP mechanism, but if the learning rule depends only on spike times (or the intervals between pre- and postsynaptic spikes), all you need to record are the spike times. Given the network's connectivity, the initial weights, and the spike times, the weight trajectories can then be calculated after the simulation completes. The time required to do so would be proportional to the total number of spikes--not simulation duration, not time step size, not the number of cells or the number of synapses, not the number of synaptic connections. Total number of spikes generated in the simulation. Storage requirement would also be proportional to the number of spikes. No need to tinker with your model cells, or your synaptic mechanisms, or the standard run system. This post-run analysis would execute much faster than the simulation that generated the recorded spike times.
Grado

Re: Help with .record - Too much data!

Post by Grado »

I did consider that approach, but it is clumsy because the STDP rules change for each synapse type, and from run to run. I did solve my problem by using .record(&v,1), which records at 1ms intervals (instead of at each timestep, 0.01 ms). This works well for me because I don't need the actual times that the weights change, I just need to be able to view how they change over time.

Thanks for your reply
Post Reply