I have spent the last hour or so trying to find an old forum post relevant to my question, but can't seem to locate it.
I need to run my model for a certain amount of time for it to reach stable outputs. I also am generating a lot of data from my simulations so I am "chunking" my simulation, meaning, for example, if I run a simulation for an hour I will break it up into 5 minute chunks and at the end of each chunk will write the data and then resize my recording vectors using h.frecord_init(). Since I have no interest in any of the data before the model behavior stabilizes, I have set up my code in the following way:
Code: Select all
h.finitialize()
...
h.t = -stabilize_time
num_time_chunks = (stabilize_time + simulation_time) / chunk_time
for i in range(num_time_chunks):
h.continuerun(h.t + chunk_time)
if h.t >= 0:
#write data
# resize recording vectors
h.frecord_init()
This obviously isn't my actual code, I just wanted it give you an idea of how it's structured. When I use a fixed time step this works perfectly. However, when I use cvode it breaks. For example I was running a simple test where
Code: Select all
stabilize_time = 1000
simulation_time = 1000
chunk_time = 250