Segmentation fault on set_maxstep()
Posted: Mon Mar 21, 2011 12:20 pm
Hi all.
We are trying to convert an older model to work in parallel. The network consists of 4 neurons connected to each other. The connections from the source to the target cells is performed by
1) assigning a gid to each cell via ParallelContext.set_gid2node(gid, ParallelContext.id)
2) connecting the source cell to a nil-connected netcon
3) calling ParallelContext.cell(source-cell-gid, netcon, 1)
4) in the machine containing the target neuron, doing ParallelContext.gid_connect(source-cell-gid, ampa), where ampa is a POINT_PROCESS
I would like to ask a few questions:
1) stdinit() does not exist in neuron version 7.1 that we use, what is the function to initialize the simulation? (we are calling finitialize())
2) The model crashes with a segmentation fault at ParallelContext[0].set_maxstep(10) when run in parallel. I have figured out that if i remove all the gid_connect() calls it can work, but nothing further than that. Also if i dont initialize the model with finitialize(), the call to set_maxstep() will hang forever. Could it have to do with the fact that the gid_connect() targets are POINT_PROCESSes? I have checked that the POINT processes actually exist as targets in the machine when they are being gid_connected()
I would appreciate any insight about why this is happening.
Thanks
We are trying to convert an older model to work in parallel. The network consists of 4 neurons connected to each other. The connections from the source to the target cells is performed by
1) assigning a gid to each cell via ParallelContext.set_gid2node(gid, ParallelContext.id)
2) connecting the source cell to a nil-connected netcon
3) calling ParallelContext.cell(source-cell-gid, netcon, 1)
4) in the machine containing the target neuron, doing ParallelContext.gid_connect(source-cell-gid, ampa), where ampa is a POINT_PROCESS
I would like to ask a few questions:
1) stdinit() does not exist in neuron version 7.1 that we use, what is the function to initialize the simulation? (we are calling finitialize())
2) The model crashes with a segmentation fault at ParallelContext[0].set_maxstep(10) when run in parallel. I have figured out that if i remove all the gid_connect() calls it can work, but nothing further than that. Also if i dont initialize the model with finitialize(), the call to set_maxstep() will hang forever. Could it have to do with the fact that the gid_connect() targets are POINT_PROCESSes? I have checked that the POINT processes actually exist as targets in the machine when they are being gid_connected()
I would appreciate any insight about why this is happening.
Thanks