Limit to network size?

A Python package that facilitates development and use of models of biological neural networks

Moderator: tom_morse

Post Reply
Posts: 27
Joined: Thu May 26, 2016 10:27 am

Limit to network size?

Post by bll5z6 »

Hey all,

I've been having a great experience with NetPyNE! Thanks again for developing it. I was wondering, is there a limit to the size of the network one can create in this framework? We are wanting to scale up our models to 10's of thousands of cells with complex connectivity... do you see any bottleneck with doing that using NetPyNE, as long as we have the computing resources? Thanks!

Posts: 86
Joined: Tue Aug 18, 2015 3:49 pm

Re: Limit to network size?

Post by salvadord »

Hi Ben,

Thanks for encouragement, great to hear NetPyNE is useful. I am currently running simulations of a model with ~10k cells, where ~2k have 700+ compartments, and the rest ~2-6 compartments. 1 sec of simulation runs in ~1h using 48 cores. There is no limitation per se in the network size using NEURON or NetPyNE. Note NEURON is used for large scale models such as HBP ( or Hippocampus models with >200k neurons (

In NetPyNE there a couple of tricks that can help save disk space (and time) when saving the output of large networks (but these depend on your requirements):

cfg.saveCellSecs = False - do not section information for each instantiated cell
cfg.saveCellConns = False - do not save connections
cfg.includeParamsLabel = False - don't include the labels of high-level params
cfg.gatherOnlySimData = False - this excludes saving netParams, simConfig, and any info on the network itself; just saves the simulation output

Also for large simulations on many cores make sure you set cfg.cache_efficient = True

Hope this helps and good luck with your larger simulations, let us know if you need any further advice.
Post Reply