Big models and big data workshop at SFN 2019

News about new releases of NEURON, bug fixes, development threads, courses/conferences/workshops, meetings of the NEURON Users' Group etc.
Post Reply
Site Admin
Posts: 5918
Joined: Wed May 18, 2005 4:50 pm
Location: Yale University School of Medicine

Big models and big data workshop at SFN 2019

Post by ted »

NSG and HPAC – Large Scale Simulations and Data Processing

A workshop on big models and big data in neuroscience, Saturday, Oct. 19, 2019 from 8:30 AM to 12:30 PM at a downtown Chicago location.


This workshop is intended for neuroscientists who need to use high performance computing (HPC) resources for computationally intensive tasks such as large modeling projects or analysis of neuroimaging data. It will examine resources offered by the Neuroscience Gateway Portal (NSG and The Human Brain Project's High Performance Analytics and Computing Platform (HPAC The workshop will combine hands-on instruction on how to use NSG, didactic presentations by NSG and HPAC developers, and discussions with experienced users of these resources.

The registration deadline for this workshop is Friday, October 10, but you should sign up early because space is limited. See ... g2019.html

for more information and a link to the registration form.

Overview of NSG and HPAC

NSG eliminates most administrative and technical barriers, providing free CPU time to users, and easy access to widely used software that currently includes BluePyOpt, The Brain Modeling Toolkit (BMTK), Brian, CARLsim, DynaSim, EEGLAB, Freesurfer, Human Neocortical Neurosolver (HNN), Large Scale Neural Modeling Simulator (LSNM), MATLAB, MOOSE, NEST, NetPyNE, NEURON, Octave, PGENESIS, PyNN, Python, R, TensorFlow, and the Virtual Personalized Multimodal Connection Pipeline. NSG's web-based interface simplifies the tasks of uploading models or data, specifying job parameters, monitoring job status, and storing and retrieving output data.

HPAC provides extensive HPC support for the HBP community. This includes developing and providing the hardware and software infrastructure required for large scale simulations, data management and analysis, visualization, and managing the complex workflows involved in performing these tasks.
Post Reply