Page 1 of 1

Error in Neuron Book Example?

Posted: Wed Nov 18, 2009 4:38 pm
by witchcraft
Dear Neuroner's,

It is very unlikely that there is an error in pg. 139 -141 of The Neuron Book example. I am following my "paranoia setting" of trying all the examples in the book - okay nearly all. But i find that when i try the code below (pages 139 -141):

Code: Select all


/////////////////////////////////////
/////////////////////////////////////
		/* Model Specs */
/////////////////////////////////////
/////////////////////////////////////


/////////// Topology ////////////////



create soma, apical, basilar, axon
connect apical(0), soma(1)
connect basilar(0), soma(0)
connect axon(0), soma(0)

/////////// Geometry ////////////////

soma		{

	L = 30
	diam = 30
	nseg = 1
	
	}
	
apical		{

	L = 600
	diam = 1
	nseg = 23
	
	}
	
basilar		{

	L = 200
	diam = 2
	nseg = 5
	
	}
	
axon		{

	L = 1000
	diam = 1
	nseg = 37
	
	}
	

/////////// Biophysics ////////////////


forall		{

	Ra = 100
	cm = 1
	
	}
	
soma		{

	insert hh
	
	}
	
apical		{

	insert pas
	g_pas = 0.0002
	e_pas = -65
	
	}
	
basilar		{

	insert pas
	g_pas = 0.0002
	e_pas = -65
	
	}
	
axon		{

	insert hh
	
	}
	
	
	
/////////////////////////////////////
/////////////////////////////////////
		/*Instrumentation*/
/////////////////////////////////////
/////////////////////////////////////

/////////// Synaptic Input /////////


objref syn
soma syn = new AlphaSynapse(0.5)
syn. onset = 0.5
syn. tau = 0.1
syn. gmax = 0.05
syn. e = 0


/////////// Graphical Disp ////////////////


objref g
g = new Graph()
g.size(0.5, -80, 40)
g.addvar("soma.v (0.5)", 1, 1, 0.6, 0.9, 2)





/////////////////////////////////////
/////////////////////////////////////
	  /*Simulation Control*/
/////////////////////////////////////
/////////////////////////////////////


dt = 0.025
tstop = 5
v_init = -65

proc initialize()	{

	finitialize (v_init)
	fcurrent()
	
	}
	
proc integrate()	{

	g.begin()
	while (t<stop)	{
	fadvance ()
	g.plot (t)
	
	}

	g.flush
	}
	
proc go()	{

	initialize ()
	integrate ()
	
	}
	
I get this response:

Code: Select all

-e 
NEURON -- Release 7.1 (359:7f113b76a94b) 2009-10-26
Duke, Yale, and the BlueBrain Project -- Copyright 1984-2008
See http://www.neuron.yale.edu/credits.html

/Applications/Neuron-7.1/nrn/i686/bin/nrniv.app/Contents/MacOS/nrniv: size not enough arguments
 in neuromus.hoc near line 114
 g.size(0.5, -80, 40)
                     ^
        Graph[0].size(0.5, -80, 40)
oc>go()
/Applications/Neuron-7.1/nrn/i686/bin/nrniv.app/Contents/MacOS/nrniv: go undefined function
 near line 1
 go()
     ^
        go()
oc>
I have checked the code as much as i can and i am quite certain i am missing something. Any pointers are appreciated.

With Kind Regards,

Witchcraft.

Re: Error in Neuron Book Example?

Posted: Wed Nov 18, 2009 10:25 pm
by ted

Code: Select all

nrniv: size not enough arguments
in neuromus.hoc near line 114
g.size(0.5, -80, 40)
                     ^
        Graph[0].size(0.5, -80, 40)
means the interpreter found a parse error. Parse errors stop the parser immediately. Nothing past this point is examined, so proc go() doesn't yet exist as far as NEURON is concerned.

The good news about parse errors is that the interpreter tells you where the error is. This would be a good time to read 12.3.2 Error handling in the NEURON Book, review the documentation of the Graph class's size() method, take a fresh look at the source code in the book and compare it with the statement that gagged the interpreter.

And marvel at the brain's ability to perceive something that it "knows is there" even when that something really isn't-- one aspect of being human that is a common hindrance to debugging. If Alexander Pope had been a programmer, might he have written "to err is a feature, not a bug"?

Re: Error in Neuron Book Example?

Posted: Sun Nov 22, 2009 5:03 pm
by Bill Connelly
Just in case you still can't find the problem (because I think we've all been there), compare what you have

Code: Select all

g.size(0.5, -80, 40)
to what the book has, very closely.

Hint: . and , are different things.