Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run simple test model with neuron? #2

Open
pramodk opened this issue Jun 27, 2020 · 15 comments
Open

How to run simple test model with neuron? #2

pramodk opened this issue Jun 27, 2020 · 15 comments

Comments

@pramodk
Copy link
Contributor

pramodk commented Jun 27, 2020

Hello @iraikov, @pmoolchand,

If you could provide instructions to run a small test with NEURON, it will be great. I can then quickly test this with CoreNEURON as well.

In the past we used reduced_dentate repository and there was run.hoc. I see that jobscripts/ has number of examples but not sure which one to use. If you could tell me how to run this new version with small input, that will be great. (to run with few cores on the laptop for a quick functional test).

Thanks!

@pmoolchand
Copy link
Contributor

Hi @pramodk,
@iraikov will upload some jobscripts shortly.

@pramodk Is extracellular mechanism now supported w/ CoreNeuron? And do we have a new method for computing LFP?

Thanks!

@pramodk
Copy link
Contributor Author

pramodk commented Jun 27, 2020

@iraikov will upload some jobscripts shortly.

Ok great!

@pramodk Is extracellular mechanism now supported w/ CoreNeuron? And do we have a new method for computing LFP?

Not directly today. We also need to calculate LFP but we do this externally i.e. record necessary variables in coreneuron and use another tool to calculate LFP. But recently we also started discussion about calculating LFP online in CoreNEURON. If you could open an issue in CoreNEURON repo with these questions, we can discuss with with Michael and see how it can be done.

@iraikov
Copy link
Member

iraikov commented Jun 29, 2020

Hi @pramodk

Thanks a lot for your help. I have created a test configuration for a reduced-size network model (config/Test_Slice_10um.yaml) and corresponding job script ( jobscripts/comet_nrnpy_Test_Slice_10um.sh ) that I used for testing on the SDSC Comet cluster, which should be adaptable for use on other systems. The model network's connectivity graph, morphological data, and input spike patterns are loaded from an HDF5 file. What is the best way to provide you with the file?

@pramodk
Copy link
Contributor Author

pramodk commented Jul 5, 2020

Thank you @iraikov for the script. I will test these as soon as soon as I have bit of free time from the deadlines that I am currently chasing.

The model network's connectivity graph, morphological data, and input spike patterns are loaded from an HDF5 file. What is the best way to provide you with the file?

How large are they? What you typically use? For me anything is fine (gdrive/dropbox). You have my epfl email address. Just send me the link and I can download the files.

@pramodk
Copy link
Contributor Author

pramodk commented Jul 10, 2020

@iraikov : I would like to run above mentioned tests this weekend if possible. Let me know how can I get input files.

@iraikov
Copy link
Member

iraikov commented Jul 10, 2020 via email

@pramodk
Copy link
Contributor Author

pramodk commented Jul 11, 2020

no problem! thanks for the access!

I am able to compile neuroh5 v0.0.4. I assume I have to follow instructions from Installation section to build everything & run : https://github.com/soltesz-lab/dentate#installation

I will continue on this tomorrow and will let you know if I see any issues.

@iraikov
Copy link
Member

iraikov commented Jul 13, 2020

Hi @pramodk did you get a chance to try the reduced model? Thanks

@pramodk
Copy link
Contributor Author

pramodk commented Jul 15, 2020

Hello @iraikov !

I started working on it but couldn't make much progress as I got distracted for CNS workshop preparation. Most likely I will be busy until end of CNS i.e. 22nd July.

Once I get time to run this, I will update you here.

Thanks!

Edit : I meant to say 22nd July

@pramodk
Copy link
Contributor Author

pramodk commented Jul 16, 2020

@iraikov : I would get time to look at this next week only. But if you have everything setup already and have time, by looking at the examples, you can also test the model locally.

Typical steps are:

  • Install NEURON with CoreNEURON enabled i.e.
# preferably intel compiler
cmake .. -DCMAKE_INSTALL_PREFIX=$HOME/install  -DNRN_ENABLE_CORENEURON=ON -DPYTHON_EXECUTABLE=`which python3`  #other options  e.g. -DNRN_ENABLE_RX3D=OFF -DNRN_ENABLE_INTERVIEWS=OFF
make -j8
make install
  • Compile mod files for neuron as well as CoreNEURON
export PYTHONPATH=$HOME/install/lib/python:$PYTHONPATH
export PATH=$HOME/install/bin:$PATH

nrnivmodl mechanisms
nrnivmodl-core mechanisms

NEURON : We have all mod files are already compatible with CoreNEURON I believe (basically Random123 and POINTER variables were updated).

Performance aspects we can tune once we verified everything.

If there are any quick questions, I will be happy to have a quick call.

@iraikov
Copy link
Member

iraikov commented Jul 16, 2020

Hi @pramodk thanks, I did see the CoreNEURON direct memory example, and I tried for a single rank job with our network code. It seemed to work but it is a bit hard to verify if the spike output is correct, as our entire workflow is predicated on the ability to save model output to HDF5, hence the issue I submitted for CoreNEURON. Once we are able to obtain the output spike trains from CoreNEURON via a memory interface, we should be able to fully incorporate it in our workflow.

@pramodk
Copy link
Contributor Author

pramodk commented Jul 16, 2020

Ok great!

Once we are able to obtain the output spike trains from CoreNEURON via a memory interface, we should be able to fully incorporate it in our workflow.

Ok understand. I was thinking of just reading generated out.dat file. But if you are doing parameter instantiations then its not convenient I agree.

@iraikov
Copy link
Member

iraikov commented Jul 16, 2020

Hi @pramodk our routines for reading the visualizing the spike trains rely on metadata about ranges of population gids that is stored as tables in HDF5, as well particular layouts of spike time and gid vectors, so it would take some effort to adapt them to reading out.dat. This is definitely doable for the purposes of validating a single simulation run, but not so great for more complex workflows.

@pramodk
Copy link
Contributor Author

pramodk commented Aug 13, 2020

@iraikov : are you able to run model using coreneuron with the same results as neuron?

@iraikov
Copy link
Member

iraikov commented Aug 13, 2020

Hi @pramodk : I was able to verify for a simulation with a single biophysical cells that receives input from multiple vecstims. I should be able to test the reduced network model today. However, I have another use case that would be very useful for us: our optimization code is structured in such a way that a biophysical cell being optimized is instantiated once, but simulated multiple times, eeach time with different parameter values for the synaptic or ion channel mechanisms. Before each invocation of psolve, the code invokes finitialize, which seems sufficient for standard NEURON. But in the case of CoreNEURON, after the first time psolve is called, subsequent calls in the same context cause either immediate return, or a segmentation fault. I have tried invoking psolve twice in the the coreneuron test_spikes.py script, and the second invocation causes a segmentation fault. So I'm wondering if you have encountered a similar use case in your work, and if you have some suggestions how to reinitialize the simulation before each call to psolve. Preferably without deleting all cells and netcons and instantiating them again, which might be too expensive in our case. Thank you very much for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants