Skip to content

Using GPU environment (CONDA) for tensorflow on DEVTOP server

fialhocoelho edited this page Dec 20, 2018 · 2 revisions

Instructions for using TF on the DEVTOP server.

login the DEVTOP server

Accessing/sharing your files

To access your files inside the environment, they just need to be inside the directory /data of any server in the manycore cluster.

Export environment variables:

Add the following snippet to your bash config file ( ~/.bashrc ):

#Path_var
export PATH="$PATH:/usr/local/cuda/bin"
export PATH="$PATH:/usr/local/cuda/lib64"
export PATH="$PATH:/opt/tools/miniconda3/bin"
export PATH="$PATH:$HOME/bin"

#LD_LIBRARY_PATH_var
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/lib64"
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/extras/CUPTI/lib64"
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/lib/x86_64-linux-gnu"

After that:

source ~/.bashrc

Creating CONDA ENV:

$ conda create -n your_env_name

RUN your environment:

$ source activate your_env_name

Installing optimized version of tensorflow inside your environment:

$ pip install /opt/tools/tensorflow_pkg/tensorflow-1.11.0-cp36-cp36m-linux_x86_64.whl --user

Installing applications inside your environment:

$ conda install application_name

Second option:

$ pip install application_name

E.g.

$ conda install keras

$ conda install jupyter notebook

Testing Tensorflow and Keras using ML application:

$ curl -sSL https://github.com/SPRACE/calo-simulation/raw/master/python/mnist_mlp.py | python3

IMPORTANT: Don't forget to install keras in your environment.

To exit the environment:

$ source deactivate