Difference between revisions of "Getting started with Jupyter Notebooks"

From DeepSense Docs
Jump to: navigation, search
(Created page with "<div class="noautonum"> == 1. Request an interactive session on a GPU compute node == <!-- TODO: still need to set up queues to fairly share GPUs --> <!-- TODO: write ins...")
 
m (Jnewport moved page Getting started with Jupyter Notebook to Getting started with Jupyter Notebooks: pluralized notebooks)
 
(2 intermediate revisions by one other user not shown)
Line 11: Line 11:
  
 
== 2. Start a python2 Jupyter notebook ==
 
== 2. Start a python2 Jupyter notebook ==
 
=== (Old method only) Source the Caffe deep learning toolkit ===
 
 
<code>source /opt/DL/caffe/bin/caffe-activate</code>
 
  
 
=== Start the notebook ===
 
=== Start the notebook ===
Line 69: Line 65:
 
'''Note''': On our macs, this worked in Chrome, but not in Safari.  Unfortunately, there was no error reported, it simply could not connect.
 
'''Note''': On our macs, this worked in Chrome, but not in Safari.  Unfortunately, there was no error reported, it simply could not connect.
  
 +
== 5. How to submit Jupyter Notebook jobs in batch mode==
  
Be sure to enter the location of the “caffe-samples” directory in your home directory as your caffe-root in the Caffe example notebooks.
+
Users usually would like to submit Jupyter Notebook jobs in interactive mode using LSF. By submitting LSF jobs in interactive mode, users are assigned interactive sessions for them to open Jupyter Notebook to edit, compile, and execute their scripts. Using the features of Jupyter Notebook, users find it is very convenient to monitor the execution of their scripts for further debugging.
 
+
However, users have to manually exit the interactive session after their jobs are done on the compute nodes. If users forget to exit the session, the session would be open forever and the computing resources are hold by the users.
== 5. Enjoy Deep Learning on DeepSense! ==
 
 
 
== 6. More information ==
 
 
 
Go to Caffe's [http://caffe.berkeleyvision.org/ website] for tutorials and example programs that you can run to get started.
 
See the following links to a couple of the example programs:
 
 
 
[http://caffe.berkeleyvision.org/gathered/examples/mnist.html LeNet MNIST Tutorial] - Train a neural network to understand handwritten digits.
 
 
 
[http://caffe.berkeleyvision.org/gathered/examples/cifar10.html CIFAR-10 tutorial] - Train a convolutional neural network to classify small images.
 
 
 
== 7. Using another deep learning toolkit such as Tensorflow ==
 
 
 
 
 
(New Method)
 
* Ensure any Anaconda dependencies are installed
 
** for tensorflow, create a new environment and <code>conda install tensorflow</conda>
 
* Download example notebooks for the deep learning toolkit to your home directory,
 
** e.g. <code> git clone https://github.com/aymericdamien/TensorFlow-Examples.git</code>
 
 
 
(Old Method)
 
* Ensure any Anaconda dependencies are installed
 
** for tensorflow, run <code>/opt/DL/tensorflow/bin/install_dependencies</code>
 
* Source the appropriate toolkit instead of caffe-activate
 
** e.g. <code>source /opt/DL/tensorflow/bin/tensorflow-activate</code>
 
* Download example notebooks for the deep learning toolkit to your home directory,
 
** e.g. <code> git clone https://github.com/aymericdamien/TensorFlow-Examples.git</code>
 
 
 
The TensorFlow [https://www.tensorflow.org/ home page] has various information, including Tutorials, How-To documents, and a Getting Started guide.
 
  
Additional tutorials and examples are available from the community, for example:
+
Therefore, we introduce users a way to submit Jupyter Notebook jobs in batch mode. First, users need to understand when they should submit Jupyter Notebook jobs in batch mode. If users are still at the stage of editing, debugging, or testing their Jupyter Notebook scripts, they should use the interactive mode. However, if users have finished the editing, debugging, and testing, they can submit the jobs in batch mode.
  
  https://github.com/nlintz/TensorFlow-Tutorials
+
For example, a user's job would need to run for a few days after he/she changes some hyperparameters, e.g. the learning rate. The user may not want to submit an interactive job and keep the Jupyter Notebook open all the time while the job is running on a compute node. The user just wants to let the compute node do everything for him/her without worrying about the opened Jupyter Notebook session.  
  
  https://github.com/aymericdamien/TensorFlow-Examples
+
The user can submit the job using the following command:
  
 +
(py36-torch-env-install) [luy@ds-lg-01 ~]$ bsub -gpu - jupyter nbconvert --to notebook --execute ./gpu.conda3.py36.ipynb --output gpu.conda3.py36.output.2.ipynb
 +
Job <5743> is submitted to queue <gpu>.
  
 +
You would not be directed to a session on a compute node like you submit interactive jobs. You would get a message confirming you have successfully submitted the job with its jobid, 5743 in the above example.
  
 +
In the above command, the script name is "gpu.conda3.py36.ipynb" which is a Jupyter Notebook script. The parameter "--output gpu.conda3.py36.output.2.ipynb" would direct the output of the Jupyter Notebook script written to a new Jupyter Notebook file with name "gpu.conda3.py36.output.2.ipynb". After the job is finished, you would be able to see all the code in your original script and also the corresponding output of the cells in the output file.
  
 +
Remember you would need to activate your anaconda environment before submitting the job.
  
 +
After the job is submitted, you can check the jobs status using:
 +
bjobs -l -gpu 5743
  
 +
'''Enjoy Deep Learning on DeepSense!'''
 +
  
 
</div> <!-- autonum -->
 
</div> <!-- autonum -->

Latest revision as of 11:36, 18 December 2020



1. Request an interactive session on a GPU compute node

bsub -Is -gpu - bash

2. Start a python2 Jupyter notebook

Start the notebook

jupyter notebook --no-browser --ip=0.0.0.0

Sample output

[I 13:32:23.937 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
[C 13:32:23.937 NotebookApp] 
    
    Copy/paste this URL into your browser when you connect for the first time,
    to login with a token:
        http://ds-cmgpu-04:8888/?token=68042f40a10b500f3747ae0a232ee209fa4bf1aa384d29ba&token=68042f40a10b500f3747ae0a232ee209fa4bf1aa384d29ba

Copy the URL, host, and port

Copy the URL but don’t paste it in your browser yet.

Make a note of which compute host and port the notebook is running on (e.g. host ds-cmgpu-04 and port 8888 in this case)

3. Port Forwarding

In a separate terminal window from your local computer, forward your local port to the remote host.

ssh command port forwarding

ssh -l <username> login1.deepsense.ca -L <local_port>:<remote_host>:<remote_port>

for example, ssh -l user1 login1.deepsense.ca -L 8888:ds-cmgpu-04:8888

Note that you may need to use a different <local_port> than 8888 if you have other web services running on your local computer. In particular, if you run a jupyter notebook locally then it will use port 8888 and you will try to connect to the local jupyter notebook instead of the cluster notebook. In this case close your port forwarding and try again with 8889 or another unused port.

PuTTY port forwarding on Windows

If you are using a PuTTY terminal from a Windows computer to access DeepSense then you can still forward ports.

Before starting your session, scroll down to the option Connection->SSH->Tunnels in the Category pane.

Enter the local_port in the Source port field. For example, 8888.

Enter <remote_host>:<remote_port> in the Destination field. For example, ds-cmgpu-04:8888.

Press the Add button to add the port forwarding rule to your PuTTY session.

Finally, open the session as usual.

4. Open the desired sample notebook

Enter the copied URL in your web browser but change the remote host name to “localhost” before pressing enter.

e.g http://localhost:8888/?token=68042f40a10b500f3747ae0a232ee209fa4bf1aa384d29ba&token=68042f40a10b500f3747ae0a232ee209fa4bf1aa384d29ba

Note: On our macs, this worked in Chrome, but not in Safari. Unfortunately, there was no error reported, it simply could not connect.

5. How to submit Jupyter Notebook jobs in batch mode

Users usually would like to submit Jupyter Notebook jobs in interactive mode using LSF. By submitting LSF jobs in interactive mode, users are assigned interactive sessions for them to open Jupyter Notebook to edit, compile, and execute their scripts. Using the features of Jupyter Notebook, users find it is very convenient to monitor the execution of their scripts for further debugging. However, users have to manually exit the interactive session after their jobs are done on the compute nodes. If users forget to exit the session, the session would be open forever and the computing resources are hold by the users.

Therefore, we introduce users a way to submit Jupyter Notebook jobs in batch mode. First, users need to understand when they should submit Jupyter Notebook jobs in batch mode. If users are still at the stage of editing, debugging, or testing their Jupyter Notebook scripts, they should use the interactive mode. However, if users have finished the editing, debugging, and testing, they can submit the jobs in batch mode.

For example, a user's job would need to run for a few days after he/she changes some hyperparameters, e.g. the learning rate. The user may not want to submit an interactive job and keep the Jupyter Notebook open all the time while the job is running on a compute node. The user just wants to let the compute node do everything for him/her without worrying about the opened Jupyter Notebook session.

The user can submit the job using the following command:

(py36-torch-env-install) [luy@ds-lg-01 ~]$ bsub -gpu - jupyter nbconvert --to notebook --execute ./gpu.conda3.py36.ipynb --output gpu.conda3.py36.output.2.ipynb
Job <5743> is submitted to queue <gpu>.

You would not be directed to a session on a compute node like you submit interactive jobs. You would get a message confirming you have successfully submitted the job with its jobid, 5743 in the above example.

In the above command, the script name is "gpu.conda3.py36.ipynb" which is a Jupyter Notebook script. The parameter "--output gpu.conda3.py36.output.2.ipynb" would direct the output of the Jupyter Notebook script written to a new Jupyter Notebook file with name "gpu.conda3.py36.output.2.ipynb". After the job is finished, you would be able to see all the code in your original script and also the corresponding output of the cells in the output file.

Remember you would need to activate your anaconda environment before submitting the job.

After the job is submitted, you can check the jobs status using:

bjobs -l -gpu 5743

Enjoy Deep Learning on DeepSense!