Hosting TensorBoard in Notebooks AI Development Platform
Contents
Hosting TensorBoard from JupyterLab Notebook Instance
TensorBoard is a visualization tool for TensorFlow and PyTorch experiments. When using AWS SageMaker or the AI Development Platform, TensorBoard can be hosted directly from your JupyterLab notebook instance and accessed securely via the built-in proxy server.
Prerequisites
- Access to a running JupyterLab environment in SageMaker Studio or the AI Development Platform.
- TensorFlow or PyTorch training logs saved to a local directory (e.g.,
./logs/). - AWS CLI setup completed (see Accessing CLI in Notebooks AI Development Platform).
Step 1: Start TensorBoard Server
In a notebook cell or terminal, navigate to your log directory and run:
tensorboard --logdir=./logs/ --port=6006 --host=0.0.0.0
This command starts TensorBoard on port 6006 and binds it to all network interfaces so it can be accessed via the Jupyter proxy.
Step 2: Access TensorBoard via Proxy
Once the TensorBoard server is running, open the following URL pattern in your browser (from within the same JupyterLab environment):
https://<your-notebook-domain>/proxy/6006/
Example:
https://studiolab.sagemaker.aws/proxy/6006/
You should now see the TensorBoard interface displaying your training metrics and graphs.
Step 3: Troubleshooting
- If TensorBoard doesn’t load, ensure:
- The port (6006) is available and not used by another process.
- The logs directory exists and contains valid TensorFlow/PyTorch event files.
- You’re using the correct proxy path (
/proxy/6006/).
- To stop TensorBoard:
pkill tensorboard
Notes
- TensorBoard will only remain active while the notebook instance is running.
- Avoid exposing TensorBoard publicly; always access it via the secure notebook proxy.