Using Coiled with JupyterLab#

Coiled integrates with the tools you already use. In this guide, you’ll learn about a few useful open source JupterLab extensions we recommend:

You can download this jupyter notebook to follow along in your own JupyterLab session.

Before you start#

You’ll first need install the necessary packages, For the purposes of this example, we’ll do this in a new virtual environment, but you could also install them in whatever environment you’re already using for your project.

$ conda create -n jupyterlab-example -c conda-forge python=3.9 jupyterlab dask-labextension coiled dask
$ conda activate jupyterlab-example
(jupyterlab-example) $ jupyter lab

You also could use pip, or any other package manager you prefer; conda isn’t required.

When you create a cluster, Coiled will automatically replicate your local jupyterlab-example environment in your cluster.

Coiled cluster widget#

When you create a cluster from JupyterLab or using Jupyter Notebooks, you’ll notice a widget detailing the cluster state overview and progress bars as resources are provisioned. This widget relies on the ipywidgets extension.

import coiled

cluster = coiled.Cluster(
    name="jupyter-example", n_workers=5


Dask JupyterLab extension#

The Dask community maintains a JupyterLab extension which allows Dask dashboard plots to be embedded directly into a JupyterLab session. Viewing diagnostic plots in JupyterLab instead of in a separate browser tab or window is often a better user experience.


If you are using dask-labextension versions 2.x, see the installation section of the Dask JupyterLab Extension README file.

Open the Dask JupyterLab extension by clicking the Dask logo in the JupyterLab left sidebar. Then select the magnifying glass icon in the upper right-hand corner and copy and paste the address from the Dashboard Address in the cluster widget to connect the extension to your cluster.

jupyterlab extension

Each orange button corresponds to a different diagnostic plot. Try clicking one of the buttons and then arranging the plots. See this screencast for a live demo.

Once you’ve arranged the plots, you can see the Dashboard at work while running a Dask computation:

import dask

# generate random timeseries of data
df = dask.datasets.timeseries("2000", "2005", partition_freq="2w").persist()

# perform a groupby with an aggregation
df.groupby("name").aggregate({"x": "sum", "y": "max"}).compute()

See the Dask documentation for more details on intrepreting the Dask dashboard. Once you’ve completed your work, you can shutdown your cluster:

# close the cluster
# close the connection to the client