Getting Started#

In this guide you will:

  1. Sign up for Coiled

  2. Install the Coiled Python library

  3. Log in to your Coiled account

  4. Configure your cloud provider

  5. Run your Dask computation in your cloud account

1. Sign up#

Sign up for Coiled using GitHub, Google, or your email address.

2. Install#

Coiled can be installed from conda-forge using conda, or from PyPI using pip:

conda install -c conda-forge coiled-runtime python=3.9

3. Log in#

You can log in using the coiled login command line tool:

$ coiled login

You’ll then navigate to on the Coiled web app where you can create and manage API tokens.

Please login to to get your token

Your token will be saved to Coiled’s local configuration file.


For Windows users

Unless you are using WSL, you will need to go to a command prompt or PowerShell window within an environment that includes coiled (see the next step) to login via coiled login.

Additionally, users users should provide the token as an argument, i.e. coiled login --token <your-token> from the command line or !coiled login --token <your-token> from a Jupyter notebook, since the Windows clipboard will not be active at the “Token” prompt.

4. Configure your cloud provider#

Use our CLI tool to quickly configure your GCP or AWS account:

coiled setup wizard

Or, if you prefer a browser-based setup, follow our step-by-step guide to configure your Google Cloud or AWS account. Don’t have a cloud provider account? You can sign up for you can sign up for Google Cloud Free Tier or AWS Free Tier.

5. Run your Dask computation in your cloud account#


If you haven’t already, use our CLI tool to configure your cloud provider account:

coiled setup wizard

Next, spin up a Dask cluster in your cloud by creating a coiled.Cluster instance and connecting this cluster to the Dask Client. You’ll use software="coiled/default-py39" to use the default Python 3.9 environment that Coiled maintains; you can change the software argument to whichever version of Python you’re using locally.

from coiled import Cluster
from dask.distributed import Client

# create a remote Dask cluster with Coiled
cluster = Cluster(name="my-cluster", software="coiled/default-py39")

# interact with Coiled using the Dask distributed client
client = Client(cluster)

# link to Dask Dashboard
print("Dask Dashboard:", client.dashboard_link)


If you’re using a Team account, be sure to specify the account= option when creating a cluster:

cluster = coiled.Cluster(account="<my-team-account-name>")

Otherwise, the cluster will be created in your personal Coiled account.

You will then see a widget showing the cluster state overview and progress bars as resources are provisioned (this may take a minute or two). You can use the cluster details page (link at the top of the widget) for detailed information on cluster state and worker logs (see Logging).

Terminal dashboard displaying the Coiled cluster status overview, configuration, and Dask worker states.

Once the cluster is ready, you can submit a Dask DataFrame computation for execution. Navigate to the Dask dashboard (see Dashboard Address in the widget) for real-time diagnostics on your Dask computations.

import dask

# generate random timeseries of data
df = dask.datasets.timeseries("2000", "2005", partition_freq="2w").persist()

# perform a groupby with an aggregation
df.groupby("name").aggregate({"x": "sum", "y": "max"}).compute()

Lastly, you can stop the running cluster using the following commands. By default, clusters will shutdown after 20 minutes of inactivity.

# Close the cluster

# Close the client