In addition to managing and deploying Dask clusters, Coiled also supports running other Python applications. This allows you to run a Python script, nightly batch job, or some other custom process on the cloud.


Coiled jobs are currently experimental with new features under active development

Job configurations

Much like a cluster configuration is a template for a Dask cluster you can launch with Coiled, a job configuration is a template for some other process, or job, you can launch with Coiled.

Job configurations are created with the coiled.create_job_configuration() function, which takes several keyword arguments allowing you to specify details about your application, as well as any software or hardware resources required by your application:

  • name: Name used to identify the job configuration.

  • command: Command to run as part of the job configuration.

  • software: Name of a software environment needed to run the command.

  • cpu: Number of CPUs to allocate.

  • memory: Amount of memory to allocate.

  • files: Local files to upload for use in the job configuration

  • ports: List of any ports that the application exposes


Currently any directory structure for uploaded files will be removed and files will be placed in the working directory of the Jupyter session. For example, /path/to/ will appear as in the running job configuration.

For example, below is a job configuration for running a custom Python script:

import coiled

# Create a software environment with the libraries needed
# for this application
    pip=["dask", "xarray", "numba"],

# Create a job configuration for a custom application.
# Here the application consists of a "" Python script.
    command=["python", ""],
    memory="16 GiB",
    ports=[8888, 8889],

Managing jobs

Once you’ve created a job configuration, you can then launch a job which is a running instance of a job configuration. To launch a job, use the coiled.start_job() function:

import coiled

# Launch a job specified by the "my-application" job configuration

You can add environment variables to jobs through the environ keyword argument. The input of environ should be a dictionary.

import coiled

    environ={"DASK_COILED__ACCOUNT": "alice", "HTTP_REQ_TIMEOUT": 200},


Environment variables are not encrypted and will be available as plain text. For security reasons, you should not use environment variables to add secrets to your jobs.

Additionally, you can get information about each on your running jobs with the coiled.list_jobs() function:

# List all running jobs

This will output a dictionary whose keys are unique names for each running job and whose values contain metadata related to the job (e.g. what job configuration it’s using):

{"job-27151b85-a": {"id": 195,
                    "account": ...,
                    "status": "running",
                    "configuration": "my-application"},

If you need to terminate a running job you can use the coiled.stop_job() function:

# Stop a running job