Coiled stores instance, scheduler, and worker logs in your cloud provider account using Amazon CloudWatch and Google Cloud Logging (see the sections on AWS and GCP). While you can use any of your existing log management systems to access your logs, Coiled also offers a few ways to make this easier.
Regardless of whether you are launching a Coiled cluster interactively or from a Python script, you can see your logs from the cluster dashboard page of your Coiled account at
When you click on the name of a given cluster, you’ll be redirected to the cluster details page at
Here you can see the current cluster state and download instance-specific logs for the scheduler or workers by clicking “download logs”.
You can also pull the logs for the scheduler and each worker using
As you scroll down, you can see the logs for the cluster state history:
Within an interactive session, e.g. IPython or Jupyter Notebook, there is a dynamic widget loaded when you first create the cluster:
The widget has three panels showing an overview of the Coiled cluster, the configuration, and Dask worker states with progress bars for how many workers have reached a given state. You can also use the link at the top to view the cluster details page mentioned above.
Coiled uses the Python standard logging module for logging changes in cluster, scheduler, and worker state. The default level is
WARNING, but you can control the logging verbosity by setting the logging level, the
INFO levels being the most verbose. See the Python logging docs for more on logging levels. Here is an example for how this can be configured from within a Python script:
import logging from coiled import Cluster logging.basicConfig(level=logging.INFO) logging.getLogger("coiled").setLevel(logging.INFO) cluster = Cluster() cluster.close()
The above snippet will print the logs to the console, but you can also choose to save logs to a file by changing the parameters passed to
basicConfig() (see this tutorial on logging to a file).
For more advanced options in debugging your Dask computations, see the Dask documentation on logging.