Coiled’s analytics and performance reports can help you understand cluster usage and performance of Dask computations once a cluster is successfully up an running. Sometimes, though, there are errors in cluster creation, launching of instance types, or worker failures. In these more challenging debugging situations, looking at the cluster or instance logs can provide insight into cluster activity. Cluster logs provide an overview of the status of the scheduler and workers, while instance logs show all events that occurred on a given instance.
Coiled uses the Python standard logging module to record various events from the scheduler, worker, and client. There are a few options for viewing this information.
Regardless of whether you are launching a Coiled cluster interactively or from a Python script, you can see your logs from the cluster dashboard page of your Coiled account at
When you click on the name of a given cluster, you’ll be redirected to the cluster details page at
Here you can see the current cluster state and download instance-specific logs for the scheduler or workers by clicking “download logs”.
You can also pull the logs for the scheduler and each worker using
As you scroll down, you can see the logs for the cluster state history:
Within an interactive session, e.g. IPython or Jupyter Notebook, there is a dynamic widget loaded when you first create the cluster:
The widget has three panels showing an overview of the Coiled cluster, the configuration, and Dask worker states with progress bars for how many workers have reached a given state. You can also use the link at the top to view the cluster details page mentioned above.
Coiled uses the Python standard logging module for logging changes in cluster, scheduler, and worker state (i.e., the cluster logs). The default level is
WARNING, but you can control the logging verbosity by setting the logging level, the
INFO levels being the most verbose. See the Python logging docs for more on logging levels. Here is an example for how this can be configured from within the Python script:
import logging from coiled import Cluster logging.basicConfig(level=logging.INFO) logging.getLogger("coiled").setLevel(logging.INFO) cluster = Cluster() cluster.close()
The above snippet will print the logs to the console, but you can also choose to save logs to a file by changing the parameters passed to
basicConfig() (see this tutorial on logging to a file).
For more advanced options in debugging your Dask computations, see the Dask documentation on logging.