uv and Coiled for easy Python scripts on the cloud#

James Bourbeau

2025-06-30

3 min read

The Python packaging world is (being generous) … vast. One of the more delightful recent developments is uv, a fast Python package and environment manager. uv has the features you’d expect from a Python package manager (project-specific environments, lock files, etc.), but the core reasons folks like uv so much is that it’s easy to use and really, really fast.

This post shows how to run Python scripts on the cloud using uv and Coiled, an easy-to-use, UX-focused cloud platform that provides a similarly straightforward approach to running Python scripts on any cloud hardware.

uv for managing software#

Here’s a script that uses pandas to load a Parquet data file hosted in a public bucket on S3 and print the first few rows:

process.py#
import pandas as pd

df = pd.read_parquet(
    "s3://coiled-data/uber/part.0.parquet",         
    storage_options={"anon": True},
)
print(df.head())

We’ll use this concrete example throughout the rest of the post, but know that the script contents could be any Python code.

Running this script requires pandas, pyarrow, and s3fs. uv has a pretty slick way for embedding these dependencies directly in the script using PEP 723 script metadata with the uv add --script command.

$ uv add --script process.py pandas pyarrow s3fs

which adds these comments with the specified dependencies to the script

process.py#
# /// script
# requires-python = ">=3.12"
# dependencies = [
#   "pandas",
#   "pyarrow",
#   "s3fs",
# ]
# ///

import pandas as pd

df = pd.read_parquet(
    "s3://coiled-data/uber/part.0.parquet",         
    storage_options={"anon": True},
)
print(df.head())

We then use uv run to run the script

$ uv run process.py

uv automatically creates a virtual environment, installs the dependencies, and then runs process.py in that environment. Here’s the output we get:

hvfhs_license_num dispatching_base_num originating_base_num    request_datetime   on_scene_datetime  ... shared_request_flag shared_match_flag  access_a_ride_flag  wav_request_flag  wav_match_flag
__null_dask_index__                                                                                                      ...
18979859                       HV0003               B02875               B02875 2019-05-26 23:29:35 2019-05-26 23:30:33  ...                   N                 N                 NaN                 N             NaN
18979860                       HV0003               B02875               B02875 2019-05-26 23:56:48 2019-05-26 23:57:03  ...                   N                 N                 NaN                 N             NaN
18979861                       HV0003               B02765               B02765 2019-05-26 23:56:35 2019-05-26 23:56:41  ...                   N                 N                 NaN                 N             NaN
18979862                       HV0003               B02682               B02682 2019-05-26 22:52:34 2019-05-26 23:10:38  ...                   Y                 N                 NaN                 N             NaN
18979863                       HV0002               B03035               B03035 2019-05-26 23:16:34 1970-01-01 00:00:00  ...                   N                 N                   N                 N             NaN

[5 rows x 24 columns]

What’s nice about this is we didn’t have to think about managing local virtual environments ourselves and the dependencies needed are included directly in the script which makes things nicely self-contained.

That’s really all we need for running scripts locally. However, there are times when we need resources beyond what’s available on our local workstation, like:

  • Processing large amounts of cloud-hosted data

  • Needing accelerated hardware like GPUs or a big machine with more memory

  • Running the same script with hundreds or thousands of different inputs, in parallel

In these situations running scripts directly on cloud hardware is often a good solution.

Coiled for managing hardware#

Similar to how uv makes it super straightforward to handle Python dependency management, Coiled makes it easy to handle running code on cloud hardware (if you’re not already familiar with how unergonomic cloud APIs can be, see this example on using uv with AWS Lambda).

Let’s start by installing the coiled Python package and authenticating whatever machine you’re running on with Coiled using the coiled login CLI (you’ll be prompted to create a Coiled account if you don’t already have one – and it’s totally free to start using Coiled)

$ uvx coiled login

To have Coiled run our script on a VM on AWS, we’ll add these two comments:

process.py#
# COILED container ghcr.io/astral-sh/uv:debian-slim
# COILED region us-east-2

# /// script
# requires-python = ">=3.12"
# dependencies = [
#   "pandas",
#   "pyarrow",
#   "s3fs",
# ]
# ///

import pandas as pd

df = pd.read_parquet(
    "s3://coiled-data/uber/part.0.parquet",         
    storage_options={"anon": True},
)
print(df.head())

They tell Coiled to use the official uv Docker image when running our script (this makes sure uv is installed) and to run the script in the us-east-2 region on AWS (where this data file happens to live) to avoid any data egress. There are many other options we could have specified here like VM instance type (the default it an m6i.xlarge instance on AWS), whether to use spot instance, etc. See the Coiled Batch docs for more details.

Finally we use the coiled batch run CLI to run our existing uv run command on a cloud VM.

$ uvx coiled batch run \
    uv run process.py

The same exact thing that happened locally before now happens on a cloud VM on AWS, only this time the script is faster because we didn’t have to transfer any data from S3 to a local laptop.

../_images/uv-coiled.png

View of Coiled UI with our script running on AWS.#

Conclusion#

uv is great. It’s a really pleasant experience for managing dependencies when running Python scripts. It also works well with Coiled when you want to run scripts on cloud hardware.

Together uv and Coiled make for a powerful, ergonomic cloud Python experience.