NEW Chunked File Uploads in the Frame.io Python SDK
The Frame.io Python SDK v1.2.1 (frameio) now ships with a built-in uploader that handles chunked multi-part uploads to S3 end to end. No more hand-rolling pre-signed URL chunking, retry logic, or parallel PUT orchestration — the SDK takes care of it.
What’s New
The new frameio.upload module introduces FrameioUploader:
| Capability | What it does |
|---|---|
| Chunked uploads | Splits your file into chunks matching the pre-signed URLs returned by the API |
| Parallel PUTs | Uploads chunks concurrently via a thread pool (5 workers by default) |
| Automatic retries | Retries failed chunks with exponential backoff (1s, 2s, 4s, …) |
| Progress callbacks | on_progress(bytes_uploaded, total_bytes) fires after each chunk |
| Remote upload | One-call helper for URL-based uploads under 50 GB |
Uploads go directly from your application to S3 — they don’t pass through Frame.io’s API servers.
How to Try It
1. Install or upgrade the SDK
pip install frameio --upgrade
2. Upload a file
import os
from frameio import Frameio
from frameio.files import FileCreateLocalUploadParamsData
from frameio.upload import FrameioUploader
client = Frameio(token="YOUR_TOKEN")
file_path = "/path/to/video.mp4"
file_size = os.path.getsize(file_path)
# 1. Create the file resource and get pre-signed upload URLs
response = client.files.create_local_upload(
account_id="YOUR_ACCOUNT_ID",
folder_id="YOUR_FOLDER_ID",
data=FileCreateLocalUploadParamsData(
name="video.mp4",
file_size=file_size,
),
)
# 2. Upload the file to S3
with open(file_path, "rb") as f:
FrameioUploader(response.data, f).upload()
That’s it — the SDK chunks the file, uploads in parallel, and retries on failure.
3. Track progress
def on_progress(uploaded: int, total: int) -> None:
pct = uploaded / total * 100
print(f"\r{pct:.1f}% ({uploaded:,} / {total:,} bytes)", end="", flush=True)
with open(file_path, "rb") as f:
FrameioUploader(response.data, f, on_progress=on_progress).upload()
For a polished terminal experience, plug the callback into a Rich progress bar:
from rich.progress import Progress, BarColumn, DownloadColumn, TransferSpeedColumn, TimeRemainingColumn
with Progress(
"[progress.description]{task.description}",
BarColumn(), DownloadColumn(), TransferSpeedColumn(), TimeRemainingColumn(),
) as progress:
task = progress.add_task("Uploading...", total=file_size)
with open(file_path, "rb") as f:
FrameioUploader(
response.data, f,
on_progress=lambda done, total: progress.update(task, completed=done),
).upload()
Key Features
- Tunable parallelism — bump
max_workersfor high-bandwidth connections - Configurable retries — raise
max_retrieson flaky networks (exponential backoff) - Remote upload helper — skip chunking entirely when the file is already at a public URL:
from frameio.files import FileCreateRemoteUploadParamsData
response = client.files.create_remote_upload(
account_id="YOUR_ACCOUNT_ID",
folder_id="YOUR_FOLDER_ID",
data=FileCreateRemoteUploadParamsData(
name="video.mp4",
source_url="https://example.com/video.mp4",
),
)
Remote upload currently has a 50 GB file size limit — use local upload for anything larger.
Checking Upload Status
After uploading, verify completion:
status = client.files.show_file_upload_status(
account_id="YOUR_ACCOUNT_ID",
file_id=response.data.id,
)
print(f"Upload complete: {status.data.upload_complete}")
Resources
- Python Upload Guide — full walkthrough with configuration, progress tracking, and a complete script
- How Local & Remote Uploads Work — the underlying API flow if you want full control
- SDK Reference — API reference
Feedback
We’d love to hear how it works for you. If you run into issues or have feature requests reply to this post!