🌍 Deploy Anywhere¶
Here's the ultimate superpower: Same code runs everywhere. Just change the configuration.
Good news: Your runnable pipeline is already production-ready. You've been building with production in mind this whole time.
Your code never changes¶
from runnable import Pipeline, PythonTask
def train_model():
# Your model training logic
print("Training model...")
return "model_v1.pkl"
def main():
pipeline = Pipeline(steps=[
PythonTask(function=train_model, name="training")
])
pipeline.execute()
return pipeline
if __name__ == "__main__":
main()
Change where it runs¶
💻 Local development¶
🐳 Container execution¶
☁️ Cloud platforms¶
Configuration files define the environment¶
The same pipeline code runs in any environment - just change the configuration:
Fast local development (default configuration):
Pipeline validation without execution:
Isolated execution in containers:
pipeline-executor:
type: local-container
config:
# Build from project root to include runnable + dependencies
docker_image: "my-pipeline:latest" # or use existing image with runnable installed
run-log-store:
type: file-system
catalog:
type: file-system
secrets:
type: env-secrets
Container Image Requirements
The Docker image must have Runnable installed. Either:
- Build from your project root:
docker build -t my-pipeline:latest .(includes your code + runnable) - Use a base image with runnable:
FROM python:3.11thenRUN pip install runnable - Never use bare
python:3.11- it doesn't include runnable
Production orchestration on Kubernetes:
pipeline-executor:
type: argo
config:
pvc_for_runnable: runnable
defaults:
image: "my-pipeline:v1.0"
resources:
limits:
cpu: "2"
memory: 4Gi
requests:
cpu: "1"
memory: 2Gi
argoWorkflow:
metadata:
generateName: "pipeline-"
namespace: production
spec:
serviceAccountName: "pipeline-executor"
run-log-store:
type: chunked-fs
config:
log_folder: /mnt/run_log_store
catalog:
type: s3
config:
bucket: production-data
secrets:
type: env-secrets
Same code, different environments - just change the RUNNABLE_CONFIGURATION_FILE.
The power of environment-agnostic code¶
Development:
Production:
# Same code, production environment
export RUNNABLE_CONFIGURATION_FILE=production-argo.yaml
uv run my_pipeline.py
Real-world example¶
From development to production:
from runnable import Pipeline, PythonTask
from examples.common.functions import hello
def main():
# Same code, different environments
task = PythonTask(function=hello, name="say_hello")
pipeline = Pipeline(steps=[task])
# This execute() call works for both development and production
# Environment determined by RUNNABLE_CONFIGURATION_FILE
pipeline.execute()
return pipeline
if __name__ == "__main__":
main()
Development (uses default local config):
Production (same code, argo config):
See complete runnable code
"""
You can execute this pipeline by:
python examples/01-tasks/python_tasks.py
The stdout of "Hello World!" would be captured as execution
log and stored in the catalog.
"""
from examples.common.functions import hello
from runnable import Pipeline, PythonTask
def main():
# Create a tasks which calls the function "hello"
# If this step executes successfully,
# the pipeline will terminate with success
hello_task = PythonTask( # [concept:task]
name="hello",
function=hello,
)
# The pipeline has only one step.
pipeline = Pipeline(steps=[hello_task]) # [concept:pipeline]
pipeline.execute() # [concept:execution]
return pipeline
if __name__ == "__main__":
main()
Try it now:
Dev environment: Runs in 2 seconds on your laptop Prod environment: Runs on Kubernetes with monitoring, logging, and auto-scaling
Zero code changes.
Configuration Reference¶
Ready to customize your deployment? Check the configuration documentation:
- Pipeline Executors - Choose where pipelines run (local, argo, etc.)
- Job Executors - Configure task execution (local, containers, kubernetes)
- Storage - Set up data persistence (local, S3, MinIO)
- Logging - Configure execution logs
- Secrets - Manage sensitive configuration
You now know the core concepts! Start with the examples above, then dive into the configuration reference for advanced setups.