⚡ Parameters From Outside¶
The coolest part: Change your function's behavior without changing your code.
Your function accepts parameters¶
def read_initial_params_as_pydantic(
integer: int,
floater: float,
stringer: str,
pydantic_param: ComplexParams,
envvar: str,
):
# Your function uses these parameters
print(f"Processing {integer} items with {stringer} config")
return f"Processed {len(pydantic_param)} records"
Method 1: YAML files¶
Create a parameters file:
parameters.yaml
integer: 1
floater: 3.14
stringer: "hello"
pydantic_param:
x: 10
foo: "bar"
envvar: "not set" # Will be overridden by environment variable
Run with parameters:
from runnable import Pipeline, PythonTask
def main():
task = PythonTask(function=read_initial_params_as_pydantic)
pipeline = Pipeline(steps=[task])
pipeline.execute(parameters_file="parameters.yaml")
return pipeline
if __name__ == "__main__":
main()
See complete runnable code
examples/03-parameters/static_parameters_python.py
"""
The below example showcases setting up known initial parameters for a pipeline
of only python tasks
The initial parameters as defined in the yaml file are:
simple: 1
complex_param:
x: 10
y: "hello world!!"
runnable allows using pydantic models for deeply nested parameters and
casts appropriately based on annotation. eg: read_initial_params_as_pydantic
If no annotation is provided, the parameter is assumed to be a dictionary.
eg: read_initial_params_as_json
You can set the initial parameters from environment variables as well.
eg: Any environment variable prefixed by "RUNNABLE_PRM_" will be picked up by runnable
Run this pipeline as:
python examples/03-parameters/static_parameters_python.py
"""
import os
from examples.common.functions import (
function_using_argparse,
function_using_kwargs,
read_initial_params_as_json,
read_initial_params_as_pydantic,
)
from runnable import Pipeline, PythonTask
def main():
"""
Signature of read_initial_params_as_pydantic
def read_initial_params_as_pydantic(
integer: int,
floater: float,
stringer: str,
pydantic_param: ComplexParams,
envvar: str,
):
"""
read_params_as_pydantic = PythonTask(
function=read_initial_params_as_pydantic,
name="read_params_as_pydantic",
)
read_params_as_json = PythonTask(
function=read_initial_params_as_json,
name="read_params_as_json",
)
using_argparse = PythonTask(
function=function_using_argparse,
name="function_using_argparse",
)
using_kwargs = PythonTask(
function=function_using_kwargs,
name="function_using_kwargs",
)
pipeline = Pipeline(
steps=[
read_params_as_pydantic,
read_params_as_json,
using_argparse,
using_kwargs,
],
)
_ = pipeline.execute(parameters_file="examples/common/initial_parameters.yaml")
return pipeline
if __name__ == "__main__":
# Any parameter prefixed by "RUNNABLE_PRM_" will be picked up by runnable
os.environ["RUNNABLE_PRM_envvar"] = "from env"
main()
Try it now:
Method 2: Environment variables¶
Set variables with RUNNABLE_PRM_ prefix:
export RUNNABLE_PRM_integer=42
export RUNNABLE_PRM_stringer="production data"
export RUNNABLE_PRM_envvar="from env" # This overrides YAML!
import os
from runnable import Pipeline, PythonTask
def main():
task = PythonTask(function=read_initial_params_as_pydantic)
pipeline = Pipeline(steps=[task])
# Same pipeline execution as before
pipeline.execute(parameters_file="parameters.yaml")
return pipeline
if __name__ == "__main__":
main()
🏆 Environment variables win¶
If you have both YAML and environment variables, environment variables take priority:
- YAML file says:
envvar: "not set" - Environment variable:
RUNNABLE_PRM_envvar="from env" - Result: Your function gets
"from env"✅
Why this is powerful¶
Same code, different behavior:
# Option 1: Individual parameter overrides
export RUNNABLE_PRM_dataset="small_test_data.csv"
uv run my_pipeline.py
export RUNNABLE_PRM_dataset="full_production_data.csv"
uv run my_pipeline.py
# Option 2: Complete parameter file switching
export RUNNABLE_PARAMETERS_FILE="configs/dev.yaml"
uv run my_pipeline.py
export RUNNABLE_PARAMETERS_FILE="configs/prod.yaml"
uv run my_pipeline.py
No code changes needed!
Method 3: Dynamic parameter files¶
Switch parameter files without changing code using RUNNABLE_PARAMETERS_FILE:
# Use different parameter files for different environments
export RUNNABLE_PARAMETERS_FILE="dev-parameters.yaml"
export RUNNABLE_PARAMETERS_FILE="prod-parameters.yaml"
export RUNNABLE_PARAMETERS_FILE="staging-parameters.yaml"
from runnable import Pipeline, PythonTask
def main():
task = PythonTask(function=read_initial_params_as_pydantic)
pipeline = Pipeline(steps=[task])
# No need to specify parameters_file - uses RUNNABLE_PARAMETERS_FILE
pipeline.execute()
return pipeline
if __name__ == "__main__":
main()
Powerful deployment pattern:
# Development
export RUNNABLE_PARAMETERS_FILE="configs/dev.yaml"
uv run my_pipeline.py
# Production
export RUNNABLE_PARAMETERS_FILE="configs/prod.yaml"
uv run my_pipeline.py # Same code, different parameters!
Complex parameters work too¶
# Nested objects
export RUNNABLE_PRM_model_config='{"learning_rate": 0.01, "epochs": 100}'
# Lists
export RUNNABLE_PRM_features='["age", "income", "location"]'
Pro tip
Three-layer flexibility:
- Code-specified:
pipeline.execute(parameters_file="base.yaml") - Environment override:
RUNNABLE_PARAMETERS_FILE="prod.yaml"(overrides code) - Individual parameters:
RUNNABLE_PRM_key="value"(overrides both)
Perfect for different environments (dev/staging/prod) without code changes!
Next: Learn about automatic file management between tasks.