Workflows Overview

Kubiya workflows are the foundation of intelligent automation. They combine the power of AI generation with the reliability of deterministic execution, all running in containerized environments.

The DAG Revolution

What is a Kubiya Workflow?

A workflow in Kubiya is:

  • Directed Acyclic Graph (DAG): Steps with dependencies, no circular references
  • Container-Based: Each step runs in its own Docker container
  • Language Agnostic: Use Python, Go, Node.js, or any language
  • AI-Generated: Created from natural language or defined programmatically
  • Deterministic: Same input → Same execution → Same output

Core Components

1. Workflow Definition

from kubiya import workflow, step

@workflow
def my_data_pipeline():
    # Your workflow logic here
    pass

2. Steps

Each step is an atomic unit of work:

data = step.extract(
    name="extract-data",
    image="python:3.11",
    script="extract.py",
    inputs={"source": "database"}
)

3. Dependencies

Steps can depend on outputs from other steps:

4. Containers

Every step runs in isolation:

🐳 Any Docker Image

Public or private registries

🔒 Complete Isolation

No shared state between steps

📦 Dependency Freedom

Each step has its own environment

Workflow Execution Model

Key Features

Parallel Execution

Run independent steps simultaneously:

Conditional Logic

Dynamic paths based on results:

if data.quality_score > 0.8:
    step.proceed_to_production()
else:
    step.alert_data_team()

Error Handling

Built-in retry mechanisms:

step.critical_operation(
    retry=3,
    backoff="exponential",
    on_failure="alert"
)

Inline AI Agents

Embed intelligent decision-making:

analysis = step.inline_agent(
    message="Analyze these logs for anomalies",
    runners=["kubiya-hosted"],
    tools=[log_parser_tool]
)

Workflow vs Other Approaches

Real-World Example

Let’s see a complete workflow that showcases Kubiya’s power:

from kubiya import workflow, step

@workflow(
    name="intelligent-deployment",
    description="AI-assisted deployment with safety checks"
)
def deploy_with_intelligence():
    # Run tests in Node.js container
    test_results = step.run_tests(
        image="node:18",
        script="npm test",
        timeout="5m"
    )
    
    # Analyze results with AI
    analysis = step.inline_agent(
        message=f"Analyze these test results: {test_results}",
        runners=["kubiya-hosted"],
        tools=[{
            "name": "decide-deployment",
            "type": "function",
            "description": "Decide if safe to deploy"
        }]
    )
    
    # Conditional deployment
    if analysis.should_deploy:
        # Build in parallel
        image = step.build_docker_image(
            dockerfile="./Dockerfile",
            tag="myapp:latest"
        )
        
        # Deploy to Kubernetes
        step.deploy_to_k8s(
            manifest="k8s/deployment.yaml",
            image=image.tag,
            namespace="production"
        )
        
        # Notify team
        step.send_notification(
            channel="deployments",
            message="🚀 Deployed successfully!"
        )
    else:
        # Alert on issues
        step.create_incident(
            severity="high",
            details=analysis.concerns
        )

Visual Workflow Builder

Benefits of Kubiya Workflows

1. Predictability

  • Deterministic execution paths
  • No agent wandering or infinite loops
  • Clear audit trails

2. Flexibility

  • Use any programming language
  • Integrate any tool or service
  • Mix AI and traditional logic

3. Scalability

  • Parallel execution by default
  • Kubernetes-native scaling
  • Efficient resource usage

4. Maintainability

  • Version control friendly
  • Easy to test and debug
  • Clear dependencies

Common Patterns

ETL Pipeline

CI/CD Pipeline

Data Science Pipeline

Next Steps

Ready to build your first workflow?