What are Jobs?
Jobs handle workloads that:- Execute once and terminate when complete
- Run on a schedule (hourly, daily, weekly, etc.)
- Process data in batches
- Require orchestration of multiple steps
- ETL Pipelines - Extract, transform, and load data between systems
- Data Processing - Batch processing of files, reports, or analytics
- Database Operations - Migrations, backups, cleanup tasks
- Scheduled Maintenance - Cache warming, data archival, log rotation
- Workflow Orchestration - Complex multi-step processes with dependencies
- Report Generation - Periodic business reports or exports
Job Types
Skyhook supports four job types to fit different execution patterns:Kubernetes Job
One-time executionSingle-container tasks that run once and exit. Best for standalone operations like migrations, data imports, and one-off scripts.
Kubernetes CronJob
Scheduled tasksRecurring single-container jobs on cron schedule with timezone support. Best for backups, cleanups, and regular maintenance.
Argo Workflow
Multi-step orchestrationWorkflows with multiple steps, dependencies, and data flow between steps. Best for ETL pipelines, ML training, and processes requiring step coordination.
Argo CronWorkflow
Scheduled orchestrationRecurring multi-step workflows with cron scheduling. Best for periodic data pipelines and processes with multiple coordinated stages.
Kubernetes Jobs vs Argo Workflows
Choose Kubernetes Jobs when:- You have a standalone task with no step dependencies
- A single container can do all the work
- You don’t need to pass data between multiple steps
- You want immediate execution on deployment (for Jobs) or simple cron scheduling (for CronJobs)
- You need to coordinate multiple steps with dependencies
- You want to pass artifacts or data between steps
- You need conditional logic, branching, or loops
- You want to reuse a workflow template with different parameters
- You need better observability with step-by-step tracking and DAG visualization
- You want suspend/resume capabilities for manual approvals
Choose Your Path
Select the guide that matches your job type:Kubernetes Jobs
Single-container executionComplete guide to Kubernetes Job and CronJob - creating, executing, monitoring, and configuring standalone batch tasks.
Argo Workflows
Multi-step orchestrationComplete guide to Argo Workflow and CronWorkflow - template architecture, parameters, step dependencies, artifact passing, and advanced configuration.
Key Features
GitOps-Based Configuration- All job definitions stored in Git with Kustomize structure
- Version control, code review, and audit trail for all changes
- Environment-specific overrides (schedules, resources, parameters)
- Configure retry strategies, execution limits, and TTL cleanup
- Set concurrency policies to prevent overlapping runs
- Customize resource allocation (CPU, memory, GPU) per environment
- Single UI for all job types (Kubernetes Jobs, CronJobs, Argo Workflows)
- Deploy templates, execute with parameters, monitor status
- Integrated logging and execution history tracking
Getting Started
To create your first job:1
Navigate to Jobs
In the Skyhook UI, navigate to Jobs and click Create New Job
2
Fill in basic details
Provide a job name and optional description
3
Select your job type
Choose based on your needs:
- Kubernetes Job - Single-container one-time task
- Kubernetes CronJob - Single-container scheduled task
- Argo Workflow - Multi-step workflow with orchestration
- Argo CronWorkflow - Scheduled multi-step workflow
4
Configure job parameters
Set execution settings, retry strategies, and resource limits
5
Add environments
Configure environments (dev, staging, production) with cluster and namespace assignments
6
Create
Hit the create button and Skyhook will create the job files, such as CI/CD and Kubernetes manifests.
Next Steps
- Kubernetes Jobs - Complete guide to Kubernetes Job and CronJob
- Argo Workflows - Complete guide to Argo Workflow and CronWorkflow
