Skip to main content
Jobs in Skyhook are batch processing workloads that run to completion, unlike Services which run continuously. Jobs are perfect for one-time operations, scheduled tasks, and complex multi-step workflows.

What are Jobs?

Jobs handle workloads that:
  • Execute once and terminate when complete
  • Run on a schedule (hourly, daily, weekly, etc.)
  • Process data in batches
  • Require orchestration of multiple steps
Common use cases:
  • ETL Pipelines - Extract, transform, and load data between systems
  • Data Processing - Batch processing of files, reports, or analytics
  • Database Operations - Migrations, backups, cleanup tasks
  • Scheduled Maintenance - Cache warming, data archival, log rotation
  • Workflow Orchestration - Complex multi-step processes with dependencies
  • Report Generation - Periodic business reports or exports

Job Types

Skyhook supports four job types to fit different execution patterns:

Kubernetes Job

One-time executionSingle-container tasks that run once and exit. Best for standalone operations like migrations, data imports, and one-off scripts.

Kubernetes CronJob

Scheduled tasksRecurring single-container jobs on cron schedule with timezone support. Best for backups, cleanups, and regular maintenance.

Argo Workflow

Multi-step orchestrationWorkflows with multiple steps, dependencies, and data flow between steps. Best for ETL pipelines, ML training, and processes requiring step coordination.

Argo CronWorkflow

Scheduled orchestrationRecurring multi-step workflows with cron scheduling. Best for periodic data pipelines and processes with multiple coordinated stages.

Kubernetes Jobs vs Argo Workflows

Choose Kubernetes Jobs when:
  • You have a standalone task with no step dependencies
  • A single container can do all the work
  • You don’t need to pass data between multiple steps
  • You want immediate execution on deployment (for Jobs) or simple cron scheduling (for CronJobs)
Choose Argo Workflows when:
  • You need to coordinate multiple steps with dependencies
  • You want to pass artifacts or data between steps
  • You need conditional logic, branching, or loops
  • You want to reuse a workflow template with different parameters
  • You need better observability with step-by-step tracking and DAG visualization
  • You want suspend/resume capabilities for manual approvals

Choose Your Path

Select the guide that matches your job type:

Kubernetes Jobs

Single-container executionComplete guide to Kubernetes Job and CronJob - creating, executing, monitoring, and configuring standalone batch tasks.

Argo Workflows

Multi-step orchestrationComplete guide to Argo Workflow and CronWorkflow - template architecture, parameters, step dependencies, artifact passing, and advanced configuration.

Key Features

GitOps-Based Configuration
  • All job definitions stored in Git with Kustomize structure
  • Version control, code review, and audit trail for all changes
  • Environment-specific overrides (schedules, resources, parameters)
Flexible Execution Control
  • Configure retry strategies, execution limits, and TTL cleanup
  • Set concurrency policies to prevent overlapping runs
  • Customize resource allocation (CPU, memory, GPU) per environment
Unified Management
  • Single UI for all job types (Kubernetes Jobs, CronJobs, Argo Workflows)
  • Deploy templates, execute with parameters, monitor status
  • Integrated logging and execution history tracking

Getting Started

Jobs use the same create/import wizard as services, with one extra step where you pick the job type. See Create and Import Services for the full wizard walkthrough — everything there applies to jobs too.
1

Open the job wizard

From the Home dashboard or the Jobs list, click New Job.
2

Job Info

Pick an existing repository (import an existing codebase) or create a new one from a template. Skyhook auto-detects the Dockerfile, container port, and framework just like the service wizard.
3

Build Config

Confirm the Dockerfile path, container registry, and deployment repository (same repo as job, or a separate GitOps repo).
4

Job Type

Pick one of:
  • Kubernetes Job — single-container one-time task
  • Kubernetes CronJob — single-container scheduled task, with cron expression and timezone
  • Argo Workflow — multi-step workflow with orchestration
  • Argo CronWorkflow — scheduled multi-step workflow
5

Environments

Pick which environments the job should be deployed to. Multi-step workflows can have different cluster assignments per environment.
6

Review and create

Skyhook opens the required pull requests and walks you through merging them, the same way as for services.

Managing existing jobs

Once a job exists, you can manage it from the Jobs list:
  • View recent runs — the Jobs list shows the latest execution status for each job
  • Trigger manually — kick off a run outside the cron schedule from the job detail page
  • Edit configuration — use the job’s settings tabs for cron schedule, resource limits, environment variables, and secrets
  • Delete — remove a job from Skyhook; you’ll be asked to confirm before the deletion pull request is opened

Notifications

Job execution notifications are configured per organization in Settings → Notifications. You can send notifications to Slack, email, or webhooks on events like:
  • Job failure
  • Job success (if you want to know about every run)
  • Schedule drift (a scheduled run was skipped or delayed)
  • Long-running jobs that exceed expected duration
Notifications apply across all jobs in the org; per-job overrides aren’t supported today.

Deleting a job

Click Delete on the job’s settings page. Skyhook opens a pull request that removes the job’s manifests from your GitOps repository. After merge, ArgoCD prunes the Job / CronJob / Workflow resources from your clusters. Deletion does not remove historical execution data or logs retained elsewhere.

Next Steps