Skip to main content
Automate workflow execution with triggers.

Trigger Types

TypeDescriptionUse Case
cronTime-based schedulingDaily/weekly scans
eventEvent-driven executionReact to discoveries
watchFile change detectionProcess new data
manualOn-demand onlyPlaceholder trigger

Cron Triggers

Execute workflows on a schedule using cron expressions.

Workflow Definition

kind: module
name: scheduled-scan

triggers:
  - name: daily-scan
    on: cron
    schedule: "0 2 * * *"    # Daily at 2 AM
    enabled: true

params:
  - name: target
    required: true

steps:
  - name: scan
    type: bash
    command: nuclei -u {{target}}

Cron Expression Format

┌───────────── minute (0 - 59)
│ ┌───────────── hour (0 - 23)
│ │ ┌───────────── day of month (1 - 31)
│ │ │ ┌───────────── month (1 - 12)
│ │ │ │ ┌───────────── day of week (0 - 6) (Sunday = 0)
│ │ │ │ │
* * * * *

Common Schedules

ExpressionDescription
0 * * * *Every hour
0 2 * * *Daily at 2 AM
0 0 * * 0Weekly on Sunday
0 0 1 * *Monthly on the 1st
*/15 * * * *Every 15 minutes
0 9-17 * * 1-5Hourly 9-5 weekdays

Event Triggers

Execute workflows in response to events.

Workflow Definition

kind: module
name: on-asset-discovered

triggers:
  - name: new-asset
    on: event
    event:
      topic: "assets.new"
      filters:
        - "event.source == 'subfinder'"
        - "event.data.type == 'subdomain'"
    input:
      type: event_data
      field: "url"
      name: target
    enabled: true

steps:
  - name: probe
    type: bash
    command: httpx -u {{target}}

Event Topics

Topics support glob patterns for flexible matching:
event:
  topic: "assets.*"         # Matches assets.new, assets.updated
  topic: "*.new"            # Matches assets.new, vulns.new
  topic: "*"                # Matches all topics
TopicDescription
run.startedWorkflow run started
run.completedWorkflow run completed
run.failedWorkflow run failed
step.completedStep completed
assets.newNew asset discovered
vulnerabilities.newNew vulnerability found

Event Filters

JavaScript expressions to filter events:
filters:
  - "event.source == 'nuclei'"
  - "event.data.severity == 'critical'"
  - "event.workspace == 'example.com'"
For advanced filtering with utility functions, use filter_functions:
event:
  topic: "assets.new"
  filters:
    - "event.source == 'httpx'"
  filter_functions:
    - "contains(event.data.url, '/api/')"
    - "!ends_with(event.data.url, '.js')"
See Event-Driven Triggers for available filter functions.

Event Input Mapping

Two syntaxes are supported: New exports-style syntax (recommended):
input:
  target: event_data.url
  source: event.source
  description: trim(event_data.desc)
Legacy syntax:
input:
  type: event_data
  field: "target"      # Field in event data
  name: target         # Workflow parameter name

Event Template Variables

Event-triggered workflows have access to these template variables:
VariableDescription
{{EventEnvelope}}Full JSON event envelope
{{EventTopic}}Event topic
{{EventSource}}Event source
{{EventDataType}}Data type
{{EventTimestamp}}Event timestamp

Watch Triggers

Execute workflows when files change. Uses fsnotify for instant inotify-based file system notifications.

Workflow Definition

kind: module
name: process-new-targets

triggers:
  - name: new-targets
    on: watch
    path: "/data/targets/"
    debounce: "500ms"    # Optional: debounce rapid changes
    input:
      type: file_path
      name: target_file
    enabled: true

steps:
  - name: process
    type: foreach
    input: "{{target_file}}"
    variable: target
    step:
      type: bash
      command: scan {{target}}

Watch Configuration

FieldTypeDescription
pathstringDirectory or file path to watch
debouncestringOptional debounce duration (e.g., “500ms”, “1s”, “2s”)

Debounce

When files are modified rapidly (e.g., during a write operation), multiple events may fire. Use debounce to consolidate rapid changes into a single trigger:
triggers:
  - name: watch-logs
    on: watch
    path: "/var/log/scan-results/"
    debounce: "1s"    # Wait 1 second after last change before triggering
    enabled: true
The debounce timer resets on each new file event. The workflow only triggers after the specified duration has passed without new events.

Watch Events

EventDescription
createNew file created
modifyFile modified
deleteFile deleted
renameFile renamed

API Management

List Schedules

curl http://localhost:8002/osm/api/schedules \
  -H "Authorization: Bearer $TOKEN"

Create Schedule

curl -X POST http://localhost:8002/osm/api/schedules \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "daily-scan",
    "workflow_name": "full-recon",
    "trigger_type": "cron",
    "schedule": "0 2 * * *",
    "input_config": {
      "target": "example.com"
    },
    "is_enabled": true
  }'

Enable/Disable

# Enable
curl -X POST http://localhost:8002/osm/api/schedules/sched-123/enable \
  -H "Authorization: Bearer $TOKEN"

# Disable
curl -X POST http://localhost:8002/osm/api/schedules/sched-123/disable \
  -H "Authorization: Bearer $TOKEN"

Trigger Manually

curl -X POST http://localhost:8002/osm/api/schedules/sched-123/trigger \
  -H "Authorization: Bearer $TOKEN"

Delete Schedule

curl -X DELETE http://localhost:8002/osm/api/schedules/sched-123 \
  -H "Authorization: Bearer $TOKEN"

Combined Triggers

A workflow can have multiple triggers:
kind: module
name: flexible-scan

triggers:
  # Run daily
  - name: daily
    on: cron
    schedule: "0 2 * * *"
    enabled: true

  # Run on new assets
  - name: on-asset
    on: event
    event:
      topic: "assets.new"
    input:
      type: event_data
      field: "url"
      name: target
    enabled: true

  # Manual trigger placeholder
  - name: manual
    on: manual
    enabled: true

params:
  - name: target
    required: true

steps:
  - name: scan
    type: bash
    command: scan {{target}}

Event Emission

Emit events from workflows:
- name: scan
  type: bash
  command: nuclei -u {{target}} -o {{Output}}/vulns.json
  on_success:
    - action: emit_event
      topic: "scan.completed"
      data:
        target: "{{target}}"
        results: "{{Output}}/vulns.json"

Database Storage

Schedules are stored in the database:
-- schedules table
id, name, workflow_name, workflow_path, trigger_type,
schedule, event_topic, watch_path, input_config,
is_enabled, last_run, next_run, run_count,
created_at, updated_at
Query schedules:
osmedeus db query "SELECT * FROM schedules WHERE is_enabled = true"

Best Practices

  1. Use descriptive trigger names
  2. Start with disabled triggers during testing
  3. Set reasonable intervals to avoid overload
  4. Use event filters to reduce noise
  5. Monitor trigger execution via event logs
  6. Combine with distributed mode for scale

Troubleshooting

Schedule not running

# Check if enabled
curl http://localhost:8002/osm/api/schedules/sched-123

# Check next_run time
# Ensure server is running

Events not triggering

# Check event logs
curl "http://localhost:8002/osm/api/event-logs?topic=assets.new"

# Verify filter expressions

Watch not detecting changes

# Verify path exists
# Check file permissions
# Ensure pattern matches

Next Steps