Skip to content

Appendix

Implementation Status

The workflow engine schema defines the full vision for the system. Some features are fully implemented, while others are schema-defined but not yet enforced by the execution engine. This matrix gives you an honest view of what works today.

Feature Status Notes
functionBlock steps Implemented Full support
workflow (embedded) steps Implemented Full support
workflowReference steps Not implemented Schema defined; throws runtime error
Conditions (skip) Implemented JMESPath-based
Assertions (fail) Implemented JMESPath-based
continueOnError Not enforced Schema defined; ignored by engine
retryConfig (step-level) Not implemented Schema defined
repeatConfig Not implemented Schema defined
Auto-retry (pure/idempotent) Not implemented isIdempotentExecution tracked but unused
Rollback Framework placeholder Handler exists; no execution logic
Abort Limited Endpoint exists; may not cleanly stop all scenarios
Scheduling (cron) Implemented Full support
Entity locking Implemented Exclusive locks via CMS
Acquire (all types) Implemented elastic, expansion, context, reference
Failure classification Partial Pure → FAILED_SAFE, everything else → FAILED_UNSAFE
Worker registration Implemented Full support with schema compatibility
Worker heartbeat/lifecycle Implemented Full support with cascade thresholds

API Reference

  • API Overview -- Endpoint groups, common curl examples, and pointers to Swagger UI
  • Swagger UI -- Interactive, auto-generated API documentation (requires running engine)

Schema Reference

Glossary

Acquire : The phase where the engine gathers entity data from the CMS and makes it available to function blocks. Can be triggered at workflow level (acquire clause) or function block level (acquire() method). → Acquire

Blackboard : The shared job queue within the workflow engine. The engine writes jobs to it; workers poll and push results back. Implemented as PostgreSQL tables with REST API access. → Blackboard

CMS (Content Management System) : The neops-core component that manages network entity data: devices, interfaces, device groups, facts, checks, credentials, and user permissions.

Context : The runtime data available during workflow execution. Contains entity data (devices, interfaces, groups), workflow parameters, and results from previous steps. JMESPath expressions evaluate against this context. → Context

Entity : A managed network object: device, interface, device group, or client. Entities have facts (structured data), checks (compliance state), and are stored in the CMS.

Facts : Versioned key-value data associated with entities. Function blocks can read and write facts (e.g., device.facts.version, interface.facts.counters).

Function Block : A versioned, reusable execution unit that encapsulates a single automation task. Written in Python using the Worker SDK. Registered with the engine by workers. → Function Blocks

Idempotent : A function block whose execution with the same inputs always produces the same result. Safe to retry on failure. The engine tracks isIdempotentExecution but does not yet use it for automatic retry decisions (planned). → Types & Safety

JMESPath : A query language for JSON used in workflow parameter interpolation ({{ expression }}), conditions, and assertions. See jmespath.org. → Parameters

Job : A discrete unit of work on the blackboard. Each job represents one function block execution for one entity. Jobs have types: ACQUIRE, EXECUTE, ROLLBACK. → Blackboard

Pure : A function block with no side effects (read-only). If a workflow only executes pure steps before failing, the failure is classified as FAILED_SAFE. → Types & Safety

Seed Entity : The entity type a workflow is designed to operate on. When executing, you provide IDs or queries for entities of this type. The engine creates one execution path per seed entity. → Workflow Definition

Worker : A running instance of the Worker SDK that registers function blocks with the engine, polls the blackboard for jobs, executes them, and pushes results back. → Worker Management

Workflow : A versioned, declarative YAML definition of an automation pipeline consisting of steps (function blocks, embedded workflows, workflow references). → Workflows

Workflow Execution : A specific run of a workflow definition with concrete parameters and entity scope. Transitions through a state machine from NEW to COMPLETED (or FAILED_SAFE/FAILED_UNSAFE). → Lifecycle

FAQ

Can I run multiple engines? : The engine is designed for single-instance deployment. Multiple instances would create conflicting state management. Scale horizontally by adding workers, not engine instances.

What happens if the engine restarts during execution? : Execution state is persisted in PostgreSQL. On restart, the engine recovers active executions and continues processing. Jobs that were in-flight may need to be re-polled by workers.

How do I cancel a running execution? : Send DELETE /workflow-execution/<execution-id>. The engine will abort the execution and release entity locks.

Why is my job not being picked up? : Check that a worker is online (GET /workers/online) and has the required function block registered. Verify the function block version in the workflow matches a registered version.