Remote Lab Testing
The remote lab is a FastAPI service that provisions Netlab network topologies on demand using Containerlab as the orchestration provider. It lets you test function blocks against real virtual devices (FRR, Cisco IOL, Nokia SR Linux, etc.) without managing lab infrastructure locally. The canonical reference for the lab itself — architecture, queue semantics, deployment paths — lives in the neops-remote-lab project docs; this page is the SDK-consumer view.
How It Works
sequenceDiagram
participant Test as pytest
participant Client as Lab Client
participant Server as Remote Lab Server
participant Netlab as Netlab / Containerlab
Test->>Client: remote_lab_fixture("topology.yml")
Client->>Server: POST /session
Server-->>Client: session_id
Client->>Server: POST /lab (topology file)
Server->>Netlab: Deploy topology
Netlab-->>Server: Devices ready
Server-->>Client: Device list (IPs, credentials)
Client-->>Test: DeviceInfoDto objects
Test->>Test: Run function block
Test->>Client: Teardown
Client->>Server: POST /lab/release
Client->>Server: DELETE /session
Setup
1. Install test dependencies
2. Set the remote lab URL
Or add to .env — the test suite auto-loads it via python-dotenv.
When REMOTE_LAB_URL is unset, fixtures fall back to local execution
(requires Netlab and
Containerlab installed on the host).
The full set of timeout overrides
(REMOTE_LAB_REQUEST_TIMEOUT, REMOTE_LAB_SESSION_TIMEOUT,
REMOTE_LAB_ACQUISITION_TIMEOUT) lives in the
Remote Lab client config reference.
Don’t have a Remote Lab Manager to point at?
Two entry points on the Remote Lab side:
- Run locally — install the lab service on
localhost:8000for offline dev (~10 min). - Pick a deployment — decision tree for shared VM / multi-tenant lab / CI runner pool.
Creating Lab Fixtures
1. Write a topology file
Topology files use the Netlab topology format
and are stored in tests/topologies/. Here is a minimal two-router lab using
FRRouting containers:
Change defaults.device to iol for Cisco IOL or srlinux for Nokia
SR Linux. Per-vendor install walkthroughs (image sources, EULA, the
IOL build path) live on the Remote Lab side at
Vendor setup.
For Netlab’s full kind catalogue, see
Netlab platforms and the
topology reference.
2. Register fixtures in conftest.py
from neops_remote_lab.testing.fixture import remote_lab_fixture
simple_frr = remote_lab_fixture("tests/topologies/simple_frr.yml")
simple_iol = remote_lab_fixture(
"tests/topologies/simple_iol.yml",
reuse_lab=True,
)
| Parameter | Purpose |
|---|---|
| Path | YAML topology file in Netlab format |
reuse_lab=True |
Share the same topology instance across multiple tests (faster) |
Fixtures yield DeviceInfoDto objects representing lab nodes, including their
management IPs and credentials assigned by Containerlab.
One lab per test; reuse keys on file content
Only one remote_lab_fixture can be active per test — the ordering
plugin rejects extras at collection time and groups tests by fixture
so the lab is provisioned once per group. reuse_lab=True keys on
the topology’s SHA-256, not its filename, so byte-identical files
share the same running lab. Full reference-counting + teardown
semantics in
Lab lifecycle.
@fb_test_case_with_lab
The decorator generates a test that provisions a lab topology, converts lab
devices to neops DeviceTypeDto objects, builds a WorkflowContext, and
executes your function block.
Signature
def fb_test_case_with_lab(
test_description: str,
params: P,
run_on: str = "device",
base_devices: list[DeviceTypeDto] | None = None,
base_device_groups: list[DeviceGroupTypeDto] | None = None,
*,
remote_lab_fixture: str | None = None,
succeeds: bool = True,
expected_result_data: R | None = None,
assertions: list[Callable[[FunctionBlockResult[R]], bool]] | None = None,
) -> Callable[[type[FunctionBlock[P, R]]], type[FunctionBlock[P, R]]]:
| Parameter | Purpose |
|---|---|
test_description |
Readable name for the generated test |
params |
Parameter instance for the function block |
run_on |
Entity type: "device" or "group" |
base_devices |
Static devices always present in the context |
base_device_groups |
Static groups always present in the context |
remote_lab_fixture |
Name of a remote_lab_fixture fixture from conftest.py |
succeeds |
Expected result.success value |
expected_result_data |
Strict equality check on result.data |
assertions |
Additional callables; must return True |
Important
At least one of expected_result_data or assertions must be provided.
Example: testing against a lab
@fb_test_case_with_lab(
"Show version on FRR device",
ShowCmdParameters(cmd="show version", password="admin"),
remote_lab_fixture="simple_frr",
assertions=[
lambda r: r.data is not None,
lambda r: r.data.output is not None,
lambda r: r.success is True,
],
)
class GetShowCmdBlock(FunctionBlock[ShowCmdParameters, ShowCmdResult]):
...
Example: testing with a Cisco IOL topology
@fb_test_case_with_lab(
"Show version with IOL fixture",
ShowCmdParameters(cmd="show version", password="admin"),
remote_lab_fixture="simple_iol",
assertions=[
lambda r: r.success is True,
lambda r: "Cisco" in r.data.output,
],
)
class GetShowCmdBlock(FunctionBlock[ShowCmdParameters, ShowCmdResult]):
...
Session Lifecycle
The remote lab uses an exclusive session model:
- Create session — your test joins a FIFO queue.
- Wait for promotion — when it’s your turn, the session becomes active.
- Acquire lab — topology is uploaded and deployed (or reused).
- Run tests — devices are reachable via their lab IPs.
- Release lab — decrements reference count; topology is torn down when count reaches zero.
- End session — session is deleted; next in queue is promoted.
Timeouts:
- Waiting sessions are dropped after 600 seconds without a heartbeat.
- Active sessions are dropped after 300 seconds of silence.
The fixture handles all of this for you. Deeper view — state machine, heartbeat cadence, 423-Locked semantics — on Session queue.
Running Lab Tests Selectively
Remote lab tests are marked with remote_lab and excluded from the default
pytest run. To run them explicitly:
uv run pytest -m "remote_lab" # only remote lab tests
uv run pytest -m "function_block or remote_lab" # all integration tests
make test-function-blocks # same via Make
Remote lab tests also carry the function_block marker, so
-m "function_block" includes them as well.
When to Use Remote Lab vs Mocks
| Scenario | Recommended approach |
|---|---|
| Rapid iteration on logic | Standalone tests with mocks |
| Verifying parameter/result schemas | @fb_test_case with mock context |
| Validating device interaction | @fb_test_case_with_lab |
| CI/CD pipeline acceptance tests | @fb_test_case_with_lab |
| Testing error handling for unreachable devices | Mocks (faster, deterministic) |
See also
- Plug into Worker SDK — the Remote Lab side’s two-step quickstart for SDK consumers.
- With Worker SDK — multi-vendor patterns and the worker-test-layout notes the SDK consumes.
- Pytest fixtures — the full
remote_lab_fixtureAPI the SDK imports as a stable contract. - Cookbook — runnable end-to-end recipes (FRR, SR Linux, multi-vendor, cURL flows, smoke scripts).
- Architecture — the three cooperating components (server,
LabManager, client) and the runtime walk-through.