Testing Function Blocks
Three approaches, from simplest to most integrated.
Level 1: Standalone Tests
Instantiate the function block, build a context, and call execute_function_block().
This gives you full control and works with standard pytest debugging.
from neops_workflow_engine_client import DeviceTypeDto
from neops_worker_sdk.testing.factories.context_factory import create_workflow_context
# Replace with the import path to your function block module
from your_package.echo import Echo, EchoParameters
async def test_echo_returns_input():
params = EchoParameters(text="hello")
device = DeviceTypeDto(id=1, hostname="test-device", ip="10.0.0.1")
context = create_workflow_context(
run_on="device", entity_id=1, devices=[device]
)
block = Echo()
result = await block.execute_function_block(
params=params, context=context, propagate_exceptions=True
)
assert result.success is True
assert result.data.output == "hello"
Testing acquire() independently
async def test_acquire_succeeds():
block = Echo()
result = await block.acquire(EchoParameters(text="hello"))
assert result.success is True
Testing parameter validation
Pydantic validates parameters at construction time:
def test_missing_required_param():
from pydantic import ValidationError
with pytest.raises(ValidationError):
EchoParameters() # 'text' is required
Level 2: @fb_test_case Decorator
The @fb_test_case decorator generates a pytest test function that runs
the function block through the full execution lifecycle — including blocking
detection. Tests are auto-discovered by pytest.
Signature
def fb_test_case(
test_description: str,
params: P,
context: WorkflowContext,
*,
succeeds: bool = True,
expected_result_data: R | None = None,
assertions: list[Callable[[FunctionBlockResult[R]], bool]] | None = None,
) -> Callable[[type[FunctionBlock[P, R]]], type[FunctionBlock[P, R]]]:
| Parameter | Purpose |
|---|---|
test_description |
Readable name; becomes part of the generated test function name |
params |
Parameter instance to pass to run() |
context |
WorkflowContext to pass to run() |
succeeds |
Expected value of result.success (default True) |
expected_result_data |
If provided, asserts result.data == expected_result_data |
assertions |
Additional callables; each receives FunctionBlockResult and must return True |
@fb_test_case vs @fb_test_case_with_lab
@fb_test_case does not require assertions or expected_result_data —
it only checks result.success == succeeds. In contrast,
@fb_test_case_with_lab raises ValueError if you omit both. This asymmetry
is intentional: local tests often start as simple "does it succeed?" smoke
tests, while lab tests should always verify specific output.
Example: testing success
from neops_worker_sdk.testing.base.function_block_test_case import fb_test_case
from neops_worker_sdk.testing.factories.context_factory import create_workflow_context
from neops_workflow_engine_client import DeviceTypeDto
ctx = create_workflow_context(
run_on="device",
entity_id=1,
devices=[DeviceTypeDto(id=1, hostname="test", ip="10.0.0.1")],
)
@fb_test_case("Echo returns input text", EchoParameters(text="hello"), ctx)
class Echo(FunctionBlock[EchoParameters, EchoResult]):
...
This generates test_echo_echo_returns_input_text in the same module.
Example: testing failure
@fb_test_case(
"Fails gracefully without a device",
CollectParams(command="show version"),
create_workflow_context(run_on="global"),
succeeds=False,
)
class CollectBlock(FunctionBlock[CollectParams, CollectResult]):
...
Example: data assertions
@fb_test_case(
"Returns version info",
VersionParams(),
ctx,
assertions=[
lambda r: r.data is not None,
lambda r: "software_release" in (r.data.version_info or {}),
],
)
class ShowVersionBlock(FunctionBlock[VersionParams, VersionResult]):
...
When to use standalone vs @fb_test_case
Use standalone tests when you need:
- Breakpoints and step-through debugging
- Multiple assertions with descriptive messages
- Parameterized tests with
@pytest.mark.parametrize
Use @fb_test_case when you want:
- Concise, declarative test definitions
- Built-in blocking detection
- Consistent execution lifecycle (mirrors production)
Level 3: Mocking Device Connections
For function blocks that connect to devices, mock the proxy to avoid real connections:
from unittest.mock import MagicMock, patch
@pytest.mark.asyncio
async def test_show_version_with_mock():
mock_proxy = MagicMock()
mock_proxy.get_version.return_value = {
"vendor": "Cisco",
"model": "CSR1000V",
"serial": "ABC123",
"software_release": "17.3.4",
}
with patch.object(VersionProxy, "connect") as mock_connect:
mock_connect.return_value.__enter__ = MagicMock(return_value=mock_proxy)
mock_connect.return_value.__exit__ = MagicMock(return_value=False)
block = ShowVersionBlock()
result = await block.execute_function_block(
params=VersionParams(),
context=device_context,
propagate_exceptions=True,
)
assert result.success
assert result.data.version_info["vendor"] == "Cisco"
For integration tests against real devices, see Remote Lab Testing.