You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Context and Motivation
The model2owl project currently only features unit tests, which verify internal components but do not demonstrate functional conformance to the transformation rules from the perspective of stakeholders. This gap had already been explicitly identified in the past and reveals a shortcoming in the project's verification and validation strategy.
Current Issues
Absence of automated high-level (functional) tests.
Existing tests operate only at the unit level, offering little value for acceptance or validation.
Stakeholders cannot easily verify if a feature is supported and functioning as expected.
No documentation exists for feature-level test cases.
Expectations
The new functional tests should:
Validate that the system behavior aligns with the transformation specifications, as defined in model2owl documentation.
Be suitable for use by clients and stakeholders during acceptance procedures.
Be maintainable and extensible for future features.
Be triggered automatically whenever a new change is committed to the repository, and the results should be made available to the user (through GitHub Actions).
The implementer is expected to:
Establish the testing framework and conventions to support functional tests for future development.
Implement a test for selected, working model2owl feature (as a proof of concept).
Integrate this functionality into the current CLI and CI workflow.
Note
Relation to new features
Any new features developed alongside this testing feature should be delivered together with the tests that follow the newly designed approach.
Description of the proposed solution
Test cases will be devised for both transformation and operational features within the defined scope.
An inventory of test cases will be established in the form of a Markdown document that will be included in the repository.
UML diagrams for the identified test cases will be prepared.
Tests for the prepared diagrams will be defined declaratively using SHACL.
SHACL shapes will be designed to target model terms, that is the TBox rather than the ABox.
Suitable conventions for organizing test cases and corresponding test data will be introduced and documented.
A generic testing script using a test runner (e.g., pytest) will be developed to execute the model2owl transformation and run the SHACL processor.
The CLI will be adjusted to support the execution of functional tests.
The CI setup will be extended to ensure functional tests are executed on every commit alongside unit tests.
Functional testing will be documented within the repository, including:
Instructions for end-users to understand test execution.
Guidance for developers on how new test cases can be created and integrated.
Problem Description
Context and Motivation
The model2owl project currently only features unit tests, which verify internal components but do not demonstrate functional conformance to the transformation rules from the perspective of stakeholders. This gap had already been explicitly identified in the past and reveals a shortcoming in the project's verification and validation strategy.
Current Issues
Expectations
The new functional tests should:
The implementer is expected to:
Note
Relation to new features
Any new features developed alongside this testing feature should be delivered together with the tests that follow the newly designed approach.
Description of the proposed solution
pytest) will be developed to execute the model2owl transformation and run the SHACL processor.