Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Support correctness validation in OSB workloads #367

Open
jainankitk opened this issue Aug 13, 2024 · 3 comments
Open

[FEATURE] Support correctness validation in OSB workloads #367

jainankitk opened this issue Aug 13, 2024 · 3 comments
Assignees
Labels
enhancement New feature or request untriaged

Comments

@jainankitk
Copy link

Is your feature request related to a problem?

The OSB workloads do not validate response of the search requests. Any 200 OK response work well from the benchmark perspective.

What solution would you like?

Given we have these workloads running nightly, having correctness validation also will prevent regressions like opensearch-project/OpenSearch#15169. And I believe, we should be able to validate correctness with minimal effort without impacting the performance numbers. We can have static file containing expected response OR validate the correctness between baseline and candidate in addition to performance numbers.

What alternatives have you considered?

Ideally, integration and unit tests cover most of these scenarios, but it is impractical to have such large amounts of data as part of such tests

@IanHoang
Copy link
Collaborator

IanHoang commented Jan 7, 2025

@jainankitk OSB comes with --enable-assertions flag which tells OSB you want to validate that responses from a operation meet a criteria. Ensure you are running the test with --enable-assertions and have defined asssertions in the workload for each operation you want to validate:

{
  "name": "term",
  "operation-type": "search",
  "assertions": [
    {
      "property": "hits",
      "condition": ">",
      "value": 0
    }
  ],
}

We will add documentation for this to the documentation website.

@jainankitk
Copy link
Author

@IanHoang - This approach involves adding assertions for the whole response, for all the operations, which is very complex IMO. Can we have expected response file for each operations, so that the operation response can be compared with the expected response?

@jainankitk jainankitk reopened this Jan 28, 2025
@github-project-automation github-project-automation bot moved this from Done to In Progress in Performance Roadmap Jan 28, 2025
@jainankitk
Copy link
Author

jainankitk commented Jan 28, 2025

Also, that feature should be enabled by default and there should be option to disable that if needed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request untriaged
Projects
Status: In Progress
Development

No branches or pull requests

3 participants