You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Objective is to empower users to independently benchmark the performance of Dapr within their own environments. By providing this capability, users can obtain the specific performance numbers they require.
Design Proposal
The purpose of this proposal is to outline the design and functionality of the performance test pipeline for DAPR (Distributed Application Runtime). The pipeline enables users to run performance tests and obtain performance metrics based on their specified inputs, such as throughput and requests per second (RPS) against different components. Expectation from user to provide the test and component configuration through YAML files or environment variables.
Idea is to provide github action which will run the tests written for performance testing. Test should be generic enough to run with different Dapr components. There will be one test for one Dapr building block, and each building block test should be configurable enough to run with supported components.
All the apps and Dapr components related configuration will be provided by user as an YAML file.
There will a default app which will be deployed if user doesn’t provide apps details. This will help users to test their scenarios if they want something specific, if no app is provided then the test will fall back to default app that is created along with test in dapr tests.
Different ways of running performance tests by users
There three approaches to run the performance test by user.
1. In customers fork
User will fork the dapr repo and setup the github actions along with the required infrastructure and run the performance test pipeline.
Advantages
There will be no cost overhead on Dapr
Disadvantages
Customer will have to setup pipeline, so it will take more time for them to do the perf test.
Customer may not want to do the setup and use their account for cloud related expense.
2. In Dapr Repo
User can run the performance test in Dapr Repo itself by providing the specific configuration for their performance test.
Advantages
User will find it easy to test as they will not have overhead of setting up the environment.
Disadvantages
Customer might incur more cost if there are no limitation on trying different combination.
It will be difficult to figure out the bad actors
3. Providing binaries to user
We can provide perf test binaries which could be executed through CLI. We will write perf tests as we generally write, and create binaries for these tests. With this binary in place, user can run the individual perf test from CLI in their AKS cluster by setting up the Dapr and required components for perf tests.
Advantages
User will find it easy to test as they will not have overhead of setting up the github actions.
Disadvantages
Running the perf test from local environment could be flaky in nature.
Conclusion
The performance test pipeline for DAPR allows users to obtain performance metrics by specifying inputs such as throughput, RPS, and test duration. It provides comprehensive reports, analysis, and recommendations to optimize the performance of the DAPR application. By leveraging this pipeline, users can evaluate and enhance the scalability and reliability of their DAPR-based distributed applications.
The text was updated successfully, but these errors were encountered:
Problem Statement
Objective is to empower users to independently benchmark the performance of Dapr within their own environments. By providing this capability, users can obtain the specific performance numbers they require.
Design Proposal
The purpose of this proposal is to outline the design and functionality of the performance test pipeline for DAPR (Distributed Application Runtime). The pipeline enables users to run performance tests and obtain performance metrics based on their specified inputs, such as throughput and requests per second (RPS) against different components. Expectation from user to provide the test and component configuration through YAML files or environment variables.
Idea is to provide github action which will run the tests written for performance testing. Test should be generic enough to run with different Dapr components. There will be one test for one Dapr building block, and each building block test should be configurable enough to run with supported components.
All the apps and Dapr components related configuration will be provided by user as an YAML file.
There will a default app which will be deployed if user doesn’t provide apps details. This will help users to test their scenarios if they want something specific, if no app is provided then the test will fall back to default app that is created along with test in dapr tests.
Different ways of running performance tests by users
There three approaches to run the performance test by user.
1. In customers fork
User will fork the dapr repo and setup the github actions along with the required infrastructure and run the performance test pipeline.
Advantages
Disadvantages
2. In Dapr Repo
User can run the performance test in Dapr Repo itself by providing the specific configuration for their performance test.
Advantages
Disadvantages
3. Providing binaries to user
We can provide perf test binaries which could be executed through CLI. We will write perf tests as we generally write, and create binaries for these tests. With this binary in place, user can run the individual perf test from CLI in their AKS cluster by setting up the Dapr and required components for perf tests.
Advantages
Disadvantages
Conclusion
The performance test pipeline for DAPR allows users to obtain performance metrics by specifying inputs such as throughput, RPS, and test duration. It provides comprehensive reports, analysis, and recommendations to optimize the performance of the DAPR application. By leveraging this pipeline, users can evaluate and enhance the scalability and reliability of their DAPR-based distributed applications.
The text was updated successfully, but these errors were encountered: