K6 load tests — Part IV
Test scenarios, sequence, thresholds
Below is the sample test and configurations file added in Part II.
import http from 'k6/http';
import {check} from 'k6';
export default function () {
let res = http.get('http://test.k6.io');
check(res, {
'success response': (result) => result.status === 200,
'body contains text': (result) => result.body.includes('Collection of simple web-pages suitable for load testing.')
})
};
{
"vus": 20,
"iterations": 50,
"duration": "30s"
}
In the configurations above, the test scenario is not mentioned. Tests are running against the default scenario. Command to run test:
k6 run test.js --out json=test.json --config load-test-config.json
In order to run tests multiple times with the same or different configurations, need to run the command multiple times. You may have a scenario where you want to make just a few calls to warm up the database cache and later start running actual load tests. The same can be achieved through a single command using scenarios attribute. Update the test configuration file with the below (It has scenarios warmup and loadTest):
{
"scenarios": {
"warmup": {
"executor": "shared-iterations",
"maxDuration": "30s",
"iterations": 50,
"vus": 20
},
"loadTest": {
"executor": "shared-iterations",
"maxDuration": "30s",
"iterations": 500,
"vus": 200
}
}
}
Tests will run against two scenarios in parallel with the given configuration. Each scenario will be executed for 30 seconds. The executor is set to shared-iterations. To learn about executors refer to this document.
1st screenshot is captured during the execution of the test. Both scenarios are running in parallel. After the execution of both(all) scenarios, a combined execution summary is generated.
To run the scenarios sequentially, update configurations to use the startTime configuration like below. It is defined for each scenario. While setting startTime, consider enough time to complete all previous scenarios. Irrespective of the completion of previous scenarios, the test will start executing after a given duration. Though the execution of other scenarios is complete, the execution of the next scenario will not start before the given startTime.
{
"scenarios": {
"warmup": {
"executor": "shared-iterations",
"maxDuration": "30s",
"iterations": 50,
"vus": 20,
"startTime": "0s"
},
"loadTest": {
"executor": "shared-iterations",
"maxDuration": "30s",
"iterations": 500,
"vus": 200,
"startTime": "60s"
}
}
}
The next step is to set failure thresholds for the scenarios. As a bulk of requests will be sent, not every request will be able to match SLA or some of the requests may get lost in the network traffic. You can set a threshold for an acceptable outcome. In the example below, I am using two thresholds.
warmup
1. http_req_duration: 50% of requests should respond within 1 second
2. http_req_failed: 50% of requests should return a success response
loadTest
1. http_req_duration: 95% of requests should respond within 500 ms
2. http_req_failed: 99% of requests should return a success response
Limit for the warmup scenario as loose as the intent is to build a cache. For the loadTest scenario, setting the limits as per SLA.
{
"scenarios": {
"warmup": {
"executor": "shared-iterations",
"maxDuration": "30s",
"iterations": 50,
"vus": 20,
"startTime": "0s"
},
"loadTest": {
"executor": "shared-iterations",
"maxDuration": "30s",
"iterations": 500,
"vus": 200,
"startTime": "60s"
}
},
"thresholds": {
"http_req_duration{scenario:warmup}": [
{
"threshold": "p(50)<1000",
"abortOnFail": false
}
],
"http_req_failed{scenario:warmup}": [
"rate<50"
],
"http_req_duration{scenario:loadTest}": [
{
"threshold": "p(95)<500",
"abortOnFail": false
}
],
"http_req_failed{scenario:loadTest}": [
"rate<1"
]
}
}
Earlier a combined test execution summary was generated. Now, as thresholds are different, a few sections of the summary will be displayed separately.
To learn and experiment with Thresholds more, refer to this document.
Part V will cover tags configurations.
Note: Subsequent parts of the series are available here.
Happy Testing!