[fa icon="calendar"] Publié le 09 May 2018 par Justine Mignot


Check your API performance while developing with k6

Problem: slow API routes

One day on a mobile application project, we discovered one of the API route used by the application took 15 seconds to return a result, which was not good at all for that route. We priorized time to solve this problem, and improved it to 500ms! But a few days later, another team working on the same route, broke its response time to 10 seconds…

Solution: automate performance testing

We had to find a solution in order to prevent this from happening again, and increase awareness about performance into our teams.
That’s how we discovered k6. It allows us to:

  • while developing, detect back-end performance problems before going to production,
  • automate performance testing to avoid regression,
  • follow back end performance regularly with the team (developpers and business) on a graph.

You will get here tips to follow your back-end performance thanks to k6.
It took us ~1hour to get a first k6 test written following their documentation. But since our first test, while using it, we improved our utilisation of k6. We want to share it with you so it is easy and quick for you to get a good use of k6.

Usage

Problem: We don’t have any check or tests that make sure that our routes are and stay fast.

Methodology:

  • Define a standard response time: how long you accept a route takes to return a response, above it the route will be considered KO. Discuss it with all your team, keep in mind that Google said 53% of mobile users abandon sites that takes more than 3 seconds to load. Then change STANDARD_RESPONSE_TIME in globalChecks (more about global checks just below which define if a route performance is OK or KO)

     For example, we chose 500ms.

  • Process:
    • if concerned route is not yet tested:existing-performance-control-3
    • if a test for concerned route already exists:new-performance-control-1

Installation

We decided to share with you a k6 sample repository as we have ours: https://github.com/bamlab/performance-monitoring

  • Fork this repository and name it {YOUR_ORGANIZATION_NAME}-performance-monitoring
  • Clone your new repo locally
  • Install k6
  • You’re good to test

The folder is structured by API:

src
├── apiName1 # one folder per API
│   ├── tests # all tests are located in a tests folder
│   │   ├── fetchComments.js # one route tested per file
│   │   ...
│   └── index.js # a main index file grouping all API tests
├── apiName2
│   ├── tests
│   │   ├── fetchArticles.js
│   │   ...
│   └── index.js
...

You can run tests:

  • one by one:
  k6 run src/{replace_by_your_API}/tests/{replace_by_your_test}.js
  • by API:
  k6 run src/{replace_by_your_API}/index.js
  • by project:
  export PROJECT={replace_by_your_project_name} && k6 run index.js

Write a test

Here is what a test looks like:

k6-test-example

What we put in place to simplify test writing:

  • Global options:

    When you run a test you can chose the number of virtual users and how many time you want to run the test.

    k6 run src/{replace_by_your_API}/tests/{replace_by_your_test}.js --vus 2 --iterations 10
    

       Here: 2 virtual users, 10 times

      Trying different values for virtual users made us aware that ours servers were not able      to manage more than 10 simultaneous connections

    We defined global options, used by default when you run tests by API, or by project.
    You can adapt it to your project requirements.

     

  • All your tests will extends BaseTest, like this all your tests will:

        - have the same checks (globalChecks):

    • ok status: check the response status is successful (2xx)
    • global performance: check the route respect your standard response time
    • latest performance: check your latest modification of code didn’t break the route performance

    output-globalchecks

       - have the same threshold: green if the latest performance test is above 70% (for example here 7 tests over the 10 verify ‘latest performance’ test), red otherwise.

    output-threshold

     

To write your first test, follow contribute by adding new tests (~10min).

Next

Based on this sample k6 repository, we also put in place a script creating a graph to visually check our back-end performance weekly.
A second article will come soon to help you doing the same.

Documentation

 

If you have any questions, or if you want to share with us your experience with k6 or any other tool, feel free!


Liked this article? Interested in building an app with us?

Contact a React Native expert in Paris




Want to rate this article?
Submit Rating

Topics: Performance