Introduction to Locust : An Open-source Performance Testing Tool

Dilusha Rasangi Kumarage
5 min readMay 27, 2021
Image source : https://coursemarks.com/course/performance-testing-using-locust-1-0/

Open-source Performance Testing Tools are not a new thing for us who are in Quality Engineering track.
We have a privilege to choose any of the tools according our preferences and the scope. Among them,

Locust is a modern Load testing tool which allows us to specify loading scenarios by a Python code.

Let’s go through the following to get a basic idea of Locust Performance Tool.

High-level picture of Performance Testing Vs Load Testing Vs Stress Testing

An overview of Locust

Locust is an easy to use, scriptable and scalable performance testing tool.

According to the Locust website, it gets its name from the insect because of its swarming behavior mimicked by the tool. Some of the terminology used by Locust, like attacking, hatching, and swarming, are all inspired and borrowed from nature.

  • Each simulated user is like a locust and applies significant pressure on websites and applications.
  • This behavior is then monitored in real time on a web-based UI.
  • Each Locust in the swarm runs its own process and user can develop highly expressive Python scenarios without compromising the code with callbacks.
  • This means that this tool is entirely event-based, making it possible to support thousands of users from a single system.

Pros and Cons of Locust

Locust Vs JMeter

Let’s take a look at this comparison table of JMeter and Locust features and abilities:

Set up Locust on Windows

Step 1 : Install Python 3.6 or later

Step 2 : Make sure that Python is added to Path variable under Environment Variables > System Variables.

Step 3 : Validate that Python is installed successfully.

$ python --version

Step 4: Install Locust using pip

$ pip3 install locust

Step 5 : Validate that Locust in installed successfully

$ locust --version

First code with Locust

Assume that your API is as below;

API : <baseURL>/api/vi/token
Json Body : {“applicationCode”: “abc” , “applicationSecret” : “xyz”}
Method : POST

Then, simple Python code for above will be as follow;

import time
from locust import HttpUser, task, between
class QuickstartUser(HttpUser):
wait_time = between(1, 2.5)
@task
def on_start(self):
self.client.post("/api/v1/token", json={"applicationCode":"abc", "applicationSecret":"xyz"})

Step 1: Add the code to a Notepad++as above example and save it as ‘locustfile.py’ to your local directory.

Step 2: Start Locust

$ locust

Step 3: Validate below output

Running Locust via Web UI

As I have described in the previous section, once you have added the above code and started the Locust, then;
Step 1 : Open up a browser and enter the following URL

localhost:8089/

Step 2: You will get the below window. Fill the fields as required and click on ‘Start swarming’ button once you’re ready.

The process will start and you will get a window containing Statistics, Charts, Failures, Exceptions and Download Data Tabs.

Navigate to the Statistics tab and you should see the following.

Navigate to the Chart tab and you should see the following.

Total Requests per Second
Response Times (ms)
Number of Users
  • You can notice that the RPS is quite low here. This is mainly because we have set the wait_time to be between 1and 2.5 seconds.
  • If you are required to do an actual actual load testing, feel free to change both of the values to 0. Restart your test and you should be able to see the maximum RPS that your server can handle.
  • According to your preferences and requirements, you can do experiment using a different number of users and hatch rate.

Running Locust via Command Line Interface

Locust gives us the benefit of running the load test via the command line interface as well.

Add the --headless parameter with host, number of clients and hatch rate.

$ locust -f locustfile.py --headless --host <base URL> -u 10 -r 10
  • -u → no. of users to spawn
  • -r → spawn / hatch rate
  • -t→ Stop after the specific amount of time
  • -csv → Save the result to files in CSV format
  • -locust --help → List down all the available parameters

You will get the output as below;

Conclusion

  • Nowadays there is a trend in industry that moving away from tradition load testing approaches to code-to-test approach by switching to tools like Locust.
  • Locust was created to solve some specific issues of current existing performance tools, but it also has both pros and cons.
  • In Locust, client behavior can be entirely specified by the user by using regular python. Therefore it’s highly flexible tool and also supports running load tests distributed across multiple machines.
  • We need to have a basic Python knowledge to work with Locust. So that, required technical skill set is bit more high than using none-code tools.

Based on the pros and cons of the Tools, our purpose and our scope of performance/load/stress testing , we can choose a tool as we preferred. But as in line with how industry goes, it would be an advantage for us to learn a modern performance testing tool like Locust.

References

https://dev.to/ajeebkp23/locust-an-open-source-load-testing-tool-5adb

--

--