Distributed Load Testing using Python, Jmeter and Docker
Vishnu Murty (~vishnu79) |
Knowing how Enterprise Server perform under load (% CPU, % Memory, Network, % Disk time) is extremely valuable and critical. This may limit the server performance and lead to enhancements or fixing before you go for production. Now any Load testing tools available comes with below problems
- Preparing the environment / infrastructure (installing various software dependencies) on multiple host systems to perform load test at times very tedious task.
- It requires maintenance and manual interventions to scale up and scale down load test.
- You have to write your own test codes, need some development effort.
- Most of the stress tools comes with their own format of reporting, very difficult to customize if it’s really needed. Also it’s difficult to view, analyze and compare test results real time across systems.
Here in today’s talk we are going to demonstrate how JAAS (JMeter As A Service) can be a one stop solution for all these problems. And how python is playing a crucial role in Delivering JAAS Solution.
Tech Stack behind JAAS
- Python: Python is at the core of JAAS. It is responsible for communicating across all individual components using Rest API. Python is also responsible for slicing and dicing the data for processing.
- Docker: For auto deploying of JMeter Apps, we use Docker containers (pre-packaged with all dependencies). This reduces manual interventions for maintaining the Load test environment / infrastructure.
- ELKB Stack: ELKB is the backend for JAAS. We store all logs, beats, JMeter results in Elastic Search. Logstash for data processing pipeline and Kibana for visualization.
- JMeter : JMeter is the load test tool for generating load. It’s an open source software, Ease of Use, Platform independent and Robust Reporting.
JAAS comes with a single window User interface where user will provide the Load test details like:
- System details
- Load Generation type
- Number of concurrent users
- Number of threads
Using RESTFul API implemented in Python this info (including dynamic Test plan .jmx file for JMeter) will be stored in ES Backend and a new Docker service will be created.
We use Docker container (prepackaged with all dependencies as a single app) for generating load on System. Usually a Docker container ships JMeter software and Beats (Data shippers for Elasticsearch ). Every time for a new load test request, we deploy a new instance of our app on the Docker Swarm cluster (a new Docker container).We maintain Docker swarm cluster (group of machines that are running Docker) for scalability and load balancing while performing load test. Each of this machines in cluster (both manager and worker) will communicate with each other and execute Docker command using Python Rest call only. Swarm managers can use several strategies to run Docker containers, such as “emptiest node” -- which fills the least utilized machines with containers. Or “global”, which ensures that each machine gets exactly one instance of the specified container.
Swarm managers authorize workers to execute\run the Docker container. Each Docker service will have specific input from user (stored in ES backend) for generating particular type of load on specific host system. Similarly user can scale up or scale down the load (number of users or threads) using the same UI form on the fly. This is the biggest advantage of JAAS over any other Load test tool available. In normal scenario there is no option but stop and start the tool, if you want to scale up or down. Each Docker container with its JMeter instance will keep generating the Load on specific host system and Beats will be responsible for pushing back the data/results into Elasticsearch. This entire implementation of data reading and writing to Elasticsearch is happening through Python. Once the load test specific data pushed to Elasticsearch , kibana will prepare the Visualization for you. This is real time, aggregated (in case of concurrent users are generating the Load) and available in a single dashboard which makes it very easy to compare and analyze.
Familiarity to Python, Docker, JMeter and Elastic Stack. Python and Modules(XML, JSON and Request) experience.
Vishnu Murty K
A Senior Principal Engineer at DellEMC Infrastructure Solutions Group, is an MS (Software systems) with a total experience of 13 years in Leading Product Qualification and Automation Development efforts. The domains Vishnu has worked on include Storage and System Management Software. Responsible of Delivering Zeno - Continues Test Automation framework, JAAS, ICEMAN and Automation Tools. Presented automation papers in Pycon (Python developer forum) and STeP-IN Forum.
A professional with over 7 years of experience in Core Java, PHP, Node.js, Python. He is currently working as a Senior Engineer in Dell R&D Bangalore. He has designed, developed web applications for various MNCs across multiple domains. He loves to be keep updated with all latest tech trends , cutting edge technologies.
Vishnu Murty K