How To Achieve Continuous Product Performance Analysis using JMeter and Stytch

Over my many years working in performance testing and performance engineering, I’ve had the chance to use a broad range of load testing tools. Whether it’s Load Runner, Neoload, JMeter, or even good old Microsoft Homer, they all have one thing in common: they’re great at generating load but mediocre at results reporting and build-over-build product performance analysis.

Sure, you can get some summary statistics on response times and throughput. You may even be able to get graphs that span the duration of a single test. If you’re lucky, the load tool may allow you to compare two performance reports to find regressions. But when it comes to doing advanced reporting and analysis, such as graphing performance results across dozens of iterative builds or drilling down to the performance of every individual HTTP request, these tools fall flat. As a result, many software performance engineers (including myself) find themselves stuck spending hours in Excel trying to summarize data for management reports or faced with the daunting task of writing a custom reporting solution.

Little did I know when I joined Stytch last year that the generalized business intelligence (BI) and data analytics platform we were building would be the perfect solution for these performance reporting and analysis challenges. As I became familiar with the product, I quickly realized that Stytch’s ability to ingest data from practically any source, model that data, and then build advanced queries against the data would allow me to easily build a performance reporting platform that could support analysis of fully automated CI performance tests—something I’d always wanted to run.

With Stytch as my new tool for product performance reporting, I went about building a system that could run automated performance tests against every new build of the Stytch platform. I was able to get the performance results of those tests in seconds and made available to anyone in the company at the click of a mouse.

Here’s what the architecture looked like:

Diagram of Continuous Product Performance Testing, Reporting & Analysis Platform Built with JMeter and Stytch

The first step in this process was to write automated performance tests using JMeter (our preferred load test tool). Once these tests had been created, I then automated the execution of the tests in our CI pipeline using Jenkins. This allowed the tests to run every time a new build of Stytch was pushed through our CI pipeline. Additional, longer-running performance tests were run overnight and on the weekends.

With 10+ CI runs per day, plus the overnight and weekly JMeter tests each producing a CSV containing every single http request with its associated response time, response code, etc., the automated tests produce a ton of data. This is where our Stytch reporting platform works its magic.

Using the Stytch API, I wrote a simple script that uploaded the JMeter CSV results to our Stytch platform. The script tagged each file (or “contribution” in Stytch parlance) with key attributes about the test not captured by JMeter. For instance, each of our test results was tagged with the name of the test environment on which the test was run, the dataset used to run the test, the name of the JMeter test that was run, the date and time the test was run, and the run number of the test as supplied by Jenkins. Adding these key attributes and associating them with the test results allowed us to query against them in the Stytch platform.

The Jenkins job running the performance tests was then configured to run this upload script every time a JMeter test finished. Just like that, we now had all of our performance test results available for reporting and analysis in Stytch.

Data Modeling in Stytch for Product Performance AnalysisData modeling in Stytch opens up countless ways to query the data.

Using the Stytch platform, it took a matter of minutes to define a data model based on the JMeter performance results CSV files. This process basically involved tagging columns in the CSV to help Stytch understand which columns were measures that we’d want to graph—such as response times, bytes received, or errors vs. subjects (run number, test name, test environment, etc.) that we’d use to query the measures. This process only needs to be completed once per data source and, as we only had one datasource (JMeter CSV results), it was fast and painless.

With Stytch now understanding our performance test data, we could ask almost any question and it could build a report containing a chart or table with the answer. To get answers to our most pressing performance questions, I set about building several kinds of reports that I thought would be useful in spotting performance trends.

The first were those high level performance trend reports that charted performance of iterative builds over days or weeks.

Product Performance Response Time Trend Chart - Stytch ExampleA report showing the min, median, and 90th percentile response time trends for a key operation over 7 days worth of CI runs.

Product Performance Throughput Trend Chart - Stytch ExampleA report showing the throughput trend for our CI Performance test over 7 days worth of test runs.

To build these reports, I simply graphed min, median, and 90th percentile response times against the test run number attribute we’d added to each test result as they were uploaded to Stytch. This allowed me to summarize each load test, containing thousands of HTTP requests, into a handful of data points on a graph and then show how those trended run over run. Similar reports were created for throughput.

I also created reports that drilled down into individual test runs to show response time and throughput graphs vs. time. This allowed users to investigate why the median response time of a particular performance test pass may have regressed. Was it a momentary slowdown in the system that skewed the overall number, or did the entire test run slowly?

Product Performance Response Drilldown Chart - Stytch ExampleA report showing the response times of every http call from a key test vs. the duration of the test run. This type of report is great for drilling down into the details of a particular performance test run.

Product Performance Throughput Drilldown - Stytch ExampleA report showing the throughput of a single CI Performance test pass. Like the previous image, this type of report is great for drilling down into the details of a particular performance test run.

I created other reports that drilled into errors answering questions such as “Which tests threw what errors on what test environments, and with what rate of frequency?” The list of possible reports you can create once the data is in Stytch is nearly endless. If you’ve uploaded the data, you can ask Stytch anything about the data and it will produce an answer.

Product Performance Error Drilldown Table - Stytch ExampleA report showing the errors occurring in our nightly performance tests over the last week. Errors are categorized by the test, operations, and test run, and includes error counts and response codes returned by the server.

To make it easy for other teams and management to consume all these product performance reports, I then built dashboards that grouped a series of reports onto one single screen. This allows our development teams to easily see the performance impact of their newly compiled code. As all reports and dashboards are updated in realtime, the moment that a performance test has finished running and the test results have been uploaded, they are available to view in Stytch. This allows for a very fast feedback loop for developers and quality assurance.

Product Performance Dashboard - Stytch ExampleA Stytch dashboard that groups together a number of performance reports, making it easy to analyze product performance trends over time.

For those not inclined to log into the reporting platform, reports and dashboards can be scheduled via email. This allows the latest results of the overnight performance tests to be available first thing in the morning in developers’ inboxes.

In the end, I was able to use Stytch to build a performance testing framework and reporting solution to support fully automated CI Performance tests and continuous product performance analysis—something that had seemed out of reach previously. So many of the reporting features and capabilities we implemented were things I’d heard about at conferences from leading tech companies with the resources to create their own custom reporting solutions. Having a software reporting and analytics service such as Stytch, that provides these advanced capabilities right out of the box, makes it much easier to build a product performance reporting and analysis solution for smaller companies such as ourselves.

If you’re interested in learning more about how we’re using Stytch with JMeter for our product performance analysis, leave a comment, or connect with me on LinkedInYou can also sign up for a demo of Stytch with our Customer Success team.

By on August 18, 2016

Leave a Reply

Your email address will not be published. Required fields are marked *

Demo

Copyright © 2016 Stytch Inc. All rights reserved unless otherwise stated.