Evolution, not revolution
Since performance testing and tools supporting it started in the 1990’s performance testing tools have been improved and there has been an evolution, but one could say no revolution.
The basic functionality of Performance testing tools has always been the same.
- Scripting: Creating scripts/programs simulating real users or API usage
- Execution: Running multiple virtual users simultaneously with these scripts to see how the system under test performs under load
- Reporting: Collecting the results to report about test results.
Also positioning to free and commercial tools has been the same since early 2000.
Free tools are popular.
Free tools are very popular for obvious reasons (money) and ideology (I do not pay for testing tools).
Commercial tools are needed when going get tough.
Also, commercial tools have their place mostly in more demanding environments like the requirement for Protocol not supported by commercial tools (Sap, Citrix, etc.) and the need for big, distributed test loads over 1000 concurrent users.
Two important tools: free JMeter and commercial LoadRunner are still the same tools and valid options for most Performance testing projects after over 20 years in the market.
There has been a small evaluation of both tools but they have not changed/improved much.
I think this is mainly for three reasons:
- Both are solid and good tools for the functionality provided.
- Pressure and the need to develop these tools further have not been big enough: New advanced features are not absolutely needed in most basic performance testing cases.
- The lack of Advanced Competition has not forced the evolution of these tools. There are lots of Performance testing tools in the marketplace but only a few have really made a mark as LoadRunner/Jmeter killer.
Old knowledge is still good.
The positive side here is that as the same tools are valid there is no need to learn new tools for scripting and running tests.
New additional tools and knowledge are needed.
However, there are new and additional tools every performance tester should know.
From the Performance testing Consultant’s perspective biggest changes have been these three:
BlazeMeter – Cloud-based performance testing tools and Applications to be tested.
As traditionally on-premise software installed locally has been a way to use performance testing tools, Cloud-based performance testing tools have increased, and clearly biggest success story has been a SaaS service called Blazemeter basically scaling up Jmeter tests for bigger loads which has been a problem in local Jmeter tests as distributed testing does not work very well.
New Relic etc- Application Performance Management (APM) tools rush-making analysis much better.
Finding the root cause for performance problems was sometimes easy, but often hard before APM tools came to market. Currently, APM tools are a very important part of performance testing analysis as most of the issues are inside the application and its integrations, relatively easily spotted with APM and very difficult without it.
NosyMouse JAA – Requirement for continuous performance testing and analysis
Continuous Performance testing has been something to consider after functional tests are in place. Also, implementations have been far from good as current performance testing tools integrated with CI tools support automated analysis poorly. The new SaaS service NosyMouse JAA fulfills this gap with automated verifications of Performance requirements and comparisons to Baselines.
Performance testing tools evolution has been slow, but, new tools complement old dinosaurs and have improved the possibility to make performance testing more effective way.
Ilkka Myllylä is the most experienced performance tester in Finland with over 200 applications performance tested.