The future of Technology, specifically, load and performance testing, is a broad topic. In the spirit of keeping a narrow focus, let’s look at a single area for a moment – the Internet of Things (IoT). What specific challenges can we see coming for Performance Testers and Developers?
The combination of the broad range of new IoT Device types being introduced, and astounding growth in overall device volume, creates challenges from factors including device heterogeneity (E.g. proprietary Android implementations) and the sheer scope of load simulation for real-world testing of large numbers of these devices.
In addition to new device types, new ranges of disruptive IoT technologies are being introduced by large and small players in many markets. Let’s take a look at Healthcare, as one example. IoT devices, including wearables, could include new healthcare devices/protocols/security requirements/technologies, such as 24/7 diabetic monitoring and data-driven dosing from a smart watch. This isn’t much of a stretch from where we are today. Samsung Health applications are monitoring heart rate and exercise levels constantly. Wearable tech is just one sensor innovation away from continuous blood glucose monitoring. Complement this with Bluetooth link to an insulin pump, and you effectively have life-changing benefits for diabetics. But, how do you design tests for this?
This is where having great testing tools enters the conversation. DevOps teams will have to find new ways of coping with an ever-changing array of tool options, evolving testing goals, trying to take advantage of pre-packaged, “snap-fit” integrations within your existing internal development/testing environment.
How Performance Testing Will Need to Evolve
As the performance testing space takes aim at the challenges we talked about, the testing process evolution will include higher levels of automation across more elements of the testing cycle, and incorporation of new advances in evolving areas like Artificial Intelligence (AI).
For example, you may one day be able to explain to the toolchain exactly what you want to test, how each element within the test environment should relate to, and work with, the other elements, and what you want to do with the data. Today, performance test optimization is done manually. With an automated approach driven by AI, testing coverage and reliability, over time, might automatically improve based on the feeding of test results back into the process front-end, while adding historical datasets into a tuning and optimization engine.
As a tester, it’s conceivable that the future will allow you to be able to describe to the AI, using straightforward language as opposed to technical descriptions like API calls, exactly what the application should accept as valid user input, how to interact with services the app may call, and what the resulting performance response should look like.
Given the level of varied elements in the testing environment, we may eventually see the emergence of standards (go figure!) to make integration of new performance testing tools supporting new app architectures – IoT devices, emerging protocols, easier and faster.
The Rise of AI-Driven Performance Testing
Since we are trying to deliver engaging applications that perform and please end users, in the future, perhaps we can “scrape” social media input from our customers to generate feedback on app performance in the real world, and feed that information back into the development and testing process. If done at all by DevOps teams today, this is a highly-manual black art. Under an automated, AI-driven approach, we might be able to leverage this social media feedback as design goal for performance based on actual real-world-perceived app performance of, say, an autonomous automobile.
These days, the power of AI in a fairly advanced form, is available to all of us, anytime anywhere – think Cortana, Siri, and Alexa. One thing each of these AI applications has in common is that they have a limited domain; that is, the user has to define the specific AI goal. This is true whether you are talking about home automation integration with Alexa, or whether it’s using a future AI designed for creating test cases automatically, generating test code, performing codeless tests, or accelerating the DevOps cycle in general. The first reasonable use case for this type of AI testing could be for test management and the creation of test cases automatically. A set of built-in processes and standards could go a long way in keeping all of the DevOps teams consistent and aligned.
Perhaps another reasonable use case is codeless test automation, to allow you to define, in plain language, what should be tested, conditions needing to be met, success criteria/SLAs, and so on, so the AI can create/run tests automatically on your application without writing code.
Since one of the core strengths of AI is pattern recognition, it would be logical to think that AI could extract relevant patterns usable for load testing; helping with the test modelling process.
The AI, having learned the app performance “fingerprint” during earlier component testing during development, could suggest act as partner providing best practices for how best to carry out app load testing. It’d be like having a chess Grandmaster at your side as you attempt your first play of the match.
This AI-driven approach is Machine learning in-action. Carry out a load test against the application, and check its performance with your monitoring tool. If the simulation is off, re-run the test using a new set of parameters. It’s that easy, right? Naturally, this is “rinse and repeat” cycle allowing you to carry out hundreds of testing combinations per minute, while helping to build a robust set of app performance data.
Summary
As organizations navigate the digital transformation (there’s that phrase again), we’ve already seen how this transformation results in changes across the business, including requirement changes in – Products (what is made and sold), Processes (how the team is organized to make/bring the products to market), and People (the staff (who) – makeup, levels, and skills needed).
One thing that remain unchanged is the need for developers and testers to showcase their ability to adapt and apply critical thinking to new challenges, highlighting their value to the business as the central force for implementing the coming digital transformation.
Achieving both speed and quality for performance testing in a DevOps environment will be the end result of:
- Tester/developer empowerment to own app performance testing
- The continued delivery of more engaging applications for end users
- Performance testing process optimization which includes automation wherever possible
Beating the competition to the market (and keeping their distance from you), will depend on the developers and testers working in unison. This DevOps team will create faster, more improved testing practices, which will drive business transformation and fuel new opportunities.
Miss the previous parts of this three-post series?