Cybersecurity products have to be tested thoroughly before they reach customers, but traditional testing methods have often struggled to keep up with fast-paced software development. In many organisations, testing for large-scale network appliances and embedded systems could take weeks, especially when each firmware update required lengthy manual regression tests. These delays not only slowed down product releases but also risked leaving security flaws undetected.
This challenge sparked a shift in thinking for one engineer, whose work on test automation frameworks is now seen as a model across the industry. John Komarthi’s approach proved that faster testing does not have to mean less thorough testing, and in some cases, it can even be more effective.
Discussing his journey, he shared that during his time at SonicWALL, he noticed how security validation was often pushed to the end of development, creating last-minute roadblocks. Testing mobile VPN platforms and APIs meant working with fragmented scripts that were hard to reuse and difficult to scale. To solve this, he built a modular testing framework that treated each test step as a reusable block. Actions like logging in, injecting a threat payload, or validating responses could be combined in different ways to cover multiple scenarios without starting from scratch each time.
Additionally, he introduced a configuration system that allowed tests to run across multiple devices and firmware versions automatically. With the help of continuous integration tools like Jenkins and containerization with Docker, entire test cycles could now run in parallel. What once took five days could be completed in less than one, an 80% reduction in testing time that became the new standard for the team.
At Intel Security (McAfee), he applied the same principles to firmware testing for secure wireless connections and policy enforcement. By reorganizing large, rigid test scripts into smaller, stateful components, his team could run flexible test sequences that uncovered issues faster. He also automated log analysis, feeding data directly into performance dashboards to cut down on manual review work.
Now at Fortinet, he continues to refine these ideas, creating a cloud-based QA pipeline for FortiWeb that uses adaptive testing. This means that if a test run shows gaps in coverage, the system automatically expands the scenarios and payloads for the next cycle. The pipeline integrates tools like Pytest, Prometheus, and Grafana, combining performance monitoring with real-time testing insights.
The key to these improvements, he believes, lies in the test architecture itself. “The most overlooked part of security testing is test architecture itself,” he added. “Most teams build automation scripts reactively, focused on specific bugs or scenarios. That’s why scaling becomes a nightmare.” According to him, a modular, data-driven framework, on the other hand, is easier to maintain and can handle new challenges without slowing down development.
For the cybersecurity industry, the lesson is straightforward. With products becoming more complex and cyber threats growing, testing can no longer be treated as an afterthought or a time-consuming barrier. It needs to be built into the development cycle from the start, with tools and frameworks that can move as fast as the products they support.
Reducing testing time by 80% is not just about speeding up releases. It’s about improving accuracy, catching more security flaws early, and giving teams the confidence that their products are ready for real-world threats. As organisations push to deliver secure and reliable software, smarter testing frameworks like this one are setting the pace for how the industry measures both speed and quality.


