For a considerable period of time, we did well organized testing, but not against requirements that were formally defined.
Why was it deemed necessary to automate the test management process before there were any decent tools for recording exactly what we were testing for?
The evolution of software tools to manage the software development process has been one of reverse progression, not in sophistication, or time, but terms of the stage of the development life cycle supported. Some of the earliest tools helped in the process of configuration management; managing version of software components, defining baselines from combinations of those versions, comparing baselines, and of course, helping to build the program from the defined components.
But from that point on, the introduction of new software tools tended to take on a reverse project flow. Test management tools came next, and ultimately test automation tools, both of which are used at the end of the traditional development life cycle. These tools supported the testing process by allowing tests to be formally recorded and the results saved so that software engineers could see where they needed to focus their effort, and of course also reporting test result statistics. Then, as software tools matured, the support of the development process slowly began to move backwards in the life cycle.
Some groups dabbled with context sensitive editors to help the software engineer directly, and most projects were introduced to debuggers. Moving back further still, we saw the advent of design tools made possible by the increasing sophistication of graphical displays. Once the design process could be supported with software tools, it became the goal to move further back in the process and provide analysis support, usually being combined with existing modeling and design capabilities.
Lastly, and rather oddly, we move back far enough in the life cycle to see the start of the support and automation of requirements management. Initially, people tried to use generalized tools such as spreadsheets and word processors for this, but no satisfactory solution presented itself until requirements and traceability management products came on the scene.
There are exceptions to this reverse progression, especially with tools that support cross-lifecycle functions such as process management and project management tools, but generally, this was the order of support tool maturity.
Because, broadly speaking, tools developed in this reverse lifecycle manner, it was only natural that tool integrations followed the same path. Test tools with configuration management, design tools with code generation, then requirements tools with testing tools and ultimately requirements tools with analysis and design tools, (the maturity of which is still on going.)
In some cases, the order of tool maturation is understandable; software engineers, especially while doing everything else manually, needed to be able to organize the proliferation of code components they were creating, hence CM tools. But one sequence of events seems, in retrospect, rather puzzling. Why was it deemed necessary to automate the test management process before there were any decent tools for recording exactly what we were testing for? How meaningful can tests really be if there are not clear statement regarding what the software should actually do? This seems to be very much a case of putting the cart before the horse. As has been said many times, introduce a tool into a bad process and you simply do the wrong things faster. So yes, you could test code much faster and record the results, but the test criteria were still based on badly managed descriptions of user needs. I seem to recall, (yes, I am old enough to recall) that much testing was actually to see whether the program did what the software engineer said it should, without crashing! At one time the software engineer was the one with most tools for the job, and was often the flag bearer for the end user. Then along came formal requirements tools and suddenly less technical people were able to define requirements, prioritize them and specify the content of releases.
So, for a considerable period of time, we did well organized testing, but not against requirements that were formally defined. The chicken came very much before the egg.
Footnote: Interestingly, Agile processes have given back much of the end user representation to the software and test engineers, perhaps trying to reverse the results of evolution. A tall order indeed.
Our mission to helping our customers - large corporations, small businesses, professional services firms, government agencies and individual developers – with the means to effectively and affordably manage their software development and testing lifecycles, so as to decrease the time to market and increase return on investment.
At Inflectra, we are fully committed to provide our customers with the very best products and customer service. We believe in going the extra mile to ensure that each customer is satisfied with our software products. We have the experience and the commitment to deliver the products customers need to deliver their projects and assure quality every step of the way. (Learn More)
We are so confident that you will be fully satisfied with our products that we offer a 30-day, unconditional, money back guarantee! (Learn More)