Wednesday, September 16, 2009

Why Test Scripts Suck

I’ll trade one motivated business user in for five IT testing professionals for making sure an application works as designed. Why? Because that business user has all of her day to day requirements, pain points and frustrations invested into the new application. On the other hand, the testing professional has to make sure the test script is executed flawlessly, that’s it, on to the next project.

Below is the typical best practice, software development mantra that project managers will promote. I’ve added some notes under each of them and few new mantras.

The software implements the required functions.

Have the requirements been aloud to change during the project? Flexibility is key to the perceived success of any project. If concessions can not be made without huge push back and it’s a pain to change requirements from the business’s perspective, the project should be stopped and re-evaluated for its purposes (and management). This is very pronounced with large projects.

Added to normal Project Manager’s software development lifecycle list:

Prototyping

Prototyping functionality for business users to experience (see) what’s been talked about and promised. If a third party or internal team cannot commit to show their application during development for fear of “giving a bad impression” or “scaring” the business then there are trust and lack of communication issues going on that need to be dealt with now rather than during the full User Acceptance Testing.

The software has passed unit testing.

Make sure the developers know the architecture of the application as a whole, its requirements and the importance of the unit testing and integration into the larger application or service. If one developer is slacking the project is at risk of failure. Project managers should have a good idea of who can perform and who needs help at this point in the project timeline.

Added to normal Project Manager’s software development lifecycle list:

Code Reviews

A junior developer cannot possibly know all of the ins and outs of the application if they are focused on coding specific components or services. All code, at least initially, should be reviewed by senior developers and architects to assure efficiency and scalability.

The software source code has been checked into the repository.

This can be a pain in the ass if the project is small, however necessary if you are developing with others and integrating into a larger repository. This also is a good check for senior developers to do quick code reviews.

The software has been compiled into the current build (for compiled systems) and deployed into the appropriate test environment.

Without proper safe guards, one developer’s code could break a whole series of other test scripts. Smoke testing is highly advised before fully committing the code.

The team has developed appropriate test scripts that exercise the software's capabilities.

These scripts are usually end to end tests that are few clueless testers, not irrational business users, who change their minds, cancel, go to lunch, upload their whole hard drive, etc.


The software has passed integration and system testing, and been deployed into the user acceptance test environment.

Many large projects are desperate for true testing environments are usually skimp on resources for them. This poses an issue when the new build is supposed to be deployed and fully functional in a test environment that has kludges.

Added to normal Project Manager’s software development lifecycle list:

Performance and Scalability Testing

Normal third party developers comment during the development phase of the project that they wonder if this will scale or how it will perform under a load. These developers talk to the project manager and usually the discussion ends there. If this is brought up at this time in the project with no time allotted to it then forget about it. Also, why would the third party be motivated to do this when this is a typical reason that they get called back in to do more business?

The users have had an opportunity to use and respond to the software, and their change requests have been acknowledged and implemented where appropriate.

Again, this is important, but there should be no surprises at this time. The users should have had their requirements changed and prototyped and testing by this time.

The software has been documented in accordance with whatever standards your project follows.

Have you ever seen a test script written for documentation accuracy?

If documentation is not ongoing during the whole project this document will be worthless. I have not worked on a project where the design document is perfect after being signed off on. During development and fixing, the design doc needs to be corrected, changed, or expanded on.

Also, the deployment and knowledge transfer documentation should be complete and tested.

No comments: