Training Badge
Submitted by superjac on
in

Forums

How can companies speed up the turn around in software releases with the monkey of regression testing on their back? 

My company has been applying more ideas from an Agile framework to our software development. We have shortened our release schedule from 18 months to 2 months, but my manager is pushing for us to reach our goal of a release every month. 

Most of the testing is done in tandem with the development, but the hold out in reaching the one month goal is the regression testing. I inherited the list of scripted regression tests, and running them takes 4 full weeks. If at any point one fails, we are forced to consider weather we should start over after the fix is made. If we do we certainly will miss the one month goal. Worse the regression test list grows monthly as new features are added. 

I've considered a number of possible solutions including:

  • releasing the software with a known issue warning
  • redesigning all the regression tests to try to test more items at once thus shortening the testing time. Surprisingly no one in testing or development was in support of this idea.

Is there some fundamental concept that I am missing?

jrosenau's picture

My suggestion is based on the assumption that  the re-designing of the regression scripts would be on top of the current project load.  

Is it possible to have the re-design of the regression scripts be declared a project in itself?  It seems to me that you need to ensure that you can catch regression issues with the new scripts and the only way to do that is to test the old script and the new script side by side and compare results.  That will obviously take a lot of time based on the testing times you've laid out above.  A project would provide the ability to focus and ensure nothing is overlooked.

My other suggestion is to have a senior technical person look at the scripts and see if there are some that could be tweaked to perform faster or consolidate scripts.  Maybe there is some low-hanging fruit that could help show the organization that you are serious about getting the time down.  The organization may be more willing then to either create a project or resource the re-design in a different way.

My last suggestion is that as new feature/function is rolled in and you need to add regression tests, the person writing the regression tests need to ensure that they are looking at the current regression tests and consolidating where possible.

John

mtietel's picture
Training Badge

 If you truly want to be "Agile", then you can't "pipeline" your regression testing.  It has to happen in parallel with the development so that you're always running regression (and you know immediately when a developer has broken something!).  And if you find a problem and have to restart all regression testing, then it's not a testing problem; it's a design/architecture problem with the software under test ( more about that below).

Here's some things we did on a recent non-agile project that had been in existence for several years:

- automate;  4-weeks to run a regression suite?!  You have to change the cost ratio of test creation vs execution from 10/90 to 90/10.  That is, testers don't spend their time running tests; the tests run unattended, at night, so the testers spend their time writing tests and analyzing results.

- eliminate duplicate or near duplicate test cases; combine similar scenarios so the setup (or bulk of the scenario) is done only once (we went from 20hrs to 12hrs using this technique)

- break the regression into smaller suites and figure out how to run them in parallel (we went from 12 hrs to 2hrs using this technique)

- do constant ruthless regression suite maintenance; if the suite is constantly growing, you're doomed (and your test team isn't doing its job).  You have to constantly evaluate what tests should be part of of your regression suite and eliminate the low value ones (do you know when the last time a given regression test caught a bug?  If you can't remember, then maybe it shouldn't be part of the suite!)

We're on a 2-week cycle on my current project.  It has been Agile from the start, so there's a definite "everyone is responsible for quality" mentality when it comes to testing.  There are suites of developer-maintained unit  tests that are run as part of the build for each code check-in (which may happen in the 10s to 100s of times a day), so they must run in seconds.  Those tests are also part of a larger "coverage" build - we're requiring 85% code coverage.  We also have developer maintained integration tests and system tests.  All of which taken together, give the developers near-instant feedback, keep defect detection early in the SDLC where the cost to fix is low, force good design practices (high cohesion, loose coupling, etc), and place less reliance on the tester's regression suites.  The testers on this project have a number of different suites for different purposes - the regression suite  takes about an hour to run and runs every day, not just during a specified testing period.

suedavis's picture
Training Badge

If you fail a test, you absolutely must re-test after you commit the fix. Any time you make a change to your code base, all previous test results, period, are no longer valid.

High-performing teams have their tests set up so that they can be executed in less than a minute, to allow very frequent commits after having run every single test. The state of the code base is known to be good at all times, and the only changes to the code base that are permitted are those that are proven to be correct. It's a lot cheaper to stay in this state than it is to chase after quality after the fact.

That's a hard state to get to if you didn't start out that way and stay there. If you're chasing after that goal, having some set of tests that execute in memory in less than a minute, with an automated farm running the longer end-to-end tests after each commit, will give reasonably rapid feedback, and then in the event of a failure, all commits since the last successful full test run are backed out.

All of that assumes continuous integration and deployment mechanisms, good automated tests, good test coverage, and so forth. I've been on a project that started out doing all its "unit" testing in end-to-end integration tests. We followed the strategy in the previous paragraph, and went from nine hours on one machine to one minute in Eclipse plus feedback from a farm of half a dozen test running machines in less than an hour. 

Now, if you're missing tests over a big chunk of code, that's a whole other problem. (I'm only considering fully automated tests here; manual ones are "missing," even if they exist.) Getting good automated coverage over the part of your code that's changing most rapidly, measuring the coverage, and tracking it as it monotonically increases, eventually gets you out of your testing debt.

Last point: if you get your project from so badly challenged that running the tests (manually, I presume?) takes more than a month, to any of the states that I described above, you'll want to highlight that achievement very prominently on your resume. (And send me a copy so we can poach you. :-) )

KTatley's picture
Training Badge

There's some really good advice on how to fix the technical problem already in the responses.

I'll add on to the discussion that if you want to convince people to change you need an emotional argument to convince them. In Manager Tools parlance they say "What's your visual" and there's a cast for that!

To me it sounds like you seem to be fighting other people for change - if you can first get them bought into the need to make a change, they will be more receptive to solutions. If you do a good job of selling the need for change they will not only be more receptive to your proposals but could help you brainstorm, develop and implement the change. Much more effective to have staff fighting with you instead of against.

I can see that your manager is bought in to the change but perhaps not your team. Sharing this story with the company may help (and it's an entertaining read)

http://www.joelonsoftware.com/items/2012/07/09.html

 

I would also recommend the hotwash cast as another method to get ideas and momentum behind change