Monday 10 May 2010

It’s NOT Rocket Science

This blog is to go into a bit more detail about the planning and execution of the European Weekend Testing mision 17.

If you don’t know what the mission was please have a look and read it.

In summary, the group should test an audio application, a software compressor plugin that needs its own host. The group was testing the plugin through the host which was an additional complication.

The mission for EWT17 was several weeks in the planning. I first came up with a rough outline, then bounced it off Anna and Markus who made it better without really knowing the application. We then pre-tested the whole mission with Anna and Markus on Thursday to make sure it was not too hard to do. I got interesting results to what they were looking at and we then decided that, yes, this mission is a go.

For me, the one of the main goals of the mission was to bring a bit more realism into it. So I introduced a bit of role playing in saying that all EWT testers are in one company in one group reporting to one test manager who has a problem he wants them to solve: Does this software compressor work and is stable enough for me to use tonight?

His problem is time limited and he wants an answer in 1.5 hours. I should rephrase that, he wants one answer. From his point of view it’s one problem, one question, one answer.
Of course in EWT people come from all over the world so this was always difficult to achieve but some people picked up on it as Jeroen also pointed out in his blog.

I wasn't actually expecting to get one but wasn't sure what people would do. Some testers refused an answer as per Michael Bolton's article.

Anna, Markus and me discussed if we should add a team lead role for this mission but decided against it as it wasn’t in line with the spirit of EWT to have a hierarchy. I was interested to see if one developed naturally or not, which didn’t happen – no problem there.

The other big difference in this mission compared to previous ones is that I picked a domain where not many people would have knowledge of – audio recording and engineering. There are actually a lot of parallels with software testing but that’s another story or blog.
When the mission was released people realised within a couple of minutes that they don’t know enough about the domain to be able to test the application. Some went into ET testing mode, some Googled and found the Wiki page that explains very well what it’s all about. No one thought about asking if there are domain experts in the group of which we had one, Zeger. He actually couldn’t get the software to start on his laptop so was available to answer questions. He picked up on some assumptions people made and threw in some very valuable comments. Some were picked up, others weren’t, for example the important one about asking if the test manager is using the same plug-in host or not as this could invalidate the test effort. Yes, he did and it would actually..

The Test Manager gave the group pretty much an impossible task. Find out if the application is stable. What does stable mean without context? Without knowing what hardware is used, operating system (as someone pointed out), other software installed stable is pretty much meaningless.

The other big area that I wanted to explore with this mission is testability. There are several possible meanings. One issue with testability is that the testers didn’t have enough domain knowledge to know how they would actually test the application. The Rocket reduces the dynamic range but how many testers know how to test that?

The other issue is about measuring if the dynamic range actually has been reduced and by the right amount. Since we’re talking about audio here we can’t rely on our senses as they aren’t up to the task. Sure we can hear if something gets quieter and louder but only if the difference is quite large. How large depends on the testers ears so it is really subjective.

There are some freeware tools that would help with this task, for example the freeware audio editor and recorder. A tester could then record the output of the plugin and compare waveforms or rely on the measure tools that come with the audio editor application. All in all a pretty difficult task. I only mention it here because the secondary objective was to find out if the attack of the plugin was fast enough. Something that could only be tested with additional tools and was largely ignored by the group, which was good. If you have your hands full with the main objective don’t get distracted with nice to have’s – thumbs up for the EWT testers.

The group dynamics where interesting to see and also the extend to which testers picked up on what information others typed into Skype. I tried to facilitate more group working which hasn’t worked so well. This is clearly something we need to think about again for future missions.

It was an intense session but one where I think everyone got something out of. To the new people who joined us for EWT17, it’s not always like that!

I’d like to everyone who participated who made this session a success and gave me lots to think about – and that’s a good thing.

Thomas

1 comment:

  1. This comment has been removed by a blog administrator.

    ReplyDelete