Our team “The Four Musketeers” recently participated in the STWC 2014 testing competition in the North America region and were Rank 5/36 teams. To start with, I should say our team had a blast and it was so much fun right from the way the competition was advertised, presented and conducted.
My team experiences of the competition can be divided and explained in the following phases-
About 3-4 days before the competition, my team members went through the information provided in the website and noted down important details which would be valuable to get more points. Also, we researched various free tools available which may help us in testing in terms of doing functional and performance testing.
It is interesting to note that, my team was so into preparing for this competition that one of my team members cut short his vacation in Canada just to join the competition in person and help our team out. Now, that is true dedication!!!
The next thing our team did was discussing testing strategies and approaches we could use for the competition. We started looking into various books and testing courses we have taken (mainly BBST and RST) and got some ideas from them.
Next thing was resources. What type of OS’s and devices we would need to test the product. After a lot of discussion, we ended up having 2 versions of Windows OS (XP and 7), Mac OSX and bunch of mobile devices and tablets, along with a multi – plug to connect a number of devices all at once for charging and power.
Finally we came to the collaboration piece. How is our team going to collaborate and discuss things during the competition? A team member came up with the idea of using Google docs for collaboration so that each one can see the other persons updates as and when he/she is doing stuff.
Once we saw the e-mail hinting that we will be testing an image and video capture tool, my team started brainstorming various possibilities and combinations in terms of whether the application would be a desktop app, mobile app or a web app. For each type of app, what should be our test strategy?
It is funny that, 3 hours before the competition we though about various image and video capture tools and my team mate said Snag it is a possibility. But, my team dismissed that idea thinking that the product may be something in the beta version not yet available to the end users (Unfortunately we were wrong 🙁 )
2 hours before the competition, we started looking at different tools like Jing, Camtasia and other tools and started comparing the features to get an idea about how these kinds of tools work.
Based on the previous phases, my team had decided various resources and testing techniques that could be used during the competition.
At 5:30 PM EST, we uploaded a doc which contained the important factors that would help in getting better scores and some rules of the competition on google docs and shared it with our team. We made sure everyone could access it.
At 6:00 PM EST, I joined the You tube channel and saw Matt and the Snag it representative (Stakeholder). I increased the volume on my side so that my entire team could hear the conversations. I was in charge of monitoring the online session and typing in the questions which me and my team members wanted to ask. We also simultaneously started downloading the builds to our respective devices.
While doing this, the other 3 people started noting down important points the stakeholder was talking about and which he cares about.
At 6:30 PM EST, we started doing a Risk analysis. We identified different areas which the stakeholder cared about and started thinking about potential risks for each of those areas. Then we thought about impact and the likelihood of some problem occurring in that module and came up with Risk Scores.
At 6:50 PM – We had quickly completed the Risk Analysis and started assigning the test areas within each one of us and prioritized our testing based on the scores.
At 7:00 PM – We started testing based on the risk score. We continuously were discussing and comparing results. Our testing approach was Risk Based Exploratory testing. We decided to spend about 20 minutes in each of the test areas identified and then moved on to the next one.
At 9:00 PM – Me and another team member started working on the “Test Report” while the other 2 continued testing and posting defects.
At 9:30 PM – We submitted everything and were really happy about our effort.
As I said in the beginning my team had a blast. The competition was pretty intense as there were some confusions and tension for the entire 3 hours. But at last we pulled through it successfully.
Some interesting things were the different testing approaches our team came up with like using some tools to check application performance, trying to test on mobile device in different ways, trying to interface with external applications through the app and I tried connecting the laptop to multiple screens like my TV and tried playing recorded videos of snag it to check video quality and performance. And of course the beer and the pizzas we had throughout the competition helped too 🙂
One thing which our team did not expect was that, the name of the product we were testing was released at about 5:45 PM, although the testing was supposed to start only at 6:30 PM. My team was unaware that we could start testing once we got the e-mail with the product download link and also, we thought we need to ask some questions to the business customer and test based on the high risk areas (We still stick to this that it was worth testing based on risks instead of randomly testing modules). So, by the time we posted the 1st defect there were already 60 entries in HP Agile Manager. Probably it could be a good idea next time, if there is information about when a team can start testing.
Overall this testing competition was worth it and my team will be competing next year and will try to improve on the limitations we had this time.