Last week was the first full week of my internship, and the project I was given was nearly ideal for the purpose. I was given an initial set of test cases for one of the products I will be testing (called Content Translation) and asked to review, edit, and expand it, while at the same time using it to actually test the product as it passed through various test environments on its way to production.
The reason that this project was (and continues to be) so ideal is that it teaches many important things simultaneously. First, it is teaching me how to write test cases — how to organize them on a spreadsheet, group them logically, think out the prerequisites, and plan the flow of the tests in the most efficient way possible. Secondly, it is teaching me the product itself — after all, you can’t test something if you don’t know how it works! You have to dive in and use it, trying every bell and whistle until you come to understand everything the product is supposed to do and all the ways it can behave. Finally, the project is helping me to understand the process of testing. This particular product has a set weekly cycle. Changes first happen in the developers’ local environments, and on a certain day those changes are merged into a non-production master, and the whole thing is tested. The next day, the release is moved to another test site that is a perfect replica the largest of the wikimedia-based wikis (Wikipedia, of course), and there it is fully tested again. The day after that, the release is deployed to a limited number of (smaller) production wikis — and it’s tested AGAIN. Finally, if everything is good to go, the next day the release is deployed to ALL production sites, including Wikipedia.
In the week ahead, I will continue working on Content Translation while taking on a new product as well (not sure which one yet).
I’ve been doing some reading this week in preparation for the internship kick-off on January 4. Elena (one of my mentors) suggested some articles, including Four Schools of Software Testing by Brett Pettichord, which compares and contrasts these approaches:
- Analytical—Focuses on the internal structure of the software and aims for comprehensive code coverage. Requires a detailed specification. Tests are technical (usually requiring programming skills) and have a binary pass/fail result.
- Factory—Testing for which there’s a predictable template that can be applied to different projects. In other words, something that is standardized and can be easily managed. Additional goal of cost-effectiveness. Focuses on requirements testing.
- Quality Assurance—Focuses on enforcing and improving development processes. Notion of having to “protect users from bad software.”
- Context-Driven—Takes into account the context of a software project (stakeholders, resource & schedule constraints, etc.) when determining the right testing strategy for this particular project at this particular time. In other words: flexible and pragmatic. Focuses on “exploratory testing,” in which each tester is highly engaged in understanding the stakeholder interests and the specs (both explicit and implicit) and aims to design tests that will advance the whole team’s understanding of the software.
Pettichord says right up front that his purpose in comparing these schools is to highlight how his own school (Context-Driven) differs from the others, so there’s extra attention devoted to that approach in the final third of the presentation. It interested me enough to follow some links and do some extra reading. Here’s what I learned:
Context-Driven or Exploratory testing starts with the acknowledgment that it is impossible to test everything about the software. (If you don’t believe that, check out You Are Not Done Yet, a testing checklist by Michael Hunter. I just skimmed it but was still overwhelmed!)
Given the almost limitless potential size of the task, the most effective and efficient way to proceed is to focus on, as Cem Kaner (a founder) puts it, “risks or issues that are of critical interest TODAY.”
That’s not all there is to it, of course. Another tenet of Context-Driven testing concerns gradually expanding test coverage to more and more avenues of exploration rather than simply repeating the same tests over and over again.
But the emphasis on immediate risks and issues along with the team focus and the idea of having ALL testers fully engaged in the problem-solving made me think of this scene from Apollo 13:
I had the great good fortune lately of being awarded an Outreachy internship with Wikimedia (the group that creates the software behind Wikipedia and other wikis). I’ll be working with the Wikimedia QA group, helping them to increase test coverage for various product features – and this blog will chart my progress!
A technical writer for most of my career, I was looking to try new things and expand my skill set. I was really drawn to this project – but hesitant to apply, since I had no formal background whatsoever in testing. It’s not that I thought I had nothing to offer! It’s just that, in my experience, most people in charge of staffing a position – even an internship position – are looking for very specific credentials; interest, enthusiasm and potential don’t cut it!
Well, that sure isn’t the mind-set of the mentors for this Wikimedia project. Their goal is to give people a learning experience, and they aim to be inclusive. I reached out to one of them (Elena) to see if they’d consider someone with my background, and she was very welcoming and encouraging. She was confident that the skills I’d developed as a technical writer could be useful in a testing context. And so I dove in!
The way Outreachy internship applications work, you spend some time actually working on tasks within your chosen project before submitting your final application. This gives you a chance to “test drive” the project and see if it’s a good fit. It’s hard to describe how rewarding this period was for me. My previous experience exploring software, reading specs, and writing procedures certainly DID come in handy as I wrote my first-ever test cases – and I was ecstatic when I ran them and found my first bug! Then came learning about how to write a good bug report. In fact, it’s surprising how much you can learn just by exploring one of these projects for a few weeks – whether or not you go on to complete the full internship. If you’re considering applying, I say go for it!