Thoughts on Tech

STLC (that Sofware Testing Life Cycle thing)

According to me...

Requirements Analysis: this is the bit where you, as a tester, are in at the beginning, where the requirements have been gathered, known, sort of understood. a document produced — all that — and you get to point out issues, ask questions, clarify. This is the kind of question that comes up in interviews. "When should testing start on a a project?" And the correct answer is: as early as possible. If you can point out an inconsistency or mistake prior to a line of code being written, then you're saving the company money and time. In a lot of companies, this is usually a meeting — such as three amigos — in which the requirement is discussed. Generally speaking, you're usually dealing with a feature being added to pre-existing software nearly always at the behest of a client.

Test Planning: this is where a Test Lead does a lot of work and then is disappointed that it's mostly ignored. This is a project level document, identifying scope, who's going to do what, timings, risks, test cases for the new feature/s. This is supposed to be a living document, but often gets ignored. Hardly anyone's interested in it unless a manager's ticking their boxes. They instantly lose interest when you send it to them. If you have an automation test suite, you will mention that this will be run multiple times by way of regression. If you don't have automation, you will have a list of tests that need to be done by way of regression. Some applications — technical debt having been ignored for years — are super sensitive to any changes. Something obscure and seemingly unrelated breaks because you added another, apparently trivial, feature.

Now on to something that irritates me. You'll often be told that that the test strategy is part of the test plan. This is blurred terminology. It makes no sense. The plan is a project level document, strategy is a companywide document. You can get round this by filling in the Test Strategy heading thus: "In accordance with the company test strategy, document ABC123." Sometimes, it's not worth arguing, even if you are fairly certain your manager doesn't understand the question they're asking.

Test Case development/design: this is where you actually write the test cases and create the test data. "Scenario: user wants to access dashboard. Given a user account exists, when user enters credentials into the browser login page, then user is given access to their dashboard."

Something else that irritates me. The absurd avoidance of "I" as above. Some people get really anal about this. But I crave leave to differ. Sometimes avoiding "I" is simply tortuous. Compare:

Scenario: Successful webmail login
  Given I am on the login page
  When I enter valid credentials
  Then I should see my inbox

With:

Scenario: Successful webmail login
  Given the user is on the login page
  When the user enters valid credentials
  Then the user should see their inbox

It's supposed to be about communication across disciplines and departments. Torturing the language to accord with some silly arbitrary rule doesn't help with that.

Environment setup: fairly obvious this one, I would have thought. Usually involves deploying the environments with the appropriate databases. Sometimes you might have to consider the spec of the hardware — some customers go as low as they can get away with in terms of servers (due to cost) — and dependencies. Updated frameworks, etc. Again, companies tend to put this off as long as they can get away with. I've known devs tell senior managers porkies in order to push framework upgrades through. Devs don't like being stuck working on old tech. The usually sell it on security or imagined extra features. Anyway, you obviously need a working test environment before moving to the part where you start testing. And it obviously helps just a smidgeon if the environment it's being tested on bears some relationship to what it'll be deployed on in anger.

Test Execution: this is the bit you, as testers, know about and understand. It's existential. You against the application. Clicking away creatively to break it. Of course, you'll assiduously do the tests you prepared for the feature earlier — tick tick, done — in multiple environments with different databases, in different browsers (don't forget the back button; that can be really messy, and is often overlooked). You'll do this over and over again. Why? Because you'll break something, it will be "fixed" and you'll have to do it all again. Some other glitch will come up when retesting (remember that technical debt no-one ever wanted to address) that you need to first reproduce, and then again have fixed. And then you go again... and again. And then later you get asked why wasn't this noticed earlier. Ah, the life of a tester. Finally, the feature works as agreed, and then you have to test everything. This is called regression testing, and you'll usually have an automated test suite to do this that only works with Vanilla — in other words the minty fresh, out-of-the-box version of your sofware. Pair it with a real client database, and the automation won't cut it anymore. So you do the manual regression tests that have been flagged as high priority with a little exploration along the way, flushing out dark obscure beasts that have been lurking in mirrored shadows since year zero.

Test Closure: this is the part that never really happens properly, mostly because devs don't like it. Devs don't like statistics on how many commits it took to fix a single defect; they prefer it recategorised as a different defect. They'd prefer not to have stats at all on how many defects and how long it took to fix them. So would testers frankly because we're supposed to be part of a team, pulling on the same rope, and getting the job done. But senior management like stuff like this, so they can weaponise it against the devs whom they often think are too big for their boots. Sure, there's a test summary report produced, but no-one really reads it or is particularly interested. What matters is the software has hit the streets, and someone's arbitrary delivery date has been met. Oh, and the retrospective — if it happens at all, it'll play a very poor second fiddle to "proper work", and will probably die of neglect.