What are your users doing (or interview techniques for project analysis)

For the first time, I’m helping run planning sessions for an agile project. Planning has been a bit of a bugbear for me on many of my recent projects, so I’m excited to have a chance to try some things out. So far, it seems to be going well. It’s a short project, so I shouldn’t have to wait too long for feedback on how things went, which is a bonus.

One of the things I noticed is that when in planning or requirements sessions, there’s a tendency to fall into an interview format. That is, we (the development team) ask questions, write things down, ask clarifying questions, and repeat until we think we have enough information to move forward. I think that can be fine sometimes, but it feels like one of the simplest things (but not necessarily the quickest) we can do to help our understanding is to ask our customers to show us how they work. That is, the interview style becomes ‘fly on the wall’ rather than ‘question time’.

As a tester in more traditional projects, I’ve also had occasions where despite the documentation around the project, deep understanding of the domain was difficult to come by, and generally absent from the test team.

On a recent project I had some success combating this by just having the gumption to ask a passing front-line operator “Can I sit with you and watch you work, or do you have time to walk me through the systems you use?”

“Sure! Your desk or mine?” was the answer.

It didn’t hurt me to ask (as I could fit the time in around my other responsibilities), and he was happy to know that some of his concerns might be represented in future. But I know that I don’t always think to do this, or feel that my request will be considered. I learned a lesson though.

So on my current project, it’s time for us to ask the same question of our client. We’re talking through the problem, but the moment the development team is on its own, we keep hitting decisions that can’t be made because our thinking is functional thinking. That is, the client is suggesting features, but we’re not digging deeper to understand the precise intent of those features requests, or the problem that those features will solve. This understanding is critical not just for our testing, but for the whole team to be successful.

Now, we don’t have to work this way. We could just start building the product and let unsatisfying features fall out of the process with our on-site customer, but I believe the amount of up-front thinking and analysis we do is an uncertainty reduction activity. We almost guarantee ourselves a slightly longer project, but I think we reduce the likelihood of having nothing useful to show to the customer at the end of our budgeted time. Critically, I think we have to inform our customers of the tradeoffs we are making around up-front analysis, and how that will impact them.

Excuse me now, I have to go ask someone a question!

Leave a Reply

Your email address will not be published. Required fields are marked *