Return to site

User focused development

Lessons learned from a government project

OVERVIEW
When it comes to letting people try a recipe you've prepared for them, there's an established protocol for when everyone tells you what they think of your creation.

Regardless of how everything tastes, anyone sampling the food will politely tell the chef that everything is great.

Now when you watch people use the software products you've built, it's not quite the same.

The IT equivalent cookery show might look more like this.

A product that makes sense to your tech and business teams on a whiteboard in a conference room, can quickly come unstuck when faced with real life customers.

This is one of the reasons why it's particularly valuable to get your products in front of users as quickly as possible. It might not always be pretty, but it saves you a long-term headache.

USER TESTING 101

There are some great tools for logging and analysing customer behaviour – Google Analytics is a free starting point, or maybe something like Totango or Gainsight if you want to go more in depth. In addition to automated logging and analytics, it's amazing what you learn by watching real people use your software in real life.

This is because by tracking analytics you know what your users are doing, by watching your users in real life you often pick up on why they do things. 

If you don't currently do any user testing outside of your tech team, then even simple corridor/hallway testing might be a good start.

This is where you pick out a few random people [as the name suggests, you can pick the
first 5 people you pass in the corridor], because these 'random' people haven't built the product, and don't know how you're meant to use it – they'll at least give you an outside perspective on what you're building.

WATCH AND LEARN
The next step is to invite users to sit down in front of your software (whether it's on a computer screen, laptop, tablet and on on), and accomplish some tasks (with no input/help from your team).

Start with some of the key use cases of your product:

'Can you please log-on, create a new account, find some data, and create a report?'.

You then watch what happens and take notes. Look out for whether the way people use your product is the same as how you had envisaged, you might reflect on:

- Is everything as simple as you thought?
- Does everyone 'complete' the set task, or do some people get stuck, or even give up?
- Do people take a long time to do something you thought would be quick?

- Do people click in the 'wrong' place?

(it might be that you've made the navigation too complicated, or simply that you've given one button a confusing name)

- And so on.

If you're working with existing customers, you can assume a better level of knowledge on your 

application, so you might get more advanced with what you do. You should still ask open-ended questions (at least at first), and watch how clients perform a particular task.

'Can you show me how you use the product to find information relevant to you?'

'What do you tend to do once you've found that information?'

If you ask leading questions ('Isn't this new feature we built really cool?'), or guide the process 

too much ('Look how easy using this tool is!'), then you lose that objectivity.

USER FOCUSED DEVELOPMENT - EMBRACE THE SURPRISES
In the past I've been surprised to find out that verbal customer feedback, can be very different from reality.

Take something like:

'I use your product for making all my reports'

There's me as the product manager thinking that this customer must use the wonderful reporting feature we built, with all our lovely graphing tools. What this actually meant when I sat down and watched the user 'make a report', was more like this:

'I'm going to completely ignore your reporting feature, and use your product to get all the data I want, dump it into Excel, and create my own charts in there'.

In this case it's better the devil you know. I had discovered that you were sometimes 

only one use case ('I need to merge your data, with my own data'), or lack of customisation ('I can't use your charts, as we need to add our logo to any report we do') away from having your
users not use your product the way you envisaged. 

LESSONS FROM A CURRENT PROJECT: THE OFFICE OF NATIONAL STATISTICS
On Day Digital's most recent project, I'm amazed by how user-focused the development is. Right down to having development teams watching users behind one way mirrors. The development team described it as like being 'in a police interview room' (I decided not to ask why they knew so much about police interview rooms!).

Day Digital teamed up with a great company called Methods Digital who were awarded the contract to build the ONS (Office of National Statistics) a new prototype website from scratch.

Both Methods and Day have a similar ethos for user focused pragmatism, via open-source 

technology, and rapid development. The ONS are really embracing this methodology and putting us all to the test. For example the Alpha project started on September 8th 2014 and went live the 8th
December 2014.

So it is possible to rapidly iterate, and do user testing, even in a more 'traditional'
organisation. The team team built a new 'digital service' prototype to replace a legacy 'website', using real data, from scratch, in 12 weeks - whilst incorporating user testing and feedback all the way.

The ONS Homepage
The ONS hompage: Project start date 8th September 2014
The ONS Alpha website: Launch date 8th December 2014
PUTTING THE PRODUCT OUT THERE: TO USERS, CRITICS, AND THE PUBLIC.

Before the project started 

The ONS had conducted an amazing amount of user research had taken place before the development team even arrived in September. Around 800 users gave feedback on the existing website, there were 200 in-depth user studies, the ONS met 150 people in person, and a Customer Experience firm (CXPartners) were brought in to provide outside expertise.

This meant that whilst the project started from scratch, we had a really well defined list of problems, aims, and also a clear set of user stories categorised by user personas.. 
The ONS identified three user types - Information foragers, expert and analysts, and inquiring citizens. This mean the team had a spectrum of user needs in mind throughout the project.

The brief was clear – we needed to get from a 'homepage from hell' to 'a website so good people forget they are using it' [The ONS' words, not mine! Taken from the project blog].

During development

- Studies with 200 users, interviews, observations, analytics

- 3 days at the government labs in London

- Lots of 'show and tells'

This involved the project owner (Matt Jukes from the ONS) not only taking the Alpha in front of ordinary users, but also current website critics, Government Digital Services (who set the bar very high for government projects these days). It's a great, and sometimes brave example of getting your product out to users, whilst you still have time to act on their feedback.

In a lot of sessions the whole development team were able to watch ordinary users test the Alpha as they were building it.

Lots to think about: Notes taken during user testing sessions (Photo Source ONS Alpha blog).

On the ONS project, one interesting piece of feedback was that some power users (perhaps unsurprisingly) had built their own tools for analysing and visualising data from the ONS (whether data relating to crime, house prices, employment, health and so on). So even if the team created some lovely pie charts using HTML5 so that users can see them on their laptops, mobiles or tablets, these power users wouldn't care - they just wanted the raw data.

As such the team put in the option to grab data directly (the tech savvy users wanted the data via JSON for example).

With user testing you might not get the answer you expect, but by putting your ideas to the test sooner, and inviting real feedback and criticism, you will make a better product.

If you launch a product, and say 'we've spent 6 months working on this, do you like this new feature we built?' - you're not really inviting feedback in the same way.

In addition to controlled user testing the real acid test is putting your product out to market. The Alpha went live to the public for several weeks in December. After passing public scrutiny and a GDS (Government Digital Services) assessment, the Alpha is now moving to a Beta phase.

The team is now working on a 9 month project to build out the product in a way that's built to last. All of the testing feedback from the Alpha (mostly positive comments, along with some constructive criticism) means the whole team is in a really good position to know what works well with real users. The user driven development approach will also carry through to the Beta phase.

Watching users use your product on the big screen! The GDS research lab (still less than 12 months old) was used to help the ONS Alpha (seen here in another project, photo taken from GDS Blog).

SUMMARY

The ONS Alpha project was extremely open, everything from the product being put in front of

users during development, going fully public after 3 months, a project blog written by the project owner Matt Jukes at the ONS (who is very honest with what's working well, and what still needs to improve), and also all the source code being on GitHub.

This refreshingly honest approach is something that a lot private companies could learn from (of course for reasons of confidentiality – a lot of updates might be internally facing only). Do get your products in front of selected customers, have show and tells at lunchtime, and generally, make sure people know what you're doing, before it's too late to change anything.

Having gone into this project with a good knowledge of user focused development – Day Digital were quite humbled to have our knowledge taken to a whole new level, and will freely admit that we didn't expect this to happen on a government project.

When reflecting on how to build software products quickly and effectively, it's clear that putting your work out to real users as soon as possible might not always be pretty, but it's a sure fire way of knowing if you're on the right track. Even if you're not lucky enough to have a full on laboratory complete with one-way mirror, it's important to try and work rigorous user testing into your development cycle.

Also just a quick note of thanks to Burhan Eser (Senior Developer, Day Digital on the ONS Alpha project) for his helpful comments on writing this post.
All Posts
×

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly