Using Feedback Techniques for the gov.uk Website
Jake Benilov will give a talk on September 27 in Brussels about feedback techniques used for making gov.uk, based upon his blog post 7 types of feedback that helped make gov.uk awesome:
With software, feedback can be cheap and frequent, although bizarrely, most product teams in the industry are content with just whatever came out of the few functional and non-functional tests run by their dedicated QA team.
In his blog post Jake describes the types of feedback that he has seen in use by the 250+ team:
At Agile Tour Brussels 2013, Jake will present about the innovative feedback techniques that the GDS team uses to improve their daily work. InfoQ did an interview with Jake about using the feedback techniques and how the team applies lean startup with minimum viable products to do user research.
InfoQ: What is the ultimate goal that gov.uk wants to reach? And what has the product team done differently to make gov.uk reach this goal?
Jake: gov.uk was conceived in 2010-2011 in order to "fix publishing"; at that time, the UK central government was publishing lots of material online, but the existing model was plagued by usability issues (such as lots of duplicated information, very hard to find what you were looking for) and extreme fragmentation (most departments and agencies had their own publishing portal, each with its own idiosyncratic way of presenting the information and interacting with the site).
To address these shortcomings, the gov.uk team has set out to build a site that acts as the single entry point to any interaction you might want to have with the government (from getting a tax disc for your car to finding out when the next bank holiday is), one which comes with its own strong brand. The brand aspect is quite important, because it permeates everything, from the visual identity and the interaction design, down to the language and voice in which the content is written.
In order to avoid repeating the mistakes of the past, the gov.uk product team has tried to only built something if it serves some concrete and well-understood user (as opposed to government) need. By sticking to their guns on this matter during the migration of the legacy platforms to gov.uk, the team was able to eliminate vast amounts of content and tools, for which there was insufficient evidence of their usefulness to end users.
Another factor contributing to the success of the project has been the team's willingness to release a minimum viable product, even if it's not quite perfect yet, and then iterate based on real user feedback. This also means that gov.uk will probably never stop changing until it's turned off - the first 8 months saw 1000 releases, which meant an average of 7 updates per working day.
InfoQ: Why is it important for teams to get feedback? Which benefits can it give them?
Jake: When building something new, the team makes assumptions about the existence of particular user needs, and whether their implementation fulfils those needs satisfactorily. The bigger the project, the larger the scope, the bigger and more varied the user base - the more assumptions there are. Feedback allows for the validation (or invalidation) of assumptions. An experienced project team makes sounder assumptions, but everyone will make mistakes - getting feedback early and often will limit how much time the team spends running in the wrong direction. I can't imagine a project of gov.uk's size and complexity being successful without short feedback loops.
InfoQ: Why did the team use so many different types of feedback? Would it be easier to use just a few types?
Jake: The third principle in the Government Digital Service Design Principles is Design with data. Different types of feedback give you different data points at various stages of the development cycle - lab testing or remote user testing is invaluable when you are still building the feature, whereas analytics and direct user comments help to understand whether the feature is working or not after launch.
The various feedback mechanisms also complement and amplify each other: for instance, it's hard to release early and often without a DevOps culture being in place - either things would be constantly breaking, and the operations team would just be tearing their hair out in frustration. Another example: product analytics can tell you what is happening, but without user research, you probably be able to quickly understand why something's happening. Without doing both, you're only seeing one half of the picture.
InfoQ: You mentioned that the team did user research, using minimum viable products. Can you give some examples how they did it?
Jake: Well before gov.uk was taken live, the team released public Alpha and Beta versions of the site, which ran alongside gov.uk's predecessors. The Alpha was a true minimum viable product, as Eric Ries defines it - a product built primarily for learning, not for production use. The Alpha was built and launched in only 12 weeks (which also included the time to recruit and ramp up the team), so the team started getting real feedback within weeks of starting, instead of months or years. The Beta was a more complete than just a prototype, but was still missing critical parts when it was launched. Lots of features were then subsequently added, and existing features were tested, redesigned and redone several times until they were fit for purpose.
As an example, the homepage went through 4 complete redesigns on the back of user testing and feedback from public users. The User Research team ran guerrilla tests (quick, short user tests outside lab conditions) and repeated rounds of more formal lab-based and remote user testing with representatives of different user groups (such as professional users, jobseekers, business owners, retirees, users with disabilities, and users of various ages and levels of internet proficiency). The testing exposed problems with early design decisions that would have not been caught until it was too late. For example, the early versions of the homepage were very sparse and relied upon using the site search to navigate to the content, taking inspiration from Google's homepage. However, user testing showed that the homepage wasn't useful to those who were comfortable using search (they would just come in from a search engine anyway), and failed for users who needed to browse through categories in order to find what they wanted. This reality was taken into account in later redesigns and as a result, gov.uk launched with a homepage that wasn't broken for those users.
InfoQ: The team also adopted DevOps, having developers and operations people collaborate. Can you tell us about the benefits that they got out of doing this?
Jake: One very obvious benefit is that there isn't a lot of ceremony about releasing new software. If I have developed a new feature that's been code-reviewed and accepted by the product owner, I'm free to go and deploy it pretty much as soon as I want - there isn't a burdensome operations handover process to go through (but obviously I am on the hook if the release goes wrong). This allows an incredibly fast turnaround time between idea inception and the feature running on production, and also means that each individual release is pretty safe, since it doesn't contain a large number of changes.
This arrangement also allows the web operations and infrastructure folks to focus on platform-wide improvements, where they can put their specialist skills to best use, rather than chase after application defects; conversely, developers end up building more robust services which are easier to debug, on account of being familiar with the production infrastructure.
Steven Ihde,Karan Parikh Mar 29, 2015