Get in touch

+44 (0) 28 9099 2882
deeper@fathom.pro

Fathom

UX Bites #7 ‘Designing for users by designing with users’ – Video and Q+A

UX Bites #7 ‘Designing for users by designing with users’ – Video and Q+A

2020In UX Bites Webinar #7, we discuss how to best involve users in design projects to discover needs, innovate on potential solutions and test designs.

Designing for users by designing with users

Video: UX Bites by Fathom webinar 7 ‘Designing for users by designing with users‘. Find this webinar and more on Fathom’s YouTube channel.

Q+A

Q1  Isn’t it dangerous to get opinions from users though? As they want to make you happy :)

A1 You are correct to assert that if you ask the wrong questions to the wrong users in the wrong way you will end up a myriad of answers offering multiple directions in which to take your product, many or all of which may be wild goose chases!  We addressed this specific challenge in our first webinar so I would encourage you to watch it when you get a chance.  TLDR; i) adhere to best practise in the bottom left research quadrant, ii) never carry out a program of research which only exists in this quadrant – give yourself the chance to calibrate your data and findings from elsewhere in the research landscape and iii) invest the majority of your time trying to understand user problems, not asking them to tell you what they think of your solution, or worse still, asking the user to come up with the solution themselves.

Q2  How did you find people willing to talk to you about their experiences?  Did you incentivise?

A2 We ascertain the best approach to incentives on a project–by–project basis, depending on what we are trying to learn and who our users are. Within the parameters of this project as briefed, it wasn’t appropriate to incentivise users.

Belfast City Council helped to arrange a number of in–person, telephone and survey–based research sessions with service users and we were able to utilise regular Council–facilitated group meetings with people with disabilities, a youth group and a monthly councillor’s meeting.

Some do’s and don’ts for encouraging participants to be open in interviews:

  • Do share information about the study in advance.
  • Don’t act like you already know all the answers – let them talk uninterrupted and try not to use too many affirmations while they’re speaking (“uh–huh, yes, that’s right”).
  • Do treat each participant as if they are the leading expert on their own experience (they are).
  • Don’t use leading questions and don’t finish their sentences.
  • Do show interest in their responses and do give them time to finish.
  • Do be polite and friendly.
  • Do make it clear at the start that they don’t have to answer anything they feel uncomfortable discussing.
  • Do ask follow–up questions – you won’t get another opportunity to ask them.

Q3  How many personas are too many in your experience?!

A3 Don’t think about the number, think about usefulness. A persona is only useful if it tells you something about historic user behaviour that you can rely on to predict future user behaviour. You may need to make a distinction between marketing personas and user personas. Often, marketing personas go deep into demographics, interests and spheres of influence. User personas should focus on innate needs, motivations, pain points and tasks to be completed. Sometimes these can be blended, sometimes you need to make a different set of user personas.

We worked on a project last year where the marketing team shared over 20 personas with us. These were useful from a marketing perspective when it came to targeted campaigns and copywriting, but there was a lot of overlap in the way these personas were using the website. In the end we condensed them into 4 user personas. In our experience, the direction of design can start to get hazy when you get above 6 personas – but we’d love to know what you think!

Q4  How would you user test a product during current restrictions and limitations (such as for an older audience 60+)

A4 The technology to allow remote moderated and unmoderated usability testing is excellent, on mobile and desktop devices.  Therefore, testing users with an adequate level of IT literacy has been relatively unaffected during the current restrictions.  However, there are two important considerations which shouldn’t be underestimated:

  • Somewhere between 70% and 90% of communication is non–verbal and picking up non–verbal cues just isn’t as straightforward in a remote context.  Some of them are too subtle to pick up over a camera, and others occur off–camera.
  • Studies by their nature are limited to users with certain degree of computer literacy.  Thus, any usability studies which need to consider users across the spectrum of IT capability, or studies which involve testing users with specific accessibility needs can be biased as the test can’t encompass the full breadth of users.

Q5  Any good resources on Problem Space?

A5 We plan to tackle the importance of problem solving in a future webinar, but the power of the problem space is the focus that it provides to the solution.  It is the part of the project where the design team can hone in with precision and accuracy on exactly what needs to be solved. “If I had 60 minutes to solve a problem, I would spend 55 minutes thinking about the problem and 5 minutes thinking about the solution.” If that mantra is good enough for Albert Einstein it’s good enough for us!

Q6  Do you recommend the Double Diamond process in all cases?

A6 We’re going to have to give you a politician’s answer here by saying “it depends”. We like Jared Spool’s approach, that UX can be seen as a box of tools (like a joiner would bring to a job) or a set of plays (that a football coach would prepare their team with before the big match). And depending on what unfolds and what needs done, the right play is selected.  Sometimes the double diamond is ideal, sometimes an innovation framework, sometimes a straight waterfall research–optimise–design approach.

Q7  How are you dealing with the challenges of research and testing with users during the pandemic?

A7 This question was partly answered in Q4 above but I’ll share a couple of the challenges we’ve come up against with all types of users, and how we’ve been dealing with them:

  • Challenge: No in–person interaction with participants. Whether you’re doing an interview or a usability test, non–verbal cues can tell you a lot about what a person is experiencing. A furrowed brow or a deep sigh can paint a thousand words. You can miss these over Zoom, especially if they choose not to share their camera.
  • Solution: Continually ask users to comment on how things make them feel. Vary your questions to stop them from going into auto–pilot with their responses, for example “how do you feel about this content?”, “how would you describe this content to a friend?” and “are you confident this content gives you what you need?” all force the user think critically about their response.
  • Challenge: Tech issues. We’re sometimes asking people to join a Zoom call on their mobile device and share their screen so we can see a website open in their browser. There are actually quite a few steps involved there, including subtleties like ensuring their notifications are off for their privacy. We also can’t account for their Wi–Fi speed or interruptions.
  • Solution: Allow at least 10 minutes of tech tinkering time at the start of every session. Also, ask the user to think out loud, stating their intentions, expectations and feelings throughout a task. That way, if the screen share freezes at least you have an audio description of what they’re doing.

Q8  What are some recommendations when testing and validating prototypes? From my (limited) experience, it’s hard for some users to understand the concept of a prototype vs. final solution.

A8 We define a prototype as a testable representation of the final product.  So as long as it is discernible and testable it is a prototype.  It is important to match the type of testing to the fidelity of the prototype. In loose terms, a prototype can be paper–based or digitally–based, can be static or interactive and can having varying degrees of fidelity from a sketch to a detailed wireframe or even creatively design artefact.  Tests can include perception tests, click tests and fuller usability tests.  You are right that users don’t intuitively understand a prototype in the same way they might understand a live digital product, but with good research execution, clear explanation to the user and defined purpose to the test, this can be managed.

Q9  How did you run the workshops Andy? Did you show them early lo–fi prototypes?

A9 The workshops that we ran during the engagement with Belfast City Council were varied in format and purpose based on our stage in the project. In discovery (exploring the ‘problem–space’) we used separate workshops with internal teams and dedicated user groups, which were focussed on understanding process, building service–user empathy and problem identification. Ideation and prioritisation workshops were utilised with a wider collective group, containing internal council teams and service users. These workshops involved collaboration on synthesised discovery artefacts including journey maps, personas and service blueprints to produce design ideation. This often involved sketching and lo–fi design and subsequent prioritisation of any potential innovations and solutions.

The website portion of the project which followed, while not delivered in workshop format, used design–sprint methods. This approach enabled us to usability test lo–fi prototypes on a weekly basis, with a range of relevant user types.

Q10  Any suggestions on how to engage and learn about users during COVID? How did you decide who to conduct research with when Belfast city council undertakes work in so many areas?

A10 The research groups and individuals who took part in this project were based on personas and service–user data for cleansing and waste services provided by Belfast City Council. The city’s information based on enquiry frequency, enquiry types and subsequent demographic, business and partner data helped shape our research panel needs. Belfast City Council then provided access to and consent from a number of individuals and groups for Fathom to engage with via various forms of research.

Q11  Do you have larger zoomable images of your journey maps?

A11 Unfortunately we can’t share any of our research artefacts due to IP restrictions. However, you can’t go far wrong with the journey map templates available on the NN/g website. Check out the following two articles which have nice examples:

Q12  How do you define what research methods to use project to project?

A12 For some projects the process is straightforward, as clients ask us to carry out a specific programme of work such as a usability test, accessibility audit or analytics review.  For larger or full end–to–end design projects however, research must be focused on important unknowns, which are most commonly identified through a series of interactive workshops which encourage design teams to think about the problem in fresh and innovative ways.  Important unknowns are typically knowledge gaps related to identified project success factors, and from them we draw up a learning plan.  The learning plan is the document which details all of the things which the design team wishes to learn and the specific research activities which they will undertake to learn them.

Q13  Favourite research methods?

A13 We find interviews and usability tests particularly powerful, within the context of the parameters we outlined in our answer to question 13.

Q14  What do you do when your design brief is set and you find that the problem for users is many different parts? How do you keep from getting lost off the top of the project?

A14 It’s really important that the design brief is shaped by solid, clear user research findings, so that the problems for users are well defined in advanced. Critical to this is selecting appropriate synthesis artefacts that succinctly communicate what has been learned in research. Tools such as journey maps, personas, flow diagrams and affinity maps help build the bridge from insight to interface. A key way of writing an excellent design brief is producing artefacts which truly help the designer empathise and step into the users’ shoes by being readable, understandable and actionable.

Q15  What kind of questions do you usually ask during customer interviews? Do you have tips on how to structure these interviews?

A15 Before you plan an interview you need to be really clear on what your learning objectives are. Only ask questions that will actually build your understanding of the problem(s) to be solved – anything else may just be adding noise to your findings. I recommend writing a separate discussion guide for each user type you’ll be interviewing on each project. However, there are a few common structural elements in most of our interviews:

  • At the start we go over the details of the study and set them at ease.
  • The first few questions should be general, relating to their job role and context of use. This warms them up and acts as a screener, so we know how relevant each subsequent question will be to them.
  • The bulk of the discussion is based around carefully worded pre–planned questions to learn exactly what we need to know, but we typically ask a few spontaneous follow–up questions along the way if we need more clarity on something.
  • Wrap the interview up with ‘overall’ questions. After talking about lots of specifics, people tend to be in a good place to summarise what they like and don’t like.

Here are some do’s and don’ts to keep you right:

  • Do write the questions in a logical order so that the conversation flows.
  • Don’t leave the most important questions until the second half of the interview – you might run out of time.
  • Do get someone to sense–check your discussion guide (preferably someone from outside of the project team).
  • Don’t use interviews to ask about user behaviour – leave this for usability testing.
  • Do use interviews to learn about attitudes, perceptions, motivations and needs.

Q16  Do you use a specific software for creating user journeys and personas?

A16 We use design tools such as Sketch, Figma and presentation tools such as Keynote.

Q17  I just wanted to ask as a follow up if there are any tools or resources that you would particularly recommend to help successfully complete distance research and testing?

A17 When it comes to distance work, collaboration is key. Out of the tools mentioned in Q17 above we recommend Figma and Keynote because both have solid real–time collaboration features. Other tools we’ve found useful include the usual Google Suite solutions, Mural for workshops and synthesis, and Whimsical for user flows and lo–fi sketching.

Q18  Generally, do UX designers undertake multiple projects simultaneously? Very informative session btw! Hope there’ll be more webinars like this in the future :) Thank you!

A18 Thank you for your kind feedback on the webinar and we’re really pleased you found it useful.  In an agency environment such as Fathom the team often work on a number of projects simultaneously, although in any given period we often have major projects flowing through the studio, for which we put a dedicated team together.

Q19  What kind of accessibility solutions were applied in this project?

A19 No actual solutions were applied as yet to the service design project, as it is still in implementation phases. Some highlights though for recommendations were increased iconography across all physical and digital tools, and increased sign interpreters across physical locations. On the website there are specific accessibility interventions including ‘Read aloud’ tools, strict WCAG 2.1 AA+ accessibility criteria being met and clarity on accessibility for all users via a dedicated section on the website.

Q20  Any tips or advice as to what to do if you hit mental / creative walls when trying to redesign or improve users UX?

A20 We find A Technique for Producing Ideas by James Webb Young absolutely brilliant.  It is as relevant today as it was in 1940 when it was first published!

Q21  “What do you answer when your client asks you:

  • What do you think?
  • How should we do it?
  • What would be the best solution to solve this issue?”

A21 I tell them it doesn’t matter what I think, it matters what their users think.

Find us on YouTube

You can watch more of our webinars and talks about UX and service design on our YouTube channel

By Andy Robinson

Andy was a UX Researcher with Fathom until July 2021, when he moved on to take up a role at Rapid7.

View more insights by Andy

Like to read more of our insights regularly?

Receive our monthly insights newsletter straight to your inbox.

To prove you’re a human please rewrite the following into the box below.
tlqik7ph

Latest from the blog

Our latest views and ideas

Our Cookie Policy

Find out more I accept

Like most websites, ours uses cookies to make sure you receive the best experience possible. These cookies are safe and secure and do not store any sensitive information. To continue, please accept the use of cookies.