Sunday, 30 August 2015

On rigour, numbers and discovery


Recently, I was asked the following fantastically thought-provoking question:

In talking to my psychology seminar group about their qualitative lab I ended up looking at Helene Joffe’s book chapter on thematic analysis.  She suggests including diagrammatic representations of the themes, together with quantitative data about how many participants mentioned the theme, and it’s subparts.  This appealed to the psychology students because it gives them quantitative data and helped them see how prevalent that theme was within the sample.

And then today I saw another paper “Supportingthinking on sample sizes for thematic analyses: a quantitative tool".  It argues that one should consider the power of the study when deciding on sample size – another concept I’d only seen in quantitative research. 

Both of these sources seem to be conducting qualitative analysis with at least a nod towards some of the benefits of quantitative data, which appears to make qualitative analysis have more rigor.  Of course, simply adding numbers doesn’t necessarily make something more rigorous but it does add more information to results of an analysis and this could influence the reader’s perception of the quality of the research.  However, I don’t recall seeing this is any HCI papers.  Why isn’t it used more often? 

The answer (or at least, my answer) hinges on nuances of research tradition that are not often discussed explicitly, at least in HCI:

Joffe, Fugard and Potts are all thinking and working in a positivist tradition that assumes an independent reality ‘out there’, that doesn’t take into account the role of the individual researcher in making sense of the data. Numbers are great when they are meaningful, but they can hide a lot of important complexity. For example in our study of people’s experience of home haemodialysis, we could report how many of the participants had a carer and how many had a helper. That’s a couple of numbers. But the really interesting understanding comes in how those people (whether trained as a carer or just acting as a helper) work with the patient to manage home haemodialysis, and how that impacts on their sense of being in control, how they stay safe, their experience of being on dialysis, and the implications for the design of both the technology and the broader system of care. Similarly, we could report how many of their participants reported feeling scared in the first weeks of dialysis, but that didn’t get at why they felt scared or how they got through that stage. We could now run a different kind of study to tease out the factors that contribute to people being scared (having established the phenomenon) and put numbers on them, but to get the larger (60-80) participants needed for this kind of analysis would involve scouring the entire country for willing HHD participants and getting permission to conduct the study from every NHS Trust separately; I’d say that’s a very high cost for a low return.

Numbers don’t give you explanatory power and they don’t give you insights into the design of future technology. You need an exploratory study to identify issues; then a quantitative analysis can give the scale of the problem, but it doesn’t give you insight into how to solve the problem. For HCI studies, most people are more interested in understanding the problem for design than in doing the basic science that’s closer to hypothesis testing. Neither is right or wrong, but they have different motivations and philosophical bases.

Wolcott (p.36) quotes a biologist, Paul Weiss, as claiming, “Nobody who followed the scientific method ever discovered anything interesting.” The quantitative approach to thematic analysis doesn’t allow me to answer many of the questions I find interesting, so I’m not going to shift in that direction just to do studies that others consider more rigorous. Understanding the prevalence of phenomena is important, but so is understanding the phenomena, and the techniques you need for understanding aren’t always compatible with those you need for measuring prevalence. Unfortunately!

Saturday, 22 August 2015

Innovation for innovation's sake?

As Director of the UCL Institute of Digital Health, my job is to envision the future. The future is fueled by innovation and vision. And there's plenty of that around. But the reality is much more challenging: as summarised in a recent blog post, most people aren't that interested in engaging with their health data (the ones who are most likely to be tracking their data are young, fit and wealthy), and most clinicians are struggling to even do their basic (reactive) jobs, without having much chance to think about the preventative (proactive) steps they might be taking to help people manage their health.

Why might this be? Innovation is creative and fun. It's also essential (without it, we'd still be wallowing around in the primordial soup). But there's a tendency for innovation to assume a world that is simpler than the real world: people who are engaged and compliant and have time to take up the innovation. Innovation tends not to engage with the inconvenient truths of real life, or to tackle the difficult and complex challenges that get in the way of simple visions.

We need a new approach to innovation: one that takes the really difficult challenges seriously, that accepts that the rate of progress may be slow, that recognises that it's much harder to change people and cultural practices than it is to change technology, but that these all need to be aligned for innovation to really work.

We need innovation that works with and for people. And we need to recognise that an important part of innovation is dealing with the inconvenient and difficult problems that seem to beset healthcare delivery, in all its forms.

Sunday, 24 May 2015

Digital Health: tensions between hype and reality

There are many articles predicting an amazing future for digital technologies for healthcare: wearables, implantables, wirelessly enabled to gather vital signs information and collate it for future sensemaking, by the individual, clinicians and population health researchers. Two examples that have recently come to my attention are a video by the American Psychiatry Association and a report by Deloitte. The possibilities are truly transformational.

Meanwhile, I recently visited a friend who has Type II diabetes. On his floor, half hidden by a table, I spotted what I thought was a pen lid. It turned out to be the top of his new lancing kit. Although he had been doing blood glucose checks daily for well over a decade, he hadn't done one for over two weeks. Not just because he'd lost an essential part of the equipment, but because he'd been prescribed the new tool and hadn't been able to work out how to use it. So losing part of it wasn't a big deal: it was useless to him anyway. He told me that when he'd reported his difficulties to his clinician, he'd... been prescribed a second issue of exactly the same equipment. So now he has three sets of equipment: the original (AccuChek) lancing device and blood glucose meter, which he has used successfully for many years, but which he can't use now because he doesn't have spare consumables; and two lancing devices and meters (from a different manufacturer), with plenty of spare consumables, which he can't use because he finds the lancing device too difficult to use. And in trying to work out with him what the problem was, we managed to break one of them. Good thing he's got a spare!

If we think it's just isolated individuals who struggle, it's not: a recent report from Forbes reports similar issues at scale: poor usability of electronic health records and patient portals that are making work less efficient and effective rather than more.

So on the one hand we have the excitement of future possibilities that are gradually becoming reality for many people, and on the other hand we have the lived experiences of individuals. And for some people, change is not necessarily good. The real challenge is to design a future that can benefit all, not just the most technology-savvy in society.

Saturday, 7 February 2015

Designing: the details and the big picture

I was at a meeting this week discussing developments to the NHS Choices site. This site is an amazing resource, and the developers want to make it better, more user-centred. But it is huge, and has huge ambitions: to address a wide variety of health-related needs, and to be accessible by all.

But of course. we are not all the same: we have different levels of knowledge, different values, needs, and ways of engaging with our own health. Some love to measure and track performance (food intake, weight, blood pressure, exercise, sleep, mood: with wearable devices, the possibilities are growing all the time). Others prefer to just get on with life and react if necessary.

We don't all choose to consume news in the same way (we read different papers, track news through the internet, TV or radio, or maybe not at all); similarly, we don't all want health information in the same form or "voice". And it is almost impossible to consider all the nuanced details of the design of a site that is intended to address the health needs of "everyone" while also maintaining a consistent "big picture". Indeed, if one imagines considering every detail, the task would become overwhelmingly large. So some "good enough" decisions have to be made.

I am very struck by the contrast between this, as an example of interaction design where there is little resource available to look at details, and the course that my daughter is doing at the moment, which has included a focus on typographical design. In the course, they are reviewing fine details of the composition and layout of every character. Typography is a more mature discipline than interaction design, and arguably more tractable (it's about the graphics and the reading and the emotional response). I hope that one day interaction design will achieve this maturity, and that it will be possible to have the kind of mature discourse about both the big picture and the details of users, usability and fitness for purpose.


Tuesday, 20 January 2015

Designing, documenting, buying, using: the mind-boggling hob

I have complained before about how difficult some taps are to use. These should be simple interactive objects whose design requirements are well understood by now, and yet designers keep generating new designs that work less well than previous models. Why is there so much emphasis in unnecessary innovation, as if innovation is inherently a good thing?

Ursula Martin has just introduced me to the unusable hob:
"This bizarre thing requires you to select a ring with the rotating arrow before applying plus/minus.  Now here's a thing. Suppose you have switched on ring 1 (bottom right), and no others, set it to 4 (a red 4 appears due South of the Ring 1) and a few minutes later you decide you want to turn it down to 3. How do you do that? Press the minus sign, as that is the only ring that is on? Oh no, nothing happens if you do that. it appears that you HAVE TO CYCLE THROUGH ALL THE OTHER RINGS AND BACK TO 1, then red 4 will start to flash, and then the minus/plus signs will change it. Just imagine the hoopla of doing that when you have four rings going at once."

The instruction manual is full of information like:
"Each cooking zone is equipped with an auto-
matic warm-up function. When this is activa-
ted, then the given cooking zone is switched
on at full power for a time dpending on the heat
setting selected, and is then switched back to
the heat setting set.

Activate the automatic warm-up function by
setting the required heating power by touching
the (+) sensor (5) first. Then the heating level
„9” is displayed intermittently on the cooking
zone indicator (3) with the letter “A” for around
10 seconds."

And so on, for many pages (spelling mistakes an added bonus). This is a manual that opens with the (only slightly patronising):
"DEAR USER,
The plate is exceptionally easy to use and extremely efficient. After reading the instruction manual, operating the cooker will be easy."

Ursula notes that: "The designer seems to have a mythical cook in mind who doesn’t want to change the temperature very often". Alternatively, maybe it's from the Dilbert school of design. All one can be sure about is that the design team apparently never use a hob, and that the technical authors who have written the 28-page manual on how to operate this hob were happy to write out inscrutable instructions without ever seriously considering their comprehensibility. And had apparently also never used a hob.

Finally, Ursula reported that "the flat owner is very embarrassed about it - he has just had the kitchen redone and I am the first tenant since, and he hadn’t used the thing himself". If you've ever bought a new appliance and tried to assess its usability before purchase you will probably sympathise with the landlord. It's usually impossible to test these things out before buying; to even read the manual; or to get any reliable information from the sales team about usability. In fact, ease of use, usability and fitness for purpose don't feature prominently in our discourse.

We really do need a cultural shift such that fitness for purpose trumps innovation. Don't we?

Friday, 9 January 2015

Compliance, adherence, and quality of life

My father-in-law used to refuse presents on the principle that all he wanted was a "bucket full of good health". And that was something that no one is really in a position to give. Fortunately for him (and us!) he remained pretty healthy and active until his last few weeks. And this is true for many of us: that we have mercifully little experience of chronic ill health. But not everyone is so lucky.

My team has been privileged to work with people suffering from chronic kidney disease, and with their families, to better understand their experiences and their needs when managing their own care. Some people with serious kidney disease have a kidney transplant. Others have dialysis (which involves having the blood 'cleansed' every couple of days). There is widespread agreement amongst clinicians that it's best for people if they can do this at home. And the people we worked with (who are all successful users of dialysis technology at home) clearly agreed. They were less concerned, certainly in the way they talked with us, about their life expectancy than about the quality of their lives: their ability to go out (for meals, on holiday, etc.), to work, to be with their families, to feel well. Sometimes, that demanded compromise: some people reported adopting short-cuts, mainly to reduce the time that dialysis takes. And one had her dialysis machine set up on her verandah, so that she could dialyse in a pleasant place. Quality of life matters too.

The health literature often talks about "compliance" or "adherence", particularly in relation to people taking medication. There's the same concern with dialysis: that people should be dialysing according to an agreed schedule. And mostly, that seemed to be what people were doing. But sometimes they didn't because other values dominated. And sometimes they didn't because the technology didn't work as intended and they had to find ways to get things going again. Many of them had turned troubleshooting into an art! As more and more health management happens at home, which means that people are immediately and directly responsible for their own welfare, it seems likely that terms like "compliance" and "adherence" need to be re-thought to allow us all to talk about living as enjoyably and well as we can – with the conditions we have and the available means for managing those conditions. And (of course) the technology should be as easy to use and safe as possible. Our study is hopefully of interest: not just to those directly affected by kidney disease or caring or designing technology for managing it, but also for those thinking more broadly about policy on home care and how responsibility is shared between clinicians, patients and family.


Wednesday, 7 January 2015

Strategies for doing fieldwork for health technology design


The cartoons in this blog post are  from Fieldwork for Healthcare: Guidance for Investigating Human Factors in Computing Systems© 2015 Morgan and Claypool Publishers, www.morganclaypool.com. Used with permission.
One of the themes within CHI+MED has been better understanding how interactive medical devices are used in practice, recognising that there are often important differences between work as imagined and work as done.  This has meant working with many people directly involved in healthcare (clinicians, patients, relatives) to understand their work when interacting with medical devices: observing their interactions and interviewing them about their experiences. But doing fieldwork in hospitals and in people’s homes is challenging:
  • You need to get formal ethical clearance to conduct any study involving clinicians or patients. As I’ve noted previously, this can be time-consuming and frustrating. It also means that it can be difficult to change the study design once you discover that things aren’t quite the way you’d imagined, however much preparatory work you’d tried to do. 
  • Hospitals are populated by people from all walks of life, old and young, from many cultures and often in very vulnerable situations. They, their privacy and their confidentiality need to be respected at all times.
  • Staff are working under high pressure. Their work is part-planned, part-reactive, and the environment is complex: organisationally, physically, and professionally. The work is safety-critical, and there is a widespread culture of accountability and blame that can make people wary of being observed by outsiders.
  • Health is a caring profession and, for the vast majority of staff, technology use is a means to an end; the design of that technology is not of interest (beyond being a source of frustration in their work).
  • You’re always an ‘outsider’: not staff, not patient, not visitor, and that’s a role that it can be difficult to make sense of (both for yourself and for the people you’re working with).
  • Given the safety-critical nature of most technologies in healthcare, you can’t just prototype and test ‘in the wild’, so it can be difficult to work out how to improve practices through design.

When CHI+MED started, we couldn’t find many useful resources to guide us in designing and conducting studies, so we found ourselves ‘learning on the job’. And through discussions with others we realised that we were not alone: that other researchers had very similar experiences to ours, and that we could learn a lot from each other.

So we pooled expertise to develop resources to give future researchers a ‘leg up’ for planning and conducting studies. And we hope that the results are useful resources for future researchers:

  • We’ve recently published a journal paper that focuses on themes of gaining access; developing good relations with clinicians and patients; being outsiders in healthcare settings; and managing the cultural divide between technology human factors and clinical practice.
  • We’ve published two books on doing fieldwork in healthcare. The first volume reported the experiences of researchers through 12 case studies, covering experiences in hospitals and in people’s homes, in both developed and developing countries. The second volume presents guidance and advice on doing fieldwork in healthcare. The chapters cover ethical issues, preparing for the context and networking, developing a data collection plan, implementing a technology or practice, and thinking about impact.
  • Most of our work is neither pure ethnography nor pure Grounded Theory, but somewhere between the two in terms of both data gathering and analysis techniques: semi-structured, interpretivist, pragmatic. There isn’t an agreed name for this, but we’re calling them semi-structuredqualitative studies, and have written about them in these terms.

If you know of other useful resources, do please let us know!