Healthy Data - Originally Posted March 23, 2016

Our third HSCollab luncheon focused on "Healthy Data: health, data and healthy practices in the age of the quantified-self" and we spent some time playing with our growing collection of pedometers, including some of the first mechanical devices. We also laughed and did a little commiseration. It feels important to me to report out on this quality of our conversation too. We're developing a community that cares with and for one another and that is an important feature of the kind of work HSCollab wants to do.

Shared care is also an important part of what people are doing when they are engaging in community efforts to quantify their activities/bodies in pursuit of improved health. As one participant noted, it's no fun if you're the only one with a Fitbit.

We talked a bit about the very analogue pace or step counters. The one I have is a lot like this, although the cord on mine is in desert camo. As one participant noted, we also know of these as borderland technologies for diabetes care - such pace counters are often handed out to members of communities and become part of a collective effort to increase walking as a way to combat diabetes and diabetes related illnesses. This includes friends peeping in others' beads and having competition and uplift as part of the behavior change practice.

We also talked about the Healthy Active Natives FB groups, that encourage greater activity within the communities they serve. Important in our discussion was a recognition that efforts such as this include native specific practices and advertisements. This is a decolonizing quantifying effort, which has really interesting implications for how we can think about the politics of technologies and techno-social networks. 

We also discussed at length about the ways that quantifying and health surveillance technologies are often invested in developing a set of consumers with a set of life-long dependencies. We talked about what knowledge is lost and gained when displacing knowledge production onto a device, and how agency is an important part of interfacing with any tech.

In the case of the pace beads, there was a general sense that they allowed a certain kind of agency - that it was a memory aid, rather than an automation device - and that the aesthetics of beads on a string mean differently than numbers on a digital device. This leaves us attuned to the differences between receiving and creating the information, or receiving and creating meaning. This also has a set of really important implications for what we mean when we say someone is performing with data devices - the what and the how are different depending on the media affordances/restrictions. One of our participants noted that the agency of working on the world with a tool (as in the pace counter) is entirely different from that enacted by putting on an electronic device and being worked upon yourself.

In terms of agency, one of our participants shared her work on data monitoring in health care settings and she echoed that agency is important. She noted that people tend to slow usage if they are not empowered to do something with the data and that just monitoring doesn’t change patient quality of life outcomes. More needs to be done to empower people who are using tracking devices for acute health situations.

This led us to talk at length about affect and wearable technologies, particularly around health. As several people observed, routine monitoring is important, life-saving even for some, but it is also a reminder that the person doing the testing is ill. In the case of mental health or non-critical health conditions, this rubs up against different understandings of what is "healthy" and may put a person in the position of regularly being reminded that others think she's sick when she disagrees. Shaming, negative affect, and dominating daily activities are problems for those who are being asked or compelled to use health monitoring tools.

This brought us again to the question of whether there are ways for health tracking devices/tools to NOT displace human knowledge/expertise. We also returned to a favorite theme of ghosts - although this time it was expressed as being aware of the presences of others - designers, manufacturers, etc - in the technology. An acute sense that "this was not made for me" - often without "me" ever in view.

We also talked briefly about psychotherapeutic “mood logs” and the ways in which there have been tracking technologies for a very long time, many of which were textual and highly contextual. That contextuality matters a great deal, based on our conversations, to a full understanding of what the data means.

Our next and last luncheon is scheduled for April 18 on the theme of Algorithmic Bias: subjectivity and implicit biases in algorithm and tech design.

We also want to make time to scope out our future work as a group. Among the ideas we discussed are

  • Bringing in speakers for public events, possibly even curating a series of related events across the ASU campuses and in our communities.
  • We would like to write together and share our work with one another - including possibly writing something in the vein of "As We Now Think" and/or a manifesto or statement of research objectives ("Urgent Questions") for the future of HSCollab.

To make room for all of this, we'll schedule a two hour session for our last luncheon. If you can only make one hour - we'll do Algorithmic Bias in the first hour and then transition to the Next Steps conversation in the second hour. We hope you'll join us!