The Center for Democracy and Technology and Fitbit recently released a report entitled Toward Privacy-Aware Research and Development in Wearable Health. From the presser:
Following a research methodology that included direct observation, surveys, and interviews with Fitbit engineers, CDT and Fitbit jointly developed recommendations that offer wearable companies specific guidance on privacy practices around user data, internal company operations, and the potential broader societal impact of applying good data practices during the R&D process. The recommendations focus on user expectations guiding consent, non-coercive rewards being offered to research participants, respecting the needs of vulnerable populations, and upholding trust through robust security and de-identification protocols. The report recommends that wearable companies invest in employees with privacy and ethics backgrounds, empower researchers with embedded tools for data stewardship, set clear security protocols for use of user data, and establish formal accountability measures.
I spoke with Michelle De Mooy who is the CDT’s Deputy Director of the Privacy and Data Project and co-author of the report. She worked with Fitbit for about a year. She commented:
A lot of advocates will look at data that’s flowing through advertising or marketing and we think that’s really important, but we decided to try to look at it from a different perspective and see if we could create positive change going into the internal data flows of a company in the wearable space.
The first step was to document the existing research processes in place at Fitbit. These are documented in the report and are set forth in this diagram.
For De Mooy, the lodestar is “individual digital dignity,”
but really what that means is people’s expectations are what should guide your use and your practices and that also means communicating with them about consent, making sure that – regardless if it’s a pain or you don’t want to do it, but every time that you’re using data in a way that they might not expect — that you need to make sure that you’re getting consent and getting consent means not just a popup, it means communicating with people and communicating with them well and listening in return to what they’re telling you . . . .
In that context, over the course of the engagement with Fitbit (funded by RWJF, by the way), De Mooy came to support access to data by industry so long as appropriate priciples of data stewardship are applied:
Now, typically you don’t hear privacy advocates talking about access and access to data and making sure that that’s available, but in this case we thought it made sense to do that, because . . . we knew that there were good privacy and security protocols that they applied to the data, so you could do it in a privacy protected way, [and] understanding that if we’re asking them to be stewards and to play a role in social good that they needed to commit to socially-conscious research projects and that means putting money into it and also publishing.
The key takeaways identified in the report are as follows:
- Internal research and development offers a unique window into data practices and policies at companies, such as insight into how data is categorized in projects, the way teams are structured, and the privacy and security methods that are deployed. Internal R&D also offers a flexible environment for piloting privacy-protective and ethical data policies and practices.
- Building a culture of privacy, security, and ethics involves embedding practices and policies that place value on individual dignity and corporate data stewardship, and also prioritizes contributions to the social good.
- Technology companies are managing several dimensions of trust through internal research and development – the company and its users, the integrity of internal policies and practices for employees, and the relationship between the company and society. Successfully navigating this trust through practical measures must be at the core of any policy or practice recommendation.
- Research departments at wearable companies face ongoing ethical questions related to the data they process. Policies and procedures around the uses of internal data, such as employee information, should be developed first.
An industry leader like Fitbit has the opportunity to model use of a reasonable privacy framework to apply around research uses of data, even where not required by law. It will be interesting to see whether other companies pursue a similar approach, and whether Fitbit will begin to extend this framework beyond research conducted using data generated by Fitbit employees.
Please listen to my conversation with Michelle DeMooy to learn more about the process and the outcome.
You may read the transcript here, or below.
David Harlow
The Harlow Group LLC
Health Care Law and Consulting
Michelle De Mooy on the Fitbit-CDT Collaboration on Wearables Data Privacy
A Conversation with David Harlow at HealthBlawg
June 2016
David Harlow: This is David Harlow at HealthBlawg and I am speaking today with Michelle De Mooy who is the Deputy Director of the Privacy and Data Project at the Center for Democracy & Technology. Welcome, Michelle, and thank you for joining us today on HealthBlawg.
Michelle De Mooy: Thanks, David. I’m very happy to be here.
David Harlow: Great. So I was very pleased to meet you in person recently and also to learn of the report that you’ve been working on for quite some time now and was released last month by the Center in collaboration with Fitbit, and this is a report — or a manifesto, I would say, in a way — and it is titled Toward Privacy-Aware Research and Development in Wearable Health and I would be interested to hear — just to start off — how you came to be working with Fitbit on something like this and what the process was that led to the development of this document.
Michelle De Mooy: Sure. And as a long-time advocate I really like the word manifesto, although I’m not sure if it completely applies here, but I can tell you that – I think that’s a really good question because to me it’s one of the more interesting parts of this. So we were funded to do this project through the Robert Wood Johnson Foundation and for me that was really crucial, it sort of took the funding aspect out of the relationship between us and as an advocacy organization that was really important, we wanted to have credibility in the work that we were doing and make sure that it would resonate and so they funded this for about a year and the idea that the Robert Wood Johnson Foundation had was: let’s try something kind of innovative and we’ll do it quickly and we’ll see what happens.
And so to their credit, and to Fitbit’s great credit, they agreed to do this. What we decided to do is try to take a different approach because this was sort of an innovative kind of grant. A lot of advocates will look at data that’s flowing through advertising or marketing and we think that’s really important, but we decided to try to look at it from a different perspective and see if we could create positive change going into the internal data flows of a company in the wearable space. Of course, the wearable industry is interesting to us because a lot of the information is, I’m sure you and your listeners know, flows outside of regulatory protection, mostly HIPAA, and so we’ve been sort of looking at that anyway, kind of scoping the space and figuring out what made sense to advocate for.
And so this project was: let’s look inside a company, let’s work with them, instead of sort of yelling at them or throwing best practices at them, let’s try to learn from them and have them learn from us and so we worked with them to basically map their internal research and development flows and that was just truly interesting and, again, I still am kind of surprised that Fitbit agreed to do this, but they were very, very open and very interested in learning from us. You know, one of the bizarre parts was having all these sort of PhD people saying to me what should we do and I would say, okay let’s talk and so because truthfully a lot of companies in this space as innovative and as smart and as technical as they are, they don’t really have this kind of expertise, they don’t have the frameworks that we have on privacy and ethical decisions and security, even. So it was a really fruitful collaboration between us.
David Harlow: I understand that Fitbit has a long history, I mean long in terms, in the context of the company, of doing research on members of its own workforce or sort of tracking different things, looking at different things. I wonder if you could speak to the different sorts of projects that they undertook or undertake internally and different sorts of frameworks for privacy that existed when you first walked in and sort of what this evolved to in terms of a privacy framework over the course of the project.
Michelle De Mooy: Sure. You know, I think you brought up one of the most surprising findings of this project at least for me, which was the fact that a lot of the studies are centering around Fitbit employee data and that makes a lot of sense especially if you’re a smaller company, you’re saying to a team of maybe five or six other researchers that you work with and you say I want to test a sensor, so can you hop on the treadmill so I can test and make sure this is working.
And that made a lot of sense and then from there they created these studies, which are basically they have a hypothesis, the researchers want to figure something out and they’ll ask for volunteers from the Fitbit employee base and they have offices in San Francisco, they have lots of different offices, but one of the things that we noticed when we went in there, besides being surprised that this was the case was that if you’re a company that goes from what Fitbit went from — 10 researchers to 60 researchers in the time that we worked with them — they went from private to a public company in the time that we worked with them, so enormous changes, that kind of just sort of ad hoc volunteer system was not sustainable and part of the goal of this work was to create recommendations that would resonate across the wearable industry, so they had to be specific enough to be responsive to what’s going on in internal research and development, but also broad enough to speak the larger issues in the wearable community.
And so we thought this is probably something that’s going on, it makes a lot of sense, people who work there actually care a lot about health and wellness pretty much across the board, the people that we spoke to, and so it was something that made sense, but wouldn’t be sustainable, so that was a great example of the way that we approached that was to say, okay, let’s come up with some recommendations, we have this data, we can ensure that this employee information is private, the access to employers is controlled, so there is no kind of negative effect and also making sure that some of the ethical frameworks that are used — like the Belmont Report is one — where there are principles of justice and beneficence and remuneration, making sure that people who volunteer to be a part of this were given a small reward so it was outside of their job performance and outside of their salary and so that’s a good example of sort of how we found something. They were surprised — it wasn’t something that they had ever really thought about would be coercive, and not that it necessarily was — but this was a good dynamic that kind of illustrates how we came in and saw things that were there, that needed to be addressed in sort of a sustainable way.
David Harlow: Right. And I think that sort of addresses the issue of scaling over time, if it’s a small group at the outset there is sort of an understanding that this is what we’re all about, that as the group grows, as the company grows, as this trends across an industry there is an important necessity to start looking at things like the Belmont Report, like the Common Rule, thinking about how that applies to these uses.
Michelle De Mooy: Yes, and this is a pretty new space, so there are existing frameworks, but a lot of what we heard from the researchers with Fitbit was there really is no guidance out there, no really comprehensive guidance on how to do this; you can draw from different frameworks and you can kind of use, for example, one of the things we pointed out in the report was that there was a lot of knowledge already on staff about ethics and this was another kind of surprise to Fitbit. You have a lot of degreed people and what that means is they’ve been to grad school and they’ve probably gone through an IRB, so that framework of ethics and applying it to data practices was something that they could draw on and was crucial, a crucial value and something that they could actually look for in the researchers that they were hiring.
So I think those sorts of seeds were a part of what we looked for throughout the project as a way to be sustainable and one of the ways that we frame it is we’re trying to help these companies see themselves as data stewards and to build a culture of stewardship, to build a culture of privacy and ethics that resonates out, that isn’t just about being a company data silo but about, you know, you’re in the health and wellness business, this means you’re part of this conversation about health with the public and so what does that mean and how do we make sure that we’re being valuable to that.
David Harlow: So do you see this as a voluntary framework or do you see this as tied in any way, I mean it’s not a HIPAA issue, but is it an FTC issue, an FDA issue, anything else?
Michelle De Mooy: I would love for that to be the case, but I don’t think so. I do feel like this is not a code of conduct, it’s not even necessarily a best practices document. It really is very new in that sense. As far as I know this kind of partnership has never been done. I haven’t seen it anywhere else. I haven’t seen this sort of advocacy-company relationship and so as far as I know this would not be something that would be enforceable by a government agency. That being said though, part of our job at CDT and part of our role as advocates is to try to innovate on these things and to try to push the envelope a little bit to new places and so my goal is to make sure that this report gets out to as many relevant agencies, whether it’s government or whoever is really interested, but also making sure that companies are aware of it and asking us questions and making sure that this is getting out there so that if companies decide they see the connection — and I think many do — between their business interest and their interest in building privacy and ethics within their products, maybe not have a real idea of where to start or how to do it, but this might be something that could help them.
David Harlow: So the fact that the study subjects are employees sort of confounds a number of different issues here. Do you see the framework that’s articulated in your report as being applicable to studies that would involve non-employee users?
Michelle De Mooy: Yeah, and actually we did address that in some of the different study types as they’re called at Fitbit — they’re looking at user data, but, again, one of the more surprising findings is that that’s not a huge part of the work, I mean that’s an important part, but really what they’re looking at is how the sensors are working, how the hardware and software are functioning, and what they can learn from patterns of user behavior and figure out what’s kind of the next iteration, what’s coming down the pike in two to three years and so that doesn’t require identification, it doesn’t require a lot of violations of privacy or sort of expectations that people might have. This is actually data that they can look at in an anonymized way. That can be rendered unidentifiable. One of the reasons we wanted to work with Fitbit is because I knew that they did some things really well, I knew that they did some things right. And so part of what I wanted to do was not kind of go in and say you did all this wrong, but to try to see what works, how is this working and so they had some real foundations of good practices and we just built on some of those to say for example, having escalation of data when you’re using user information, user data, based on the sensitivity of it, so if they’re just looking at patterns — maybe of steps — it may not be as sensitive; that doesn’t mean it shouldn’t be anonymized in some way, but there are different kinds of levels of protection and we think that makes sense because one of the buckets that we put all of this into was the idea of what is the business reality, I think that’s another thing that I think it’s fair to say advocates have maybe not been as good on, and that is being responsive to the practical needs of a business.
So we didn’t want to create these recommendations that would float around on user data and they couldn’t do it and they’d say the market pressures are too strong, we’ve got to get this out, we’re not going to do this, so it had to be something that made sense and something that could be sort of built into the work that they were doing.
David Harlow: Now, Fitbits and other activity trackers are often used in medical care or medical research and obviously in those contexts you buy into other more established set of privacy rules, right, so that’s sort of a whole other situation and, again, as I asked before I think the fact that you’re talking about employee data to a great extent and some anonymized individual user data going beyond the employee data, to what extent do these studies raise the issue that we hear about some times about user ownership of data. Does this implicate that sort of discussion, it’s my data, don’t you do anything with it without my say-so?
Michelle De Mooy: I think it does. The traditional sort of fair information practices principle of the sort of things that surround individual control and the related principles are I think where that would come in in this kind of context, but what we did here was we kind of used that, we looked at that framework but when you’re talking about internal research and development, I think it’s fair to say that the sort of general research, that is, the looking at your data to make sure that it’s working, to make sure that the device that you’re using is working, that I think is fair to say that’s expected use, that’s something we expect.
So rather than looking at ownership as much in this because I think, again, we’re talking about data that the company is using to sort of grow its internal work and to grow the company itself, and so what we looked at was this idea of individual digital dignity, but really what means is people’s expectations are what should guide your use and your practices and that also means communicating with them about consent, making sure that – regardless if it’s a pain or you don’t want to do it, but every time that you’re using data in a way that they might not expect that you need to make sure that you’re getting consent and getting consent means not just to pop up, it means communicating with people and communicating with them well and listening in return to what they’re telling you and so we think that conversation made more sense in this context.
David Harlow: In the course of this project did you come across or did you come to realize that their potential uses of this data, even secondary uses of this data that you hadn’t thought about when you went into it?
Michelle De Mooy: You know, I don’t know the answer to that. Again, our focus was pretty narrow, but I will say this. One of the ways that that did come into play was and the idea that we had about the company contributing to the social good. Now, typically you don’t hear privacy advocates talking about access and access to data and making sure that that’s available, but in this case we thought it made sense to do that, because, well, one, we knew that there were good privacy and security protocols that they applied to the data, so you could do it in a privacy protected way, but understanding that if we’re asking them to be stewards and to play a role in social good that they needed to commit to socially-conscious research projects and that means putting money to it and also publishing.
And so I think all of that is governed, of course, by what is okay with people and what is okay with their users and they were absolutely clear about that too, I mean this wasn’t like something that they have done a lot of at all, but we felt like there was a role to pay. You know, Fitbit has a unique data set. They can answer or at least research questions about obesity, heart disease, things that really are vital issues. So I think in that sense it made sense and so we kind of emerged with it a different kind of thinking about the role of access in this kind of health world and health tech world.
David Harlow: So do you see a pathway to expanding what was done with respect to identifiable employee data to a broader population set? You said earlier that the broader user data was used by the company to date really only on an anonymized basis. Would it make sense that future research might be done in a way that would require different kinds of consents from users of wearable sensors?
Michelle De Mooy: Yeah, I think depending on the size of the company, aggregated data can be really helpful or completely useless, right? I mean there are lots of different ways to apply these protections and some of them make sense and some of them don’t, but I think for employee data what we said and what I think makes sense is creating mechanisms, right, so one of the ways we said this was use innovations to help your employees be a part of this in a privacy-protected way, so, creating anonymous ways to find out to volunteer, making sure that they can withdraw any time they want and none of this is identifiable.
Using identifiable data to me, especially in employee data doesn’t make sense. There may be times when for whatever reason it does, but I would, as a privacy advocate, say that’s not something that is really necessary or needed. Looking at the patterns of information is fine, when the researchers need to learn more information, for example, if they have outliers in the data, they need to understand the demographics or other kinds of more sensitive, more identifiable type information they can do that in a way that doesn’t violate the identifiability of the person and I think certainly when we’re talking about employees that’s crucial and I do think it can resonate out.
Again, one of the goals was, okay, if we can look at this sort of internal world where data lives, where decisions about privacy and ethics are made all the time, second by second, maybe what we can do is have a situation where those recommendations resonate out within a company and change the culture of that company and I can tell you that I think that has happened at Fitbit — they’re hiring their first privacy director, which is exciting. They’ve committed to their wellness products being HIPAA-compliant, which I think is also in their business interest. I think that’s smart for them. That’s probably where they’re headed, but at the same time it’s good for privacy and security.
They also have instituted some controls on, similar to what we recommended, on how they sign up employees, making sure that they are either giving them gift cards or other kind of small meaningful rewards, so they’re following many of these recommendations.
David Harlow: You said earlier that you hoped that publication of something like this would lead to adoption or at least a recognition of the issues by other folks in the community meaning other manufacturers, other folks in industry. Do you see an opportunity here to broaden the discussion in other ways? I mean, how does this tie in with other work that you’re doing?
Michelle De Mooy: You know, it definitely ties into work that we’re doing. I think what I’m looking at now is, well, a couple of things. One is, yeah, we’ve gotten some inquires from different companies in the space, just curious — I think it’s great to work with such a market leader like Fitbit because other companies say, okay, well, Fitbit is doing this then I think we’ve got to think about doing this, we need to at least look at it and so there has been interest in that sense and so maybe out of that there will be workshops or something where we could get this information out.
But I think part of what I’m looking at now is a couple of the ways in which wearables are used, so employee wellness programs is a great example of a space where these are being used quite a bit, but I feel like there are a lot of concerns, there are a lot of privacy concerns, ethical concerns that I have about their use, about employee wellness programs in general, about the intersection of wearables and those programs and then from that also how do laws interact with some of the newer kinds of information that are becoming really mainstream like genetic information — this to me is rapidly – this marketplace is growing bigger and bigger and there are a lot of interested entities that want this information including the government, including tons of commercial entities and yet there are very sort of loose frameworks of laws that surround this stuff and they’re very contextual, so what does it mean when we have a genetic marketplace that explodes and then we have the government asking for the Precision Medicine Initiative, or we have 23 & Me starting to work with a giant healthcare insurance company, so all of a sudden genetic information is a part of our medical record? How do we deal with things like technologies that are growing so quickly that researchers can identify the face of a person based on their genetic information that they can actually have a 3D image of a face, what does that mean in terms of law enforcement? Those are the questions that we’re asking, to sort of bounce off of some of the work we did here.
David Harlow: It’s just the tip of the iceberg, I suppose. There are so many potential implications and I think here we’re getting further down the path and to the question of secondary use of data, how data can be collected at one point in time for one purpose in one sort of dataset and can be cross-referenced and contextualized with other information and be put to other sorts of uses entirely.
Michelle De Mooy: That’s right. And I think the thing about asking a company to view itself as sort of a steward and be a part of the conversation about health, social health, it was a really important part of this. I think many advocates sort of maybe scoff at or are more cynical about it, but when you see these changes at Fitbit for example those impact a lot of people. When you see the fact that they are aware of what are the ethical questions that are coming in and seeing them in that framework, that really can make a huge difference in the role of the company and how they treat user data and making decisions about secondary uses I think hopefully becomes a different equation for them.
David Harlow: Any last thoughts, Michelle?
Michelle De Mooy: No, thank you. I’m really pleased to be here talking with you about this and and I appreciate you recognizing the work, so thanks for having me on.
David Harlow: Well, thank you very much for taking the time. I’ve been speaking with Michelle De Mooy, Deputy Director of the Privacy and Data Project at the Center for Democracy & Technology. This is David Harlow at HealthBlawg.
A Pot Luck Health Wonk Review
The posts for this week’s Health Wonk Review are an interesting and varied lot. Accordingly, despite