The Federal Trade Commission convened a meeting in late 2014 to discuss the use of big data analytics, their benefits for consumers, and their potential for exclusionary and discriminatory outcomes, particularly with respect to low-income and underserved populations. See the final report, just issued: Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues and the accompanying presser. An earlier FTC report on data brokers focused on data collection, compilation and analytics; the latest report focuses on the use of data.
The general regulatory framework governing use of data includes a number of laws, which should be considered afresh given the current range of data available for analysis, and the ways in which data types are mashed up together for specific analyses.
The Fair Credit Reporting Act (FCRA) governs traditional credit decisions and may also relate to employment, insurance and housing screenings. Using predictive analytics, big data companies may now use non-traditional characteristics in a person’s data (zip code, social media usage, others) and mash them up with income and employment history to make credit decisions. Use of these data must be carried out with the limits of law in mind: the need to have reasonable controls in place to ensure accuracy, as well as consumers’ opportunity to review and correct their own data.
The FTC also enforces the Equal Credit Opportunity Act (ECOA), which prohibits discrimination based on protected characteristics (e.g., race, religion, national origin, gender, marital status, age, genetic information). Disparate treatment (higher interest rates for single borrowers) or disparate impact flowing from facially neutral policies (e.g. making credit decisions based on applicants’ zip codes, using neighborhood as a proxy for race), based on the use of big data analytics, could constitute violations of the ECOA.
Finally, the Federal Trade Commission Act (FTC Act) prohibits unfair and deceptive acts or practices. Businesses using big data should consider whether their use of big data analytics violate limitations on sharing of data and is conducted in a secure manner.
The FTC encourages companies to consider the following issues when using big data:
- How representative is your data set?
- Does your data model account for biases?
- How accurate are your predictions based on big data?
- Does your reliance on big data raise ethical or fairness concerns?
While this report focuses on consumer protection in the context of credit reporting and decisions that may be made based on credit data alone or together with other data, it is of course relevant to the FTC’s stance with respect to health care and the use of big data in health care.
The FTC was kicked back on its heels recently given the decision in the LabMD case, which found that the FTC had not proved that the practices in question — which were the grounds for an FTC enforcement action — caused, or were likely to cause, substantial injury to consumers. (That’s just the first prong of a three-prong test under Sec. 5 of the FTC Act, all three of which must be satisfied in order to find that an act or practice is unfair or deceptive. The other prongs require that the FTC prove that the act or practice is not reasonably avoidable by consumers and that its harm is not outweighed by countervailing benefits to consumers or to competition.)
Given the right set of facts, however, it is likely that the FTC will continue to take action against health care companies with poor data security hygiene where an adequate showing of harm may be made.
One important point raised in the report (in the context of consumer protection) is that companies could be held liable where they sold or otherwise provided data to other companies that they knew or had reason to know were going to use the data to perpetrate fraud on consumers. Similarly, the FTC is likely to seek to hold companies responsible for sharing health data with other companies that they knew or had reason to know had inadequate security protections in place. This is often a weak link in the chain of HIPAA compliance. Covered entities and business associates may work to ensure that their own houses are in order, but many are satisfied with perfunctory statements from their downstream business associates, or even a simple signature on a business associate agreement. While such an approach could lead to liability in an OCR enforcement action, the FTC report provides a reminder that such an approach could leave companies open to enforcement actions brought by the FTC as well.
(A related issue is the concern that health data may find its way into big data analyses that are not related to health care for the data subject, thanks to our current national obsession with security — see: big data policing — and the recent enactment of the deeply flawed Cybersecurity Information Sharing Act. This is an issue that demands a broad national conversation in the wake of Congress’ decision last spring not to re-up the broad surveillance provisions of the Patriot Act, rather than the budget-rider treatment we got.)
Bottom line: the FTC is seeking to raise awareness and raise the level of security practices (among other things) through the public dialogue that preceded the publication of the report, and through the publication of the report itself. While the FTC and OCR are ultimately more interested in compliance than in hanging individual violators out to dry, the regulated community needs to recognize that enforcement action will continue, and compliance should be a key business objective.
Nobody wants to be made an example of by the federales.
David Harlow
The Harlow Group LLC
Health Care Law and Consulting
Image credit: vintagedept via Flickr CC