Tải bản đầy đủ - 0 (trang)
3 Infrastructure and Meaningful Use, or Excessive Regulation and Boondoggle?

3 Infrastructure and Meaningful Use, or Excessive Regulation and Boondoggle?

Tải bản đầy đủ - 0trang

408



L.F. Hogle



little to connect information at a national level, plus small clinics and many postacute facilities, including rehabilitation and long-term care facilities, are not eligible

for incentives and do not participate in HIEs.

The vision for health IT failed to take into account such realities, as well

as the lack of interoperability of IT platforms (Adler-Milstein and Jha 2013). A

commissioned report highlighted the interoperability issues, noting that different

federal agencies have authority over different aspects of EHR implementation and

exchange, but there does not appear to be any single interagency group charged with

coordinating and harmonizing these efforts (Agency for Healthcare Research and

Quality 2014, p. 15).11 This state of affairs makes the importance of infrastructure

more visible; it is when things do not function as intended that the frictions between

old and new practices become evident.



5 Considerations of Privacy and Protections

Recognition of a concern about privacy and data security are a consistent theme in

documents directing the building of new information infrastructures. Yet provisions

in new laws create procedural and definitional moves to get around the problem

without addressing directly what privacy is in an era when scientists are urged

to have open-source data sharing, when algorithms make it relatively easy to reidentify anonymized data, and when individuals volunteer considerable information

about themselves in social media and other venues. Policy makers are making

directives about how to manage information that enables information to flow in

desired directions, yet may conflict with existing guidelines and understandings of

what constitute appropriate protections.



5.1 Provisions in HITECH and Existing HIPAA Rules

The HITECH Act recognized potential problems with data privacy and security

issues, making provisions for protecting against commercial use of EHR, and adds

criminal penalties for breaches to privacy under the Health Insurance Portability and

Accountability Act (HIPAA) (Blumenthal 2010). HIPAA, enacted in 1996, authorizes the Department of Health and Human Services (HHS) to create regulatory

rules related to the privacy of individually identifiable health information, particular



11



The JASON Report cited ineffectiveness of the Federal Advisory Committees created to assist

with coordination (the Health IT Policy Committee and the Health IT Standards Committee, which

report to the ONC) (Agency for Healthcare Research and Quality 2014). Notably, a proposed new

law would replace existing committees with a new entity comprised of industry representatives

(see Sect. 7).



The Ethics and Politics of Infrastructures: Creating the Conditions of Possibility. . .



409



with regard to unauthorized or inadvertent disclosures.12 Much has been written

about HIPAA, including its weaknesses, but for the purposes of this chapter, a few

main points bear highlighting: (1) while it covers information transmitted between

health care providers and relevant “covered entities” and business associates, these

are ambiguously defined, and (2) it covers the transmission of information between

care providers and business associates, but not individuals, and not the manner of

collection.13 Importantly, any information shared by individuals—whether it entails

patient experience narratives uploaded in a survey, medical record data voluntarily

shared by individuals, posted in social media or elsewhere, is not HIPAA protected.

Furthermore, as other chapters in this volume will attest, the ability to re-identify

individuals using diagnostic codes or other relatively easily accessible publicly

available information and re-identification algorithms means that there can be no

guarantee of anonymity (Loukides et al. 2010; Gymrek et al. 2013).

Nicholas Terry argues that these situations create an especially problematic

situation with the introduction of specific policy initiatives to promote broader

collection and sharing of information and the employment of big data (Terry

2012). He points out that our “medical selves” exist outside of HIPAA-protected

spaces, both in terms of what individuals make available themselves, and what

data aggregators and brokers are able to infer through data mining of social

media entries, purchase patterns, prescription use, sensor data and much more. The

surrogate selves represented by aggregating and triangulating such data mean that

no protected health information may need to be accessed in order to infer behavior,

illness or other characteristics of individuals. With such end runs around privacy

protections, the policy response has been to rely on informed consent to attempt

to preserve autonomy. As Barocas and Nissenbaum (2014) demonstrate, this is not

only insufficient, but the use of big data, particularly with predictive analytics and

the ability to infer properties across groups, creates complex problems. Even if only

a few people volunteer information, it implicates many others.



12



Health Insurance Portability and Accountability Act of 1996 Public Law 104–191, Sub. F, Sec.

264.

13

Health information is defined as “any information, whether oral or recorded in any form or

medium, that (A) is created or received by a health care provider, health plan, public health

authority, employer, life insurer, school or university, or health care clearinghouse; and (B) relates

to the past, present, or future physical or mental health or condition of an individual, the provision

of health care to an individual, or the past, present, or future payment for the provision of health

care to an individual.” Personally identifiable information is that which can be directly tied to an

individual, including name, geographic information smaller than a state, social security number,

birth and death dates, phone and fax numbers, device serial numbers, and biometric identifiers

(including voice print and photos) (45.CFR 160.103). “Business Associate” is defined in 45 CFR

160, subpart A and includes an entity which “ : : : claims processing or administration, data analysis,

processing or administration, utilization review, quality assurance, patient safety activities, billing,

benefit management, practice management and repricing, or provides legal, actuarial, accounting,

consulting, data aggregation, management, administrative, accreditation, or financial services to a

covered entity or for an organized health care arrangement : : : ” A summary of the privacy rule can

be found at http://www.hhs.gov/ocr/privacy/hipaa/understanding/summary/index.html



410



L.F. Hogle



However, it is also the case that investigators increasingly want to use explicitly

identifiable information in order to link ‘omics’ data with visual or audio data. In

vitro disease modeling, for example, may seek to link what they see in the dish

with in vivo behavioral phenotypes (Saha and Hogle 2014). This sets up a growing

ethical tension between the need to specifically identify individuals and to protect

their identity (Kaye 2015).

There is also little to prevent non-covered entities (or covered entities acting

for a different purpose than outlined in HIPAA) from sharing data. For example,

private HIEs, investigators and other holders of data sets may release data under

restrictive or generous data-use terms, or may repackage and sell the data to third

parties. If data holders such as an NIH-funded project, a private or public biobank,

etc. dissolve, they can transfer or sell entire data sets, which would most likely

not have been covered in original consent agreements. In sum, HITECH prioritizes

privacy and data security, but neither HITECH nor HIPAA solve some of the actual

problems, including tensions between the need to access data and the need to protect

human subjects (see also Kaye 2015).

There are some prohibitions on the sale of protected information through

HIPAA.14 However, the Center for Medicare and Medicaid Services (CMS) which

also has data on claims for services, personal health information, and identified

providers, recently reversed its policy prohibiting access to CMS data for commercial use (e.g., to develop products or tools to sell on the market) to allow such uses

in the interest of providing data for the ‘public good.’ Individual records will be

de-identified, but the identity of care providers will be available. As Niall Brennan,

CMS chief data officer and director of the Office of Enterprise and Data Analytics

put it, “as the [health care] delivery system transforms from rewarding volume to

value, data will play a key role” (emphasis added) (CMS Press release, June 2,

2015).15 This underscores the argument of economists, management consultants and

entrepreneurs that health data has value as a product in its own right.

Finally, the President’s Council of Advisors on Science and Technology

(PCAST), an authoritative scientific advisory body, published a report dealing with

big data privacy issues, identifying interception, stalking, false or spurious facts

constituting misleading profiles of individuals, and loss of autonomy as serious

threats to privacy (President’s Council 2014). However, it focused on downstream

effects (after a harm might have been incurred), and advocates the use of tools such

as consumer “preference profiles” in place of or as an adjunct to formal consent

processes. Recognizing the increasingly invasive practices of data brokers and

aggregators, the White House also issued a report, advancing a previously-prepared

Consumer Privacy Bill of Rights, but did little more than calling for data brokers to

be more transparent about their practices of gathering and using data on individuals

(Executive Office 2014).



14



Id at 164.508 ‘uses and disclosures for which an authorization is required.’

https://www.cms.gov/Newsroom/MediaReleaseDatabase/Press-releases/2015-Press-releasesitems/2015-06-02.html



15



The Ethics and Politics of Infrastructures: Creating the Conditions of Possibility. . .



411



5.2 Changing the Common Rule: What Is a Human Subject?

At the same time as modifications to privacy protections are taking place, the U.S.

is in the process of making major changes to the Common Rule, the major federal

regulation for the protection of human subjects that applies to all federally funded

research.16 An Advance Notice of Proposed Rule-Making (ANPRM) was published

in 2011 with the title giving a clue about the reasons for change: “Human Subjects

Research Protections: Enhancing Protections for Research Subjects and Reducing

Burden, Delay, and Ambiguity for Investigators.”17 The proposed rule explicitly

names new technologies as a chief reason for the needed update, including rapid

genome sequencing, imaging, and data analytics, indicating an awareness of the

growing use of both big data techniques and re-identification algorithms.

After almost 4 years of deliberations and many thousands of comments from

supporters and detractors, the revised Notice was published on September 8, 2015.18

The 519-page document proposes to strengthen data security measures, centralize

IRB oversight for multi-institution studies, make stricter requirements for what

must be included in consent forms, and makes recommendations for standardizing

informed consent procedures. Concern about genetic privacy prompted a change

requiring patients to consent for each use of biospecimens, even if it had been deidentified and had previously been used for studies and was stored, or was left over

after being used for clinical purposes (e.g., a blood sample taken for diagnostic

testing).19 However, in contradiction, the new rule states that broad consent should

be used; that is, participants should agree to allow secondary uses of their material

and information beyond the original study without having to be re-consented.

This was in response to a concern expressed by investigators about the perceived

increased burden by having to consent for each use.



16



The Common Rule (45 CFR 46) was created to implement uniform regulations across the major

federal agencies, including the Department of Veterans Affairs, Environmental Protection Agency,

National Science Foundation, Agency for International Development, Department of Defense,

Department of Commerce, Department of Education, among others. Researchers funded by these

agencies are subject to the Rule. The proposed rule extends the scope to non-federally funded

studies as well.

17

The ANPRM and public comments can be found at http://www.hhs.gov/ohrp/humansubjects/

anprm2011page.html and in the Federal Register at 76 FR 44512–44531. The NPRM will be

available online at http://federalregister.gov/a/2015-21756

18

In the process of conducting research on this topic, I attended a number of medical information

technology and precision medicine symposia and workshops, and in each one the desire to change

the Common Rule was raised in presentations and audience comments. In informal discussions

with participants as well as public presentations sponsored by the White House, it was consistently

asserted that this was a top priority, and that it would happen by September.

19

Specimens that have been stripped of identifiers (“de-identified”) are currently not counted as

human subjects. Currently, the definition of a “human subject” includes living subjects about

whom a researcher obtains data through intervention or interaction with the individual, but also

any biospecimen or data derived from a human and for which any individual personal information

can be identified.



412



L.F. Hogle



More notable for big data researchers is the creation of new categories of

research exempted from IRB review. This includes studies with “informational risk

that is no more than minimal” which is now defined as: use of data containing

personal health information originally collected as a part of a non-research activity

(assuming a notice of the possibility of such use was given), and public behaviors

or activities common in everyday life when sensitive information may be collected

(provided that data security protection procedures are followed), or focus groups

and surveys.20 In so doing, the new rules remove oversight for a great many

of the kinds of studies emerging with big data that might involve informational

risks, stating: “IRB review or oversight of research posing informational risks

may not be the best way to minimize the informational risks associated with data

on human subjects. Instead, informational risks may be best mitigated through

compliance with stringent standards for data security and information protection

that are effectively enforced through mechanisms such as periodic random audits.”

Jurisdiction for such risks is pushed back under HIPAA, which as stated previously,

only covers certain kinds of conditions, and while protecting the unauthorized

access to personal health information, does nothing about the collection of data.

Also, HIPAA has historically not allowed broad consent, setting up yet another

potential conflict between mandates.

This represents a major change in the current scheme in order to concentrate

oversight on higher-risk research, but also opens the door for many of the sources

of data that will be used in big data research on individuals. Furthermore, with the

broad use of algorithms to create shadow or surrogate identities from digital exhaust

the question of what counts as a human ‘subject’ may be up for grabs (Terry 2012).

The deliberation and ultimate establishment of new human subjects rules clearly

demonstrate the co-production of science and society in the context of increasing

tensions around how societies have thought about protections of human subjects

and imperatives to have unfettered access to medical and scientific data.



6 Regulating Products Using Big Data: Changes in the Food

and Drug Administration

Data-driven science is affecting regulatory oversight as well as discovery research

and clinical medicine. There are noteworthy ethical implications for the production

of evidence used to support product reviews, the scope of what may be regulated,

and a statutory change which allows the FDA to have direct access to patient

information without their knowledge or consent. While most of these changes are

in response to calls to streamline regulatory processes and reduce barriers to new



20



This is based on the assumption that people engaging in activities occurring in a public context

would have no reasonable expectation of privacy.



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

3 Infrastructure and Meaningful Use, or Excessive Regulation and Boondoggle?

Tải bản đầy đủ ngay(0 tr)

×