/
2015  “Privacy In Action” 2015  “Privacy In Action”

2015 “Privacy In Action” - PowerPoint Presentation

olivia-moreira
olivia-moreira . @olivia-moreira
Follow
382 views
Uploaded On 2018-10-27

2015 “Privacy In Action” - PPT Presentation

Speaker Series January 28 2015 Innovative Technologies to Increase Privacy Dialin 18557671051 and Conference ID 34619328 Administrative Items Do not use your computer microphone to participate in this meeting Lync will be used only as a display Please dial in using the following inf ID: 698997

privacy data hl7 health data privacy health hl7 information technologies ds4p policy big pcast university technology pilot standards national

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "2015 “Privacy In Action”" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

2015 “Privacy In Action” Speaker Series

January 28, 2015

Innovative Technologies to Increase Privacy

Dial-in: 1-855-767-1051 and Conference ID: 34619328Slide2

Administrative ItemsDo not use your computer microphone to participate in this meeting. Lync will be used only as a display. Please dial in using the following information:Phone number: 1-855-767-1051Conference ID: 34619328Please mute your computer microphone and speakers. This will eliminate feedback on the line and make it easier for you and your colleagues to hear the presentation.

The presenters will address questions during the Q&A session at the end of each presentation. For those online, please feel free to type your questions into the Lync Instant Messenger. Send technical issues to

VACOPrivacySpeakers@va.gov. Slide3

Welcome and Introduction of SpeakersLaShaunne’ DavidDirector, VA Privacy ServiceU. S. Department of Veterans AffairsSlide4

REPORT TO THE PRESIDENTBIG DATA AND PRIVACY:A TECHNOLOGICAL PERSPECTIVEMay 2014 Slide5

Charge from President Obama January 17, 2014 speech requesting analysis of big-data implications for policyScoping study, focusing on the wider economy and societyPCAST report to inform and accompany White House report

Objectives of the PCAST report

Assess current technologies for managing and analyzing big data and preserving privacy

Consider how

such technologies

are evolving

Explain

what

technological

capabilities and trends imply for

design

and enforcement of public policy

to

protect privacy in big-data contexts Slide6

Presidents Council of Advisors on Science and Technology (PCAST)PCAST Working Group Members & StaffSusan Graham, Co-Chair, UC BerkeleyWilliam Press

, Co-Chair, University of Texas S. James Gates, Jr., University of MarylandMark Gorenberg

, Zetta Venture PartnersJohn P. Holdren, OSTP Director

Eric Lander

, Broad Institute of Harvard and MIT

Craig Mundie

, Microsoft Corp.

Maxine Savitz

, National Academy of Engineering

Eric Schmidt

, Google, Inc.

Marjory S. Blumenthal

, PCAST Executive Director

Michael Johnson, OSTP (NSIA Assistant Director )

Other PCAST Members

Rosina

Bierbaum

, University of Michigan

Christine Cassel, National Quality Forum

Christopher Chyba, Princeton University

Shirley Ann Jackson, RPI

Chad Mirkin, Northwestern University

Mario Molina, UC San Diego

Ed

Penhoet

, Alta Partners

Barbara

Schaal

, Washington University

Daniel

Schrag

, Harvard University Slide7

Changing Technological ContextsPrivacy history conditioned on “small data” Collection of data/development of data sets used w/conventional statisticsContext of a personal relationship (e.g., personal physician, local shop)Big data attributesQuantity and variety of data

available to be processed (3 Vs)Scale of analysis

that can be applied to those data (“analytics”)Expansion of metadataLaws have not always kept pace w/technological realitiesSlide8

People Emit Data Continuously . . .Born digitalGenerated for computer(s)Clicks and taps, GPS, cookies

Born analogByproduct of the physical worldSensors collect (often invisibly)

Over-collection? Digital convergence?Big-data analytics create new information

Data

mining and machine learning

Data

fusion and integration (data

from different sources

)

Image/speech

recognition

Social-network

analysis (self-censorship won’t help…)Slide9

The Cloud as Dominant InfrastructureEasy ingestion, access, and use of dataReplication and distributionInfrastructure for mobility (e.g., smart-phone apps)Potential security benefits from automation, procedures, oversightDemocratization of analyticsSlide10

Cybersecurity and Privacy: Distinctions and Dependency Cybersecurity: technologies enforce policies for computer use and communicationSystems to protect identity and to authenticate (are you who you say)Harder to codify privacy policy for tech implementationPoor cybersecurity is

a threat to privacy, but . . .

Violations of privacy are possible

with no failure in computer security

Misuse of data, fusion of data

Cybersecurity: necessary but not sufficient Slide11

Technologies and Strategies for Privacy ProtectionCryptography and encryptionAnonymization and de-identificationData deletion and ephemeralityNotice and consentSlide12

Areas of Concern: ExamplesHealthcare: Personalized medicine (including genetic info); mobile devices that monitorEducation:New online platforms collect masses of data, enable longitudinal datasetsHome:More ways of collecting, storing, and communicatingSlide13

What Might the Future Look Like? Taylor Rodriguez packs for a trip, leaves suitcase outside home for pick-upCamera on streetlight watches the suitcase, which has an RFID tag (anti-theft)Her suitcase is picked up at night by delivery companyShipper knows Taylor’s itinerary and plans Self-driving car arrives, its instructions for her itinerary delivered by the cloud

No boarding passes or queues at the airportEveryone is tracked by phone, facial recognition, gait, emotional state, RFID tagsIn this world, the cloud and robotic aides are trustworthy WRT personal privacyImprovements in convenience and security of everyday life become possible

. . .Not an endorsement, just food for thought!Slide14

PCAST Perspectives and ConclusionsNew sources of big data are abundant; new analytics tools will emerge New data aggregation and processing can bring enormous economic and social benefits. Unintentional leaking of data and deliberate systemic attacks on privacy are potential risks Cannot always recognize privacy-sensitive data when collected—may emerge w/analytics, may be able to home in on the moment of particularization to an individualGovernment

role to prevent breaches of privacy that can harm individuals, groupsTech plus law/regulation to generate incentives, contend with measure-countermeasure cycle Data collectors, data analyzers, and users of analyzed data as different actors

Policy can intervene at various stages of this value chainAttention to collecting practices may reduce risk, but use is the most technically feasible place to apply regulation Technological feasibility mattersSlide15

Recommendation 1: Policy attention should focus more on the actual uses of big data and less on its collection and analysisAny adverse consequences of big data arise from a program/app interacting with raw data or information refined via analyticsPolicies focused on the regulation of data collection, storage, retention, a priori limitations on applications, and analysis (absent identifiable actual uses of the data or products of analysis) are unlikely to yield effective strategies for improving privacy

It is not the data themselves that cause the harm, nor the program itself (absent any data), but the confluence of the twoSlide16

Recommendation 2: Policies and regulation should not embed particular technological solutions, but rather should be stated in terms of intended outcomesTechnology alone is not sufficient to protect privacyTo avoid overly lagging

the technology, policy concerning privacy protection should address the purpose—the “what” — rather than prescribe the mechanism—the “

how”Controlling the use of personal data is more effective than regulating technologies of data collection, storage, and retention (these may evolve rapidly) Slide17

Recommendation 3: With support from OSTP, the NITRD agencies should strengthen U.S. research in privacy-related technologies and in the relevant areas of social science that inform the successful application of those technologiesSome of the technology for controlling uses already

existsResearch and research funding are needed for (1) technologies that help to protect privacy, (2)

social mechanisms that influence privacy-preserving behavior, and (3) legal options that are robust to changes in technology and create appropriate balance among economic opportunity, national priorities, and privacy protectionSlide18

Recommendation 4: OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protectionCareer paths for professionals (e.g., digital-privacy experts both on the software-development side and on the

technical-management side)Programs that provide education leading to privacy expertise are essential and need encouragementSlide19

Recommendation 5: The United States should adopt policies that stimulate the use of practical privacy-protecting technologies that exist today. It can exhibit global leadership both by its convening power and also by its own procurement practices

Nurture the commercial potential of privacy-enhancing technologies through U.S. government procurement and through the larger policy frameworkPromote the creation and adoption of standards

Cloud computing offers positive new opportunities for privacy

Privacy-Preserving Cloud

Services?

PCAST is not aware of more effective innovation or strategies being developed abroadSlide20

White House/“Podesta” Policy RecommendationsAdvance the Consumer Privacy Bill of RightsPass National Data Breach Legislation

Extend Privacy Protections to Non-U.S. PersonsEnsure Data Collected on Students in School is used for Educational Purposes

Expand Technical Expertise to Stop DiscriminationAmend the Electronic Communications Privacy ActSlide21

Questions?Marjory S. Blumenthal, Executive Director, PCASTmblumenthal@ostp.eop.gov Slide22

Lucia Savage

Chief Privacy Officer

ONC Update and Data Segmentation for Privacy (DS4P) Update

Veterans Administration Data Privacy Day

January 28, 2015Slide23

10 Year Interoperability Vision ( fall 2014)Leverage health IT to increase health care quality, lower health care costs and increase population healthFocus on supporting health broadly, including but not limited to health care deliveryBuild incrementally over time from current technology – multiple methods of exchange requiredEstablish best minimum possible interoperability for all; create opportunities for innovation

Empower and maintain focus on individuals

23

http://healthit.gov/sites/default/files/ONC10yearInteroperabilityConceptPaper.pdfSlide24

Vision for the Decade Ahead – Improvements Due to the Sharing of Interoperable Data

24

Core technical standards and functions

Certification to support adoption and optimization of health IT products & services

Privacy and security protections for health information

Supportive business, clinical, cultural, and regulatory environments

Rules of engagement and governanceSlide25

Data Segmentation for Privacy (DS4P)Goal:Develop technical standards, develop use case, and pilot use case testing whether:Patient choice to disclose, or not, information regulated by 42 CFR Part 2 (substance abuse treatment at a federally regulated facility), can be captured, documented and persisted electronically.

Office of the National Coordinator for Health Information TechnologySlide26

Purpose of Use: Defines the allowed purposes for the disclosure (e.g. Treatment, Emergency Treatment etc).Obligations:Refrain Codes: Specific obligations being placed on the receiving system (e.g. do not re-disclose without consent)

Confidentiality Codes:

Used by systems to help convey or

enforce rules regarding access to data requiring enhanced protection. Uses “highest watermark” approach.

Types of Privacy Metadata used by DS4P

26Slide27

Selected Standards

27

STANDARD:

HL7

Implementation Guide: Data Segmentation for Privacy (DS4P), Release

1

(

Includes Content Profile, Profile for Direct, Profile for exchange

)

Capability

Standards/Profiles used by the HL7 DS4P R1 Standard

Specific Usage

Metadata Vocabularies (for Transport and/or Document Metadata)

HL7 RefrainPolicy

Conveys specific prohibitions on the use of disclosed health information (e.g. prohibition of redisclosure without consent)

HL7 PurposeofUse

Conveys the purpose of the disclosure of health information (e.g. treatment, research, emergency)

HL7 BasicConfidentialityCodeKind

Used to represent confidentiality codes associated with disclosed health information (e.g. restricted

) as specified in the HL7 Healthcare Security Classification standard (HCS).

HL7 ObligationCode

Used to convey specific obligations associated with disclosed health information (e.g. encryption)

HL7 ActPolicyType

Used to convey a type of policy

HL7

SensitivityPrivacyPolicy

Used to convey the sensitivity level of a specific policySlide28

HL7 normative standard which has been approved by ANSI May 2014Standards facilitate tagging at document and section levelONC pilots tested at document level

DS4P Standards

28Slide29

DS4P Pilot accomplishmentsData Segmentation for Privacy Initiative

29Slide30

NETSMART Pilot:The Netsmart DS4P Part 2 solution has been implemented with the community services referral network in Tampa Bay (2-1-1 system), helping them manage restricted data associated with programs regulated by 42 CFR part 2.

Pilot Accomplishments

30Slide31

VA/SAMHSA Pilot: The pilot was successfully tested and demonstrated in multiple venues, including the Interoperability showcase at HIMSS 2013 and the HL7 Plenary meeting in Baltimore, September 2013. VA have extended the DS4P capabilities to demonstrate utilization of FHIR for DS4P (demonstrated at HL7 in Jan 14, in real time, using resources from Australia, Canada and USA).

Pilot Accomplishments

31Slide32

HITPC Recommendations re Incorporating Standards into EHRsContextONC contemplating expanding certification program to “voluntary” EHRs for Behavioral Health and Long Term and Acute CareNo MU incentivesAim of promoting exchange of data with primary care providers

Office of the National Coordinator for Health Information TechnologySlide33

Questions? Slide34

Thanks for Attending!Thank you for attending the first of four 2015 VA Privacy Service “Speaker Series.” We value your feedback, opinions and comments! After this session, you will receive a short questionnaire via email. Please take a moment to complete upon receipt.

To self-certify Lync Meeting attendance in the Talent Management System (TMS), search:Item Title: Privacy In Action - Speaker Series 2015: Innovative Technologies to Increase PrivacyTMS ID: 3901065

Visit the new VA Privacy Service website at http://vaww.oprm.va.gov/privacy/ to learn more about Privacy within VA.