I was re-reading my notes today and realized I've done Dr Sophia Frentz a bit of a disservice in my write up. This was a really good talk on the risks and problems involved with gene sequencing and electronic heath records.
I didn't take a lot of notes since this is something I've been living for the last few years but it was a great primer on this, so I thought I'd try to do a better write up from memory and the photos I took of slides.
I'm probably going to struggle to separate my memory of the talk from my own thoughts so any mistakes are definitely mine.
(Not) hacking your biology
Sophia Frentz @SophiaFrentz
There's a lot of noise at the moment about getting genome sequenced, this can be for health reasons, research reason or even family history.
But there's risks involved, particularly when talking about private gene sequencing company such as 23 and me or ancestry. These companies are definitely selling your data, it's part of the business model.
DNA is forever, right now there's only so much that we can interpret from it, but as we learn more there will be more things that the DNA sequence can tell you. Once you've given it up then you can't take it back, so it's very difficult to make a calculated risk decision.
The risk I took was calculated, but man, am I bad at math.
Also when you're getting your DNA sequenced you're not the only one taking on the risk, you share half your DNA with your parents, your children, your siblings. So you're also putting the risk on the rest of your family, is it even possible to provide meaningful consent for this?
Of course there are legitimate health reasons to get DNA sequenced, do this through a health provider that way it should at least be treated as a Clinical Record with the appropriate Personal Health Information protections.
There are also needs to sequence DNA to progress scientific research, but that brings it's own set of challenges. DNA can itself by identifying information, so it's impossible to reliably anonymise DNA records. Even if you de-identify by removing the associated name, date of birth etc. the DNA alone can remain personally identifiable. Similar risks apply with future advances making it more identifiable over time.
So we're presented with a dilemma between open science and closed science. Open science would require the DNA data set to be published, bringing in all the above risks. Is it possible to get informed consent in that context? There are real risks of harms if the information people get about their own genome is not entirely correct, but we know the caveats are often lost. Having this data out there creates all kinds of risks of discrimination.
Closed science though presents it's own problems. It limits the research that can be done, and limits who does that research based on networks of who can get the data. It takes away power from the individual, they can't get the results from research into their own genome. Consent and the scope of consent is often very confusing to the participants. And of course this means treating the source data as seriously sensitive PHI, which brings cybersecurity requirements that research institutes are not well placed to manage.
So yeah, make an informed decision.
Which provides a bit of a transition to a discussion of Electronic Medical Records. Particularly given all the noise in Australia it's worth remembering that these do provide a lot of benefits. Clinicians site examples of how the access to a health record because of an EHR has quite literally saved lives.
And paper health records have security breaches too. Paper records are left lying around the hospital where any of us can walk off with them. We've all had the situation of sitting in the ED and overhearing the details of someone else's care including their identifying information.
But there definitely are issues. The Australian MyEHR had no purposeful or malicious attacks, but still had 42 data breaches. E.g. people being shown records they shouldn't be, for the wrong patient etc. But this happens on paper as well, there are cases of people being sent other people's confidential medical records by accident because they were put in the wrong envelope.
If you ask anyone involved in the health sector they will say they take patient privacy very seriously. But when you dig by asking questions like "do you go into a room to have conversations about patient care?" you find that they're regularly discussed in corridors where they can be overheard.
There is not an information security culture, and this is what we need to work on. Technology doesn't solve social problems, just magnifies them. These are social problems.
So what can we do:
- Make responsible individual decisions
- Educating people outside of security on needs, requirements and values so we can help spread a culture of information security
- Be the squeaky wheel, complain about breaches, this is the only way culture will eventually change
2 Corrections that Dr Frentz was kind enough to point out:
- The second action point was outside of the security sector and optionality outside of health where as my initial write up had it as outside of the health sector
- DNA can be necessarily personally identifying where as my initial write up had it as always identifying