Digital Health, Transparency, Consent and Respect

Transparency and consent are necessary components of ethical data handling, but what about respect?

On 23 July 2018, Australia’s Digital Health Agency began defending the security of the Government’s MyHealth Record system, but by midday the agency was concerned with more immediate matters: users inundating its online and telephone systems.

In an effort to bolster enrolments to a system that was met with apathy by citizens, in 2017, the government switched from an opt-in to an opt-out system, in which records would be created automatically, for everyone. People wanting to opt-out of the new system, could do so during a three-month window which opened on Monday, July 16.

By 11am, Twitter users were reporting that the telephone hold time had blown out to more than 90 minutes, and website users claimed intermittent meltdowns.

There were other issues with the online opt-out system. As well as a Medicare number, the system demanded a second form of identification, with just three options: a driver’s licence, a passport, or an ImmiCard (issued as an ID card for permanent residents).

Citizens who don’t drive or travel overseas could only opt-out through the telephone system.

The gap between the Digital Health Agency’s expectation and the number of people trying to opt-out reveals serious trust deficit.

I think this is more than a simple fear of being hacked, or apathy. It’s more active. People aren’t trusting the Government or the experts promoting the MyHealth Record.

This mistrust has been gathering in a world battered by international misuse of data (Facebook and Cambridge Analytica come to mind— there’s a good backgrounder at The Guardian about misuse of this specific data set), as well as HealthEngine’s misuse of Australian clients’ personal information data, as reported by the ABC).

Such scandals haven’t helped teach people to trust the secondary use of data, and I’d argue that the Government’s impulse to force us to share data with law enforcement has made things worse. But it’s there, in the MyHealth Record law: this data can be obtained by the police, without a warrant, to investigate crimes to “protect public revenue”.

Secondary use is one of the matters at the core of the ethical use of data, something discussed by Trent Yarwood (a doctor writing for Future Wise) and Justin Warren (a consultant writing for Electronic Frontiers Australia) in this article at Healthcare IT.

Yarwood and Warren note that, in a world infested by the data capitalism of Google (whose DeepMind partnership with the NHS broke data protection laws), and in response to the notion that MyHealth Record inevitably leads to better healthcare:

“If patients cannot trust the systems their doctors use, they will be less inclined to share information their doctors need to be able to provide them with healthcare. Patient health will suffer if patients can no longer trust that their healthcare providers will put all their patients’ needs first.”

Transparency and consent—an explanation of exactly what our data will be used for and a legitimate opportunity to say ‘no’—are necessary to help overcome public mistrust. But we need more.

We need the voices of authority to show respect for those they’re explaining the system to.

What does that mean?

In the face of the backlash to MyHealth Record, its advocates and supporters too often decide it’s not worth speaking honestly, intelligibly, in a way that respects those whose fears are reasonable. Too often, advocates don’t engage, they lecture. They don’t respect their audience.

The Digital Health Agency’s MyHealth Record website also shows a lack of respect for users. Finding the privacy controls is difficult. The system burdens the individual, especially those with a chronic illness, who might need to work out privacy settings between multiple providers for each document their case generates. And that’s before the privacy policy (to first read the My Health Record Act, the Privacy Act, and the Healthcare Identifiers Act) and data collection notice.

Sure, if the information is there, somewhere, it satisfies the need for transparency. But making it hard to find or understand leaves people feeling like they’re not respected. As does being told “your health is more important than your privacy concerns”.

You won’t recover people’s trust if you don’t respect them.

About the author: Shara Evans is recognized as one of the world’s top female futurists. She’s a media commentator, strategy adviser, keynote speaker and thought leader, as well as the Founder and CEO of Market Clarity.

Looking For a Dynamic Speaker?

Get in touch now to check Shara’s availability to speak at your next event. Shara works closely with her clients to ensure all her presentations are tailored to your event – ensuring maximum impact for your business.