DeepMind’s initial data-sharing understanding with a UK’s National Health Service looks to be entrance resolutely unstuck.
The partnership attracted debate final year, when a range of a behind-the-scenes data-sharing was suggested by a New Scientist investigation. The privately identifiable health information of some 1.6 million NHS patients was used to develop an app for a early showing of a kidney condition.
Now the studious information reserve advisory physique to a UK government, a National Data Guadian (NDG), has weighed in with an opinion that there was — discordant to DeepMind’s repeat insistence — no official basement for the data send of 1.6 million patients’ medical annals to a Google-owned association during a growth / piloting proviso of a app.
The arrangement remains under examination by a UK’s information insurance watchdog, a ICO.
Sky News has now performed and published (embedded in a next tweet) a minute sent by a NDG to a Royal Free NHS Trust and to DeepMind co-founder Mustafa Suleyman. A mouthpiece for a NDG reliable to TechCrunch that a minute is authentic.
Full copies of National Data Guardian’s minute sent to @SkyNews warning of authorised basement re: @DeepMindAI + 1.6 millio… twitter.com/i/web/status/8…—
Alexander J. Martin (@lexanderjmartin) May 15, 2017
In the letter, that was sent on February 2017, Dame Fiona Caldicott, a NDG, writes:
It is my perspective and that of my row that a purpose for a send of 1.6 million identifiable studious annals to Google DeepMind was for a contrast of a Streams application, and not for a sustenance of approach caring to patients. Given that Streams was going by contrast and therefore could not be relied on for studious care, any purpose a focus competence have played in ancillary a sustenance of approach caring would have been singular and delegate to a purpose of a information transfer. My deliberate opinion therefore stays that it would not have been within a reasonable expectancy of patients that their annals would have been common for this purpose.
She goes on to state that she is essay to a ICO to promulgate her advice to feed into a ingoing examination of a data-sharing arrangement.
The NDG also supposing us with a following matter per a letter:
The National Data Guardian for Health and Care, Dame Fiona Caldicott, and her row of advisors have been deliberation how studious information was common by a Royal Free London NHS Foundation Trust with DeepMind underneath a ‘Streams’ plan to urge a showing and government of strident kidney failure. In discussions with a ICO about this, a NDG concluded to yield recommendation on a use of pragmatic agree for approach caring as a authorised basement for a pity of information by a Royal Free with DeepMind. While a ICO examination is ongoing a NDG will yield any serve assistance to a ICO as required, though will not be commenting serve on a matter during this point.
During a growth of a Streams app final year, DeepMind and a Royal Free consistently claimed studious agree was not indispensable to share their medical annals for a Streams app plan as a app was being used for supposed ‘direct studious care’. Although during 2016 a app was never actually being used for direct care — it was merely being irregularly tested alongside normal clinical practices.
Critics have also doubtful their ‘direct care’ argument for a app — indicating out, for example, that not all a patients whose information is being common with DeepMind for Streams will go on to rise Acute Kidney Injury, a condition a app is dictated to detect. And therefore that some will never be in a approach caring attribute for this condition.
The NDG’s criticism here is not weighing in on that wider point. But a minute does make it transparent that in Caldicott’s view DeepMind and a Royal Free were not justified in their use of live studious information during the testing/development proviso of the Streams app.
The spokeswoman reliable to us that the scope of a examination was singular to a correspondence of pragmatic agree for approach caring “as a authorised basement when a information was shared”.
“As a minute states, this was a authorised basement that a Royal Free had reliable that they had used. The NDG has not deliberate or been asked to cruise a correspondence of any other authorised bases,” she added.
At a time of essay conjunction DeepMind nor a Royal Free NHS Trust had responded to a questions or ask for comment.
But responding to the NDG’s minute in a statement, health information remoteness organisation medConfidential’s coordinator, Phil Booth, told us: “This minute shows that Google DeepMind contingency know it had to undo a 1.6 million studious medical annals it should never have had in a initial place. There were legitimate ways for DeepMind to rise a app they wanted to sell. Instead they pennyless a law, and afterwards lied to a open about it.”
Despite a debate around a data-sharing arrangement underpinning Streams, DeepMind and its partner NHS Trust for this deal, a Royal Free, pushed forward deploying the app in several London hospitals progressing this year — though usually after inking a new data-sharing understanding final year.
That second data-sharing arrangement is not now being reviewed by a NDG, according to a spokeswoman.
At a time of essay a ICO could not be reached for comment.