Google on Tuesday announced new artificial intelligence tools aimed at allowing healthcare organizations to use the search giant’s software and servers to read, store and label X-rays, MRIs and other medical imaging.
Tools from Google’s cloud unit allow hospitals and medical companies to search through imaging metadata or develop software to quickly analyze images for diagnosis. Called the Medical Imaging Suite, these tools help healthcare professionals automatically annotate medical images and create machine learning models for research.
“With advances in medical imaging technology, there’s been an increase in the size and complexity of these images,” Alyssa Hsu Lynch, Google Cloud’s global lead for health technology strategy and solutions, said in an interview. “We know that AI can enable faster, more accurate diagnosis and therefore help improve productivity for healthcare workers.”
Privacy advocates may be concerned that the tech giant, which generates the majority of its $257 billion in annual revenue from personalized advertising based on user data, will use patient information to feed its vast advertising machine, based on Google’s other efforts in healthcare.
Lynch said Google has no access to patients’ protected health information, and none of the data from the service is used for the company’s advertising efforts. Google claims the service complies with the Health Insurance Portability and Accountability Act, or HIPAA, a federal law governing the use of patient data.
The tech giant is working with some medical institutions as initial partners for the imaging software. One partner, a company called Hologic, is using Google Suite for cloud storage, as well as developing technology to help improve cervical cancer diagnosis. Another partner, Hackensack Meridian Health, a network of healthcare providers in New Jersey, is using tools to scrub identifying information from millions of gigabytes of X-rays. The company also uses the software to help create an algorithm to predict metastasis of prostate cancer.
The new tools come as Google and its parent company Alphabet invest heavily in health-related initiatives. In the early days of the pandemic, Alphabet’s Verily unit, which focuses on life-science and med tech, partnered with the Trump administration to offer online screening for Covid tests. Google has also partnered with Apple to create a system for contract tracing on smartphones. Last year the company disbanded its Google Health unit, restructuring its health efforts so they weren’t housed under one central division.
Google has courted controversy in the past for its healthcare efforts. In 2019, Google came under fire for an initiative called Project Nightingale, in which the company partnered with Ascension, the nation’s second-largest healthcare system, to collect the personal health information of millions of people. The data included lab results, diagnoses and hospital admission records, including names and birthdays, according to Wall Street Journal, but Google said at the time that the project was subject to federal law. Google is using some of the data to build new software.
Two years ago, the tech giant partnered with the National Institutes of Health to publicly post more than 100,000 human chest X-ray images. The goal there is to demonstrate the company’s cloud storage capabilities and make data available to researchers. But two days before the images were posted, the NIH told Google that its software had not properly removed data from X-rays that could identify patients. The Washington Post, which may violate federal law. In response, Google canceled its project with the NIH.
When asked about Google’s past fumbles with identity detection, Sameer Sethi, SVP and chief data and analytics officer at Hackensack Meridian Health, said the company has safeguards in place to prevent such risks.
“You don’t really trust the tool,” he said Forbes. He added that Hackensack Meridian Health works with a third-party company to verify that images are not identified even after using Google’s tools. “We don’t use anything without expert judgment.”