How is Artificial Intelligence used in Healthcare?

Original article was published by Felipe Tambasco on Artificial Intelligence on Medium


How is Artificial Intelligence used in Healthcare?

Technology-lead industries are taking advantage of the current ‘Data-Boom’ that is occurring at explosive speeds. As human-computer interaction achieves an unprecedented level of connectivity, it is without question that developing technologies will begin to fixture themselves into various facets of our communities.

This convergence is especially obvious in the healthcare industry. Hospitals and clinics alike depend on the advancement of future technologies. It is for this reason that it should be no surprise that Artificial Intelligence is gaining traction in the medical field.

This article will provide a high-level summary of artificial intelligence solutions in the medical field. It will discuss the benefits of using artificial intelligence, and provide examples on how the advancements in deep learning is revolutionizing medical diagnostics and treatment.

Why Artificial Intelligence for Healthcare?

Improved Decision-Making Support

In general, conventional medical diagnostics require that doctors and specialists are able to interpret the results of analyses run on patients. While medical technology has advanced to the point that we can detect breast cancer in 87% of female patients, or provide doctors with profiles of internal organs in real time, the margin of error in detecting negative health patterns and abnormalities can still claim human lives. Medical examinations often result in boatloads of data that may contain vital information about a patient’s health. Artificial intelligence can provide additional insights from patients’ data not inherently obvious when evaluating diagnostic test results.

Improved Access to Healthcare

The availability of A.I-based applications have never been more accessible to the general public than in this age of smartphones. Many tech companies have already begun to take advantage of portable A.I models as a service to the healthcare industry. The handling of healthcare records — which can be quite expansive in data — is a process which is easily optimized using machine learning. The large quantity of data generated by the healthcare industry has created a vast amount of opportunities for A.I-integration.

Computer Vision for Medical Imaging Diagnostics

Medical professionals use a variety of equipment (MRIs, CT scans, ultrasounds) in order to determine if there are indicators of disease not visible from the outside. Reading the images produced by these technologies can be hindered by reader variability and imperfect image rendering. With computer vision, these uncertainties can be reduced.

The Lymph Node Assistant by Google

A notable example of this research is Google’s Lymph Node Assistant (LYNA). It’s been found that doctors have only an 81% accuracy in spotting secondary cells — cells which break away from tumors and spread in other parts of the body. The ability to identify these cells early on can determine the type of treatment a cancer patient should receive.

An algorithm like the ones used by LYNA are trained on a series of ground-truth images. The algorithm is told that the image it is seeing is a cancerous formation, or a regular healthy tissue. The Lymph Node Assistant examines pathology-slides of lymph nodes, pixel-by-pixel, as an array of numbers which indicate the likelihood of a tumor in that tissue. It was trained on a series of slides taken from real-life breast cancer patients, and when tested on never-before seen slides, was able to distinguish healthy lymph nodes from cancerous lymph nodes 99% of the time. The implications for this advancement are huge. It was found that LYNA was able to predict the most minute-occurrences of cancerous cells in pathology slides that the human eye was unable to see.

Left: a slide containing lymph nodes. Black necrotic tissue can be seen in the lower portion of the slide Right: LYNA identifying a tumor’s development (red) and classifies surrounding regions as non-cancerous. Source: Goolge AI Blog, October 2018

LungNet by the National Institute of Biomedical Imaging and Bioengineering (NIBIB)

Similar to Google’s LYNA, LungNet was developed to aid doctors interpret the images produced by radiological scans searching for cancer cells in the lungs. LungNet was trained on images produced from CT scans, and was able to distinguish benign from malignant formations on the lungs, and classified patients as low, medium, and high risk. The ability to rate the severity of the patient’s condition allows for streamlined decision making in terms of administering treatment. The research on this deep learning application is ongoing, but will revolutionize the treatment of lung cancer patients. The research has already shown that radiological images contain minute details not seen by trained professionals.

LungNet predictions are visualized in a 2-dimensional plane. Lesions are distinguished between High and Low risk. Source: Mukherjee, et al., Nature Machine Intelligence, May 2020

Detecting Melanoma Skin Cancer by Stanford University

Researchers at Stanford University recently developed a deep learning-based solution for classifying skin lesions as cancerous or not-cancerous. The neural network was trained on ~130,000 images of skin lesions, and was able to detect whether the lesion was benign or malignant in nature. Skin cancer is one of the most wide-spread cancers in the world today, and early detection of this deadly disease before it spreads beneath the surface of the skin drastically affects survivability. Lesions are typically examined by the naked eye prior to a more in-depth biopsy is performed. Introducing A.I-driven tools into this examination process can improve detection and allow for doctors to allocate resources towards treatment at a sooner period of time.

A notable difference between this computer vision application and those mentioned prior, are that the images analyzed and used for prediction were obtained through regular photography. There was no specialized radiological equipment necessary to obtain an image of the lesion. Stanford computer scientists see this as a potential smartphone app, placing the power to self-examine oneself in the palm of your hand.

Non-Image Based Machine Learning Applications in Medicine

Data Mining Patient Records to Predict Patient Risk by Massachusetts General Hospital

In a digitized age, patient information is stored in electronic health records (EHRs) and contain a plethora of data points. The researchers at Massachusetts General have taken advantage of machine learning and their patient’s data to develop a process of determining the risk of disease in their patients. A patient’s EHR is updated over time with the type of medications they are prescribed; the therapies they may perform; and their diagnostic history. Using algorithms that detect sequences of patient events has allowed the staff at MGH to reinvent how they approach emerging health issues within a patient. Patient health records are often marred by administrative procedures, adding complexity to a patient database. Machine learning algorithms that pinpoint predictive features of a patient’s health allow for a streamlined discussion to occur between patient and physician.

Detecting Heart Disease through Biosensors by Independent Researchers

Researchers had accessible healthcare in mind with this collaborative study involving MyoKardia Inc, Wavelet Health, and Oregon Health Sciences University. The proof-of-concept study combines wearable bio-technology and machine learning analysis. Photoplethysmography (PPG), a technique in which blood volume changes are measured from the skin’s surface, is available in many exercise products available on the market today.

The study resulted in a scenario where test subjects were provided with wearable biosensors, and the data picked up by the sensor was passed to a machine learning classifier. The classifier was able to determine whether an individual was at risk of Hypertrophic Cardiomyopathy (HCM). This condition affects the cardiac muscles, which can lead to complications later in life including stroke, heart failure, and cardiac arrest. 94% of individuals with the condition HCM, and 98% of healthy individuals were correctly identified by the classifier.

Looking Ahead: Artificial Intelligence in Healthcare

The adoption of A.I in the healthcare industry is encouraged by the increasing expansion of cloud computing services and advancements in computer vision applications. Collaboration between industry professionals, data scientists, and patients themselves is an integral part in this new frontier of medical practice. According to the “Artificial Intelligence in Medicine Market — Growth, Trends, and Forecast (2019–2024)” report, A.I in the medicine industry is expected to reach a value of USD 17.02 billion by year 2024. Artificial Intelligence is not only being used in medical diagnostics like the examples presented in this article, but is permeating into all facets of healthcare services including clinical trial testing and personal healthcare assistants.