Trends

Impact of AI in the field of Health Care

Impact of Artificial Intelligence in the field of Health Care

The time is right for significant changes in the healthcare sector. There are countless opportunities to use technology to deliver more precise, effective, and impactful interventions at precisely the appropriate time in a patient’s care, from chronic diseases and cancer to radiography and risk assessment.

Artificial intelligence (AI) is positioned to be the engine that drives changes across the care continuum as payment mechanisms change, consumers expect more from their providers, and the volume of available data continues to expand at a startling rate.

Compared to conventional analytics and clinical decision-making methods, AI has a lot of benefits. Learning algorithms can become more exact and accurate as they interact with training data, giving people access to previously unattainable insights into diagnosis, care procedures, treatment variability, and patient outcomes. 

Leading researchers and clinical faculty members presented the twelve healthcare technologies and sectors that are most likely to experience a significant impact from artificial intelligence within the next ten years at the 2018 World Medical Innovation Forum (WMIF) on artificial intelligence, which Partners Healthcare organised.

The top 12 ways artificial intelligence will revolutionise healthcare delivery and science were compiled by moderators Keith Dreyer, DO, PhD, Chief Data Science Officer at Partners, and Katherine Andriole, PhD, Director of Research Strategy and Operations at Massachusetts General Hospital (MGH), with assistance from experts from across the Partners Healthcare System, including faculty from Harvard Medical School (HMS).

ai

1. UNIFYING MIND AND MACHINE THROUGH BRAIN-COMPUTER INTERFACES

Computer communication is by no means a novel concept, but cutting-edge research is being done to directly connect technology and the human mind without using keyboards, mice, or monitors. This work has important implications for some patients.

The ability to speak, move, and effectively connect with people and their settings can be lost in some patients due to neurological illnesses and injuries to the nervous system. Artificial intelligence-powered brain-computer interfaces (BCIs) could bring back those essential experiences for many who felt they were lost forever.

Leigh Hochberg, MD, PhD, Director of the Center for Neurotechnology and Neuro recovery at MGH, said, “If I’m in the neurology ICU on a Monday and I see someone who has suddenly lost the ability to move or to speak, we want to restore that ability to communicate by Tuesday.”

“By utilising a BCI and artificial intelligence, we can decode the neural activations related to the intended movement of one’s hand, and we should be able to allow that person to communicate in the same way that many people in this room have communicated using a ubiquitous communication technology like a tablet computer or phone at least five times throughout the course of the morning,” says the researcher.

Brain-computer interfaces could significantly enhance the quality of life for those suffering from ALS, strokes, locked-in syndrome, and the 500,000 people worldwide who suffer spinal cord injuries each year.

ai

2. DEVELOPING THE NEXT GENERATION OF RADIOLOGY TOOLS

The human body’s inner workings can be seen using non-invasive radiological pictures produced by MRI machines, CT scanners, and x-ray devices. However, a lot of diagnostic procedures still use actual tissue samples from biopsies, which come with hazards like the possibility of infection.

According to experts, artificial intelligence will make it possible for the next generation of radiological instruments to be precise and comprehensive enough to replace the need for tissue samples occasionally.

According to Alexandra Golby, MD, Director of Image-Guided Neurosurgery at Brigham & Women’s Hospital, “We want to bring together the diagnostic imaging team with the surgeon or interventional radiologist and the pathologist. Aligning goals and bringing together disparate teams is a major problem.”

“We will need to be able to get very close registration such that the ground truth for each given pixel is known if we want the imaging to offer us information that we currently acquire from tissue samples.”

If this effort is successful, practitioners may be able to make treatment decisions based on the behaviour of tumours as a whole rather than just a limited subset of their characteristics.

Additionally, healthcare providers might be better equipped to categorise tumours’ levels of aggression and better target their therapies.

The cutting-edge discipline of radionics, which focuses on using image-based algorithms to identify the phenotypic and genetic characteristics of tumours, is being advanced by artificial intelligence by making “virtual biopsies” possible.

ai

3. EXPANDING ACCESS TO CARE IN UNDERSERVED OR DEVELOPING REGIONS

In underdeveloped countries all over the world, a lack of qualified medical professionals, such as radiologists and ultrasound technologists, can severely restrict access to life-saving care.

By taking over some of the diagnostic tasks traditionally assigned to humans, artificial intelligence may be able to lessen the effects of this significant shortage of skilled clinical professionals.

AI imaging technologies, for instance, may frequently achieve a level of accuracy equivalent to people when screening chest x-rays for indications of tuberculosis. It would be unnecessary to have a diagnostic radiologist on-site if this capability could be implemented through an app made available to healthcare professionals in underdeveloped regions.

According to Jayashree Kalpathy-Cramer, PhD, Assistant in Neuroscience at MGH and Associate Professor of Radiology at HMS, “the potential for this technology to enhance access to healthcare is immense.”

Developers of algorithms must be careful to take into account the possibility that various ethnic groups or residents of various places may have distinctive physiologies and environmental factors that will affect how diseases exhibit themselves.

“In India compared to the US, for instance, the progression of a disease and the people it affects may be significantly different,” she said.

“We can’t just design an algorithm based on a particular population and expect it to operate as well on other populations, so it’s crucial to make sure the data covers a diversity of illness presentations and demographics as we develop these algorithms.”

ai

4. REDUCING THE BURDEN OF ELECTRONIC HEALTH RECORD USE

The transition to electronic health records (EHRs) has brought forth various issues related to cognitive overload, endless documentation, and user burnout. EHRs have been crucial in the healthcare industry’s journey toward digitisation.

EHR developers are now using artificial intelligence to build more user-friendly interfaces and automate some of the repetitive tasks that take up so much of a user’s time.

According to Adam Landman, MD, Vice President and CIO of Brigham Health, users spend most of their time on three activities: clinical recording, order entry, and sorting through the in-basket.

The process of clinical documentation is being made better by voice recognition and dictation, but natural language processing (NLP) techniques might not go far enough.

According to Landman, “I think we may need to be even more daring and contemplate innovations like videotaping a clinical session, almost like cops wear body cams.” Then, for later information retrieval, you can utilise AI and machine learning to index those movies. The future will bring virtual assistants with embedded intelligence to the bedside for clinicians to use, just like in the house where we use Siri and Alexa.”

Inbox routine requests like those for prescription refills and test result alerts may also be processed using artificial intelligence. In order to make it simpler for users to complete their to-do lists, Landman suggested that it could be helpful to prioritise chores that actually demand the clinician’s attention.

ai

5. CONTAINING THE RISKS OF ANTIBIOTIC RESISTANCE

The abuse of these essential medications encourages the emergence of superbugs that are resistant to therapies, posing an increasing threat to communities around the world. In hospitals, multi-drug resistant pathogens can cause chaos and result in hundreds of fatalities each year.

Data from electronic health records can be used to spot infection patterns and flag individuals who are at risk even before they start exhibiting symptoms. Leveraging machine learning and AI tools to drive these analytics can enhance their accuracy and create faster, more accurate alerts for healthcare providers.

According to Erica Shenoy, MD, PhD, Associate Chief of the Infection Control Unit at MGH, “AI tools can live up to the expectation for infection control and antibiotic resistance.”

“If they don’t, it really is a failure on our collective parts. It would be a failure for hospitals to sit on mountains of EHR data and not utilise them fully, for the industry to stop developing smarter, faster clinical trial designs, and for EHRs to stop exploiting the data, they are producing.”

ai

6. CREATING MORE PRECISE ANALYTICS FOR PATHOLOGY IMAGES

According to Jeffrey Golden, MD, Chair of the Department of Pathology at BWH and Professor of Pathology at HMS, pathologists are one of the most important sources of diagnostic information for physicians across the spectrum of care delivery.

According to him, a pathology result informs 70% of all healthcare decisions. A pathology result makes up between 70 and 75 per cent of all the data in an EHR. Therefore, the more accurate we become and the quicker we identify the proper diagnosis, the better off we’ll be. AI and digital pathology have the potential to provide such.

Analytics that can zoom in on individual pixels on incredibly huge digital photos can help providers spot subtle differences that the human eye might miss.

According to Golden, “We’re now at a stage where we can better assess whether a cancer is likely to advance swiftly or slowly and how that would impact how patients will be treated based on an algorithm rather than clinical staging or the histopathologic grade.” That represents a significant advancement.

Before a human clinician evaluates the material, he continued that artificial intelligence can increase efficiency by spotting interesting features in slides.

“AI can screen through slides and point us toward the appropriate item to look at so we can determine what’s significant and what’s not, according to the author. As a result, the pathologist is used more effectively, and their time spent on each case is worth more.”

ai

7. BRINGING INTELLIGENCE TO MEDICAL DEVICES AND MACHINES

The consumer world is being overtaken by smart technologies, which offer everything from real-time video from inside a refrigerator to cars that can recognise when a driver is distracted.

Smart gadgets are essential in the medical setting for monitoring patients in the ICU and other locations. It is possible to improve outcomes greatly and maybe save expenses associated with hospital-acquired condition penalties by using artificial intelligence to improve the capacity to recognise deterioration, hint that sepsis is setting in, or sense the emergence of complications.

According to Mark Michalski, MD, Executive Director of the MGH & BWH Center for Clinical Data Science, “When we’re talking about integrating disparate data from across the healthcare system, integrating it, and generating an alert that would alert an ICU doctor to intervene early on – the aggregation of that data is not something that a human can do very well.”

By incorporating intelligent algorithms into these gadgets, doctors’ cognitive workloads can be lessened while still ensuring that patients receive care as quickly as feasible.

ai

8. ADVANCING THE USE OF IMMUNOTHERAPY FOR CANCER TREATMENT

One of the most promising methods for treating cancer is immunotherapy. Patients may be able to defeat tenacious tumours by employing the body’s immune system to combat malignancies. Oncologists still lack a precise and accurate mechanism for determining which patients may benefit from this approach because only a tiny percentage of patients respond to current immunotherapy methods.

Machine learning algorithms’ capacity to combine extremely complicated information may open up new possibilities for tailoring treatments to each patient’s particular genetic profile.

According to Long Le, MD, PhD, Director of Computational Pathology and Technology Breakthrough at the MGH Center for Integrated Diagnostics, “Recently, the most intriguing development has been checkpoint inhibitors, which block some of the proteins made by some sorts of immune cells.” But we still don’t fully comprehend the biology of every disease. This is a highly challenging issue.”

“We unquestionably require more patient data. Few patients have actually been given these medications because the therapies are still in their infancy. Therefore, whether we need to merge data from different institutions or from one institution to another will be a crucial consideration for expanding the patient population to fuel the modelling process.”

ai

9. TURNING THE ELECTRONIC HEALTH RECORD INTO A RELIABLE RISK PREDICTOR

EHRs contain a plethora of patient data, but it has been difficult for developers and physicians to extract and analyse that data in a precise, timely, and reliable way.

Understanding precisely how to participate in effective risk stratification, predictive analytics, and clinical decision support has proven to be exceedingly challenging due to data quality and integrity difficulties, a jumble of data formats, structured and unstructured inputs, and incomplete records.

Ziad Obermeyer, MD, Assistant Professor of Emergency Medicine at BWH and Assistant Professor at HMS, said that “integrating the data into one location is part of the hard effort.” However, understanding what you’re getting when you anticipate an illness in an EHR is a different issue. 

“When you dig further, you discover that what an algorithm is actually forecasting is a billing code for a stroke. You might hear that an algorithm can predict depression or stroke. Compared to a stroke itself, that is very different. Relying on MRI findings could seem to provide a more specific dataset,” he said.

“However, you now need to consider who can afford an MRI and who cannot. As a result, your final prediction differs from what you first intended. Instead of some form of cerebral ischemia, you might be billed for a stroke in patients who can afford a diagnosis.”

EHR analytics has given rise to a number of effective risk scoring and stratification tools, mainly when researchers use deep learning approaches to discover unique relationships between datasets that initially appear unrelated.

However, Obermeyer said that to implement tools that will improve clinical treatment, it is essential to ensure that those algorithms do not confirm hidden biases in the data.

Before we start opening up the black box and examining how we predict it, the largest challenge, according to him, will be determining precisely what we are projecting.

ai

10. MONITORING HEALTH THROUGH WEARABLES AND PERSONAL DEVICES

Nowadays, almost all customers have access to gadgets with sensors that can gather important information about their health. A growing amount of health-related data is produced on the go, from wearables that can monitor a heartbeat constantly to cellphones with step trackers.

This data can be gathered and analysed to provide new insights into both individual and population health, especially when combined with patient-provided data from apps and other home monitoring devices.

Finding useful insights from this vast and diversified treasure trove of data will mostly depend on artificial intelligence. 

However, according to Omar Arnaout, MD, co-director of the Computation Neuroscience Outcomes Center and an attending neurosurgeon at BWH, getting patients to feel comfortable sharing data from this personal, ongoing monitoring may call for a little more effort.

“We’ve been quite lenient with our digital data as a culture, he claimed. However, as issues like Cambridge Analytica and Facebook become more well known, consumers will start to exercise greater caution when deciding with whom to share which types of data.”

The fact that people are more likely to trust their doctors than they are a giant corporation like Facebook, he continued, may assist in allaying any concerns about providing data to extensive research projects.

“Because of the episodic nature of our treatment and the coarseness of the data we gather, there is a very good probability that wearable data will have a significant influence, according to Arnaout. “There is a greater possibility that the data will assist us in taking better care of patients if granular data is collected continuously.”

ai

11. MAKING SMARTPHONE SELFIES INTO POWERFUL DIAGNOSTIC TOOLS

In keeping with the idea of utilising mobile technology, professionals predict that photos from cellphones and other consumer-grade sources will be a crucial addition to clinical quality imaging, particularly in underprivileged areas or developing countries.

Every year, cell phone cameras improve, and they can now take pictures that are good enough for algorithms using artificial intelligence to analyse. Early beneficiaries of this movement include dermatology and ophthalmology.

Researchers in the UK have even created a gadget that analyses pictures of a child’s face to detect developmental disorders. The programme can identify distinct characteristics in a child, including their jawline, where their eyes and nose are located, and other characteristics that could point to a craniofacial abnormality. The programme may provide clinical decision help by matching common photos to more than 90 illnesses.

Hadi Shafiee, PhD, Director of the Laboratory of Micro/Nanomedicine and Digital Health at BWH, remarked, “The majority of the population is equipped with pocket-sized, potent devices that have a lot of different sensors built-in.”

“This is a fantastic chance for us. Almost all of the industry’s big players have begun integrating AI hardware and software into their products. That is not an accident. We produce more than 2.5 million gigabytes of data each day in our digital environment. The cell phone makers think that combining that data with AI can offer much more individualised, quick, and intelligent services.”

Smartphones may be able to help disadvantaged areas deal with a lack of specialists while speeding up the time to diagnose for some issues. These photographs of eyes, skin blemishes, wounds, infections, drugs, or other subjects might be taken.

Shafiee declared, “Something significant is occurring.” We can take advantage of that chance to deal with some of the significant issues we face in disease management at the point of care.

ai

12. REVOLUTIONISING CLINICAL DECISION-MAKING WITH ARTIFICIAL INTELLIGENCE AT THE BEDSIDE

Reactive care is becoming less and less prevalent in the healthcare sector as it moves away from fee-for-service payment models. Every clinician wants to be ahead of chronic illnesses, expensive acute events, and unexpected deterioration, and reimbursement mechanisms finally enable them to create the procedures that will allow for proactive, predictive interventions.

By powering clinical decision support systems and predictive analytics, which alert doctors to issues long before they may otherwise realise the need for action, artificial intelligence will serve as a significant portion of the foundation for that evolution.

AI can offer earlier warnings for illnesses like seizures or sepsis, frequently requiring rigorous analysis of highly complex datasets.

According to Brandon Westover, MD, PhD, Director of the MGH Clinical Data Animation Center, machine learning can inform decisions on whether or not to continue caring for critically ill patients, such as those who have gone into a coma following a cardiac arrest.

Usually, he said, healthcare professionals must examine these patients’ EEG data visually. The method is drawn out and subjective, and depending on the expertise and experience of the particular doctor, the outcomes may differ.

“Trends could be gradually evolving in these patients,” he said. “We occasionally take the data from ten seconds of monitoring at a time when determining whether someone is recovering. The equivalent of checking to see if your hair is getting longer is trying to see if it changed from ten seconds of data obtained 24 hours prior.”

“However, it is simpler to compare what you see to long-term patterns and perhaps find modest improvements that would affect your care decisions if you had an AI system and a tonne of data from numerous patients.”

One of the most interesting future directions for this ground-breaking data analysis method is using AI for clinical decision support, risk rating, and early alerting.

AI will usher in a new era of clinical excellence and exciting advancements in patient care by powering a new generation of tools and systems that make doctors more aware of nuances, more efficient when providing care, and more likely to get ahead of developing problems.

 

Article proofread & published by Gauri Malhotra.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button