With Latest Tech, Can Doctors Treat Us Before We Fall Sick?
Even now I can see Dr. Hingorani telling my mother that her boy has chronic hypertension and feel the embarrassment of seeing tears well up in her eyes. More than 16 years have passed since but I can’t help feeling oddly satisfied with that episode. Getting diagnosed early has helped me manage my condition well and do things that I really want to do — even trek up to the Everest Base Camp. Most people are not this lucky. We visit a doctor when something feels wrong. We routinely receive care too late — millions die because of delayed diagnosis. The future of healthcare delivery flips this equation.
Can we be treated before we fall sick?
Symptoms are the body’s beepers to let us know that it’s not happy. But before our skin shows a rash or we feel dizzy, there are signs brewing under the surface. Awareness of microbes that occupy 90% of the body can alert us before our abdomen hurts. Larry Smarr (read The Patient of the Future) identified the presence of Crohn’s disease from his microbiome while doctors missed the diagnosis. Discovering messages from our genetic code can reveal camouflaged diseases. At age 4, Nic Volker became the first human whose life was saved by sequencing DNA.
Imaging studies, body activity, sleep and other indicators reveal more and more clues. To translate this varied data into meaningful diagnosis, we need a mechanism that tracks, stores, and continuously monitors the body’s information. Today’s electronic health records (EHRs) are systems for storing medical data at a gross level – vital signs, medications, lab results and so on. They can be extended to assist doctors in managing entire patient conditions by connecting the dots and deciphering what our body is trying to tell us.
When we think about medicine evolving as a data science, three shifts in healthcare become apparent. First, monitoring ailments becomes longitudinal and not based on staggered episodes of care. Second, a doctor may not need to physically examine a patient to comprehend her condition. Third, comparisons among patients can scale across time and locations. These evolving changes require us to adapt our thinking on how we access and deliver care.
When we think of EHRs as hubs that interconnect patients and doctors, we visualize a system where medical information actively flows back and forth. Aided with artificial intelligence tools such as IBM’s Watson, a health record can synthesize data into a contextual guide at the point of care. Viewing a patient’s record, a physician will then be able to ask targeted questions – for example, how did we treat young female patients from southern India with cystic fibrosis gene mutation and family history of pancreatitis during the last 10 years?
But it is not going to be easy to visualize complex data from EHRs without newer interfaces. With head-up displays, such as Google Glass, physicians can simplify what they review during patient encounters. Such wearable computers combine physical reality with augmented video, text or other types of data to provide a distinct interactive experience. To examine a patient’s heart, the doctor sees imaging information of blood flow overlaid on the body. She zooms-in, asking for data from a recent lipid profile. Zooming-out, she views a wall on which historical graphs of cholesterol and body mass index are projected. She may even meet her patients virtually.
I worry whenever I visit doctors. Knowing medicine’s transitional state, I’ll never know whether my doctors know what they don’t know. I wonder if I would eventually be comfortable being examined by medical cyborgs. I don’t know that yet. But whatever the evolution, the same few suggestions continue to do the rounds: eat right, exercise more, drop stress and allow the body to do its thing.
Praveen Suthrum is an alumnus of Singularity University and co-founder of NextServices.