Skip to content

Breaking News

Pat May, business reporter, San Jose Mercury News, for his Wordpress profile. (Michael Malone/Bay Area News Group)
PUBLISHED: | UPDATED:

It was Isaac Asimov meets Marcus Welby.

As 45 medical researchers, academics and VC executives in town this week for the Digital Health Summer Summit moved slowly through the UCSF Medical Center in San Francisco, the resident trailblazers of digital medicine unveiled science fiction without the fiction.

An iPad app that combines big data with 3-D MRIs of the brains of MS patients. A Fitbit-like device with photosensors that track the pulse of blood. A virtual-reality headset that lets patients fly inside their own brains in a real-time magical mystery tour.

“We have people here on the front lines doing tremendously innovative things,” Dr. Aenor Sawyer, one of the center’s leaders in digital health innovation, told the group as they prepared for their demo-filled walk-through this week. “We hope to be part of the wave disrupting health care.”

Disruption was in full swing at every turn. As Sawyer pointed out, “Our team includes researchers, data scientists, physicians and informaticists,” dropping another term in a place that claims to have coined “hospitalist” for the uber-physician coordinating care and “netwalking,” which was precisely what the visitors were now doing. The medical center is clearly much more than a meetup of doctors, nurses and sick people. UCSF has carved out an international reputation as the gold standard for the mash-up of digital technology, cloud-supported big data and cutting-edge medical research. And all of it’s being done in a brand-spanking-new campus where the paint on the walls still seems wet.

First stop, the Human Performance Center. Sports-medicine pioneer Dr. Anthony Luke introduced his guests to his brain trust and talked about their ongoing work in injury-prevention, along with experiments to test and improve the wearable fitness monitors coming with a firehose-blast onto the market these days.

“We focus here on function versus disease,” Luke said. “The idea is that instead of looking just at injuries, we need to look at wellness and prevention and at an individual’s physiology.”

One by one, researchers around the room talked about their projects and how their work with partners like Samsung is stepping up the performance of digital tools that help athletes run smarter, breathe better and avoid career-crushing injuries.

One team uses state-of-the-art EKG machines along with stationary exercise equipment to study how much oxygen athletes are consuming and how much carbon dioxide they’re producing. At the same time, this new technology can pull in a mountain of metabolic data from the cloud, monitoring in real time a patient’s activity, right down to the millisecond.

Another group has installed Hollywood-standard, infrared motion-capture cameras, which combine with strike plates under the floor to give researchers a dynamic 3-D skeletal view of an athlete in action. The technology is so precise that the image zooms right down to what biomechanical researcher Aaron Sparks calls “a calculation of internal joint torques” that can show and measure a foot touching the ground from “heel strike to toe off.”

A big part of this team’s effort is to test and validate some of the wearable-tech tools that have become de rigueur for sports enthusiasts, even though some of the readings produced by these commercial products may be wildly off the mark.

“There are lots of digital-health products on the market, but the question not yet answered is ‘Do they work on humans as intended?’ ” Luke said. “Caregivers and insurers have a difficult time knowing how well these products really work and what they’re really worth.”

That sort of academic skepticism is particularly striking as shares of Fitbit soared as much as 60 percent in its initial trading day Thursday, giving the company a value of at least $6 billion.

Neil Sehgal, director of the center’s validation programs, said that another team’s work with Samsung on a sensor-loaded wrist device could one day lead to an entirely new chapter in fitness-monitoring technology.

“This has never been possible before,” he said of the device that will accommodate a growing number of as-yet-undesigned sensors that could measure not just a wearer’s physical activity but also psychological conditions like stress. “It’s better for consumers,” he said, “if they know these devices are doing what they say they’re doing and giving a valid measurement.”

Sehgal said that in the future, “We can take these devices to the inner city and to developing countries around the world where by using big data and algorithms we can help people live better lives. I want to make the world better and this technology will help people make better decisions about their health.”

After leaving the Human Performance Lab, and watching the demonstration of another team’s Bioscreen App for the iPad that will help physicians working with MS patients make complex medical decisions on treatment, the group moved to the most futuristic lab of all: the Neuroscape Lab, created by cognitive neuroscientist Dr. Adam Gazzaley and his team as a testing ground to study the neural mechanisms of memory, attention and perception.

Buck Rogers would be green with envy to see Gazzaley’s machines in action. Inside the darkened room, two large screens offer patients a chance to play interactive video games designed to strengthen the brain’s circuitry, Gazzaley told the visitors. Sensors and other devices track the user’s eye, reach and other body movement as the patient moves about in front of the screen, essentially watching their own brain in action inside the game on the monitor.

It’s demo time, and Gazzaley calls up a 3-D image of a patient’s MRI-scanned brain, then suddenly the image becomes transparent, with the internal neuro-pathways inside the brain coming to life in a fireworks diplay of shooting colors.

By strapping on one of the electrode-laden headsets, he says, “In real time you can navigate your own brain’s activity inside what we call ‘the glass brain.’ No one has ever been able to look at data in real time this way, so we don’t really know what’s coming next.”

Contact Patrick May at 408-920-5689; follow him at Twitter.com/patmaymerc.