Imagine being able to see where your blood sugar currently stands and where it’s going, just by tilting your head and whispering a few words — instead of drawing blood or even fumbling with a transmitter picking up signals from a sensor stuck to your skin.
Using the futuristic Google Glass technology that’s generating a lot of buzz about how it could be used in the healthcare field, UCSD researchers are putting a diabetes spin on it in what they’re loosely calling ‘Glucose Glass.’
The new glasses would allow you to view any D-data from a meter to a continuous glucose monitor (CGM) right in the lenses, and all you’d have to say to see it would be “OK, Glass: How’s my diabetes?” Or “OK, Glucose Glass, show me my CGM data.”
And whala! You’d be looking at whatever stream of D-data you asked for, completely hands-free.
Sounds beyond cool, huh?!
This concept was presented by researchers at the big Stanford MedicineX conference in late September. Leading the team is Dr. Nate Heintzman, who directs the Diabetes Informatics and Analytics Lab (DIAL) at the UCSD School of Medicine. While Nate’s not a PWD himself, he is a longtime friend to the D-Community in that he’s co-founder of the Insulindependence non-profit that provides recreation and fitness programs for those of us with diabetes. And he’s passionate about diabetes innovation (he’ll be participating in the DiabetesMine Innovation Summit this year!)
Others on Nate’s team include two students, Justin Tantiongloc and Subrai Pai; two faculty members, Dr. Todd Coleman from UCSD and Dr. Thad Starner from Georgia Institute of Technology; and practicing endo Dr. Steve Edelman from UCSD, who’s also well-known as founder of the Taking Care of Your Diabetes (TCOYD) organization.
We chatted with Nate recently about his team’s work on this fascinating Glucose Glass concept:
DM) Can you please describe Google Glass technology and what exactly you’re doing with it?
NH) Google Glass is a lightweight wearable computing device, featuring a heads-up display, camera, microphone, sound, memory, various sensors, WiFi and Bluetooth. Users interact with Glass via a built-in touchpad or with hands-free voice commands. The design philosophy of Glass, in my own words, is to provide the right information to the user, at the right time, and the right place, without getting in the way of what the user is looking at or listening to, and I find that to be the case in my own experiences as a regular user of the device.
We are working to integrate various kinds of diabetes data into one simplified visualization that people with diabetes can view on Google Glass, as a “Glucose Glass” app that displays “diabetes timecards” to the user, in chronological order. Diabetes timecards are high-resolution images, each of which includes CGM readings/trends, insulin on board, meal photos, and other physiologic/activity measures. Users may view their timecards on-demand or according to configurable notifications, and they may also share their timecards (via messaging or email) with family, friends, and caretakers. Glucose Glass is designed as a new resource for patients, and may also serve as a platform for research. The project is a collaboration involving scientists, engineers, clinicians, and patients at UCSD and Georgia Institute of Technology.
How did you get involved in this D-data-streaming line of research?
Being a diabetes researcher is a dream come true – not many scientists have the privilege of doing research that is so personally meaningful to themselves and their friends. Our lab at UCSD strives to answer the basic question, “What caused that glycemia?” Or put more relevantly in the self-management context, “Why am I high or low, and what can I do to avoid this in the future?” Our team includes numerous patient-scientists whose perspectives directly inform the research we do. We employ diverse technologies to collect and analyze all kinds of relevant diabetes data, from medical devices to clinical lab values to genetics, toward gaining new insights into how blood glucose levels change in each individual as a result of behavior, physiology, environment, and more.
Launching a project using Google Glass was a logical extension of our other ongoing research efforts, and we’re very fortunate to have early access to this technology as participants in the Google Glass Explorer program. In addition to the benefits that users with diabetes may experience, as noted the Glucose Glass system is also a new tool for research that can help us provide better decision support to individuals and hopefully even contribute to the development of closed-loop medical device technologies that can account for things like physical activity, stress, and other circumstances of day-to-day living with diabetes.
Sounds like a super-gadget, but is there a “mainstream” need for this?
Many of my friends with diabetes lament the difficulty of making decisions about their diabetes management based on multiple devices that live in different pockets, don’t talk to each other, don’t passively export data in useful formats, and so on. Further, there are times when it’s inconvenient or nearly impossible to sneak a peek at one’s CGM or insulin pump, like while hiking the Grand Canyon, snowboarding down the slope, holding a child – people with diabetes know better than I do how cumbersome it can be to check and check and check their devices! In short, there are multiple times, any given day, when having hands-free access to one’s diabetes data could be incredibly helpful. Taking that a step further, what if ALL of a person’s relevant diabetes data could be displayed in one neat snapshot, whenever they’d like to see it? And what if that snapshot could be shared with family, friends, and caretakers to get some help in real-time? That’s what we aim to enable with Glucose Glass.
How do you envision people using Glucose Glass on a daily basis?
When I first learned about Google Glass, I immediately thought that it might help address some of the needs that have been expressed in my diabetes social circles for years, namely the integrated display of data from different diabetes devices and other sources like meal photos from a camera, as well as the ability to get support from peers who can offer real-time feedback about things like carbs in a meal or insulin strategies during physical activity. Currently, this requires interaction with multiple devices, taking a photo with your phone’s camera, and more, depending on the circumstances.
The Glucose Glass system we’re developing enables all of this to happen without the user lifting the proverbial finger. Data are collected passively from medical devices and other sensors, and meal photos can be snapped using voice commands. This information is then used to build the “diabetes timecard” I mentioned that displays CGM, insulin, activity, and nutrition data at a glance, whenever the user wants to see it. The timecard can also be shared in real-time by email or text message, or called up later, for example during a visit with a healthcare professional or when the user is about to have a slice of pizza and is curious about how the pizza they had last week affected their blood sugar – just flick back through your timecards and take a look. Different people with diabetes have different needs, but a central theme involves user-friendly, meaningful access to one’s personal diabetes data, and that is what we are focusing on first.
Any specific diabetes devices you’re looking to integrate with Glass?
If a device can interface with a smartphone, its data can be shown on Glucose Glass. We have a handful of devices that we’re actively working to integrate as part of our prototype, and we’d be happy to expand our interoperability as quickly and broadly as possible.
Where does the research stand at this point?
We are still in the prototyping stage, as the project is quite new, but we anticipate involvement of early adopters with type 1, type 2, MODY, LADA, gestational diabetes, and perhaps even family members and caretakers of people with diabetes. If anyone seeks a new, user-friendly way of interacting with an individual’s diabetes data ecosystem, they can find that with Glucose Glass. So we involve diverse stakeholders in our conversations about what Glucose Glass should do, and how. We haven’t yet deployed our prototype in the field, as we have more development work to do. Real-time data collection from actual devices is one of our team’s imminent next achievements – we expect to reach this milestone by the end of the year. It’s challenging, however, to pull off a project like this in the resource-constrained environment currently faced by many researchers, and a lot of our work is currently unfunded and being performed on a volunteer basis.
Resources permitting, we’ll start a field study of Glucose Glass beta as early as next spring. We aim to evaluate both user experience and the practicality of using Glucose Glass for next-generation diabetes informatics research. Stay tuned for updates via @diabetesdata on Twitter.
* * *
At the MedX conference, fellow type 1 Chris Snider actually got to wear Google Glass around for a bit, connected to event organizer Dr. Larry Chu’s twitter account. Although his first-hand experience wasn’t specific to diabetes data, Chris says he could definitely envision (pun!) the advantages that Glucose Glass could have for PWDs. Chris wrote about that experience on his blog, A Consequence of Hypoglycemia.
In sharing with us here at the ‘Mine, Chris says wearing Glass felt like, well… the future.
“With all of the explicit and implicit data that is required to manage this disease, genuine data integration is my holy grail. That’s what I want. And hypothetically, that’s what Glass could provide,” Chris tells us.
Sure, right now our cell phones can perform all of the functions of Google Glass faster and more accurately, but certainly not right in front of our eyes, hands-free. Chris says he wouldn’t necessarily want to wear the glasses continuously, and he has data security and integrity concerns about streaming all that medical data from the cloud, but he does think this innovation feels like the first step “to something bigger and better.”
“Google Glass may not appeal to me on a daily basis, but that doesn’t mean that … Google Glass couldn’t be the next big thing for me or other people with diabetes,” he said. “Despite my reservations, I still like the idea of saying ‘OK Glass, How’s my diabetes?’”
As technophiles ourselves, we’re kind of enthralled by what Nate and his team are working on! I know that personally, before I started using an integrated insulin pump and CGM, I wasn’t that concerned about having needing to carry an extra device around to monitor my blood sugars. But once I started using an integrated system, it was a whole new world that I much preferred. Now I don’t think I could ever go back.
The idea of being able to call up your relevant glucose data with just a voice command and a blink is very appealing indeed. Who wouldn’t love to take Glucose Glass for a spin?