Digital Tattoo Gets Under Your Skin to Monitor Blood
- By Alexander George
- July 26, 2011 |
Maybe tattoos aren’t just for Harley riders or rebellious teens after all. In a few years, diabetics might get inked up with digital tats that communicate with an iPhone to monitor their blood.
Instead of the dye used for tribal arm bands and Chinese characters, these tattoos will contain nanosensors that read the wearer’s blood levels of sodium, glucose and even alcohol with the help of an iPhone 4 camera.
Dr. Heather Clark, associate professor of pharmaceutical sciences at Northeastern University, is leading the research on the subdermal sensors. She said she was reminded of the benefits of real-time, wearable health monitoring when she entered a marathon in Vermont: If they become mass-produced and affordable for the consumer market, wireless devices worn on the body could tell you exactly what medication you need whenever you need it.
“I had no idea how much to drink, or when,” said Clark, reflecting on her marathon run. “Or if I should have Gatorade instead.”
Clark’s technology could spell out the eventual demise of the painful finger pricks required for blood tests — assuming users have an iPhone, which Northeastern bioengineering grad student Matt Dubach has customized to read light from the tiny sensors to collect and output data.
Here’s how it works: A 100-nanometer-wide set of sensors go under the skin, like tattoo ink — as for the size, “You can spot it if you’re looking for it,” Clark says. The sensors are encased in an oily agent to ensure the whole contraption stays together.
Within the implant, certain nanoparticles will bind exclusively to specific blood contents, like sodium or glucose. Thanks to an additive that makes the particles charge neutral, the presence of a target triggers an ion release, which manifests as a florescence change. The process is detailed in an article published in the journal Integrative Biology.
Dubach designed the iPhone 4 attachment to use the phone’s camera to read the color shift and translate the results into quantifiable data. A plastic ring surrounding the lens blocks out ambient light while a battery-powered blue LED contrasts with the sensors. The software uses the iPhone camera’s built-in RGB filters to process the light reflected off the sensors.
Why blue? Initial trials with lights that projected other colors were hindered by Apple’s built-in optical filter, but blue light uses the iPhone’s built-in RGB setup to process the data accurately. That blue light, powered by a 9-volt battery attached to the phone, works with the sensors’ red-shifted florescence because red shines well through skin.
As of now, the data collected with the iPhone still requires processing through a secondary machine, but Duboch says using the iPhone to do all the work is not far off, and that an app is likely on the way.
Clark hopes to see the work of an entire clinical analyzer done by nanoparticles interacting with smartphones, which would mean a major step forward for personalized medicine. Diabetics and athletes alike could adapt and measure their own statistics without dependence on big, pricey, exclusive medical equipment.
The testing is still in early stages, and hasn’t been tried on humans yet. Research on mice, who have comparatively thinner skin than humans, has shown promising results.
When Apple’s next iPhone comes out, the project will benefit, said Dubach, citing rumors that the iPhone 5 will include a more powerful camera sensor.
“I’m holding out for the iPhone 5,” Dubach said. “More megapixels gives you more for the average,” meaning the higher-resolution camera provides more data for analysis. Even bioengineers are waiting for Steve Jobs’ next move.
The technology is still years off, but Clark and Dubach’s developments are bringing medicine closer to a time when diagnostics are minimally invasive. Real-time feedback through subdermal circuits and smartphone cameras means you could know exactly when to slug that water.