News Feature | September 28, 2022

Bright Ideas — Wearable Sensors Find Success, Prove Problematic

abby proch headshot

By Abby Proch, former editor

bright-ideas

Astronauts enduring long periods in space are susceptible to ailments like blood clots, kidney stones, radiation exposure, and more. Now, a new miniature flow cytometer is helping to keep the explorers healthy. The ongoing demonstration is a part of NASA’s Human Research Program’s Exploration Medical Capability (ExMC) priorities of “advancing medical system design for exploration beyond low-Earth orbit and promoting human health and performance in space.” Debuting at the International Space Station (ISS) in February, the Reusable Handheld Electrolyte and Laboratory Technology for Humans (rHEALTH) ONE biomedical analyzer first uses a body sensor adhered to the chest to record vital signs. Then, an astronaut collects a biological sample (blood, urine, or saliva) on a nanostrip and runs it through the laser-based device where microfluidic technology collects more than 100 million data points on additional biomarkers. Data is then displayed to the astronaut and transmitted back to Earth for analysis. Prior to launch, the makers of the rHEALTH analyzer had to alter the device’s design to withstand microgravity.

Here are on Earth, researchers hope to rely on a much more ubiquitous piece of tech to help monitor human health: our smartphones. Clinical settings typically rely on pulse oximeters to determine blood-oxygen saturation levels, but now a proof-of-principle study has show that, outside the doctor’s office, a patient’s cell phone camera can be handy at identifying blood-oxygen saturation levels down to 70%. A patient simply places his or her finger across both the camera lens and the flash, and the camera conducts reflectance photoplethysmography (PPG), or “how much light from the flash the blood absorbs in each of the red, green and blue channels.” The cell phone approach can measure as well as pulse oximeters, but a larger study is needed to explore effectiveness on a wider diversity of skins tones.  

While some wearable technology has found success from our homes clear into outer space, the pursuit of non-invasive blood glucose monitors has been less fruitful. More than 20 years ago, the FDA approved a wearable medical device called the GlucoWatch to monitor patients’ blood glucose levels without them having to endure a needle prick. Expectations were high but then so were side effects, like as burns and blisters from a too-hot device that needed a lot of current to operate properly. In 2005, manufacturing for the device ended, and researchers are still searching for a viable replacement. In an open article published by Optica, we learn about the quest for a non-invasive blood glucose monitor and what optical coherence tomography, photoacoustic spectroscopy, and even far-infrared spectroscopy might mean for its future success.  

Though light has enabled technologies to change the course of human life for the better, its effects are not without consequence. A recent comparison of digital images taken by astronauts aboard the ISS from 2012 and 2020 show a whiter and brighter depiction of London. The culprit? Blue-light pollution caused by the proliferation of LED lights. Yellowish sodium light has been replaced by crisper and brighter LED light in recent years. Astronauts documented the change with a digital camera. Until now, satellite images had only produced images that combine red, green, and blue light and thus did not differentiate their sources. While the transition to LEDs has proved beneficial for nighttime visibility and energy usage, it’s also disrupting human sleep patterns, alternating animal behavior, and interfering with both public and professional astronomy activities, say researchers.

Over the past week, Jupiter and Neptune got their time in the sun, so to speak, with stunning new 3D renderings and images captured by JunoCam and the James Webb Space Telescope, respectively. JunoCam, a visible-light camera orbiting on the Juno spacecraft since 2016, has recorded images of the Jupiter’s cloud cover that where then rendered into a 3D model replete with soft, swirly peaks resembling cupcake frosting. Although NASA launched JunoCam as way to engage the public with Jupiter, its images are now helping scientists better understand cloud composition and what elements clouds may contain based on their height in the upper and lower atmospheres. Neptune got a similar treatment when the JWST captured its best portrait — rings, dust bands, and all — in 30 years. Voyager 2 gave us a good look in 1989, but the latest images now show an “unusual brightness” around the ice planet’s north pole and seven of the planet’s 14 moons, including its biggest moon, Triton.

Although the tech layoffs have seemingly skirted much of the optics and photonics industry, Ouster has reported it will reduce its workforce by 10%, or roughly 30 employees, as a means to reduce cash spending. Ouster reported it is cutting back its gross cash spending by about 15 percent for 2023, down to $107 million for the fiscal year. The special-purpose acquisition company (SPAC) lidar firm had acquired Sense Photonics last year as a increased play in automotive lidar applications. That deal was supposed to bring a sales increase between $65 million and $85 million, but the company revised its forecast to $40 million to $55 million to account for “ongoing macroeconomic pressures.”