News Feature | October 19, 2021

Bright Ideas: 10/20/21 — NIR/MWIR Advance Medical Imaging Of Tumors, Deep Tissue

abby proch headshot

By Abby Proch, former editor

bright-ideas

Sunny days are now A-OK for technicians to identify defects on solar panels. What was once a task only practical in the darkest conditions is now something doable during daylight hours. According to an article published by OSA, a new defect detection system “electrifies solar panels through a modulated current source, uses high frame rate InGaAs area array detectors for image data acquisition, and transmits images via CameraLink.” Data from the images is then worked into an algorithm that determines the modulated phase difference between the defective and nondefective areas of the panels. Next steps in the development include enhancing the algorithm to reduce background interference and improve contrast to provide clearer imagery, according to the report.

In other imaging news, NIL Technology claims its meta-optical element (MOE) lenses — at 94% efficiency — are the first flat lenses to compare with, even outperform, refractive optics. According to a press release, the company tested a 940 nm near-infrared (NIR) wavelength imaging lens with a single metasurface. It found the lens to be more compact, have a better image quality in a wide field of view, prove more cost effective, and demonstrate more rugged performance than traditional refractive lenses. The silicon-on-glass substrate design is strong and thermally stable, making the lens type viable for 3D sensing and face recognition in smartphones and driver-monitoring in automobiles.

Speaking of seeing more clearly with NIR, with a revamped version of two-photon microscopy, researchers at MIT and Harvard can now see two times deeper and up to 1,000 times clearer when imaging muscle, kidney, and brain cells in mice. According to Science Daily, researchers use patterned excitation and computational reconstruction to improve upon existing imaging methods that don’t get nearly as deep or offer as clear images. The scientists achieved the better look by shining a beam of long wavelength, low energy NIR light to induce absorption of two photons at the focal point. There, it produces a fluorescent signature and as that fluorescence goes further into the tissue sample, the light scatters and the image becomes blurry. But now, by modulating the light to turn each pixel on and off at different times, they can effectively predict the scattering pattern throughout the tissue. Using a computer algorithm, the team can reconstruct each pixel. This advancement means a quicker, clearer picture not only of blood flow and blood vessels but potentially with identifying the edges of tumors.  

Other developments in tumor detection and imaging involve using a high-speed camera to capture photons to create a 3D image. École Polytechnique Fédérale de Lausanne (EPFL) professor Edoardo Charbon captured tumor images by pointing a red laser at and simultaneously photographing diseased tissue treated with a fluorescent contrasting agent, according to an article by EPFL. Charbon then measured the nanoseconds-long delay from the time the laser took to reflect from the damaged tissue, effectively indicating the tumor’s shape. While existing MRI technology affords surgeons the ability to locate tumors, this development could be used in the operating room and help guide surgeons in removing the entire diseased tissue and ensuring no traces remain. Charbon noted the imaging advancement could also be used in other medical imaging, microscopy, and metrology applications.

Also in the medical arena, researchers are using mid-wavelength IR (MWIR) cavity-enhanced direct-frequency comb spectroscopy (CE-DFCS) to advance disease and medical condition detection through breath analysis. With it, researchers have successfully identified four breath bio markers and have the capability of detecting six more, according a research article published by PNAS. In current practice, breath analysis can be cumbersome, but this team from the University of Colorado Boulder and JILA endeavors to make the quick, noninvasive testing solution more accessible and offer a greater level of spectral coverage, specificity, and sensitivity. Advances in frequency comb measurement, optical coating, and photodetector technologies enabled the team to identify the biomarkers and aspire to identifying more.

And finally, in machine vision, a robotic arm developed by researchers at MIT can search, identify, and retrieve lost items, according to a university press release. The team’s RFusion robot combines signals from the RF antenna on its gripper with visual input from an attached camera to locate and pick up items marked with an RFID (radio frequency identification) tag.

To conduct the experiment, the team at MIT attached an RFID to a keychain and buried the keychain in a small pile of household items. Then, the robot’s antenna sent a signal that was reflected by the RFID, which indicated a small spherical area of interest. Reinforced with visual data from the onboard camera, the robotic arm sifted through the hodgepodge of materials to grab the tagged keychain. It wasn’t quick, but over time it’s proven to be 96 percent accurate, says the team.

To speed up the process, researchers are using reinforcement learning to train a neural network whereby an algorithm is improved upon through trial and error. When sped up a bit, RFusion could have broader applications in fulfilling warehouse orders; identifying and installing components in an automotive plant; and even assisting elderly people with household tasks, says the team.