6 Microscopy And Nanopositioning Innovations To Be Aware Of

By John Oncea, Editor

Microscopy and nanopositioning go hand in hand, allowing researchers to see objects too small to see with the naked eye. Let’s take a look at three innovations for each of these tiny technologies.
Microscopy is, in the simplest of terms, the use of a microscope to view samples and objects that are too small to see with the naked eye. Nanopositioning is the technology of moving, measuring, and positioning a device or instrument with sub-micron accuracy. Nanopositioning techniques are used in microscopy to ensure stable images throughout a microscope’s range of motion.
Microscopy has led to many discoveries including cells, organelles, disease-causing microorganisms, viruses, and atoms. Nanopositioning is used in optics and photonics for applications like adaptive optics, laser cavity stabilization, and optical/atomic trapping, and nanopositioning stages – devices that can position and measure with sub-nanometer or nanometer resolution – have radically improved the performance of scanning probe microscopes and optical lithography tools for semiconductor manufacturing.
But that’s all in the past. Let’s take a look at some of the new applications of microscopy and nanopositioning, as well as what we might expect in the future.
3D Microscopy, Self-Driving Microscopy, And More Accessible Super Resolution Microscopy
Researchers in Purdue University’s College of Engineering are developing patented and patent-pending innovations that make 3D microscopes that suffer from slow image capture and high costs due to precision translation stages faster to operate and less expensive to manufacture. “Such drawbacks in a microscope slow the measurement process, making it difficult to use for applications that require high speeds, such as in situ quality control,” said Song Zhang, a professor at Purdue’s School of Mechanical Engineering.
Zhang went on to explain that the Purdue 3D microscope automatically completes three steps: focusing on an object, determining the optimal capture process, and creating a high-quality 3D image for the end user. “In contrast, a traditional microscope requires users to carefully follow instructions provided by the manufacturer to perform a high-quality capture,” Zhang said.
Zhang and his colleagues use an electronically tunable lens, or ETL, that changes the focal plane of the imaging system without moving parts. He said using the lens makes the 3D microscope easier to use and less expensive to build.
“Our suite of patents covers methods on how to calibrate the ETL, how to create all-in-focus 3D images quickly, and how to speed up the data acquisition process by leveraging the ETL hardware information,” Zhang said. “The result is the same as a traditional microscope: 3D surface images of a scene. Ours is different because of its high speed and relatively low cost.” The next step involves translating this research into a commercial product with the help of an industrial partner.
Next up: researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory have developed an autonomous microscopy technique that uses AI to selectively target points of interest for scanning, according to Phys.org. This could revolutionize the way researchers acquire data and allow them to preserve the integrity of precious samples.
“Unlike the traditional point-by-point raster scan, which methodically covers every inch like the sequential reading of words on a page, this innovative approach identifies clusters of intriguing features, bypassing humdrum regions of monotonous uniformity,” writes Phys.org.
According to Charudatta (C.D.) Phatak, a group leader and materials scientist at Argonne and one of the authors of the study, “Many regions of a sample can be safely disregarded or at least not sampled heavily, but regions where there are discontinuities and boundaries can instead contribute the vast majority of information about the sample.” By homing in on these areas, the technique dramatically speeds up the experimental process.
The experiment employs an AI algorithm that starts the scanning process by choosing a group of random points on the sample. It collects data from these points simultaneously while predicting the upcoming points of interest. This real-time prediction capability allows researchers to speed up data acquisition without requiring human intervention, saving time, and expediting the experiment.
“Taking the human component out of the prediction process saves a great deal of time and speeds up the experiment,” said Saugat Kandel, a postdoctoral researcher at Argonne and lead author of the study. “There’s also only a small number of scientists who can perform these experiments effectively as they are done now.”
By reducing the time needed for gathering data, scientists can conduct additional experiments with the beam time they have reserved creating a streamlined approach to data acquisition, something that is invaluable in facilities like Argonne’s Advanced Photon Source (APS).
“The ability to automate experiments with AI will significantly accelerate scientific progress in the coming years,” added Argonne group leader and computational scientist Mathew Cherukara, another author of the study. “This is a demonstration of our ability to do autonomous research with a very complex instrument.”
Lastly, scientists at EPFL have published a guide to building an add-on that turns a standard optical microscope into an instrument capable of producing super resolution, 3D images of cells, organoids, and embryos.
For centuries, the movement of cells, bacteria, and yeast could only be studied by an optical microscope, but the diffraction of light made it impossible to observe objects at resolutions of less than 100 nm because the resulting images were too blurry to be of any use.
Known as the diffraction barrier – this limit was overcome with the development of super-resolution microscopy about 15 years ago. This technique allows scientists to study the behavior of organelles – as well as observe how cells interact with viruses, proteins, and drug molecules – by letting them peer deep inside living specimens.
“One of these new methods, known as structured illumination microscopy (SIM), is highly prized by researchers because it produces high-resolution and high-contrast images with low photon exposure,” writes EPFL. “Despite the advent of nanometer-resolution electron microscopes, optical imaging continues to play a key role in life-science research: it offers greater flexibility in terms of equipment and lets scientists observe live samples in normal developmental conditions.”
Many researchers are unable to perform SIM imaging due to cost and availability limitations. However, scientists from EPFL's Laboratory for Bio- and Nano-Instrumentation (LBNI), part of the Interfaculty Institute for Bioengineering (IBI) at EPFL's School of Engineering (STI), have found a solution. They have developed a technique that allows a standard optical microscope to be converted into a high-resolution device using affordable, commercially available components.
“SIM overcomes the diffraction barrier by reconstructing the areas of high spatial frequencies that normally appear blurred when viewed through a conventional optical microscope,” EPFL writes. “This method offers a twofold increase in resolution, enabling scientists to observe details as small as 100 nm across. SIM works by projecting a standard illumination pattern, such as a grid, onto a sample. Images, captured with different illumination patterns, are then processed by an algorithm to produce a higher-resolution reconstruction, harnessing the moiré effect.”
OpenSIM’s approach has a lower modulation contrast than commercially available equivalents, constraining the resolution gain to a factor of 1.7x compared to the theoretical 2x. But that doesn’t keep it from accomplishing what it was designed for – making SIM technology available to labs that need it only occasionally or that simply can’t afford to buy a commercial-grade model.
“The LBNI team is pressing ahead with efforts to bring its work to a wider group of scientists and build a community of users to share their experiences,” writes EPFL. “’Since the paper was shared on BioRxiv.org, I’ve been contacted by several people who are interested in the idea and want to know more about how to build their own OpenSIM,’ says Prof. Fantner.”
Nanopositioning: AI, Diverse Applications, And The Future
The integration of artificial intelligence (AI) and automation is enhancing nanopositioning systems, streamlining processes like sample preparation, imaging, and data analysis to increase productivity.
AI algorithms enable more precise control and optimization of nanoscale manufacturing processes, writes C Abor Jr. By analyzing data from sensors and feedback systems, AI can make real-time adjustments to improve the accuracy and efficiency of nanoscale positioning and assembly.
AI also enables the development of autonomous nanobots that can navigate and interact with nanoscale components to assemble complex nanostructures and devices with high precision. This breakthrough in nanoscale assembly opens up new possibilities for advanced nanodevices, sensors, and materials.
The combination of AI and nanotechnology is transforming drug discovery and delivery as well. AI algorithms can analyze vast amounts of data to predict drug efficacy and optimize drug design at the nanoscale, enabling the development of more targeted and effective nanomedicine therapies.
Finally, AI is helping to optimize energy storage and distribution at the nanoscale. Smart nanoscale energy management systems leveraging AI can efficiently utilize stored energy, enabling advancements in areas like smart grids and wearable devices.
Nanopositioning – AI-aided or not – enables many applications across various industries. Here’s a snapshot of how it’s being used in four of them:
- Materials Science: Nanopositioning is used extensively in materials research, enabling techniques like atomic force microscopy to study surface properties, deformation, and other nanoscale phenomena.
- Semiconductor Manufacturing: Nanopositioning stages are used in semiconductor fabrication for tasks like wafer inspection, overlay alignment, and packaging of microchips and MEMS devices.
- Life Sciences: Nanopositioning enables high-precision manipulation and analysis in fields like genomics, single-molecule biophysics, and live-cell imaging.
- Defense and Infrastructure: Nanopositioning technology finds use in diverse applications like beam steering, satellite positioning, and the control of devices under extreme conditions like high vacuum, cryogenic temperatures, and high magnetic fields.
Finally, what does the future hold for nanopositioning? According to Transparency Market Research, nanopositioning systems are poised for significant growth in the coming years with the global nanopositioning systems market projected to grow at a CAGR of 8.6% from 2023 to 2031, reaching an estimated market size of $214.08 million by 2028.
This growth will be driven by several factors including the rising demand for the miniaturization of devices and components. As devices become smaller and more intricate, the need for precise manufacturing processes and component alignment increases. Nanopositioning systems enable the precise positioning and manipulation of objects at the nanometer scale, which is crucial for the development of new materials, devices, and technologies.
Nanopositioning systems will have significant applications in the medical and biotechnology fields, where precision and accuracy are paramount for research, diagnostic, and therapeutic purposes. These systems enable researchers, clinicians, and scientists to precisely position and manipulate biological samples, cells, and molecules at the nanometer scale.
Technological advancements, such as the emergence of quantum computing and nanoscale electronics, are expected to further bolster the global nanopositioning systems market. These new technologies require precise positioning and alignment, which can be achieved through the use of nanopositioning systems.