A set of 3D-printed lenses that are smaller than a grain of sand but can mimic eagle-eye vision came about thanks to a chance discussion with a colleague and the freedom to pursue scientific creativity, according to its inventor.
Optics was always one of Professor Harald Giessen’s favourite scientific fields, a fact he attributes to the ability to create visual wonders. ‘It is always nice when you can see what you are doing with lasers and light, it’s just beautiful. Seeing the light beams and working with mirrors and lenses is something very hands-on and practical. The visual aspect is very pleasing.’
Prof. Giessen, a specialist in nano optics at the University of Stuttgart, Germany, hit the scientific headlines in February when, together with colleagues, he revealed a set of incredibly tiny 3D-printed camera lenses that mimic how an eagle sees the world. Companies eager to exploit the ability to capture detailed images from a distance are lining up at his door, from firms working on robots and automation to medical tech businesses.
But, as Prof. Giessen well knows, scientists can’t always control how their work is used. ‘Of course it gives the spies incredible abilities to spy even more on us,’ he said. ‘In the end it will probably end up in the hands of the bad guys. This is what I fear, but I think it will make a real difference in medical technology.’
What they’re all interested in is a 2 millimetre by 3 millimetre chip that contains four lenses, each of which is 100 micrometres wide, or about the size of a speck of dust. Combining the data from these lenses produces a picture that is high-resolution in the centre and less focused towards the edges, known as a foveated image.
‘We called it the eagle-eye vision because eagles have a very sharp, crisp fovea, which is the part at the centre of the eye where you have very sharp vision,’ said Prof. Giessen. ‘On the outside you have much less sharp vision but the eagle can see whether on the left or right is some enemy, and in the middle the image is so sharp that he can see a mouse from three kilometres.’
Inspiration for the system came after a chance conversation with a colleague who wanted to put a lens on the end of a fibre.
Normally, this would be done by manufacturing lenses individually and then assembling them in a device that attaches to the end of the fibre. However, Prof. Giessen had a different idea. He decided to use a special 3D printing machine known as Nanoscribe, which a former PhD student of his helped develop to carry out 3D printing on the nanoscale, in order to build a lens directly onto the tip of the fibre.
As luck would have it, he had just purchased one of these machines as part of a project funded by the EU’s European Research Council (ERC), in which he was investigating direct laser writing for nanofabrication. This is a type of light-activated 3D printing which starts with an undefined blob of the building material and then sculpts it into an object by using a very precise laser beam to harden the substance one 3D pixel at a time.
‘We called it the eagle-eye vision because … in the middle the image is so sharp that (the eagle) can see a mouse from three kilometres.’
Professor Harald Giessen, University of Stuttgart, Germany
Using this process, Prof. Giessen’s group was able to directly build lenses layer by layer onto the tip of a fibre 125 micrometres in diameter – which is three times the width of a human hair.
‘It turned out that the lenses and optics were really good and worked excellently,’ he said. ‘Then we thought, ok let’s go to more complicated optics.’
His ambition was to create a tiny microscope lens system, known as an objective, to overcome the problems associated with using a single lens, namely that the image can become distorted near the rim.
‘A single lens does not give a good image. For example, cameras with simple plastic lenses which you can put on your helmet when you ride your bike or when you do skiing, gives an image that is distorted. The horizon looks curved - depending on how you turn your head, the horizon could bend downwards or upwards.’
The solution was to print a double or triple lens to correct for the imaging errors and then combine four of these tiny objectives with different focal lengths and fields of view. Each objective captures a different type of image – one a very zoomed-in shot like those captured with a telephoto lens, another that mimics the type of image your eye would see, and another that captures a wide angle. The resulting pictures are stored in the chip and then combined into one single eagle-eye image.
Of all the applications that these tiny cameras could have, Prof. Giessen is most excited about their potential to revolutionise medical technology such as endoscopes, medical cameras that are inserted into the body to diagnose disease, or assist during surgery.
‘Imagine you had encdoscopes that were just EUR 50, that could be sterilised by cooking them for a little while in hot steam, and you could do this in hospitals in the third world. For example, colon cancer could be detected early. (At the moment) it is ugly and painful with this big tube being inserted into your intestines, so something there could be nice and cheap and small.’
Other potential applications include more discreet and numerous cameras on self-driving autonomous vehicles, or surveillance cameras that could be affixed to drones the size of a bee.
Prof. Giessen says the freedom to pursue ideas sparked by chance conversations is vital for creative scientists like himself, and he praises the ERC for allowing him space within his grant – for a project called COMPLEXPLAS – to explore where his ideas could lead.
‘The ERC is not too strict about limiting what you can do with the money. If that kind of money is married with a creative mind who is willing to wander off a little bit, to explore things then this is a lucky strike. I wish there were more such funding scheme that would allow this.’
Next steps on Prof. Giessen’s agenda is to refine the process so the whole device can be produced faster, reduce the size of the chip to 1 square millimetre and incorporate new materials such as metals for mirrors. He also plans to combine his work on the eagle-eye lens system with his work on COMPLEXPLAS, which could allow him to introduce a zoom function.
The ultimate aim is to build a system that would enable someone to print customised lenses on demand.
‘You go to your computer program, you design the optics, you go to the 3D printer, you feed the file into the printer, you wait a few hours and out comes the optics – that’s magic.’
If you liked this article, please consider sharing it on social media.
There are about 1,500 potentially active volcanoes worldwide and about 50 eruptions occur each year. But it’s still difficult to predict when and how these eruptions will happen or how they’ll unfold. Now, new insight into the physical processes inside volcanoes are giving scientists a better understanding of their behaviour, which could help protect the 1 billion people who live close to volcanoes.
Artificial intelligence (AI) used by governments and the corporate sector to detect and extinguish online extreme speech often misses important cultural nuance, but bringing in independent factcheckers as intermediaries could help step up the fight against online vitriol, according to Sahana Udupa, professor of media anthropology at Ludwig Maximilian University of Munich, Germany.
As the first coronavirus vaccines started to be rolled out at the end of a tumultuous 2020, UK officials unexpectedly endorsed stretching the gap between the first and second vaccine dose by up to three months – an approach also considered by other countries.
Pragmatic or dangerous – what do the experts say?
Better predictions of volcano behaviour could protect people and infrastructure.
Dr Kate Rychert studies ocean plate structures.