Metasurface-driven full-space structured light for three-dimensional imaging

  • Gyeongtae Kim
  • , Yeseul Kim
  • , Jooyeong Yun
  • , Seong Won Moon
  • , Seokwoo Kim
  • , Jaekyung Kim
  • , Junkyeong Park
  • , Trevon Badloe
  • , Inki Kim
  • , Junsuk Rho

Research output: Contribution to journalArticlepeer-review

200 Scopus citations

Abstract

Structured light (SL)-based depth-sensing technology illuminates the objects with an array of dots, and backscattered light is monitored to extract three-dimensional information. Conventionally, diffractive optical elements have been used to form laser dot array, however, the field-of-view (FOV) and diffraction efficiency are limited due to their micron-scale pixel size. Here, we propose a metasurface-enhanced SL-based depth-sensing platform that scatters high-density ~10 K dot array over the 180° FOV by manipulating light at subwavelength-scale. As a proof-of-concept, we place face masks one on the beam axis and the other 50° apart from axis within distance of 1 m and estimate the depth information using a stereo matching algorithm. Furthermore, we demonstrate the replication of the metasurface using the nanoparticle-embedded-resin (nano-PER) imprinting method which enables high-throughput manufacturing of the metasurfaces on any arbitrary substrates. Such a full-space diffractive metasurface may afford ultra-compact depth perception platform for face recognition and automotive robot vision applications.

Original languageEnglish
Article number5920
JournalNature Communications
Volume13
Issue number1
DOIs
StatePublished - Dec 2022

Fingerprint

Dive into the research topics of 'Metasurface-driven full-space structured light for three-dimensional imaging'. Together they form a unique fingerprint.

Cite this