Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

A universal 3D imaging sensor on a silicon photonics platform

Abstract

Accurate three-dimensional (3D) imaging is essential for machines to map and interact with the physical world1,2. Although numerous 3D imaging technologies exist, each addressing niche applications with varying degrees of success, none has achieved the breadth of applicability and impact that digital image sensors have in the two-dimensional imaging world3,4,5,6,7,8,9,10. A large-scale two-dimensional array of coherent detector pixels operating as a light detection and ranging system could serve as a universal 3D imaging platform. Such a system would offer high depth accuracy and immunity to interference from sunlight, as well as the ability to measure the velocity of moving objects directly11. Owing to difficulties in providing electrical and photonic connections to every pixel, previous systems have been restricted to fewer than 20 pixels12,13,14,15. Here we demonstrate the operation of a large-scale coherent detector array, consisting of 512 pixels, in a 3D imaging system. Leveraging recent advances in the monolithic integration of photonic and electronic circuits, a dense array of optical heterodyne detectors is combined with an integrated electronic readout architecture, enabling straightforward scaling to arbitrarily large arrays. Two-axis solid-state beam steering eliminates any trade-off between field of view and range. Operating at the quantum noise limit16,17, our system achieves an accuracy of 3.1 millimetres at a distance of 75 metres when using only 4 milliwatts of light, an order of magnitude more accurate than existing solid-state systems at such ranges. Future reductions of pixel size using state-of-the-art components could yield resolutions in excess of 20 megapixels for arrays the size of a consumer camera sensor. This result paves the way for the development and proliferation of low-cost, compact and high-performance 3D imaging cameras that could be used in applications from robotics and autonomous navigation to augmented reality and healthcare.

This is a preview of subscription content

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

Fig. 1: Solid-state 3D imaging architecture.
Fig. 2: Receiver FPA design.
Fig. 3: Receiver electro-optic performance.
Fig. 4: Characterization of 3D imaging system.

Data availability

The data used to produce the plots within this paper and produce the extended data plots are available at https://doi.org/10.5258/SOTON/D1619.

Code availability

The code used to analyse the data, produce the plots shown within this paper and produce the extended data plots are available at https://doi.org/10.5258/SOTON/D1619.

References

  1. 1.

    Urmson, C. et al. Autonomous driving in urban environments: Boss and the Urban Challenge. J. Field Robot. 25, 425–466 (2008).

    Google Scholar 

  2. 2.

    Wang, Q. & Kim, M.-K. Applications of 3D point cloud data in the construction industry: a fifteen-year review from 2004 to 2018. Adv. Eng. Inform. 39, 306–319 (2019).

    Google Scholar 

  3. 3.

    Lichti, D. D. Error modelling, calibration and analysis of an AM-CW terrestrial laser scanner system. ISPRS J. Photogramm. Remote Sens. 61, 307–324 (2007).

    ADS  Google Scholar 

  4. 4.

    Kidd, J. Performance Evaluation of the Velodyne VLP-16 System for Surface Feature Surveying. MSc thesis, Univ. New Hampshire (2017); https://scholars.unh.edu/thesis/1116

  5. 5.

    Salvi, J., Pagès, J. & Batlle, J. Pattern codification strategies in structured light systems. Pattern Recognit. 37, 827–849 (2004).

    MATH  Google Scholar 

  6. 6.

    Corti, A., Giancola, S., Mainetti, G. & Sala, R. A metrological characterization of the Kinect V2 time-of-flight camera. Robot. Auton. Syst. 75, 584–594 (2016).

    Google Scholar 

  7. 7.

    McManamon, P. Review of LADAR: a historic, yet emerging, sensor technology with rich phenomenology. Opt. Eng. 51, 060901 (2012).

    ADS  Google Scholar 

  8. 8.

    McManamon, P. F. et al. Comparison of flash LiDAR detector options. Opt. Eng. 56, 031223 (2017).

    ADS  Google Scholar 

  9. 9.

    Hutchings, S. W. et al. A reconfigurable 3-D-stacked SPAD imager with in-pixel histogramming for flash LIDAR or high-speed time-of-flight imaging. IEEE J. Solid-State Circuits 54, 2947–2956 (2019).

    ADS  Google Scholar 

  10. 10.

    Ronchini Ximenes, A. et al. A modular, direct time-of-flight depth sensor in 45/65-nm 3-D-stacked CMOS technology. IEEE J. Solid-State Circuits 54, 3203–3214 (2019).

    ADS  Google Scholar 

  11. 11.

    Behroozpour, B., Sandborn, P. A. M., Wu, M. C. & Boser, B. E. Lidar system architectures and circuits. IEEE Commun. Mag. 55, 135–142 (2017).

    Google Scholar 

  12. 12.

    Aflatouni, F., Abiri, B., Rekhi, A. & Hajimiri, A. Nanophotonic coherent imager. Opt. Express 23, 5117–5125 (2015).

    ADS  CAS  PubMed  Google Scholar 

  13. 13.

    Martin, A. et al. Photonic integrated circuit-based FMCW coherent LiDAR. J. Lightwave Technol. 36, 4640–4645 (2018).

    ADS  CAS  Google Scholar 

  14. 14.

    Inoue, D., Ichikawa, T., Kawasaki, A. & Yamashita, T. Demonstration of a new optical scanner using silicon photonics integrated circuit. Opt. Express 27, 2499–2508 (2019).

    ADS  CAS  PubMed  Google Scholar 

  15. 15.

    Li, C., Cao, X., Wu, K., Li, X. & Chen, J. Lens-based integrated 2D beam-steering device with defocusing approach and broadband pulse operation for lidar application. Opt. Express 27, 32970–32983 (2019).

    ADS  CAS  PubMed  Google Scholar 

  16. 16.

    Collett, M., Loudon, R. & Gardiner, C. Quantum theory of optical homodyne and heterodyne detection. J. Mod. Opt. 34, 881–902 (1987).

    ADS  CAS  Google Scholar 

  17. 17.

    Rubin, M. A. & Kaushik, S. Squeezing the local oscillator does not improve signal-to-noise ratio in heterodyne laser radar. Opt. Lett. 32, 1369–1371 (2007).

    ADS  PubMed  Google Scholar 

  18. 18.

    El Gamal, A. & Eltoukhy, H. CMOS image sensors. IEEE Circuits Devices Mag. 21, 6–20 (2005).

    Google Scholar 

  19. 19.

    Stann, B. L. et al. A 32 × 32 pixel focal plane array ladar system using chirped amplitude modulation. In Proc. SPIE 5412 (eds Kamerman, G. W. & Kamerman, G. W.) 264–272 (SPIE, 2004).

  20. 20.

    Hu, K. et al. Design of a CMOS ROIC for InGaAs self-mixing detectors used in FM/cw LADAR. IEEE Sens. J. 17, 5547–5557 (2017).

    ADS  Google Scholar 

  21. 21.

    Poulton, C. V. et al. Coherent solid-state LIDAR with silicon photonic optical phased arrays. Opt. Lett. 42, 4091–4094 (2017).

    ADS  CAS  PubMed  Google Scholar 

  22. 22.

    Miller, S. A. et al. 512-element actively steered silicon phased array for low-power LIDAR. In Conference on Lasers and Electro-Optics JTh5C.2 (Optical Society of America, 2018).

  23. 23.

    Poulton, C. V. et al. Long-range LiDAR and free-space data communication with high-performance optical phased arrays. IEEE J. Sel. Top. Quantum Electron. 25, 1–8 (2019).

    Google Scholar 

  24. 24.

    Wang, J. et al. Integrating BIM and LiDAR for real-time construction quality control. J. Intell. Robot. Syst. 79, 417–432 (2015).

    Google Scholar 

  25. 25.

    Kasturi, A., Milanovic, V., Atwood, B. H. & Yang, J. UAV-borne LIDAR with MEMS mirror-based scanning capability. In Proc. SPIE 9832 (eds Turner, M. D. & Kamerman, G. W.) 206–215 (SPIE, 2016).

  26. 26.

    Griffiths, H. D. New ideas in FM radar. Electronics Commun. Eng. J. 2, 185–194 (1990).

    Google Scholar 

  27. 27.

    Riemensberger, J. et al. Massively parallel coherent laser ranging using a soliton microcomb. Nature 581, 164–170 (2020).

    ADS  CAS  PubMed  Google Scholar 

  28. 28.

    Thurn, K., Ebelt, R. & Vossiek, M. Noise in homodyne FMCW radar systems and its effects on ranging precision. In 2013 IEEE MTT-S International Microwave Symposium Digest https://doi.org/10.1109/MWSYM.2013.6697654 (IEEE, 2013).

  29. 29.

    Tsang, H. K. et al. Optical dispersion, two-photon absorption and self-phase modulation in silicon waveguides at 1.5 μm wavelength. Appl. Phys. Lett. 80, 416–418 (2002).

    ADS  CAS  Google Scholar 

  30. 30.

    Rong, H. et al. An all-silicon Raman laser. Nature 433, 292–294 (2005).

    ADS  CAS  PubMed  Google Scholar 

  31. 31.

    Giewont, K. et al. 300-mm monolithic silicon photonics foundry technology. IEEE J. Sel. Top. Quantum Electron. 25, 8200611 (2019).

    CAS  Google Scholar 

  32. 32.

    Shen, Y. et al. Deep learning with coherent nanophotonic circuits. Nat. Photon. 11, 441–446 (2017).

    ADS  CAS  Google Scholar 

  33. 33.

    Razavi, B. The transimpedance amplifier. Solid-State Circuits Mag. 11, 10–97 (2019).

    Google Scholar 

  34. 34.

    Wang, J. Y. Heterodyne laser radar-SNR from a diffuse target containing multiple glints. Appl. Opt. 21, 464–476 (1982).

    ADS  CAS  PubMed  Google Scholar 

  35. 35.

    Rebolj, D., Pučko, Z., Čuš Babič, N., Bizjak, M. & Mongus, D. Point cloud quality requirements for Scan-vs-BIM based automated construction progress monitoring. Autom. Construct. 84, 323–334 (2017).

    Google Scholar 

  36. 36.

    Samani, A. et al. Silicon photonic Mach–Zehnder modulator architectures for on chip PAM-4 signal generation. J. Lightwave Technol. 37, 2989–2999 (2019).

    ADS  CAS  Google Scholar 

  37. 37.

    Stove, A. Linear FMCW radar techniques. IEE Proc. F 139, 343–350 (1992).

    Google Scholar 

  38. 38.

    Winkler, V. Range Doppler detection for automotive FMCW radars. In 2007 European Radar Conference, 166–169 (Horizon House, 2007).

  39. 39.

    Chen, H.-S. & Rao, C. R. N. Polarization of light on reflection by some natural surfaces. J. Phys. D 1, 1191–1200 (1968).

    ADS  Google Scholar 

  40. 40.

    Sheng, Z. et al. A compact and low-loss MMI coupler fabricated with CMOS technology. IEEE Photonics J. 4, 2272–2277 (2012).

    ADS  Google Scholar 

  41. 41.

    Harris, N. C. et al. Efficient, compact and low loss thermo-optic phase shifter in silicon. Opt. Express 22, 10487–10493 (2014).

    ADS  PubMed  Google Scholar 

  42. 42.

    Mendez-Astudillo, M., Okamoto, M., Ito, Y. & Kita, T. Compact thermo-optic MZI switch in silicon-on-insulator using direct carrier injection. Opt. Express 27, 899–906 (2019).

    ADS  CAS  PubMed  Google Scholar 

  43. 43.

    Ahmed, M. G. et al. 34-GBd linear transimpedance amplifier for 200-Gb/s DP-16-QAM optical coherent receivers. IEEE J. Solid-State Circuits 54, 834–844 (2019).

    ADS  Google Scholar 

  44. 44.

    Shahdoost, S., Medi, A. & Saniei, N. Design of low-noise transimpedance amplifiers with capacitive feedback. Analog Integr. Circuits Signal Process. 86, 233–240 (2016).

    Google Scholar 

  45. 45.

    Mohan, S. S., Hershenson, M. D. M., Boyd, S. P. & Lee, T. H. Bandwidth extension in CMOS with optimized on-chip inductors. IEEE J. Solid-State Circuits 35, 346–355 (2000).

    ADS  Google Scholar 

  46. 46.

    Razavi, B. A. 622 Mb/s 4.5 pA/√Hz CMOS transimpedance amplifier [for optical receiver front-end]. In 2000 IEEE International Solid-State Circuits Conference Digest of Technical Papers, 162–163 (IEEE, 2000).

  47. 47.

    Romanova, A. & Barzdenas, V. A review of modern CMOS transimpedance amplifiers for OTDR applications. Electronics (Basel) 8, 1073–1105 (2019).

    Google Scholar 

  48. 48.

    Sun, J., Timurdogan, E., Yaacobi, A., Hosseini, E. S. & Watts, M. R. Large-scale nanophotonic phased array. Nature 493, 195–199 (2013).

    ADS  CAS  PubMed  Google Scholar 

  49. 49.

    Piggott, A. Y. et al. Inverse-designed photonics for semiconductor foundries. ACS Photonics 7, 569–575 (2020).

    CAS  Google Scholar 

  50. 50.

    Sorianello, V. et al. Near-infrared absorption of germanium thin films on silicon. Appl. Phys. Lett. 93, 111115 (2008).

    ADS  Google Scholar 

Download references

Acknowledgements

This paper is dedicated to the memory of S. Sandhu, who performed all component-level simulations for the optical components used on the chip, as well as validation of the existing commercial foundry designs. D.J.T. acknowledges funding from the Royal Society for his University Research Fellowship. We thank A. Stricker, A. Watts, M. Djavid and the rest of the Global Foundries team for assistance in device fabrication.

Author information

Affiliations

Authors

Contributions

C.R. conceived, built and tested the free-space portion of the LiDAR system, performed the final LiDAR measurements and calibrated the optical switching trees. A.Y.P and C.R. performed the electro-optic characterization. A.Y.P. conceived, built and tested the fibre-optic portion of the LiDAR system, wrote the LiDAR system software, tested the coherent receiver array, and contributed to the architecture and layout of the photonic chip. D.J.T. designed and laid out the silicon photonics modulator. R.F.W. contributed to design verification including circuit simulations, and contributed to the embedded software control systems. I.E.O. designed the electronic circuits on the photonic chip, and performed their layout and verification. S.A.F developed the signal acquisition and control systems, and contributed to the system architecture. A.J.C. designed and verified the circuit boards used to interface with the photonic chip. A.G. contributed to the architecture and performed the layout of the photonic chip. F.M and X.C. contributed to the fabrication and testing of the modulator. R.N. conceived the receiver, switching and photonic system architecture. R.N. and G.T.R. supervised the project.

Corresponding author

Correspondence to Remus Nicolaescu.

Ethics declarations

Competing interests

All authors, with the exception of F.M. and X.C., are shareholders of Pointcloud Inc., a start-up company that makes laser ranging devices based on coherent receiver arrays.

Additional information

Peer review information Nature thanks Johann Riemensberger and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. Peer reviewer reports are available.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data figures and tables

Extended Data Fig. 1 Thermo-optic switching tree demonstration.

a, The thermo-optic switches consist of a Mach–Zehnder interferometer with an electrical heater on each arm. b, Tuning curve for a single thermo-optic switch, showing optical power in the two outputs as a function of applied heater power. The use of two heaters allows the average electrical power consumption per switch to be halved. c, Output power distribution of the 1 × 16 transmitter switching tree for all switch settings, demonstrating clean switching. Output power was monitored using a set of monitor photodiodes at the output of the switching tree. d, On and off transients for a representative thermo-optic switch, demonstrating 90–10% switching times of 9.1 μs and 12.1 μs, respectively. Owing to minor thermal crosstalk between switches, the switching transients are not perfect decaying exponentials.

Extended Data Fig. 2 Transmitter (TX) and receiver (RX) synchronization and readout architecture.

a, The transmitter steers light through a four-level tree of 1 × 2 switches to feed the FPA of 16 output grating couplers. Each leaf contains a fractional tap-and-monitor photodiode enabling electronic calibration of the tree. b, The receiver array is divided into eight blocks of 64 pixels. Imaging an eight-pixel column requires both steering the LO light to the block and enabling the associated electronics (pixel column and row buffer amplifiers). Signals from the active pixel column are driven by eight output amplifiers for parallel readout. c, Several levels of multiplexing are used to map 512 pixels down to eight output channels. An active receiver block has one active pixel per row, with the other, disabled, pixels within the row presenting high output impedance (no drive strength). The row buffers are similarly passively multiplexed between the blocks. The eight drivers are always activated during readout. d, Timing diagram showing synchronization between the optical switching trees (transmitter and receiver) and the electrical readout circuitry.

Extended Data Fig. 3 Free-space optics schematic of the demonstration system.

Much of the complexity in the optical system is to match the receiver and transmitter FPAs, which can be corrected in the future by adjusting the chip layouts. For inexpensive consumer versions of the system, the Faraday rotator and polarizing beamsplitter could be replaced by a 50–50 beamsplitter, at the cost of a fourfold reduction in signal strength. Although it is possible to implement this experiment using a single chip for both transmit and receive functions, we have used two identical chips acting as the transmitter and receiver, respectively, to simplify the experimental set-up.

Extended Data Fig. 4 Far-field infrared camera images of transmitter steering.

a, Images of several representative steering positions. The receiver fields of view corresponding to the 16 steering positions are indicated by the dashed lines, with the currently active block indicated by a solid outline. The light from the active transmit grating is first slightly defocused to completely illuminate the active block, and then structured by the microlens array. Owing to this defocusing and the soft edges of the beam, a small fraction of the transmitted light falls outside the active block. b, A zoomed-in image showing the structured illumination pattern produced by the microlens array. The locations of the bright spots coincide with the receiver pixel grating couplers.

Extended Data Fig. 5 Additional characterization of system performance.

a, Imaging contrast measured using retroreflective sheeting. Our system achieves >25 dB contrast for a one-pixel displacement, >50 dB contrast for a four-pixel displacement, and reaches the system noise floor thereafter, illustrating the excellent pixel-to-pixel isolation in our system. Here, the error bars represent the standard error. b, Depth precision and detection probability as a function of distance for a 44% reflectance target. The error bars on the depth precision represent the 95% confidence intervals. c, d, Single-chirp (c) and multi-chirp (d) FMCW waveforms, used for measuring slow and fast objects, respectively. e, f, Measured velocity as a function of rotation rate for a 30-cm-diameter styrofoam cylinder at a distance of 17 m using single-chirp (e) and multi-chirp (f) waveforms, with error bars indicating the standard deviation.

Supplementary information

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Rogers, C., Piggott, A.Y., Thomson, D.J. et al. A universal 3D imaging sensor on a silicon photonics platform. Nature 590, 256–261 (2021). https://doi.org/10.1038/s41586-021-03259-y

Download citation

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing