LiDAR Scanner

lidar Depth Imaging Time Of Flight Ray
View Benchmarks (1)

LiDAR (Light Detection and Ranging) measures distances by emitting laser pulses and timing the round-trip to the reflecting surface. Automotive LiDAR systems use rotating multi-beam scanners (e.g., Velodyne HDL-64E) or solid-state flash LiDAR to acquire 3D point clouds at 10-20 Hz. The forward model is simple time-of-flight: d = c*t/2. The resulting sparse point cloud requires densification, ground segmentation, and object detection. Primary challenges include sparse sampling, intensity variation with surface reflectivity, and rain/fog attenuation.

Forward Model

Pulse Tof

Noise Model

Gaussian

Default Solver

tv fista

Sensor

SPAD_OR_APD

Forward-Model Signal Chain

Each primitive represents a physical operation in the measurement process. Arrows show signal flow left to right.

P pulsed Pulsed Laser Sigma return Return Signal Integration D g, η₁ SPAD / APD
Spec Notation

P(pulsed) → Σ(return) → D(g, η₁)

Benchmark Variants & Leaderboards

LiDAR

LiDAR Scanner

Full Benchmark Page →
Spec Notation

P(pulsed) → Σ(return) → D(g, η₁)

Standard Leaderboard (Top 10)

# Method Score PSNR (dB) SSIM Trust Source
🥇 Point Transformer 0.779 33.13 0.954 ✓ Certified Zhao et al., ICCV 2021
🥈 RandLA-Net 0.753 31.91 0.942 ✓ Certified Hu et al., CVPR 2020
🥉 PnP-ADMM 0.655 29.1 0.840 ✓ Certified ADMM + denoiser prior
4 Bilateral Filter 0.641 27.41 0.868 ✓ Certified Tomasi & Manduchi, ICCV 1998
Mismatch Parameters (3) click to expand
Name Symbol Description Nominal Perturbed
timing_jitter Δt Timing jitter (ps) 0 50
beam_divergence Δθ Beam divergence error (mrad) 0 0.1
range_walk ΔR Range walk error (cm) 0 1.0

Reconstruction Triad Diagnostics

The three diagnostic gates (G1, G2, G3) characterize how reconstruction quality degrades under different error sources. Each bar shows the relative attribution.

G1 — Forward Model Accuracy How well does the mathematical model match reality?

Model: pulse tof — Mismatch modes: rain fog attenuation, crosstalk, motion distortion, low reflectivity dropout

G2 — Noise Characterization Is the noise model correctly specified?

Noise: gaussian — Typical SNR: 15.0–40.0 dB

G3 — Calibration Quality Are instrument parameters accurately measured?

Requires: extrinsic to camera, beam angles, range calibration, intensity calibration

Modality Deep Dive

Principle

Light Detection and Ranging (LiDAR) measures distances by emitting laser pulses (905 nm or 1550 nm) and timing their return after reflection from the scene (time-of-flight: d = c·t/2). A scanning mechanism (rotating mirror, MEMS, or optical phased array) sweeps the beam to build a 3-D point cloud of the environment. Resolution depends on the beam divergence, scanning density, and pulse timing precision.

How to Build the System

Select a LiDAR sensor appropriate for the application: mechanical spinning (Velodyne VLP-16/128 for autonomous vehicles), solid-state (Livox, Ouster), or airborne (Leica ALS80 for terrain mapping). Mount rigidly and combine with an IMU and GNSS for georeferencing. Calibrate intrinsic parameters (beam angles, timing offsets, intensity response) and extrinsics (relative to vehicle coordinate frame). Process returns: first/last/full waveform for different applications.

Common Reconstruction Algorithms

  • Point cloud registration (ICP, NDT for multi-scan alignment)
  • Ground filtering and classification (progressive morphological filter)
  • SLAM (Simultaneous Localization and Mapping) with LiDAR
  • Object detection and segmentation (PointNet, PointPillars)
  • Surface reconstruction from point clouds (Poisson, ball-pivoting)

Common Mistakes

  • Multi-echo / multi-path reflections causing ghost points
  • Motion distortion in the point cloud from vehicle movement during one scan rotation
  • Incorrect calibration causing misalignment between LiDAR and camera data
  • Rain, fog, or dust causing false returns and reduced range
  • Near-range blind zone where the receiver is not sensitive to returns

How to Avoid Mistakes

  • Filter ghost points using intensity thresholds and multi-return analysis
  • Apply ego-motion compensation using IMU data to deskew each scan
  • Perform target-based or targetless calibration between LiDAR and other sensors
  • Use 1550 nm wavelength (eye-safe and less affected by rain) for outdoor applications
  • Account for minimum range specification; fuse with short-range sensors if needed

Forward-Model Mismatch Cases

  • The widefield fallback produces a 2D (64,64) image, but LiDAR produces a 1D or 3D point cloud of range measurements (r_i = c*t_i/2) — the output is a set of (x,y,z) points, not a blurred image
  • LiDAR measures distance by timing laser pulse round-trips, with angular scanning determining direction — the widefield spatial blur has no connection to time-of-flight distance measurement or angular scanning geometry

How to Correct the Mismatch

  • Use the LiDAR operator that models pulsed laser emission, scene reflection (surface albedo and geometry), and time-of-flight detection: range = c*delta_t/2 for each beam direction
  • Process the point cloud using registration (ICP), ground classification, or object detection algorithms that operate on the correct 3D range measurement format

Experimental Setup

Instrument

Velodyne HDL-64E / Ouster OS1-128 / Livox Avia

Channels

64

Range M

120

Horizontal Fov Deg

360

Vertical Fov Deg

27

Horizontal Resolution Deg

0.08

Rotation Rate Hz

10

Wavelength Nm

905

Points Per Second

2200000

Dataset

KITTI, nuScenes, Waymo Open

Signal Chain Diagram

Experimental setup diagram for LiDAR Scanner

Key References

  • Geiger et al., 'Are we ready for autonomous driving? The KITTI vision benchmark suite', CVPR 2012

Canonical Datasets

  • KITTI 3D object detection
  • nuScenes (1000 driving scenes)
  • Waymo Open Dataset

Benchmark Pages