Compressed Ultrafast Photography (CUP)
Compressed Ultrafast Photography (CUP)
Standard reconstruction benchmark — forward model perfectly known, no calibration needed. Score = 0.5 × clip((PSNR−15)/30, 0, 1) + 0.5 × SSIM
| # | Method | Score | PSNR (dB) | SSIM | Source | |
|---|---|---|---|---|---|---|
| 🥇 |
DiffusionCUP
DiffusionCUP Qiao 2020
40.2 dB
SSIM 0.956
Checkpoint unavailable
|
0.898 | 40.2 | 0.956 | ✓ Certified | Qiao 2020 |
| 🥈 |
DAUHST-CUP
DAUHST-CUP Cai 2022
38.6 dB
SSIM 0.941
Checkpoint unavailable
|
0.864 | 38.6 | 0.941 | ✓ Certified | Cai 2022 |
| 🥉 |
STFormer-CUP
STFormer-CUP Wang 2022
37.9 dB
SSIM 0.933
Checkpoint unavailable
|
0.848 | 37.9 | 0.933 | ✓ Certified | Wang 2022 |
| 4 | PnP-FastDVDnet | 0.795 | 35.4 | 0.911 | ✓ Certified | Tassano 2020 |
| 5 |
E2E-CNN-CUP
E2E-CNN-CUP Liang 2019
33.7 dB
SSIM 0.886
Checkpoint unavailable
|
0.755 | 33.7 | 0.886 | ✓ Certified | Liang 2019 |
| 6 | DeSCI-CUP | 0.697 | 31.2 | 0.854 | ✓ Certified | Liu 2018 |
| 7 | GAP-TV | 0.631 | 28.5 | 0.812 | ✓ Certified | Yuan 2016 |
| 8 | TwIST-CUP | 0.584 | 26.8 | 0.774 | ✓ Certified | Bioucas-Dias 2007 |
| 9 | TV-CUP | 0.521 | 24.3 | 0.732 | ✓ Certified | Gao 2014 |
Dataset: PWM Benchmark (9 algorithms)
Blind Reconstruction Challenge — forward model has unknown mismatch, must calibrate from data. Score = 0.4 × PSNR_norm + 0.4 × SSIM + 0.2 × (1 − ‖y − Ĥx̂‖/‖y‖)
| # | Method | Overall Score | Public PSNR / SSIM |
Dev PSNR / SSIM |
Hidden PSNR / SSIM |
Trust | Source |
|---|---|---|---|---|---|---|---|
| 🥇 |
DiffusionCUP + gradient
DiffusionCUP + gradient Qiao et al., Nat. Photonics 2020 (updated 2024) Score 0.800
Correct & Reconstruct →
|
0.800 |
0.868
38.8 dB / 0.985
|
0.779
33.19 dB / 0.954
|
0.754
31.64 dB / 0.939
|
✓ Certified | Qiao et al., Nat. Photonics 2020 (updated 2024) |
| 🥈 | DAUHST-CUP + gradient | 0.799 |
0.828
35.91 dB / 0.973
|
0.806
35.44 dB / 0.970
|
0.764
32.77 dB / 0.950
|
✓ Certified | Cai et al., NeurIPS 2022 (CUP) |
| 🥉 | STFormer-CUP + gradient | 0.792 |
0.841
36.15 dB / 0.974
|
0.792
33.34 dB / 0.956
|
0.742
30.6 dB / 0.926
|
✓ Certified | Wang et al., CVPR 2022 (CUP) |
| 4 | PnP-FastDVDnet + gradient | 0.742 |
0.787
32.57 dB / 0.949
|
0.748
30.71 dB / 0.927
|
0.690
27.79 dB / 0.876
|
✓ Certified | Tassano et al., CVPR 2020 (CUP) |
| 5 | DeSCI-CUP + gradient | 0.702 |
0.751
30.02 dB / 0.917
|
0.705
28.33 dB / 0.888
|
0.650
25.5 dB / 0.818
|
✓ Certified | Liu et al., IEEE TPAMI 2018 (CUP adapt.) |
| 6 | E2E-CNN-CUP + gradient | 0.643 |
0.767
31.72 dB / 0.940
|
0.635
24.89 dB / 0.799
|
0.526
20.55 dB / 0.625
|
✓ Certified | Liang et al., CVPR 2019 |
| 7 | GAP-TV + gradient | 0.517 |
0.673
26.2 dB / 0.838
|
0.483
19.51 dB / 0.575
|
0.395
16.09 dB / 0.406
|
✓ Certified | Yuan, ICSIP 2016 |
| 8 | TV-CUP + gradient | 0.482 |
0.577
22.26 dB / 0.701
|
0.447
18.17 dB / 0.508
|
0.422
17.34 dB / 0.467
|
✓ Certified | Gao et al., Nature 2014 |
| 9 |
TwIST-CUP + gradient
TwIST-CUP + gradient Bioucas-Dias & Figueiredo, IEEE TIP 2007 (CUP) Score 0.471
Correct & Reconstruct →
|
0.471 |
0.632
24.19 dB / 0.775
|
0.425
17.43 dB / 0.472
|
0.357
15.19 dB / 0.363
|
✓ Certified | Bioucas-Dias & Figueiredo, IEEE TIP 2007 (CUP) |
Complete score requires all 3 tiers (Public + Dev + Hidden).
Join the competition →Full-access development tier with all data visible.
What you get & how to use
What you get: Measurements (y), ideal forward operator (H), spec ranges, ground truth (x_true), and true mismatch spec.
How to use: Load HDF5 → compare reconstruction vs x_true → check consistency → iterate.
What to submit: Reconstructed signals (x_hat) and corrected spec as HDF5.
Public Leaderboard
| # | Method | Score | PSNR | SSIM |
|---|---|---|---|---|
| 1 | DiffusionCUP + gradient | 0.868 | 38.8 | 0.985 |
| 2 | STFormer-CUP + gradient | 0.841 | 36.15 | 0.974 |
| 3 | DAUHST-CUP + gradient | 0.828 | 35.91 | 0.973 |
| 4 | PnP-FastDVDnet + gradient | 0.787 | 32.57 | 0.949 |
| 5 | E2E-CNN-CUP + gradient | 0.767 | 31.72 | 0.94 |
| 6 | DeSCI-CUP + gradient | 0.751 | 30.02 | 0.917 |
| 7 | GAP-TV + gradient | 0.673 | 26.2 | 0.838 |
| 8 | TwIST-CUP + gradient | 0.632 | 24.19 | 0.775 |
| 9 | TV-CUP + gradient | 0.577 | 22.26 | 0.701 |
Spec Ranges (3 parameters)
| Parameter | Min | Max | Unit |
|---|---|---|---|
| dmd_encoding_error | -0.4 | 0.8 | - |
| streak_sweep_calibration | -1.0 | 2.0 | - |
| temporal_spatial_coupling | -2.0 | 4.0 | - |
Blind evaluation tier — no ground truth available.
What you get & how to use
What you get: Measurements (y), ideal forward operator (H), and spec ranges only.
How to use: Apply your pipeline from the Public tier. Use consistency as self-check.
What to submit: Reconstructed signals and corrected spec. Scored server-side.
Dev Leaderboard
| # | Method | Score | PSNR | SSIM |
|---|---|---|---|---|
| 1 | DAUHST-CUP + gradient | 0.806 | 35.44 | 0.97 |
| 2 | STFormer-CUP + gradient | 0.792 | 33.34 | 0.956 |
| 3 | DiffusionCUP + gradient | 0.779 | 33.19 | 0.954 |
| 4 | PnP-FastDVDnet + gradient | 0.748 | 30.71 | 0.927 |
| 5 | DeSCI-CUP + gradient | 0.705 | 28.33 | 0.888 |
| 6 | E2E-CNN-CUP + gradient | 0.635 | 24.89 | 0.799 |
| 7 | GAP-TV + gradient | 0.483 | 19.51 | 0.575 |
| 8 | TV-CUP + gradient | 0.447 | 18.17 | 0.508 |
| 9 | TwIST-CUP + gradient | 0.425 | 17.43 | 0.472 |
Spec Ranges (3 parameters)
| Parameter | Min | Max | Unit |
|---|---|---|---|
| dmd_encoding_error | -0.48 | 0.72 | - |
| streak_sweep_calibration | -1.2 | 1.8 | - |
| temporal_spatial_coupling | -2.4 | 3.6 | - |
Fully blind server-side evaluation — no data download.
What you get & how to use
What you get: No data downloadable. Algorithm runs server-side on hidden measurements.
How to use: Package algorithm as Docker container / Python script. Submit via link.
What to submit: Containerized algorithm accepting y + H, outputting x_hat + corrected spec.
Hidden Leaderboard
| # | Method | Score | PSNR | SSIM |
|---|---|---|---|---|
| 1 | DAUHST-CUP + gradient | 0.764 | 32.77 | 0.95 |
| 2 | DiffusionCUP + gradient | 0.754 | 31.64 | 0.939 |
| 3 | STFormer-CUP + gradient | 0.742 | 30.6 | 0.926 |
| 4 | PnP-FastDVDnet + gradient | 0.690 | 27.79 | 0.876 |
| 5 | DeSCI-CUP + gradient | 0.650 | 25.5 | 0.818 |
| 6 | E2E-CNN-CUP + gradient | 0.526 | 20.55 | 0.625 |
| 7 | TV-CUP + gradient | 0.422 | 17.34 | 0.467 |
| 8 | GAP-TV + gradient | 0.395 | 16.09 | 0.406 |
| 9 | TwIST-CUP + gradient | 0.357 | 15.19 | 0.363 |
Spec Ranges (3 parameters)
| Parameter | Min | Max | Unit |
|---|---|---|---|
| dmd_encoding_error | -0.28 | 0.92 | - |
| streak_sweep_calibration | -0.7 | 2.3 | - |
| temporal_spatial_coupling | -1.4 | 4.6 | - |
Blind Reconstruction Challenge
ChallengeGiven measurements with unknown mismatch and spec ranges (not exact params), reconstruct the original signal. A method must be evaluated on all three tiers for a complete score. Scored on a composite metric: 0.4 × PSNR_norm + 0.4 × SSIM + 0.2 × (1 − ‖y − Ĥx̂‖/‖y‖).
Measurements y, ideal forward model H, spec ranges
Reconstructed signal x̂
Spec DAG — Forward Model Pipeline
M → Σ → D
Mismatch Parameters
| Symbol | Parameter | Description | Nominal | Perturbed |
|---|---|---|---|---|
| d_e | dmd_encoding_error | DMD encoding error (-) | 0.0 | 0.4 |
| s_s | streak_sweep_calibration | Streak sweep calibration (-) | 0.0 | 1.0 |
| t_c | temporal_spatial_coupling | Temporal-spatial coupling (-) | 0.0 | 2.0 |
Credits System
Spec Primitives Reference (11 primitives)
Free-space or medium propagation kernel (Fresnel, Rayleigh-Sommerfeld).
Spatial or spatio-temporal amplitude modulation (coded aperture, SLM pattern).
Geometric projection operator (Radon transform, fan-beam, cone-beam).
Sampling in the Fourier / k-space domain (MRI, ptychography).
Shift-invariant convolution with a point-spread function (PSF).
Summation along a physical dimension (spectral, temporal, angular).
Sensor readout with gain g and noise model η (Gaussian, Poisson, mixed).
Patterned illumination (block, Hadamard, random) applied to the scene.
Spectral dispersion element (prism, grating) with shift α and aperture a.
Sample or gantry rotation (CT, electron tomography).
Spectral filter or monochromator selecting a wavelength band.