Physically Inspired Gaussian Splatting for HDR Novel View Synthesis


CVPR 2026


Huimin Zeng      Yue Bai      Hailing Wang      Yun Fu
Northeastern University    

TL;DR: We present PhysHDR-GS, a physically inspired HDR novel view synthesis framework that models scene appearance via intrinsic reflectance and adjustable ambient illumination, thus modeling appearance-related dynamic details.

Abstract

High dynamic range novel view synthesis (HDR-NVS) reconstructs scenes with dynamic details by fusing multi-exposure low dynamic range (LDR) views, yet it struggles to capture ambient illumination-dependent appearance. Implicitly supervising HDR content by constraining tone-mapped results fails in correcting abnormal HDR values, and results in limited gradients for Gaussians in under/over-exposed regions. To this end, we introduce PhysHDR-GS, a physically inspired HDR-NVS framework that models scene appearance via intrinsic reflectance and adjustable ambient illumination. PhysHDR-GS employs a complementary image-exposure (IE) branch and Gaussian-illumination (GI) branch to faithfully reproduce standard camera observations and capture illumination-dependent appearance changes, respectively. During training, the proposed cross-branch HDR consistency loss provides explicit supervision for HDR content, while an illumination-guided gradient scaling strategy mitigates exposure-biased gradient starvation and reduces under-densified representations. Experimental results across realistic and synthetic datasets demonstrate our superiority in reconstructing HDR details (e.g., a PSNR gain of 2.04 dB over HDR-GS), while maintaining real-time rendering speed (up to 76 FPS).


Method

The proposed PhysHDR-GS is driven by three key components:

  • Physical Radiance Composition. The IE branch globally scales the projected HDR image by camera exposure t; the GI branch relights the scene via a virtual illumination modulator, locally rescaling radiance intensity to avoid saturation.
  • Self-Consistent HDR Fusion. The lightweight tone mapper fuses complementary LDR outputs from dual branches. Cross-branch HDR consistency loss between IE and GI branch provides explicit HDR self-supervision without ground truth.
  • Illumination-Guided Gradient Scaling. Gaussians in over/under-exposed regions receive small gradients due to the flat slope of the tone mapping curve. The illumination-guided scaling amplifies per-Gaussian gradients and prevents under-densification.

Video Comparison

Drag the slider to compare LDR-HDR novel views of each scene.

Bathroom
Bear
Chair
Desk
Diningroom
Dog
Sofa
Sponza

Drag to Compare

Select methods for the left and right panels. Drag the slider to compare visual results.

LDR

HDR

Qualitative Results

LDR Results

HDR Results

BibTeX

@misc{zeng2026physicallyinspiredgaussiansplatting,
      title={Physically Inspired Gaussian Splatting for HDR Novel View Synthesis}, 
      author={Huimin Zeng and Yue Bai and Hailing Wang and Yun Fu},
      year={2026},
      eprint={2603.28020},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2603.28020}, 
}