Dual Band Thermal Videography:
Separating Time-Varying Reflection and Emission Near Ambient Conditions

Sriram Narayanan · Mani Ramanagopal · Srinivasa Narasimhan

CVPR 2026

Abstract

Long-wave infrared radiation captured by a thermal camera includes (a) emission from an object governed by its temperature and emissivity, and (b) reflected radiation from the surrounding environment. Separating these components is a long-standing challenge in thermography. Even when using multiple bands, the problem is under-determined without priors on emissivity. This difficulty is amplified in near ambient conditions, where emitted and reflected signals are of comparable magnitude. We present a dual-band video thermography framework that reduces this ambiguity by combining two complementary ideas at a per-pixel level: (i) spectral cues (ratio of emissivity between bands is unknown but fixed), and (ii) temporal cues (object radiation changes smoothly while background radiation changes rapidly). We derive an image formation model and an algorithm to jointly estimate the object's emissivity at each band, and the time-varying object and background temperatures. Experiments with calibrated and uncalibrated emissivities in everyday scenes (e.g., coffee pot heating up, palm print on mirrors, reflections of moving people), demonstrate robust separation and recovery of temperature fields.

Video

Results

Select a scene thumbnail to view the corresponding decomposition videos and explanation.

2.5x
Scene Thumbnails

Coffee-Pot

2.5x

A borosilicate coffee pot contining hot liquid shows heat propagation on its surface, while a person moves in the background, waving their hands and uncovering a heat source.

Thermal Band Videos (Inputs)

8-14 microns

9.25-9.75 microns

Calibrated Thermography

Object Temperature

Background Reflections

Uncalibrated Thermography

Object Temperature

Background Reflections

Iso-Contour Lines of Temperature

Calibrated Approach

Uncalibrated Approach

Incandescent Bulb

2.5x

An example that showcases our method operating in extremely low signal to noise ratio regime. A person transfers heat through hand contact onto an incandescent bulb. A person transfers heat to an incandescent bulb through hand contact, leaving a thermal fingerprint that gradually dissipates. Our uncalibrated method successfully separates the emission due to palm print from reflection of a person walking around and sipping a hot beverage.

Thermal Band Videos (Inputs)

8-14 microns

9.25-9.75 microns

Calibrated Thermography

Object Temperature

Background Reflections

Uncalibrated Thermography

Object Temperature

Background Reflections

Iso-Contour Lines of Temperature

Calibrated Approach

Uncalibrated Approach

Glass Plate

2.5x

In this example, a hot air gun is placed behind a glass plate to raise its temperature. The background reflection captures hand movements, light from a lighter, and a hand-inscribed "CV" text on a nearby board. These generate a faint thermal patterns on the glass, successfully separated by our method.

Thermal Band Videos (Inputs)

8-14 microns

9.25-9.75 microns

Calibrated Thermography

Object Temperature

Background Reflections

Uncalibrated Thermography

Object Temperature

Background Reflections

Iso-Contour Lines of Temperature

Calibrated Approach

Uncalibrated Approach

Wineglass

2.5x

Hot liquid is poured into a wineglass, with heat gradually spreading across its surface. In the background, a person performs actions like waving their hands.

Thermal Band Videos (Inputs)

8-14 microns

9.25-9.75 microns

Calibrated Thermography

Object Temperature

Background Reflections

Uncalibrated Thermography

Object Temperature

Background Reflections

Iso-Contour Lines of Temperature

Calibrated Approach

Uncalibrated Approach

BibTeX

@inproceedings{narayanan2026dual,
  title     = {Dual Band Thermal Videography: Separating Time-Varying Reflection and Emission Near Ambient Conditions},
  author    = {Narayanan, Sriram and Ramanagopal, Mani and Narasimhan, Srinivasa},
  booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year      = {2026}
}

Acknowledgements

This work was partly supported by NSF grants IIS-2107236, and NSF-NIFA AI Institute for Resilient Agriculture.