ON-THE-FLY FEEDBACK SfM
Project Page

On-the-fly Feedback SfM: Online Explore-and-Exploit UAV Photogrammetry with Incremental Mesh Quality-Aware Indicator and Predictive Path Planning

Liyuan Lou*, Wanyun Li*, Wentian Gan*, Yifei Yu, Tengfei Wang, Xin Wang, Zongqian Zhan
School of Geodesy and Geomatics, Wuhan University
* Indicates equal contribution
Near Real-Time Incremental Reconstruction Quality-Aware Planning Explore-and-Exploit
On-the-fly Feedback SfM workflow
The overall workflow of our On-the-fly Feedback SfM: each incoming UAV image batch is processed by SfM on-the-fly, followed by incremental surface reconstruction, online quality assessment, and predictive path planning.

Abstract

Compared with conventional offline UAV photogrammetry, real-time UAV photogrammetry is essential for time-critical geospatial applications such as disaster response and active digital-twin maintenance. Most existing online methods process captured images or sequential frames in real time, but do not explicitly evaluate the quality of the on-the-go 3D reconstruction or provide guided feedback for improving image acquisition. This work presents On-the-fly Feedback SfM, an explore-and-exploit framework for real-time UAV photogrammetry.

Built upon SfM on-the-fly, our pipeline integrates: (1) online incremental coarse-mesh generation from dynamically expanding sparse points; (2) online mesh quality assessment with actionable indicators; and (3) predictive path planning for on-the-fly trajectory refinement. Comprehensive experiments demonstrate in-situ reconstruction and evaluation in near real time while providing actionable feedback that reduces coverage gaps and re-flight costs.

Key Contributions

The project moves UAV photogrammetry from a passive process-after-capture mode toward an active, quality-aware data collection process.

Online Explore-and-Exploit Framework

Image acquisition, incremental reconstruction, quality assessment, and path planning are coupled into a continuously updated workflow.

Incremental Mesh Quality Indicator

Dynamic graph cuts generate an evolving mesh, and per-face GSD, redundancy, and reprojection error are fused into an ensemble quality score.

Predictive Path Planning

Low-quality regions drive DBSCAN grouping, multi-constraint viewpoint generation, sparsification, and altitude-aware trajectory optimization.

Method

The framework synchronizes acquisition, evaluation, and planning at short temporal intervals. After the online SfM update, each small incoming batch of 5-20 images triggers a complete feedback cycle: incremental surface reconstruction, quality assessment, and trajectory optimization.

SfM on-the-fly Update

Each incoming image batch is processed by SfM on-the-fly to update camera poses and the evolving sparse point cloud without interrupting UAV motion.

Online Incremental Surface Reconstruction

The evolving sparse cloud is converted into a triangular surface mesh through dynamic Delaunay updates, ray-based energy accumulation, and dynamic graph-cut optimization.

Online Mesh Quality Assessment

Ground sampling distance, observation redundancy, and reprojection error are computed per triangle face and fused into an ensemble quality score.

Mesh-Quality-Aware Predictive Path Planning

Detected low-quality regions guide multi-constraint viewpoint generation, viewpoint sparsification, and lightweight trajectory optimization to produce an executable adaptive flight segment.

Experiments and Results

Experiments evaluate four aspects of the framework: incremental surface reconstruction, online mesh-quality assessment, adaptive path planning, and the full explore-and-exploit feedback pipeline.

Surface Reconstruction

SHHY, GYM, and YS evaluate mesh quality and online reconstruction efficiency against COLMAP and OpenMVG.

Quality Assessment

PHANTOM and US3D validate whether the ensemble quality score tracks global and local reconstruction evolution.

Adaptive Planning

US3D and XingHu test whether quality feedback reallocates viewpoints and improves acquisition in realistic online loops.

Datasets

Dataset Images Platform Resolution Source
SHHY770DJI Mavic 2 Pro1920 x 1080Self-captured
PHANTOM467DJI Mavic 2 Pro1920 x 1080Bu et al., 2016
US3D990-5472 x 3648Lin et al., 2022
GYM580DJI Matrice 4T4032 x 3024Self-captured
YS320DJI Matrice 4T4032 x 3024Self-captured
XingHu-DJI Matrice 4T4032 x 3024Self-captured
Sample images of the evaluated UAV datasets
Sample images from the evaluated UAV datasets.

Surface Reconstruction Quality

Dataset Method Accuracy Completeness F1 Score
SHHYCOLMAP0.49700.47710.4868
SHHYOpenMVG0.72700.36330.4845
SHHYOurs0.65090.45270.5340
GYMCOLMAP0.77930.60920.6838
GYMOpenMVG0.73630.49370.5911
GYMOurs0.80010.59580.6830
YSCOLMAP0.70420.57550.6334
YSOpenMVG0.70540.46720.5621
YSOurs0.72710.55330.6284

Bold indicates the best performance within each dataset group.

Online Quality Assessment

The ensemble mesh-quality indicator tracks both global reconstruction evolution and localized geometric consistency as new images are integrated.

Online quality assessment during image acquisition
Online quality assessment during incremental image acquisition.
Localized quality indicator consistency on a rooftop region of interest
Localized consistency verification of the quality indicator within a region of interest.

Online Explore-and-Exploit Pipeline

Stage Accuracy Completeness F1 Score Avg. Time / Image Trajectory Generation
Iteration 10.67640.41430.51381.465 s602.311 ms
Iteration 20.83330.52320.64281.470 s183.801 ms
Iteration 30.81410.48300.60631.415 s497.513 ms
Final0.82910.50990.63151.475 s-

The XingHu experiment keeps image processing around 1.4-1.5 seconds per image, while online trajectory generation stays below one second.

Explore-and-exploit online feedback pipeline on the XingHu dataset
Visualization of the explore-and-exploit online feedback pipeline on the XingHu dataset.

BibTeX

@misc{lou2025ontheflyfeedbacksfm,
  title={On-the-fly Feedback SfM: Online Explore-and-Exploit UAV Photogrammetry with Incremental Mesh Quality-Aware Indicator and Predictive Path Planning},
  author={Liyuan Lou and Wanyun Li and Wentian Gan and Yifei Yu and Tengfei Wang and Xin Wang and Zongqian Zhan},
  year={2025},
  eprint={2512.02375},
  archivePrefix={arXiv},
  primaryClass={cs.CV},
  doi={10.48550/arXiv.2512.02375},
  url={https://arxiv.org/abs/2512.02375}
}

Contact

School of Geodesy and Geomatics, Wuhan University, Wuhan 430079, China

Corresponding authors: xwang@sgg.whu.edu.cn, zqzhan@sgg.whu.edu.cn