Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
MonoPrior-Fusion: Monocular-Prior-Guided Multi-Frame Depth Estimation with Multi-Scale Geometric Fusion
by
Lin, Zhiwei
, Sun, Bohan
, Zhang, Zhan
, Qian, Linrui
, Yi, Nianyu
in
Architecture
/ Augmented Reality
/ Cameras
/ geometric consistency
/ Geometry
/ Hypotheses
/ monocular depth prior
/ multi-frame depth estimation
/ multi-scale fusion
/ multi-view stereo (MVS)
/ Paradigms
/ Robotics
/ Supervision
/ Taxonomy
2026
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
MonoPrior-Fusion: Monocular-Prior-Guided Multi-Frame Depth Estimation with Multi-Scale Geometric Fusion
by
Lin, Zhiwei
, Sun, Bohan
, Zhang, Zhan
, Qian, Linrui
, Yi, Nianyu
in
Architecture
/ Augmented Reality
/ Cameras
/ geometric consistency
/ Geometry
/ Hypotheses
/ monocular depth prior
/ multi-frame depth estimation
/ multi-scale fusion
/ multi-view stereo (MVS)
/ Paradigms
/ Robotics
/ Supervision
/ Taxonomy
2026
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
MonoPrior-Fusion: Monocular-Prior-Guided Multi-Frame Depth Estimation with Multi-Scale Geometric Fusion
by
Lin, Zhiwei
, Sun, Bohan
, Zhang, Zhan
, Qian, Linrui
, Yi, Nianyu
in
Architecture
/ Augmented Reality
/ Cameras
/ geometric consistency
/ Geometry
/ Hypotheses
/ monocular depth prior
/ multi-frame depth estimation
/ multi-scale fusion
/ multi-view stereo (MVS)
/ Paradigms
/ Robotics
/ Supervision
/ Taxonomy
2026
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
MonoPrior-Fusion: Monocular-Prior-Guided Multi-Frame Depth Estimation with Multi-Scale Geometric Fusion
Journal Article
MonoPrior-Fusion: Monocular-Prior-Guided Multi-Frame Depth Estimation with Multi-Scale Geometric Fusion
2026
Request Book From Autostore
and Choose the Collection Method
Overview
Precise 3D perception is critical for indoor robotics, augmented reality, and autonomous navigation. However, existing multi-frame depth estimation methods often suffer from significant performance degradation in challenging indoor scenarios characterized by weak textures, non-Lambertian surfaces, and complex layouts. To address these limitations, we propose MonoPrior-Fusion (MPF), a novel framework that integrates pixel-wise monocular priors directly into the multi-view matching process. Specifically, MPF modulates cost-volume hypotheses to disambiguate matches and employs a hierarchical fusion architecture across multiple scales to propagate global and local geometric information. Additionally, a geometric consistency loss based on virtual planes is introduced to enhance global 3D coherence. Extensive experiments on ScanNetV2, 7Scenes, TUM RGB-D, and GMU Kitchens demonstrate that MPF achieves significant improvements over state-of-the-art multi-frame baselines and generalizes well across unseen domains. Furthermore, MPF yields more accurate and complete 3D reconstructions when integrated into a volumetric fusion pipeline, proving its effectiveness for dense mapping tasks. The source code will be made publicly available to support reproducibility and future research.
Publisher
MDPI AG,Multidisciplinary Digital Publishing Institute (MDPI)
MBRLCatalogueRelatedBooks
Related Items
Related Items
This website uses cookies to ensure you get the best experience on our website.