Bayesian Gabor Network with Uncertainty Estimation for Pedestrian Lane Detection in Assistive Navigation

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

Automatic pedestrian lane detection is a challenging problem that is of great interest in assistive navigation and autonomous driving. Such a detection system must cope well with variations in lane surfaces and illumination conditions so that a vision-impaired user can navigate safely in unknown environments. This paper proposes a new lightweight Bayesian Gabor Network (BGN) for camera-based detection of pedestrian lanes in unstructured scenes. In our approach, each Gabor parameter is represented as a learnable Gaussian distribution using variational Bayesian inference. For the safety of vision-impaired users, in addition to an output segmentation map, the network provides two full-resolution maps of aleatoric uncertainty and epistemic uncertainty as well-calibrated confidence measures. Our Gabor-based method has fewer weights than the standard CNNs, therefore it is less prone to overfitting and requires fewer operations to compute. Compared to the state-of-the-art semantic segmentation methods, the BGN maintains a competitive segmentation performance while achieving a significantly compact model size (from 1.8x to 237.6x reduction), a fast prediction time (from 1.2x to 67.5x faster), and a well-calibrated uncertainty measure. We also introduce a new lane dataset of 10,000 images for objective evaluation in pedestrian lane detection research.
Original languageEnglish
Pages (from-to)5331-5345
Number of pages15
JournalIEEE Transactions on Circuits and Systems for Video Technology
Volume32
Issue number8
DOIs
Publication statusPublished - 1 Aug 2022

Keywords

  • Assistive and autonomous navigation
  • Bayesian Gabor Network
  • Pedestrian lane detection
  • Uncertainty estimation
  • Variational inference

Fingerprint

Dive into the research topics of 'Bayesian Gabor Network with Uncertainty Estimation for Pedestrian Lane Detection in Assistive Navigation'. Together they form a unique fingerprint.

Cite this