Millimeter-wave radar dataset for multi-modal fusion and keypoint detection in under-canopy soybean and corn row navigation
Mihigo, Aganze
This item's files can only be accessed by the System Administrators group.
Permalink
https://hdl.handle.net/2142/130233
Description
Title
Millimeter-wave radar dataset for multi-modal fusion and keypoint detection in under-canopy soybean and corn row navigation
Author(s)
Mihigo, Aganze
Issue Date
2025-07-25
Director of Research (if dissertation) or Advisor (if thesis)
Amato, Nancy
Committee Member(s)
Chowdhary, Girish
Department of Study
Siebel School Comp & Data Sci
Discipline
Computer Science
Degree Granting Institution
University of Illinois Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
Robotics
Abstract
This thesis addresses the persistent challenge of autonomous navigation beneath dense crop canopies, where common sensors RGB cameras, LiDAR, and GNSS often fail due to occlusion and degraded visibility. We a novel under-canopy dataset, collected over two growing seasons in corn and soybean fields, that integrates 77 GHz millimeter-wave (mmWave) radar with RGB-D stereo imagery and inertial measurements under a range of environmental conditions (low light, dust, and foliage cover). To our knowledge, this is the first multi-modal dataset of its kind in agricultural settings. Building on this resource, we investigate an end-to-end deep learning framework for radar-based keypoint prediction that leverages unprocessed range–azimuth radar data. The model employs knowledge distillation from a vision-based teacher network and fuses radar and camera features within an encoder–decoder architecture supervised by soft heatmap targets. Experimental evaluation highlights both the potential and limitations of using raw mmWave radar for under-canopy navigation, particularly with respect to computational demands. Our contributions include the new multimodal dataset, an agriculture-tailored fusion architecture, and a training methodology for radar-vision integration. These findings suggest that the incorporation of radar can enhance perception robustness in precision farming, paving the way for future work on real-time closed-loop control, cross-crop generalization, and multitask radar vision applications for plant monitoring and yield prediction.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.