Advances in automated plant phenotyping and target-driven visual navigation for breeding candidates
Wu, Junzhe
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/130037
Description
Title
Advances in automated plant phenotyping and target-driven visual navigation for breeding candidates
Author(s)
Wu, Junzhe
Issue Date
2025-07-16
Director of Research (if dissertation) or Advisor (if thesis)
Tran, Huy
Doctoral Committee Chair(s)
Chowdhary, Girish
Committee Member(s)
Eveland, Andrea
Lipka, Alexander
Department of Study
Engineering Administration
Discipline
Agricultural & Biological Engr
Degree Granting Institution
University of Illinois Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Leaf Angle Prediction
Automated Plant Phenotyping
Target-Driven Navigation
Abstract
Advances in agricultural robotics are transforming field operations by enhancing perception and planning capabilities, particularly in automated plant phenotyping and visual navigation. Leaf angle, a critical factor for solar light absorption and photosynthetic efficiency, directly impacts the growth and yield of row crops such as maize. As such, plant breeders are interested in selecting specific leaf angle traits, which require measurement of leaf angle. However, traditional manual measurement of leaf angle in field settings is nearly impossible due to the significant labor involved. Furthermore, measurement tactics, such as with protractors, do not scale to field settings, and the results have been found to vary widely between different individuals. In this thesis, we present a field robotic system equipped with low-cost sensors to automatically measure the angle of the leaf. The proposed method is capable of producing quantitative and qualitative estimates of leaf angle for numerous maize leaves consistently using an RGB-D camera. Our method provides a high-throughput low-cost tool for the measurement of leaf angle and is expected to enable new agricultural research.
Additionally, autonomous navigation in unstructured and dynamic outdoor environments is critical for field robotics applications. For example, the breeding candidate after phenotyping needs to be located and their seeds collected for further breeding. Traditional Deep Reinforcement Learning-based visual navigation techniques face challenges in outdoor settings, particularly in the absence of high-resolution maps and GPS signals. This thesis presents a deep reinforcement learning-based approach for target-driven visual navigation in outdoor settings, using the successor feature (SF) framework to enhance the model's generalization and transfer learning capabilities. Our method constructs a grid-world environment for navigation tasks and employs a goal-conditioned reinforcement learning (GCRL) strategy that leverages SFs to capture environmental dynamics. This approach enables the model to transfer knowledge across tasks, making it adaptable to new environments with zero-shot or few-shot tuning. The experimental results demonstrate the adaptability of our method in new outdoor environments within the same domain. Furthermore, although the model is trained in a discrete grid-world environment, it is successfully deployed in real-time across different seasons within the same area, highlighting its robust transferability across domains and continuous state spaces.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.