Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 171 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 36 tok/s Pro
GPT-4o 60 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Towards Closing the Loop in Robotic Pollination for Indoor Farming via Autonomous Microscopic Inspection (2409.12311v1)

Published 18 Sep 2024 in cs.RO, cs.SY, and eess.SY

Abstract: Effective pollination is a key challenge for indoor farming, since bees struggle to navigate without the sun. While a variety of robotic system solutions have been proposed, it remains difficult to autonomously check that a flower has been sufficiently pollinated to produce high-quality fruit, which is especially critical for self-pollinating crops such as strawberries. To this end, this work proposes a novel robotic system for indoor farming. The proposed hardware combines a 7-degree-of-freedom (DOF) manipulator arm with a custom end-effector, comprised of an endoscope camera, a 2-DOF microscope subsystem, and a custom vibrating pollination tool; this is paired with algorithms to detect and estimate the pose of strawberry flowers, navigate to each flower, pollinate using the tool, and inspect with the microscope. The key novelty is vibrating the flower from below while simultaneously inspecting with a microscope from above. Each subsystem is validated via extensive experiments.

Summary

  • The paper introduces a multi-scale robotic system that unifies global detection, local pose estimation, and microscopic inspection for pollination.
  • The system utilizes a 7-DOF arm with YOLOv8 and RAFT-based algorithms, achieving up to 98% accuracy in microscopic pollen inspection.
  • Experimental results show improved pollination efficiency and a 30% yield boost, emphasizing the potential for automation in indoor farming.

Autonomous Robotic Pollination via Microscopic Inspection for Indoor Farming

This paper proposes a robotic system aimed at addressing critical challenges in the pollination of indoor-grown, self-pollinating crops like strawberries. The authors introduce an autonomous setup, combining a robotic arm with a customized end effector, to execute pollination and in-situ microscopic inspection, thus attempting to close the loop in robotic pollination. The system operates across three scales: global (multi-flower detection and localization), local (single-flower pose estimation and alignment), and microscopic (visual inspection of pollen deposition).

System Overview and Contributions

The proposed system incorporates several novel hardware and software elements that synergize to achieve robust pollination. The hardware comprises a 7-DOF robotic arm (Kinova Gen3) equipped with an Intel RealSense depth camera for global flower detection, and a custom end effector that combines an endoscope camera, a 2-DOF microscope subsystem, and a vibrating pollination tool. The custom end effector is meticulously designed to vibrate the flower from below, facilitating simultaneous inspection from above.

The software backbone utilizes state-of-the-art deep learning models, including YOLOv8 for flower detection and RAFT for optical flow-based 3D point cloud generation, paired with classical algorithms like RANSAC and ICP for pose estimation. The perceptual system is bolstered by image-based visual servoing to achieve precise alignment and contact, while a custom focus-scoring algorithm ensures efficient autofocusing for the microscopic subsystem.

Experimental Validation

The paper presents extensive experimental validation of each subsystem. For the global scope, YOLOv8, in conjunction with the RealSense camera, detected and localized flowers with an 80% success rate. The local scope subsystem, which further refines flower pose estimation and alignment, achieved 95% and 100% success rates for alignment and pose estimation, respectively.

A critical aspect of the system is the microscopic inspection of pollen deposition. This subsystem demonstrated a 98% accuracy in distinguishing between pollinated and non-pollinated flowers using HSV filtering and custom focus-scoring algorithms. Furthermore, the pollination tool was validated independently in a USDA facility, showing a 30% improvement in fruit yield when compared to traditional pollination methods.

Implications and Future Directions

The implications of this research are significant for the field of agricultural robotics, particularly concerning the automation of labor-intensive tasks such as pollination in controlled environments. The proposed system not only enhances the reliability and efficiency of pollination but also introduces a novel method for real-time verification of its success.

From a practical perspective, the system can substantially reduce labor costs and increase the yield and quality of indoor crops. Theoretically, the integration of robotic manipulation with in-situ microscopic inspection paves the way for future research into more sophisticated perception and control strategies in dynamic, unstructured environments like indoor farms.

One of the most promising future directions is the extension of this system to mobile platforms, enabling the robot to service larger areas autonomously. Moreover, the adaptation of the system to other self-pollinating crops could generalize the benefits observed in strawberry farming. Further refinement in motion planning, obstacle avoidance, and real-time interaction models are also anticipated to enhance system robustness and reliability.

Conclusion

The paper presents a comprehensive and well-validated approach to automated pollination for indoor farming, leveraging advanced robotic and computer vision technologies. Its contributions encompass hardware innovations, algorithmic integrations, and rigorous experimental validations, positioning it as a meaningful advancement in the field of agricultural robotics. Future work promises to expand upon these foundations, enhancing the autonomy, versatility, and efficacy of robotic systems in agriculture.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.