User Rating: 5 / 5

Star ActiveStar ActiveStar ActiveStar ActiveStar Active


Robotic Harvesting of Fruiting Vegetables

Author: Redmond R. Shamshiri

Download PDF

 Traditional harvesting of fruiting vegetables for fresh market is a labor-intensive task that demands shifting from tedious manual operation to a continuously automated harvesting. In spite of the advances in agricultural robotics, million tons of fruits and vegetables are still hand-picked every year in open-fields and greenhouses (Figure 1). Other than the high labor cost, the availability of the skilled workforce that accepts repetitive tasks in the harsh field conditions impose uncertainties and timeliness costs. For robotic harvesting to be cost-effective, fruit yield needs to be maximized to compensate the additional automation costs. This leads to growing the plants at higher densities which make it even harder for an autonomous robot to simultaneously detect the fruit, localize and harvest it. In the case of sweet pepper fruit, with an estimated yield of 1.9 million tons/year in Europe, reports indicate that while an average time of 6 seconds per fruit is required for automated harvesting, the available technology has only achieved a success rate of 33% with an average picking time of 94 seconds per fruit [1]. For cucumber harvesting, a cycle time of 10 seconds was proven to be economically feasible [2]. Only in Washington State, 15-18 billion apple fruits are harvested manually every year. An estimated 3 million tons of apples is reported to have been produced in Poland in 2015 [3], out of which one-third are delicate fruits and are less resistant to bruising from mass harvester machines. Also in Florida, where the current marketable yield of sweet pepper fruits in open-field cultivation is 1.6 to 3.0 with a potential yield of 4 lb/ft2 in passive ventilated greenhouses [4], manual harvesting is still the only solution. Therefore, development of an automated robotic harvesting should be considered as an alternative method to address the associated labor shortage costs and timeliness.


 Figure 1. Manual harvesting of fruits


Research and development in agricultural robotics date back to 1980s, with Japan, The Netherlands, and the USA as the pioneer countries. The first studies used simple monochrome cameras for apple detection inside the canopy [5]. Advances in the sensor technology and imaging devices have led to the employment of more sophisticated devices such as infrared[6], thermal[7] and hyperspectral cameras [8], or combination of multi-sensors [9] that are adopted with novel vision-based techniques for extracting spatial information from the images for fruit recognition, localization, and tracking. Examples of some of the recent achievements include automatic fruit recognition based on the fusion of color and 3D feature [10], multi-template matching algorithm [11], and automatic fruit recognition from multiple images [12]. Unlike the industrial case, an agriculture robot has to deal with different arrangement of plantings size and shapes, stems, branches, leaves, fruit color, texture, and different location of fruits and plants with respect to each other. Significant contributions have been made by different research groups to address these challenges, however there is currently no report of a commercial robotic harvesting for fresh fruit market [13], mainly due to the extremely variable heterogeneous working condition and the complex and unpredicted tasks involved with each scenario. Some of the questions to be addressed in designing of a complete robotic harvesting are the simultaneous localization of fruit and environment mapping, path planning algorithms, and the number of detectable and harvestable fruits in different plant density conditions. The function of a robot can be separated into three main sections as sensing (i.e., fruit recognition), planning (i.e., hand-and-eye coordination) and acting (i.e., end-effector mechanism for fruit grasping) [14]. A common approach in fruit detection is by using a single view point, as in the case of a cucumber harvesting robot [15], or multiple viewpoints with additional sensing from one or few external vision sensors that are not located on the robot [16]. Other than the issues with frame transformation, this solution is not promising if the fruit is heavily occluded by the high density plant leaves [17]. Obviously, the final robot prototype needs to be relatively quicker for mass-harvest, with an affordable cost for greenhouse growers. Swarms of simple robots with multiple low-cost camera and grippers, or human-robot collaboration are the research topics to solve the facing challenges in robotic harvesting that current technology cannot overcome. These approaches can significantly improve the processing time of multiple fruit detection in the high-density plants, and provide ground truth results over time for machine learning algorithms based on human-operators experience. Research on agricultural robotics with focus on automated harvesting of fruiting and vegetable are huge. See for example the works carried out on sweet pepper [1][18][19][20], oil palm [21], cucumber [15][22][23][24], apple [25], strawberry [26][27], cherry fruit [6], citrus [28], and tomato [29]. Most of these works have used eye-in-hand look-and-move configuration in their visual servo control (Figure 2). Other researchers are concentrated on the end-effector design [30], analysis of the robot performance in the dense obstacle environments using stability tests [31], motion planning algorithms [32], and orchard architecture design for optimal harvesting robot [33]. In addition, several software frameworks have been developed for agricultural robotics. An example includes the work of [34] in which a generic high-level functionality was provided for easier and faster development of agricultural robots. Some of the most recent advances in sensing for robotic harvesting include the works of [29] and [35] which address the problem of detecting fruits and obstacles in dense foliage. Moreover, [20]  and [25] have extensively explored the use of combined colour distance, or RGB-D, data on apples and on sweet-peppers respectively, while [36] present a study devoted to symmetry analysis in three dimensional shapes for products detection on the plant.


Figure 2. Research and development in the robotic harvesting of fruits with different manipulators and gripper mechanisms for: (A) Citrus, (B, C) sweet pepper, (D, E) tomato, (F) cucumber, (G, H) strawberry, (I, J, K) apple


Improvement of robotic harvesting requires experimenting with different sensors and algorithms for fruit detection and localization, and a strategy for finding the collision-free paths to grasp the fruits with minimum control effort.  Experiments with the actual hardware setup for this purpose are not always feasible due to time constraints, unavailability of equipment (i.e., sensors, cameras, and the robot manipulator) and the operation costs. In the other hand, some hardware setups may result in actuator saturation, or create an unsafe situation to the operators and/or plants system. Simulation offers a reliable approach to bridge the gap between innovative ideas and the laboratory trials, and therefore can accelerate the design of a robust robotic fruit harvesting platform for efficient, cost-effective and bruise-free fruit picking. This research was motivated based on the sensing task in robotic harvesting, which requires delivering a robust pragmatic computer vision package to localize mature pepper fruits and its surrounding obstacles. The main objective was to create a completely simulated environment for improvement of plant/fruit scanning and visual servoing task through an easy testing and debugging of control algorithms with zero damage risk to the real robot and to the actual equipment. The research was carried out in two main phases: (i) the creation of the simulated workspace in the Virtual Robot Experimentation Platform (V-REP), and (ii) the development of communication and control architecture using the Robot Operating System (ROS) and MATLAB (The MathWorks Inc, Natick, MA, USA). The simulated workspace included an exact replica of the Fanuc LR Mate 200iD robot manipulator with six degrees of freedom (Fanuc America Corporation, Rochester Hills, MI), models of sweet pepper fruit and plant system, and different vision sensors were created in (V-REP). A simulated colour camera attached to the end-effector of the robot was used as fruit localization sensor. ROS was used for exchanging data between the simulated environment and the real workspace via its publish and subscribe architecture. This provides a tool for validating the simulated results with those from experimenting with a real robot. V-REP and MATLAB were also interfaced to create two-way communication architecture for exchanging sensors and robot control messages.  Data from the simulated manipulator and sensors in V-REP were used as inputs of a visual servo control algorithm in MATLAB. Results provided a flexible platform that saves in cost and time for experimenting with different control strategies, sensing instrumentation and algorithms in the automated harvesting of sweet pepper.




[1]               J. Hemming, W. Bac, B. van Tuijl, R. Barth, J. Bontsema, E. Pekkeriet, and E. van Henten, “A robot for harvesting sweet-pepper in greenhouses,” Proc. Int. Conf. Agric. Eng., pp. 6–10, 2014.

[2]               E. J. Van Henten, D. A. Van’t Slot, C. W. J. Hol, and L. G. Van Willigenburg, “Optimal manipulator design for a cucumber harvesting robot,” Comput. Electron. Agric., vol. 65, no. 2, pp. 247–257, 2009.

[3]               Młotek, M., Kuta, Ł., Stopa, R., & Komarnicki, P. (2015). The effect of manual harvesting of fruit on the health of workers and the quality of the obtained produce. Procedia Manufacturing, 3, 1712-1719.

[4]               Gary E. Vallad, Hugh A. Smith, Peter j. Dittmar, and Joshua H. Freeman. 2017. Vegetable production handbook of Florida. University of Florida, IFAS Extension. Gainesville, FL, USA. Available at: http://edis.ifas.ufl.edu/pdffiles/CV/CV29200.pdf. Last access: February 10th, 2018

[5]               R. on F. H. M. Li, Peilin, S. Lee, and H.-Y. Hsu, “Review on fruit harvesting method for potential use of automatic fruit harvesting systems,” Procedia Eng., vol. 23, pp. 351–366, 2011.

[6]               K. Tanigaki, T. Fujiura, A. Akase, and J. Imagawa, “Cherry-harvesting robot,” Comput. Electron. Agric., vol. 63, no. 1, pp. 65–72, 2008.

[7]               D. M. Bulanon, T. F. Burks, and V. Alchanatis, “Study on temporal variation in citrus canopy using thermal imaging for citrus fruit detection,” Biosyst. Eng., vol. 101, no. 2, pp. 161–171, 2008.

[8]               H. Okamoto and W. S. Lee, “Green citrus detection using hyperspectral imaging,” Comput. Electron. Agric., vol. 66, no. 2, pp. 201–208, 2009.

[9]               D. M. Bulanon, T. F. Burks, and V. Alchanatis, “Image fusion of visible and thermal images for fruit detection,” Biosyst. Eng., vol. 103, no. 1, pp. 12–22, 2009.

[10]            Y. Tao and J. Zhou, “Automatic apple recognition based on the fusion of color and 3D feature for robotic fruit picking,” Comput. Electron. Agric., vol. 142, no. Part A, pp. 388–396, 2017.

[11]            G. Bao, S. Cai, L. Qi, Y. Xun, L. Zhang, and Q. Yang, “Multi-template matching algorithm for cucumber recognition in natural environment,” Comput. Electron. Agric., vol. 127, no. Supplement C, pp. 754–762, 2016.

[12]            Y. Song, C. A. Glasbey, G. W. Horgan, G. Polder, J. A. Dieleman, and G. W. A. M. van der Heijden, “Automatic fruit recognition and counting from multiple images,” Biosyst. Eng., vol. 118, no. Supplement C, pp. 203–215, 2014.

[13]            Y. Bac, C. W., Henten, E. J., Hemming, and J. Edan, “Harvesting Robots for High‐value Crops: State‐of‐the‐art Review and Challenges Ahead,” J. F. Robot., vol. 31, no. 6, pp. 888–911, 2014.

[14]            Murphy, Robin. Introduction to AI robotics. MIT press, 2000. Cambridge, MA, USA

[15]            J. Van Henten, E. J., Van Tuijl, B. A. J., Hoogakker, G. J., Van Der Weerd, M. J., Hemming, J., Kornet, J. G., & Bontsema, “An autonomous robot for de-leafing cucumber plants grown in a high-wire cultivation system,” Biosyst. Eng., vol. 94, no. 3, pp. 317–323, 2006.

[16]            J. Hemming, J. Ruizendaal, J. W. illem Hofstee, and E. J. van Henten, “Fruit detectability analysis for different camera positions in sweet-pepper,” Sensors (Basel)., vol. 14, no. 4, pp. 6032–6044, 2014.

[17]            Bac, Cornelis Wouter. Improving obstacle awareness for robotic harvesting of sweet-pepper. Wageningen University, 2015. Wageningen, The Netherlands

[18]            J. Hemming, J. Bontsema, W. Bac, Y. Edan, B. van Tuijl, R. Barth, and E. Pekkeriet, “Final Report Sweet-Pepper Harvesting Robot,” no. December, p. 22, 2014.

[19]            C. W. Bac, J. Hemming, and E. J. Van Henten, “Robust pixel-based classification of obstacles for robotic harvesting of sweet-pepper,” Comput. Electron. Agric., vol. 96, pp. 148–162, 2013.

[20]            E. Vitzrabin and Y. Edan, “Adaptive thresholding with fusion using a RGBD sensor for red sweet-pepper detection,” Biosyst. Eng., vol. 146, pp. 45–56, 2016.

[21]            R. Shamshiri, W. Ishak, and W. Ismail, “Nonlinear tracking control of a two link oil palm harvesting robot manipulator,” vol. 5, no. 2, pp. 1–11, 2012.

[22]            J. B. and E. A. van O. Van Henten, E.J., J. Hemming, B.A.J. van Tuijl, J.G. Kornet, and J. Meuleman, “An Autonomous Robot for Harvesting Cucumbers in Greenhouses,” J. Auton. Robot., vol. 13, pp. 241–258, 2002.

[23]            Y. Tang, X., Zhang, T., Liu, L., Xiao, D., & Chen, “A new robot system for harvesting cucumber,” in American Society of Agricultural and Biological Engineers Annual International Meeting, 2009, pp. 3873–3885.

[24]            E. A. Van Henten, E. J., Van Tuijl, B. A. J., Hemming, J., Kornet, J. G., Bontsema, J., & Van Os, “Field test of an autonomous cucumber picking robot,” Biosyst. Eng., vol. 86, no. 3, pp. 305–313, 2003.

[25]            W. Thanh Nguyen, T., Vandevoorde, K., Wouters, N., Kayacan, E., De Baerdemaeker, J. G., & Saeys, “Detection of red and bicoloured apples on tree with an RGB-D camera,” Biosyst. Eng., vol. 146, pp. 33–44, 2016.

[26]            H. H. Han, Kil-Su, Si-Chan Kim, Young Bum Lee, Sang Chul Kim, Dong Hyuk Im, Hong Ki Choi, “Strawberry harvesting robot for bench-type cultivation,” Biosyst. Eng., vol. 37, no. 1, pp. 65–74, 2012.

[27]            S. Hayashi, K. Shigematsu, S. Yamamoto, K. Kobayashi, Y. Kohno, J. Kamata, and M. Kurita, “Evaluation of a strawberry-harvesting robot in a field test,” Biosyst. Eng., vol. 105, no. 2, pp. 160–171, 2010.

[28]            S. S. Mehta and T. F. Burks, “Vision-based control of robotic manipulator for citrus harvesting,” Comput. Electron. Agric., vol. 102, pp. 146–158, 2014.

[29]            J. Senthilnath, A. Dokania, M. Kandukuri, R. K.N., G. Anand, and S. N. Omkar, “Detection of tomatoes using spectral-spatial methods in remotely sensed RGB images captured by UAV,” Biosyst. Eng., vol. 146, pp. 16–32, 2016.

[30]            Z. De-An, L. Jidong, J. Wei, Z. Ying, and C. Yu, “Design and control of an apple harvesting robot,” Biosyst. Eng., vol. 110, no. 2, pp. 112–122, 2011.

[31]            Z. Li, P. Li, H. Yang, and Y. Wang, “Stability tests of two-finger tomato grasping for harvesting robots,” Biosyst. Eng., vol. 116, no. 2, pp. 163–170, 2013.

[32]            C. W. Bac, T. Roorda, R. Reshef, S. Berman, J. Hemming, and E. J. van Henten, “Analysis of a motion planning problem for sweet-pepper harvesting in a dense obstacle environment,” Biosyst. Eng., vol. 146, pp. 85–97, 2016.

[33]            V. Bloch, A. Degani, and A. Bechar, “A methodology of orchard architecture design for an optimal harvesting robot,” Biosyst. Eng., vol. 166, pp. 126–137, Feb. 2018.

[34]            O. Hellström, T. and Ringdahl, “A software framework for agricultural and forestry robots, Industrial Robot,” Ind. Robot An Int. J., vol. 40, no. 1, pp. 20–26, 2013.

[35]            S. Amatya, M. Karkee, A. Gongal, Q. Zhang, and M. D. Whiting, “Detection of cherry tree branches with full foliage in planar architecture for automated sweet-cherry harvesting,” Biosyst. Eng., vol. 146, pp. 3–15, 2015.

[36]            E. Barnea, R. Mairon, and O. Ben-Shahar, “Colour-agnostic shape-based 3D fruit detection for crop harvesting robots,” Biosyst. Eng., vol. 146, pp. 57–70, 2016.