Abstract
The selection of the gras** location is the most important task for robots that handle randomly shaped objects. In previous studies, the grasp quality was accurately evaluated, but the speed was much too low for high-throughput applications, and the focus was mainly on industrial products. In this study, a large-scale dataset for randomly deformed plastics is constructed. We propose a contact-area estimation model and difficulty function for a quantitative analysis of surface conditions. Synthetic labels were calculated using the tuned difficulty function for donut-shaped contact areas. We trained the network containing a pre-trained encoder and decoder with skip connections for grasp-difficulty map estimation. Grasp-difficulty estimations for multiple objects required at most 30.9 ms with an average error rate of 1.65 %. The algorithm had a 94.4 % gras** success rate and its computational efficiency was compared with that in previous studies. The algorithm enables the rapid sorting of continuously conveyed objects with higher throughput.
Similar content being viewed by others
Abbreviations
- D :
-
Suction gras** difficulty
- W cup :
-
Suction cup stiffness weight
- W end :
-
End effector orientation weight
- W obj :
-
Object gravitational torque weight
- V :
-
Normal directional variance
- θ :
-
Slope of mean plane
- L :
-
Distance from the nearest contour
References
PlasticsEurope, Plastics—The Facts 2019, An Analysis of European Plastics Production, Demand and Waste Data (2019).
T. P. Tho and N. T. Thinh, Design and development of the sorting system based on robot, 2015 15th International Conference on Control, Automation and Systems (ICCAS), IEEE (2015) 1639–1644.
S. Wang, H. Lin, R. Gai and Y. Sun, An application of vision technology on intelligent sorting system by delta robot, 2017 IEEE 19th International Conference on e-Health Networking, Applications and Services (Healthcom), IEEE (2017) 1–6.
W. **ao, J. Yang, H. Fang, J. Zhuang, Y. Ku and X. Zhang, Development of an automatic sorting robot for construction and demolition waste, Clean Technologies and Environmental Policy, 22 (9) (2020) 1829–1841.
H. **, W. Fan, H. Chen and Y. Wang, Anti-corrosion wood automatic sorting robot system based on near-infrared imaging technology, Journal of Mechanical Science and Technology, 34 (7) (2020) 3049–3055, doi: https://doi.org/10.1007/s12206-020-0636-z.
Y. Jeon et al., Development of real-time automatic sorting system for color PET recycling process, 2020 20th International Conference on Control, Automation and Systems (ICCAS), IEEE (2020) 995–998.
F. Gabriel, M. Römer, P. Bobka and K. Dröder, Model-based grasp planning for energy-efficient vacuum-based handling, CIRP Annals, 70 (1) (2021).
C. Lehnert, I. Sa, C. McCool, B. Upcroft and T. Perez, Sweet pepper pose detection and gras** for automated crop harvesting, 2016 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2016) 2428–2434.
H. Zhang, J. Peeters, E. Demeester and K. Kellens, A CNN-based grasp planning method for random picking of unknown objects with a vacuum gripper, Journal of Intelligent and Robotic Systems, 103 (4) (2021) 1–19.
J. Mahler, M. Matl, X. Liu, A. Li, D. Gealy and K. Goldberg, Dex-net 3.0: Computing robust vacuum suction grasp targets in point clouds using a new analytic model and deep learning, 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2018) 5620–5627.
J. Ibarz, J. Tan, C. Finn, M. Kalakrishnan, P. Pastor and S. Levine, How to train your robot with deep reinforcement learning: lessons we have learned, The International Journal of Robotics Research, 40 (4–5) (2021) 698–721.
M. U. Khalid et al., Automatic grasp generation for vacuum grippers for random bin picking, Berlin, Advances in Automotive Production Technology — Theory and Application, Springer (2021) 247–255.
T. W. Utomo, A. I. Cahyadi and I. Ardiyanto, Suction-based grasp point estimation in cluttered environment for robotic manipulator using deep learning-based affordance map, International Journal of Automation and Computing, 18 (2) (2021) 277–287.
W. Wan, K. Harada and F. Kanehiro, Planning grasps with suction cups and parallel grippers using superimposed segmentation of object meshes, IEEE Transactions on Robotics, 37 (1) (2020) 166–184.
A. Bernardin, C. Duriez and M. Marchal, An interactive physically-based model for active suction phenomenon simulation, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE (2019) 1466–1471.
H. Cao, H.-S. Fang, W. Liu and C. Lu, SuctionNet-1billion: a large-scale benchmark for suction gras**, IEEE Robotics and Automation Letters, 6 (4) (2021) 8718–8725.
C. Borst, M. Fischer and G. Hirzinger, Grasp planning: How to choose a suitable task wrench space, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA’04., 1 (2004) 319–325.
C. Ferrari and J. F. Canny, Planning optimal grasps, CRA, 3 (4) (1992) 6.
J. Mahler et al., Dex-net 2.0: Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics, The Robotics: Science and Systems (2017).
D. Morrison, N. Kelly-Boxall, S. Wade-McCue, P. Corke and J. Leitner, Hierarchical grasp detection for visually challenging environments, Workshop at the IEEE RAS International Conference on Humanoid Robots (2017).
D. Morrison et al., Cartman: the low-cost cartesian manipulator that won the amazon robotics challenge, 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2018) 7757–7764.
T. B. Jørgensen, S. H. N. Jensen, H. Aanæs, N. W. Hansen and N. Krüger, An adaptive robotic system for doing pick and place operations with deformable objects, Journal of Intelligent and Robotic Systems, 94 (1) (2019) 81–100.
C. Lehnert, A. English, C. McCool, A. W. Tow and T. Perez, Autonomous sweet pepper harvesting for protected crop** systems, IEEE Robotics and Automation Letters, 2 (2) (2017) 872–879.
Y. Choi et al., Hierarchical 6-dof gras** with approaching direction selection, 2020 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2020) 1553–1559.
A. ten Pas, M. Gualtieri, K. Saenko and R. Platt, Grasp pose detection in point clouds, The International Journal of Robotics Research, 36 (13–14) (2017) 1455–1473.
K. He, X. Zhang, S. Ren and J. Sun, Deep residual learning for image recognition, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016) 770–778.
S. Hasegawa, K. Wada, S. Kitagawa, Y. Uchimi, K. Okada and M. Inaba, Graspfusion: realizing complex motion by learning and fusing grasp modalities with instance segmentation, 2019 International Conference on Robotics and Automation (ICRA), IEEE (2019) 7235–7241.
K. Simonyan and A. Zisserman, Very deep convolutional networks for large-scale image recognition, 3rd International Conference on Learning Representations (ICLR 2015) (2015).
O. Russakovsky et al., Imagenet large scale visual recognition challenge, International Journal of Computer Vision, 115 (3) (2015) 211–252.
L. Pinto and A. Gupta, Supersizing self-supervision: Learning to grasp from 50k tries and 700 robot hours, 2016 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2016) 3406–3413.
L.-C. Chen, Y. Zhu, G. Papandreou, F. Schroff and H. Adam, Encoder-decoder with atrous separable convolution for semantic image segmentation, Proceedings of the European Conference on Computer Vision (ECCV) (2018) 801–818.
D. Ge, T. Matsuno, Y. Sun, C. Ren, Y. Tang and S. Ma, Quantitative study on the attachment and detachment of a passive suction cup, Vacuum, 116 (2015) 13–20.
I. Lee, J. Oh, I. Kim and J.-H. Oh, Camera-laser fusion sensor system and environmental recognition for humanoids in disaster scenarios, Journal of Mechanical Science and Technology, 31 (6) (2017) 2997–3003, doi: https://doi.org/10.1007/s12206-017-0543-0.
S. Um, K.-S. Kim and S. Kim, Suction point selection algorithm based on point cloud for plastic waste sorting, 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE), IEEE (2021) 60–65.
A. G. Howard et al., Mobilenets: Efficient convolutional neural networks for mobile vision applications, ar** and cross-domain image matching, 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE (2018) 3750–3757.
Acknowledgments
This work was supported by Korea Environment Industry & Technology Institute (KEITI) through R&D Project for Recycling of Municipal Waste, funded by Korea Ministry of Environment (MOE) (Project Number: 1485017686, 2019002720001).
Author information
Authors and Affiliations
Corresponding author
Additional information
Sangwoo Um received his B.S. and M.S. degrees in Mechanical Engineering from KAIST in 2015 and 2017, respectively. He is currently pursuing his Ph.D. at KAIST. His research interests include sensor design, robot control, and robotic vision.
Kyung-Soo Kim received his B.S., M.S., and Ph.D. degrees in Mechanical Engineering from KAIST in 1993, 1995, and 1999, respectively. In 2007, he joined the Faculty of the Department of Mechanical Engineering at KAIST. His research interests include control theory, sensor design, actuator design, robots, and autonomous vehicles.
Soohyun Kim received his B.S. degree from Seoul National University in 1978, and his M.S. degree from KAIST in 1980 in Mechanical Engineering. He received his Ph.D. degree in Mechanical Engineering from the Imperial College of Science, Technology and Medicine, University of London, UK, in 1991. He joined the Faculty of the Department of Mechanical Engineering at KAIST in 1991. His research interests include robots, path planning, spectroscopy, actuators and sensors.
Rights and permissions
About this article
Cite this article
Um, S., Kim, KS. & Kim, S. Fast suction-grasp-difficulty estimation for high throughput plastic-waste sorting. J Mech Sci Technol 37, 955–964 (2023). https://doi.org/10.1007/s12206-023-0135-0
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12206-023-0135-0