Computer Vision-based Robotic Arm for Object Color, Shape, and Size Detection
DOI:
https://doi.org/10.18196/jrc.v3i2.13906Keywords:
Arduino Mega, Color Sorting, OpenCV, PixyCMU, Robotic Arm, Servo Motor, Shape DetectionAbstract
Various aspects of the human workplace have been influenced by robotics due to its precision and accessibility. Nowadays, industrial activities have become more automated, increasing efficiency while reducing the production time, human labor, and risks involved. With time, electronic technology has advanced, and the ultimate goal of such technological advances is to make robotic systems as human-like as possible. As a result of this blessing of technological advances, robots will perform jobs far more efficiently than humans in challenging situations. In this paper, an automatic computer vision-based robotic gripper has been built that can select and arrange objects to complete various tasks. This study utilizes the image processing methodology of the PixyCMU camera sensor to distinguish multiple objects according to their distinct colors (red, yellow, and green). Next, a preprogrammed command is generated in the robotic arm to pick the item employing Arduino Mega and four MG996R servo motors. Finally, the device releases the object according to its color behind the fixed positions of the robotic arm to a specific place. The proposed system can also detect objects' geometrical shapes (circle, triangle, square, rectangle, pentagon, and star) and sizes (large, medium, and small) by utilizing OpenCV image processing libraries in Python language. Empirical results demonstrate that the designed robotic arm detects colored objects with 80% accuracy. It performs an excellent size and shapes recognition precision in real-time with 100% accuracy.References
F. Rubio, F. Valero, and C. Llopis-Albert, “A review of mobile robots: Concepts, methods, theoretical framework, and applications,” International Journal of Advanced Robotic Systems, vol. 16, no. 2, pp. 1–22, 2019.
J. Arents and M. Greitans, “Smart industrial robot control trends, challenges and opportunities within manufacturing,” Applied Sciences, vol. 12, no. 2, p. 937, 2022.
O. Glaufe, O. Gladstone, E. Ciro, C. A. T. Carvalho, and L. José, “Development of robotic arm control system using computational vision,” IEEE Latin America Transactions, vol. 17, pp. 1259–1267, 2019.
A. Kelly, B. Nagy, D. Stager, and R. Unnikrishnan, “Field and service applications - An infrastructure-free automated guided vehicle based on computer vision - An Effort to Make an Industrial Robot Vehicle that Can Operate without Supporting Infrastructure,” IEEE Robotics & Automation Magazine, vol. 14, pp. 24–34, 2007.
C.-Y. Tsai, C.-C. Wong, C.-J. Yu, C.-C. Liu and T.-Y. Liu, “A hybrid switched reactive-based visual servo control of 5-DOF robot manipulators for pick-and-place tasks,” IEEE Systems Journal, vol. 9, pp. 119–130, 2015.
Y.-Z. Hsieh, “Robotic arm assistance system based on simple stereo matching and Q-learning optimization,” IEEE Sensors Journal, vol. 20, pp. 10945–10954, 2020.
D. Kruse, J. T. Wen, and R. J. Radke, “A sensor-based dual-arm tele-robotic system,” IEEE Transactions on Automation Science and Engineering, vol. 12, pp. 4–18, 2015.
Y. Hu, “Development of sensory-motor fusion-based manipulation and grasping control for a robotic hand-eye system,” IEEE Transaction on Systems, Man and Cybernetics: Systems, vol. 47, pp. 1169–1180, 2017.
S. D. Perkasa, P. Megantoro and H. A. Winarno, “Implementation of a camera sensor pixy 2 CMUcam5 to a two wheeled robot to follow colored object,” Journal of Robotics and Control, vol. 2, pp. 496–501, 2021.
S. Gushi, Y. Shimabukuro, and H. Higa, “A self-feeding assistive robotic arm for people with physical disabilities of the extremities,” International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), pp. 61–64, 2020.
M. Intisar, M. M. Khan, M. R. Islam, and M. Masud, “Computer vision based robotic arm controlled using interactive GUI,” Intelligent Automation & Soft Computing, vol. 27, pp. 533–550, 2021.
H. M. I. Salehin, Q. R. A. Joy, F. T. Z. Aparna, A. T. Ridwan and R. Khan, “Development of an IoT based smart baby monitoring system with face recognition,” IEEE World AI IoT Congress (AIIoT), pp. 292–296, 2021.
A. A. Rafiq, W. N. Rohman, S. D. Riyanto, “Development of a simple and low-cost smartphone gimbal with MPU-6050 sensor,” Journal of Robotics and Control, vol. 1, pp. 136–140, 2020.
L. Huang, G. Wu, W. Tang, and Y. Wu, “Obstacle distance measurement under varying illumination conditions based on monocular vision using a cable inspection robot,” IEEE Access, vol. 9, pp. 55955–55973, 2021.
N. D. Rao and J. Min, “Living standards: material prerequisites for human wellbeing,” Social Indicators Research, vol. 138, pp. 225–244, 2018.
H. Kreinin and E. Aigner, “From “Decent work and economic growth” to “Sustainable work and economic degrowth”: A new framework for SDG 8,” Empirica, pp. 1–31, 2021.
C. A. G. Gutiérrez, J. R. Reséndiz, J. D. M. Santibáñez and G. M. Bobadilla, “A model and simulation of a five-degree-of-freedom robotic arm for mechatronic courses,” IEEE Latin America Transactions, vol. 12, pp. 78–86, 2014.
M. F. Ahmad, H. J. Rong, S. S. N. Alhady, W. Rahiman and W. A. F. W. Othman, “Color tracking technique by using pixy CMUcam5 for wheelchair luggage follower,” International Conference on Control System, Computing and Engineering (ICCSCE), pp. 186–191, 2017.
J. H. Marburger, “Science, technology and innovation in a 21st century context,” Policy Sciences, vol. 44, pp. 209-213, 2011.
R. N. Darmanin and M. K. Bugeja, “A review on multi-robot systems categorised by application domain,” Mediterranean Conference on Control and Automation (MED), 2017, pp. 701–706.
A. Voulodimos, N. Doulamis, A. Doulamis, and E. Protopapadakis, “Deep learning for computer vision: A brief review,” computational intelligence and neuroscience, vol. 2018, pp. 1–13, 2018.
K. L. Masita, A. N. Hasan and T. Shongwe, “Deep learning in object detection: A review,” International Conference on Artificial Intelligence, Big Data, Computing and Data Communication Systems (icABCD), pp. 1–11, 2020.
J. H. Ryu, M. Irfan, and A. Reyaz, “A Review on Sensor Network Issues and Robotics,” Journal of Sensors, vol. 2015, pp. 1–14, 2015.
A. Sophokleous, P. Christodoulou, L. Doitsidis and S. A. Chatzichristofis, “Computer Vision Meets Educational Robotics,” Electronics, vol. 10, pp. 1–24, 2021.
L. Pérez, I. Rodríguez, N. Rodríguez, R. Usamentiaga and D. F. García, “Robot guidance using machine vision techniques in industrial environments: A comparative review,” Sensors, vol. 16, pp. 1–26, 2016.
K. B. Jang, C. H. Baek, and T. H. Woo, “Risk analysis of nuclear power plant (NPP) operations by artificial intelligence (AI) in robot,” Journal of Robotics and Control, vol. 3, pp. 153–159, 2022.
S. A. Salim, M. R. Amin, M. S. Rahman, M. Y. Arafat, and R. Khan, “An IoT-based smart agriculture system with locust prevention and data prediction,” International Conference on Information Technology, Computer and Electrical Engineering (ICITACEE), pp. 201–206, 2021.
S. D. Perkasa, P. Megantoro and H. A. Winarno, “Implementation of a Camera Sensor Pixy 2 CMUcam5 to A Two Wheeled Robot to Follow Colored Object,” Journal of Robotics and Control, vol. 2, pp. 496–501, 2021.
B. Jang, M. Kim, G. Harerimana and J. W. Kim, “Q-Learning Algorithms: A Comprehensive Classification and Applications,” IEEE Access, vol. 7, pp. 133653–133667, 2019.
V. Larsson, M. Pollefeys and M. Oskarsson, “Orthographic-Perspective Epipolar Geometry,” International Conference on Computer Vision (ICCV), pp. 5550–5558, 2021.
A. R. A. Tahtawi, M. Agni and T. D. Hendrawati, “Small-scale Robot Arm Design with Pick and Place Mission Based on Inverse Kinematics,” Journal of Robotics and Control, vol. 2, pp. 469–475, 2021.
C. H. Park, J. Kim and M. G. Kang, “Color interpolation algorithm for an RWB color filter array including double-exposed white channel,” EURASIP Journal on Advances in Signal Processing, vol. 58, pp. 1–12, 2016.
O. Gómez, J. A. Gonzalez and E. F. Morales, “Image segmentation using automatic seeded region growing and instance-based learning,” Iberoamericann Congress on Pattern Recognition, pp. 192–201, 2007.
Downloads
Published
Issue
Section
License
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
This journal is based on the work at https://journal.umy.ac.id/index.php/jrc under license from Creative Commons Attribution-ShareAlike 4.0 International License. You are free to:
- Share – copy and redistribute the material in any medium or format.
- Adapt – remix, transform, and build upon the material for any purpose, even comercially.
The licensor cannot revoke these freedoms as long as you follow the license terms, which include the following:
- Attribution. You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- ShareAlike. If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.
- No additional restrictions. You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
• Creative Commons Attribution-ShareAlike (CC BY-SA)
JRC is licensed under an International License