Guided and collaborative kinematic control system for perception of the trajectories of the upper extremities

Authors

  • Mauro Leandro Ibarra-Peñaranda Universidad de Pamplona
  • Oscar Manuel Duque-Suárez Servicio Nacional de Aprendizaje SENA
  • Maria Carolina Duque-Suarez Servicio Nacional de Aprendizaje SENA

DOI:

https://doi.org/10.15649/2346030X.2394

Keywords:

robot, guided, morphology, control, learning

Abstract

This project studies the morphology of the human arm in order to build a robot capable of imitating the movements of the same, learning them and repeating them under a kinematic control routine, for this it was necessary to learn trajectories, which are obtained using artificial vision making use of the Kinect; from which the spatial coordinates of each joint are extracted, and subsequently processed by means of a mathematical model to obtain joint positions, calculate the kinematic model of the robot, and develop a routine for kinematic control that establishes the relationship between the speeds of the joints . The system allows the user to start learning their movements, and then simulate that learning on the virtual robot. As well as activating the physical robot to perform the learned movements. When comparing the results, it was determined that the standard deviation of the trajectories with and without control does not change to a greater extent; but the points that are within the deviation in the control part are more proportional, this because the stability of the trajectories improves when applying kinematic control.

References

M. Ibarra y L. Sepúlveda, “Desarrollo de un sistema de control cinemático guiado y colaborativo mediante percepción y aprendizaje de trayectorias obtenidas por visión artificial de los movimientos de las extremidades superiores del ser humano para Tecnoacademia Cúcuta del SENA-NDS”, Universidad de Pamplona, 2020.

T. Turja, I. Aaltonen, S. Taipale, y A. Oksanen, “Robot acceptance model for care (RAM-care): A principled approach to the intention to use care robots”, Inf. Manag., núm. (in press), p. 103220, 2019.

O. Igno-Rosario, C. Hernandez-Aguilar, A. Cruz-Orea, and A. Dominguez-Pacheco, “Interactive system for painting artworks by regions using a robot”, Rob. Auton. Syst., vol. 121, p. 103263, 2019.

H. Celikag, N. D. Sims, y E. Ozturk, “Cartesian stiffness optimization for serial arm robots”, Procedia CIRP, vol. 77, núm. Hpc, pp. 566–569, 2018.

C. Urrea, J. Cortés, and J. Pascal, “Design, construction and control of a SCARA manipulator with 6 degrees of freedom”, J. Appl. Res. Technol., vol. 14, núm. 6, pp. 396–404, 2016.

H. S. An, J. H. Lee, C. Lee, T. W. Seo, y J. W. Lee, “Geometrical kinematic solution of serial spatial manipulators using screw theory”, Mech. Mach. Theory, vol. 116, pp. 404–418, 2017.

C. Lopez-Franco, J. Hernandez-Barragan, A. Y. Alanis, and N. Arana-Daniel, “A soft computing approach for inverse kinematics of robot manipulators”, Eng. Appl. Artif. Intell., vol. 74, núm. May, pp. 104–120, 2018.

V. N. Iliukhin, K. B. Mitkovskii, D. A. Bizyanova, y A. A. Akopyan, “The Modeling of Inverse Kinematics for 5 DOF Manipulator”, Procedia Eng., vol. 176, pp. 498–505, 2017.

G. Li, H. Huang, H. Guo, and B. Li, “Design, analysis and control of a novel deployable grasping manipulator”, Mech. Mach. Theory, vol. 138, pp. 182–204, 2019.

F. Basile, F. Caccavale, P. Chiacchio, J. Coppola, and A. Marino, “A decentralized kinematic control architecture for collaborative and cooperative multi-arm systems”, Mechatronics, vol. 23, núm. 8, pp. 1100–1112, 2013.

D. Guo y Y. Zhang, “Li-function activated ZNN with finite-time convergence applied to redundant-manipulator kinematic control via time-varying Jacobian matrix pseudoinversion”, Appl. Soft Comput. J., vol. 24, pp. 158–168, 2014.

C. C. Cheah, S. Kawamura, and S. Arimoto, “Stability of hybrid position and force control for robotic manipulator with kinematics and dynamics uncertainties”, Automatica, vol. 39, núm. 5, pp. 847–855, 2003.

K. Goldberg, “Robots and the return to collaborative intelligence”, Nat. Mach. Intell., vol. 1, núm. 1, pp. 2–4, 2019.

C. Schou, R. S. Andersen, D. Chrysostomou, S. Bøgh, and O. Madsen, “Skill-based instruction of collaborative robots in industrial settings”, Robot. Comput. Integr. Manuf., vol. 53, núm. March, pp. 72–80, 2018.

A. Realyvásquez-Vargas, K. Cecilia Arredondo-Soto, J. Luis García-Alcaraz, B. Yail Márquez-Lobato, y J. Cruz-García, “Introduction and configuration of a collaborative robot in an assembly task as a means to decrease occupational risks and increase efficiency in a manufacturing company”, Robot. Comput. Integr. Manuf., vol. 57, núm. December 2018, pp. 315–328, 2019.

K. Amunts, L. Grandinetti, T. Lippert, and N. Petkov, “Brain-Inspired Computing: Second International Workshop, BrainComp 2015 Cetraro, Italy, July 6–10, 2015 Revised Selected Papers”, Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 10087 LNCS, pp. 179–193, 2016.

B. Fang, X. Guo, Z. Wang, Y. Li, M. Elhoseny, and X. Yuan, “Collaborative task assignment of interconnected, affective robots towards autonomous healthcare assistant”, Futur. Gener. Comput. Syst., vol. 92, pp. 241–251, 2019.

A. Levratti, G. Riggio, C. Fantuzzi, A. De Vuono, and C. Secchi, “TIREBOT: A collaborative robot for the tire workshop”, Robot. Comput. Integr. Manuf., vol. 57, núm. December 2018, pp. 129–137, 2019.

P. Francesco y G. G. Paolo, “AURA: An Example of Collaborative Robot for Automotive and General Industry Applications”, Procedia Manuf., vol. 11, núm. June, pp. 338–345, 2017.

J. P. Vasconez, G. A. Kantor, y F. A. Auat Cheein, “Human–robot interaction in agriculture: A survey and current challenges”, Biosyst. Eng., vol. 179, pp. 35–48, 2019.

J. O. Oyekan et al., “The effectiveness of virtual environments in developing collaborative strategies between industrial robots and humans”, Robot. Comput. Integr. Manuf., vol. 55, núm. September 2017, pp. 41–54, 2019.

M. J. Rosenstrauch, T. J. Pannen, y J. Krüger, “Human robot collaboration - Using kinect v2 for ISO/TS 15066 speed and separation monitoring”, Procedia CIRP, vol. 76, pp. 183–186, 2018.

A. Mohammed y L. Wang, “Brainwaves driven human-robot collaborative assembly”, CIRP Ann., vol. 67, núm. 1, pp. 13–16, 2018.

R. Meziane, M. J. D. Otis, y H. Ezzaidi, “Human-robot collaboration while sharing production activities in dynamic environment: SPADER system”, Robot. Comput. Integr. Manuf., vol. 48, núm. December 2015, pp. 243–253, 2017.

J. de Gea Fernández et al., “Multimodal sensor-based whole-body control for human–robot collaboration in industrial settings”, Rob. Auton. Syst., vol. 94, pp. 102–119, 2017.

M. D. Román, “SIMULADOR CINEMÁTICO DE UN ROBOT MANIPULADOR INDUSTRIAL”, 2014.

J. Pascual, “Programación de robots II”, 2014.

R. Suárez, “Programación, Planificación y Control en Robótica”, pp. 1–15, 2000.

O. Benítez, C. García, y N. Acosta, “Programación de robots”, Inst. Tecnológico Super. ROBÓTICA, pp. 1–11, 2017.

D. Massa, M. Callegari, y C. Cristalli, “Manual guidance for industrial robot programming”, Ind. Rob., vol. 42, núm. 5, pp. 457–465, 2015.

L. Bascetta, G. Ferretti, G. Magnani, y P. Rocco, “Walk-through programming for robotic manipulators based on admittance control”, Robotica, vol. 31, núm. 7, pp. 1143–1153, 2013.

E. Guiffo Kaigom y J. Roßmann, “Physics-based simulation for manual robot guidance—An eRobotics approach”, Robot. Comput. Integr. Manuf., vol. 43, pp. 155–163, 2017.

S. Wrede, C. Emmerich, R. Grünberg, A. Nordmann, A. Swadzba, y J. Steil, “A User Study on Kinesthetic Teaching of Redundant Robots in Task and Configuration Space”, J. Human-Robot Interact., vol. 2, núm. 1, pp. 56–81, 2013.

A. Odorico, “Marco teórico para una robótica pedagógica”, Rev. Informática Educ. y Medios Audiovisuales Vol. 1(3), vol. 1, núm. 3, pp. 34–46, 2004.

J. A. Batlle, J. M. Font, y J. Escoda, “Guiado de un robot móvil con cinemática de triciclo”, An. Ing. Mecánica, vol. 15, núm. 4, pp. 2981–2986, 2004.

D. R. Omar Arturo, J. S. Alejandro, P. V. Vicente, R. S. Francisco, y S. C. Gabriel, “Neurorehabilitación Robótica Basada en Guiado Kinestésico Local para Miembro Superior con Movimiento Involuntario”, Cienc. Univ., vol. 2, núm. February 2016, pp. 19–33, 2011.

Y. Li y S. S. Ge, “Force tracking control for motion synchronization in human-robot collaboration”, Robotica, vol. 34, núm. 6, pp. 1260–1281, 2016.

P. Schleer, S. Drobinsky, y K. Radermacher, “Evaluation of Different Modes of Haptic Guidance for Robotic Surgery”, IFAC-PapersOnLine, vol. 51, núm. 34, pp. 97–103, 2019.

J. Baumeyer, S. Miossec, C. Novales, G. Poisson, P. Vieyres, y S. Pinault, “P9. Haptic multimodal assessment of robot manual guidance for patient pre-positioning in proton therapy”, Phys. Medica, vol. 32, pp. 370–371, 2016.

A. M. Flores, P. Bauer, y G. Reinhart, “Concept of a learning knowledge-based system for programming industrial robots”, Procedia CIRP, vol. 79, pp. 626–631, 2019.

G. De Catalunya, “Aplicación práctica de la visión artificial en el control de procesos industriales.”, 2011.

J. Ordieres et al., Técnicas y algoritmos básicos de visión artificial Recurso electrónico - En línea, núm. January. 2006.

COGNEX, “Introducción a la visión artificial. Una guía para la automatización de procesos y mejoras de calidad”, 2003.

X. Feng, Y. Jiang, X. Yang, M. Du, and X. Li, “Computer vision algorithms and hardware implementations: A survey”, Integration, vol. 69, núm. August, pp. 309–320, 2019.

V. Alvear-Puertas, P. Rosero-Montalvo, D. Peluffo-Ordóñez, y J. Pijal-Rojas, “Internet de las Cosas y Visión Artificial, Funcionamiento y Aplicaciones: Revisión de Literatura”, Enfoque UTE, vol. 8, núm. 1, p. 244, 2017.

H. M. Mohammed y N. El-Sheimy, “Segmentation of image pairs for 3D reconstruction”, Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. - ISPRS Arch., vol. 42, núm. 2/W16, pp. 175–180, 2019.

Sanmartín y Briceño, “Development of an Artificial Vision System for Underwater Vehicles”, Proceedings, vol. 21, núm. 1, p. 1, 2019.

S. Trejo, K. Martinez, y G. Flores, “Depth map estimation methodology for detecting free-obstacle navigation areas”, 2019 Int. Conf. Unmanned Aircr. Syst. ICUAS 2019, núm. May, pp. 916–922, 2019.

W. Shi, M. B. Alawieh, X. Li, y H. Yu, “Algorithm and hardware implementation for visual perception system in autonomous vehicle: A survey”, Integr. VLSI J., vol. 59, núm. July, pp. 148–156, 2017.

A. Al-Kaff, D. Martín, F. García, A. de la Escalera, y J. María Armingol, “Survey of computer vision algorithms and applications for unmanned aerial vehicles”, Expert Syst. Appl., vol. 92, pp. 447–463, 2018.

P. Martinez, M. Al-Hussein, y R. Ahmad, “A scientometric analysis and critical review of computer vision applications for construction”, Autom. Constr., vol. 107, núm. August, 2019.

H. Tian, T. Wang, Y. Liu, X. Qiao, y Y. Li, “Computer vision technology in agricultural automation —A review”, Inf. Process. Agric., núm. xxxx, 2019.

L. J. Catania and E. Nicolitz, “Artificial Intelligence and Its Applications in Vision and Eye Care”, Adv. Ophthalmol. Optom., vol. 3, núm. 1, pp. 21–38, 2018.

Y. Xu et al., “Shared control of a robotic arm using non-invasive brain–computer interface and computer vision guidance”, Rob. Auton. Syst., vol. 115, pp. 121–129, 2019.

M. Laganowska, “Application of vision systems to the navigation of mobile robots using markers”, Transp. Res. Procedia, vol. 40, pp. 1449–1452, 2019.

M. H. Ali, K. Aizat, K. Yerkhan, T. Zhandos, and O. Anuar, “Vision-based Robot Manipulator for Industrial Applications”, Procedia Comput. Sci., vol. 133, pp. 205–212, 2018.

Published

2020-12-14

How to Cite

[1]
M. L. Ibarra-Peñaranda, O. M. Duque-Suárez, and M. C. . Duque-Suarez, “Guided and collaborative kinematic control system for perception of the trajectories of the upper extremities”, AiBi Revista de Investigación, Administración e Ingeniería, vol. 8, no. S1, pp. 124–151, Dec. 2020.

Issue

Section

Research Articles

Altmetrics

Downloads

Download data is not yet available.