FAQ

The use of machine vision to control the basic functions of a CNC machine tool using gestures

Publication date: 27.12.2017

Technical Transactions, 2017, Volume 12 Year 2017 (114), pp. 213 - 229

https://doi.org/10.4467/2353737XCT.17.221.7764

Authors

,
Karol Miądlicki
West Pomeranian University of Technology
All publications →
Mateusz Saków
West Pomeranian University of Technology
All publications →

Titles

The use of machine vision to control the basic functions of a CNC machine tool using gestures

Abstract

W artykule przedstawiona została koncepcja systemu wizyjnego umożliwiającego kontrolowanie i programowanie obrabiarki CNC za pomocą gestów. Opracowane rozwiązanie ułatwia obsługę obrabiarki CNC poprzez rozpoznawanie gestów wykonywanych przez operatora. Do realizacji sytemu wykorzystany został kontroler ruchu Microsoft Kinect for Windows v2. System rozpoznawania gestów zastosowano w otwartym systemie sterowania obrabiarki VC 760 (O.C.E.A.N.). W ramach badań opracowane zostały zestawy gestów pozwalających na sterowanie obrabiarką CNC. W artykule omówiono koncepcję i budowę systemu oraz wyniki przeprowadzonych testów. W podsumowaniu wskazano zalety oraz potencjalne problemy związane ze strukturą i zastosowaniem systemu, a także zarysowano plany jego dalszego rozwoju.

References

[1] Shuo-Peng L., Hsu J., Chun-Chien T., Development of the common human-machine interface for multi-axis machine tools, IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Kaohsiung, Taiwan, 2012, 650–653.
[2] Saków M., Miądlicki K., Parus A., Self-sensing teleoperation system based on 1-dof pneumatic manipulator, Journal of Automation Mobile Robotics and Intelligent Systems, Vol. 11, 64–76.
[3] Saków M., Parus A., Pajor M., Miądlicki K., Nonlinear inverse modeling with signal prediction in bilateral teleoperation with force-feedback, Methods and Models in Automation and Robotics (MMAR), Międzyzdroje, 2017, 141–146.
[4] Kopp S., Wachsmuth I., Gesture in Embodied Communication and Human Computer Interaction, 1 ed., Springer-Verlag, Berlin 2010.
[5] Majewski M., Kacalak W., Smart Control of Lifting Devices Using Patterns and Antipatterns, [in:] Artificial Intelligence Trends in Intelligent Systems: Proceedings of the 6th Computer Science On-line Conference, R. Silhavy, R. Senkerik, Z. Kominkova-Oplatkova, Z. Prokopova, P. Silhavy (Eds.), Springer International Publishing, Cham 2017.
[6] Majewski M., Kacalak W., Innovative Intelligent Interaction Systems of Loader Cranes and Their Human Operators, [in:] Artificial Intelligence Trends in Intelligent Systems: Proceedings of the 6th Computer Science On-line Conference, R. Silhavy, R. Senkerik, Z. Kominkova-Oplatkova, Z. Prokopova, P. Silhavy (Eds.), Springer International Publishing, Cham 2017.
[7] Corcoran P., To Gaze with Undimmed Eyes on All Darkness [IP Corner], IEEE Consumer Electronics Magazine, Vol. 4, 99–103.
[8] Jing Y. et al., A novel hand gesture input device based on inertial sensing technique, 30th Annual Conference of IEEE Industrial Electronics Society (IECON), Busan, South Korea 2004, Vol. 3, 2786–2791.
[9] Almetwally I., Mallem M., Real-time tele-operation and tele-walking of humanoid Robot Nao using Kinect Depth Camera, 10th IEEE International Conference on Networking, Sensing and Control (ICNSC), Evry, France 2013, 463–466.
[10] Sanna A., Lamberti F., Paravati G., Manuri F., A Kinect-based natural interface for quadrotor control, (in en), Entertainment Computing, Vol. 4, 179–186.
[11] Pajor M., Miądlicki K., Saków M., Kinect sensor implementation in FANUC robot manipulation, Archives of mechanical technology and automation, Vol. 34, 35–44.
[12] Gośliński J., Owczarek P., Rybarczyk D., The use of Kinect sensor to control manipulator with electrohydraulic servodrives, Pomiary, Automatyka, Robotyka, Vol. 17, 481–486.
[13] Pietrusewicz K., Miądlicki K., Gestures can control cranes, Control Engineering, Vol. 61, 14.
[14] Majewski M., Kacalak W., Budniak Z., Pajor M., Interactive Control Systems for Mobile Cranes, [in:] Advances in Intelligent Systems and Computing, Vol. 661, ed, 2018, 10–19.
[15] Majewski M., Kacalak W., Human-Machine Speech-Based Interfaces with Augmented Reality and Interactive Systems for Controlling Mobile Cranes, Interactive Collaborative Robotics: First International Conference, Budapest 2016, 89–98.
[16] Liu W., Ren H., Zhang W., Song S., Cognitive tracking of surgical instruments based on stereo vision and depth sensing, IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China 2013, 316–321.
[17] Gallo L., Placitelli A., Ciampi M., Controller-free exploration of medical image data: Experiencing the Kinect, 24th International Symposium on Computer-Based Medical Systems (CBMS), Bristol, United Kingdom 2011, 1–6.
[18] Stateczny K., Pajor M., Project of a manipulation system for manual movement of CNC machine tool body units, Advances in Manufacturing Science, Vol. 35, 33–41.
[19] Berman S., Stern H., Sensors for Gesture Recognition Systems, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), Vol. 42, 277–290.
[20] Kalgaonkar K., Raj B., One-handed gesture recognition using ultrasonic Doppler sonar, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Taipei, Taiwan 2009, 1889–1892.
[21] Ketabdar H., Ali K., Roshandel M., MagiTact: interaction with mobile devices based on compass (magnetic) sensor, 15th international conference on Intelligent user interfaces, Hong Kong, China 2010, 413–414.
[22] Pajor M., Stateczny K., Pietrusewicz K., Virtual reality applied for programming CNC machine tools, Control Engineering, Vol. 60, 50.
[23] Miądlicki K., Pajor M., Saków M., Ground plane estimation from sparse LIDAR data for loader crane sensor fusion system, Methods and Models in Automation and Robotics (MMAR), Międzyzdroje 2017, 717–722.
[24] Miądlicki K., Pajor M., Saków M., Real-time ground filtration method for a loader crane environment monitoring system using sparse LIDAR data, Innovations in Intelligent SysTems and Applications (INISTA), Gdynia 2017, 207–212.
[25] Shuying Z., Li S., Wenjun T., A method of dynamic hand gesture detection based on local background updates and skin color model, International Conference on Computer Application and System Modeling (ICCASM), Taiyuan, China 2010, Vol. 5, V5-657-V5-660.
[26] Mohan P., Srivastava S., Tiwari G., Kala R., Background and skin colour independent hand region extraction and static gesture recognition, Eighth International Conference on Contemporary Computing (IC), Noida, India 2015, 144–149.
[27] Haiting Z., Xiaojuan W., Hui H., Research of a Real-time Hand Tracking Algorithm, International Conference on Neural Networks and Brain (ICNNB), Beijing, China 2005, Vol. 2, 1233–1235.
[28] Ionescu D., Suse V., Gadea C., Solomon B., Ionescu B., Islam S., An infrared-based depth camera for gesture-based control of virtual environments, IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), Milan, Italy 2013, 13–18.
[29] Feng Z., Du P., Song X., Chen Z., Xu T., Zhu D., Research on Features Extraction from Frame Image Sequences, International Symposium on Computer Science and Computational Technology (ISCSCT), Shanghai, China 2008, Vol. 2, 762–766.
[30] Nianjun L., Lovell B. C., Kootsookos P. J., Davis R. I. A., Model structure selection & training algorithms for an HMM gesture recognition system, Ninth International Workshop on Frontiers in Handwriting Recognition (IWFHR), Tokyo, Japan 2004, 100–105.
[31] Liu K., Chen C., Jafari R., Kehtarnavaz N., Multi-HMM classification for hand gesture recognition using two differing modality sensors, Circuits and Systems Conference (DCAS), Dallas, USA 2014, 1–4.
[32] Fu X., Lu J., Zhang T., Bonair C., Coats M. L., Wavelet Enhanced Image Preprocessing and Neural Networks for Hand Gesture Recognition, IEEE International Conference on Smart City/SocialCom/SustainCom, Chengdu, China 2015, 838–843.
[33] Watanabe T., Yachida M., Real time gesture recognition using eigenspace from multi-input image sequences, Third IEEE International Conference on Automatic Face and Gesture Recognition, Nara, Japan 1998, 428–433.
[34] Shehu V., Dika A., Curve similarity measurement algorithms for automatic gesture detection systems, 35th International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia 2012, 973–976.
[35] Chen M., AlRegib G., Juang B.H., A new 6D motion gesture database and the benchmark results of feature-based statistical recognition, IEEE International Conference on Emerging Signal Processing Applications (ESPA), Las Vegas, USA 2012, 131–134.
[36] Van Nieuwenhove D., Van der Tempel W., Grootjans R., Kuijk M., Time-of-flight Optical Ranging Sensor Based on a Current Assisted Photonic Demodulator, Annual Symposium of the IEEE Photonics Benelux Chapter, Ghent, Belgium 2006, 209–212.
[37] Domek S., Pajor M., Pietrusewicz K., Urbański Ł., Experimental open control system OCEAN for linear drives, Inżynieria Maszyn, Vol. 16, 40–49.
[38] Erol A., Bebis G., Nicolescu M., Boyle R.D., Twombly X., Vision-based hand pose estimation: A review, Computer Vision and Image Understanding, Vol. 108, 52–73.
[39] Wang Q., Kurillo G., Ofli F., Bajcsy R., Evaluation of Pose Tracking Accuracy in the First and Second Generations of Microsoft Kinect, International Conference on Healthcare Informatics (ICHI), Dallas, USA 2015, 380–389.

Information

Information: Technical Transactions, 2017, Volume 12 Year 2017 (114), pp. 213 - 229

Article type: Original article

Titles:

Polish:

The use of machine vision to control the basic functions of a CNC machine tool using gestures

English:

The use of machine vision to control the basic functions of a CNC machine tool using gestures

Authors

West Pomeranian University of Technology

West Pomeranian University of Technology

Published at: 27.12.2017

Article status: Open

Licence: None

Percentage share of authors:

Karol Miądlicki (Author) - 50%
Mateusz Saków (Author) - 50%

Article corrections:

-

Publication languages:

English