World Scientific
  • Search
  •   
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at [email protected] for any enquiries.

Mapping Surgeons Hand/Finger Movements to Surgical Tool Motion During Conventional Microsurgery Using Machine Learning

    https://doi.org/10.1142/S2424905X21500045Cited by:2 (Source: Crossref)

    Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human–robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation.

    Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system.

    Results: Based on surgeon’s feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrument’s pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeon’s hand/finger movements with a mean squared error (MSE) less than 0.3%.

    Conclusion: We have developed a system to capture fine movements of the surgeon’s hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation.

    This paper was recommended for publication in its revised form by editorial board member Dan Stoianovici.

    NOTICE: Prior to using any material contained in this paper, the users are advised to consult with the individual paper author(s) regarding the material contained in this paper, including but not limited to, their specific design(s) and recommendation(s).

    References

    • 1. H. Alemzadeh, J. Raman, N. Leveson, Z. Kalbarczyk and R. K. Iyer , Adverse events in robotic surgery: A retrospective study of 14 years of fda data, PloS One 11(4) (2016) e0151470. CrossrefGoogle Scholar
    • 2. E. Rodriguez and W. Chitwood , Robotics in cardiac surgery, Scand. J. Surg. 98(2) (2009) 120–124. CrossrefGoogle Scholar
    • 3. E. Quint and G. Sivakumar , The role of robotic technology in cardiac surgery, Univ. Western Ontario Med. J. 87(2) (2018) 40–42. CrossrefGoogle Scholar
    • 4. A. Simorov, R. S. Otte, C. M. Kopietz and D. Oleynikov , Review of surgical robotics user interface: what is the best way to control robotic surgery?, Surg. endosc. 26(8) (2012) 2117–2125. CrossrefGoogle Scholar
    • 5. L. S. Mattos, N. Deshpande, G. Barresi, L. Guastini and G. Peretti , A novel computerized surgeon–machine interface for robot-assisted laser phonomicrosurgery, Laryngoscope 124(8) (2014) 1887–1894. CrossrefGoogle Scholar
    • 6. A. Mewes, B. Hensen, F. Wacker and C. Hansen , Touchless interaction with software in interventional radiology and surgery: A systematic literature review, Int. J. Comput. Assist. Radiol. Surg. 12(2) (2017) 291–305. CrossrefGoogle Scholar
    • 7. G. R. Sutherland, S. Wolfsberger, S. Lama and K. Zarei-nia , The evolution of neuroarm, Neurosurgery 72(1) (2013) A27–A32. CrossrefGoogle Scholar
    • 8. M. Fattahi Sani, S. Abeywardena, E. Psomopoulou, R. Ascione and S. Dogramadzi , Towards finger motion tracking and analyses for cardiac surgery, in 2019 15th Mediterranean Conf. Medical and Biological Engineering and Computing (MEDICON) (Springer, 2019), pp. 1–11. Google Scholar
    • 9. Y.-C. Du, C.-B. Shih, S.-C. Fan, H.-T. Lin and P.-J. Chen , An IMU-compensated skeletal tracking system using kinect for the upper limb, Microsyst. Technol. 24(10) (2018) 4317–4327. CrossrefGoogle Scholar
    • 10. R. Li, Z. Liu and J. Tan , A survey on 3d hand pose estimation: Cameras, methods, and datasets, Pattern Recognit. 93 (2019) 251–272. CrossrefGoogle Scholar
    • 11. D. Lu, K. Xu and D. Huang , A data driven in-air-handwriting biometric authentication system, in 2017 IEEE Int. Joint Conf. Biometrics (IJCB) (IEEE, 2017), pp. 531–537. CrossrefGoogle Scholar
    • 12. C. E. Reiley, E. Plaku and G. D. Hager , Motion generation of robotic surgical tasks: Learning from expert demonstrations, in 2010 Annual Int. Conf. IEEE Engineering in Medicine and Biology (IEEE, 2010), pp. 967–970. CrossrefGoogle Scholar
    • 13. M. Power, H. Rafii-Tari, C. Bergeles, V. Vitiello and G.-Z. Yang , A cooperative control framework for haptic guidance of bimanual surgical tasks based on learning from demonstration, in 2015 IEEE Int. Conf. Robotics and Automation (ICRA) (IEEE, 2015), pp. 5330–5337. CrossrefGoogle Scholar
    • 14. M. J. Fard, S. Ameri, R. B. Chinnam, A. K. Pandya, M. D. Klein and R. D. Ellis, Machine learning approach for skill evaluation in robotic-assisted surgery, preprint (2016), arXiv:1611.05136. Google Scholar
    • 15. S. Speidel, M. Delles, C. Gutt and R. Dillmann , Tracking of instruments in minimally invasive surgery for surgical skill analysis, in Int. Workshop Medical Imaging and VirtualReality (Springer, 2006), pp. 148–155. CrossrefGoogle Scholar
    • 16. A. Ometov et al., A survey on wearable technology: History, state-of-the-art and current challenges, Comput. Netw. 193 (2021) 108074. CrossrefGoogle Scholar
    • 17. B. D. Itkowitz, S. P. DiMaio and K. Y. Bark, Master finger tracking device and method of use in a minimally invasive surgical system (2013), US Patent 8,543,240. Google Scholar
    • 18. B. D. Itkowitz, S. P. DiMaio and T. Zhao, Method and apparatus for hand gesture control in a minimally invasive surgical system (2015), US Patent 8,996,173. Google Scholar
    • 19. S. Abeywardena, E. Psomopoulou, Efi amd Fattahi Sani, A. Tzemanaki and S. Dogramadzi , Control of a da vinci endowrist surgical instrument using a novel master controller, in 2019 15th Mediterranean Conf. Medical and Biological Engineering and Computing (MEDICON) (Springer, 2019), pp. 1545–1550. Google Scholar
    • 20. W.-Y. Loh , Classification and regression tree methods, Wiley StatsRef: Statistics Reference Online (2014). CrossrefGoogle Scholar
    • 21. N. Polaris, Northern digital polaris tracking system (2004). Google Scholar
    • 22. A. Deguet, R. Kumar, R. Taylor and P. Kazanzides , The cisst libraries for computer assisted intervention systems, in MICCAI Workshop Systems and Architectures for Computer Assisted Interventions, Vol. 71 (2008). CrossrefGoogle Scholar
    • 23. N. Koenig and A. Howard , Design and use paradigms for gazebo, an open-source multi-robot simulator, in 2004 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566), Vol. 3 (IEEE, 2004), pp. 2149–2154. CrossrefGoogle Scholar
    • 24. Shadow Dexterous Hand (2020), www.shadowrobot.com/products/dexterous-hand, (-). Google Scholar
    • 25. Stanford Artificial Intelligence Laboratory, Robotic operating system. Google Scholar
    • 26. D. Hershberger, D. Gossow and J. Faust, Rviz, 3d visualization tool for ros, Vol. 1 (2019), http://wiki. ros. org/rviz. Google Scholar
    • 27. D. HX711, 24-bit analog-to-digital converter (ADC) for weigh scales, AVIA Semiconductor (2014). Google Scholar
    • 28. R. E. da Silva, J. Ondrej and A. Smolic , Using LSTM for automatic classification of human motion capture data, in 14th Int. Conf. Computer Vision, Imaging and Computer Graphics Theory and Applications, Prague, Czech Republic, 2019, pp. 236–243. CrossrefGoogle Scholar
    • 29. L. Breiman , Random forests, Mach. Learn. 45(1) (2001) 5–32. CrossrefGoogle Scholar
    • 30. P. Cattin, H. Dave, J. Grünenfelder, G. Szekely, M. Turina and G. Zünd , Trajectory of coronary motion and its significance in robotic motion cancellation, Eur. J. Cardio-thoracic Surg. 25(5) (2004) 786–790. CrossrefGoogle Scholar
    • 31. A. Danioni, Study on dexterity of surgical robotic tools in a highly immersive concept (2020). Google Scholar