Optimising Image-guided Needle Biopsy Procedures
Keywords:
Image-guided needle biopsy, computer vision, surgical robotic arm, medical imagingAbstract
A minimally invasive technique called image-guided needle biopsy takes tissue samples for diagnosis or treatment. Needle biopsy operations can be carried out more precisely and accurately with surgical robotic arms than with manual techniques. However, controlling surgical robotic arms can be difficult because the patient's movements and any obstructions in the needle's route must be considered. Surgical robotic arms can be controlled more effectively using computer vision technologies when performing image-guided needle biopsy procedures. Using computer vision, real-time tracking of the needle tip location, detection and avoidance of obstructions in the needle's path, motion compensation for the patient, and instantaneous feedback to the surgeon regarding the procedure's success are all possible. Image cancer recognition and tracking system was developed using computer vision to extract cancer center coordinates from the computer vision pixel images. The significance of the suggested approach lies in its ability to raise the effectiveness and precision of surgical robotic arm control, allowing surgeons to execute a more extensive variety of procedures with more ease and precision. The study also includes the investigation of multi-modal imaging modalities, including CT, MRI, and ultrasound studies, leveraging computer vision to localize tissue and guide the needles precisely. The goal of creating a system that coordinates localization and segment brain tumors was achieved. The center coordinates of brain tumors have been extracted from CT scan pictures by effectively applying computer vision techniques. This resulted in increased procedure accuracy and precision, a lower risk of damaging blood vessels and nerves, better surgeon visualization of the procedure, and reduced number of needle insertions required. Overall, the safety and effectiveness of this crucial medical process could be raised by using computer vision to optimize the control of surgical robotic arms during image-guided needle biopsy procedures.
Downloads
References
Yi, S., Liu, S., Xu, X., Wang, X. V., Yan, S., & Wang, L. (2022). A vision-based human-robot collaborative system for digital twins. Procedia CIRP, 107(March), 552–557. https://doi.org/10.1016/j.procir.2022.05.024
Zhang, X., Liu, Y., Branson, D. T., Yang, C., Dai, J. S., & Kang, R.(2022). Variable-gain control for continuum robots based on velocity sensitivity. Mechanism and MachineTheory, 168 (November 2021), 104618. https://doi.org/10.1016/j.mechmachtheory.2021.104618
Thenuwara, G., Curtin, J., & Tian, F. (2023). Advances in Diagnostic Tools and Therapeutic Approaches for Gliomas: A Comprehensive Review. Sensors, 23(24), 1–47. https://doi.org/10.3390/s23249842
Saha, M., Mukherjee, R., & Chakraborty, C. (2016). Computer-aided diagnosis of breast cancer using cytological images: A systematic review. Tissue and Cell, 48(5), 461–474. https://doi.org/10.1016/j.tice.2016.07.006
Pulumati, A., Pulumati, A., Dwarakanath, B. S., Verma, A., & Papineni, R. V. L. (2023). Technological advancements in cancer diagnostics: Improvements and limitations. Cancer Reports, 6(2), 1– 17. https://doi.org/10.1002/cnr2.1764
Zhang, F., Jin, G., Dai, M., Ding, M., Zhang, J., & Zhang, X. (2023). Percutaneous Magnetic Resonance Imaging-Guided Coaxial Cutting Needle Biopsy of Pancreatic Lesions: Diagnostic Accuracy and Safety. Cardiovascular and Interventional Radiology, 46(11), 1603–1609. https://doi.org/10.1007/s00270-023-03485-z
Han, Z., Yu, K., Hu, L., Li, W., Yang, H., Gan, M., Guo, N., Yang, B., Liu, H., & Wang, Y. (2019). A targeting method for robot-assisted percutaneous needle placement under fluoroscopy guidance. Computer Assisted Surgery, 24(sup1), 44–52. https://doi.org/10.1080/24699322.2018.1557907
Lim, S., Jun, C., Chang, D., Petrisor, D., Han, M., &
Stoianovici, D. (2019). Robotic transrectal ultrasound-guided prostate biopsy. IEEE Transactions on Biomedical Engineering, 66. https://doi.org/10.1109/TBME.2019.2891240
Welleweerd, M. K., Pantelis, D., De Groot, A. G., Siepel, F. J.& Stramigioli, S. (2020). Robot-assisted ultrasound- guidedbiopsy on MR-detected breast lesions. IEEE International Conference on Intelligent Robots and Systems.
https://doi.org/10.1109/IROS45743.2020.9341695
Chen, B., Shorey, J., Saunders, R. S., Richard, S., Thompson, J., Nolte, L. W., & Samei, E. (2011). An Anthropomorphic Breast Model for Breast Imaging Simulation and Optimization. Academic Radiology, 18(5). https://doi.org/10.1016/j.acra.2010.11.009
Sajadi, S. M. R., Karbasi, S. M., Brun, H., Tørresen, J., Elle, O. J., & Mathiassen, K. (2022).Towards Autonomous Robotic Biopsy— Design, Modeling and Control of a Robot for Needle Insertion of a Commercial Full Core BiopsyInstrument. Frontiers in Robotics and AI, 9. https://doi.org/10.3389/frobt.2022.896267
Guo, Z., Tai, Y., Du, J., Chen, Z., Li, Q., & Shi, J. (2021). Automatically Addressing System for Ultrasound-Guided Renal Biopsy Training Based on Augmented Reality. IEEE Journal of Biomedical and Health Informatics, 25(5). https://doi.org/10.1109/JBHI.2021.3064308
Hu, Y., Cai, P., Zhang, H., Adilijiang, A., Peng, J., Li, Y., Che, S., Lan, F., & Liu, C. (2022). A Comparation Between Frame- Based and Robot-Assisted in Stereotactic Biopsy. Frontiers in Neurology, 13, 928070. https://doi.org/10.3389/FNEUR.2022.928070/BIBTEX
Takahashi et al., 2022) Takahashi, Y., Izumi, K., Saito, R., Ikeda, I., Tsumura, R., & Iwata, H. (2022). Development of Needle Guide Unit Considering Buckling Bone-Perforation Control Strategy Based on Computed Tomography-Guided Needle Insertion Robot. Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, 2022-July, 4391–4396. https://doi.org/10.1109/EMBC48229.2022.9871709
Ben-David, E., Shochat, M., Roth, I., Nissenbaum, I., Sosna, J., & Goldberg, S. N. (2018). Evaluation of a CT-Guided Robotic System for Precise Percutaneous Needle Insertion. Journal of Vascular and Interventional Radiology, 29(10), 1440–1446. https://doi.org/10.1016/J.JVIR.2018.01.002
Nagao, A., Matsuno, T., Kimura, K., Kamegawa, T., Minami, M., & Hiraki, T. (2017). Installation angle offset compensation of puncture robot based on measurement of the needle by CT equipment. 2017 IEEE International Conference on Mechatronics and Automation, ICMA 2017, 451–457. https://doi.org/10.1109/ICMA.2017.8015859
Agarwal, A. (2018). Review of Optical Flow Technique for Moving Object Detection. December 2016. https://doi.org/10.1109/IC3I.2016.7917999
Ahmed, S. F., Alam, S. Bin, Hassan, M., Rozbu, M. R., Ishtiak, T., Rafa, N., Mofijur, M., Ali, A. B. M. S., & Gandomi, A. H. (2023). Deep learning modelling techniques : current progress , applications , advantages , and challenges. In Artificial Intelligence Review (Vol. 56, Issue 11). Springer Netherlands. https://doi.org/10.1007/s10462-023-10466-8
Simplification, F. D. (2022). applied sciences Research on Image Matching of Improved SIFT Algorithm Based on Stability Factor and Feature Descriptor Simplification.
Hutchinson S., Hager G. D. and Corke P.I., 2006. A tutorial in visual servo control. IEEE Transactions on robotics and automation. Vol. 12, No. 5, p. 65.
Garcia, G., Corrales Ramon, J. A., Pomares, J., & Fernando, T. (2009). Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain. Sensors, 9. https://doi.org/10.3390/s91209689
Chaumette F, Hutchinson S (2006) Visual servo control, part I: basic approaches. IEEE Robot Autom Mag 13(4):82–90
Corke P. I., 2011. Robotics, Vision and Control. Fundamental algorithms in MATLAB. Springer.
Habuza, T., Navaz, A. N., Hashim, F., Alnajjar, F., Zaki, N., Serhani, M. A., & Statsenko, Y. (2021). AI applications in robotics, diagnostic image analysis and precision medicine: Current limitations, future trends, guidelines on CAD systems for medicine. Informatics in Medicine Unlocked, 24,100596. https://doi.org/10.1016/j.imu.2021.100596
Aderibigbe. (2018). No 主観的健康感を中心とした在宅高齢者 における健康関連指標に関する共分散構造分析Title. Energies, 6(1), 1–8
http://journals.sagepub.com/doi/10.1177/1120700020921110%0Aht tps://doi.org/10.1016/j.reuma.2018.06.001%0Ahttps://doi.org/10.10 16/j.arth.2018.03.044%0Ahttps://reader.elsevier.com/reader/sd/pii/S 1063458420300078?token=C039B8B13922A2079230DC9AF11A3 33E295FCD8
Bick, U., Trimboli, R. M., Athanasiou, A., Balleyguier, C., Baltzer, P. A. T., Bernathova, M., Borbély, K., Brkljacic, B., Carbonaro, L. A., Clauser, P., Cassano, E., Colin, C., Esen, G., Evans, A., Fallenberg, E. M., Fuchsjaeger, M. H., & Gilbert, F. J. (2020). Image-guided breast biopsy and localisation : recommendations for information to women and referring physicians by the European Society of Breast Imaging.
Encarnacion, C. O., Ali, A., Johnstone, D. W., & Haasler, G. B. (2017). Clinics in Surgery Imaging Modalities and Image Guided Biopsy Techniques for Lung Cancer Staging and Their Staging Implications for Lung Cancer- A Review for the General Surgeon. 2, 2–4.
Jiménez-lópez, E., De, D. S., Reyes-ávila, L. A., Servín, R., Mora-pulido, D., Melendez-campos, J., & López-martínez, A. A. (2021). Modeling of Inverse Kinematic of 3-DoF Robot , Using Unit Quaternions and Artificial Neural Network. January. https://doi.org/10.1017/S0263574720001071
Le, H. B. Q., Lee, S. T., & Munk, P. L. (2010). Image-Guided Musculoskeletal Biopsies. 1(212), 191–198.
Castelli, F., Michieletto, S., & Ghidoni, S. (2017). A machine learning-based visual servoing approach for fast robot control in industrial setting. December, 1–10. https://doi.org/10.1177/1729881417738884
Siepel, F. J., Maris, B., Welleweerd, M. K., Groenhuis, V., Fiorini, P., & Stramigioli, S. (2021). Needle and Biopsy Robots : a Review. 73–84.
Raza, S., Chikarmane, S. A., Gombos, E. C., Georgian-Smith, D., & Frost, E. P. (2018). Optimizing Success and Avoiding Mishaps in the Most Difficult Image-guided Breast Biopsies. Seminars in Ultrasound, CT and MRI, 39(1), 80–97. https://doi.org/10.1053/j.sult.2017.08.006
Lingegowda, D., Gupta, B., Gehani, A., Sen, S., & Ghosh, P. (2022). Optimization of the Lung Biopsy Procedure : A Primer. 190–201.
Huang, S., Lou, C., Zhou, Y., He, Z., Jin, X., Feng, Y., & Gao, A. (2023). MRI - guided robot intervention — current state - of - the -art and new challenges. In Med-X. Springer Nature Singapore. https://doi.org/10.1007/s44258-023-00003-1
Wang, T., Li, H., Pu, T., & Yang, L. (2023). Microsurgery Robots : Applications , Design , and Development. 1–49.
Shetty, R., Sreekar, H., Lamba, S., & Gupta, A. K. (2012). A novel and accurate technique of photographic wound measurement. Indian Journal of Plastic Surgery, 45(2), 425–429. https://doi.org/10.4103/0970-0358.101333
Cheng, K., Li, L., Du, Y., Wang, J., Chen, Z., Liu, J., Zhang, X., Dong, L., Shen, Y., & Yang, Z. (2023). A systematic review of image-guided, surgical robot-assisted percutaneous puncture: Challenges and benefits. Mathematical Biosciences and Engineering, 20(5), 8375–8399. https://doi.org/10.3934/mbe.2023367
Zhang, Y., Yuan, Q., Muzzammil, H. M., Gao, G., & Xu, Y. (2023). Image-guided prostate biopsy robots: A review. Mathematical Biosciences and Engineering, 20(8), 15135–15166. https://doi.org/10.3934/mbe.2023678
Klimberg, V. S., & Rivere, A. (2016). Ultrasound image-guided core biopsy of the breast. Chinese Clinical Oncology, 5(3), 1–9. https://doi.org/10.21037/cco.2016.04.05
Welleweerd, M. K., Siepel, F. J., Groenhuis, V., Veltman, J., & Stramigioli, S. (2020). Design of an end-effector for robot-assisted ultrasound-guided breast biopsies. International Journal of Computer Assisted Radiology and Surgery, 15(4), 681–690. https://doi.org/10.1007/s11548-020-02122-1
Hernandez-Guedes, A., Arteaga-Marrero, N., Villa, E., Callico, G. M., & Ruiz-Alzola, J. (2023). Feature Ranking by Variational Dropout for Classification Using Thermograms from Diabetic Foot Ulcers. Sensors, 23(2), 1–18.
Liu, Y., Wu, X., Sang, Y., Zhao, C., Wang, Y., Shi, B., & Fan, Y. (2024). Evolution of Surgical Robot Systems Enhanced by Artificial Intelligence: A Review. Advanced Intelligent Systems. https://doi.org/10.1002/aisy.202300268
Sun, B., Li, D., Song, B., Li, S., Li, C., Qian, C., Lu, Q., & Wang, X. (2023). An Overview of Minimally Invasive Surgery Robots from the Perspective of Human–Computer Interaction Design. Applied Sciences (Switzerland), 13(15). https://doi.org/10.3390/app13158872
Unger, M., Berger, J., & Melzer, A. (2021). Robot-Assisted Image-Guided Interventions. Frontiers in Robotics and AI, 8(July), 1–7. https://doi.org/10.3389/frobt.2021.664622
Sajadi, S. M. R., Karbasi, S. M., Brun, H., Tørresen, J., Elle, O. J., & Mathiassen, K. (2022). Towards Autonomous Robotic Biopsy— Design, Modeling and Control of a Robot for Needle Insertion of a Commercial Full Core Biopsy Instrument. Frontiers in Robotics and AI, 9(June), 1–16. https://doi.org/10.3389/frobt.2022.896267
Xiong, R., Zhang, S., Gan, Z., Qi, Z., Liu, M., Xu, X., Wang, Q., Zhang, J., Li, F., & Chen, X. (2022). A novel 3D-vision–based collaborative robot as a scope holding system for port surgery: a technical feasibility study. Neurosurgical Focus, 52(1), 1–8. https://doi.org/10.3171/2021.10.FOCUS21484
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.