Reimagining Accessibility: Leveraging Deep Learning in Smartphone Applications to Assist Visually Impaired People Indoor Object Distance Estimation


  • Talal Saleem Asia Pacific University of Technology & Innovation image/svg+xml
  • V Sivakumar Asia Pacific University of Technology & Innovation image/svg+xml



Artificial Intelligence, Distance Imagination, Mobile Deep Learning, Object Detection, Visual Impairment, Indoor Navigation


Every other aspect of life is organized around the sight. A person with vision impairment suffers severely from independent mobility and quality of life. The proposed framework combines mobile deep learning with distance estimation algorithms to detect and classify indoor objects with estimated distance in real-time during indoor navigation. The user, wearing the device with a lanyard or holding it in a way that positions the camera forward, identifies in real-time surroundings indoor objects with estimated distance and voice commentary. Moreover, the mobile framework provides an estimated distance to the obstacles and suggests a safe navigational path through voice-guided feedback. By harnessing the power of deep learning in a mobile setting, this framework aims to improve the independence of visually impaired individuals by facilitating them a higher degree of independence in indoor navigation.  This study's proposed mobile object detection and distance estimation framework achieved 99.75% accuracy.  This research contributes by leveraging mobile deep learning with identifying objects in real-time, classification and distance estimation, a state-of-the-art approach to use the latest technologies to enhance indoor mobility challenges faced by visually impaired people.


Download data is not yet available.
<br data-mce-bogus="1"> <br data-mce-bogus="1">


M. E. Gilbert, “Digitally Engaging Older Populations in Healthcare Requires New Practices to Be Effective,” Gartner, pp. 1–21, 2020.

G. Qiao, H. Song, B. Prideaux, and S. (Sam) Huang, “The ‘unseen’ tourism: Travel experience of people with visual impairment,” Ann. Tour. Res., vol. 99, p. 103542, 2023. DOI:

L. Kay, “A sonar aid to enhance spatial perception of the blind: engineering design and evaluation,” Radio Electron. Eng., pp. 605–627, 1974. DOI:

B. Durette, N. Louveton, D. Alleysson, and J. Herault, “Visuo-auditory sensory substitution for mobility assistance: testing TheVIBE,” Work. Comput. Vis. Appl. Vis. Impair. - Marseille, Fr., 2008.

B. D. C. Martinez, O. O. V. Villegas, V. G. C. Sanchez, H. de J. O. Domínguez, and L. O. Maynez, “Visual perception substitution by the auditory sense,” Int. Conf. Comput. Sci. Its Appl. Springer, Berlin, Heidelb., pp. 522–533, 2011. DOI:

L. Tepelea, V. Tiponut, P. Szolgay, and A. Gacsadi, “Multicore portable system for assisting visually impaired people,” Int. Work. Cell. Nanoscale Networks their Appl., pp. 3–4, 2014. DOI:

H. Jabnoun, F. Benzarti, and H. Amiri, “Object detection and identification for blind people in video scene,” 15th Int. Conf. Intell. Syst. Des. Appl. - IEEE - 2015, 2015. DOI:

A. N. Zereen and S. Corraya, “Detecting real time object along with the moving direction for visually impaired people,” ICECTE 2016 - 2nd Int. Conf. Electr. Comput. Telecommun. Eng., no. December, pp. 8–10, 2016. DOI:

C. T. Patel, V. J. Mistry, L. S. Desai, and M. Y. K., “Multisensor-based object detection in indoor environment for visually impaired people,” 2018 Second Int. Conf. Intell. Comput. Control Syst. (ICICCS)-IEEE, no. Iciccs, pp. 2018–2021, 2018. DOI:

L. Árvai, “Mobile phone based indoor navigation system for blind and visually impaired people: VUK - Visionless supporting frameworK,” Proc. 2018 19th Int. Carpathian Control Conf. ICCC 2018, pp. 383–388, 2018. DOI:

Y. Lin, K. Wang, W. Yi, and S. Lian, “Deep Learning based Wearable Assistive System for Visually Impaired People,” Proc. IEEE/CVF Int. Conf. Comput. Vis. Work., 2019. DOI:

S. Ooi, T. Okita, and M. Sano, “Study on A Navigation System for Visually Impaired Persons based on Egocentric Vision Using Deep Learning,” ACM Int. Conf. Proceeding Ser., pp. 68–72, 2020. DOI:

S. Mahmud, R. Haque Sourave, M. Islam, X. Lin, and J. H. Kim, “A vision based voice controlled indoor assistant robot for visually impaired people,” IEMTRONICS 2020 - Int. IOT, Electron. Mechatronics Conf. Proc., 2020. DOI:

B. Strbac, M. Gostovic, Z. Lukac, and D. Samardzija, “YOLO Multi-Camera Object Detection and Distance Estimation,” 2020 Zooming Innov. Consum. Technol. Conf. ZINC 2020, pp. 26–30, 2020. DOI:

K. Karthika, S. Adarsh, and K. I. Ramachandran, “Distance Estimation of Preceding Vehicle Based on Mono Vision Camera and Artificial Neural Networks,” 2020 11th Int. Conf. Comput. Commun. Netw. Technol. ICCCNT 2020, 2020. DOI:

N. Sakic, M. Krunic, S. Stevic, and M. Dragojevic, “Camera-LIDAR Object Detection and Distance Estimation with Application in Collision Avoidance System,” IEEE Int. Conf. Consum. Electron. - Berlin, ICCE-Berlin, vol. 2020-Novem, 2020. DOI:

M. M. Rahman, M. M. Islam, S. Ahmmed, and S. A. Khan, “Obstacle and Fall Detection to Guide the Visually Impaired People with Real Time Monitoring,” SN Comput. Sci. - Springer, vol. 1, no. 4, pp. 1–10, 2020. DOI:

Microsoft, “Seeing AI,” Microsoft Corporation, 2021. [Online]. Available: [Accessed: 01-Feb-2021].

Envision, “Envision AI,” Envision Technologies B.V., 2021. [Online]. Available: [Accessed: 01-Feb-2021].

Aipoly, “Aipoly Vision,” Aipoly - V7 Ltd, 2021. [Online]. Available: [Accessed: 01-Feb-2021].

TapTapSee, “TapTapSee,” Cloudsight, Inc, 2021. [Online]. Available: [Accessed: 02-Feb-2021].

BeMyEyes, “Be My Eyes,” Be My Eyes, 2021. [Online]. Available: [Accessed: 02-Feb-2021].




How to Cite

T. Saleem and V. Sivakumar, “Reimagining Accessibility: Leveraging Deep Learning in Smartphone Applications to Assist Visually Impaired People Indoor Object Distance Estimation”, EAI Endorsed Trans IoT, vol. 10, Jul. 2024.