Smart Approach for Finding Indoor Navigation Using BLE for Visually Impaired Person
DOI:
https://doi.org/10.26438/ijcse/v7i8.8893Keywords:
Indoor navigation, BLE beacons technology for triangulation, Blind navigation, wayfinding, robotic navigation aid, pose estimationAbstract
In todays’ life, the problems faced by the visually impaired persons are increases due to the huge growth in urbanization in cities. Even a normal person also gets confused, if they come across to the new locations. To handle this problem, in this paper, we have proposed a new robust system, which provide help to user while navigating in big industrial buildings. This system uses BLE (bluetooth low energy) devices to communicate with the hardware present at user and then it will direct the route to the user. The process includes user interaction through voice for the input location after that system will find desired location of user by connecting the hardware to various BLE devices and depending upon the signal strengths from each BLE user will be get navigated. If the range of BLE devices get less than that means, that user is going away from that BLE device and similarly if the range of particular device is getting increase then it means that user going towards the BLE device. Now to get accurate result we are implementing Three Dimensional Triangulation Technique where the hardware present at user will simultaneously connect with multiple BLE devices and then find the required route for navigation. Along with this we are providing IR(infra-red) SONAR sensors through which we can find any obstacle that comes between the user and its navigation. We have added buzzer and LED lights to notify the obstacle to others.
References
[1] He Zhang ; Cang Ye, "An Indoor Wayfinding System Based on Geometric Features Aided Graph SLAM for the Visually Impaired" IEEE Transactions on Neural Systems and Rehabilitation Engineering (Volume: 25 , Issue: 9 , Sept. 2017 )
[2] D. Yuan and R. Manduchi, “A Tool for Range Sensing and Environment Discovery for the Blind,” in Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2004.
[3] K. Tsukada and M. Yasumura, “Activebelt: Belt-type wearable tactile display for directional navigation,” in Proc. Ubiquitous Comput., 2004, pp. 384–399.
[4] F. Endres, J. Hess, N. Engelhard, J. Sturm, D. Cremers, and W. Burgard, “An evaluation of the RGB-D SLAM system,” in Proc IEEE Int. Conf.Robotics and Automation, 2012, pp. 1691-1696.
[5] A. Tamjidi, C. Ye, and S. Hong, “6-DOF pose estimation of a portable navigation aid for the visually impaired,” in Proc. IEEE international symposium on robotic and sensors environments, 2013, pp. 178-183.
[6] C. Ye, S. Hong, and A. Tamjidi, “6-DOF pose estimation of a robotic navigation aid by tracking visual and geometric features,” IEEE Trans. Autom. Sci. Eng., vol. 12, no. 4, pp. 1169-1180, Oct. 2015.
[7] V. Kulyukin, C. Gharpure, J. Nicholson, and G. Osborne, “Robot-assisted wayfinding for the visually impaired in structured indoor environments,” Auton. Robot., vol. 21, no. 1, pp. 29-41, 2006.
[8] J. A. Hesch and S. I. Roumeliotis, “Design and analysis of a portable indoor localization aid for the visually impaired,” Int. J. Robot. Res., vol. 29, no. 11, pp. 1400-1415, 2010.
[9] T. Bailey and H. Durrant-Whyte, “Simultaneous Localization and Mapping (SLAM): Part II,” IEEE Robotics Automation Magazine, vol. 13, no. 3, pp. 108-117, 2006.
[10] M. Kaess, A. Ranganathan, and F. Dellaert, “iSAM: Incremental smoothing and mapping. Robotics,” IEEE Transactions on Robotics, vol. 24, no.6, pp.1365-1378, 2008.
[11] G. Klein and D. Murray, “Parallel tracking and mapping for small AR workspaces,” in Proc. IEEE and ACM International Symposium on Mixed and Augmented Reality, 2007, pp. 225-234.
[12] R. A. Newcombe, S. J. Lovegrove and A.J. Davison, “DTAM: Dense tracking and mapping in real-time,” in Int. Conf. Computer Vision, 2011, pp. 2320-2327.
[13] J. Engel, T. Schöps, and D. Cremers, “LSD-SLAM: Large-scale direct monocular SLAM,” in Proc. European Conference on Computer Vision, 2014, pp. 834–849.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.
