A Review on Data Fusion and Integration

Authors

  • Roja T MCA, Mother Theresa Institute of Computer Applications, Palamaner. Sri Venkateswara University, Tirupathi. Andhra Pradesh, India
  • Kumar AM MCA, Mother Theresa Institute of Computer Applications, Palamaner. Sri Venkateswara University, Tirupathi. Andhra Pradesh, India

Keywords:

Autonomous Mobile Robots, Multi sensor Data Fusion, Multi sensor Integration

Abstract

A standout amongst the most critical and valuable element of independent versatile robots is their capacity to receive themselves to work in unstructured condition. Today robots are performing self-sufficiently in mechanical conditions, and in addition in swarmed open places. The essential necessity of a clever portable robot is to create and keep up confinement and mapping parameters to finish the unpredictable missions. In such circumstances, a few difficulties emerge because of the errors and vulnerabilities in sensor estimations. Different systems are there to deal with such commotions where the multi sensor information combination isn't the remarkable one. Amid the last two decades, multi sensor information combinations in versatile robots turn into a prevailing worldview because of its potential favorable circumstances like decrease in vulnerability, increment in precision, and decrease of cost. This paper exhibits the detail survey of multi sensor information combination and its applications for self-ruling versatile.

References

[1] N Ghosh, Y Ravi, A Patra, S Mukhopadhyay, S Paul, A Mohanty, and A Chattopadhyay. “Estimation of tool wearduring CNC milling using neural network-based sensor fusion”. Mechanical Systems and Signal Processing. 2007;21(1): 466-479.

[2] M Mucientes, DL Moreno, A Bugarin, and S Barro. “Design of a fuzzy controller in mobile robotics using genetic algorithms”. Applied Soft Computing. 2007; 7(2): 540-546.

[3] B Moshiri, M Reza Asharif, and R Hosein Nezhad. “Pseudo information measure: A new concept for extension of bayesian fusion in robotic map building”. Information Fusion. 2002; 3(1): 51-68.

[4] K Nagla, M Uddin, D Singh, and R Kumar. “Object identification in dynamic environment using sensor fusion”. in IEEE 39th Workshop on Applied Imagery Pattern Recognition Workshop (AIPR), 2010. IEEE. 2010: 1-4.

[5] DE Rumelhart, GE Hinton, and RJ Williams. “Learning representations by back-propagating errors”. Cognitive modeling. 2002; 1: 213.

[6] JW van Dam, BJ Krose, and FC Groen. “Adaptive sensor models”. in Multisensor Fusion and Integration for Intelligent Systems, 1996. IEEE/SICE/RSJ International Conference on. IEEE. 1996: 705-712.

[7] M Kam, X Zhu, and P Kalata. “Sensor fusion for mobile robot navigation”. Proceedings of the IEEE. 1997; 85(1): 108-119.

[8] YF Zheng. “Integration of multiple sensors into a robotic system and its performance evaluation”. Robotics and Automation, IEEE Transactions on. 1989; 5(5): 658-669.

[9] N Yadaiah, L Singh, RS Bapi, VS Rao, BL Deekshatulu, and A Negi. “Multisensor data fusion using neural networks”. in Neural Networks, 2006. IJCNN`06. International Joint Conference on. IEEE. 2006: 875-881.

[10] J Borenstein and Y Koren. “Histogramic in-motion mapping for mobile robot obstacle avoidance”. Robotics and Automation, IEEE Transactions on. 1991; 7(4): 535-539.

[11] M Dekhil and TC Henderson. “Instrumented logical sensor systems-practice”. in Robotics and Automation, 1998. Proceedings. 1998 IEEE International Conference on. 1998; 4: 3103-3108.

[12] H Alex, M Kumar, and B Shirazi. “Midfusion: An adaptive middleware for information fusion in sensor network applications”. Information Fusion. 2008; 9(3): 332-343.

[13] J Gonzalez, JL Blanco, C Galindo, A Ortiz-de Galisteo, JA Fernandez-Madrigal, FA Moreno, and JL Martinez. “Mobile robot localization based on ultra-wide-band ranging: A particle filter approach”. Robotics and autonomous systems. 2009; 57(5): 496-507.

[14] F Ferreira, I Amorim, R Rocha, and J Dias. “T-slam: Registering topological and geometric maps for robot localization in large environments”. in Multi sensor Fusion and Integration for Intelligent Systems, 2008. MFI 2008. IEEE International Conference on. IEEE. 2008: 392-398.

[15] A Elfes. “Using occupancy grids for mobile robot perception and navigation”. Computer. 1989; 22(6): 46-57.

[16] JJ Leonard, HF Durrant-Whyte, and IJ Cox. “Dynamic map building for an autonomous mobile robot”. The International Journal of Robotics Research. 1992; 11(4): 286-298.

[17] HF Durrant-Whyte. Integration, coordination and control of multi-sensor robot systems. Kluwer Academic Publishers, 1987.

[18] RC Luo and MG Kay. “A tutorial on multi sensor integration and fusion”. in Industrial Electronics Society, 1990. IECON`90., 16th Annual Conference of IEEE. IEEE. 1990: 707-722.

[19] DL Hall and J Llinas. “An introduction to multi sensor data fusion”. Proceedings of the IEEE. 1997; 85(1): 6-23.

[20] AN Steinberg, CL Bowman, and FE White. “Revisions to the jdl data fusion model”. in AeroSense`99. International Society for Optics and Photonics. 1999: 430-441.

[21] BV Dasarathy. “Information fusion-what, where, why, when, and how?". Information Fusion. 2001; 2(2): 75-76.

[22] S Das. High-level data fusion. Artech House, 2008.

[23] A Plascencia. Sensor fusion for autonomous mobile robot navigation. Videnbasen for Aalborg Universitet VBN, Aalborg Universitet Aalborg University, Det Teknisk Naturvidenskabelige Fakultet The Faculty of Engineering and Science, Automation & Control Automation & Control, 2007.

[24] R Araujo and AT de Almeida. “Learning sensor-based navigation of a real mobile robot in unknown worlds”.Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on. 1999; 29(2): 164-178.

[25] S Thrun. “Learning occupancy grid maps with forward sensor models”. Autonomous robots. 2003; 15(2): 111-127.

Downloads

Published

2025-11-25

How to Cite

[1]
T. Roja and A. M. M. Kumar, “A Review on Data Fusion and Integration”, Int. J. Comp. Sci. Eng., vol. 7, no. 6, pp. 77–81, Nov. 2025.