Enabling Immersive Indoor Navigation and Control Through Augmented Reality With Computer Vision
M. Pasan Sanjaya Fernando, A. G. Tharindu Gimras, J. M. Budshan P. Jayasundara, D. P. Chandima
Abstract
This project integrates computer vision, Augmented Reality (AR), and cloud-based communication to create a real-time 3D map of the robot's surroundings. The outcome is an engaging and immersive control experience for users in indoor robot monitoring and controlling. Furthermore, combining an image semantic segmentation model with the Microsoft Kinect V2 depth camera introduces a novel method for achieving 1 cm accuracy in estimating the distance to a particular object. This precise depth estimation is a significant benefit for tasks that require accurate object identification, segmentation, and depth information to generate a virtual representation of the real-world environment of a robot. We have enabled remote communication with the robot's environment by incorporating AWS cloud services, guaranteeing minimal delay. The commercial viability of this project can be identified in a wide range of applications that find the system useful due to its easy-to-use interface, remote access capability, and precise object rendering capabilities. Examples of such applications include surgical robotics, instructional robotics platforms in classrooms, and automated guided vehicles in warehouses.
Keywords
Cite this paper
Fernando, M. PasanSanjaya, Gimras, A. G. Tharindu, Jayasundara, J. M. BudshanP., & Chandima, D. P. (2024). Enabling Immersive Indoor Navigation and Control Through Augmented Reality With Computer Vision. In MERCon 2024 β IEEE. https://doi.org/10.1109/MERCon63886.2024.10688929
M. PasanSanjaya Fernando, A. G. Tharindu Gimras, J. M. BudshanP. Jayasundara, D. P. Chandima, "Enabling Immersive Indoor Navigation and Control Through Augmented Reality With Computer Vision," in MERCon 2024 β IEEE, 2024. doi: 10.1109/MERCon63886.2024.10688929.
@inproceedings{fernando2024enabling,
title = {Enabling Immersive Indoor Navigation and Control Through Augmented Reality With Computer Vision},
author = {M. Pasan Sanjaya Fernando, A. G. Tharindu Gimras, J. M. Budshan P. Jayasundara, D. P. Chandima},
booktitle = {MERCon 2024 β IEEE},
year = {2024},
publisher = {IEEE},
doi = {10.1109/MERCon63886.2024.10688929},
url = {https://ieeexplore.ieee.org/document/10688929/},
}Related Publications
CasteriX: A Wheel Configurable Caster Wheel-Based Prototype Design for Electric Wheelchair Motion Dynamics Research
A wheel-configurable modular research platform for electric wheelchair caster wheel motion dynamics analysis, supporting 212 wheel configurations across three major wheelchair types, recognized with the π Best Poster Award at ICIPRoB 2026.
Adaptive Navigation of a Transformer Robot in Warehouse Environments
A adaptive navigation framework for transformer robots in warehouse environments, integrating overhead camera perception, binary segmentation, and a shape-aware A* algorithm to determine optimal robot configurations through narrow passages.