A Collaborative Augmented Reality Framework Based on Distributed Visual Slam

If you need an accessible version of this item, please email your request to digschol@iu.edu so that they may create one and provide it to you.
Date
2017-09
Language
English
Embargo Lift Date
Committee Members
Degree
Degree Year
Department
Grantor
Journal Title
Journal ISSN
Volume Title
Found At
IEEE
Abstract

Visual Simultaneous Localization and Mapping (SLAM) has been used for markerless tracking in augmented reality applications. Distributed SLAM helps multiple agents to collaboratively explore and build a global map of the environment while estimating their locations in it. One of the main challenges in Distributed SLAM is to identify local map overlaps of these agents, especially when their initial relative positions are not known. We developed a collaborative AR framework with freely moving agents having no knowledge of their initial relative positions. Each agent in our framework uses a camera as the only input device for its SLAM process. Furthermore, the framework identifies map overlaps of agents using an appearance-based method.

Description
item.page.description.tableofcontents
item.page.relation.haspart
Cite As
Egodagamage, R., & Tuceryan, M. (2017). A Collaborative Augmented Reality Framework Based on Distributed Visual Slam. In 2017 International Conference on Cyberworlds (CW) (pp. 25–32). https://doi.org/10.1109/CW.2017.47
ISSN
Publisher
Series/Report
Sponsorship
Major
Extent
Identifier
Relation
Journal
2017 International Conference on Cyberworlds
Source
Author
Alternative Title
Type
Conference proceedings
Number
Volume
Conference Dates
Conference Host
Conference Location
Conference Name
Conference Panel
Conference Secretariat Location
Version
Author's manuscript
Full Text Available at
This item is under embargo {{howLong}}