A distributed framework for monocular visual SLAM

dc.contributor.authorEgoda Gamage, Ruwan
dc.contributor.authorTuceryan, Mihran
dc.contributor.departmentComputer and Information Science, School of Scienceen_US
dc.date.accessioned2018-04-26T17:43:39Z
dc.date.available2018-04-26T17:43:39Z
dc.date.issued2017-04
dc.description.abstractIn Distributed Simultaneous Localization and Mapping (SLAM), multiple agents generate a global map of the environment while each performing its local SLAM operation. One of the main challenges is to identify overlapping maps, especially when agents do not know their relative starting positions. In this paper we are introducing a distributed framework which uses an appearance based method to identify map overlaps. Our framework generates a global semi-dense map using multiple monocular visual SLAM agents, each localizing itself in this map.en_US
dc.eprint.versionAuthor's manuscripten_US
dc.identifier.citationEgodagamage, R., & Tuceryan, M. (2017). A distributed framework for monocular visual SLAM (pp. 55–61). Presented at the 28th Modern Artificial Intelligence and Cognitive Science Conference, MAICS 2017.en_US
dc.identifier.urihttps://hdl.handle.net/1805/15917
dc.language.isoenen_US
dc.relation.journalProceedings of the 28th Modern Artificial Intelligence and Cognitive Science Conference 2017en_US
dc.rightsPublisher Policyen_US
dc.sourceAuthoren_US
dc.subjectSimultaneous Localization and Mappingen_US
dc.subjectdistributed frameworken_US
dc.subjectmonocular visual SLAMen_US
dc.titleA distributed framework for monocular visual SLAMen_US
dc.typeConference proceedingsen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Egodagamage_2017_distributed.pdf
Size:
3.47 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.99 KB
Format:
Item-specific license agreed upon to submission
Description: