Real-Time 3-D Segmentation on An Autonomous Embedded System: using Point Cloud and Camera

dc.contributor.authorKatare, Dewant
dc.contributor.authorEl-Sharkawy, Mohamed
dc.contributor.departmentElectrical and Computer Engineering, School of Engineering and Technologyen_US
dc.date.accessioned2021-02-12T18:19:18Z
dc.date.available2021-02-12T18:19:18Z
dc.date.issued2019-07
dc.description.abstractPresent day autonomous vehicle relies on several sensor technologies for it's autonomous functionality. The sensors based on their type and mounted-location on the vehicle, can be categorized as: line of sight and non-line of sight sensors and are responsible for the different level of autonomy. These line of sight sensors are used for the execution of actions related to localization, object detection and the complete environment understanding. The surrounding or environment understanding for an autonomous vehicle can be achieved by segmentation. Several traditional and deep learning related techniques providing semantic segmentation for an input from camera is already available, however with the advancement in the computing processor, the progression is on developing the deep learning application replacing traditional methods. This paper presents an approach to combine the input of camera and lidar for semantic segmentation purpose. The proposed model for outdoor scene segmentation is based on the frustum pointnet, and ResNet which utilizes the 3d point cloud and camera input for the 3d bounding box prediction across the moving and non-moving object and thus finally recognizing and understanding the scenario at the point-cloud or pixel level. For real time application the model is deployed on the RTMaps framework with Bluebox (an embedded platform for autonomous vehicle). The proposed architecture is trained with the CITYScpaes and the KITTI dataset.en_US
dc.eprint.versionAuthor's manuscripten_US
dc.identifier.citationKatare, D., & El-Sharkawy, M. (2019). Real-Time 3-D Segmentation on An Autonomous Embedded System: Using Point Cloud and Camera. 2019 IEEE National Aerospace and Electronics Conference (NAECON), 356–361. https://doi.org/10.1109/NAECON46414.2019.9057988en_US
dc.identifier.urihttps://hdl.handle.net/1805/25210
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.relation.isversionof10.1109/NAECON46414.2019.9057988en_US
dc.relation.journal2019 IEEE National Aerospace and Electronics Conferenceen_US
dc.rightsPublisher Policyen_US
dc.sourceAuthoren_US
dc.subjectautonomous embedded platformen_US
dc.subjectBLBX2en_US
dc.subjectcameraen_US
dc.titleReal-Time 3-D Segmentation on An Autonomous Embedded System: using Point Cloud and Cameraen_US
dc.typeConference proceedingsen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Katare2019Real-Time.pdf
Size:
1.46 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.99 KB
Format:
Item-specific license agreed upon to submission
Description: