Show simple item record

dc.contributor.authorAlonso, Ignacio Parraen_US
dc.contributor.authorLlorca, David Fernandezen_US
dc.contributor.authorGavilan, Miguelen_US
dc.contributor.authorPardo, Sergio Alvarezen_US
dc.contributor.authorGarcia-Garrido, Miguel Angelen_US
dc.contributor.authorVlacic, Ljuboen_US
dc.contributor.authorSotelo, Miguel Angelen_US
dc.date.accessioned2017-05-03T12:07:44Z
dc.date.available2017-05-03T12:07:44Z
dc.date.issued2012en_US
dc.date.modified2013-06-25T00:42:15Z
dc.identifier.issn15249050en_US
dc.identifier.doi10.1109/TITS.2012.2193569en_US
dc.identifier.urihttp://hdl.handle.net/10072/51880
dc.description.abstractOver the past few years, advanced driver-assistance systems (ADASs) have become a key element in the research and development of intelligent transportation systems (ITSs) and particularly of intelligent vehicles. Many of these systems require accurate global localization information, which has been traditionally performed by the Global Positioning System (GPS), despite its well-known failings, particularly in urban environments. Different solutions have been attempted to bridge the gaps of GPS positioning errors, but they usually require additional expensive sensors. Vision-based algorithms have proved to be capable of tracking the position of a vehicle over long distances using only a sequence of images as input and with no prior knowledge of the environment. This paper describes a full solution to the estimation of the global position of a vehicle in a digital road map by means of visual information alone. Our solution is based on a stereo platform used to estimate the motion trajectory of the ego vehicle and a map-matching algorithm, which will correct the cumulative errors of the vision-based motion information and estimate the global position of the vehicle in a digital road map. We demonstrate our system in large-scale urban experiments reaching high accuracy in the estimation of the global position and allowing for longer GPS blackouts due to both the high accuracy of our visual odometry estimation and the correction of the cumulative error of the map-matching algorithm. Typically, challenging situations in urban environments such as nonstatic objects or illumination exceeding the dynamic range of the cameras are shown and discussed.en_US
dc.description.peerreviewedYesen_US
dc.description.publicationstatusYesen_US
dc.languageEnglishen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.publisher.placeUnited Statesen_US
dc.relation.ispartofstudentpublicationNen_US
dc.relation.ispartofpagefrom1535en_US
dc.relation.ispartofpageto1545en_US
dc.relation.ispartofissue4en_US
dc.relation.ispartofjournalIEEE Transactions on Intelligent Transportation Systemsen_US
dc.relation.ispartofvolume13en_US
dc.rights.retentionYen_US
dc.subject.fieldofresearchControl Systems, Robotics and Automationen_US
dc.subject.fieldofresearchcode090602en_US
dc.titleAccurate global localization using visual odometry and digital maps on urban environmentsen_US
dc.typeJournal articleen_US
dc.type.descriptionC1 - Peer Reviewed (HERDC)en_US
dc.type.codeC - Journal Articlesen_US
gro.date.issued2012
gro.hasfulltextNo Full Text


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

  • Journal articles
    Contains articles published by Griffith authors in scholarly journals.

Show simple item record