| | |
| | |
Stat |
Members: 3643 Articles: 2'487'895 Articles rated: 2609
28 March 2024 |
|
| | | |
|
Article overview
| |
|
Argoverse: 3D Tracking and Forecasting with Rich Maps | Ming-Fang Chang
; John Lambert
; Patsorn Sangkloy
; Jagjeet Singh
; Slawomir Bak
; Andrew Hartnett
; De Wang
; Peter Carr
; Simon Lucey
; Deva Ramanan
; James Hays
; | Date: |
6 Nov 2019 | Abstract: | We present Argoverse -- two datasets designed to support autonomous vehicle
machine learning tasks such as 3D tracking and motion forecasting. Argoverse
was collected by a fleet of autonomous vehicles in Pittsburgh and Miami. The
Argoverse 3D Tracking dataset includes 360 degree images from 7 cameras with
overlapping fields of view, 3D point clouds from long range LiDAR, 6-DOF pose,
and 3D track annotations. Notably, it is the only modern AV dataset that
provides forward-facing stereo imagery. The Argoverse Motion Forecasting
dataset includes more than 300,000 5-second tracked scenarios with a particular
vehicle identified for trajectory forecasting. Argoverse is the first
autonomous vehicle dataset to include "HD maps" with 290 km of mapped lanes
with geometric and semantic metadata. All data is released under a Creative
Commons license at www.argoverse.org. In our baseline experiments, we
illustrate how detailed map information such as lane direction, driveable area,
and ground height improves the accuracy of 3D object tracking and motion
forecasting. Our tracking and forecasting experiments represent only an initial
exploration of the use of rich maps in robotic perception. We hope that
Argoverse will enable the research community to explore these problems in
greater depth. | Source: | arXiv, 1911.2620 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser claudebot
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |