Challenge Tracks

Detailed participant instructions can be accessed here.

Participants can compete in one or more of the following three challenges. All teams considered for a prize will be required to submit their code for independent verification of winning results.

*Demo from Team 48

Track 1: Traffic Flow Analysis

Participating teams will submit results for individual vehicle speed for a test set containing 27 1-minute videos. Performance will be evaluated based on ground truth generated by a fleet of control vehicles that were driven during the recording. Evaluation for Challenge Track 1 will be based on detection rate of the control vehicles and the root mean square error of the predicted control vehicle speeds.

*Demo from Team 15

Track 2: Anomaly Detection

Participating teams will submit the top 100 detected anomalies, which can be due to car crashes or stalled vehicles. Regular congestion not caused by any traffic incident does not count as an anomaly. Evaluation for Challenge Track 2 will be based on model anomaly detection performance, measured by the F1-score, and detection time error, measured by RMSE.

*Demo from Team 37

Track 3: Multi-sensor Vehicle Detection and Reidentification

Participating teams will identify all vehicles that are seen passing at least once at all of 4 different locations in a set of 15 videos. Evaluation for Challenge Track 3 will be based on detection accuracy and localization sensitivity for a set of ground-truth vehicles that were driven through all checkpoionts at least once.