Challenge Winners 2025

Track 1:

Winner: Team65 ZV

Multi-Camera 3D Object Tracking via 3D Point Clouds and Re-Identification

Runner-up: Team15 SKKU-AutoLab

DepthTrack: Cluster Meets BEV for Multi-Camera Multi-Target 3D Tracking

The following table shows the top teams from the public leader board of Track 1 with paper submissions by the challenge submission deadline.

RankTeam ID Team NameScore(HOTA)Online
165ZV69.9118No
215SKKU-AutoLab63.1396No
3133TeamQDT28.7515Yes
4116UTE AI Lab25.3983Yes

Track 2:

Winner: Team145 CHTTLIOT

TrafficInternVL: Understanding Traffic Scenarios with Vision–Language Models

Runner-up: Team1 SCU_Anastasiu

Multi-Agent Cooperation for Traffic Safety Description and Analysis

The following table shows the top teams from the public leader board of Track 2 with paper submissions by the challenge submission deadline.

 

RankTeam IDTeam NameScore
1145CHTTLIOT60.0393
21SCU_Anastasiu59.1184
352Metropolis_Video_Intelligence58.8483
4137ARV57.9138
5121Rutgers ECE MM57.4658
668VNPT_AI57.1133
760BAO_team55.6550
1049MIZSU45.7572

 

Track 3:

Winner: Team16 UWIPL_ETRI

Warehouse Spatial Question Answering with LLM Agent: 1st Place Solution of the 9th AI City Challenge Track 3

Runner-up: Team57 HCMUT.VNU

Multimodal and Multi-task Fusion for Spatial Reasoning

The following table shows the top teams from the public leader board of Track 3 with paper submissions by the challenge submission deadline.

 

Final RankTeam IDTeam NameScore
116UWIPL_ETRI95.8638
257HCMUT.VNU91.9735
3140Embia90.6772
449MIZSU73.0606
599HCMUS_HTH66.8861

Track 4:

Winner: Team5 SmartVision

Augmentation, distillation and optimization: A practical pipeline for fisheye object detection on edge devices

Runner-up: Team15 SKKU-AutoLab

Data Augmentation Is All You Need For Robust Fisheye Object Detection

The following table shows the final ranking based on the multi-step docker evaluation by Dr. Gochoo and his team.

  • All Docker submissions have been evaluated on the Jetson AGX Orin 64GB platform, configured with a 30W power mode and maximum frequency settings. The ranking in the 1st table is based on the harmonic mean of the normalized frames per second (FPS) and F1-score on FishEy1Keval dataset. All teams met the FPS > 10 requirement.  The top2 teams from the 1st table are further evaluated by retraining and re-evaluating on the in-house dataset by Dr. Gochoo’s team and the results are shown in the 2nd table.

RankTeam IDTeam NameScore
15SmartVision0.7365
215SKKU-AutoLab0.7235
353UT_T10.6169
486Xiilab0.5827
533Tyche0.5690
643UIT-OpenCubee0.5098
RankTeam ID Team NameScore
15SmartVision0.7393
215SKKU-AutoLab0.7295