Skydata Challenge

Welcome to the SkyData Challenge! The SkyData challenge provides state of the art benchmark dataset for aerial vision tasks.

Welcome to the SkyData Challenge

Drone and UAV based vision applications have taken interest in recent years. SkyData aims to complement the existing aerial datasets by focusing on their missing features. SkyData provides several challenges to improve existing vision based aerial algorithms including tiny object detection, tiny object segmentation and tiny object tracking by providing annotated video set.

Aerial vision is an important part in many Unmanned Aerial Vehicle (UAV) applications. The performance of such vision based applications heavily rely on the availability of the related and annotated aerial datasets. Unfortunately, the amount of available public and annotated aerial datasets are scarce and for particular tasks (such as video segmentation) there is no public and available large dataset that includes segmentation annotations for tiny objects as the majority of the relevant aerial datasets focus on segmentation. Real world UAV and drone based vision applications usually use high resolution images to recognize and process tiny objects, while the existing aerial datasets provide mostly what we would consider low resolution datasets including larger objects today. Real world videos taken at dense areas including many small scale objects are needed for real world aerial applications. SkyData aims to help aerial vision researchers from such perspectives including segmentation, tracking and crowd counting of densely populated tiny objects.

SkyData comes with two parts: training and test sets. There are 8085 frames in the training set and the remaining 2275 frames in the test set out of the 10360 total frames.

Citing SkyData
@article {SkyDataChallenge, author = {Xxxx Xxxx et. al.}, journal={TBA}, title={SkyData: A New Benchmarking Dataset to Detect, Segment and Track Densely Populated Small Objects in Aerial Videos}, year={2022}, month={Jul}, volume={38}, number={11}, pages={xx-xx}, doi={10.xx/}, ISSN={xx-xx} }