Keypoint estimation for human poses is performed on the whole image. The evaluation is performed according to the COCO evaluation metric. We use the average precision (AP) which is averaged over different object keypoint similarity (OKS) thresholds, namely 0.50:0.05:0.95 (primary COCO challenge metric), and denote this metric by AP. We also report the mAP over an OKS of 0.50 (loose metric) and 0.75 (strict metric), denoted as AP50 and AP75, respectively. We use the official pycocotools functions to calculate the performance.

Keypoints are only excluded from the evaluation when they are outside the image or when they are occluded by a another object, however, if occluded by themselves, the keypoints are included. This means that from the ground truth data, only the keypoints with visibility 2 are included in the evaluation.

The model should learn to estimate the keypoints for adults, children and even babies. At the moment, the bones with the following names are used for this tasks:

head, clavicle_r, clavicle_l, upperarm_r, upperarm_l, lowerarm_r, lowerarm_l, hand_r, hand_l, thigh_r, thigh_l, calf_r, calf_l, pelvis, neck_01, spine_02, spine_03

Below is the public leaderboard for keypoint estimation for different training data and vehicles.

Train car all means that one model was trained on each vehicle. The general performance of the method is evaluated on the test set of each vehicle without the test performance of the vehicle it was trained on. Consequently, we calculate the mean of the means of the performances across all vehicles for the overall performance of the method.

If a single car is mentioned as the car the model was trained on, then a single model was trained only on the mentioned car and the performance of this model on the test images of all unseen/unknown vehicles is evaluated. Consequently, we calculate the mean of the means of the performances across all vehicles without the test performance of the vehicle it was trained on.

Filters: RBG Grayscale Depth Additional
NameTrain CarAPAP50AP75PaperCodeRGBGrayDepthAdditionalTeamTitleConference
SVIRO-TeamX50.1500.6170.006NoYesNoYes
MDSPX50.4370.6380.473NoYesNoYesHochschule MannheimUnder review