Beyond Triplet Loss: Person Re-Identification With Fine-Grained Difference-Aware Pairwise Loss
Author(s): Yan, C (Yan, Cheng); Pang, GS (Pang, Guansong); Bai, X (Bai, Xiao); Liu, CH (Liu, Changhong); Ning, X (Ning, Xin); Gu, L (Gu, Lin); Zhou, J (Zhou, Jun)
Source: IEEE TRANSACTIONS ON MULTIMEDIA Volume: 24 Pages: 1665-1677 DOI: 10.1109/TMM.2021.3069562 Published: 2022
Abstract: Person Re-IDentification (ReID) aims at re-identifying persons from different viewpoints across multiple cameras. Capturing the fine-grained appearance differences is often the key to accurate person ReID, because many identities can be differentiated only when looking into these fine-grained differences. However, most state-of-the-art person ReID approaches, typically driven by a triplet loss, fail to effectively learn the fine-grained features as they are focused more on differentiating large appearance differences. To address this issue, we introduce a novel pairwise loss function that enables ReID models to learn the fine-grained features by adaptively enforcing an exponential penalization on the images of small differences and a bounded penalization on the images of large differences. The proposed loss is generic and can be used as a plugin to replace the triplet loss to significantly enhance different types of state-of-the-art approaches. Experimental results on four benchmark datasets show that the proposed loss substantially outperforms a number of popular loss functions by large margins; and it also enables significantly improved data efficiency.
Accession Number: WOS:000776227200032
Author Identifiers:
Author Web of Science ResearcherID ORCID Number
Zhou, Jun 0000-0001-5822-8233
ISSN: 1520-9210
eISSN: 1941-0077
Full Text: https://ieeexplore.ieee.org/document/9392276