Cross Domain Transfer for Sketch-based Clothing Retrieval

Simin Chen, Haopeng Lei, Mingwen Wang, Fan Yang, Xiangjian He, Guoliang Luo

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Due to the rise of e-commerce platforms, online shopping has become a trend. However, the current mainstream retrieval methods are still limited to using text or exemplar images as input. In the huge commodity database, it remains a long-standing unsolved problem for users to find the interested products quickly. Because the sketch contains more content than the text, and the way to get is more convenient than the exemplar image. In this work, We propose a sketch-based clothing retrieval model. It implements cross-domain retrieval and is used to search for specific clothing. Because most of the existing clothing datasets are only composed of photos, it is difficult to obtain dataset which composed of sketch-photo pairs. Thus, we contribute a clothing dataset, including 34142 sketch-photo pairs. Whether our model in our dataset or other dataset has achieved compelling performance.

Original languageEnglish
Title of host publicationProceedings - 8th International Conference on Digital Home, ICDH 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages22-27
Number of pages6
ISBN (Electronic)9781728192345
DOIs
Publication statusPublished - Sep 2020
Externally publishedYes
Event8th International Conference on Digital Home, ICDH 2020 - Dalian, China
Duration: 20 Sep 202022 Sep 2020

Publication series

NameProceedings - 8th International Conference on Digital Home, ICDH 2020

Conference

Conference8th International Conference on Digital Home, ICDH 2020
Country/TerritoryChina
CityDalian
Period20/09/2022/09/20

Keywords

  • clothing dataset
  • cross-domain retrieval
  • sketch-based clothing retrieval

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction
  • Signal Processing

Cite this