This paper presents a novel learning-based framework for video content-based advertising, DeepLink, which aims at linking Sitcom-stars and online shops with clothing retrieval by using state-of-the-art deep convolutional neural networks (CNNs). Speciﬁcally, several deep CNN models are adopted for composing multiple sub-modules in DeepLink, including human-body detection, human pose selection, face veriﬁcation, clothing detection and retrieval from advertisements (ads) pool that is constructed by clothing images crawled from real-world online shops. For clothing detection and retrieval from ad-images, we ﬁrstly transfer the state-of-the-art deep CNN models to our data domain, and then train corresponding models based on our constructed large-scale clothes datasets. Extensive experimental results demonstrate the feasibility and efﬁcacy of our proposed clothing-based video advertising system. Keywords Video advertising Deep learning Object detection Face veriﬁcation Image retrieval Clothing detection 1 Introduction advertisers, undifferentiated advertising for all of the users will increase the advertising cost and waste resources Recent years have witnessed dramatic development of the which, to some extent, reduces advertising efﬁciency. Internet economy. According to the increasing online video Thus, a tradeoff between reducing the impact on users trafﬁc and its growing revenue, video advertising offers viewing experience and keeping the advertising
Neural Computing and Applications – Springer Journals
Published: Jun 7, 2018
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.
All for just $49/month
Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly
Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.
All the latest content is available, no embargo periods.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud