OneSketch: learning high-level shape features from simple sketches

Eyal Reisfeld,Andrei Sharf

VISUAL COMPUTER(2022)

引用 2|浏览5
暂无评分
摘要
Humans use simple sketches to convey complex concepts and abstract ideas in a concise way. Just a few abstract pencil strokes can carry a large amount of semantic information that can be used as meaningful representation for many applications. In this work, we explore the power of simple human strokes denoted to capture high-level 2D shape semantics. For this purpose, we introduce OneSketch, a crowd-sourced dataset of abstract one-line sketches depicting high-level 2D object features. To construct the dataset, we formulate a human sketching task with the goal of differentiating between objects with a single minimal stroke. While humans are rather successful at depicting high-level shape semantics and abstraction, we investigate the ability of deep neural networks to convey such traits. We introduce a neural network which learns meaningful shape features from our OneSketch dataset. Essentially, the model learns sketch-to-shape relations and encodes them in an embedding space which reveals distinctive shape features. We show that our network is applicable for differentiating and retrieving 2D objects using very simple one-stroke sketches with good accuracy.
更多
查看译文
关键词
Sketch-based retrieval, Partial representations, High-level shape features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要