Add like
Add dislike
Add to saved papers

Aesthetics-Guided Graph Clustering with Absent Modalities Imputation.

Accurately clustering Internet-scale Internet users into multiple communities according to their aesthetic styles is a useful technique in image modeling and data mining. In this work, we present a novel partially-supervised model which seeks a sparse representation to capture photo aesthetics1. It optimally fuzes multi-channel features, i.e., human gaze behavior, quality scores, and semantic tags, each of which could be absent. Afterward, by leveraging the KL-divergence to distinguish the aesthetic distributions between photo sets, a large-scale graph is constructed to describe the aesthetic correlations between users. Finally, a dense subgraph mining algorithm which intrinsically supports outliers (i.e., unique users not belong to any community) is adopted to detect aesthetic communities. Comprehensive experimental results on a million-scale image set crawled from Flickr have demonstrated the superiority of our method. As a byproduct, the discovered aesthetic communities can enhance photo retargeting and video summarization substantially.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app