Ranking images based on User-specific Aesthetic

September 5, 2018

While taking decision if an image is of good quality or not it is very hard to take personal choice in consideration. There are some existing aesthetic models to determine the quality of an image but those models give preference to features embedded in high quality images rather taking into consideration about the human factors and interactions. Considering this situations Pei Lv et al. [1] proposes a user friendly aesthetic ranking system which uses deep neural network and user interactions. This system automatically ranks the aesthetic features of the images according to the users preference.

This framework takes an array of selected images by the user as an input and then produces an output of aesthetic ranking of the images in accordance to users preference. It considers personal preference and the uncertainty of user’s single selection, using all the data a unique and exclusive dataset is constructed to describe the preference of one individual by retrieving the most similar images with regard to those specified by users. Based on this unique user-specific dataset and well designed aesthetic attributes, a customized aesthetic distribution model can be learned, which considers both personalized preference and aesthetic rules.

In the paper Pei Lv et al. propose a user-specific aesthetic ranking framework by using a massive image dataset via AlexNet [2] which consists of three modules: 1) Primary personalized ranking, 2) Interaction stage and 3) User-specific aesthetic ranking. Given a set of preferred images, the authors first extracted their content features for further retrieval of similar images from the whole aesthetic database and construct a retrieval set. Then a primary personalized ranking is generated from the primary personalized ranking module. In order to overcome the instability suffered from the direct use of a small amount of samples they follow refining strategy by asking user to interact with the primary ranking images and treat them as the basic sample images, which are subsequently sent to the style-specific classifier to generate a user-specific aesthetic distribution. During the testing stage, the learned ranking module outputs the testing distribution. This user-specific aesthetic ranking is obtained by calculating the correlation coefficients between user-specific aesthetic distribution and testing distribution model.

Pei Lv et al. deploy Deep Neural Network to understand images. The features extracted are based on AlexNet. It is proposed and designed by Krizhevsky et al. [3]. As for the extraction on feature of aesthetic attributes, there is no need to construct too deep network. AlexNet is well-matched for the requirement on extracting aesthetic feature as well as computing time while they refine the whole network to generate user-specific aesthetic model.

They propose a user specific aesthetic ranking model which combines the result of deep neural network with preference of the user and the aesthetic values of the images.


1. arXiv:1805.01091 [cs.CV]

2. https://arxiv.org/abs/1803.01164

3. Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2012. ImageNet classification
with deep convolutional neural networks. In International Conference on
Neural Information Processing Systems, Lake Tahoe, NV, United states, December.1097–1105.