Pictures in social networks like Instagram or Facebook often are edited by applying some filters. Convolutional neural networks-based visual understanding models could be used in filter removal tasks. However, current research tries to classify the specific filter applied to the images or to learn parameters of transformations applied and cannot recover the original image.
A recent study suggests a novel approach to the task. It is suggested to consider visual effects as the style information and use the style transfer approach. The architecture has an encoder-decoder structure that normalizes the style information in the encoder. Unfiltered images are generated with the help of adversarial learning.
Also, a dataset of 600 images and their filtered versions is introduced. Experiments show that the model eliminates the external visual effects to a great extent.
Social media images are generally transformed by filtering to obtain aesthetically more pleasing appearances. However, CNNs generally fail to interpret both the image and its filtered version as the same in the visual analysis of social media images. We introduce Instagram Filter Removal Network (IFRNet) to mitigate the effects of image filters for social media analysis applications. To achieve this, we assume any filter applied to an image substantially injects a piece of additional style information to it, and we consider this problem as a reverse style transfer problem. The visual effects of filtering can be directly removed by adaptively normalizing external style information in each level of the encoder. Experiments demonstrate that IFRNet outperforms all compared methods in quantitative and qualitative comparisons, and has the ability to remove the visual effects to a great extent. Additionally, we present the filter classification performance of our proposed model, and analyze the dominant color estimation on the images unfiltered by all compared methods.
Research paper: Kınlı, F., Özcan, B., and Kıraç, F., “Instagram Filter Removal on Fashionable Images”, 2021. Link: https://arxiv.org/abs/2104.05072