Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.11861/10464
Title: Saliency detection using deep features and affinity-based robust background subtraction
Authors: Dr. NAWAZ Mehmood 
Yan, Hong 
Issue Date: 2020
Source: IEEE Transactions on Multimedia, 2020, vol. 23, pp. 2902-2916.
Journal: IEEE Transactions on Multimedia 
Abstract: Most existing saliency methods measure fore- ground saliency by using the contrast of a foreground region to its local context, or boundary priors and spatial compactness. These methods are not powerful enough to extract a precise salient region from noisy and cluttered backgrounds. To evaluate the contrast of salient and background regions effectively, we consider high-level features from both supervised and unsupervised methods. We propose an affinity-based robust background subtraction technique and maximum attention map using a pre-trained convolution neural network. This affinity-based technique uses pixel similarities to propagate the values of salient pixels among foreground and background regions and their union. The salient pixel value controls the foreground and background information by using multiple pixel affinities. The maximum attention map is derived from the convolution neural network using features of the Pooling and Relu layers. This method can detect salient regions from images that have noisy and cluttered backgrounds. Our experimental results demonstrate the effectiveness of the proposed approach on six different saliency data sets and benchmarks and show that it improves the quality of detection beyond current saliency detection methods.
Type: Peer Reviewed Journal Article
URI: http://hdl.handle.net/20.500.11861/10464
ISSN: 1520-9210
1941-0077
DOI: 10.1109/TMM.2020.3019688
Appears in Collections:Applied Data Science - Publication

Show full item record

SCOPUSTM   
Citations

22
checked on Nov 3, 2024

Page view(s)

13
Last Week
0
Last month
checked on Nov 9, 2024

Google ScholarTM

Impact Indices

Altmetric

PlumX

Metrics


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.