[vc_empty_space][vc_empty_space]
Boosting-Based Relevance Feedback for CBIR
Pardede J.a,b, Sitohang B.b, Akbar S.b, Khodra M.L.b
a Department of Informatics Engineering, Institut Teknologi Nasional, Bandung, Indonesia
b Institut Teknologi Bandung, School of Electrical and Informatics Engineering, Bandung, Indonesia
[vc_row][vc_column][vc_row_inner][vc_column_inner][vc_separator css=”.vc_custom_1624529070653{padding-top: 30px !important;padding-bottom: 30px !important;}”][/vc_column_inner][/vc_row_inner][vc_row_inner layout=”boxed”][vc_column_inner width=”3/4″ css=”.vc_custom_1624695412187{border-right-width: 1px !important;border-right-color: #dddddd !important;border-right-style: solid !important;border-radius: 1px !important;}”][vc_empty_space][megatron_heading title=”Abstract” size=”size-sm” text_align=”text-left”][vc_column_text]© 2018 IEEE.In this research, we implemented Boosting-based Relevance Feedback (BRF) technique for Content-Based Image Retrieval (CBIR) system. The BRF technique follows two stages. In the first stage, the system returns the results of image retrieval based on the dissimilarity measure using Jeffrey Divergence with threshold 0.15. In the second stage, the system returns the results of the image retrieval based on the prediction of the BRF model which is generated based on the user’s feedback image. With the same procedure, every feedback generates a BRF model that corresponds to the user’s feedback images. In this study, we compare existing three Boosting algorithms, i.e.: AdaBoost, Gradient Boosting, and XGBoost. We consider the performance of application from precision, recall, F-measure, and accuracy value. The best BRF technique is XGBoost on fourth feedback, based on the results of experiments that conducted on the Wang Dataset. The BRF technique using XGBoost enhances the average precision value by 18.82%, the average recall value amount 173.32%, the average F-measure value amount 94.97%, and the average accuracy value amount 4.15% compared with the baseline. The BRF technique using XGBoost achieves the best performance on both the average recall and F-measure value compared to the most recent methods.[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Author keywords” size=”size-sm” text_align=”text-left”][vc_column_text]Boosting algorithm,CBIR,Contentbased image retrieval (CBIR) system,Dissimilarity measures,Gradient boosting,Relevance feedback,User’s feedbacks,XGBoost[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Indexed keywords” size=”size-sm” text_align=”text-left”][vc_column_text]AdaBoost,CBIR,Gradient Boosting,Relevance Feedback,XGBoost[/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”Funding details” size=”size-sm” text_align=”text-left”][vc_column_text][/vc_column_text][vc_empty_space][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][vc_empty_space][megatron_heading title=”DOI” size=”size-sm” text_align=”text-left”][vc_column_text]https://doi.org/10.1109/ICODSE.2018.8705854[/vc_column_text][/vc_column_inner][vc_column_inner width=”1/4″][vc_column_text]Widget Plumx[/vc_column_text][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row][vc_column][vc_separator css=”.vc_custom_1624528584150{padding-top: 25px !important;padding-bottom: 25px !important;}”][/vc_column][/vc_row]