TRANSFER LEARNING: INCEPTION-V3 BASED CUSTOM CLASSIFICATION APPROACH FOR FOOD IMAGES
Abstract
vioft2nntf2t|tblJournal|Abstract_paper|0xf4ff24782b000000d1bf040001000200
Deep-learning approach has become more popular in the field of image processing. When it is concerned with health issues, there are lots of improvements in the applications of food image classification by deep learning methods. Transfer learning has become one of the popular techniques used in inception V3 for image classification, it is the reuse of a pre-trained model on a new model, where it uses a small amount of dataset to reduce the training time and increases the performance. In this paper, the Google Inception-V3 model is considered as a base, in top of that fully connected layer is built to optimize the classification process. In the model building process, convolution layers are capable to enough learn on its own convolution kernel to produce the tensor outputs. In addition, the separately obtained segmented features are concatenated with our custom model before the classification phase. It enhances the capability of important features and utilizes in the process of food classification. Here, the dataset of 16 class food images is considered and it contains thousands of images. The 96.27% classification accuracy has been obtained at the testing phase which is compared with different state-of-art techniques.

Authors
Vishwanath C Burkapalli, Priyadarshini C Patil
Poojya Doddappa Appa College of Engineering, India

Keywords
Deep Learning, Transfer Learning, Convolutional Neural Networks (CNNs), Food Classification, Calories Estimation, South Indian Dataset, Inception Model
Yearly Full Views
JanuaryFebruaryMarchAprilMayJuneJulyAugustSeptemberOctoberNovemberDecember
313100002000
Published By :
ICTACT
Published In :
ICTACT Journal on Image and Video Processing
( Volume: 11 , Issue: 1 , Pages: 2261-2267 )
Date of Publication :
August 2020
Page Views :
206
Full Text Views :
10

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.