Fine-Tuning InceptionV3 for Thai Cuisine Image Classification: A Mobile Deployment Perspective

Authors

  • Somsak Saksat, Nathaphon Boonnam, Siriwan Kajornkasirat

Keywords:

Thai Food Classification, InceptionV3, Transfer Learning, Fine-Tuning, TensorFlow Lite, Mobile app

Abstract

This work presents the development of a smartphone application that utilizes deep learning techniques for the automatic classification of Thai food images. Transfer learning and fine-tuning approaches were compared using the InceptionV3 model, initially trained on the ImageNet dataset and subsequently refined with a dataset consisting of 49 varieties of Thai cuisine images. Experimental results indicate that the fine-tuning model achieved superior performance, attaining an accuracy of 95.22% on the validation set, surpassing the transfer learning model, which achieved an accuracy of 85.43%. Additionally, the fine-tuning model exhibited a stable and consistent decrease in loss without significant overfitting, making it the preferred choice for application development. We converted this model to TensorFlow Lite to enable offline functionality on smartphones developed using Flutter. However, retrieving detailed nutritional information still requires an online database connection to ensure comprehensive nutrient data, including calories, protein, fat, and carbohydrates. This research demonstrates the potential of combining fine-tuning methods with mobile application development to promote mindful food consumption, reduce the risk of non-communicable diseases, and enhance quality of life in the digital era. Moreover, the application supports the United Nations Sustainable Development Goal 3: Good Health and Well-being by encouraging healthier lifestyle choices and contributing to improved health outcomes. Furthermore, it provides a valuable framework for the sustainable promotion of Thai food culture.

Downloads

Download data is not yet available.

References

D. Medina, I. M. Cabrera, R. A. de la Cruz, L. Guerra Arzuaga, S. Cuello Portal, and M. Bianchini, “A mobile app for detecting potato crop diseases,” J. Imaging, vol. 10, no. 2, p. 47, 2024, doi: 10.3390/jimaging10020047.

C. Tian and Y. Sun, “AI_BIRDER: An intelligent mobile application to automate bird classification using artificial intelligence and deep learning,” Comput. Sci. Inf. Technol., 2022.

R. G. Rakesh et al., “Fish classification with machine learning: Enhancing accuracy and efficiency,” Int. J. Comput. Digit. Syst., 2024.

P. Chotwanvirat, A. Prachansuwan, P. Sridonpai, and W. Kriengsinyos, “Automated artificial intelligence–based Thai food dietary assessment system,” Curr. Dev. Nutr., vol. 8, p. 102154, 2024.

A. Phonrat and S. Pholphirul, “Culinary culture and Thai food exports,” J. Asian Dev., vol. 6, no. 1, pp. 45–59, 2020.

S. Sirirak, “Thai food culture: Identity and globalization,” Thai J. Cult. Stud., vol. 12, no. 2, pp. 10–18, 2022.

Thai Health Promotion Foundation, “Thai health report 2023.” [Online]. Available: https://www.thaihealth.or.th

Ministry of Public Health Thailand, “Obesity and diabetes data 2023.” [Online]. Available: https://nutrition2.anamai.moph.go.th

World Health Organization, “Noncommunicable diseases,” 2022. [Online]. Available: https://www.who.int/news-room/fact-sheets/detail/noncommunicable-diseases

B. M. Popkin, S. Kim, and N. Slining, “The nutrition transition and its health implications in lower- and middle-income countries,” Public Health Nutr., vol. 10, no. 1, pp. 56–65, 2021.

H. Fang et al., “Food recognition using CNNs with weakly supervised learning,” IEEE Trans. Image Process., vol. 29, pp. 685–698, 2020.

S. Kawano and K. Yanai, “Food image recognition with deep convolutional features,” in Proc. IEEE Int. Conf. Image Process. (ICIP), 2014, pp. 1894–1898.

X. Liu et al., “DeepFood: Deep learning-based food image recognition for dietary assessment,” in Proc. IEEE Int. Conf. Smart Homes Health Telemat. (ICOST), 2016.

M. Hassannejad et al., “Food image classification using deep convolutional networks,” in Proc. Int. Conf. Biomed. Health Inform. (ICBHI), 2016.

C. Szegedy et al., “Rethinking the inception architecture for computer vision,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2016, pp. 2818–2826.

M. Tan and Q. Le, “EfficientNet: Rethinking model scaling for convolutional neural networks,” in Proc. Int. Conf. Mach. Learn. (ICML), 2019, pp. 6105–6114.

United Nations, “Sustainable development goal 3: Ensure healthy lives and promote well-being for all,” 2023. [Online]. Available: https://sdgs.un.org/goals/goal3

Z. Yang et al., “Chinese food recognition using deep neural networks,” Neurocomputing, vol. 372, pp. 187–197, 2020.

T. Mezgec and B. Kaluža, “Nutrinet: A convolutional neural network for food and drink image recognition,” Nutrients, vol. 9, no. 7, p. 657, 2017.

K. Tanprasert and C. Boonbrahm, “Thai food image recognition using convolutional neural networks,” in Proc. Int. Conf. Signal Process. Inf. Secur. (ICSPIS), 2021, pp. 1–6.

Y. W. Teo and K. F. Lim, “Nutrition knowledge and food choice among Malaysian adults,” Asia Pac. J. Clin. Nutr., vol. 27, no. 2, pp. 390–398, 2018.

C. A. Monteiro et al., “Ultra-processed foods: What they are and how to identify them,” Public Health Nutr., vol. 22, no. 5, pp. 936–941, 2019.

THFOOD-50 Dataset. Fine-grained Thai food image classification dataset. [Online]. Available: https://drive.google.com/ (access restricted for educational/non-commercial use)

Downloads

Published

19.04.2025

How to Cite

Somsak Saksat. (2025). Fine-Tuning InceptionV3 for Thai Cuisine Image Classification: A Mobile Deployment Perspective. International Journal of Intelligent Systems and Applications in Engineering, 13(1), 451–457. Retrieved from https://www.ijisae.org/index.php/IJISAE/article/view/7834

Issue

Section

Research Article