CPW Optimization a Novel Optimization Algorithm to Improve Classification of Objects in Video Content Detection
Keywords:
Coyote , Cat Swarm , chaser priori wolf optimization, deep learning, object detectionAbstract
Optimizers are essential for video object detection because they facilitate training and enhance model performance. During training, optimizers are in charge of the loss function minimization. The gradients of the loss parameters are used to iteratively update the model parameters. Optimizers steer the model towards convergence by iteratively changing the parameters in the direction of the steepest descent, which lowers the loss and enhances object detection performance. The hybrid optimizer known as the chaser priori wolf optimizer is proposed in the paper. The hybridization of coyote and cat swarm optimization forms the foundation of the chaser priori wolf optimization. The CPW optimizer was introduced in the proposed work to enhance feature selection and convergence in classification. The comparative outcome demonstrated that the CNN-based YOLO model performed better. In terms of sensitivity, specificity, and accuracy, the results are contrasted. The findings unequivocally demonstrated improvements in every performance metric, with an average improvement of 10.3% when compared to state-of-the-art architecture.
Downloads
References
Pei-Wei Tsai, Xingsi Xue, Jing Zhang, Vaci Istanda, “Adjustable mode ratio and focus boost search strategy for cat swarm optimization”, Applied Computing and Intelligence, 2021, 1(1): 75-94. doi: 10.3934/aci.2021005.
Zeyu, Ming Zhao, Tie Luo, and Yimin Yang, “A Compact Cat Swarm Optimization Algorithm Based on Small Sample Probability Model” Applied Sciences 12, no. 16: 8209. https://doi.org/10.3390/app12168209.
Sabah, Mustafa, “An Enhanced Video Tracking Technique Based on Improved Cat Swarm Optimization”,2015, 10.13140/RG.2.2.31093.37604.
Duman, S., Kahraman, H.T., Guvenc, U. et al., “Development of a Lévy flight and FDB-based coyote optimization algorithm for global optimization and real-world ACOPF problems”, Soft Computing 25, 6577–6617 (2021). https://doi.org/10.1007/s00500-021-05654-z.
Rasoul Rahmani, Rubiyah Yusof, “A new simple, fast and efficient algorithm for global optimization over continuous search-space problems: Radial Movement Optimization”, Applied Mathematics and Computation, Volume 248, 2014, Pages 287-300, ISSN 0096-3003, https://doi.org/10.1016/j.amc.2014.09.102.
Dhafar Al-Ani, Saeid Habibi, “A New Particle Swarm Optimization and Differential Evolution Technique for Constrained Optimization Problems”, https://doi.org/10.1115/IMECE2013-63877, 2014.
Arora, S., Singh, S., “ Butterfly optimization algorithm: a novel approach for global optimization”, Soft Computing 23, 715–734 (2019). https://doi.org/10.1007/s00500-018-3102-4.
Ferrante Neri, Ernesto Mininno, Giovanni Iacca, “Compact Particle Swarm Optimization”, Information Sciences, Volume 239, 2013, Pages 96-121, ISSN 0020-0255, https://doi.org/10.1016/j.ins.2013.03.026.
Ming Zhao (2020) A Novel Compact Cat Swarm Optimization Based on Differential Method, Enterprise Information Systems, 14:2, 196-220, DOI: 10.1080/17517575.2018.1462405.
J. Pierezan and L. Dos Santos Coelho, "Coyote Optimization Algorithm: A New Metaheuristic for Global Optimization Problems," 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 2018, pp. 1-8, doi: 10.1109/CEC.2018.8477769.
Choi, D., Shallue, C. J., Nado, Z., Lee, J., Maddison, C. J., Dahl, G. E., “On Empirical Comparisons of Optimizers for Deep Learning”, ICLR 2020.
H. Sikkandar, R. Thiyagarajan, Deep learning based facial expression recognition using improved Cat Swarm Optimization, J. Amb. Intel. Hum. Comp., 12 (2021), 3037–3053.
S. Das, S. Maity, B.-Y. Qu and P.N. Suganthan, "Real-parameter evolutionary multimodal optimization — a survey of the state-of-the-art", Swarm and Evolutionary Computation, vol. 1, no. 2, pp. 71-88, June 2011
Gupta et. al., “Adam vs. SGD: Closing the generalization gap on image classification”, OPT2021: 13th Annual Workshop on Optimization for Machine Learning, 2021.
P. Kingma and J. Ba, “Adam: A method for stochastic optimization”, 3rd International Conference on Learning Representations, (ICLR), 2015.
Pedro F. Felzenszwalb, Ross B. Girshick, David McAllester, Deva Ramanan, “Object detection with discriminatively trained part-based models”, Pascal Challenge , 2010.
Choi, D., Shallue, C. J., Nado, Z., Lee, J., Maddison, C. J., Dahl, G. E., “On Empirical Comparisons of Optimizers for Deep Learning”, ICLR 2020.
S. Bera and V. K. Shrivastava, “Analysis of various optimizers on deep convolutional neural network model in the application of hyperspectral remote sensing image classification”, Int. J. Remote Sens., vol. 41, no. 7, pp. 2664–2683, 2020 .
Everingham, M. and VanGool, L. and Williams, C. K. I. and Winn, J. and Zisserman, A., “Pascal 2007 Dataset”, http://www.pascalnetwork.org/challenges/VOC/voc2007/workshop/index.html, 2007.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.