Incremental Focal ENsemble for multi-class Imbalalanced Learning (FENIL)

Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:

Convolutional neural networks are considered as one of the most popular machine learning models in data classification. despite their significant success in data classification, they do not produce acceptable results when working with imbalanced data.  Imbalanced learning is one of the most challenging issues in machine learning, since in these problems, samples of one or more classes are usually much more than others, or misclassification costs are not equal for all classes, while CNN networks assume the distribution of classes and the cost of misclassification to be equal. The ensemble method is a popular method to deal with imbalanced data sets, which can achieve high accuracy by combining several basic estimators, and in comparison with using only one estimator, the reliability of the model could be improved. ensemble methods empower machine learning models to deal with imbalanced data. In this research, we have introduced a method based on ensemble learning for convolutional neural networks, which uses the cascade of CNN networks to work with imbalanced data. We use the focal loss function to train CNNs, the gamma parameter in the loss function determines the importance of hard and easy samples. CNNi+1 gives less importance to easy samples than hard ones in comparison with CNNi, this is done by increasing the gamma step by step (increasing γi+1 compared to γi). In our proposed FENIL ensemble network (Incremental Focal Ensemble method for multi-class Imbalalanced Learning), weights of the training data for CNNi+1 are determined by the classification result of the previous CNN i.e. CNNi. The combination of all CNNs is used to classify the new data. We applied our proposed FENIL ensemble network to several benchmark data sets. the results showed that the FENIL network not only has much higher accuracy and F1-score (18.63, 19.61 higher) In comparison with non-deep methods such as decision tree AdaBoost but also obtained better results In comparison with other common deep methods for imbalanced learning.

Language:
Persian
Published:
Journal of Command and Control Communications Computer Intelligence, Volume:6 Issue: 2, 2022
Pages:
60 to 77
magiran.com/p2579558  
دانلود و مطالعه متن این مقاله با یکی از روشهای زیر امکان پذیر است:
اشتراک شخصی
با عضویت و پرداخت آنلاین حق اشتراک یک‌ساله به مبلغ 1,390,000ريال می‌توانید 70 عنوان مطلب دانلود کنید!
اشتراک سازمانی
به کتابخانه دانشگاه یا محل کار خود پیشنهاد کنید تا اشتراک سازمانی این پایگاه را برای دسترسی نامحدود همه کاربران به متن مطالب تهیه نمایند!
توجه!
  • حق عضویت دریافتی صرف حمایت از نشریات عضو و نگهداری، تکمیل و توسعه مگیران می‌شود.
  • پرداخت حق اشتراک و دانلود مقالات اجازه بازنشر آن در سایر رسانه‌های چاپی و دیجیتال را به کاربر نمی‌دهد.
In order to view content subscription is required

Personal subscription
Subscribe magiran.com for 70 € euros via PayPal and download 70 articles during a year.
Organization subscription
Please contact us to subscribe your university or library for unlimited access!