JIEIE
Journal of the Korean Institute of Illuminating
and Electrical Installation Engineers
KIIEE
Contact
Open Access
Monthly
ISSN : 1229-4691 (Print)
ISSN : 2287-5034 (Online)
http://journal.auric.kr/jieie
Mobile QR Code
Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
ISO Journal Title
J Korean Inst. IIIum. Electr. Install. Eng.
Online Submission
Main Menu
Main Menu
About Journal
Journal Archive
Aims and scope
Editorial board
Instructions to authors
Research Ethics Regulation
Contact info
Online-Submission
Journal Search
Home
Archive
2025-02
(Vol.39 No.1)
10.5207/JIEIE.2025.39.1.44
Journal XML
XML
PDF
INFO
REF
References
1
J. Hwang, “A study on fault diagnosis for photovoltaic systems with application to digital O&M,” M.S. thesis, Dept. Electrical Eng., Dong-A University, Korea, 2024.
2
M. Kim, “Operation and maintenance (O&M) market analysis for photovoltaic systems,” ASTI Market Insight, no. 49, pp. 1-6, 2022.
3
L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp. 5-32, 2001.
4
S. J. Rigatti, “Random forest,” J. Insur. Med., vol. 47, no. 1, pp. 31-39, 2017.
5
Y. Liu, Y. Wang, and J. Zhang, “New machine learning algorithm: Random forest,” in Proc. ICICA 2012, LNCS, vol. 7473, pp. 246-252, 2012.
6
P. Probst and A.-L. Boulesteix, “To tune or not to tune the number of trees in random forest,” J. Mach. Learn. Res., vol. 18, pp. 1-18, 2018.
7
A. Paul, et al., “Improved random forest for classification,” IEEE Trans. Image Process., vol. 27, no. 8, pp. 4012-4023, 2018.
8
R. K. Halder, et al., “Enhancing k-nearest neighbor algorithm: A comprehensive review and performance analysis of modifications,” Journal of Big Data, vol. 11, no. 113, pp. 1-55, 2024.
9
S. Zhang, et al., “Efficient kNN classification with different numbers of nearest neighbors,” IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 5, pp. 1774-1785, 2018.
10
J. Kim, H. Kim, and S. Choe, “KNN algorithm-based driver face recognition model available for car-sharing service,” Proceeding on Korea Institute of information and Communication Engineering, vol. 27, no. 1, pp. 658-660, 2023.
11
S. Zhang, et al., “Learning k for kNN classification,” ACM Trans. Intell. Syst. Technol., vol. 8, no. 3, pp. 1-19, 2017.
12
S. Zhang, “Challenges in kNN classification,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 10, pp. 4663-4668, 2022.
13
S. Taheri and M. Mammadov, “Learning the naive Bayes classifier with optimization models,” Int. J. Appl. Math. Comput. Sci., vol. 23, no. 4, pp. 787-795, 2013.
14
J. Doe, et al., “An implementation of naive Bayes classifier,” Int. J. Comput. Sci. Appl., vol. 10, no. 2, pp. 45-55, 2015.
15
I. Rish, “An empirical study of the naive Bayes classifier,” J. Mach. Learn. Res., vol. 3, no. 2, pp. 41-52, 2001.
16
M. Hasnain, et al., “Evaluating trust prediction and confusion matrix measures for web services ranking,” IEEE Access, vol. 8, pp. 90847-90858, 2020.
17
B. P. Salmon, W. Kleynhans, C. P. Schwegmann, and J. C. Olivier, “Proper comparison among methods using a confusion matrix,” 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 3057-3060, 2015.
18
J. Miao and W. Zhu, “Precision–recall curve (PRC) classification trees,” Evol. Intell., vol. 14, no. 1, pp. 23-35, 2021.
19
P. Liang, et al., “Machine learning of single-cell transcriptome highly identifies mRNA signature by comparing F-score selection with DGE analysis,” Mol. Ther. Nucleic Acids, vol. 20, pp. 155-160, 2020.
20
A. Tharwat, “Classification assessment methods,” Appl. Comput. Inform., vol. 17, no. 1, pp. 168-192, 2021.
21
M. Heydarian, et al., “MLCM: Multi-label confusion matrix,” IEEE Access, vol. 10, pp. 19083-19086, 2022.