Mobile QR Code QR CODE : Journal of the Korean Institute of Illuminating and Electrical Installation Engineers

Journal of the Korean Institute of Illuminating and Electrical Installation Engineers

ISO Journal TitleJ Korean Inst. IIIum. Electr. Install. Eng.

References

1 
J. Hwang, “A study on fault diagnosis for photovoltaic systems with application to digital O&M,” M.S. thesis, Dept. Electrical Eng., Dong-A University, Korea, 2024.URL
2 
M. Kim, “Operation and maintenance (O&M) market analysis for photovoltaic systems,” ASTI Market Insight, no. 49, pp. 1-6, 2022.URL
3 
L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp. 5-32, 2001.DOI
4 
S. J. Rigatti, “Random forest,” J. Insur. Med., vol. 47, no. 1, pp. 31-39, 2017.DOI
5 
Y. Liu, Y. Wang, and J. Zhang, “New machine learning algorithm: Random forest,” in Proc. ICICA 2012, LNCS, vol. 7473, pp. 246-252, 2012.DOI
6 
P. Probst and A.-L. Boulesteix, “To tune or not to tune the number of trees in random forest,” J. Mach. Learn. Res., vol. 18, pp. 1-18, 2018.URL
7 
A. Paul, et al., “Improved random forest for classification,” IEEE Trans. Image Process., vol. 27, no. 8, pp. 4012-4023, 2018.DOI
8 
R. K. Halder, et al., “Enhancing k-nearest neighbor algorithm: A comprehensive review and performance analysis of modifications,” Journal of Big Data, vol. 11, no. 113, pp. 1-55, 2024.DOI
9 
S. Zhang, et al., “Efficient kNN classification with different numbers of nearest neighbors,” IEEE Trans. Neural Netw. Learn. Syst., vol. 29, no. 5, pp. 1774-1785, 2018.DOI
10 
J. Kim, H. Kim, and S. Choe, “KNN algorithm-based driver face recognition model available for car-sharing service,” Proceeding on Korea Institute of information and Communication Engineering, vol. 27, no. 1, pp. 658-660, 2023.URL
11 
S. Zhang, et al., “Learning k for kNN classification,” ACM Trans. Intell. Syst. Technol., vol. 8, no. 3, pp. 1-19, 2017.DOI
12 
S. Zhang, “Challenges in kNN classification,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 10, pp. 4663-4668, 2022.DOI
13 
S. Taheri and M. Mammadov, “Learning the naive Bayes classifier with optimization models,” Int. J. Appl. Math. Comput. Sci., vol. 23, no. 4, pp. 787-795, 2013.URL
14 
J. Doe, et al., “An implementation of naive Bayes classifier,” Int. J. Comput. Sci. Appl., vol. 10, no. 2, pp. 45-55, 2015.URL
15 
I. Rish, “An empirical study of the naive Bayes classifier,” J. Mach. Learn. Res., vol. 3, no. 2, pp. 41-52, 2001.URL
16 
M. Hasnain, et al., “Evaluating trust prediction and confusion matrix measures for web services ranking,” IEEE Access, vol. 8, pp. 90847-90858, 2020.DOI
17 
B. P. Salmon, W. Kleynhans, C. P. Schwegmann, and J. C. Olivier, “Proper comparison among methods using a confusion matrix,” 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 3057-3060, 2015.DOI
18 
J. Miao and W. Zhu, “Precision–recall curve (PRC) classification trees,” Evol. Intell., vol. 14, no. 1, pp. 23-35, 2021.DOI
19 
P. Liang, et al., “Machine learning of single-cell transcriptome highly identifies mRNA signature by comparing F-score selection with DGE analysis,” Mol. Ther. Nucleic Acids, vol. 20, pp. 155-160, 2020.DOI
20 
A. Tharwat, “Classification assessment methods,” Appl. Comput. Inform., vol. 17, no. 1, pp. 168-192, 2021.DOI
21 
M. Heydarian, et al., “MLCM: Multi-label confusion matrix,” IEEE Access, vol. 10, pp. 19083-19086, 2022.DOI