• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid

References

1 
L. A. Gottlieb, A. Kontorovich, R. Krauthgamer, 2016, Adaptive metric dimensionality reduction, Theoretical Computer Science, Vol. 620, pp. 105-118DOI
2 
17 August 2023, National Cancer Center. Available online, https://ncc.re.kr/indexGoogle Search
3 
H. Chi, I. H. Chang, 2018, The Overdiagnosis of Kidney Cancer in Koreans and the Active Surveillance on Small Renal Mass, Korean J Urol Oncol, Vol. 16, No. 1, pp. 15-24DOI
4 
A. M. Ali, H. Zhuang, A. Ibrahim, O. Rehman, M. Huang, A. Wu, Nov. 2018, A machine learning approach for the classification of kidney cancer subtypes using miRNA genome data, Appl Sci, Vol. 8, No. 2422, pp. 1-14DOI
5 
H. M. Kim, S. J. Lee, S. J. Park, I. Y. Choi, S. Hong, 2021, Machine Learning Approach to Predict the Probability of Recurrence of Renal Cell Carcinoma After Surgery: Prediction Model Development Study, JMIR Med Inform, Vol. 9, No. 3DOI
6 
A. J. Peired, R. Campi, M. L. Angelotti, G. Antonelli, C. Conte, E. Lazzeri, F. Becherucci, L. Calistri, S. Serni, P. Romagnani, 2021, Sex and Gender Differences in Kidney Cancer: Clinical and Experimental Evidence, Cancers, Vol. 13, No. 18, pp. 4588-DOI
7 
17 August, 2023, Genomic Data Commons. Available online, https://portal.gdc.cancer.govGoogle Search
8 
B. J. Kim, S. H. Kim, 2018, Prediction of inherited genomic susceptibility to 20 common cancer types by a supervised machine-learning method, Proc Natl Acad Sci USA, Vol. 115, No. 6, pp. 1322-1327DOI
9 
O. G. Troyanskaya, K. Dolinski, A. B. Owen, R. B. Altman, D. Botstein, 2003, A Bayesian framework for combining heterogeneous data sources for gene function prediction (in S. cerevisiae), Proc Natl Acad Sci USA, Vol. 100, No. 14, pp. 8348-8353DOI
10 
N. E. M. Khalifa, M. H. N. Taha, D. E. Ali, A. Slowik, A. E. Hassanien, 2020, Artificial intelligence technique for gene expression by tumor RNA-Seq data: a novel optimized deep learning approach, IEEE Access, Vol. 8, pp. 22874-22883DOI
11 
H. S. Shon, K. O. Kim, E. J. Cha, K. A. Kim, 2020, Classification of Kidney Cancer Data based on Feature Extraction Methods, The Transactions of the Korean Institute of Electrical Engineers, Vol. 69, No. 7, pp. 1061-1066DOI
12 
H. S. Shon, E. Batbaatar, E. J. Cha, T. G. Kang, S. G. Choi, K. A. Kim, 2022, Deep Autoencoder based Classification for Clinical Prediction of Kidney Cancer, The Transactions of the Korean Institute of Electrical Engineers, Vol. 71, No. 10, pp. 1393-1404DOI
13 
H. S. Shon, E. Batbaatar, K. O. Kim, E. J. Cha, K. A. Kim, 2020, Classification of kidney cancer data using cost-sensitive hybrid deep learning approach, Symmetry, Vol. 12, No. 1, pp. 1-21DOI
14 
Y. Bengio, E. Laufer, G. Alain, J. Yosinski, 2014, Deep generative stochastic networks trainable by backprop, Proceeding of the 31st International Conference on Machine Learning, Vol. 32, pp. 226-234DOI
15 
B. Kalaiselvi, M. Thangamani, 2020, An efficient Pearson correlation based improved random forest classification for protein structure prediction techniques, Measurement, Vol. 162DOI
16 
I. Jain, V. K. Jain, R. Jain, 2018, Correlation feature selection based improved-binary particle swarm optimization for gene selection and cancer classification, Applied Soft Computing, Vol. 62, pp. 203-215DOI
17 
Z. M. Hira, D. F. Gillies, 2015, A review of feature selection and feature extraction methods applied on microarray data, Adv Bioinformatics, Vol. 2015DOI
18 
R. Tibshirani, 1996, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical B, Vol. 58, No. 1, pp. 267-288DOI
19 
E. R. Girden, 1992, ANOVA: Repeated measures, sageGoogle Search
20 
I. Guyon, M. Nikravesh, S. Gunn, L. A. Zadeh, 2008, Feature extraction: foundations and applications, SpringerGoogle Search
21 
A. Nakra, M. Duhan, 2020, Feature Extraction and Dimensionality Reduction Techniques with Their Advantages and Disadvantages for EEG-Based BCI System: A Review, IUP Journal of Computer Sciences, Vol. 14, No. 2020, pp. -DOI
22 
X. Zhang, W. Yang, X. Tang, J. Liu, 2018, A fast learning method for accurate and robust lane detection using two-stage feature extraction with YOLO v3, Sensors, Vol. 18, No. 12, pp. 4308-DOI
23 
M. Oravec, 2014, Feature extraction and classification by machine learning methods for biometric recognition of face and iris, Proceedings of ELMAR-2014DOI
24 
P. Baldi, 2011, Autoencoders, unsupervised learning, and deep architectures, Proceedings of machine learning research, Vol. 27, pp. -DOI
25 
M. Sewak, S. K. Sahay, H. Rathore, 2020, An overview of deep learning architecture of deep neural networks and autoencoders, Journal of Computational and Theoretical Nanoscience, Vol. 17, No. 1, pp. 182-188DOI
26 
L. Weng, From Autoencoder to Beta-VAE, Available from: https://lilianweng.github.io/posts/2018-08-12-vae/DOI
27 
D. P. Kingma, M. Welling, 2013, Auto-encoding variational bayes, arXiv preprint arXiv:13126114DOI
28 
S. M. A. Elrahman, A. Abraham, 2013, A review of class imbalance problem, Journal of Network and Innovative Computing, Vol. 1, pp. 332-340DOI
29 
K. M. Hasib, M. Iqbal, F. M. Shah, J. A. Mahmud, M. H. Popel, M. Showrov, S. Ahmed, O. Rahman, 2020, A survey of methods for managing the classification and solution of data imbalance problem, arXiv preprint arXiv:201211870DOI
30 
D. Li, C. Liu, S. C. Hu, 2010, A learning method for the class imbalance problem with medical data sets, Computers in biology and medicine, Vol. 40, No. 5, pp. 509-518DOI
31 
N. V. Chawla, K. W. Bowyer, L. O. Hall, W. P. Kegelmeyer, 2002, SMOTE: synthetic minority over-sampling technique, Journal of artificial intelligence research, pp. 321-357DOI
32 
D. G. Kleinbaum, K. Dietz, M. Gail, M. Klein, M. Klein, 2002, Logistic regression, SpringerDOI
33 
W. S. Noble, 2006, What is a support vector machine?, Nature biotechnology, Vol. 24, No. 12, pp. 1565-1567DOI
34 
B. Charbuty, A. Abdulazeez, 2021, Classification based on decision tree algorithm for machine learning, Journal of Applied Science and Technology Trends, Vol. 2, No. 1, pp. 20-28DOI
35 
Y. Qi, 2012, Random forest for bioinformatics. Ensemble machine learning: Methods and applications, Springer, pp. 307-323DOI
36 
N. S. Altman, 1992, An introduction to kernel and nearest-neighbor nonparametric regression, The American Statistician, Vol. 46, No. 3, pp. 175-185DOI
37 
K. P. Murphy, 2006, Naive bayes classifiers, University of British Columbia, Vol. 18, No. 60, pp. 1-8DOI
38 
T. Hastie, S. Rosset, J. Zhu, H. Zou, 2009, Multi-class adaboost, Statistics and its Interface, Vol. 2, No. 3, pp. 349-360DOI
39 
2016, Xgboost: A scalable tree boosting system, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. -DOI
40 
S. Ruder, 2016, An overview of gradient descent optimization algorithms, arXiv preprint arXiv:160904747DOI
41 
2020, Performance evaluation of supervised machine learning algorithms in prediction of heart disease, 2020 IEEE International Conference for Innovation in Technology, IEEE, pp. -DOI
42 
M. Viscaino, J. T. Bustos, P. Muñoz, C. A. Cheein, F. A. Cheein, 2021, Artificial intelligence for the early detection of colorectal cancer: A comprehensive review of its advantages and misconceptions, World Journal of Gastroenterology, Vol. 27, No. 38, pp. 6399-DOI
43 
A. N. Richter, T. M. Khoshgoftaar, 2018, A review of statistical and machine learning methods for modeling cancer risk using structured clinical data, Artificial intelligence in medicine, Vol. 90, No. , pp. 1-14DOI
44 
S. R. Stahlschmidt, B. Ulfenborg, J. Synnergren, 2022, Multimodal deep learning for biomedical data fusion: a review, Briefings in Bioinformatics, Vol. 23, No. 2, pp. bbab569-DOI