References
- World Steel Association. (2022). 2022 world steel in figures. World Steel Association. https://worldsteel.org/steel-topics/statistics/world-steel-in-figures-2022/.
- Chakraborty S, Sahai Y. Effect of slag cover on heat loss and liquid steel flow in ladles before and during teeming to a continuous casting tundish. Metall Trans B. 1992;23:135–151.
- Garcia A. Solidificação: fundamentos e aplicações. Campinas: Editora da UNICAMP; 2007.
- Brimacombe JK, Sorimachi K. Crack formation in the continuous casting of steel. Metall Trans B. 1977;8:489–505.
- Huang X, Thomas BG, Najjar FM. Modeling superheat removal during continuous casting of steel slabs. Metall Mater Trans B. 1992;23:339–356.
- Gupta N, Chandra S. Temperature prediction model for controlling casting superheat temperature. ISIJ Int. 2004;44:1517–1526.
- Sonoda S, Murata N, Hino H, et al. A statistical model for predicting the liquid steel temperature in ladle and tundish by bootstrap filter. ISIJ Int. 2012;52:1086–1091.
- Jormalainen T, Louhenkilpi S. A model for predicting the melt temperature in the ladle and in the tundish as a function of operating parameters during continuous casting. Steel Res Int. 2006;77:472–484.
- Tian H, Mao Z, Wang A. A new incremental learning modeling method based on multiple models for temperature prediction of molten steel in LF. ISIJ Int. 2009;49:58–63.
- WANG Y-n, Bao Y-p, Cui H, et al. Final temperature prediction model of molten steel in RH-TOP refining process for IF steel production. J Iron steel Res, Int. 2012;19(3):1–5.
- He F, Dong-feng HE, An-jun XU, et al. Hybrid model of molten steel temperature prediction based on ladle heat status and artificial neural network. J Iron Steel Res Int. 2014;21, n. 2:181–190.
- Sousa SIV, Martins F, Alvimferraz M, et al. Multiple linear regression and artificial neural networks based on principal components to predict ozone concentrations. Env Mod Softw. 2007;22:97–103.
- AL-ALAWI SM, Abdul-wahab SA;, Bakheit CS. Combining principal component regression and artificial neural networks for more accurate predictions of ground-level ozone. Environ Model Softw. 2008;23:396–403.
- Adusumilli S, Bhatt D, Wang H, et al. A novel hybrid approach utilizing principal component regression and random forest regression to bridge the period of GPS outages. Neurocomputing. 2015;166:185–192.
- Shearer C. The CRISP-DM model: the new blueprint for data mining. J. Data Warehous. 2000;5, (4):13–22.
- Breiman L. Random forests. Mach Learn. 2001;45:5–32.
- Hancock JT, Khoshgoftaar TM. Survey on categorical data for neural networks. J Big Data. 2020;7(1):1–41.
- Lantz B. Machine learning with R: expert techniques for predictive modeling. Packt publishing ltd; 2019.
- Potdar K, Pardawala TS, Pai CD. A comparative study of categorical variable encoding techniques for neural network classifiers. Int J Comput Appl. 2017;175:7–9.
- Pedregosa F, et al. Scikit-learn: machine learning in python. J Mach Learn Res. 2011;12:2825–2830.
- Abraham A, Pedregosa F, Eickenberg M, et al. Machine learning for neuroimaging with scikit-learn. Front Neuroinform. 2014;8:14–14.
- Montgomery DC. Introduction to statistical quality control. Arizona State University: John Wiley & Sons; 2007.
- James G, Witten D, Hastie T, et al. An introduction to statistical learning. New York: springer; 2013.
- Kingsford C, Salzberg SL. What are decision trees? Nat Biotechnol 2008;26:1011–1013.
- Hastie T, Tibshirani R, Friedman J. The elements of statistical learning: data mining, inference, and prediction. New York: springer; 2009.
- Chen X, Ishwaran H. Random forests for genomic data analysis. Genomics. 2012;99:323–329.
- Bishop CM, Nasrabadi NM. Pattern recognition and machine learning. New York: springer; 2006.
- Liashchynskyi P, Liashchynskyi P. Grid search, random search, genetic algorithm: a big comparison for NAS. arXiv preprint arXiv:1912.06059, 2019.