Abstract
The early years of a child’s life greatly affects education potential and furthermore potential of educational achievements in adulthood. The brain develops faster during early age, and missed cognitive opportunities in that period are difficult to make up for. In this research, we have developed a machine learning model based on the data points obtained from the educational game, with aim to predict how many attempts are necessary for an individual child to complete the task or assessment as part of educational game. In-game assessments are based on the skills that the child already possess and those developed while playing the game. Training of the machine learning model is based on collected and processed data points (features), while model interconnections are related to the factors of the child’s cognitive growing up process. Model performance benchmarks are elaborated in results and conclusion section of the paper as quality measures of the forecast indicators.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ben-David, A.: Comparison of classification accuracy using Cohen’s Weighted Kappa. Expert Syst. Appl. 34(2), 825–832 (2008). https://www.sciencedirect.com/science/article/abs/pii/S0957417406003435
Chen, T., Guestrin, C.: XGBoost: A scalable tree boosting system. In: Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. ACM (2016) https://dl.acm.org/citation.cfm?doid=2939672.2939785
Domingo, P.: A few useful things to know about machine learning. Commun. ACM 55(10), 78–87 (2012). https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf
Hastie, T., Tibshirani, R., Friedman, J.: Introduction to Statistical Learning. Berlin: Springer (2013). http://faculty.marshall.usc.edu/gareth-james/ISL
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Berlin: Springer (2009). https://web.stanford.edu/~hastie/Papers/ESLII.pdf
Nielsen, D.: Tree Boosting with XGBoost-Why Does XGBoost Win “Every” Machine Learning Competition? MS thesis. Trondheim: NTNU (2016). https://ntnuopen.ntnu.no/ntnu-xmlui/bitstream/hanle/11250/2433761/16128_FULLTEXT.pdf, prosinac. 2019
Prokhorenkova, L., Gusev, G., Vorobev, A., Dorogush, A.V., Gulin, A.: CatBoost: Unbiased boosting with categorical features. In: Advances in neural Information Processing System. Moskva: Yandex (2018). https://papers.nips.cc/paper/7898-catboost-unbiased-boosting-with-categorical-features.pdf
Skansi, S.: Introduction to Deep Learning. Springer, London (2018)
Intelligent Computing & Optimization, Conference proceedings ICO 2018, Springer, Cham (2018) ISBN 978-3-030-00978-6
Intelligent Computing and Optimization, Proceedings of the 2nd International Conference on Intelligent Computing and Optimization 2019 (ICO 2019), Springer International Publishing (2019) ISBN 978-3-030-33585-4
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Tolic, A., Mrsic, L., Jerkovic, H. (2021). Learning Success Prediction Model for Early Age Children Using Educational Games and Advanced Data Analytics. In: Vasant, P., Zelinka, I., Weber, GW. (eds) Intelligent Computing and Optimization. ICO 2020. Advances in Intelligent Systems and Computing, vol 1324. Springer, Cham. https://doi.org/10.1007/978-3-030-68154-8_61
Download citation
DOI: https://doi.org/10.1007/978-3-030-68154-8_61
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-68153-1
Online ISBN: 978-3-030-68154-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)