Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Smart Environment for Adaptive Learning of Cybersecurity Skills (2307.05281v1)

Published 11 Jul 2023 in cs.CR and cs.CY

Abstract: Hands-on computing education requires a realistic learning environment that enables students to gain and deepen their skills. Available learning environments, including virtual and physical labs, provide students with real-world computer systems but rarely adapt the learning environment to individual students of various proficiency and background. We designed a unique and novel smart environment for adaptive training of cybersecurity skills. The environment collects a variety of student data to assign a suitable learning path through the training. To enable such adaptiveness, we proposed, developed, and deployed a new tutor model and a training format. We evaluated the learning environment using two different adaptive trainings attended by 114 students of various proficiency. The results show students were assigned tasks with a more appropriate difficulty, which enabled them to successfully complete the training. Students reported that they enjoyed the training, felt the training difficulty was appropriately designed, and would attend more training sessions like these. Instructors can use the environment for teaching any topic involving real-world computer networks and systems because it is not tailored to particular training. We freely released the software along with exemplary training so that other instructors can adopt the innovations in their teaching practice.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. M. Bashir, C. Wee, N. Memon, and B. Guo, “Profiling cybersecurity competition participants: Self-efficacy, decision-making and interests predict effectiveness of competitions as a recruitment tool,” Computers & Security, vol. 65, pp. 153–165, 2017. [Online]. Available: https://doi.org/10.1016/j.cose.2016.10.007
  2. C. Braghin, S. Cimato, E. Damiani, F. Frati, L. Mauri, and E. Riccobene, “A Model Driven Approach for Cyber Security Scenarios Deployment,” in Computer Security.   Cham: Springer International Publishing, 2020, pp. 107–122. [Online]. Available: https://doi.org/10.1007/978-3-030-42051-2_8
  3. R. S. Putri, A. Purwanto, R. Pramono, M. Asbari, L. M. Wijayanti, and C. C. Hyun, “Impact of the COVID-19 pandemic on online home learning: An explorative study of primary schools in Indonesia,” International Journal of Advanced Science and Technology, vol. 29, no. 5, pp. 4809–4818, 2020.
  4. B. Tabuenca, S. Serrano-Iglesias, A. C. Martín, C. Villa-Torrano, Y. Dimitriadis, J. I. Asensio-Pérez, C. Alario-Hoyos, E. Gómez-Sánchez, M. L. Bote-Lorenzo, A. Martínez-Monés, and C. D. Kloos, “Affordances and Core Functions of Smart Learning Environments: A Systematic Literature Review,” IEEE Transactions on Learning Technologies, vol. 14, no. 2, pp. 129–145, 2021. [Online]. Available: https://doi.org/10.1109/TLT.2021.3067946
  5. J. Ma and J. V. Nickerson, “Hands-on, simulated, and remote laboratories: A comparative literature review,” ACM Comput. Surv., vol. 38, no. 3, p. 7–es, sep 2006. [Online]. Available: https://doi.org/10.1145/1132960.1132961
  6. T. Alkhaldi, I. Pranata, and R. I. Athauda, “A review of contemporary virtual and remote laboratory implementations: observations and findings,” Journal of Computers in Education, vol. 3, no. 3, pp. 329–351, 2016. [Online]. Available: https://doi.org/10.1007/s40692-016-0068-z
  7. I. Grout, “Remote Laboratories as a Means to Widen Participation in STEM Education,” Education Sciences, vol. 7, no. 4, 2017. [Online]. Available: https://doi.org/10.3390/educsci7040085
  8. H.-D. Wuttke, M. Hamann, and K. Henke, “Learning analytics in online remote labs,” in 2015 3rd Experiment International Conference (exp.at’15), 2015, pp. 255–260. [Online]. Available: https://doi.org/10.1109/EXPAT.2015.7463275
  9. C. N. Tulha, M. A. G. Carvalho, and L. N. de Castro, “Leda: A learning analytics based framework to analyze remote labs interaction,” in Proceedings of the Ninth ACM Conference on Learning @ Scale, ser. L@S ’22.   New York, NY, USA: Association for Computing Machinery, 2022, p. 379–383. [Online]. Available: https://doi.org/10.1145/3491140.3528324
  10. P. Orduña, A. Almeida, D. López-de Ipiña, and J. Garcia-Zubia, “Learning analytics on federated remote laboratories: Tips and techniques,” in 2014 IEEE Global Engineering Education Conference (EDUCON), April 2014, pp. 299–305. [Online]. Available: https://doi.org/10.1109/EDUCON.2014.6826107
  11. J. García-Zubía, J. Cuadros, V. Serrano, U. Hernández-Jayo, I. Angulo-Martínez, A. Villar, P. Orduña, and G. Alves, “Dashboard for the VISIR remote lab,” in 2019 5th Experiment International Conference (exp.at’19), 2019, pp. 42–46. [Online]. Available: https://doi.org/10.1109/EXPAT.2019.8876527
  12. H. Considine, A. Nafalski, and Z. Nedic, “Understanding Common Student Mistakes in the Remote Laboratory NetLab,” in 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), 2018, pp. 266–271. [Online]. Available: https://doi.org/10.1109/TALE.2018.8615348
  13. H. Considine, A. Nafalski, and M. Milosz, “An Automated Support System in a Remote Laboratory in the Context of Online Learning,” in Educating Engineers for Future Industrial Revolutions, M. E. Auer and T. Rüütmann, Eds.   Cham: Springer International Publishing, 2021, pp. 657–665. [Online]. Available: https://doi.org/10.1007/978-3-030-68201-9_64
  14. A. A. Benattia, A. Benachenhou, and M. Moussa, “Development of an Automatic Assessment in Remote Experimentation Over Remote Laboratory,” in Smart Industry & Smart Education, M. E. Auer and R. Langmann, Eds.   Cham: Springer International Publishing, 2019, pp. 136–143. [Online]. Available: https://doi.org/10.1007/978-3-319-95678-7_15
  15. A. L. Gonçalves, L. M. Carlos, J. B. da Silva, and G. R. Alves, “Personalized Student Assessment based on Learning Analytics and Recommender Systems,” in 2018 3rd International Conference of the Portuguese Society for Engineering Education (CISPEE), 2018, pp. 1–7. [Online]. Available: https://doi.org/10.1109/CISPEE.2018.8593493
  16. E. Mousavinasab, N. Zarifsanaiey, S. R. Niakan Kalhori, M. Rakhshan, L. Keikha, and M. Ghazi Saeedi, “Intelligent tutoring systems: a systematic review of characteristics, applications, and evaluation methods,” Interactive Learning Environments, vol. 29, no. 1, pp. 142–163, 2021. [Online]. Available: https://doi.org/10.1080/10494820.2018.1558257
  17. V. Aleven, E. A. McLaughlin, R. A. Glenn, and K. R. Koedinger, “Instruction Based on Adaptive Learning Technologies,” in Handbook of Research on Learning and Instruction, R. E. Mayer and P. A. Alexander, Eds.   New York: Routledge, 2016, pp. 522–560. [Online]. Available: https://doi.org/10.4324/9781315736419
  18. A. Mitrovic and S. Ohlsson, “Implementing CBM: SQL-Tutor After Fifteen Years,” International Journal of Artificial Intelligence in Education, vol. 26, no. 1, pp. 150–159, Mar 2016. [Online]. Available: https://doi.org/10.1007/s40593-015-0049-9
  19. B. Vesin, K. Mangaroska, and M. Giannakos, “Learning in smart environments: user-centered design and analytics of an adaptive learning system,” Smart Learning Environments, vol. 5, no. 1, p. 24, 2018. [Online]. Available: https://doi.org/10.1186/s40561-018-0071-0
  20. D. Dermeval, R. Paiva, I. I. Bittencourt, J. Vassileva, and D. Borges, “Authoring tools for designing intelligent tutoring systems: a systematic review of the literature,” International Journal of Artificial Intelligence in Education, vol. 28, no. 3, pp. 336–384, 2018. [Online]. Available: https://doi.org/10.1007/s40593-017-0157-9
  21. V. Aleven, B. M. McLaren, J. Sewall, M. van Velsen, O. Popescu, S. Demi, M. Ringenberg, and K. R. Koedinger, “Example-tracing tutors: Intelligent tutor development for non-programmers,” International Journal of Artificial Intelligence in Education, vol. 26, no. 1, pp. 224–269, 2016. [Online]. Available: https://doi.org/10.1007/s40593-015-0088-2
  22. N. Chouliaras, G. Kittes, I. Kantzavelou, L. Maglaras, G. Pantziou, and M. A. Ferrag, “Cyber Ranges and TestBeds for Education, Training, and Research,” Applied Sciences, vol. 11, no. 4, 2021. [Online]. Available: https://doi.org/10.3390/app11041809
  23. M. Swann, J. Rose, G. Bendiab, S. Shiaeles, and F. Li, “Open Source and Commercial Capture The Flag Cyber Security Learning Platforms - A Case Study,” in 2021 IEEE International Conference on Cyber Security and Resilience (CSR), 2021, pp. 198–205. [Online]. Available: https://doi.org/10.1109/CSR51186.2021.9527941
  24. V. Švábenský, “Automated Feedback for Cybersecurity Training,” Doctoral thesis, Masaryk University, 2022. [Online]. Available: https://is.muni.cz/th/dg3b4/?lang=en
  25. S. Kucek and M. Leitner, “An Empirical Survey of Functions and Configurations of Open-Source Capture the Flag (CTF) Environments,” Journal of Network and Computer Applications, vol. 151, 2020. [Online]. Available: https://doi.org/10.1016/j.jnca.2019.102470
  26. M. M. Yamin, B. Katt, and V. Gkioulos, “Cyber ranges and security testbeds: Scenarios, functions, tools and architecture,” Computers & Security, vol. 88, no. 101636, 2020. [Online]. Available: https://doi.org/10.1016/j.cose.2019.101636
  27. Hack The Box. (2022) Hack The Box. Hack The Box. [Online]. Available: https://www.hackthebox.com/
  28. TryHackMe. (2022) TryHackMe. TryHackMe. [Online]. Available: https://www.tryhackme.com/
  29. Circadence. (2022) Project Ares. Circadence. [Online]. Available: https://projectares.academy
  30. G. Hatzivasilis, S. Ioannidis, M. Smyrlis, G. Spanoudakis, F. Frati, L. Goeke, T. Hildebrandt, G. Tsakirakis, F. Oikonomou, G. Leftheriotis et al., “Modern Aspects of Cyber-Security Training and Continuous Adaptation of Programmes to Trainees,” Applied Sciences, vol. 10, no. 16, p. 5702, 2020. [Online]. Available: https://doi.org/10.3390/app10165702
  31. J. Vykopal, P. Čeleda, P. Seda, V. Švábenský, and D. Tovarňák, “Scalable Learning Environments for Teaching Cybersecurity Hands-on,” in 2021 IEEE Frontiers in Education Conference (FIE).   New York, NY, USA: IEEE, 10 2021, pp. 1–9. [Online]. Available: http://doi.org/10.1109/FIE49875.2021.9637180
  32. R. Ošlejšek, V. Rusňák, K. Burská, V. Švábenský, J. Vykopal, and J. Čegan, “Conceptual Model of Visual Analytics for Hands-on Cybersecurity Training,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 8, pp. 3425–3437, 2021. [Online]. Available: https://doi.org/10.1109/TVCG.2020.2977336
  33. C. Romero and S. Ventura, “Educational data mining and learning analytics: An updated survey,” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, vol. 10, no. 3, 2020. [Online]. Available: https://doi.org/10.1002/widm.1355
  34. C. Hundhausen, D. Olivares, and A. Carter, “IDE-Based Learning Analytics for Computing Education: A Process Model, Critical Review, and Research Agenda,” ACM Transactions on Computing Education, vol. 17, no. 3, pp. 11:1–11:26, Aug. 2017. [Online]. Available: https://doi.org/10.1145/3105759
  35. K. Maennel, “Learning Analytics Perspective: Evidencing Learning from Digital Datasets in Cybersecurity Exercises,” in 2020 IEEE European Symposium on Security and Privacy Workshops (EuroSPW), 2020, pp. 27–36. [Online]. Available: https://doi.org/10.1109/EuroSPW51379.2020.00013
  36. R. Weiss, M. E. Locasto, and J. Mache, “A Reflective Approach to Assessing Student Performance in Cybersecurity Exercises,” in Proceedings of the 47th ACM Technical Symposium on Computing Science Education, ser. SIGCSE ’16.   New York, NY, USA: ACM, 2016, pp. 597–602. [Online]. Available: https://doi.org/10.1145/2839509.2844646
  37. Y. Deng, D. Lu, C.-J. Chung, D. Huang, and Z. Zeng, “Personalized Learning in a Virtual Hands-on Lab Platform for Computer Science Education,” in 2018 IEEE Frontiers in Education Conference (FIE).   New York, NY, USA: IEEE, oct 2018, pp. 1–8. [Online]. Available: https://doi.org/10.1109/FIE.2018.8659291
  38. V. Švábenský and J. Vykopal, “Challenges Arising from Prerequisite Testing in Cybersecurity Games,” in Proceedings of the 49th ACM Technical Symposium on Computer Science Education, ser. SIGCSE ’18.   New York, NY, USA: Association for Computing Machinery, 2018, p. 56–61. [Online]. Available: https://doi.org/10.1145/3159450.3159454
  39. J. Mirkovic and P. A. Peterson, “Class Capture-the-Flag Exercises,” in 2014 USENIX Summit on Gaming, Games, and Gamification in Security Education (3GSE 14), 2014. [Online]. Available: https://www.usenix.org/system/files/conference/3gse14/3gse14-mirkovic.pdf
  40. G. Rainer and A. GmbH, “The Syslog Protocol,” Internet Requests for Comments, RFC Editor, RFC 5424, 3 2009. [Online]. Available: https://www.rfc-editor.org/rfc/rfc5424.txt
  41. Elastic NV, “The Elastic Stack,” 2021. [Online]. Available: https://www.elastic.co/elastic-stack/
  42. D. Mills, J. Martin, J. Burbank, and W. Kasch, “Network time protocol version 4: Protocol and algorithms specification,” Internet Requests for Comments, RFC Editor, RFC 5905, 6 2010. [Online]. Available: https://www.rfc-editor.org/rfc/rfc5905.txt
  43. V. Švábenský, J. Vykopal, D. Tovarňák, and P. Čeleda, “Toolset for Collecting Shell Commands and Its Application in Hands-on Cybersecurity Training,” in 2021 IEEE Frontiers in Education Conference (FIE).   New York, NY, USA: IEEE, 10 2021, pp. 1–9. [Online]. Available: https://doi.org/10.1109/FIE49875.2021.9637052
  44. P. Seda, J. Vykopal, V. Švábenský, and P. Čeleda, “Reinforcing Cybersecurity Hands-on Training With Adaptive Learning,” in 2021 IEEE Frontiers in Education Conference (FIE).   New York, NY, USA: IEEE, 10 2021, pp. 1–9. [Online]. Available: https://doi.org/10.1109/FIE49875.2021.9637252
  45. P. Seda, J. Vykopal, P. Čeleda, and I. Ignác, “Designing Adaptive Cybersecurity Hands-on Training [in press],” in 2022 IEEE Frontiers in Education Conference (FIE).   New York, NY, USA: IEEE, 10 2022, pp. 1–9.
  46. D. E. Avison, F. Lau, M. D. Myers, and P. A. Nielsen, “Action Research,” Commun. ACM, vol. 42, no. 1, p. 94–97, Jan. 1999. [Online]. Available: https://doi.org/10.1145/291469.291479
  47. T. Anderson and J. Shattuck, “Design-Based Research: A Decade of Progress in Education Research?” Educational Researcher, vol. 41, no. 1, pp. 16–25, 2012. [Online]. Available: https://doi.org/10.3102/0013189X11428813
  48. Masaryk University, “KYPO Cyber Range Platform,” 2022. [Online]. Available: https://gitlab.ics.muni.cz/muni-kypo-crp
  49. M. Gáliková, V. Švábenský, and J. Vykopal, “Toward Guidelines for Designing Cybersecurity Serious Games,” in Proceedings of the 52nd ACM Technical Symposium on Computer Science Education, ser. SIGCSE ’21.   New York, NY, USA: ACM, 2021, p. 1275. [Online]. Available: https://doi.org/10.1145/3408877.3439568
  50. Masaryk University, “KYPO Cyber Range Platform: Documentation,” 2022. [Online]. Available: https://docs.crp.kypo.muni.cz/
  51. M. Gáliková, V. Švábenský, and J. Vykopal, “Junior Hacker Adaptive Training,” 2022. [Online]. Available: https://gitlab.ics.muni.cz/muni-kypo-trainings/games/junior-hacker-adaptive
Citations (4)

Summary

We haven't generated a summary for this paper yet.