Computer Engineering, Volume. 51, Issue 8, 120(2025)
Personalized Forgetting Modeling for Knowledge Tracing via Transformers
It is very difficult for traditional Knowledge Tracing (KT) models to model learners' knowledge state changes in long interaction sequences. This study introduces an attention mechanism model represented by a Transformer to capture potential information in learners' long interaction sequences that exhibits good performance. However, when modeling the learning process, existing models often ignore the differences in learners' abilities and focus mainly on the accumulation of knowledge mastery states, failing to fully model the forgetting benefit of learners. In this study, a Knowledge Tracing Method based on Personalized Forgetting Modeling (PFKT) is proposed that models learners' answering ability by introducing additional characteristic information and further explores learners' differentiated memory-forgetting ability. Specifically, this method starts with the historical interaction sequence of learners and comprehensively considers the acquisition and forgetting of knowledge points to capture the state of the learners' real knowledge mastery. Simultaneously, combined with additional characteristic information, personalized forgetting phenomenon modeling is realized more accurately. Experimental results demonstrate that the proposed PFKT model achieves better performance than existing models on the ASSISTments2017 and Algebra 2005-2006 datasets.
Get Citation
Copy Citation Text
ZHANG Zhaoli, LI Jiahao, LIU Hai, SHI Fobo, HE Jiawen. Personalized Forgetting Modeling for Knowledge Tracing via Transformers[J]. Computer Engineering, 2025, 51(8): 120
Category:
Received: Apr. 15, 2024
Accepted: Aug. 26, 2025
Published Online: Aug. 26, 2025
The Author Email: LIU Hai (hailiu0204@ccnu.edu.cn)