2,947 | 23 | 8 |
下载次数 | 被引频次 | 阅读次数 |
随着人工智能技术的快速发展,以深度学习为代表的复杂人工智能模型开始被逐步应用于各类智能教学系统和平台。然而,这类复杂人工智能模型通常需要从海量数据中学习隐含特征与规律,导致其决策过程的不透明性,通常难以向用户提供清晰且易理解的解释,进而容易引起用户的不信任,也可能带来不易察觉的错误隐患。该研究首先介绍了当前可解释人工智能技术及其基本方法,并以学习者模型的解释作为教育领域的典型案例。在此基础上,研究梳理和提出了可解释人工智能在微观、中观和宏观三个层面的教育应用模式,即检验教育模型、辅助理解系统与支持教育决策。最后,该研究对可解释人工智能在教育中的应用提出了具体建议和展望。
Abstract:With the rapid development of artificial intelligence technology, complex artificial intelligence models represented by deep learning have been gradually applied to various intelligent teaching systems and platforms. However, such complex AI models usually need to learn implicit features and laws from massive data, which leads to the opacity of their decision-making process and usually makes it difficult to provide clear and easily understandable explanations to users. It is in turn likely to cause distrust of users and may also bring the hidden danger of imperceptible errors. This study first introduces current explainable AI technologies and their basic methods, and uses the explanation of learner models as a typical case in the education field. After that, it sorts out and proposes three models of educational applications of explainable AI at the micro, meso, and macro levels, namely, testing educational models,aiding understanding systems, and supporting educational decision-making. Finally, this study provides specific suggestions and outlooks on the application of explainable artificial intelligence in education.
[1]国务院.国务院关于印发新一代人工智能发展规划的通知[EB/OL].http://www.gov.cn/zhengce/content/2017-07/20/content_5211996.htm,2017-07-20.
[2]Li H.Deep Learning for Natural Language Processing:Advantages and Challenges[J].National Science Review,2017,5(1):24-26.
[3]孔祥维,唐鑫泽等.人工智能决策可解释性的研究综述[J].系统工程理论与实践,2021,41(2):524-536.
[4]肖睿,肖海明等.人工智能与教育变革:前景、困难和策略[J].中国电化教育,2020,(4):75-86.
[5]刘桐,顾小清.走向可解释性:打开教育中人工智能的“黑盒”[J].中国电化教育,2022,(5):82-90.
[6][8]纪守领,李进锋等.机器学习模型可解释性方法、应用与安全研究综述[J].计算机研究与发展,2019,56(10):2071-2096.
[7]Hall M.,Harborne D.,et al.A Systematic Method to Understand Requirements for Explainable AI (XAI) Systems[A].Proceedings of the IJCAI Workshop on Explainable Artificial Intelligence[C].Macau:International Joint Conferences on Artificial Intelligence,2019.21-27.
[9]Adadi A.,Berrada M.Peeking Inside the Black-Box:A Survey on Explainable Artificial Intelligence (XAI)[J].IEEE Access,2018,6:52138-52160.
[10][21]Conati C.,Barral O.,et al.Toward Personalized XAI:A Case Study in Intelligent Tutoring Systems[J].Artificial Intelligence,2021,298(9):1-15.
[11]Lu Y.,Wang D.,et al.Towards Interpretable Deep Learning Models for Knowledge Tracing[A].Proceeding of International Conference on Artificial Intelligence in Education[C].Cham:Springer,2020.185-190.
[12]Lu Y.,Wang D.,et al.Does Large Dataset Matter?An Evaluation on the Interpreting Method for Knowledge Tracing[A].Proceedings of the 29th International Conference on Computers in Education[C].Taiwan:Asia-Pacific Society for Computers in Education,2021.63-68.
[13]卢宇,王德亮等.智能导学系统中的知识追踪建模综述[J].现代教育技术,2021,31(11):87-95.
[14]Feng M.,Heffernan N.,et al.Addressing the Assessment Challenge with an Online System that Tutors as It Assesses[J].User Modeling and User-Adapted Interaction,2009,19(3):243-266.
[15]Rai A.Explainable AI:From Black Box to Glass Box[J].Journal of the Academy of Marketing Science,2020,48(1):137-141.
[16]王萍,田小勇等.可解释教育人工智能研究:系统框架、应用价值与案例分析[J].远程教育杂志,2021,39(6):20-29.
[17]Schramowski P.,Stammer W.,et al.Making Deep Neural Networks Right for the Right Scientific Reasons by Interacting with their Explanations[J].Nature Machine Intelligence,2020,2(8):476-486.
[18]Yang C.,Rangarajan A.,et al.Global Model Interpretation via Recursive Partitioning[A].20th International Conference on High Performance Computing and Communications[C].Exeter:IEEE,2018.1563-1570.
[19]Beck J..Difficulties in Inferring Student Knowledge from Observations(and why you should care)[A].Educational Data Mining:Supplementary Proceedings of the 13th International Conference of Artificial Intelligence in Education[C].Los Angeles:IOS Press,2007.21-30.
[20]Staggers N.,Norcio A.F.Mental Models:Concepts for HumanComputer Interaction Research[J].International Journal of ManMachine Studies,1993,38(4):587-605.
[22]Kirill Y.,Bek K.,et al.Application of Machine Learning Black Boxes Explainers in the Development of a Recommendatory System for Improving the Quality of School Education[A].Proceeding of the 17th International Scientific Conference on Information Technologies and Management[C].Riga:ISMA University of Applied Sciences,2019.60-63.
[23]Ledeboer T.S.Interpretable Student Performance Prediction[D].Netherland:Eindhoven University of Technology,2019.
[24]胡钦太,伍文燕等.深度学习支持下多模态学习行为可解释性分析研究[J].电化教育研究,2021,42(11):77-83.
基本信息:
DOI:
中图分类号:G434
引用信息:
[1]卢宇,章志,王德亮等.可解释人工智能在教育中的应用模式研究[J].中国电化教育,2022,No.427(08):9-15+23.
基金信息:
国家自然科学基金面上项目“基于跨学科概念图的知识追踪模型构建及其可解释性研究”(项目编号:62077006)阶段性研究成果