http://www.cs.indiana.edu/hyplan/leake/gdie.html
Explanations play a fundamental role in many artificial intelligence tasks, but generating useful explanations remains a difficult problem. This project studies goal-driven interactive explanation, an approach to explanation that combines aspects of goal-driven learning and case-based reasoning to guide real-world explanation. The goal is to develop a pragmatic process for generating useful explanations. The model combines strategic decision-making about when and what to learn with opportunistic interaction within a dynamic environment.
Associated Faculty: David B. Leake
Associated Graduate Students: Raja Sooriamurthi
Support: This research has been supported by start-up funds from the Indiana University Computer Science Department.
For more information about CBR research at Indiana, click here