Dynamic Decision Making (DDM) is the process of assessing and choosing among alternatives in a system that changes over time. Humans make decisions in dynamic tasks based on experience, relying on recognizing past situations similar to current problems, retrieving past solutions, and evaluating their success. This process has been formalized as Instance-Based Learning (IBL) theory, allowing us to develop computational models to emulate human decision-making across multiple contexts, including cybersecurity, dynamic resource allocation, dynamic control systems, search and rescue scenarios, exploration and exploitation problems, and more.
Our research aims to answer three questions:
Discover how we address these questions by exploring our research, publications, resources, games, and our .