Improving Memory for Optimization and Learning in Dynamic Environments - Robotics Institute Carnegie Mellon University
Loading Events

PhD Thesis Defense

July

7
Thu
Gregory John Barlow Carnegie Mellon University
Thursday, July 7
10:30 am to 12:00 am
Improving Memory for Optimization and Learning in Dynamic Environments

Event Location: NSH 1507

Abstract: Many problems considered in optimization and artificial intelligence research are static: information about the problem is known a priori, and little to no uncertainty about this information is presumed to exist. Most real problems, however, are dynamic: information about the problem is released over time, uncertain events may occur, or the requirements of the problem may change as time passes. One technique for improving optimization and learning in dynamic environments is by using information from the past. By using solutions from previous environments, it is often easier to find promising solutions in a new environment. A common way to maintain and exploit information from the past is the use of memory, where solutions are stored periodically and can be retrieved and refined when the environment changes. Memory can help search respond quickly and efficiently to changes in a dynamic problem. Despite their strengths, standard memories have many weaknesses which limit their effectiveness. This thesis explores ways to improve memory for optimization and learning in dynamic environments. The techniques presented in this thesis improve memories by incorporating probabilistic models of previous solutions into memory, storing many previous solutions in memory while keeping overhead low, building long-term models of the dynamic search space over time, allowing easy refinement of memory entries, and mapping previous solutions to the current environment for problems where solutions may become obsolete over time. To address the weaknesses and limitations of standard memory, two novel classes of memory are introduced: density-estimate memory and classifier-based memory. Density-estimate memory builds and maintains probabilistic models within memory to create density estimations of promising areas of the search space over time. Density-estimate memory allows many solutions to be stored in memory, builds long-term models of the dynamic search space, and allows memory entries to be easily refined while keeping overhead low. Density-estimate memory is applied to three dynamic problems: factory coordination, the Moving Peaks benchmark problem, and adaptive traffic signal control. For all three problems, density-estimate memory improves performance over a baseline learning or optimization algorithm as well as state-of-the-art algorithms. Classifier-based memory allows dynamic problems with shifting feasible regions to capture solutions in memory and then map these memory entries to feasible solutions in the future. By storing abstractions of solutions in the memory, information about previous solutions can be used to create solutions in a new environment, even when the old solutions are now completely obsolete or infeasible. Classifier-based memory is applied to a dynamic job shop scheduling problem with sequence-dependent setup times and machine breakdowns and repairs. Classifier-based memory improves the quality of schedules and reduces the amount of search necessary to find good schedules. The techniques presented in this this thesis improve the ability of memories to guide search quickly and efficiently to good solutions as the environment changes.

Committee:Stephen Smith, Chair

Katia Sycara

Laura Barbulescu

Juergen Branke, University of Warwick