My research focuses on designing systems of representation for autonomous mobile robots. I have created a task based design methodology that allows the designer to develop the agent's internal representation of its environment based on the agent's goals and capabilities.
Primarily, I am interested in representation that is useful for the "action oriented" part of an agent's architecture, i.e. not the planner. My work begins with the assumption that the world is dynamic and so robots must have a fairly direct coupling between perception and action if they are to be effective. This means they cannot afford to do a lot of high-level inferencing (planning) as part of their low-level control and this effects the kinds of representation structures that they can use efficiently and effectively. By analyzing the tasks that the robot must perform, the designer can answer three basic questions:
I advocate the use of representation in all layers of an agent architecture. Although some (like Brooks) have advocated that intelligent behavior can be achieved without representation, I believe this to be somewhat of a strawman. The question is how to balance the informational needs of any particular component of the agent architecture (what knowledge needs to be captured in the representation) against the cost of keeping the representation consistent with the state of a dynamic world.What to represent? How to structure that representation? How to keep that representation consistent with the changing state of a dynamic environment?
Bruce
F. Brill, G. Wasson, G. Ferrer, and W. Martin. 1998. The Effective Field of View Paradigm: Adding Representation to a Reactive System. Journal of Engineering Applications of Artificial Intelligence 11 (Special Issue on Machine Vision for Intelligent Vehicles and Autonomous Robots): 189-201.
G. Wasson, D. Kortenkamp, and E. Huber. 1998. Integrating Active Perception with an Autonomous Robot Architecture. Autonomous Agents 98: 325-331.
G. Wasson, G. Ferrer, and W. Martin. 1997. Perception, Action and Effective Representation in Multi-Layered Systems. GI/VI-97: 73-80.
G. Wasson, G. Ferrer, and W. Martin. 1997. Systems for Perception, Action and Effective Representation. FLAIRS-97 - Special Track on Real-Time Planning and Reacting: 352-356.
G. Wasson, G. Ferrer, and W. Martin. 1997. Hide and Seek: Effective Use of Memory in Perception/Action Systems. Autonomous Agents 97: 492-493. full version
G. Wasson, E. Huber, and D. Kortenkamp. 1998. A Behavior Based, Visual Architecture for Autonomous Robots. CVPR 98 Workshop on Perception for Mobile Agents: 89-94.
G. Wasson, and W. Martin. 1996. Integration
and Action in Perception/Action Systems with Access to Non-local Space
Information. AAAI-96
workshop on Planning, Action and Control: Bridging the Gap.
Sound source direction is estimated based on the time difference between the reception of a source's signal at each of a pair of microphones. This time difference combined with other information derives the source's angular position. The estimation of time delay is done using phase correlation. My thesis on the subject can be accessed here.
wasson@virginia.edu