|
Main
Page
Graduate
Program
Undergraduate
Program
Course Information
Course
Web Pages
Our
Research
Research
Areas
Technical
Reports
Faculty
Graduate
Students
Research
and Technical Staff
Administrative
Staff
Alumni
Degree
Recipients
Calendars
Computing
Facilities
CS
Talks Mailing List
Yale
Computer Science FAQ
Yale Workstation Support
Computing
Lab
AfterCollege
Job Resource
Graduate
Writing Center
Contact
Us
History
Life in the Department
Life About Town
Directions
Faculty
Positions
City
of New Haven
Yale
Applied Mathematics
Yale
C2: Creative Consilience of

Computing and the Arts
Yale
Faculty of Engineering
Yale
GSAS Staff Directory
Yale
University Home Page
Google Search
Yale Info Phonebook
Internal |
|
Autonomous Coordinated Motion
Through collaboration with several other research programs, we are studying
the notion of autonomous coordinated motion. Dramatic examples of such
behavior are exhibited in nature by flocking birds, herding mammals, schooling
fish, and swarming insects. Research at Yale is aimed at understanding
this behavior, simulating it on computers, and programming robots to exhibit
similar behaviors on the ground, in the air, and under water. We are particularly
interested in tasks such as: congregation, navigation, obstacle avoidance,
target capture, target fleeing, and others. Yale is also involved in RoboCup
Soccer, where small robots compete once per year on small artificial soccer
fields against teams from other research groups from around the world.
Yales experimental testbeds include a custom-made team of "soccerbots,"
a small fleet of Probotic Cye "Luxury Wagons," and a small number
of model helicopters.
Research in autonomous coordinated motion involves a cross-disciplinary
blend of ideas drawn from automatic control, high-level programming languages
and planning. Steve
Morses research in multi-level logic-based switching control
is central to the algorithmic supervision, coordination, and control of
autonomous vehicles. Central to the software effort is Paul
Hudaks notion of Functional Reactive Programming (FRP). With
FRP, it is possible to develop software for control applications in a
high-level, declarative manner, and to use the same infrastructure in
a simulation or prototyping environment, including the use of FRP animation
to display control signals and related data. Finally, Drew
McDermotts work on hierarchical planning contributes to the
high-level strategies used in intelligent coordinated motion. The overall
goal of the research is to develop a comprehensive methodology for the
integration of sensing, control, planning, and learning in autonomous,
multi-robot systems.
Faculty members involved in the Center for Computational Vision and Control
include Drew McDermott,
Brian Scassellati,
Steve Zucker,
and Paul Hudak
in the Computer Science Department, Steve
Morse in Electrical Engineering, and Jim
Duncan, Hemant
Tagare, and Larry Staib in the Radiology Department of the Medical
School.

|
 |