Professional Documents
Culture Documents
Definitions
Agent = entity that can perceive, through sensors, and act, through
effectors.
Example: the vacuum-cleaner agent
Determinist vs. stochastic
Determinist
If the environment is completely determined by the current state and the
agents action.
Example: our vacuum-cleaner agent.
Stochastic
The contrary of determinist.
Example: real world (e.g., driving a car).
Adding incertitude (stochasticity) in the vacuum agent:
Random emergence of dust.
Relative reliability of the sucking.
Simple reflex agents
Select their action according to the current percept (no memory).
Example of the vacuum:
Simple reflex agents: global architecture
Model-based reflex agents (internal state)
Goal-based agents (internal state)
Utility-based agents (internal state)
Learning agents (internal state)
Project: Implementing the vacuum-cleaner
Part 1: Building a flexible environment
Implement a simple reflex agent for the vacuum environment. Run the
environment with this agent for all possible initial dirt configurations and
agent locations. Record the performance score for each configuration and the
overall average score.
The vacuum environments in the preceding parts have all been deterministic.
Develop and discuss agent programs for each of the following stochastic
versions:
Murphys law: 25% of the time, the Suck action fails to clean the floor if it is dirty and
deposits dirt onto the floor if the floor is clean. How is your program affected if the dirt
sensor gives the wrong answer 10% of the time?
Small children: At each time step, each clean square has a 10% chance of becoming dirty.
Can you come up with a rational agent design for this case?