problems.multi_object_search.env package

Submodules

problems.multi_object_search.env.env module

The Environment

class problems.multi_object_search.env.env.MosEnvironment(dim, init_state, sensors, obstacles={})[source]

Bases: Environment

property robot_ids
state_transition(self, action, execute=True, **kwargs)[source]

Overriding parent class function. Simulates a state transition given action. If execute is set to True, then the resulting state will be the new current state of the environment.

Parameters:
  • action (Action) – action that triggers the state transition

  • execute (bool) – If True, the resulting state of the transition will become the current state.

Returns:

reward as a result of action and state transition, if execute is True (next_state, reward) if execute is False.

Return type:

float or tuple

problems.multi_object_search.env.env.interpret(worldstr)[source]

Interprets a problem instance description in worldstr and returns the corresponding MosEnvironment.

For example: This string

rx...
.x.xT
.....
***
r: laser fov=90 min_range=1 max_range=10

describes a 3 by 5 world where x indicates obsticles and T indicates the “target object”. T could be replaced by any upper-case letter A-Z which will serve as the object’s id. Lower-case letters a-z (except for x) serve as id for robot(s).

After the world, the *** signals description of the sensor for each robot. For example “r laser 90 1 10” means that robot r will have a Laser2Dsensor with fov 90, min_range 1.0, and max_range of 10.0.

Parameters:

worldstr (str) – a string that describes the initial state of the world.

Returns:

the corresponding environment for the world description.

Return type:

MosEnvironment

problems.multi_object_search.env.env.interpret_robot_id(robot_name)[source]
problems.multi_object_search.env.env.equip_sensors(worldmap, sensors)[source]
Parameters:
  • worldmap (str) – a string that describes the initial state of the world.

  • sensors (dict) a map from robot character representation (e.g. 'r') –

string that describes its sensor (e.g. ‘laser fov=90 min_range=1 max_range=5 angle_increment=5’)

Returns:

A string that can be used as input to the interpret function

Return type:

str

problems.multi_object_search.env.env.make_laser_sensor(fov, dist_range, angle_increment, occlusion)[source]

Returns string representation of the laser scanner configuration. For example: “laser fov=90 min_range=1 max_range=10”

Parameters:
  • fov (int or float) – angle between the start and end beams of one scan (degree).

  • dist_range (tuple) – (min_range, max_range)

  • angle_increment (int or float) – angular distance between measurements (rad).

  • occlusion (bool) – True if consider occlusion

Returns:

String representation of the laser scanner configuration.

Return type:

str

problems.multi_object_search.env.env.make_proximity_sensor(radius, occlusion)[source]

Returns string representation of the proximity sensor configuration. For example: “proximity radius=5 occlusion_enabled=False”

Parameters:
  • radius (int or float) –

  • occlusion (bool) – True if consider occlusion

Returns:

String representation of the proximity sensor configuration.

Return type:

str

problems.multi_object_search.env.visual module

problems.multi_object_search.env.visual.object_color(objid, count)[source]
class problems.multi_object_search.env.visual.MosViz(env, res=30, fps=30, controllable=False)[source]

Bases: object

property img_width
property img_height
property last_observation
update(robot_id, action, observation, viz_observation, belief)[source]

Update the visualization after there is new real action and observation and updated belief.

Parameters:
static draw_robot(img, x, y, th, size, color=(255, 12, 12))[source]
static draw_observation(img, z, rx, ry, rth, r, size, color=(12, 12, 255))[source]
static draw_belief(img, belief, r, size, target_colors)[source]

belief (OOBelief)

on_init()[source]

pygame init

on_event(event)[source]
on_loop()[source]
on_render()[source]
on_cleanup()[source]
on_execute()[source]
render_env(display_surf)[source]
problems.multi_object_search.env.visual.unittest()[source]

Module contents