On Fri, March 16, 2007 2:12 am, belinda thom said:
> Hi again,
>
> I'm on a roll :-).
>
> The range info re: distance() vs value was quite enlightening.
>
> Now I'm wondering more about the light sensors.
>
> I discovered
>
>    dir(robot.light[0])
>
> which led me to try out things like:
>
> robot.light[0].angles()
>
> (both lights are angle 0---what does this mean? how can two light
> sensors both have the same angle?)

.angles() should return the angle that the sensor is point with respect to
the robot. If these two sensors are pointing in the same direction, then
they could both be the same. Zero should be straight ahead.

> and
>
> robot.light[0].hits,
>
> which returned a pair of 3-tuples. What does it mean for a light to
> have hits? I must also confess that I'm confused about what hits
> means wrt range as well. The Sensor page says:

.hits is only meaningful for range sensors, so lights shouldn't have this
method, or it should return data that indicates that it is meaningless.

> --------
> One could ask the robot to tell you where it hit something. For
> example, you could write robot.range[2].hit() which will return the
> (x,y,z) of the hit:
>
> robot.range[3].hit # returns x, y, z
> and the geometry of the originating ray of the sensor:
>
> robot.range[3].geometry  # returns x, y, z, theta in radians, arc width
>
> ---------
>
> but I'm unsure what this means (the robots I'm using are ones I
> inherited from Lisa's asst, there doesn't appear to be any bumper
> sensors on these robots). I would have thought hits referred to
> locations that the robot is currently touching something. (My lack of
> familiarity w/real robots here, in addition to conflation in terms,
> is likely the problem; nonetheless I hope others might benefit from
> these basic questions, which is why I'm sending them to the list.)

Yes, that text is more confusing than helpful. Here, "hit" is suggestive
of the ray that extends from the range sensor and then intersects with an
object in the world. So, the (x,y,z) tuple is the location (in local
coordinates) where the sensor detected an object.

There is a LPSWander brain (for "local perceptual space") that graphically
displays the local "hits" and also a "global perceptual space" where it
translates the local coordinates into global coordinates to build a map
via local occupancy "hits". (For this brain, I think you have to drive the
robot with the joystick. This global map works too well, because there is
no noise in the dead reckoning. Adding this kind of noise would make one
appreciate algorithms like SLAM and AMC localization much more.)

> There was not much discussion on the sensor pages about lights, so
> I'm hoping you can provide me some pointers to additional information.

There isn't much more to light sensors. They just detect how much light is
coming into the sensor. You can also click "View" on the device line when
"light[0]" is selected and see values in real time. You'll notice a
light[0][0].rgb tuple. This acts like a 1-pixel camera indicating the
amount of red,green,blue light falling on the sensor. Try the Fireflies
world in the Pyrobot simulator.

You can also see what methods and properties objects have by clicking
Robot -> View from the menu. It lists the methods and the first line of
the doctext from the python file.

(These questions are great; we just need to take them and their answers
and put them into the documentation.)

-Doug

> Many thanks,
>
> --b
> _______________________________________________
> Pyro-users mailing list
> [email protected]
> http://emergent.brynmawr.edu/mailman/listinfo/pyro-users
>


-- 
Douglas S. Blank
Associate Professor, Bryn Mawr College
http://cs.brynmawr.edu/~dblank/
Office: 610 526 6501

_______________________________________________
Pyro-users mailing list
[email protected]
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users

Reply via email to