Lecture 2 and lab visit after lecture 2

Computer vision requires that we fundamentally understand what a camera measures and how it responds to light:

Above, we see the effects of a linear array of lights on the human face.

Above, we see the photographic exposure of a circular array of lights.

Above, we see the photographic exposure of a Nanoleaf node (triangular pixel)

Above, we see the photographic exposure of a white LED that's actually sequential Red, Green, and Blue illuminations.

Above, we see a linear array of LEDs. Understand imaging and computer vision as a slice of the spacetime continuum...

Please read Chapter 1 before lab 1 and try to formulate some questions about anything you don't understand.

Seeing and photographing radio waves and sound waves

Seeing and photographing radio waves

Seeing and photographing sound waves


Augmented Reality robotics (ARbotics) allows us to see the effects of hand movement on the sound waves from a robotically actuated violin. Notice how moving the hands in the vicinity of the sound waves has a slight but noticeable effect on them. Sound waves bounce off objects in the vicinity, and we can see and understand the patterns they make when they bounce off various objects.

Interactive Augmented and Virtual Reality robotics for AI, and HI = HuMachine Learning.

Reading assignment: start with equation 1 on page 1411 of this paper.