HAW: Lesson - Big Idea 1: Perception
Big Idea 1: Perception
Think About It...
How can a computer see, hear, smell, taste or touch?
Those are the five human senses and in this lesson we'll be talking about perception as it relates to AI.
Now that we are aware of historical pioneers and the impact of modern day innovators , let’s dive into aspects of how AI works to see how it is all connected. In our 1st module for this unit we introduced The Five Big Ideas of AI. In this lesson we will focus on Big Idea 1: Perception.
Video Lesson - Five Big Ideas of AI Review
Re-watch the following video to review the Five Big Ideas of AI.
Video Credit: Georgia Public Broadcasting
Perception with AI
Artificial Intelligence for Georgia (AI4GA.org) defines perception as an extraction of meaning from sensory signals. Remember the basis of the Turing Test: “if a machine could display intelligent behavior just like a human, then the machine could be considered intelligent.“
Let’s think for a moment: how do humans perceive? What senses do we use in our perceptions? As humans, we see with our eyes and hear with our ears. Computers utilize input devices, like cameras to see and microphones to hear. An example of computers ability to perceive is demonstrated in our use of face filters on apps and facial recognition.
Autonomous Vehicles and Robots
Autonomous vehicles and robots are equipped with impressive computers. And in reality, as AI continues to advance, we should expect to see more autonomous vehicles and autonomous robots. Remember that autonomous means that the device is:
- self-determined
- aware of the world
- self- governed
- able to set goals and choose actions to achieve those goals
- able to operate independently without being remotely controlled (AI4GA)
How do you think autonomous vehicles and robots sense the world? Autonomous vehicles and robots are able to sense the world today by utilizing the following:
- Cameras - devices which can detect visible light only
- LIDAR - Light Direction and Ranging. Similar to radar, LIDAR detects objects by bouncing a spot of laser light off them and measuring the time it takes the light to be reflected back. LIDAR cannot see color.
- Radar - Radio Direction and Ranging. A technique for detecting objects by bouncing radio signals off them and measuring the time it takes for the signal to return. This gives the distance to the object. Knowing where the antenna was pointed when the signal was sent gives the bearing to the object. Radar was originally used to detect aircraft, but today it is also used by self-driving cars to detect obstacles. Radio waves aren’t affected by fog, so radar can "see" in low visibility conditions where regular cameras cannot.
- SONAR - Sonic Navigation and Ranging. Sonar works similar to radar except it uses sound waves instead of radio waves. By sending out a pulse of high-frequency sound and measuring the time it takes for the sound to be reflected back, sonar can measure distance to an object. Sonar is used by submarines to detect other vessels under water, but it also works in air. Self-driving cars may use sonar to measure distance to other cars when parallel parking. Dolphins and bats also use sonar.
Take a Ride in an Autonomous Vehicle
The following is a video that shows what it is like to ride in an autonomous vehicle. As you watch, pay attention to the various sensors that are highlighted.
Video Source and Copyright: This video was produced by Waymo (waymo.com). We are sharing this video under Fair Use laws. Waymo, the original creator of the video, retains all rights and trademarks presented in this video. This video is in no way an endorsement of Waymo, but is being used for educational purposes only.
[CC BY-NC-SA 4.0
Links to an external site.] UNLESS OTHERWISE NOTED | IMAGES: LICENSED AND USED ACCORDING TO TERMS OF SUBSCRIPTION - INTENDED ONLY FOR USE WITHIN LESSON.
elenabsl/Shutterstock.com. Image used under license from Shutterstock.com and may not be repurposed.