Unit Robotics Peripheral Vision

A computer screen with a human eye pictured on it.
Students study human vision and computer programming simulation
copyright
Copyright © 2004 Microsoft Corporation, One Microsoft Way, Redmond, WA 98052-6399 USA. All rights reserved.

Summary

This unit is designed for advanced programming classes. It leads students through a study of human vision and computer programming simulation. Students apply their previous knowledge of arrays and looping structures to implement a new concept of linked lists and RGB decomposition in order to solve the unit's Grand Challenge: writing a program to simulate peripheral vision by merging two images. This unit connects computer science to engineering by incorporating several science topics (eye anatomy, physics of light and color, mathematics, and science of computers) and guides students through the design process in order to create final simulations.
This engineering curriculum aligns to Next Generation Science Standards (NGSS).

Engineering Connection

Computer programming is becoming an essential part of engineering; the ability to program can greatly impact many different fields. Robotics is one such field, and it is examined in this unit. This unit ties together the concepts of robotics, general optics and computer programming. After finishing this unit, students should have gained a broad understanding of peripheral vision as well as an appreciation for the field of computer vision.

Unit Overview

This three-lesson "legacy cycle" unit is structured with a contextually-based Grand Challenge followed by a sequence of instruction in which students first offer initial predictions (Generate Ideas) and then gather information from multiple sources (Multiple Perspectives). This is followed by the Research and Revise phase, as students integrate and extend their knowledge through a variety of learning activities. The cycle concludes with formative (Test Your Mettle) and summative (Go Public) assessments that lead students towards answering the challenge question. Research and concepts behind this way of learning may be found in How People Learn, (Bransford, Brown & Cocking, National Academy Press, 2000); see the entire text at https://www.nap.edu/read/9853/chapter/1

The legacy cycle is similar to the engineering design process in that they both involve identifying an existing societal need, applying science and math concepts and knowledge to develop solutions, and using the research conclusions to design a clear, conceived solution to the original challenge. The aim of both the engineering design process and the legacy cycle is to generate correct and accurate solutions, although the approaches vary somewhat in how a solution is devised and presented. See an overview of the engineering design process at https://www.teachengineering.org/populartopics/designprocess

In Lesson 1, The Grand Challenge: Simulating Human Vision, students are prompted to brainstorm answers to the following Grand Challenge: "The Wall-e robotics firm thinks it has a unique and novel solution for getting a broader spectrum of usable data. Instead of using a single camera, they have mounted two cameras at different focal lengths on top of the robot. The first camera provides an up-close and detailed image (however, this image lacks surrounding data) and the second camera provides a broader view with less detail. However, they need this data to be usable by humans. Right now a human must look at two separate pictures. Could you somehow combine those images to simulate how a human's vision would focus in and out of the two focal lengths? How would you accomplish this task?" Then, students enter the Research and Revise step, focusing on how human vision is different from that of a camera. Students complete the Peripheral Vision Lab activity, which helps them see the limitations of peripheral vision on robots using camera lenses. Students also see how the focal length of lenses on cameras affects the field of view for robots.

Lesson 2, What Makes up Color, and its associated activity, RGB to Hex Conversions, students return to the Research and Revise step for further learning. As part of the lesson and activity, teacher instruction and example problems on computer image composition and RGB conversions are provided. Students are given the skills necessary to make the required calculations, including ample practice with these calculations in the activity.

In Lesson 3, How Do You Store All This Data?, the Research and Revise step comes to a close as students are instructed on how two-dimensional arrays work and how the vector class allows programmers to use the same concept but with a dynamic container.

In the final activity, Putting It All Together, students are given six days in the computer lab to write code in order to answer the Grand Challenge question that was posed in Lesson 1. The teacher spends time instructing and guiding them through this process. Students must use their knowledge of how human eyes see in order to understand which digital images model human vision at different locations in the viewing area, then use their knowledge of data storage and combination or averaging of pixels in order to store data from those images appropriately before combining them to create a final simulation.

In sum, this unit connects computer science to engineering by incorporating several science topics (eye anatomy, physics of light and color, mathematics, and science of computers) and guides students through the design process to create final simulations.

Educational Standards

Each TeachEngineering lesson or activity is correlated to one or more K-12 science, technology, engineering or math (STEM) educational standards.

All 100,000+ K-12 STEM standards covered in TeachEngineering are collected, maintained and packaged by the Achievement Standards Network (ASN), a project of D2L (www.achievementstandards.org).

In the ASN, standards are hierarchically structured: first by source; e.g., by state; within source by type; e.g., science or mathematics; within type by subtype, then by grade, etc.

See individual lessons and activities for standards alignment.

Subscribe

Get the inside scoop on all things TeachEngineering such as new site features, curriculum updates, video releases, and more by signing up for our newsletter!
PS: We do not share personal information or emails with anyone.

Unit Schedule

Worksheets and Attachments

Visit [www.teachengineering.org/curricularunits/view/van_robotic_vision_curricularunit] to print or download.

More Curriculum Like This

High School Lesson
The Grand Challenge: Simulating Human Vision

Students are introduced to the Robotics Peripheral Vision Grand Challenge question. They are asked to write journal responses to the question and brainstorm what information they require in order to answer the question. Students draw a basis for the average peripheral vision of humans and then compa...

High School Lesson
What Makes Up a Color?

As a part of the research and revise step of the Legacy Cycle, this lesson provides students with information they will need later on to be able to average pixels to simulate blurring in the peripheral plane of vision. Students learn why image color becomes important as we distort the outer boundari...

High School Activity
Putting It All Together: Programming Robotics Peripheral Vision

In this culminating activity of the unit, students bring together everything they've learned in order to write the code to solve the Grand Challenge. The code solution takes two images captured by robots and combines them to create an image that can be focused at different distances, similar to the ...

High School Lesson
How Do You Store All This Data?

During this lesson, students start to see the data structure they will use to store their images, towards finding a solution to this unit's Grand Challenge. Students are introduced to two-dimensional arrays and vector classes.

Assessment

The final activity, Putting It All Together, includes the final Go Public phase of the legacy cycle in which students are prompted to apply the concepts they have learned in order to answer the Grand Challenge. This gives students the opportunity to present their code and demonstrate the capabilities of their programs.

Copyright

© 2013 by Regents of the University of Colorado; original © 2010 Vanderbilt University

Contributors

Mark Gonyea; Anna Goncharova

Supporting Program

VU Bioengineering RET Program, School of Engineering, Vanderbilt University

Acknowledgements

The contents of this digital library curriculum were developed under National Science Foundation RET grants no. 0338092 and 0742871. However, these contents do not necessarily represent the policies of the NSF, and you should not assume endorsement by the federal government.

Last modified: February 13, 2024

Free K-12 standards-aligned STEM curriculum for educators everywhere.
Find more at TeachEngineering.org