One of the most popular video games on the market today is called “Guitar Hero.” Players strum a guitar and try to correctly respond to each note as it flashes across a TV screen.
Pete Nikrin, an engineering student at Minnesota West Community and Technical College in Worthington, MN, recently developed a robot that plays the game using advanced vision technology from Banner Engineering Corp. Bill Manor, a robotics instructor at Minnesota West, urged Nikrin to use a PresencePLUS P4 OMNI vision sensor with a right-angle lens.
Manor had the vision system in his possession, because the school had acquired it for educational purposes. “Students have used Banner vision sensors in many projects over the years,” says Manor. “For instance, they use them to inspect containers as they come down a conveyor.”
To develop his Guitar Hero robot, Nikrin used a mannequin and installed the camera lens as the robot’s left eye, which would be positioned toward the TV screen. The robot, named ‘Roxanne,’ identified the notes to be played by using an Edge vision tool, which detects, counts and locates the transition between bright and dark pixels in an image area.
“We set-up five Edge tools that ran horizontally across the screen, one for every fret, and positioned the tools to focus on the notes at the bottom of each,” says Nikrin. “The Edge tools sent a constant signal as the five vertical fret lines progressed, and when a bright white dot appeared in the middle of a dark colored circle, the tool allowed the sensor to detect it.”
Jeff Curtis, senior applications engineer at Banner, worked with Nikrin and Manor to ensure the robot’s processing time was fast enough to keep up with the video game. Once a note was identified, communicating this signal efficiently depended upon a heavy amount of programming, as well as Ethernet technology applied through a Modbus register.
A PLC was programmed so that it constantly looked at the vision sensor’s register. Once the Edge tool senses a note, the PLC notices the change in the register, and the logic in the PLC fires a solenoid that activates the robot’s finger. Just as a human player would react, the robot’s finger then presses down on the appropriate note on the guitar. This set-up resulted in a processing speed of 9 milliseconds.
To ensure consistent, accurate operation, the team needed to ensure Roxanne could play within a range of lighting conditions, as she would be relocated from classrooms to gymnasiums for demonstrations, as well as confirm the robot was correctly oriented with the monitor displaying the video game. The engineers solved this problem by using a Locate tool, an edge-based vision tool that finds the absolute or relative position of the target in an image by finding its first edge.
“We honed a Locate tool and gave it a fixed point-a piece of reflective tape on the PC monitor-to focus on,” explains Curtis. “This ensures the Edge tools are in the correct location to detect each note, as it comes along and allows for any slight vibration in the application environment that could result in some deviation. If the robot starts to sag a bit, for example, it can still play.”
Using this technique, Roxanne has, on Medium mode, hit 100 percent accuracy at times, and it averaged 98 percent accuracy during the remainder of Nikrin’s tenure at Minnesota West. She could achieve up to 95 percent accuracy on Hard mode and 80 percent accuracy on Expert mode, due to the increased mechanical requirements of the robot’s fingers.
Today, Roxanne still engages current and prospective Minnesota West engineering students. Nikrin looks back on the project with a sense of accomplishment and a dose of gratitude. “Throughout the process, I was impressed with Banner’s dedication to their products and customers,” he points out. “They went above and beyond to help with a school project, which might seem trivial to some companies.”
For more on vision sensors, visit www.bannerengineering.com or call 888-373-6767.