Smart Object Recognition Algorithm Doesn't Need Humans

First Posted: Jan 17, 2014 04:59 PM EST
Close

If we've learned anything from post-apocalyptic movies it's that computers eventually become self-aware and try to eliminate humans.

BYU engineer Dah-Jye Lee isn't interested in that development, but he has managed to eliminate the need for humans in the field of object recognition. Lee has created an algorithm that can accurately identify objects in images or video sequences without human calibration.

"In most cases, people are in charge of deciding what features to focus on and they then write the algorithm based off that," said Lee, a professor of electrical and computer engineering. "With our algorithm, we give it a set of images and let the computer decide which features are important."

Not only is Lee's genetic algorithm able to set its own parameters, but it also doesn't need to be reset each time a new object is to be recognized-it learns them on its own.

Lee likens the idea to teaching a child the difference between dogs and cats. Instead of trying to explain the difference, we show children images of the animals and they learn on their own to distinguish the two. Lee's object recognition does the same thing: Instead of telling the computer what to look at to distinguish between two objects, they simply feed it a set of images and it learns on its own.

In a study published in the December issue of academic journal Pattern Recognition, Lee and his students demonstrate both the independent ability and accuracy of their "ECO features" genetic algorithm.

The BYU algorithm tested as well or better than other top object recognition algorithms to be published, including those developed by NYU's Rob Fergus and Thomas Serre of Brown University.

Lee and his students fed their object recognition program four image datasets from CalTech (motorbikes, faces, airplanes and cars) and found 100 percent accurate recognition on every dataset. The other published well-performing object recognition systems scored in the 95-98% range.

The team has also tested their algorithm on a dataset of fish images from BYU's biology department that included photos of four species: Yellowstone cutthroat, cottid, speckled dace and whitefish. The algorithm was able to distinguish between the species with 99.4% accuracy.

Lee said the results show the algorithm could be used for a number of applications, from detecting invasive fish species (think of the carp in Utah Lake) to identifying flaws in produce such as apples on a production line.

"It's very comparable to other object recognition algorithms for accuracy, but, we don't need humans to be involved," Lee said. "You don't have to reinvent the wheel each time. You just run it."

Fellow BYU electrical and computer engineering professor James Archibald as well as graduate students Kirt Lillywhite and Beau Tippetts were coauthors on the research.

See Now: NASA's Juno Spacecraft's Rendezvous With Jupiter's Mammoth Cyclone

Source: News release. ©2015 ScienceWorldReport.com All rights reserved.

Join the Conversation

Real Time Analytics