Using your smartphone and its camera to identify what is in front of your eyes
I never carry a point-and-shoot camera. Chances are you don’t either. In the last few years, cellphone optics have improved substantially. That means more megapixels, better image sensors and stronger flashes and zooms on the one device most of us carry all the time.
Now comes the next phase: using your smartphone and its camera to identify what is in front of your eyes.
Although image-recognition software is still in its infancy, a number of mobile apps are already translating signs, naming landmarks and providing a running commentary on your world.
Google Goggles, which appeared on Android phones in late 2009 and on the iPhone last year, is best at deciphering landmarks, text, book and DVD covers, artwork, logos, bar codes and wine labels. You start the app — it’s part of Google’s search app for the iPhone — and peer at the object through the camera lens. It takes a stab at identifying it.
I’ve found the app especially useful for comparison shopping. If you’re browsing through a bookstore, for instance, one quick snapshot of a book’s cover allows you to check the price on Amazon. It’s much faster than typing the title into a search bar. Same goes for photographing paintings or craft beer bottles.
Perhaps its most promising use, for tourists especially, is language translation. Goggles can scan English, French, German, Italian, Spanish and, a recent addition, Russian text.
In practice, I’ll admit I’ve had only modest success translating phrases from restaurant menus or street signs. Part of the challenge is capturing an image that’s clear enough for the software to recognize. Unless the text appears on a white background, the software’s success is diminished. But when it does work — wow. Optical character recognition is only going to get better and broader.
Asian languages pose different challenges, says Hartwig Adam, a Goggles engineer. Their alphabets consist of thousands of characters, which tend to be strung together with fewer obvious boundaries. Be wary when buying apps that say they translate Japanese or Chinese. The ones I’ve tried are not fully baked. For now, the handwritten specials posted on the walls of no-frills Chinese restaurants will remain a mystery to me.
Even Google admits Goggles is “not so good” at identifying plants. For that you want Leafsnap, a free iPhone app that supplements a traditional field guide. You photograph a leaf on a white background within the app, which then scans the silhouette. The app then cross-references it with its built-in database. For each potential match, you’re shown high-resolution images of the plant’s leaves, flowers, fruit and bark. Your location is also recorded on a map so you can build a database of your urban forest.
Developed by researchers at Columbia University, the University of Maryland and the Smithsonian Institution, Leafsnap has been downloaded 400,000 times since May. It’s easy to see why. I had a blast trying it out in San Francisco — even though the plants in the app’s database are mostly specific to the New York and Washington areas. In the next 18 months, it will expand to 750 species from 250 species found in the continental United States (excluding South Florida), says Peter Belhumeur, a Leafsnap co-founder and computer science professor at Columbia. Eventually, the app will also use your location to refine its search and improve accuracy, he said.
Goggles has other limitations. It’s not good with faces — deliberately, for privacy reasons. And when I photographed an apple using Goggles on both iPhone and Android handsets, no close matches were found. Minutes later, I tried Meal Snap, a $2.99 iPhone app meant as a tool for dieters. Not only did it correctly identify the fruit, but Meal Snap also provided an accurate caloric range (60 to 90 calories).
Even more impressive was what happened when I used the app to deconstruct a bowl of homemade chopped salad. The app correctly identified diced beets and sliced cherry tomatoes alongside broccoli in the tossed mess. That said, the app failed to spot my spinach, deli turkey, cauliflower and bits of pepperoncini. However, the calorie estimate was only off by about 100 calories; and it was actually inflated, which is O.K. if your goal is weight loss.
Start the app, and an outline of a human body appears. Once you touch a spot on the body, the camera opens. After taking a snapshot, you trace a mole using a tiny on-screen pencil. From there, the mole’s symmetry, border and color are analyzed and assigned numerical values. By cross-checking the numbers with a database of a few hundred images culled from dermatologists, the app estimates whether your mole might be consistent with melanoma.
See also:
- First iPhones in Space: Final Shuttle Astronauts to Deliver $1 App
- Apple iPhone SLR Mount Brings Out the Best in Your iPhone’s Camera
- How to Buy the Best Tablet? The key factors you need to consider
- Hercules eCAFE Netbook Series Review; It features 13 hours of battery life
- Fast-Charging Batteries Faces Serious Challenges to Production
|