Saturday, April 09, 2016

Facebook artificial intlligence technonogy allows the blind to see photos



Facebook is now using artificial intelligence to automatically generate captions for photos in the News Feed of people who can’t see them.

The tool is called Automatic Alternative Text, and it dovetails with text-to-speech engines that allow blind people to use Facebook in other ways. Using deep neural networks, the system can identify particular objects in a photo, from cars and boats to ice cream and pizza. It can pick out particular characteristics of the people in the photo, including smiles and beards and eyeglasses. And it can analyze a photo in a more general sense, determining that a photo depicts sun or ocean waves or snow. The text-to speech engine will then “read” these things aloud.

A Facebook employee named Matt King showed me a prototype of the service last fall. King, 49, is blind, and though he acknowledged that the service was far from perfect, he said it was a notable improvement over the the status quo. He wasn’t wrong. King showed the system a photo of a friend and his bike as he traveled through Europe. Facebook’s AI described the scene as outdoors. It said the photo included grass and trees and clouds, and that the scene was near water. If the photo had turned up in his News Feed in the past, King would have known only that his friend had posted a photo.

“My dream is that it would also tell me that it includes Christoph with his bike,” King told me. “But from my perspective as a blind user, going from essentially zero percent satisfaction from a photo to somewhere in the neighborhood of half … is a huge jump.”

As King told me, the system doesn’t always get things right. And it hasn’t yet reached a point where it generates captions in full and complete sentences. But this will come. Others have already used deep neural nets to do much the same thing. As King pointed out, a service that only gets part of the way there is still important now—Facebook says that more than 50,000 people already are using the service with text-to-speech engines.

Ever wished you could order Taco Bell for everyone in the office using a simple chat bot? No? Well, you’ll soon be able to it anyway.

Taco Bell has blessed the world with TacoBot, a chatbot for the popular workplace chat app Slack. Tell TacoBot what you want, and it keeps a running tally of your order, just like that screen at the drive-thru. When you’re done, pay through TacoBot and pick up your order at the Nearest Participating Taco Bell.

If that sounds like it would be handy for your team—or, you know, your “team”—you can join the waiting list. Be warned, though—it might be a few months before you revel in the miracle of this advance in productivity tech, says Martin Legowiecki. He’s the creative technology director at Deutsch, the agency that built TacoBot. And even after you’re in, you’ll still have to send someone with the Taco Bell app on their phone to fetch the order. So, much like the Tacocopter before it, the real bot-ified future of meat, cheese, and tortilla still hasn’t arrived.

But fear not: Legowiecki says delivery is something Taco Bell is working on, too.

Really Slacking

TacoBot is one of the most gluttonous manifestations yet of the bot-powered future of work that many companies predict. In China, people already use the popular instant message app WeChat for all sorts of daily tasks, from checking their bank balances to ordering cabs to buying sneakers. Silicon Valley entrepreneurs are betting that chatbots will replace apps here in the US, too.

Slack, with competitors like Hipchat, hope this chat revolution won’t just change consumer behavior but the way work gets done at corporations with lots of money to spend on office supplies, company lunches, and whatever else big companies spend their budgets on. Slack is taking this so seriously that last year it said that it would invest $80 million in companies that build Slack bots.

That vision of the future aside, TacoBot came about almost by accident. Deutsch was contracted by Taco Bell to redesign the company’s website, Legowiecki says. During the course of that project, the Taco Bell and Deutsch teams worked together using Slack. “They fell in love with it, we fell in love with it,” Legowiecki says. “So we though wouldn’t it be great to make Taco Bell available through Slack?”

The Deutsch team built TacoBot using Wit.ai, an online service for building software that’s able understand and respond to human language. (Facebook bought Wit.ai last year to help build its virtual assistant M.) Legowiecki says this means it will be able to bring TacoBot to other apps in the future, such as Hipchat, Facebook Messenger, Amazon Echo, even Apple TV.

In other words, food bots are coming, whether you want them or not. As if you need another excuse to hit the drive-thru.

Over the past few years, the world’s biggest chipmaker has been buying up companies to help make its chips smarter.

Through acquisitions of companies like Indisys, Xtremeinsights, and perhaps most importantly, fellow chip maker Altera (a $16.7 billion deal), Intel has devoted much its artificial intelligence efforts on baking AI into its into chips, as well as software that powers its 3-D video cameras.

Today, Intel has added yet more AI to its portfolio with the purchase of Saffron Technology. Like many other AI startups, Saffron attempts to extract useful information from huge datasets via algorithms inspired in part on the way the human brain works. But instead of focusing on deep learning, the trendy branch of AI in which Google and Facebook are heavily investing, Saffron is focused its own technique called associative memory. The company was founded in 1999 by former IBM Knowledge Management and Intelligent Agent Center chief scientist Manuel Aparicio and led by former PeopleSoft executive Gayle Sheppard. It has deep roots in the enterprise software industry and cut its teeth selling software to the Department of Defense, such as a system for predicting the likely location of roadside bombs in Iraq.

“We see an opportunity to apply cognitive computing not only to high-powered servers crunching enterprise data, but also to new consumer devices that need to see, sense, and interpret complex information in real time,” Intel New Technology Group Senior Vice President Josh Walden says. “Big data can happen on small devices, as long as they’re smart enough and connected.”



Source: Wired




Share:

0 comments:

Post a Comment

Top Tech Stories

Powered by Blogger.
Free and Premium Blgger Templates