AI, like any technology can be used for good or bad purposes. And, when it comes to the development of super sophisticated software it’s up to us to put procedures, policies and safeguards in place so that AI aids our life - not runs it!
New innovations may make it a bit safer to return to office buildings, shopping centres and other places where large numbers of people gather.
There are many different ways that artificial intelligence in conjunction with huge data sets, analytics, CT scans and chatbots can really assist the healthcare sector.
Ask a search engine what the world has to say about AI ethics, and you're guaranteed a plethora of scholarly articles, blog posts, magazine features, and images—some animated, illustrating the ‘trolley problem’.
The most current and familiar examples of adversarial AI are attacks on vision systems. These systems are becoming more pervasive—facial recognition unlocks computers or phones, opens doors, and is a key tool of the surveillance state. It’s no surprise there's a lot of work going on to attack the models.
“In a perfect future, our AI virtual assistant will know what we're doing, where we're going and, most importantly, what we're saying" wrote Computerworld's Mike Elgan in his article, Wanted: World where virtual assistants help without being asked. Thankfully, this dystopia is still a long way off, but Elgan’s words perfectly articulate the industry vision. His article goes on to discuss the obvious issues around privacy, acknowledging that the ‘public isn't ready to be spied on all day by the companies that make virtual assistants.’
Stories of the rise of AI are all around us. The machine can beat a master at Go, and Facebook’s shopping smarts are so good, people believe the company is listening to their microphones. And AI will soon be getting closer to you–much closer. Qualcomm, to name just one, has designs to make its chipsets powerful enough to put machine learning into phones. But, as AI capabilities race ahead, we’re seeing more and more stories about what can go wrong with it.