Google Researcher: Don’t work for Facebook in AI if you have a conscience


Google researcher François Chollet believes anyone with a conscience should not work for Facebook in AI.
While keeping in mind that Google is very much a competitor to Facebook, and is no stranger to criticism itself, Chollet brings up some interesting points about ethics in regards to AI and its use to influence — or rather, exploit — people.
In a tirade of tweets, Chollet said:
"The human mind is highly vulnerable to simple patterns of social manipulation. While thinking about these issues, I have compiled a short list of psychological attack patterns that would be devastatingly effective. Some of them have been used for a long time in advertising (e.g. positive/negative social reinforcement) but in a very weak, un-targeted form.
From an information security perspective, you would call these ‘vulnerabilities’: known exploits that can be used to take over a system. In the case of the human mind, these vulnerabilities never get patched, they are just the way we work. They’re in our DNA. They're our psychology.
On a personal level, we have no practical way to defend ourselves against them. The human mind is a static, vulnerable system that will come increasingly under attack from ever-smarter AI algorithms that will simultaneously have a complete view of everything we do and believe, and complete control of the information we consume.
Importantly, mass population control — in particular, political control — arising from placing AI algorithms in charge of our information diet does not necessarily require very advanced AI. You don’t need self-aware, superintelligent AI for this to be a dire threat.
So, if mass population control is already possible today -- in theory -- why hasn’t the world ended yet? In short, I think it’s because we’re really bad at AI. But that may be about to change. You see, our technical capabilities are the bottleneck here."
Facebook has come under increased scrutiny in recent weeks. The most high profile reason is for the revelations that Cambridge Analytica spent $1 million on harvesting Facebook profiles while its head, Alexander Nix, boasted of using dirty data to swing elections. Executives even boasted of its role in getting President Donald Trump elected.
Previous employees of Facebook have since spoken out about the firm’s data collection and how other companies accessed this detailed information about its over two billion users.

Extensive data collection

Dylan McKay, a Rails and Rust developer, went through the process of using a Facebook tool which allows users to download their data. He was shocked to find it contained his call logs, the names of current and previous contacts, and metadata of text messages despite having the SMS feature in Messenger switched off.
One user, responding to McKay’s tweet, claims Facebook logged the exact coordinates of where he’s been. Another posted a list of the advertisers on Facebook with their contact info. One person even claims to have deleted everything from their activity log, but all posts were still found in the dump.
These users were likely joining the #DeleteFacebook movement. Even the founder of WhatsApp, who sold his app to Facebook for $19 billion in 2014, called on people to delete their account in a tweet.



For its part, Facebook says that in response it will be changing the way it shares data with third-party applications. This may not do much to persuade users to stay who have been spooked by the extent of the data being collected by the social media giant.
In the UK, MPs have sent a letter to Facebook’s chief executive, Mark Zuckerberg, summoning him to give evidence to a select committee investigating fake news. In the U.S, a White House spokesman said Donald Trump would welcome investigations into the breach.
Meanwhile, Facebook’s stock has been taking a big hit. The company has lost $59 billion in value since the report broke over the weekend.



"There's only one company where the product is an opaque algorithmic newsfeed, that has been running large-scale mood/opinion manipulation experiments, that is neck-deep in an election manipulation scandal, that has shown time and time again to have morally bankrupt leadership," wrote Chollet. "For me, working at Google is a deliberate choice. If I start feeling uncomfortable, I'll definitely leave. If I had been working at FB, I would have left in 2017."

Source: https://www.developer-tech.com/

No comments: