Monday 19 February 2018

Maastricht University and The Post-Truth Society | Manipulating Perception

For Hristo Berger, an DKE grad at Maastricht University with engineering stints at tech companies like Quora, the shock and ongoing anxiety over Russian Facebook ads and Twitter bots pales in comparison to the greater threat: Technologies that can be used to enhance and distort what is real are evolving faster than our ability to understand and control or mitigate it.

The stakes are high and the possible consequences more disastrous than foreign meddling in an election — an undermining or upending of core civilizational institutions, an "infocalypse.” And Hristo says that this one is just as plausible as the last one — and worse. Worse because of our ever-expanding computational prowess; worse because of ongoing advancements in artificial intelligence and machine learning that can blur the lines between fact and fiction; worse because those things could usher in a future where, as Hristo observes, anyone could make it “appear as if anything has happened, regardless of whether or not it did.” 

And much in the way that foreign-sponsored, targeted misinformation campaigns didn't feel like a plausible near-term threat until we realized that it was already happening, Hristo cautions that fast-developing tools powered by artificial intelligence, machine learning, and augmented reality tech could be hijacked and used by bad actors to imitate humans and wage an information war. And we’re closer than one might think to a potential “Infocalypse.” Already available tools for audio and video manipulation have begun to look like a potential fake news Manhattan Project.

In the murky corners of the internet, people have begun using machine learning algorithms and open-source software to easily create pornographic videos that realistically superimpose the faces of celebrities — or anyone for that matter — on the adult actors’ bodies. At institutions like Stanford, technologists have built programs that that combine and mix recorded video footage with real-time face tracking to manipulate video. Similarly, at the University of Washington computer scientists successfully built a program capable of “turning audio clips into a realistic, lip-synced video of the person speaking those words.” As proof of concept, both the teams manipulated broadcast video to make world leaders appear to say things they never actually said. Continue reading at BuzzFeed

No comments:

Post a Comment