EyeQ Tech review EyeQ Tech EyeQ Tech tuyển dụng review công ty eyeq tech eyeq tech giờ ra sao EyeQ Tech review EyeQ Tech EyeQ Tech tuyển dụng double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs double-skinned crabs king crabs double-skinned crabs double-skinned crabs crab roe crab food double-skinned crabs double-skinned crabs soft-shell crabs crab legs double-skinned crabs double-skinned crabs vietnamese seafood double-skinned crabs mud crab exporter double-skinned crabs double-skinned crabs crabs crab exporter soft shell crab crab meat crab roe mud crab sea crab vietnamese crabs seafood food vietnamese sea food double-skinned crab double-skinned crab crabs crabs crabs vietnamese crab exporter mud crab exporter crabs crabs
Tech

‘Student’ who attacked activist couple in press appears to be a deepfake

WASHINGTON – Oliver Taylor, a student at England’s University of Birmingham, is a twenty-something with brown eyes, light stubble and a slightly stiff smile.

Online profiles describe him as a coffee lover and politics junkie who was raised in a traditional Jewish home. His half dozen freelance editorials and blog posts reveal an active interest in anti-Semitism and Jewish affairs, with bylines in the Jerusalem Post and the Times of Israel.

The catch? Oliver Taylor seems to be an elaborate fiction.

His university says it has no record of him. He has no obvious online footprint beyond an account on the question-and-answer site Quora, where he was active for two days in March. Two newspapers that published his work say they have tried and failed to confirm his identity. And experts in deceptive imagery used state-of-the-art forensic analysis programs to determine that Taylor’s profile photo is a hyper-realistic forgery – a “deepfake.”

Who is behind Taylor isn’t known to Reuters. Calls to the UK phone number he supplied to editors drew an automated error message and he didn’t respond to messages left at the Gmail address he used for correspondence.

Reuters was alerted to Taylor by London academic Mazen Masri, who drew international attention in late 2018 when he helped launch an Israeli lawsuit against the surveillance company NSO on behalf of alleged Mexican victims of the company’s phone hacking technology.

In an article in US Jewish newspaper The Algemeiner, Taylor had accused Masri and his wife, Palestinian rights campaigner Ryvka Barnard, of being “known terrorist sympathizers.”

Human rights lawyer Mazen Masri in London.
Human rights lawyer Mazen Masri in London.Reuters

Masri and Barnard were taken aback by the allegation, which they deny. But they were also baffled as to why a university student would single them out. Masri said he pulled up Taylor’s profile photo. He couldn’t put his finger on it, he said, but something about the young man’s face “seemed off.”

Six experts interviewed by Reuters say the image has the characteristics of a deepfake.

“The distortion and inconsistencies in the background are a tell-tale sign of a synthesized image, as are a few glitches around his neck and collar,” said digital image forensics pioneer Hany Farid, who teaches at the University of California, Berkeley.

Artist Mario Klingemann, who regularly uses deepfakes in his work, said the photo “has all the hallmarks.”

“I’m 100 percent sure,” he said.

“A ventriloquist’s dummy”

The Taylor persona is a rare in-the-wild example of a phenomenon that has emerged as a key anxiety of the digital age: The marriage of deepfakes and disinformation.

The threat is drawing increasing concern in Washington and Silicon Valley. Last year House Intelligence Committee chairman Adam Schiff warned that computer-generated video could “turn a world leader into a ventriloquist’s dummy.” Last month Facebook announced the conclusion of its Deepfake Detection Challenge – a competition intended to help researchers automatically identify falsified footage. Last week online publication The Daily Beast revealed a network of deepfake journalists – part of a larger group of bogus personas seeding propaganda online.

Deepfakes like Taylor are dangerous because they can help build “a totally untraceable identity,” said Dan Brahmy, whose Israel-based startup Cyabra specializes in detecting such images.

Brahmy said investigators chasing the origin of such photos are left “searching for a needle in a haystack – except the needle doesn’t exist.”

Taylor appears to have had no online presence until he started writing articles in late December. The University of Birmingham said in a statement it could not find “any record of this individual using these details.” Editors at the Jerusalem Post and The Algemeiner say they published Taylor after he pitched them stories cold over email. He didn’t ask for payment, they said and they didn’t take aggressive steps to vet his identity.

“We’re not a counterintelligence operation,” Algemeiner Editor-in-chief Dovid Efune said, although he noted that the paper had introduced new safeguards since.

After Reuters began asking about Taylor, The Algemeiner and the Times of Israel deleted his work. Taylor emailed both papers protesting the removal, but Times of Israel Opinion Editor Miriam Herschlag said she rebuffed him after he failed to prove his identity. Efune said he didn’t respond to Taylor’s messages.

The Jerusalem Post and Arutz Sheva have kept Taylor’s articles online, although the latter removed the “terrorist sympathizers” reference following a complaint from Masri and Barnard. The Post’s editor-in-chief, Yaakov Katz, didn’t respond when asked whether Taylor’s work would stay up. Arutz Sheva editor Yoni Kempinski said only that “in many cases” news outlets “use pseudonyms to byline opinion articles.” Kempinski declined to elaborate or say whether he considered Taylor a pseudonym.

Oliver Taylor’s articles drew minimal engagement on social media, but the Times of Israel’s Herschlag said they were still dangerous – not only because they could distort the public discourse but also because they risked making people in her position less willing to take chances on unknown writers.

“Absolutely we need to screen out impostors and up our defenses,” she said. “But I don’t want to set up these barriers that prevent new voices from being heard.”