[Fakelinks 8.8.2019] This Video does not exist; YouPorns erste virtuelle Porno-Darstellerin; Manipulation von Aktienkursen durch Deepfakes

Israeli software firm Lightricks raises $135 million at $1 billion valuation. Lightricks ist der Entwickler von Facetune, eine Foto-Editing-Software, die vor allem von Influencern auf Instagram genutzt wird und meines Erachtens eine Mitverantwortung an den Mental-Health-Schäden durch die Plattform trägt.

Neues Paper über ein Neural Network-generierte Videos, dieselbe Technik, die man GanBreeder und ähnlichem kennt, nur für Video. Ich schätze mal, so circa in einem Jahr sehen wir brauchbare Versionen von „This Video Does Not Exist“: DeepMind DVD-GAN: Impressive Step Toward Realistic Video Synthesis.

YouPorn launches the first ever CGI porn star. Youporn steigt in das Geschäft mit CGI-Influencern ein. Man kann jetzt darüber streiten, ob CGI-Pornostars jetzt was neues sind angesichts von Hentai und der Tatsache, dass es CGI-Porno schon seit Jahrzehnten existiert. Das neue dürfte vor allem die Marktmacht von YouPorn sein und ihre Fähigkeit, einen Character/Fake-Celebrity zu etablieren.

Created in partnership with porn tech company Camasutra VR, Vales will interact with fans and create content across Twitter, Instagram and Modelhub, an online marketplace where users exchange explicit videos.

YouPorn claims its new virtual influencer is the first of its kind to be able to interact with fans in NSFW contexts-—a distinction that probably falls in the category of “weird flex, but OK”—though because of rules about explicit content on most social platforms, these risqué communications will be relegated to Modelhub. The character will also be rendered in real-time through the video game software Unreal Engine to film explicit videos, according to a YouPorn spokesperson.

“It’s fun to share information about my favorite porn site and little peeks into my life with the YouPorn community,” said a statement attributed to Vales. “I am looking forward to becoming their go-to source for providing fun and entertaining updates while being an active part of the future of porn!”

• Basically, this is a combination of automatic segmentation and video inpainting, quasi photoshop content-aware autofill for video: Magically Remove Moving Objects from Video Github

Facebook, Google, Twitter Detail How to Address Deepfake Videos: A ton of corpspeak to say they have no clue how to handle the reality shifting democratization of editing tools.

• Das FBI hat ein paar Tage vor dem verschwörungstheoretisch motivierten Terroranschlag von El Paso vor genau diesem Verschwörungs-Terrorismus gewarnt.

“The FBI assesses these conspiracy theories very likely will emerge, spread, and evolve in the modern information marketplace, occasionally driving both groups and individual extremists to carry out criminal or violent acts,” the document states. It also goes on to say the FBI believes conspiracy theory-driven extremists are likely to increase during the 2020 presidential election cycle. […]

President Trump is mentioned by name briefly in the latest FBI document, which notes that the origins of QAnon is the conspiratorial belief that “Q,” allegedly a government official, “posts classified information online to reveal a covert effort, led by President Trump, to dismantle a conspiracy involving ‘deep state’ actors and global elites allegedly engaged in an international child sex trafficking ring.”

IRA (Internet Research Agency aka Putins Troll Army) verbreitet Falschnachrichten über die IRA (Irish Republican Army).

• Ausführliche Studie aus Griechenland über Tweets on the Dark Side of the Web: Humans and Robots in the Industry of Political Propaganda.

• Die 6th Division der britischen Armee legt ihre Hacker und Propagandisten zusammen. Army fights fake news with propagandists and hackers in one unit.

• Die Rating-Agentur Moodys warnt vor einer Manipulation von Aktienkursen durch Deepfakes: Deepfakes can threaten companies’ financial health.

Companies’ businesses and credit quality are threatened as advancing technology makes it easier to create deepfake videos and images designed to damage their reputations, Moody’s Investors Service said in a report published today.

“Imagine a fake but realistic-looking video of a CEO making racist comments or bragging about corrupt acts,” said Leroy Terrelonge, AVP-Cyber Risk Analyst at Moody’s. “Advances in artificial intelligence will make it easier to create such deepfakes and harder to debunk them. Disinformation attacks could be a severe credit negative for victim companies.”

——————————————

Folgt Nerdcore.de auf Twitter (https://twitter.com/NerdcoreDe und https://twitter.com/renewalta) und Facebook (https://www.facebook.com/Crackajackz), wenn ihr meine Arbeit mögt, könnt ihr mich hier finanziell unterstützen (https://nerdcore.de/jetzt-nerdcore-unterstuetzen/).

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert.