Today's cybersecurity headlines are brought to you by ThreatPerspective


Ethical Hacking News

The Ongoing Problem of Deepfake Porn on GitHub: A Web of Open Source Software Used for Intimate Image Abuse


Despite efforts by GitHub to crack down on deepfake porn, a web of open source software used to create non-consensual explicit images continues to exist on the platform. WIRED has found over a dozen GitHub projects linked to deepfake "porn" videos evading detection, highlighting blind spots in the company's moderation efforts.

  • Github has banned projects that create or distribute non-consensual explicit images using synthetic media.
  • Despite the ban, over a dozen GitHub projects linked to deepfake "porn" videos have evaded detection.
  • The code used to create deepfakes can be accessed freely on Github, making it easy for developers to build upon existing models.
  • The proliferation of deepfake porn on Github has led to a web of open source software that can be used to make intimate image abuse.
  • Experts warn that the issue of deepfakes requires a multifaceted approach involving technology, policy, and social changes.



  • In recent months, the issue of deepfake porn on social media platforms has gained significant attention, with many experts warning about the potential harm caused by these manipulated images. However, a more insidious problem has emerged on GitHub, one that is just as concerning: the widespread use of open source software to create and distribute non-consensual explicit images.

    GitHub, the world's largest platform for developers to host their code, has taken steps to address this issue, implementing a policy in June 2024 to ban projects that are "designed for, encourage, promote, support, or suggest in any way the use of synthetic or manipulated media for the creation of nonconsensual intimate imagery." However, despite these efforts, over a dozen GitHub projects linked to deepfake "porn" videos have evaded detection on the platform.

    WIRED has identified at least 14 repositories that were built on code linked to videos on a deepfake porn streaming site. These repositories exist as part of a web of open source software across the web that can be used to make deepfake porn, but by its open nature, cannot be gate-kept. GitHub repos can be copied, known as "forks," and from there tailored freely by developers.

    One such project identified by WIRED is a repository branded almost identically to a major project self-described as the "leading software for creating deepfakes." This repository was disabled by GitHub several months ago for violating its terms of service. However, an archived version of the major repository is still available on the platform, and at least six other repositories based on this model were present on GitHub as of January 10.

    The code used to create these deepfake images can be accessed freely on GitHub, making it easy for developers to build upon existing models and create new ones. This has led to a proliferation of deepfake porn on the platform, with some creators even marketing their projects as "NSFW" or "unlocked" versions of other models.

    The impact of this issue goes beyond just the creation and distribution of non-consensual explicit images. The potential for harm caused by deepfakes is not limited to psychological distress; it can also lead to intimidation and manipulation of women, minorities, and politicians.

    In some cases, perpetrators have even been known to congregate in online communities on Discord and Reddit to share their creations and discuss ways to evade detection. Torrents of the main repository banned by GitHub in August are still available on other corners of the web, demonstrating the difficulty in policing open-source deepfake software across the board.

    Elizabeth Seger, director of digital policy at cross-party UK think tank Demos, notes that "once a model is made open source publicly available for download, there's no way to do a public rollback of that." This highlights the challenges faced by platforms like GitHub in keeping up with the ever-evolving landscape of deepfake creation and distribution.

    Ajder, an AI adviser to tech companies like Meta and Adobe, agrees. "It's not easy to always remove something the moment it comes online," he says. "At the same time, there were red flags that were pretty clear." However, even with these warnings, GitHub's efforts to crack down on deepfake porn have been incomplete, leaving a web of open source software used for intimate image abuse still extant on the platform.

    Despite this, experts say that it's not too late to take action. Platforms like GitHub can intervene at the point of upload, and policymakers, tech companies, developers, and creators of abusive content themselves can all play a role in reining in deepfake porn.

    In fact, at least 30 US states have already taken steps to address this issue, with laws addressing bans, minors only. Deepfakes are also set to be criminalized in the UK under new legislation announced on January 7.

    Ultimately, the battle against deepfake porn requires a multifaceted approach that involves not just technological solutions but also policy and social changes. As we move forward, it's essential that we continue to monitor this issue closely and work together to address the problems posed by these manipulated images.

    Related Information:

  • https://www.wired.com/story/githubs-deepfake-porn-crackdown-still-isnt-working/


  • Published: Thu Jan 16 08:58:55 2025 by llama3.2 3B Q4_K_M













         


    © Ethical Hacking News . All rights reserved.

    Privacy | Terms of Use | Contact Us