Google Quietly Bans Deepfake Training Projects on Colab
Google has quietly banned deepfake projects on its Colaboratory (Colab) service, putting an end to the large-scale utilization of the platform’s resources for this purpose.
Colab is an online computing resource that allows researchers to run Python code directly through the browser while using free computing resources, including GPUs, to power their projects.
Due to the multi-core nature of GPUs, Colab is ideal for training machine learning projects like deepfake models or for performing data analysis.
Deepfakes can be trained to swap faces on video clips, adding realistic facial expressions to make the result appear genuine, although it’s fake.
They have been used for spreading fake news, creating revenge porn, or for fun. However, the lack of ethical limitations in their use has been the source of controversy and concern.
Placing a ban on deepfakes
“You may be executing code that is disallowed, and this may restrict your ability to use Colab in the future. Please note the prohibited actions specified in our FAQ.”
The impact of this new restriction is expected to be far-reaching in the deepfake world, as many users utilize pre-trained models with Colab to jump-start their high-resolution projects.
Colab was making this process very easy even for those with no coding background, which is why so many tutorials suggest Google’s “free resource” platform to launch deepfake projects.
It is not known if Google performed the policy due to ethical concerns or rampant abuse of the free computing resources utilized by these projects.
Colab is meant to be used by researchers who need power that costs several thousands of USD to help them reach scientific goals. That is especially critical during times of GPU shortages.
Instead, there are reports that some users are exploiting the platform’s free tier to create deepfake models at scale. This captured a significant amount of Colab’s available resources for extended periods.
The complete list of disallowed projects are listed below:
- file hosting, media serving, or other web service offerings not related to interactive computing with Colab
- downloading torrents or engaging in peer-to-peer file-sharing
- using a remote desktop or SSH
- connecting to remote proxies
- mining cryptocurrency
- running denial-of-service attacks
- password cracking
- using multiple accounts to work around access or resource usage restrictions
- creating deepfakes
All disallowed projects are far from being eligible for regular scientific research.
While some projects could fit in that context, it appears that Google has detected far more abusive than legitimate cases.
Outsourced Data Protection Officer – It is mandatory to appoint a Data Protection Officer. We help our clients quickly comply with their PDPA & data protection requirements.
Vulnerability Assessment Penetration Testing – Find loopholes in your websites, mobile apps or systems.
Smart Contract Audit – Leverage our industry-leading suite of blockchain security analysis tools, combined with hands-on review from our veteran smart contract auditors.