Business
Photo-sharing community EyeEm will license users’ photos to train AI if they don’t delete them
EyeEm, the Berlin-based photo-sharing community that exited last year to Spanish company Freepik after going bankrupt, is now licensing its users’ photos to train AI models. Earlier this month, the company informed users via email that it was adding a new clause to its Terms & Conditions that would grant it the rights to upload users’ content to “train, develop, and improve software, algorithms, and machine-learning models.” Users were given 30 days to opt out by removing all their content from EyeEm’s platform. Otherwise, they were consenting to this use case for their work.
At the time of its 2023 acquisition, EyeEm’s photo library included 160 million images and nearly 150,000 users. The company said it would merge its community with Freepik’s over time. Despite its decline, almost 30,000 people are still downloading it each month, according to data from Appfigures.
Once thought of as a possible challenger to Instagram — or at least “Europe’s Instagram” — EyeEm had dwindled to a staff of three before selling to Freepik, TechCrunch’s Ingrid Lunden previously reported. Joaquin Cuenca Abela, CEO of Freepik, hinted at the company’s possible plans for EyeEm, saying it would explore how to bring more AI into the equation for creators on the platform.
As it turns out, that meant selling their work to train AI models.
Now, EyeEm’s updated Terms & Conditions reads as follows:
8.1 Grant of Rights – EyeEm Community
By uploading Content to EyeEm Community, you grant us regarding your Content the non-exclusive, worldwide, transferable and sublicensable right to reproduce, distribute, publicly display, transform, adapt, make derivative works of, communicate to the public and/or promote such Content.
This specifically includes the sublicensable and transferable right to use your Content for the training, development and improvement of software, algorithms and machine learning models. In case you do not agree to this, you should not add your Content to EyeEm Community.
The rights granted in this section 8.1 regarding your Content remains valid until complete deletion from EyeEm Community and partner platforms according to section 13. You can request the deletion of your Content at any time. The conditions for this can be found in section 13.
Section 13 details a complicated process for deletions that begins with first deleting photos directly — which would not impact content that had been previously shared to EyeEm Magazine or social media, the company notes. To delete content from the EyeEm Market (where photographers sold their photos) or other content platforms, users would have to submit a request to [email protected] and provide the Content ID numbers for those photos they had wanted to delete and whether it should be removed from their account, as well, or the EyeEm market only.
Of note, the notice says that these deletions from EyeEm market and partner platforms could take up to 180 days. Yes, that’s right: Requested deletions take up to 180 days but users only have 30 days to opt out. That means the only option is manually deleting photos one by one.
Worse still, the company adds that:
You hereby acknowledge and agree that your authorization for EyeEm to market and license your Content according to sections 8 and 10 will remain valid until the Content is deleted from EyeEm and all partner platforms within the time frame indicated above. All license agreements entered into before complete deletion and the rights of use granted thereby remain unaffected by the request for deletion or the deletion.
Section 8 is where licensing rights to train AI are detailed. In Section 10, EyeEm informs users they will forgo their right to any payouts for their work if they delete their account — something users may think to do to avoid having their data fed to AI models. Gotcha!
EyeEm’s move is an example of how AI models are being trained on the back of users’ content, sometimes without their explicit consent. Though EyeEm did offer an opt-out procedure of sorts, any photographer who missed the announcement would have lost the right to dictate how their photos were to be used going forward. Given that EyeEm’s status as a popular Instagram alternative had significantly declined over the years, many photographers may have forgotten they had ever used it in the first place. They certainly may have ignored the email, if it wasn’t already in a spam folder somewhere.
Those who did notice the changes were upset they were only given a 30-day notice and no options to bulk delete their contributions, making it more painful to opt out.
Requests for comment sent to EyeEm weren’t immediately confirmed, but given this countdown had a 30-day deadline, we’ve opted to publish before hearing back.
This sort of dishonest behavior is why users today are considering a move to the open social web. The federated platform, Pixelfed, which runs on the same ActivityPub protocol that powers Mastodon, is capitalizing on the EyeEm situation to attract users.
In a post on its official account, Pixelfed announced “We will never use your images to help train AI models. Privacy First, Pixels Forever.”
-
Entertainment6 days ago
Greatest birthday gift ideas for women: What to get for your mom, sister, wife, daughter, or greatest friend
-
Entertainment7 days ago
‘Heretic’ review: Hugh Grant and horror are a match made in heaven (or hell)
-
Entertainment6 days ago
‘Arcane’ Season 2 review: The greatest fantasy show of 2024, hands-down
-
Entertainment7 days ago
‘Disclaimer’s ending, explained | Mashable
-
Entertainment6 days ago
Greatest 50th birthday gifts: Celebrate half a century with the perfect present
-
Entertainment6 days ago
Giant telescope’s own powerful radiation may have contributed to collapse
-
Entertainment3 days ago
How to watch Pharrell’s ‘Piece by Piece’ at home: When is it streaming?
-
Entertainment6 days ago
‘Heretic’s intense ending, explained | Mashable