This tech executive quit his job to fight the original sin of generative AI

[ad_1]

Ed Newton-Rex says there is an ethical problem with generative AI. He should know, because he used to be part of a rapidly growing industry. Newton-Rex was TikTok's lead AI designer and then an executive at Stability AI until she left the job in November after becoming frustrated with the company's stance on collecting training data.

Following his high-profile departure, Newton-Rex has thrown himself into the conversation after conversation about what building AI ethically would look like in practice. “It surprised me that there are so many people who want to use generative AI models that treat creators fairly,” he says. “If you can give them tools to make better decisions, that's helpful.”

headshot of a person

Ed Newton-Rex's nonprofit is trying to encourage companies to be more thoughtful about sourcing training data for AI projects.

Courtesy of Ed Newton-Rex

Now Newton-Rex has launched a new nonprofit, Fairly Trend, to give people the tools to make exactly that kind of decision. It offers a certification program to recognize AI companies that license their training data. The AI ​​industry now has its own version of those “fair trade” certification labels you see on coffee.

To earn Fairly Trend's certification label, which it calls L Certification, a company must prove that its training data was either explicitly licensed for training purposes, in the public domain, under an appropriate open license. Was introduced under, or already belonged to, the company.

So far, nine companies have received certification, including image generator Bria AI, which trains exclusively on data licensed from sources such as Getty Images, and music production platform LifeScore Music, which supports work from all major record labels. Gives license. Many others are close to completing their certification. Depending on the size of the applicant's business, the nonprofit charges a fee of $500 to $6,000.

OpenAI—the world's leading generative AI company—recently argued that it is impossible to create generative AI services like ChatGPT without using unlicensed data. Newton-Rex and the first companies to receive Fairly Certified's seal of approval disagree. “We already consider it an essential thing,” Bria CEO Yair Adato says of licensing data. He compares AI models built on unlicensed data to Napster and The Pirate Bay, and his company to Spotify. “It's really easy for us to be compliant,” says Tom Gruber, co-founder of LifeScore Music, who is also a consultant for Fairly Certified. “The music business really cares about provenance and rights.”

Newton-Rex says it has the support of trade groups such as the Association of American Publishers and the Association of Independent Music Publishers, as well as companies such as Universal Music Group. But the movement to overturn the AI ​​industry's standard approach of scraping training data at will is still in its infancy. And Fairly Trained is a one-man operation. Not that Newton-Rex minds; He still has the mindset of a startup founder. “I believe in shipping things quickly,” he says.

The nonprofit isn't the only one trying to standardize the idea of ​​labeling AI products with information about their ingredients. Howie Singer, a former Warner Music Group executive who now studies how technology is changing the music industry, sees parallels between Fairly Trend and the Content Authenticity Initiative, which aims to help people track the authenticity of images. is a project run by Adobe. “It's a good step forward,” he says of Newton-Rex's project.