Developers must disclose all AI-generated art, code, or music used in their games for Steam to review before their titles can be publicly released via the software souk.
Devs will be required to fill out an AI disclosure section in the paperwork they must submit to Steam before their titles can be approved. Steam announced the changes yesterday, and said it will scrutinize any content generated by AI that was used to make a game, or created on the fly during gameplay.
The form requires game makers to promise that pre-generated materials don’t include any illegal or infringing content, or mislead players with false marketing. If their games support live-generated content, they will have to explain what guardrails are put in place to prevent AI from making anything inappropriate or unlawful on-the-fly.
On top of this, Steam is asking players to report any illegal AI-generated material they spot, to keep developers on their toes. Gamers will also be made aware by Steam if a title uses machine-generated content.
“Today’s changes are the result of us improving our understanding of the landscape and risks in this space, as well as talking to game developers using AI, and those building AI tools,” the video game hosting giant said in a statement.
“This will allow us to be much more open to releasing games using AI technology on Steam. The only exception to this will be Adult Only Sexual Content that is created with Live-Generated AI – we are unable to release that type of content right now.”
It’s not surprising that businesses like Steam are being cautious: many fear they could potentially be legally liable for hosting illicit synthetic content.
“It’s taken us some time to figure this out, and we’re sorry that has made it harder for some developers to make decisions around their games. But we don’t feel like we serve our players or developer partners by rushing into decisions that have this much complexity. We’ll continue to learn from the games being submitted to Steam, and the legal progress around AI, and will revisit this decision when necessary,” the biz added.
Copyright infringement is a top concern right now in the AI world. Generative AI model makers, such as OpenAI, Google, Microsoft, Midjourney, and Stability AI, are facing or have faced lawsuits accusing them of unlawfully ripping off people’s intellectual property to train and run neural networks. These AI model makers deny any wrongdoing.
In an attempt to reassure customers concerned about the liability of content-creating AI tools, Big Tech names including Microsoft and Google have promised to defend their subscribers if they’re hit with any copyright infringement lawsuits from the stuff they generate with those models (caveats apply.) Whether the use of copyrighted content to train AI models is protected under fair use or not is still an open question. ®