A new wave is crashing over the creative world, and it isn’t made of water. It’s made of algorithms. Just weeks ago, Xania Monet, an artist who doesn’t breathe or feel, debuted on the Billboard Adult R&B chart. Simultaneously, “Walk My Walk” by Breaking Rust topped country digital song sales – both entirely generated by artificial intelligence.
This isn’t a futuristic fantasy; it’s happening now. Monet marked the first time an AI-created artist appeared on a major Billboard airplay chart, a milestone that’s sent tremors of fear and frustration through the human creative community. The sheer volume of AI-generated content is already overwhelming the system.
“People are scared, angry, and upset,” says Arun Chaturvedi, president of the Songwriters’ Association of Canada. He paints a stark picture: 100,000 songs already flood Spotify daily from human artists, and AI is poised to exponentially increase that number, drowning out genuine voices.
The concern isn’t limited to music. Canada’s House of Commons Heritage committee is now grappling with the implications for all creative sectors – publishing, film, television, and more. Hearings revealed a deep anxiety about AI’s insatiable appetite for existing work.
At the heart of the issue lies the way AI learns: by consuming massive amounts of copyrighted material, often without permission. Creative groups are demanding a licensing system to ensure artists are compensated when their work fuels these AI engines.
The problem extends beyond professional artists. Even public figures aren’t safe. A search for a biography of a prominent Canadian leader yielded a flood of AI-generated books, some ranking higher in search results than the official autobiography. Distinguishing authentic work from “incoherent slop” is becoming impossible for consumers.
Writers and artists feel they’ve unwittingly fed a beast that now threatens to consume them. Victoria Shen of the Writers Guild of Canada stated that generative AI is “trained on the work of artists and creators and now threatens their livelihood.” The scale of the potential disruption is staggering.
Experts predict that within years, over 90% of the content Canadians encounter online could be AI-generated. Even major entertainment companies like Disney are exploring allowing users to create their own AI-powered content, further blurring the lines between human and machine creativity.
However, the creative industry isn’t seeking to halt AI’s progress. Their plea to the government is simple: demand transparency. Artists need to know when and how their work is being used to train AI systems, a crucial step towards establishing a fair licensing framework.
Without transparency, artists are left in the dark, unable to protect their intellectual property. “Should I license? Can I license? Is it being used? I don’t know,” explains Erin Finlay, legal counsel for Access Copyright, highlighting the current power imbalance.
The legal battleground is already forming. Courts in both Canada and the United States are currently examining how copyright law applies to AI training. Creative groups are urging lawmakers to resist adding new exceptions to copyright law that would further benefit AI companies.
The debate centers on a proposed “text and data mining” exception, favored by the tech industry, which would allow the use of copyrighted material for AI training. Opponents fear this would effectively grant AI companies a free pass to exploit creative work.
Some argue that overly strict regulations could drive AI development elsewhere. Michael Geist, a law professor at the University of Ottawa, warns that Canada must remain “globally competitive” to avoid losing out on AI opportunities. The stakes are high, and the future of creativity hangs in the balance.