Share this post!

Last Modified: November 17, 2023
A semi-transparent photo of the head of a robot, with the title "ARTIFICIAL EXPECTATIONS: READERS, WRITERS, PUBLISHERS AND AI" overlayed.
Artificial Expectations: Readers, Writers, Publishers, and AI

By Glenn Rollans

This fall, Read Alberta and the WGA are collaborating on a series of articles that consider artificial intelligence and how it will affect writers and publishers. WGA members who wish to read more can continue the conversation that has already been started in the latest edition of WestWord.

Readers who are not WGA members are invited to join the WGA to get access to the latest issue, and to the many other benefits WGA membership can offer them.

Earlier this year, I moderated a panel on artificial intelligence (AI) for SaskBooks. One of the panellists was Neil Clarkes, of Clarkesworld science fiction and fantasy magazine. Neil described receiving up to 700 AI-generated submissions per month, fiction “written” by large language model (LLM) generative AIs (AIs that generate text or images) such as ChatGPT. He and his team trash them as fast as they detect them.

Neil is well known in publishing circles partly for setting out his principles regarding AI early and publicly. His statement, “Where we stand on AI and publishing,” is emphatically human-centred and ethics driven.

“AI technologies are here to stay,” he says, “and even their developers are stating that there is a need for regulation. If we want to be a part of that conversation, publishing professionals need to be clear about our concerns, expectations, and intentions.” He goes on to set out some of the pressing issues confronting readers, writers and publishers: Who has the right to allow developers to “train” their AIs on copyright-protected works, and who should benefit? How transparent should developers be about what works they have used to train their AIs? What responsibility do publishers have to inform their readers when text or images have been created, or partially created, by AIs? What responsibility do authors and artists have to inform publishers when they’ve used AIs as part of their process? Should works created by AIs be protected by copyright?

Neil described his own position in favour of human-created content during the panel and left us to examine our own principles rather than wandering into an AI future.

When ChatGPT burst on the scene mere months ago, it felt like a sudden beginning of a revolution, but in fact it’s a revolution that has been building steam for many years. We have all used the editorial tools in word-processing applications, browsed the titles that an online book retailer tells us we might be interested in, and enjoyed films that used motion-capture techniques to convert live actors into avatars of themselves. So it’s hard to join the discussion around AI without feeling like we’re already involved, but that it has somehow got away from us. I picture Buster Keaton being flung into the air after grabbing hold of the back rail of a speeding streetcar in Day Dream, a 1922 silent film. Somehow, he hauls himself on board, which is what we’re trying to do with AI.

Not that long ago, we were used to hearing that some kinds of work could never be automated: Teaching, counselling, medicine, law, visual arts, the caring professions. The qualities of empathy, taste, intuition, inspiration and perceptiveness, we thought, meant that only warm, caring, breathing humans could accomplish the challenges presented by this work. Yet here we are, witnessing as AI elbows its way into all these fields.

I feel like I was slow waking up to the issues involving copyright and AI, but back in October 2018 I said this to the Heritage Committee in Ottawa: “Some commentators argue … that research into artificial intelligence requires broad copyright exceptions allowing machine learning to ‘ingest’ large volumes of published works. [But] there is no justification for … requiring a sector that operates on very thin margins to subsidize a sector that can well afford to pay a fair price to its suppliers.”

The many copyright holders who have filed lawsuits against AI developers this year in the US and the UK seem to feel the same way.

Yet the Government of Canada is still far from getting its house in order. The Globe & Mail reported on 28 September 2023 that several large tech companies will join a voluntary code of conduct, which is intended to apply to all companies creating or managing generative AIs. The government hopes the code will work as a sort of stopgap while waiting for the provisions of the Artificial Intelligence and Data Act (AIDA) to come into force in 2025.

The code of conduct itself has eighteen points ranging across categories that include “Accountability,” “Fairness and Equity” and “Transparency,” but none of these categories include any mention of the importance of getting permission from copyright holders to use their works. The closest it comes is a point committing to publishing “a description of the types of training data used to develop the system.” I don’t expect those “types” will include “training data owned by other people and used without their permission.”

Canadian tech unicorn Shopify won’t be signing on. In response to the AI Code of Conduct, CEO Tobi Lütke is quoted as saying, “We don’t need more referees in Canada. We need more builders. Let other countries regulate while we take the more courageous path and say ‘come build here.’“ When you picture the Government of Canada in this context, remember Buster Keaton.

Right about now you may be saying to yourselves, “This guy thinks the internet is a series of tubes.” In fact, Alberta’s book publishers have long been early adopters of new technologies, me included.

Many of those innovations had been important to our ability to stay in business. I’m thinking, for example, of generic coding of text, digital typesetting, digital design and layout, print-on-demand technologies, ebook publishing, digital marketing, web-based sales, electronic data interchange, generating ONIX-based metadata, digital audiobooks, accessible ebooks and audiobooks … the list goes on. As small and medium-sized shops, we have usually embraced new technologies as ways of making good work possible. AIs offer potential efficiencies in all these areas of creation and bringing creations to their audiences, and many more. I expect most of us are approaching AI with our usual optimism about new technologies.

This spring, the Writers Guild of America went on strike over a variety of issues, with AI right in the centre of their concerns. They made three AI-related demands of the producers: no scripts written by AI, no scripts based on works written by AI (novels or treatments, for example), and no using the writers’ scripts to train AI.

As I’m finishing this article, the strike has been settled, but details are still to come. It has been reported so far that studios have agreed to disclose when they provide writers with AI-generated concepts, or concepts that contain AI-generated materials. This, after 148 days picketing, feels a lot like stepping on to a slippery slope, but at least their strike has sounded an alarm for other writers, editors, illustrators, and artists that the streetcar is speeding away.

Promoters sell AI as a revolutionary tool for human progress that will solve climate change, deliver new drugs and vaccines, optimize transportation, and take us to Mars. But we have also seen their intentions to replace many workers who really don’t need replacing: car-hire drivers, visual artists, and movie extras, for example. And we’ve seen their intentions to maximize the value extracted from underpaid human efforts—on my mind these days is the brutal irony of using without compensation the creative work of writers and publishers to replace writers and publishers.

And then there’s the important issue of B.S. Gary Marcus, professor of psychology and neuroscience at NYU, has said that truth and falsehood are different sides of the same game, but AI is not even in the game: it produces what philosophers term “bullshit.” Think of Star Trek when they’re talking about the allure of “gold-pressed latinum”: words that sound as if they should mean something real, but in fact don’t.

As entities literally without conscience or skin in the game, AIs cannot act in good faith or bad, they have nothing at stake, they have none of the motivations of mortality, they have no experience.

When Joni Mitchell sings in “A Case of You” that she “drew a map of Canada,” I feel like I know the map she remembers: familiar to every kid on the prairies, it was a promotion piece with chocolate bars in the four corners and bold colours distinguishing ten provinces and two territories. We thought mostly of the candy, but the idea of Canada stayed with us.

When an AI says it drew a map of Canada, it might mean that it drew a map of Canada, although maybe it didn’t draw one at all.

So, what do I conclude? It’s foolish, vain, and risky as readers, writers, and publishers to claim that artificial intelligence can never replace what we do. It’s nonsense to pretend that we will never use AI-driven tools to work more efficiently, reach out to more people, or to jog our stuck imagination.

As we’re all struggling to catch up to things as they are now, developers are using reinforced learning techniques to take AI beyond the mimicry we see in DALL-E or ChatGPT into realms where it solves problems and pursues goals. Quantum computing will add brute strength to AIs, along with capacities that behave a lot like human intuition. Whether you believe the on-rush of advancements in AI will achieve something close to human intelligence, however, depends a lot on whether you believe, along with many AI developers, that the human brain is in essence a complex and subtle machine that can be fully understood and fully replicated. Debate among yourselves.

AI developers often collect training data by “scraping” the internet, a term I want to pause on for a moment. What else do we “scrape”? Rust, caked-on food sticking to dirty dishes, old paint, the inside of our cheek when Ancestry asks for a DNA sample, the bottom of the ocean to scoop up all living things and thrash them to death … none of these strikes me as a particularly attractive reference.

“Scrape” is an apt complement to “content,” that which is scraped. It is a term popularized on the early internet and encompasses words, images, movies, sounds, the great masterpieces of arts and literature, holy scriptures, the inscriptions on the Rosetta Stone, the Magna Carta, “Four Strong Winds,” Einstein’s Special Theory of Relativity, and baby talk—AI renders them all as a kind of slurry that can be stirred and reconstituted, then squirted back at humans through a variety of innovative nozzles. Yum!

When my colleague Tom Lore prompted ChatGPT to tell me about the role of AI in the publishing industry, it responded boldly, as is its style: “The publishing industry … is undergoing a profound transformation thanks to the advent of artificial intelligence (AI) .… For example, GPT-3 … can generate coherent and contextually relevant text, making it a valuable resource for publishers seeking to create content at scale.” This happened on 22 August 2023.

I know that a different prompt would have yielded a different response, but I can’t get past how bloodless this is. Digesting scraped content and producing “coherent and contextually relevant text … at scale.” As an artist’s statement, it’s a long way from Stephen Dedalus’s exultant cry at the end of A Portrait of the Artist as a Young Man by James Joyce: “Welcome, O Life! I go to encounter for the millionth time the reality of experience and to forge in the smithy of my soul the uncreated conscience of my race.”

Instead, it reminds one of the rueful observation in Alberta songwriter Joel Stretch’s “Low-hanging Fruit”: “Somebody, somewhere, is making lots of money.” (Copyright © Starpainter 2023)


Headshot: Glenn RollansGlenn Rollans is Publisher and owner of Brush Education (a higher-education publisher based in Edmonton) and owner of Freehand Books (a literary publisher based in Calgary). His experience includes serving as the Director of the University of Alberta Press, co-owner of Les Éditions Duval / Duval House Publishing (a K-12 publisher based in Edmonton), Co-director of the Business of Publishing program (University of Chicago), Director of the Banff Publishing Program (Banff Centre), Co-chair of Access Copyright, President of the Book Publishers Association of Alberta and President of the Association of Canadian Publishers. He serves as the representative of the Canadian Copyright Institute to the Standing Committee on Copyright and Related Rights of the WIPO, and on the Copyright Committee and the Copyright Policy Working Group of the International Publishers Association. He chairs the Copyright Committee and Copyright Policy Advisory Group of the Association of Canadian Publishers.