Find a professional editor in your field or genre, or in your language, with our Editors Directory.


By Annette Seargent

Three speakers gave their views during an EdsQ event in August on implications of generative artificial intelligence (AI) for editors, authors and the publishing industry. 

Ben Cottier (Epoch, a research company)

Ben’s key message was that the editing industry should approach the AI future with both vigilance and optimism. AI is good at certain tasks and will get better all the time. It can streamline editors’ work, but editors may also have to offer more diverse services. 

Chatbots work on the same basis, even though they differ in knowledge, pricing, personality and features. Examples are OpenAI’s ChatGPT, Microsoft’s Bing AI, Google’s Bard and Anthropic’s Claude. 

Chatbots are good at retaining information from training data as they try to predict what comes next. They play the role of “helpful assistant”, based on the scene that is set for them. In this role-playing, they can at any time make errors such as giving factually incorrect replies. Careful checking is thus always necessary.

Chatbots are not good at planning or reasoning that involves many steps. They are also not good at anything that requires high reliability. 

Nevertheless, AI will keep improving. It will continuously deliver better writing and reasoning. 

In conclusion, in a few years editing can become automated. Editors will not be out of work, but the nature of their work will change, possibly towards high-level guidance, management or niche services. Fortunately, AI can boost productivity and automate “boring” parts of the work. 

Emily Halloran (the Plain English Foundation)

Emily discussed research by the Plain English Foundation, which compared the results after AI editing tools and human editors edited the same executive summary. The study aimed to see how much the edits improved structure, design and expression. 

The findings are captured in a white paper: Will you still need an editor if you use ChatGPT?

The key findings were:

  • AI ignores structure, but human editors improve it
  • AI doesn’t understand design, but human editors do
  • AI is OK at expression, but human editors are better
  • AI is good at grammar, but human editors are more thorough.

AI left the structure as it was – the structure did not highlight findings upfront. The editor placed the findings early on, which is more useful for a reader or decision-maker. To improve the design, AI suggested two things: numbered headings and bullet lists. The editor suggested the same but also many other improvements including using more white space, icons, list and tables. 

The quality of expression was measured in terms of how nuanced, consistent and radical the edits were. In all three areas the human editor did better than AI. AI fell short on the consistency measure too. It replaced some words to improve the text but left many unchanged. The editor replaced far more words. AI also replaced words inconsistently. AI’s suggestions for grammar were also inconsistent. 

Overall, the human editor showed the value of editors, who have insight into structure and design and a human understanding of how language works. 

Olivia Lanchester (CEO, Australian Society of Authors)

The Australian Society of Authors (ASA) acknowledges that AI brings opportunities and efficiencies. But ASA has concerns around the risks and implications of AI. A core risk is related to AI training, which happens through text obtained from the internet – including digitised books whose authors have not permitted their books to be used. The process lacks transparency. Authors cannot opt out but receive no fair compensation either. Authors worry that AI will be used to also narrate their audiobooks, translate their books or design the cover for their books.

Short-term implications include a loss of income as paid opportunities contract, and the occurrence of imitations. AI may also flood an already crowded book market. The question arises whether consumers want to buy cheap AI-generated work. If there is transparency, consumers can make an informed choice.

A long-term implication is for Australian cultural output, which may suffer if writing loses its unique Australian perspectives. A future skills gap is even a possibility, given the potentially difficult time ahead for creators. 

ASA worries about whether the human connection in created works can be preserved. Editing and writing involve human judgement and a human sense of empathy. AI’s superficial plausibility can conceal bias and inaccuracies. 

ASA recommends greater transparency, and licensing for copyright owners. Copyright laws should be maintained and authors compensated. It wants the government to appoint a special expert group and work internationally on solutions for this global problem.


A common thread through the talks was that there is reason for both optimism and caution. AI can help editors with many tasks. But editors must not be complacent and must prepare to be innovative as they look for ways to support writers and publishers in ensuring high quality outcomes.