Find a professional editor in your field or genre, or in your language, with our Editors Directory.

IPEd

The rapid expansion of artificial intelligence (AI) using large language models (LLM) technologies is becoming of increasing concern to publishing industries around the world. CSIRO defines AI as “a collection of interrelated technologies used to solve problems autonomously and perform tasks to achieve defined objectives without explicit guidance from a human being”. The Colorado State University [CSU] Global further notes that it “allows machines and computer applications to mimic human intelligence, learning from experience via iterative processing and algorithmic training”.

Many of the current tools in the editor’s toolkit have emerged only in the last 20 years. While the recent development of AI and LLMs brings opportunity, it also poses a risk to the work of editors, and new ethical issues they must face in their editorial practice. Research by Australian editors, as outlined recently in The Conversation, exposes the weaknesses of the currently available ChatGPT tool for editing.

In particular we are concerned that:

  • To learn, these systems rely on data created by humans – data that may be collected and monetised without the creators’ permission or remuneration.
  • As AI and LLMs expand, editors could be caught up in the process of misusing intellectual property.
  • Generated content is often poorly written, biased and inaccurate, if not entirely fabricated, which puts extra pressure on editors.
  • Editors are at risk of losing work because potential clients may believe our services can be adequately replaced by AI or LLMs.

IPEd joins others in the publishing industry who are calling on the Australian Government to develop a framework that protects both authors and editors. We also acknowledge the New Zealand Government’s work with the World Economic Forum on AI governance and regulation.

The potential impact on editors

IPEd supports emerging technologies when they have been developed and are used ethically, transparently, sustainably and with appropriate remuneration to the relevant parties (authors and publishers).

Some of the serious issues that could arise for editors when unknowingly editing AI and LLM material include:

  • being unaware of the use of fabricated data, quotes and sources
  • working on text that has been appropriated from an author without their knowledge or permission
  • being unaware they are working with sourced material that is not acknowledged
  • working with material that breaches privacy and data laws
  • editing material that lacks quality control measures.

Concerns that can arise when publishers use AI to create or edit material, rather than using the services of human authors and editors, include:

  • lack of contextual understanding (nuances such as tone, context, style)
  • absence of moral judgement
  • lack of cultural sensitivity or awareness of diversity and inclusion.

Other concerns that can arise for human authors when using AI rather than human editors include that AI is unable to:

  • offer creative direction
  • mentor and guide content creators.

Steps to protect editors

IPEd supports the statements on AI issued by the Australian Society of Authors and Australian Publishers Association. These call for the Australian Government to develop a framework that will help to protect the work of editors and authors and provide “authorisation, fair compensation and transparency”. IPEd endorses the Australian Government’s AI ethical principles, and acknowledges the New Zealand Government for sponsoring a project with the World Economic Forum that is “co-designing actionable governance frameworks for AI regulation”.

IPEd upholds the quality standards in the editing profession by managing its accreditation scheme, sponsoring awards for professional excellence and advocating to raise the profiles of editing and editors. IPEd also hosts professional development and a biennial conference and promotes IPEd standards for editing practice.

IPEd reaffirms its position in supporting Australian and New Zealand editors through training, advocacy and promotion, including professional development in the use of AI technologies if these are shown to be used in appropriate, transparent and ethical ways. IPEd is establishing a Working Party to undertake continuing work on AI technologies as they impact editors and the editing profession.

IPEd stands in solidarity with authors and publishers in calling for the preservation of their moral and material rights over their work. We urge the Australian and New Zealand governments to address this as a matter of urgency.