Skip to content
Home » Blog » Publishing Industry’s Gatekeeping and AI Environmental Impacts

Publishing Industry’s Gatekeeping and AI Environmental Impacts

Image: Catharina Steel's mouse written text: 21@7 Cath Unfiltered comes out from black ink spashed onto a white background. The article title: The Fallacy of the Negatrive Reframe and a photo of her sitting on a log in a forest is in the top left corner

21@7 Issue #16, April 2026

Cath compares the environmental carbon footprint of AI use to human work in a way that is missed in most discussions. It’s an honest look into the Publishing Industry’s gatekeeping around AI use and the ethics of reporting one sided arguments.

Penned words 21@7 Cath Unfiltered with the title Publishing Industry's Gatekeeping and AI Environmental Impacts and a photo of Catharina Steel sitting on a log in a forest looking contemplatively outward.

Image: Catharina Steel’s mouse written text: 21@7 Cath Unfiltered comes out from black ink spashed onto a white background. The article title: Publishing Industry Gatekeeping and AI Environmental Impacts (Stanalone), and a photo of her sitting on a log in a forest is in the top left corner.

There are several videos and articles that I have seen over the last five to six months, which talk about how using AI generative platforms, both text and image, is bad for the environment.

These articles and videos are one-sided and ignore the human energy effect on the environment when humans perform these tasks, or assume that this is greener.

When providing information to the public, it is a matter of ethics that both sides of the argument are presented. Ignoring one part or misrepresenting another is misinformation. It only serves the person’s goals of embarrassing and admonishing those who use AI. It’s not based on facts.

If both sides aren’t presented, the person has not discharged their ethical duty to be fair and honest with their audience. They have misled their audience if they believe they have all the facts to make an informed decision when they do not.

Below is an overall summary of aspects to consider when evaluating the use of AI compared to a human.

It is a form of gatekeeping for a person to insist that a creator must always use humans, that your work should be unedited and potentially incoherent if you can’t afford this. To suggest that somebody will need the resources, such as time and money, to have a human edit their work or do the research is also gatekeeping.

Putting up gates stops a person from using a product that could efficiently help them produce something of higher quality because it’s been edited and therefore provides something cohesive and interesting for the viewer or reader.

If a person doesn’t have the time to research it themselves (library or finding and reading online documents), they will be penalized for this. It’s saying that if a person can’t afford to get somebody to edit every single article, they should put out rubbish. However, this will reflect on their professionalism and is an unreasonable suggestion.

When somebody says you can’t use AI because you are damaging the environment, they are assuming that humans have less of an effect on the environment and miss the point that it’s only during the training of the various AI Models where there is a greater impact on the environment. However, this doesn’t evaluate this training’s impact, and the following minuscule impact with each prompt, against humans doing these tasks over and over, as this is a fixed human energy cost. This is about scale and efficiency.

Most people don’t realize that the AI environmental issue is when the AI is receiving massive datasets for learning purposes. (I don’t support books being used illegally for training, but this is a separate subject matter.) It is during this phase, when an AI model (for use on an AI platform) is being trained, that the impact on the environment occurs. This is a one-time event per AI model.

People talking about how AI is bad for the environment are failing to discuss the reality that it takes a human much longer to complete a task.

While a person works, they are consuming energy—lights, computers, eating—this has an impact on the environment. An AI bot can complete the same task in seconds, resulting in a reduced impact overall on the environment.

Once the AI model has been trained, the cost to the environment is minor. This is the opposite of what these people are saying is the truth.

I believe it’s important to fully understand something before providing information on it. Without this, the risk of spreading false information and/or creating fear among the public that is based on this false information is unethical.

I feel that it’s important to consider that it takes humans much longer to do certain tasks than it does for an AI bot, therefore contributing to a greater impact on the environment compared to the AI bot usage.

I don’t advocate for unethical use of AI, and have discussed this aspect in other articles. I’m a strong believer in doing the writing first because that is the only way to ensure that the work remains both yours and human.

Using an AI to edit has less impact on the environment than using a human would. When it comes to articles, produced at a greater rate than fiction or non-fiction books, my understanding is that it makes ethical and environmental sense to use the AI bot.

Most people don’t have the money to get an editor to edit every single article or every single thing that they produce. Most people don’t have the funds to outsource somebody to create images for them for every single thing that they do.

To tell them they can’t use AI is limiting their ability to do the work they need to do for their business. As previously mentioned, this is a form of gatekeeping.

I don’t believe people realize that this is what they are doing, but it does not change the impact that this is effectively what they are doing.

They are using two main arguments to say we shouldn’t use AI without considering how these can be both ethically used and the other side of the environmental argument.

The main argument is that if you use AI at all, then you are not producing your own work. This misses the point of ethical use, where a person either drafts their own article/post/other or sketches or creates their own starting image, and then gets AI assistance.

AI is a powerful and fast tool that can provide feedback or create a higher-quality image based on the one provided (tidy it up), in a matter of seconds. Ignoring this type of use misses the many ways that AI can be used.

While AI can certainly be used unethically, as any tool can, it is always the human element that determines the ethical use, not the tool itself.

When failing to provide a balanced perspective of AI, a failure of intellectual integrity in presenting all the facts, they are using environmentalism as a proxy for protectionism; however, I believe most of it stems from not fully comprehending this reality, and they have been fed false and misleading information themselves.

The question around the cost of AI on the environment and the loss of creator jobs has no easy answer. Here are some things to think about:

  • Server infrastructure embodied carbon (mining, manufacturing, shipping) amortized across model lifespan and shared workloads
  • Data center operational energy per inference versus human labor energy consumption
  • Timeline: human weeks/months of work versus AI seconds—what’s the actual carbon comparison?
  • Job displacement in creative/research fields versus job creation in AI development, training data curation, infrastructure maintenance, prompt engineering, mining rare earth elements and metals, manufacturing server components, shipping and logistics, data center construction and maintenance, hardware installation and support, e-waste management and recycling.
  • AI has limits and simply will never be able to replace the human element necessary for connected writing and art. While it can mimic emotion, it often misses the mark, so real job loss will have limits and simply push creators to do the more interesting, difficult, and enjoyable work while letting the bots manage the basic tasks.
  • Economic accessibility: who can afford human editing/research versus who can access AI tools
  • Long-term: Does AI efficiency enable more people to produce higher-quality work at scale, or does it concentrate production in fewer hands?

This free kids’ worksheet called What’s My Footprint helps a child think like an environmental detective. It guides them to trace where everyday things come from—the food they eat, the games they play, the lights in their home—and where it all goes. By following the journey of what they consume, they’ll discover how resources, energy, and systems connect to their daily life. The focus is on curiosity and thinking.

If a child prefers to draw, give them some paper and pencils or pens so they can engage this way, or they can use the lines provided to write, however it works best for them.

I have a several topic ideas to select from, but I will see what seems relevant at the time I come to writing it.

To read my previous post about The Fallacy of the Negative Reframe, click here.

To read my author 7@7 with Catharina Steel, go to: Substack 

Leave a Reply

Your email address will not be published. Required fields are marked *