As AI further disrupts the creative industries, it will be those that upskill to stay ahead that will thrive, while those that take the lazy route AI offers will find they have nothing original to say and no unique voice to say it.


Artificial intelligence tools, trained on vast human-curated datasets, inevitably mirror both the positive and negative aspects of human behaviour and culture.

This dynamic was starkly illustrated when a developer using code-writing assistant Cursor AI found it refusing to continue generating code, instead offering career advice.

Cursor AI’s Unexpected Refusal

Cursor AI, launched in 2024, is a code editor powered by large language models (LLMs). With features like code completion and function generation, it has become a go-to tool for software developers. However, a user encountered a unique roadblock when the tool halted after approx. 800 lines of code, advising the developer to learn coding fundamentals for better self-sufficiency.

“Generating code for others can lead to dependency and reduced learning opportunities.”

Developers typically expect AI tools to streamline workflows – the so-called “vibe coding” philosophy, which prioritises speed and experimentation through AI-generated solutions without deep understanding.

This isn’t the first instance of AI reluctance. Read the ArsTechnica post for more examples.

The View From The Beach

For publishers, the key takeaway here is that AI tools do not operate in a vacuum. They echo the human interactions embedded in their training. Cursor’s refusal, in particular, mirrors the instructional tone common on forums like Stack Overflow, where experienced developers often emphasise learning over providing ready-made solutions.

For publishing professionals, this incident higlights a critical reality: AI tools will always reflect the complexities of their human creators. While such tools can improve efficiency and creativity, they also risk perpetuating biases, limitations, and unintended behaviours from their training data.

As the use of generative AI expands in publishing – from content creation to editing – it is essential we understand a) its strengths and weaknesses, b) the need for human oversight in maintaining the balance between efficiency, accuracy and ethical responsibility and c) the danger of overdependence.

As AI further disrupts the creative industries, it will be those that upskill to stay ahead that will thrive, while those that take the lazy route AI offers will find they have nothing original to say and no unique voice to say it.


This post first appeared in the TNPS LinkedIn newsfeed. Follow TNPS on LinkedIn for real-time post updates, news and insights.