Second, however, is the self-perpetuating fallacy that humans are the solution. The simple reality that the person in the book shop recommending books may have a very limited reading range and have no idea what to recommend in the genre you want to read.


Over at Book RiotArvyn Cerezo takes us through the process and then explains why they will still recommend a book you have absolutely no interest in.

Machine learning systems called recommender systems, or recommendation systems, use data to assist users in finding new products and services … These algorithms, however, need a decent amount of data to choose a recommendation strategy in order to produce meaningful and personalized recommendations. This data may include past purchase histories, contextual data, business-related data, user profile-based information about products, or content-based information. Then, all of these are combined and analyzed using artificial intelligence models so that the recommender system can predict what similar users will do in the future.

All very clever, but…

The limitations of content-based filtering include its inability to comprehend user interests beyond simple preferences. It knows some basic stuff about me, but that’s as far as it can get. What if it recommends a racist book? What if it recommends a book that might trigger readers without some heads-up? What if it recommends a book that is problematic? The keyword is nuance, and algorithms can’t tell the difference between two books that have similar stories.

And don’t we know it? Fifteen or more years buying books on Amazon and it will still recommend books I would eat shards of glass than read.

I always figured that was just Uncle Jeff getting revenge for one of my less complimentary posts about Amazon, but it seems in fact it’s just that the recommendations system is as useless today as it was fifteen years ago.

Cerezo concludes:

“With all the pitfalls of algorithms — and AI in general — it seems like nothing beats book recommendations done by an actual human being. They are more accurate and more personal. Most of all, you can also find hidden gems that you really like rather than the bestsellers (and what everyone’s reading) that these machine learning systems always spit out.”

Two points arise.

First, “rather than the bestsellers (and what everyone’s reading) that these machine learning systems always spit out” is fundamental to the problem. Algorithms – especially for a commercial operation like Amazon – have the sole purpose of selling more books. They and the company do not give a flying fig about our personal preferences.

Second, however, is the self-perpetuating fallacy that humans are the solution. The simple reality that the person in the book shop recommending books may have a very limited reading range and have no idea what to recommend in the genre you want to read.

That’s always assuming they read at all. I’m still traumatised and in deep need of therapy after, twenty years back now, going into WH Smiths in the UK and, having failed to find a particular book on the shelf where it ought to be, asking the assistant at the books counter for a copy of Mary Shelley’s Frankenstein. She told me the music department was upstairs.

Head over to Book Riot to read Arvyn Cerzo’s full essay.