“The multitude of books, the shortness of time and the slipperiness of memory…” bemoans a surprisingly relatable Dominican monk, Vincent of Beauvais, in 1255. Our anxieties – it bears repeating – don’t change as much as we like to think. In the 13th century, Vincent aimed to organise and summarise the world’s knowledge into a single resource, an encyclopaedia he called the Great Mirror: a nifty one-man Wikipedia running to 3.25 million words. In 2019, Wikipedia is pushing 6.5 billion words – meaning that for all his prolificity, Vincent would have needed 2 million additional lifetimes to jot our contemporary Great Mirror down.
There’s a term for paralysis felt in the face of too much available knowledge: ‘information overload’. I’d like to tell you more, but I got distracted – by the seven hyperlinks before the first full stop on the relevant Wikipedia page:
“For the album by Alien Sex Fiend, see Information Overload (album). Information overload (also known as infobesity, infoxication, information anxiety, and information explosion) is the difficulty in understanding an issue and effectively making decisions when one has too much information about that issue.”
In New Dark Age (2018), writer James Bridle suggests the Enlightenment’s guiding mantra i.e. “more knowledge leads to better decisions”, has skewed technological progress in favour of amassing and accessing information, without our ever agreeing on a definition of ‘better’ against which to process it. “We find ourselves today connected to vast repositories of knowledge, and yet we have not yet learned how to think,” he writes. “It is on this contradiction that the idea of a new dark age turns, an age in which the value we have placed on knowledge is destroyed by the abundance of this profitable commodity, and in which we look about ourselves in search of new ways to understand the world.”
Add to our ‘infoxication’ the busyness of modern lives and FOMO-inducing power of social sharing and you’ll arrive at a pretty solid business case for pre-processed reads in bitesize packages.
This is the landscape within which books are desired, discovered, discussed, devoured and shared. Behind every book or ebook is another recommendation. For every ‘must-read’ there’s a ‘must-also-now-read’. For every good review, you can squirrel out a bad. The browser bookmarks breed ad infinitum, the favourites pile up. The ‘multitude of books’ has far, far outrun the ‘shortness of time’ and TL;DR (too long, didn’t read) is one of the more telling acronyms to have emerged from the online lexicon. Add to our ‘infoxication’ the busyness of modern lives and FOMO-inducing power of social sharing and you’ll arrive at a pretty solid business case for pre-processed reads in bitesize packages.
Digestive lit bits
Content condensing is nothing new. In a Welsh cottage where family summers passed slowly, I’d turn – three damp days in – to a pile of abandoned Reader’s Digest condensed books. Mouldering and kitsch in faux leather and foil-blocked spines, they were the second-class citizens of the otherwise well-stocked bookshelf. I’d decide the originals couldn’t possibly have been any good to bear such butchery – euphemised ‘abridged’. I was a 12-year-old snob, who then devoured the lot.
Vast collections of Reader’s Digest editions haunt the corners of eBay or languish in charity shops awaiting their fate as gastropub décor or victim to the rampant scalpel of an Etsy decoupager. But, in the ‘80s, Reader’s Digest was selling 10 million books a year, sating the literary appetite of 1.5 million Americans. Each volume, comprising up to five abridged novels and non-fiction books, was bought and sent by mail order. Decades before Spotify or Netflix, Reader’s Digest was packaging, curating and condensing content for loyal subscribers.
“In 700 condensations we have never had a complaint from an author.”
Barbara Morgan, Reader's Digest
In a 1987 New York Times interview, then Editor-in-Chief Barbara Morgan batted away the scoffers. “In 700 condensations we have never had a complaint from an author,” she says. “Some have even admitted that our version reads better than the original, and many have written to thank us.”
The manuscripts, contemporary work picked up pre-publication, included novels by Steinbeck and Faulkner. And supposedly, of everyone asked, only Agatha Christie refused to give her novels up. ”We don’t think of everyone as a potential subscriber,” Morgan is quoted saying, sounding not unlike a silicon valley start-up kid, “but we’re performing a valuable service by encouraging people to read by introducing them to a variety of books that they might never have read or even heard of.”
Blink or you’ll miss out
Some people are still getting Reader’s Digest books in the post. But, in the last decade, Morgan’s vision of publishing as a ‘service’ has outgrown its origins. Blinkist, a summary app describing itself as “big ideas in small packages” promises “a smarter way to get smarter” . Their subscribers can access 15-minute versions (the ‘key insights’) of non-fiction books, notably quantified by time, not pages. You can choose to read the summaries, divided into digestible ‘blinks’, or listen to them, allowing you to get smarter and cook your dinner at the same time.
“…spend your book budget in the most optimal way possible.”
There’s an assumption implicit in condensation sold this way: reading is a means to an end. There’s a goal to be reached, and a summary’s the shortcut. Perhaps both the seduction of, and snobbery towards, abridgement stem from the sense that it’s allowing the reader to cheat, gifting wisdom without the legwork, a sort of crib sheet for maxing cultural capital. Blinkist’s language embraces this cost-benefit implication, where the cost is time and the benefit is knowledge, using economics-flavoured wording like “spend your book budget in the most optimal way possible.”
In a move that reveals a lot about what we choose to read in certain ways, and why, the app trades only in non-fiction. Business, wellbeing and popular science books top the most-read. The summaries are delivered second-hand, and don’t feign the author’s voice. They are a schoolfriend, hurriedly imparting the plot and themes of Othello before GSCE English, when you hadn’t done the reading. Blinkist seems designed to prep you for conference small talk, or a dinner party where everyone’s going to bring up Sapiens again. It beautifully taps into our information anxiety, and there’s no doubt it enables you to get the gist quicker. Though, once the synopsis is stockpiled and the dinner party’s over, I’m not sure I’ve found the blink that’ll tell me what it’s all for. Where was it we were trying to get to – faster? Wherever that is, 11 million subscribers are currently signed up for the journey.
Blinkist may be a digital service, but for now its editors are all human. They may not always be. In 2013 Yahoo aquired ‘Summly’, the app and automatic text summarisation algorithm developed by (then) 17-year-old Nick D’Aloisio, for $30 million. D’Aloisio’s algorithm could scan webpages and automatically spit out bulleted summaries. In 2017, researchers at US software company Salesforce published the results of their own attempt. It’s surprisingly effective. In 2019, a quick Google search will throw up multiple coding how-tos for automatic text summarisation, DIY style.
To ‘condense’ a book, Reader’s Digest would filter it through multiple editors, sometimes removing just one word at a time.
A text can be shortened in two ways. The first, ‘subtractive’, is exemplified in the Reader’s Digest approach. Text is shortened by removing words, adding nothing (to ‘condense’ a book, Reader’s Digest would filter it through multiple editors, sometimes removing just one word at a time). The second, ‘abstractive’ summarisation, rewrites the gist of a text into new and fewer words, à la Blinkist. Abstractive summaries, in particular, require sophisticated language processing and a subjective feel for significant features. The original text has to be understood holistically. Take Phoebe Waller-Bridge’s synopsis of Killing Eve – ‘murder, murder, good hair’ – as a highly abstractive example.
The Salesforce algorithm gets close to effective abstractive summarisation, using machine learning techniques to interpret the failure or success of its own generated words and phrases. In the MIT Technology review, one researcher describes how “the system learns from examples of good summaries…but also employs a kind of artificial attention to the text it is ingesting and outputting. This helps ensure that it doesn’t produce too many repetitive strands of text.” To take the researcher too literally, we’re effectively workshopping ways to have our attention paid for us. Can we trust the code to root out genuine relevancy, or will we always wonder what got lost in the cut? If some words are pointless enough to be routinely detected as such, shouldn’t we be writing more concisely in the first place?
Salesforce see numerous uses for their tech. A press release states “improved natural language text summaries present great potential for people and businesses of all industries to boost productivity,” and they’re working towards “serving up coherent, digestible highlights that help you stay informed in a fraction of the time.” Their words wouldn’t look out of place on the homepage of Blinkist, where efficiency is emphasised, information overload diagnosed. Proposed applications span news outlets to social media marketing to optimised SEO.
The weight of words
Different machine learning techniques replicate the ways our minds act upon words. Their application shows the myriad of ways words work for us. Communication, play, instruction – it’s a post-digital platitude, but the more we learn to manipulate language at one step removed, the more we learn about the existing mechanics of our language as it is.
“New types of language, new language games… come into existence and others become obsolete and get forgotten.”
It comes down to the weightings we give content and form, how much we judge a given book by its adjectives, alliteration or rhythm. Attempting summarisation puts us in the shoes of a translator, dancing between communicating sense and communicating style. In the case of Reader’s Digest, where nothing was ever rewritten, only excised from the original text, the words are inextricable from the essence of the work. Some must survive the journey. It makes sense to subtractively summarise fiction (if it makes sense to summarise fiction at all), where we’re as interested in the voice, as what it’s saying. For Blinkist’s roster of non-fiction, the words are less important than the ideas they express. It seems a paraphrase will do.
When it comes to language, philosopher Ludwig Wittgenstein can be counted on to state the invisible obvious, balm for the anxious information age. “There are countless kinds of use of what we call ‘symbols’, ‘words’, ‘sentences,’” he says, exasperated by his peers’ tendency to forensically analyse words in isolation, rather than language’s flexible use in life. “This multiplicity is not something fixed, given once and for all; but new types of language, new language games… come into existence and others become obsolete and get forgotten.”
In other words, as Wittgenstein might (possibly) say if he’d ever had to grapple with Google analytics, for every SEO’ed sentence stripped bare of language’s sensuality, for maximum communicative efficiency; there’s an overwrought metaphor being deployed in a literary article somewhere. The two have as much in common as a pencil and a boat, though both are made of wood.
Depending on their reasons for being, some writing can bear a cut, some can’t. I’d like to see an algorithm attempt to condense Lucy Ellman’s 2019, Booker nominated, 1,050 page Ducks, Newburyport – in which single sentences fill 100 pages at a time. What turns out to be more interesting are our motivations for condensing. If we could throw off the weight of words entirely, and inject Blinkist’s ‘insights’ right into our veins, would we? If the author’s voice is more disposable in non-fiction than fiction, are we in danger of losing the ability to distinguish between subjective opinion and fact?
As natural language processing in machine learning becomes more sophisticated, there’ll be plenty of new language games to get our heads around, and plenty of unexpected players. There’s also the simple truth that, while I like long books, thanks to Wikipedia I now know the plot of every horror film I’m too squeamish to see. Life is short, and sometimes I want to cut to the twist.