A linguist and lexicographer specialising in slang, jargon and cultural history, Tony Thorne is a leading authority on language change and language usage in the UK and across the English-speaking world.
Among his many books are: Shoot the Puppy (Penguin, 2007), a survey of the latest buzzwords and jargon, drew upon his inside experience of corporate life while working as a communications consultant for multinationals, NGOs and business schools; The 100 Words That Make the English (Abacus, 2011), a collection of essays on one hundred key words that are most emblematic of English identity in the 20th and 21st centuries; Dictionary of Contemporary Slang (Bloomsbury; first edition 1990, 4th edition 2014), one of the only treatments of the subject to be based on examples of authentic speech rather than purely upon written or broadcast sources.
Rather curiously, he also wrote a biography of the 16th century Hungarian Countess Erzsebet Bathory, reputed to be a mass murderer who bathed in the blood of her victims, as well as a book on the 18th century French waxworker, Madame Tussaud.
Tony Thorne is a Visiting Consultant at King’s College London, UK. He blogs at Language and Innovation.
Let’s start by talking about the role linguists/lexicographers play in a time of (health) crisis. To what extent do they have to cope with sloppy communication, fake news, and so on?
The Chinese coined a term right at the start of the pandemic which translates roughly as ‘emergency linguistics.’ I think that is a notion that lexicographers and other linguists have to embrace, even if it moves them into unfamiliar territory. The deluge of new language – the ‘infodemic’ – and the flood of disinformation or fake news means that linguists have to play a social role in defining and explaining, but also going beyond definition and usage notes to expose the toxic potential and devious intent behind much of the vocabulary being generated. In dealing with a concept, or technique such as ‘track and trace’ (surveillance and monitoring of infected individuals and their contacts), denotation – simple definition – is not enough. Any serious commentary has to include that fact that the strategy was and still is partially illusory in the UK, in other words the term is not straightforward but conflicted, confusing and confused.
Let’s move forward with the language of the coronavirus pandemic. In the month of April 2020 over 700 Dutch neologisms, all coronavirus-related, have been registered; the Italian language has also been extremely prolific in this sense. The number of neologisms is probably high in English as well. When it comes to COVID-19-related English neologisms, what are the distinctive elements of these new words?
By April I had recorded over 1000 Covid-related neologisms from Anglophone sources which I labelled #coronaspeak. There must be many more by now and I have been doing my best to track them.
They came in three successive waves: the first phase consisted of medical and scientific language crossing over into everyday conversation – expressions such as ‘viral load’, ‘herd immunity’, ‘intubation’, ‘social distancing’. The second phase featured the reactions of people experiencing the new reality of quarantine and working from home. They coined new terms – often humorous or ironic nicknames, slang and jargon – to describe the ‘new normal’. People celebrated drinking a ‘quarantini’ or a ‘furlough merlot’ in ‘locktail hour.’ They complained of ‘zoom fatigue’, being ’zoombombed’ and suffering a ‘coronacut’ while hairdressers were forbidden to operate. The third phase, which is still ongoing, has been dominated by government messaging; a rolling out of slogans, mantras and regulations, some attempting new formulations, some reworking old catchphrases.
Once again, I don’t think linguists can stay neutral in their commentaries on discourse which seems often to be designed to mislead and to coerce.
Do you agree with Samuel Johnson’s definition of a lexicographer as “a harmless drudge”?
Lexicographers are like translators and editors: the unsung heroes of the information age. The poet Shelley claimed (perhaps predictably) that poets were our ‘unacknowledged legislators’, but I think lexicographers have equal claim. Dr Johnson was being heavy-handedly facetious as usual, but lexicographers still toil away unnoticed and largely unrewarded. At we now have better access to data and better tools with which to manipulate it. I have argued that academic and professional linguists should try and raise their profile, not just share ideas with one another but engage with a wider public.
Which technologies do you believe have had – or are having – the most profound effect on how we communicate today?
The rise of Big Data and the evolution of electronic processing and digital media have profoundly changed not only our professional environment but also the way we conceive of communication and our roles as both consumers and producers. We are empowered to access vast amounts of information and analyse and re-work data at greater speed and with greater accuracy, but I think populism, Brexit and the pandemic have shown that our relationship with mass media, social media, messaging at all levels and those who subvert communication remains troubled, to say the least.
Platforms such as Facebook and Twitter are rightly criticised for allowing dangerous or fallacious messages to circulate, but they also allow us to reach a global network not only of ‘friends’, but of colleagues, informants, experts, even directly communicate with ‘influencers’ and ‘thought-leaders.’ The technologies that fascinate me are those that help us construct what has been called the ‘annotated self’: from ‘selfies’ to ‘life journaling’, so that any one of us can now become author, autobiographer, photographer or film-maker – or all of these at once. I like the idea that we are not simply at the mercy of the manufacturers of news and entertainment but can engage with them and redesign their formats and conventions ourselves.
There is a lot of talk about inclusive language and inclusive writing lately. Linus Torvald has approved a new terminology that bans the use of terms like master/slave and blacklist. More generally, the tech industry is trying to clean up its vocabulary. Following a request, Merriam-Webster will change its definition of the word “racism”. On this side of the pond, the OED is cleaning up what are considered sexist stereotypes. In other languages, like Italian, language professionals are considering using the schwa (ə) as a way to address a mixed group that might also include non-binary people. Are all these linguistic efforts contributing in any way to social changes?
Despite the laments and the attacks on the part of conservatives, the reform of language in order to increase equality and diversity is hugely positive. The problem of course is in the details: how exactly do we do it? Which parts of the language are most in need of revision? How do we persuade the pedants, the peevers and the bigots (not to mention the self-appointed institutions who pose as the guardians of some national languages) to accept change? How quickly should lexicographers, translators and teachers implement and incorporate these innovations? It is good that some of those working on reference books and some tech specialists have recently taken steps to question and where necessary to adapt the language assumptions implicit in their systems. I cannot pose as an expert in a field in which there are many activists (from feminist, BAME, LGBTQ, Trans communities for example) who have a more personal stake in enacting social change through language reform. I have, though, offered a few thoughts in my article, Decolonizing the Dictionary.
You can read more of Tony Thorne’s articles on some of the topics covered in this interview as well as on slang, jargon and innovation in language on his website, Language and Innovation.