Monthly Archives: January, 2014



why do we need punctuation anyhow

isnt the meaning obvious

Best-selling punctuation guru Lynne Truss tells us in Eats, Shoots and Leaves (2003) that punctuation ‘directs you how to read, in the way musical notation directs a musician how to play.’ In other words, punctuation aids the reader as well as the writer.

By definition, manuscripts were hand-written. There was a wide variety of acceptable conventions. You might have been scribing away and you needed to consult an in-monastery style guide. Good luck with that. You relied on custom and practice rather than any published authority. Extant manuscripts from the Dark Ages and early Middle Ages show little or no use of punctuation. Other samples show haphazard and non-standard punctuation marks. Their major reason for coming into existence was to help with reading aloud, usually in church. Monks and priests needed to know where to take a breath, even in Latin.

Through history, punctuation marks have come and gone. We used to have one called the ‘pilcrow.’ It looked like a backwards capital P (though original it was a C with strokes through, to introduce a new chapter). Another widely used mark was the ‘hedera’: heart-shaped with an ivy leaf flourish on top, and used for marking paragraph breaks. Also since fallen into obscurity has been the point d’ironie, a reverse question mark used for rhetorical questions.

Will today’s emoticons, or revived marks like the ampersand or the ‘at’ sign or the near-ubiquitous hash-tag become lasting punctuation marks of the future? Or will they fade like the ‘virgule’ and the ‘Tironian et’?

Already computers dislike any interruption to numbers and letters. They really do not like commas. An extremist view is that all punctuation will disappear. But if humans continue to read and write, how will we understand without some means of orthography, even if it relates to colour or typeface? A recent article in Slate magazine, discussing our endangered comma, cited a University of Michigan linguist’s view that the decreasing use of commas in everyday texts and tweets may be the result of seeking to make communications stylistically fun and closer in tone to conversation. Students have reported that period (full stop) is being re-imagined to signify seriousness or anger, while the ellipsis is used to convey scepticism or unhappiness. The comma doesn’t even rate.

I believe the comma is truly at risk. And it’s one mark we would struggle to manage without. For as long as words have been written down we have needed something like the comma. A form of this mark dates back to Ancient Greece, though the modern punctuation mark was invented by a fifteenth-century Venetian called Aldus Mantius the elder. Prior to that, scribes used a variety of other marks, including the virgule (also known as the ‘forward slash,’ ‘oblique,’ ‘solidus,’ ‘slope bar,’ ‘shilling mark’ or ‘diagonal’).

I would not necessarily mourn such language niceties as the so-called ‘Oxford comma,’ which is the final comma used when listing items. Oh, yes, it can certainly save confusion (e.g. ‘tea, bread, butter and cake’ is less clear than ‘tea, bread and butter, and cake’) but is it essential? Nor would I grieve long for the demise of the introductory adverbial comma. This denotes the use of a word, phrase or clause at the start of a sentence. But I would miss the absence of all commas; they have a style and grace that no dashes or emoticons could replace.

The demise might be inevitable though. With increased reliance on electronic writing, people are already skipping commas in text messages, tweets and other posts to social media, as well as email–and still making themselves understood. Loss of commas might cause the odd moment of unintended hilarity (e.g. ‘Let’s eat grandma.’ or ‘After eating the dogs slept.’) but their absence would not be long missed in the majority of informal forms of communication.

As the line between formal and informal writing has blurred the poor comma is often associated with an academic end of the writing spectrum. For some it could be a generational shift (‘Dad the comma is so 2010!’). For others the comma could represent a social or ethnic hegemony where the democratising nature of an informal style inexorably crowds out the seemingly old-fashioned or elevated form.

I just went back and excised all commas from that last paragraph. The meaning did not suffer. Such a pity. Perhaps commas will go the way of the pilcrow or the manicule. But for now, I’ll keep using the little thing. Let us stay on our comma-toes to make sure the poor critter doesn’t go comatose.


When did ‘whitewash’ get a whitewash?

Ashes Tom Sawyer Fresco

Recently Australia’s national (men’s) cricket team defeated their England counterparts in a clean sweep 5-0 series. Australian media outlets went a little crazy, raving about this historic ‘whitewash’. The English media went a little crazy for different reasons, seeking culprits. Meanwhile I started wondering about the term ‘whitewash’. When did that become a legitimate usage?

Plenty of words and phrases change, or even reverse, their meanings. One thinks of ‘hoi polloi’. This is Ancient Greek for ‘the common people’ but it has been adopted by some as a synonym for ‘the elite’ [see earlier blog Misuse or Evolution?]. One thinks of a phrase like ‘going for a song’. Originally the ‘song’ in question was an epic poem (The Faerie Queene, 1590 by Edmund Spenser, dedicated to his sovereign Elizabeth I). When the queen’s treasurer, Lord Burleigh, heard that the queen would pay one hundred pounds for the work, he said: ‘All this for a song?’ In time this phrase came to mean a bargain rather than an extravagant purchase. So I assumed that a similar process of evolutionary inversion had taken place with ‘whitewash’.

The whitewash I knew was a low-cost paint, with a lime or chalk base. I’ve seen examples of English church walls that Henry VIII’s lieutenants enthusiastically whitewashed. During the Reformation, reformers painted over religious frescoes across the land, in a bid to take the capital ‘C’ out of Catholic and to remove papist Christianity from the post-Thomas More realm of former chum Henry.

Whitewash is also used on building exteriors, such as workers’ cottages: cheap, but needing regular application. I remember first reading about this in The Adventures of Tom Sawyer, by Mark Twain, as a punishment task meted out to the naughty Tom, which he turned into a triumph. There’s an extant American expression: ‘too proud to whitewash and too poor to paint.’ Whitewash can also be used on tree trunks, especially fruit trees, to prevent a condition known as ‘sun scald’. It can also keep a trunk cool to help prevent the fruit from blooming too soon.

Another use for ‘whitewash’ is in the metaphorical sense, for censorship. ‘Whitewashing’ can mean covering up unpalatable truths by corporations, politicians or the military. Many an authoritarian regime, and a few democratic ones, are guilty of whitewashing. Critics of the way in which Australian history used to be taught in schools refer to decades before the 1980s as an era of ‘whitewashing,’ when indigenous people hardly featured in the syllabus: a scholastic Terra Nullius. I’m reminded of Animal Farm, by George Orwell, in which the porcine Stalin-like figure of Napoleon seeks to whitewash the farm’s history by erasing certain animals from the record.

Versions of spin have been practised by others with a different revisionist agenda, known as ‘greenwashing’ (products packaged as environmentally friendly), ‘bluewashing’ (governments packaging humanitarian aid as having a small water footprint) and ‘pinkwashing’ (products packaged for breast-cancer awareness).

A further use of the word reflects the ‘white’ in whitewashing. One definition from Urban Dictionary cites an example of someone viewed as leaving behind or neglecting their culture and assimilating to a western culture, e.g. ‘That exchange student has really been white-washed.’ Sometimes this whitewashing favours people of Anglo-Celtic origin over others; sometimes it can work in reverse. Newcorp recently reported on a move by some British  politicians to downplay the role of Australian soldiers in World War One in favour of so-called developing (i.e. predominantly non-white) countries such as India, allegedly in a bid to win political and economic favour in multicultural Britain. This comes at a time of heated debate about immigration numbers. Naturally, people who object to this idea can’t help but regurgitate the phrase ‘political correctness gone mad’.

So much for chemical, historical, political and ethnic whitewashing. Where did the sporting definition of ‘whitewash’ come from? When journalists from Fleet Street and Australia wrote about the recent Ashes series, they were talking about a clean sweep of victories that left the opponent scoreless. This was a whitewash? I thought this use of the word was just not cricket.

And it’s not. It comes from baseball.

The term ‘whitewash’ arose in that great game, not invented in the USA (but that hardly matters). ‘Whitewash’ started in the nineteenth century as a description for preventing an opposing team from scoring any runs: in other words, a clean sweep. Now that’s the phrase Australians ought to be using. Otherwise, journalists might find themselves in all kinds of trouble if and when an Australian side defeats the West Indies, Pakistan or Sri Lanka in a series clean sweep. Whitewash might be a most unfortunate term. 

Wet liquid water

Pleonasm Forward planningSynergy

The redundant phrase I most dislike is ‘forward planning,’ as if anyone ever engages in backward planning. But we all have our gripes. Perhaps you cringe at the phrase ‘basic fundamentals’ or ‘free gift’. Hey, if it’s a gift, why would you pay for it?

We’ve all heard the complaints about ‘ATM machine’ and ‘HIV virus’ as redundant phrases. The damn things are tautological: a collection of words meaning the same thing, like ‘hot burning fire’ or ‘frozen cold ice’. There’s a related phenomenon known as the pleonasm, which is closer to a rhetorical repetition or excess loquacious verbosity (like that). As an extreme illustration, you could refer to your spade as a ‘single-bladed garden-digging implement’.

Tautologies as a verbal tic can really annoy people. In writing, they’re just sloppy. Why write ‘added bonus’ when you mean ‘bonus’? Why describe someone as an ‘armed gunman,’ ‘bald-headed’ or ‘fellow colleague’?

To create some emotional detachment, which helps prevent me from physical pain or inflicting pain on others, I’ve been keeping a collection of these irritants. Here are a few:  ‘actual facts,’ ‘assembled together,’ ‘9 AM in the morning,’ ‘disappear from sight,’ ‘end result,’ ‘cooperate together,’ and ‘foreign imports’. Do you collect such howlers? I think it’s even more satisfying than collecting clichés.

But a more important question than why must we put up with them is why do people do it in the first place? Perhaps this habit arises from anxiety about being understood, or a related wish to be unambiguous, or even a form of emphasis. These are not motives to be sneered at. Rather than trying to irritate others, the speaker or writer believes he or she is making statements clearer. It’s akin to the use in non-standard English of the double-negative, as a form of highlighting, e.g. ‘I never done nothing’.

My greater concern is that some words have become devalued. ‘Essential’ isn’t strong enough, so ‘absolutely essential’ takes its place. Is it a form of linguistic inflation, whereby units of currency are worth less than they used to be? Why do we say ‘add up’ instead of ‘add’? Why is it necessary to ‘completely annihilate’ when ‘annihilate’ already means ‘reduced to nothing’?

Could such depreciation in word value be attributable to the hyperbole we are subjected to via advertising? For example: ‘The best part of waking up is Folgers in your cup’ – which is neither true nor even partly true, and false if you’re not a coffee drinker or you enjoy sleeping-in.

Is a pleonastic tendency related to the phenomenon of creative exaggeration used in real estate palaver, e.g. ‘beginner’s luck’ or ‘renovator’s delight’ to describe a run-down bungalow?

Even job titles are becoming inflated. Teachers become ‘Educators’. Sales Managers becomes ‘Account Executives’ and a Storeman becomes an ‘Inventory Control Manager’.

In such a hyperflationary language ethos, increases in redundancy and pleonasm are not surprising (or should that be upgraded to ‘unexpectedly surprising’)? In the exaggerated word world, ‘necessary’ becomes ‘absolutely necessary,’ ‘ultimatum’ becomes ‘final ultimatum’ and ‘prospects’ become ‘future prospects’.

I give you advance warning [ouch!] of an affirmative yes [please, stop!] when you assemble together [No…!] such repeat repetitions.

With careful scrutiny you may completely eliminate current trends or else confer together with fellow classmates to reach a consensus opinion about this exact same crisis situation. You can meet together for joint collaboration on a major breakthrough which may possibly achieve a final result as never before so you make a new beginning out of past experience. You can pick and choose your advance plan and proceed ahead to a safe haven where you won’t revert back to old habits. You will no longer be in serious danger of repeating again your incorrect errors. A few unintentional mistakes might still persist but you can resist the sudden impulse of your usual custom and make a truly sincere effort to reach the sum total of your ultimate goal.

Think I feel nauseous. Need a horizontal lie-down.