Language and the Internet
The offending language constructs:
TBH I hate all those TLA’s that you can find floating around on the WWW . I 4l50 f1|\|d 4ll 7h15 733t 5p34k 4nn0y1|\|9 (I also find all this leet speak annoying), nt 2 mntn txt spk & al its abbrvs.
i hate the langauge that’s used on countless forums to where all the above offending communication methods get rolled into one and theres a complete lack off any gramatical input in a passage of text no full stops no commas no capitals no anything in fact and your left trying to figure out where thoughts stop and new ones begin and weather they really ment weather or whether because they dont seem to understand the difference especially in words like there theyre their of off to too etc
And wen gramers used it’s usually the rong gramer in the wrong place at the rong time, like, there can be, countless redundant, commas!!!! And exclamations!!!!!!! w00t!!!!! and misplaced apostroph’s.
When I encounter passages of text written like those above I tend to ignore them. They convey the impression that the author couldn’t give a damn about what they are actually trying to say. It takes serious effort to decode the meaning behind the words, and too often the meaning is some ill considered opinionated garbage written by a lazy illiterate muppet that couldn’t justify their viewpoint any better than they can write or explain it.
The real problem:
The Internet has a massive potential for being the vehicle which creates a common language understood by everyone on the planet. Britain spread the English language through conquest and colonisation. It took many decades for English to become one of the most spoken tongues on Earth. The Internet could match and exceed the rate of adoption of language in much less time. Considering the World Wide Web as a publicly available entity has only really been in existence for ten years or so, it has now reached billions of people. Google indexes results for 8 billion individual web pages. Global population is only around 6 billion, so there’s a web-page and a bit for every person on earth, and the numbers are only going to increase. It’s getting easier to get connected and the online world is seeping into the ‘real’ world more each day. You order books online, you order your shopping, you listen to radio from other countries, you use it as an encyclopaedia, dictionary and reference tome, you use it to create social networks and keep in touch with people, you use it to bank. In ten years the world has changed massively, everyone knows what a web address is, and the Internet isn’t going to go away, it will only evolve and expand. The days of being able to live disconnected from the networked electronic world are fading fast. It’s my belief that in another two or three decades it will be near impossible to live ‘off the net’ - something somewhere in your house will be linked to the Internet, be it your house alarm, fire alarms, fridge, TV, electricity meter, or PC.
Language evolves, it always has, but the rate of evolution will be that much faster in the online world. What I really don’t want to see is a global language based on a bastardised English that has dialects. Bastardise the English, that’s fine as long as everything has a globally understood meaning, but I feel that’s not what will happen. Geeks will use words and TLAs one way and other people will use them with different meanings. Dialects won’t be based on geography, but on ‘digital class’, of which there will be many.
Why the bastardisation of the language in the first place?
txt spk evolved for a reason, which was the limited character count when sending SMS over mobile phone networks. You might think it ought to disappear once the restrictions were lifted, but that’s not turned out to be the case. The reason is simple: it’s faster to type in txt spk. You can think far faster than you can talk, and in turn you can talk far faster than you type, and you can type far faster than you can enter text on a numerical keypad. Each one is a step down in expressing your thoughts. Each one takes longer and makes communication that much harder or less effective.
As long as producing written words takes as long as it does, abbreviations will continue; because written word is now used for every day conversation, rather than predominantly article based conversation. Hundreds of millions of PM’s and Instant Messages are sent worldwide every day. That’s the written word being used in a way it never has before - attempting to get real-time communication via written word, which is a method ill suited to the task.
With articles you can use the time it takes to write to add extra thought to your content. The objective is a well thought out coherent self contained entity. That’s not the goal of ‘conversation’ written word and so ‘conversational’ written words trn in2 txt spk in the rush 2 say wot u wnt rite there rite then.
The solution is either to modify the language into something which conveys more meaning in a shorter space and time, or to create a radically accelerated user input method.
- Sun, 10th Apr 2005 at 15:04 UTC
- Filed under:
Commentsskip to comment form
I think I understand what you are saying, at least I think I do, but surely it is to late to do anything about it, surely the ‘Genie’ is out of the bottle and it will be imposable to get people to use any other system, after all people tend to use what ever is the easiest. Don’t you think?
Yes, I’d agree. Which is why the only solution would be to design some way of making ‘proper’ English faster and easier to type. That’s one heck of a user-input problem though.
But txt spk is a quicker way of writing? You want to modify the English language into something more efficient yet you turn your nose up when it happens?
The problem is that compression requires decompression. When someone writes in txt spk it is a lot faster to write (when you get used to it - it’s just like another version of shorthand - and no I can’t write in it) but of course it requires the reader to spend extra time translating it all back.
What’s needed is a system that decrypts the likes of txt spk back into normal english. So the typist can type that fast and the reader still gets what they want. This wouldn’t be too hard to impliment on most forums with the swear filter. Maybe I should experiment on LAW.
I don’t doubt that there will be an Internet universal language before long. But at the same time expect there to be various sub languages the users don’t want others to understand. That’s why various languages, accents and slang exist - exclusivity.
I’m turning my nose up at txt spk because it is only more efficient one-way, and the price for writing efficiency is a loss of reading efficiency - so, in my mind, txt spk isn’t a viable answer to the problem.
You’ve got an interesting possibility with the ’swear filter’ - that’s something I hadn’t thought of; processing strings to decode common short-cuts back into readable output. I might have to play with that myself, it’s a partial solution perhaps.
I dnt c nethin rong wiv bein a 1337 h4x0r!
*blushes* sorry…I’ll get my coat.
What did he say?
errr… ‘Rolling On the Floor Laughing’
from two entries previous to this one: “synomonos”
STOP BASTARDISING OUR BEAUTIFUL LANGUAGE! ;¬)
Matt says: I will have to remember to use Firefox’s spell checker more often! My apologies Reverend