First it was Ajax, then Web 2.0, and now it’s “the cloud” and HTML5. As a developer, miss-understanding, and by that I mean the general use of buzzwords, is one of the most annoying things in the world. It starts as a rumour, where a non-techie will say something like “hey have you heard about this?” That’s fine, that’s fairly safe. But soon you have an epedemic of sizeable proportions on your hands, when all sorts of people are running to you, out of breath and with a bit of spittle on their face crying “IF WE DON’T USE THIS, IT’S THE END OF US. THE END.”

It’s not really the end. Let me tell you a story. Back in the 1960s there was something called mainframe computing, where people would use a dumb terminal which could do little on its own; the terminal would connect to the mainframe which was the monster that did all the work. However, it was not to be: in came the 1980s, and with it the Commodore 64, and the Apple II, followed by IBM-PC and of course, MS-DOS. This was followed by my birth. It was an epic decade.

The advent of the PC was a game changer, and it meant people could do all sorts of things on their own disconnected computer, things like play solitaire; ah windows 3.1, how I miss you. And now, we’re back full circle. Cloud computing is mostly the same as the mainframes of the past, and our terminals are once again becoming dumb. But it’s not all bad, it means your iPad can edit word documents, and your phone can use services which require computing power beyond what it is capable of. Just, don’t tell me it’s a new invention, and don’t tell me it’s necessary. And most of all, please don’t use the word if you have no idea what the fuck it means.