Over the holidays a reader wrote in mourning for a manuscript he had placed on the Google Docs site. Because he had placed his trust totally in “the cloud,” he had not backed up the file, and when it disappeared nobody at Google could help him find it. Actually, according to my reader the company wouldn’t even respond. I figured they were on vacation.
I blogged the letter and lamented the situation in a podcast. The reader eventually got the document back because his colleague, who was working on the same manuscript, still had his document intact. But it was only after some thought that I realized that none of this was about the cloud. This was about retro computing, the kind we used to do on the mainframe with a terminal. Retro computing—in which data is stored on a centralized system rather than on your own PC and storage media—remains peculiarly attractive to many people.
The loss of data is nothing new. You can hardly expect better when you allow someone other than yourself to have ultimate control. Just recently AOL shut down its Hometown Web sites, and people complained bitterly because they didn’t get their data off the system before its abrupt end. Podrango decided to stop giving away bandwidth to podcasters and has given users until the end of the month to get their content off the site or risk losing it. In the late 1990s, as the dot-com crash gathered speed, free services started being shut down one by one, often with no warning. Despite all this, people keep coming back for more agony.
Exactly what is driving retro computing? There’s been a movement back to this form of architecture ever since the PC grabbed a foothold on the computing scene and refused to give way to all efforts to remove it. Retro computing has taken many forms, from “client-server” schemes in the 1980s to the ASPs (application service providers) of the 1990s to thin-client computing to “the network is the computer” to “cloud computing.” There are dozens of other forays into this architecture, and the results are always a dead end. Well, almost always.
What makes the whole Google Docs thing kind of silly is that loaded computers are more powerful and cheaper than ever. And you can get a terabyte of storage for around $100. How inexpensive can you go?
But no matter how cheap personal storage becomes, people keep gravitating toward giving control to someone other than themselves. This is reflected in society as a whole, and in the current financial crises and various other scams. The Bernie Madoff $50 billion swindle was the best case in point. People would rather trust him with their money than trust themselves. After all, he was a professional.
So, too, with retro computing. People would rather trust Google with their documents than trust themselves. I mean, your house could burn down and take all your files with it, right? And of course, there are security issues with a PC. You could get a virus, and it could destroy your machine and its contents…and so on. Generally speaking, if you feel this way and are fearful I can almost guarantee that you have some malware on your machine at this moment. The malware isn’t out to destroy your machine but to send out spam at night on your dime.— Next: Blaming AOL
But I digress. I blame AOL for the entire mess we are in. AOL was the first online entity that did everything in the cloud and led us to think that upgrading online from some mainframe was somehow okay.
Well, this sure was a lot cheaper for AOL than sending out new discs when it wanted to upgrade, but the side effects were many. First of all, the method worked—and thus created a false sense of trust in the…mmm…let’s call it the proto-cloud. Week after week AOL would take over the machine and upload and install something. And from what I recall, it never failed, ever.
The conditioning that we all received from AOL predated the Web and Google and the rest of it and paved the way for cloud computing and all the buzz it was to generate. AOL was also responsible for popularizing e-mail despite all that MCI Mail, CompuServe, AT&T Mail, and even the early Internet were trying to do.
Once AOL’s vision of the cloud was adopted by Microsoft, the rest was inevitable. Retro was in. Instead of powerful standalone computing, we do things the slow way— online. Instead of utilizing our 1-gigabit-per-second LANs, we go online to a retro 10/100-Mbps IP network and hope for the best.
We got suckered into thinking that this way is better. Retro.
Lulled by a false sense of security, nostalgia for the past, and years of conditioning by AOL, how could the public resist the cloud? The cloud is the future of computing. In other words, the past is the future. You’ve got the mainframe at the end of a slow network. You’ve got oversight and control turned over to someone else. Do you feel better now?
Now if someone could just come up with a batch-processing mechanism, the circle would be complete. “Submit all your work at once; we’ll take care of the rest!”
Do I think this is all bad? No, I suspect there are plenty of things that the cloud can do better than the PC—not that many examples spring to mind. Here’s the thing to think about: A giant word processor in the sky should do more than the PC, since there’s more power to be utilized. Then why are current tools less powerful? What’s the point?
To sell me on cloud computing, make the apps hyper-powerful. With Google Docs that means adding incredible features such as automated copyediting, grammar analysis, and correction. And how about adding real inter-language translation, not the rinky-dink translators Google uses to display a foreign-language Web page quickly. Grind on a document for an hour if you have to. I want a real translation. I’ll wait for it.
Now that sort of thing would make the cloud work for me. A version of Microsoft Word that’s been gutted is just silly.