The Everlasting PC

As I was graduating from college, one thing became apparent: Desktops were going the way of the dinosaur. Laptops were going to replace them. Why shouldn’t they? I mean, you can actually take a laptop places. It can do everything a desktop can do, but in a portable way, right?

Well, laptops have displaced desktops as the most common form of personal computing (at least, I believe so). Yet… desktops are still around. In large numbers. And they aren’t going away. It’s likely that they never will. Why not? Because they serve a valuable purpose as work machines. The work station I have up stairs? Laptops can’t do that. Multi-monitors, large monitors, a more workish environment that requires no set-up. Even if a laptop is more flexible, there are a number of things that are easier to do on a desktop than a laptop. And given how cheap both are, it’s easy to have both. And so while the laptop has thrived, the desktop has remained and appears as though it will remain indefinitely. Why wouldn’t they?

With this in mind, my head boggles any time anyone talks about the post-PC world. A few years ago it was smartphones that were going to replace PCs. Now, tablets. Okay, tablets with keyboards, maybe. So sort of laptops. There’ll be a merger. Something will surely happen to kill off the PC, right?

No. Not at all.

Just as was the case with the smartphone, the notion that we will settle on any single device is very short-sighted. Why should we? Different tasks beg for different tools. Sometimes you need to sit and work. Sometimes you want to sprawl on the sofa and write a blog post. Sometimes you need extra monitors, sometimes you want to be comfortable. Sometimes you want something you can take with you, sometimes you want something that fits in your pocket, and sometimes those things don’t matter and you want performance (and the desktop will always rule over laptops, tablets, and smartphones over performance). This notion that we will end up settling on a single device implies a degree of scarcity that does not actually exist. Now, more than ever, we can afford a desktop and a laptop and a smartphone and a tablet. We can cut out this one or that one, and few will have all four, but there’s not much reason to believe we will all cut out the same ones.

Ironically, the thing that is going to make this easier is actually the thing that leads some to say that the future is going to not be a PC one: cloud computing. I am actually skeptical of the extent to which we will ever move completely to cloud computing, but it does make switching between devices easier. Which not only means that we can do more on our tertiary devices, but also means that we can use these devices in complement with one another.

I was first told about “cloud computing” when I was in college. It wasn’t called that yet, but went by the less marketable name of “Dumb Terminal.” Which was the belief that in the future, computers wouldn’t actually be anything but terminals into larger and more powerful machines. It took fifteen years, and it’s still not happening quite like it was supposed. Why should it happen? Our individual computers now are just as powerful as the mainframe they would have been connected to 15 years ago. Given the constant state of advancement, there’s no reason to outsource what our computer does. At least, not completely. Enough, though, to make owning lots of devices easier. As Americans, we like to own stuff.

Will Truman

Will Truman is the Editor-in-Chief of Ordinary Times. He is also on Twitter.

37 Comments

  1. I’m typing this on my 10 year old desktop which i’ve added a hard drive to and gone through 3 DVD burners. Of course i could have gotten external versions of those but that would require a bigger desktop replacement laptop, but i couldn’t have as easily add other upgrades to laptop. Desktops will always have a function and a place.

    I’m really sceptical of cloud computing Its partially me, but i don’t want to always have to have a good fast intertoob connection to access my stuff. I don’t fast universal web connections are close enough. Yeah telephones have good connectivity but i don’t think they are there yet. Even still its much simpler to have stuff on my own hard disk.

    • I’m also skeptical of cloud computing, at least as envisioned. I am a little less skeptical since joining the Android ecosphere and increasingly relying on Google to keep things synced, but I think there are limitations that will make (for instance) Chromebooks unworkable.

      Basically, we lack the Internet infrastructure. Especially in places like where I live now, but its reliability is less than rock-solid and we are metered on data now. That means that using the cloud, at least on the phone, costs me money. So I’m not going to use it for anything but syncing relatively simple data or to retrieve something rather specific.

      Increasingly, laptops/netbooks are not even coming with optical players/burners. Tablets don’t come with them at all. This would be another thing that tells me that we’re not going to move towards a singular device.

  2. Not sure if it’s a gen y thing, but this feels like common sense to me. Of course I’m going to have several devices and they each have their pros and cons. I will just own them all.

  3. I prefer PCs for the computing power, storage space and the monitors. At work I run a ridiculously large PC with a dual monitor setup. I keep all of my documents on my PC for quick access and then back them up to an internal server weekly. I wouldn’t trade that setup for anything. When I’m in a meeting and see upper managers squinting at their small laptop screen and I think about doing that all of the time (they mostly work from home and don’t have docking stations) I cringe.

    At home I also prefer a PC for many of the same reasons. We have a laptop that we use for vacations but with our Android phones and tablets I don’t see much need for the laptop anymore.

    As for the cloud – it’s awesome for home. I’m a Google whore and I love the sync-ability between my PC and my phone for Gmail, Reader, etc. At work we have serious security protocols and so far we have been very reluctant to use the cloud for most of our work. We mainly use it very low-level stuff that we are willing to risk getting out.

    • Google really has changed my thinking on the cloud. Especially when it comes to things like schedules, bookmarks, and other things Morat20 points out below.

      • The cloud is great for task, bookmarks, and calendar events.

        I’m no so sure that it makes sense for huge media files. I like to carry those around with me, and storage is dirt cheap. I don’t want to hit my 3GB limit on my phone listening to music that I could just as easily take around with me.

        And my home/work internet service provider, AT&T, caps me at 150GB / month.

        The cloud makes sense for some things, and not others. But I’m still going to take responsibility for backing up the things I hope to keep.

        • I’d like to find something that works with our family media library. We have over 20K songs, and a ton of gigabytes devoted to TV shows and movies. It’s near impossible to figure out a way that everyone in the family can access the library on their own devices in a way that allows my adding an album, for example, and everyone being able to see and access it without having to KNOW that it was just added and then find and add it to the library on their device.

          I’m a little surprised all of this is still an issue.

          • Your Corporate Overlords are highly invested in not making this easy.

          • Drob0Share? Don’t have it myself (I just have a Drobo) but I have eyed it a few times.

          • If you are using iTunes for your music and movies, you can just get the basic Drobo to store it all (redundantly, I might add – Drobo is kind of like RAID for dummies), then turn on iTunes home sharing to let the other computers on your network see & access the songs and movies – they don’t have to add them to their libraries, they are accessing the library of the computer hooked up to the Drobo. This is basically what I do.

  4. Troglodyte here.
    Figure we’ll be done with the big ol’ desktop eventually. Build microprocessors into everything. But that’s already happened too, hasn’t it?

    My computer should not be needed to play a movie. Give the hardware geeks a couple of years, and it won’t be. Parallel parallel parallel — and less heatproducing, means that you don’t need the one big box.

    My homePC has a brilliant aluminum case. built the whole thing myself.

    • We’ll build microprocessors into everything, but a whole lot of people will have something sitting on a desk, [a] large monitor(s), used for getting some serious work done. Also, maybe, for running all of the other things that have microprocessors (changing the thermostate on a computer) – though a laptop or possibly tablet may be able to do that job.

      • always gonna need a keyboard, until we get the EEG working better. 😉
        Also, always want large monitors, but maybe build them into walls??

        Optimal for many things is voice recognition — for a thermostat, just shout down the hall. Or some other walkie-talkie.

  5. Cloud Computing might better be called “Cloud Keeping Your Data Backed Up and Storing Your Bookmarks and Stuff”

    • In the business world, Cloud Computing means “You don’t need a data center.” Instead of having your own (in-house or outsourced) computers where you install the software your company runs in, you use Google for e-mail, Salesforce for customer data, etc. You may still have IT people, but the systems they’re managing or working with are all remote. And, because most of the Cloud software has web interfaces, your people don’t need to install and update any applications beyond a web browser.,

      • Well, sorta. Some things have to remain in-house. I won’t put HIPAA data in the cloud: patient privacy. I’m not exposing that information to anyone with #root on some third-party site. The cloud might be as or more secure than something in the old server room but ultimately any security model fails once you’ve decided to expose too much of your system.

        • And ideally you don’t want so much in the cloud that, if your ‘net access is down, no one can do anything but play Minesweeper. Still, you know how business people are: overdo to the point of insanity anything that looks like it’ll cut costs.

          • During a Space Shuttle sim, sometime several years back, the email servers crashed. And crashed hard. Whatever happened took a few days to restore to rights.

            The sim was a long-sim, so it ran more than a standard shift. It was chaos. Nobody had even considered multi-day loss of email. It really hosed lines of communication internally, and to participants at other centers.

            IIRC, NASA took some lessons from this (not that I can recall an email outage ever happening during a real flight), but it’s a lesson I wonder how many businesses really consider — how much we rely on email and network access to do pratically anything.

  6. I’m not very knowledgeable about computers, but I prefer my desktop to laptops, and if the desktop ever breaks down, I’ll probably buy another desktop instead of a laptop.

    When I did have a laptop, it was my short-lived, three-times-as-expensive Toshiba (ave, may it Rest In Chaos). I used it as a desktop and never took anywhere. I imagine laptop prices have gone down from then (2005), but I still prefer my desktop and would probably continue to unless the price difference changes dramatically.

    • I find desktops to be more comfortable. The only thing I like about laptops are their portability.

      • Laptops are fine, other than screens that are too small, third-rate keyboards, and fake mice. I’m using one right now, but with a real keyboard and mouse and a 27-inch monitor attached to it.

          • I used to do that. Eventually I gave up and got good with the eraserhead. If I didn’t have the eraserhead and was stuck with a trackpad (as is the case on most non-Thinkpads), I’d probably be plugging my mouse in with more regularity.

        • Laptops are fine, so long as you aren’t running anything on them.

  7. The computer has always been a terminal in one form or another. Cloud computing is yet another manifestation of what we’ve had since the days of the earliest phone modems and CICS.

    The heyday of Microsoft’s hegemony led us down perverse alleys. Microsoft did not foresee the Internet and its reaction to its appearance was unwise. Microsoft attempted to pervert common standards to its own ends. HTTP always supported a PUT operator, DELETE, too: I’m only surprised that it’s taken this long for “cloud” computing to take off.

    • I wonder if maybe its boosters had to get rid of the idea of Dumb Terminal. Stop thinking about how it could change everything, and start thinking of more specific things it could do better than the local workstation. The march of progress needs a rationale. When they were telling me that it was all going to be Dumb Terminal, nobody did a good job of explaining why that was preferable. Just that it was The Future.

      • The only constant in computers is that things oscillate. Before my time came the 3270 terminal, where you’d fill in a screenful of information and then hit “GO” to send it to the server. I started programming back in the days of the VT100, when all local intelligence (like full-screen editing) was simulated by the main compter sending control sequences to move the cursor around, but clever software could really give the illusion that you were in control. Next came workstations and PCs of various kinds, where the intelligence really was local, and client programs could be incredibly powerful. Then the browser, which was the 3270 model again. And the current “smart” web pages, full of JavaScript messing with the page without having to talk to the server, act very much like a fully graphical VT100.

        • Ah, Javascript. It opened the flood-gates to the horrible wastelands of the modern Internet.

          Between unnecessary javascript, flash, badly formatted and vetted ads (so many JS errors), malware, spyware, stupid-ware, ‘please install my toolbar’ ware, and the fact that Microsoft refuses to support a common standard (although I’m not sure Firefox and Chrome play along anymore either), opening a webpage — even a simple news site — can sometimes be an awful game called “Who just pegged my processor and ballooned their memory footprint? WHY DOES IT NEED TO DO SO MUCH STUFF SO I CAN READ TEXT!”

          There are days when I consider a career of travelling around the country, and whacking web designers on the head to the applause of a million angry users whose browsers keep crashing trying to see who won the game last night.

          Unsurprisingly, the websites that want to sell you stuff (besides ads) tend to work much better.

          • Heckuva good rant.

            I’m actually writing a lot of JavaScript these days, but it’s server-side (node.js), unrelated to web pages, and thus Not Evil.

          • You’re still stained by the blood of innocents.

          • It hasn’t reached “Webpages that automatically play music or embedded video” levels of seething hatred, but it’s getting close.

            “Which one of my tabs is playing that godawful song??”


          • try {
            fs.readFile(fileName, function (err, data) {
            if (err ) {
            console.log(err);
            }
            else {
            process(data);
            }
            });
            }
            catch (err) {
            console.log(err);
            }

            The catch block doesn’t catch error in the data-processing routine, because that’s an asynchronous callback. So the innocents are getting some of their own back.

        • Odd little corners of the computing landscape show tremors in that pendulum. I used to do a lot of Java Swing development. I taught a lot of it, too. As you probably know, there are only a few lines of code separating an Java Application from a Java Applet.

          So here’s what I used to do:

          A Java Applet can only talk to the machine which launched it, via a Socket. If this is engineered correctly, it’s all done exactly like a browser. A Thread starts up, makes a connection, sends some data, gets a response and shuts down. On the server side, a threaded ServerSocket accepts the connection, spawns a thread and handles the interaction with the caller. This scalable back end is surprisingly powerful, each client interaction like a bubble in a champagne glass. If all the transactions conform to a standard MVC pattern, even a small server can easily support thousands of threads.

          The client originally wanted Applets. Took a long time to load them. The users loved the idea of the Application on their own box. So I worked out a way for the server to do sanity checks on the Application: the client would send in its own version number and the Java version they were using and the server could tell them to upgrade the Application and Java virtual machine.

          The Application could persist what they’d retrieved from the database on their local machine. I’d set it up originally as a debugging procedure. The users loved it. It wasn’t a complete copy, mind you, just the data they’d encountered. If the application couldn’t connect to the server, it would open their local copy of the database in read-only mode. The sanity checks came to include checking database metadata and on-the fly database rebuilding. It got weirder and weirder: I’d watch the application logs: the salespeople were issuing bizarrely extensive reporting queries at night, running reports in connected mode so they didn’t have to worry about that data being available the next day on the client site.

          I’ve always loved watching users on the systems I write. It’s the same sensation every cook gets watching someone forego the pleasantries of compliments and gettin’ busy eating and asking if there’s any more in the pan.

          • Heh. The software I work on now is a GUI front end for a complex engineering backend. (Lots of fotran and lots and lots of valuable data and expertise). Have a yearly meeting with the big users, and each year there’s at least one presentation on “How we used your system to simulate a problem it doesn’t do directly and you never designed, and how it worked”.

            Users can be clever as well as dumb. We pay close attention to the smart ones, and turn their work-arounds into products. (Often by asking them ‘How would you like to do that directly, and what else would you like?”)

  8. Fantastic goods from you, man. I have remember your stuff prior to
    and you’re just extremely fantastic. I really like what you’ve got right here, certainly like what you’re saying and the way by which you say it. You make it entertaining and you still care for to stay it smart. I can’t wait to read much more from you. This is actually a wonderful site.

  9. I simply couldn’t depart your site prior to suggesting that I really loved the standard information a person provide for your guests? Is going to be back frequently in order to inspect new posts

Comments are closed.