Steve Jobs

Just when we need more jobs, we lose Steve Jobs.

Sorry, I couldn’t resist.

I hate the iPhone passionately. I hate what it has done to smartphones. I have little use for Macs. I have little use for a computing ecology that doesn’t allow me to build to my own specifications. Once you get past graphical interface, Steve Jobs’s vision of technology is in many ways antithetical to my own. I never appreciated Microsoft and Windows until the smartphone market showed what things might have looked like if Jobs and Apple had won.

But… what an amazing life and career. My views (outside of the PC market) are evidently in the minority. He found a way to deliver products that people did not know that they wanted. There’s not much I can say that people who actually appreciated what he delivered cannot say with more feeling and affection. But I will focus on one thing he did that was a godsend at the time. It is, ironically, one of his biggest failures: NeXT.

NeXT had a variant that was a shell for Windows 3.1. Windows 3.1 utilized a clunky and inflexible interface called Program Manager. Progman was terrible. NeXT, though had a way that you could put icons directly onto the screen. Once done, it was one click and you were there. You could also put bars on the side of the screen. All these years later, I still use something like that on my Windows XP machines (alas, Win7 won’t let me do it, bastards). And it was Jobs and NeXT that gave that to me. It got me through until Windows 95 came out.

So, in addition to appreciating the man’s genius, I will raise a glass to his memory for that.

Will Truman

Will Truman is the Editor-in-Chief of Ordinary Times. He is also on Twitter.

46 Comments

  1. Once you get past graphical interface

    Oh, and getting the record labels to sign on to digital music was a neat trick.

    My first computer was an Apple IIe. I briefly gave him that, but then realized that I wanted a Commodore.

    • Bingo. To be honest, the biggest contributions of Steve Jobs were not so much hardware or software, as the concepts that he brought to the computer technology world. In his Playboy interview from 1985, he says

      “Some people are saying that we ought to put an IBM PC on every desk in America to improve productivity. It won’t work. The special incantations you have to learn this time are the “slash q-zs” and things like that. The manual for WordStar, the most popular word-processing program, is 400 pages thick. To write a novel, you have to read a novel––one that reads like a mystery to most people. They’re not going to learn slash q-z any more than they’re going to learn Morse code. That is what Macintosh is all about.”

      (I’m assuming that “blockquote” works here.)

      That is, the notion that it was not only possible but desirable for a computer to be something that didn’t require specialist training and knowledge.

      Or, as you point out, the idea that people downloading music as files was something that could be a successful business–and this in the same years that Napster was running!

      • Much of that conceptual stuff he just stole from Parc.

        Not that anybody else was going to use it at the time; nor that it wasn’t brilliant to see something and know intuitively that it was very useful. But I always found the “Bill Gates stole Windows from Apple” hilarious, because it avoided the fact that the Macintosh interface was basically lifted, kit and kaboodle, from somebody else.

        While I like keyboard shortcuts and think the mouse is the number one health hazard in the modern workplace, it’s definitely the case that it inverted the learning curve to perform basic operations on a computer, and for the general public, that was invaluable.

        I tell most people to just buy a Mac, if they ask. Not because it “just works”… because doing anything complicated on a Mac is just as bad as doing anything complicated on any computer… but because the top 10 things 95% of people want to do on a computer are all done most easily on a Mac, and I’m tired of providing free helpdesk support 🙂

  2. It’s interesting. I have always found that computer people hate Apple, and those who aren’t computer people but use computers a lot love Apple.

    • A lot of computer people love Apple these days. Especially the iPhone and iPad and such. Very tragic. 🙂

      • No. Join us, Will. Join us. There will be no more pain, and you will be one with us. The Apple loves you.

        Join us.

        • I WILL NEVER JOIN YOU.

          (I have a Mac, it runs Windows. Only because it can’t run Linux. Well, okay, that and I still play Civ III).

          • I have a Mac, it runs Windows.

            What, precisely, is the purpose of this?

            (I once had a conversation with an Applyte who bragged that he had a Mac with triple-boot system that covered “every major OS platform.” To which I replied, “Mine has every major OS. There’s only one.”)

          • I had to buy a Mac for the department to build Macbook images because we merged three departments together and most of one of the other two departments live and die by their Macs.

            I needed a laptop, it’s a laptop, and I couldn’t really justify buying myself an upgraded Fujitsu just because I loved my tablet PC. Ergo, I now carry a Mac.

            At least it has a decent videocard!

            But most of the tools I run I need both a *Nix environment and a Windows environment (I’m the primary Windows dude and the secondary Linux one). Cygwin makes my teeth ache, but it works. Wine makes my teeth ache worse, and it doesn’t.

            So, I run Windows on a Mac so that I can run Microsoft AD tools and whatnot on their native OS, and still do command line jiggery-pokery.

            Plus, it infuriates Mac people. I’m a terrible person, I’ve mentioned that, yes?

          • That makes sense.

            Tangent (I love tangents): I learned how to use a Mac a couple years ago, working at a very, very large software company on the outskirts of a major city in the Pacific Northwest that one would not associate with having Macs sitting around.

          • learned how to use a Mac a couple years ago, working at a very, very large software company on the outskirts of a major city in the Pacific Northwest that one would not associate with having Macs sitting around.

            Not the one that makes Office for Macs, then? 🙂

          • No comment. But assuming for a moment that you have identified the company, it wouldn’t have been a division within the company that would so transparently lend itself to Macintosh usage. That being said, it basically was a “testing for compatibility” sort of thing. Which, when you think further on it, makes sense. But it’s not the first thing you think of.

          • Pat,
            Irritating mac-nerds is a public service. Irritating those who are proud of the “jewish inventor” is a public good.

        • You know how in some of those shows where they have a cult leader who singles somebody out from the pack and yells “Heretic! Be gone with you!*”

          Yeah, that’s me. Jobs just has (err, had) no interest in providing what I want and Jobs and the Applytes presented my wants as some sort of shortcoming.

          So I have been deemed the heretic.

          * – I’m not entirely sure where I’m getting this from, though there was a related scene in X-Files where Mulder was randomly tagged as an unbeliever.

      • I have this debate at work all the time, and “computer people” are definitely split on Apple products. I as a full-blown computer person fall mostly on the pro-Apple side.

        I love my iPhones, all 3 of them in my household, our AppleTV, iPods, and our AirPorts. These are areas where I want appliances that do what I want when I want them to with a minimum of fuss, and I don’t much care to spend time and effort hacking them to my own specifications.

        I am also blown away by the fact that our two year old son can navigate an iPhone or iPad on his own; that kind of design is insanely cool.

        All our PCs are Windows 7 or Linux, though, and I can’t stand OS/X and the current generation of Macs. On the other hand, my first PC was an Apple ][+ and my second was an original Fat Mac, both of which I still have.

        Steve Jobs probably contributed more to my career path than any other individual whom I never met, so tonight I’m raising a glass of the Good Stuff in his memory.

        • I am also blown away by the fact that our two year old son can navigate an iPhone or iPad on his own; that kind of design is insanely cool..

          Yeah, I was really amazed when our daughter was able to take her mom’s iPhone and use it to find her little apps and play them not long after she could walk.

          It’s never what I wanted in a computer, I think I’ll go to my grave wanting to be able to change and control everything the way the IBM PC and Windows has allowed me to.

          • This one always makes me shake my head.

            I get where it comes from, granted.

            Still, I have a hard time wrapping my brain around, “This thing is functional for three year olds, and this says something meaningful about why I, an adult should use it.”

          • So, what, things “for adults” should intentionally be hard and obscure and difficult to use?

            “I know that this hammer would work better if the head were rigidly connected to the shaft. But that’s so simple that a baby could use it. I prefer a hammer where the head can slide along the shaft freely, and it must be held in a specific alignment or you’ll bang your thumb. That way if I ever need a smaller hammer, I just slide the head down the shaft, and bam–smaller hammer.”

          • I have a hard time wrapping my brain around, “This thing is functional for three year olds, and this says something meaningful about why I, an adult should use it.”

            I think it does. Developing user interface design patterns that aren’t based on English (or any other natural language) and are so intuitive that they can be used effectively by not only untrained adults, but small children, is a significant achievement and goes back to my point about wanting my phone to be an appliance, not a toolkit.

            I want my phone/mobile computing device to perform a variety of tasks on demand without having to invest a lot of time learning, building, or customizing it, and the iOS UI enables that better than any other platform I’ve seen.

          • @ DD & DG

            I get what you guys are saying. You’re not wrong, I’m just coming at it differently.

            Let me rephrase in terms familiar to IT people, ’cause that’s what I am and this is actually what we’re talking about.

            Complexity in UI should match the complexity in what the user wants to accomplish, in a given use case. Optimally, this will scale accordingly.

            If you want to use the device in simple ways, the UI will be simple.

            If you want to use the device in very complex ways, the UI will switch abstraction.

            Why?

            Well, which is faster, on a computer with a word processor:

            Ctrl-A, Ctrl-C, Alt-Tab, one mouse click, Ctrl-V.

            Click, drag, scroll, scroll scroll, unclick, click “Edit”, click “Copy”, unfocus window, focus other window, one mouse click in a data entry box on a web form, move mouse more, click Edit, click Paste.

            (discounting the “oh crap I clicked in the wrong place, which happens).

            *Obviously*, the first case is *enormously* better, right?

            My problem with the iPhone is that it is designed for simple things. It’s designed so simply, that gobs of swiping can be necessary to execute something that I’d want to do as routinely as Ctrl-A, Ctrl-C, Alt-Tab, one mouse click, Ctrl-V.

            If you don’t ever do that stuff, there’s no reason to care.

            If you do that stuff every once in a while, it might not bother you.

            If you do that stuff all the time, you’ll smash an iPhone with a hammer.

          • Complexity in UI should match the complexity in what the user wants to accomplish, in a given use case. Optimally, this will scale accordingly.

            This is exactly where I think you’re wrong and why I love the iPhone — it thoroughly breaks this assumption, and you can perform enormously complex actions through a very simple UI running on a very small form-factor device.

            No, it doesn’t make a good word processing platform when compared to something with a 24″ screen and full-sized tactile keyboard. What smartphone or tablet does? Comparing a pocket-size mobile device to a desktop PC is apples and oranges. My iPhone didn’t replace any of my PCs at work or at home, nor would I want it to.

            Even so, I don’t buy your comparison since an actual gestural equivalent to your keyboard sequence would be more like:

            Triple-click, right-click, select “copy”, click other window, right-click, select “paste.” Often just as simple, or simpler, than keyboard shortcuts, and definitely simpler to learn if you weren’t weaned on early versions of MS Office.

            If you approach every UI like it’s just an upgraded version of a VT-100 terminal, yes, you will be flummoxed and frustrated; if you’re willing and able to look at smartphones and tablets as a new device class and discard assumptions that they should work like a screen+keyboard (or even screen+keyboard+mouse), you quickly find you can do a lot of complex things very simply without trial and error or rote memorization.

          • > Comparing a pocket-size mobile device to
            > a desktop PC is apples and oranges.

            Yes, because a pocket-sized mobile device currently has about as much power (if not more) than the machine I’m currently using to write this comment. Oh wait..

            Which reminds me, it’s the new fiscal year, I need to order a desktop.

            The problem, Darren, is that once you decide that the pocket sized mobile device is never going to be like the desktop, you’re chopping off whole swaths of functionality that the device is actually capable of executing.

            > If you approach every UI like it’s just
            > an upgraded version of a VT-100 terminal

            That’s a fair point, but give me a little credit, eh?

            At the same time, if you approach every UI like it’s something that surpasses everything that came out previous to it, you ignore the fact that we learned stuff about UI from the VT-100.

            And from Windows 3.1/Mac OS 7

            And from OS/2 Warp, for that matter. And dozens of others, besides.

            > you can perform enormously complex
            > actions through a very simple UI
            > running on a very small form-factor device.

            I will grant this might be the case, I don’t use one.

            However, this is not the pattern of use I see other people executing. I see other people doing lots o’ swipin’ to perform what I consider to be pretty simple tasks.

            Now, maybe they don’t know how to use the device either, but they’ve had ’em for years and this would belie the “three year old can do it” idea.

          • When I hit the volume button on my phone twice, it sends the A2DP audio signal to my non-A2DP earpiece. When I hit the volume button once, it opens my media player. When I hit the volume button with my media player, it plays. Hit it again, it pauses. I can stop/start audiobooks and listen to TV shows without ever having to take the device into my hands.

            When I hit the PROG button once, it opens up a screen of my most-used and most-recent used apps. If I hit it twice, it opens up a list of apps I have open so I can easily switch between them. I hit an on-screen button and I can see all my apps.

            When I hit the SEND button once, it opens up the dialpad and the last few calls I’ve made/received. I hit it twice, it opens up a screen of selected contacts. Three times, my entire contacts list.

            When I hit the BACK button, it closes whatever I have open. Unless I have a browser open, in which case it sends me to the previous page cause I want it to.

            When I hit the END button (and am not on a call), it gives me straight access to the home screen.

            My phone didn’t come with these capabilities. I put them there because I wanted it to suit particular needs. Without all of these things, I still have easy access to my apps with a click here and a click there. I can make my home screen look pretty similar to the iPhone’s (more or less). I sure as hell can’t make the iPhone do anything like the above. Not the least of which because there is one basic model that doesn’t have so many buttons. It’s simpler that way, provided that you want to use your phone in a particular way.

            And, of course, that is the way a lot of users want to do it. Which is why the iPhone sells well. But for people who want something scalable from simple to complex the way that I do? Forget about it. You need to use that phone in a rather particular way. The apps I use to rig my phone the way I like it wouldn’t make it through the app store. It just makes things too… complex.

          • @ Will

            All win, that last comment.

            Note: I’m not discounting this, which you also say:

            “And, of course, that is the way a lot of users want to do it. Which is why the iPhone sells well.”

            This is a huge deal. It is a device that fits most people’s complete set of use cases, really easily. It’s a dead-on, balls-accurate example of device creation that is beautiful and perfectly suited to a very large audience of people who want to use that device a certain way.

            I’m just not that guy 🙂

          • “I’m not that guy” is different from “This thing is functional for three year olds, and this says something meaningful about why I, an adult should use it.”

  3. I think of Steve Jobs as being a lot like Walt Disney. Both of them had the same kind of genius:

    They knew what they wanted when they saw it, and even more emphatically what they didn’t want. They couldn’t build it themselves (Jobs wasn’t a techie, and Disney was a mediocre animator), but they could inspire talented people to build it for them. Getting to the final product was an iterative process: They would tell the troops what they wanted, and after that version was done, figure out how to make the next one closer to the ideal. They were ruthlessly able to look at someone else’s hard work and say “Wrong!”, and keep saying that until their dream was realized. They were absolute motherfishers to work for, but the results speak for themselves.

    • We need more people like that.

      If we can cultivate them, we need to put our efforts into cultivation.

      If we can’t, we need to put our efforts into figuring out how to best get out of the way when they do show up.

      • The hard part is that there are a lot of people every bit that difficult, and only one in a thousand is worth it. We need to learn to be patient and give them a chance to prove that they’re genius motherfishers and not just run-of-the-mill motherfishers before we lose our tempers and beat the crap out of them.

        • > The hard part is that there are a lot
          > of people every bit that difficult, and
          > only one in a thousand is worth it.

          You’re being generous.

          But I concur with the sentiment.

          Note: let us imagine an alternate universe where Steve Jobs never left Apple.

          Imagine the Newton and the NeXT as major foci of Apple’s production line.

          Do you think Apple would have survived that?

          I agree basically with what Will’s saying, but the tangent is that Steve suffered just a bit from reality disassociation disorder. He wanted an iPhone fifteen years before he could have one, and this produced the Newton, which nobody actually wanted.

          Genius can get too far ahead of the curve.

          • Wow, you just put into words something I was very much wanting to say (with regards to Jobs being a bit too far ahead of the curve – thank god) but unusually couldn’t find the words to.

          • Lest this whole thing be misconstrued:

            When it comes to information systems, thinking too fast is what drives innovation.

            Trying to do things before they might be ready for prime time means you’re going to be the first one to do it when it *is* ready for prime time.

            There is a lot to be said for this. I’m glad people do it. A lot of times, I’m an early adopter when someone hits the homer.

          • Trying to do things before they might be ready for prime time means you’re going to be the first one to do it when it *is* ready for prime time.

            Or that you’ll fail, but develop technology that someone else can use when prime time arrives (and yes, Xerox Parc, I am looking at you.)

  4. Jobs was a marketeering bastard.
    Still, he made money the old-fashioned way — by bloody well earning it creating something new.

    L’Chaim!

  5. Having owned an Android phone for over a year, I am about as happy as anyone has ever been to have an iPhone, because it fishing works.

  6. “Just when we need more jobs, we lose Steve Jobs. ”

    heh. I made a similar joke to my wife, this morning; the radio said “the upcoming jobs report for this month looks grim”, and I said “well of course it does, he’s dead!”

    She did not find this amusing.

  7. Incidentally, what has the iPhone done to smartphones, and why do you hate it?

    • iPhones changed the dynamics so that basic usability became the primary thing that mattered. Tailoring the user to the phone, rather than vice-versa. A lot of folks are happy with the walled garden, which is why the market turned, but I hate it. They set a standard that is in the opposite direction of what I want.

      • The biggest thing for computer users is to learn enough to recognize when failure is because they want something impossible and not because they did it wrong, or because the software was buggy.

        Most casual users give up long before they reach that point.

        Apple’s concept for smartphones is terribly annoying to people who are willing to learn the difference between “can’t” and “wrong”. But–pace the Playboy interview–Jobs decided that the user should bump up against “can’t” sooner than they bump up against “wrong”, even if that means that the boundary of “can’t” is placed further in than the boundary of “wrong”.

        • I don’t disagree with any of this (actually, I commend you on the excellent phrasing). This is why the iPhone was so wildly successful. This is why I do not like it, and why I do not like the devices that seek to emulate it.

  8. I can’t even say all my intersects with Jobs because Patrick is reading. But overall what Jobs had that all the other geeks lacked was something called “taste”. We may not like his taste, we may not agree with it, but it was solid, it was real and it had its own audience. The Xerox Star (which I ran) had plenty of issues and Jobs fixed them. I had one of the first Macs (hacked into a fat Mac) and that was my last Apple. Two things about it irritated me no end, a one button mouse and a single floppy drive. No CLI (command line interface for you youngin’s) also was a downer but not ruinous. Someone gave me a Lisa but I never ran it. Bonus points if you know Lisa was Jobs’ daughter’s name.

    There is something to be said for a left brain vs right brain device construct. Unfortunately for Jobs, he designed for a right brain world when 90% of society are left-brained (and right handed). Unfortunately for me, I was in my 20’s before I realized I should have been left-handed my whole life. By then I had simply done what humans do best, compensate.

    I didn’t like Jobs creating little sandboxes for users to play in, much preferring the II+ and those magical open slots (which my S-100 card cage had). I didn’t like not being able to add my own devices (and device drivers) but understood his gestalt of keeping things simple, and he clearly understood that 90% of his 10% audience were not to be trusted with the keys to the car anyway.

    Jobs also gave us the new improved Pixar, kept it alive and gave it direction. We wouldn’t have ToyStory or any of the rest without him and it, and shouldn’t forget that either. As for his future vision being too far ahead of his time, no problem there. We wouldn’t have the present we do without the future visionaries upon whose shoulders we stand. Whether they’re all successful businessmen is more a matter of luck and temperament than vision. To me it’s worth it to have the reality I wanted, regardless of whose face is on the cover of Fortune or Forbes. We definitely need more like him, now more than ever.

    • Well done, Ward. I especially like the point about taste.

      I’ve always thought the guy make some sleet gear, that’s for certain.

    • Everyone who draws with a mouse ought to draw left-handed. In fact, it’s better for artists to just get used to using the left hand, and scaling down until they can compensate for the bloody poor-fine-motor coordination.

  9. And now Al Davis, who was, many, many years ago, a brilliant innovator in the realm of professional football. Maybe Jobs was fortunate in some way to go at the peak of his success, rather than being remembered as the guy who was still trying to sell smartphones in the age of telepathic implants.

Comments are closed.