The Steve Jobs Myth – D. Jason Fleming

The Steve Jobs Myth – D. Jason Fleming

Steve Jobs is often held up as something of a modern hero. The man was undeniably a genius. And he did a lot of good in his quest to “change the world”.

However, he also had two problems, and one of them might do nearly as much damage to the world as he did good.

The first problem was that his genius caused people to excuse his a-hole tendencies, and he exploited that to the fullest. (He also had a massive charismatic effect on people which he used ruthlessly, the so-called “Jobs Reality Distortion Field”.)

The second problem follows from the first: People everywhere are always looking for The Easy Answer. Jobs presents two paths to worldwide fame and riches: Be a genius, or be an a-hole.

Guess which one is easier. Take as many guesses as you need. I’ll wait.

Why, yes, you’re right.

Witness, as merely the most recent and most egregious example, Miss Elizabeth Holmes.

This Vanity Fair article (archived version)lays out, in fascinating detail, how Holmes followed the Jobs “Be a jerk to everybody, all the time, and as opaque as possible” playbook virtually line by line, except that her company, Theranos, wasn’t founded on genius. It was founded on the illusion of genius. An illusion made easier to maintain by the precedent that Steve Jobs set.

She started the company as a 19-year-old college dropout, and rode her constructed legend to making it a $9 billion empire, before the curtain was pulled back and it all collapsed around her.

Holmes wore black turtlenecks every day, “a homogeneity that she had borrowed from her idol, the late Steve Jobs.”

Holmes had learned a lot from Jobs. Like Apple, Theranos was secretive, even internally. Just as Jobs had famously insisted at 1 Infinite Loop, 10 minutes away, that departments were generally siloed, Holmes largely forbade her employees from communicating with one another about what they were working on — a culture that resulted in a rare form of executive omniscience. At Theranos, Holmes was founder, C.E.O., and chairwoman. There wasn’t a decision — from the number of American flags framed in the company’s hallway (they are ubiquitous) to the compensation of each new hire — that didn’t cross her desk.

And like Jobs, crucially, Holmes also paid indefatigable attention to her company’s story, its “narrative.” Theranos was not simply endeavoring to make a product that sold off the shelves and lined investors’ pockets; rather, it was attempting something far more poignant. In interviews, Holmes reiterated that Theranos’s proprietary technology could take a pinprick’s worth of blood, extracted from the tip of a finger, instead of intravenously, and test for hundreds of diseases — a remarkable innovation that was going to save millions of lives and, in a phrase she often repeated, “change the world.”

That “change the world” riff is directly from Jobs, and if you’ve read Walter Isaacson’s bio, you know that he used it to seduce a lot of people into doing his bidding.

And the rest of it shows that she studied Jobs very carefully. And learned how to manipulate people, individually and en masse, by selling them a vision. Like any good sociopath, she learned the form in great detail, and eschewed the substance. (No, I’m not saying she is a sociopath. Only that she apparently operated like one. The whole thing was a confidence game, one way or another.)

And I’m afraid that Jobs’s malicious influence is much larger than just Miss Holmes and Theranos.

Have you noticed how programs and apps and websites have taken to “improving” by taking away functions you liked and used every day?

Now, everybody thinks they know what you want better than you do. With the extra added side benefit of “molding” your actions to conform to what they think is preferable.

Again, Jobs was very, very good at actually determining what people really wanted, versus what they held onto simply because it was familiar. He killed the floppy disk drive. He veered away from power-on buttons. He got lots of changes through that seemed huge at the time, but in hindsight are natural.

And because of his precedent, in addition to (at least) fifty-plus years of marketing “wisdom” that treats customers as mindless sheep, everybody now treats you, the user, as a “moist robot” who does not think, but merely needs the proper stimulus to behave the way they want you to.

Steve Jobs was the outlyingest outlier there is: He was a jerk, but he actually was a genius, and he actually did want to change the world, and he actually was very good at figuring out what people would want before they even knew they wanted it.

The foundation on which the Cult of Jobs was built was, wonder of wonders, actually pretty solid.

I would bet that not one single emulator of his has the same solid basis on which to stand. They all learned how to imitate him, to give the impression of integrity as it is currently misunderstood, thanks in part to Jobs’s antics. But I would be surprised if any copied his substance. Because genius cannot be faked. Only the appearance of it can.

159 responses to “The Steve Jobs Myth – D. Jason Fleming

  1. Let’s not make Jobs out to have been infallible. After he was booted out of Apple, he started NeXT. The workstation they made was almost literally unsalable. He made so many bad decisions in its design that you couldn’t give the things away.

    • True enough, I was glossing. But it is interesting to note that his hubris remained unchanged, and he didn’t actually meet nemesis because of it, at least professionally. He started by having failures, but learned from them.

    • NeXT Step did have a decent life and descendants are in many places even if the hardware wasn’t as innovated as he claimed, although I’d love an old cube to gut and build a decent desktop inside.

    • Yes, and the Darth Vader of the mythology, John Sculley, was correct, just early, when he launched “the brick”, aka the Apple Newton, as the first PDA. And he oversaw the expansion of the Macintosh market as well as the first PowerBooks, growing the company well beyond the Apple II days.

      And I know multiple people who had face to face interactions with Jobs – he was a total jerk.

      • I worked at Apple for five years, mostly during Jobs’ first time there. Came just about *this close* to being fired by him, except he didn’t know who had asked a question on the other side of the Herman Miller partition and I did’t answer his “who said that?”

        Folks I know who were there when he returned to Apple figured he grew up a lot during the NeXT period. At least some of the rough edges had been sand down a little.

    • As any good engineer or scientist knows, you learn more from your failures than you do from your successes. Thus the only true way to know the worth of a product is to test to failure.
      Jobs and Apple came out with a whole bunch of innovative dogs, often at the bleeding edge of technology if not pushing the slightest bit beyond it. And when so many of those failed he and the company studied on why they did and learned from it.

    • I temped at a place with NeXTs. They were lovable, at least for what we were doing. But way too expensive.

    • Turbo Beholder (@TBeholder)

      Hey, at least NeXTSTEP found the right way to glory: “rip off the *NIX crowd”. 😀

      • To be honest, I never knew anyone from the *NIX crowd who fell for the NeXT boxes. We were all running Sun, AIX, or SVR4. With the occasional Silicon Graphics system for those looking for high-end graphic goodness.

  2. I wonder how much cash she has hidden in secret overseas bank accounts……With the supposed net worth of her company at its peak, the temptation to sell small pieces and squirrel away cash would have been tempting.

  3. The second problem follows from the first: People everywhere are always looking for The Easy Answer.

    We are also conditioned, by evolution and culture, to look for The One, a Leader.

    For a variety of reasons the American presidency has become such a position, when it ought not be. The president is not a leader, is not a Captain Kirk boldly going. The position is essentially a CEO, a staff management job and we need t look at it as such, evaluating a candidate’s skill set for managerial ability — the ability to keep a team rowing in unison — and communication capability, the ability to communicate policy goals and agendas in ways that motivate people to pursue those.


    An ordinary person who understands his limits is what the presidency requires, not an ubermensch who micromanages the world, knows the details of everything and snaps orders to all underlings.

    • Ummmm … apparently I should have ticked the little box at bottom left.

    • I think it’s great how Jeffrey Brown (the PBS interviewer) sits there squirming as he talks positives about Ronald Reagan. You can almost smell his discomfort throughout, except when he gets to induce Professor Brands to shoot some zingers, as he does by pointing to the SNL sketch. But Reagan is safely dead, and thus eligible for PBS praise, though transparently faint, while the current Republicans are all obviously Hitler.

      And I find it interesting that Prof. Brands think the Tea Party folks would be uncomfortable with Reagan the pragmatist – if they could get someone who achieved 80% of what he wanted, but shared their underlying philosophy, I think most would be ecstatic.

      One of the things that’s been the most instructive about being around this long is that I remember the vitriol and hatred underlying all of the press coverage of Ronald Reagan. They absolutely loathed him. But now those same folks are all over themselves to praise him whenever they get a chance.

      I suppose that’s why when you enter the fourth estate they make you take the hypocratic oath.

      • I have noticed nothing so raises the Left’s esteem for a conservative as death. Only a deplorable cynic would notice that this elevation is generally expressed in comparison to contemporary conservatives, as when Hillary recently asked what Ronald Reagan would say about Trump.

        Funny, she (and her fellow Proglodytes) showed little concern for Reagan’s opinions while he was alive. Now that he (and Nancy) are dead they feel free to put words in his mouth or (very selectively) quote him.

        The apex of this elevation occurs when they declare that, if (for example) Lincoln were alive today he would be a Democrat. Apparently they can conceive no greater achievement than agreeing with them.

        It is clear that in the eyes of the Left, the only good conservatives are dead conservatives.

        • if (for example) Lincoln were alive today he would be a Democrat

          The fact that this assertion would absolutely astonish Lincoln, him being one of the founding members of the Republican Party at a time when the Democratic Party that exists today in fact then existed – that party membership being in fact available to him yet was not chosen – somehow escapes their notice.

        • It does rather nicely inoculate them against anybody pointing out how different they are from just ten years ago, never mind 40– they can just pull a “oh, you’re always claiming that, you guys are just radicals now, here I’ll misrepresent Reagan, nay nay we’re unchanging and perfect.”

          • Don’t forget, “The parties traded places under Nixon. His Southern Strategy took those bad old southern Democrats and made them into today’s Republicans, while all the good Republicans became Democrats.” It’s not even a laughable claim, it’s just stupid, but they keep chanting it and chanting it.

            • I haven’t actually heard it given a specific date before!

            • BobtheRegisterredFool

              Always have available for citation a) the xkcd infographic on federal legislative faction history b) the political history of at least one state, which supports the case for continuity in that state’s parties.

              • I’ve tried finding that xkcd graphic several times and my search-fu isn’t up to it. Do you have a link?

                Sort of on topic of the post, I didn’t think the Mac I had to use at work was all that intuitive and strongly disliked the way I couldn’t do certain batch jobs. My husband had to use a NeXT at one point in his career and reported that it may have looked different but behaved like a Mac (or whatever the Apple computer of the era was). And he didn’t think much of it or Macs. His most recent encounter with adults trying to use an IPod for something to do with training class didn’t leave him impressed with that gadget’s intuitiveness either.

      • I think the Left’s idea of “Good Republicans” is like Phil Sheridan’s idea of “Good Indians.”

  4. Bill Gates: Here’s billions to vaccinate the poor, saving them from needless death and suffering!
    Steve Jobs: I’d rather die of cancer than use conventional medicine!

  5. And if you remove things from the user’s sticky little paws, they can’t screw things up as much and makes things more reliable. Of course reliable in performing the tasks Your Betters have determined you want. Which is why I turn the air blue when I have to test on Apple platforms. Yes I DO need to muck with the registry/domain change/use nonbranded peripherals, you stupid fruit-powered device!

    • I’ve been a devout fan-girl of Apple since teh IIe. And then along came Yosimite. I’m thinking hard about learning Linux and going to open-source from now on.

      • I went Linux in ’05, after being Mac my entire computing life up to that point. It’s not terribly hard.

        • I’ve ended up using both; they’re roughly complementary for me.

        • Linux since ’96. As you say, if you’ve used OS X in any depth, Linux shouldn’t be very difficult. The biggest differences will probably be the filesystem directory structures and the available command line commands.

          • My only remaining Windows portions are music production which I don’t see changing soon (although recent changes there may move me to Mac the next time the studio upgrades).

          • That’s about the time I switched from OS/2 to Linux.

            I no longer try to help people with Linux because the way I’ve always done things is so totally not the way you’re supposed to do them now.

      • Download and install VirtualBox on your Mac. Download a Linux distribition ISO and install it in a virtual machine. You can run it in a window, or zoom it fullscreen and use use the Mac as a boot device.

        There are many different flavors of Linux, most of them forks of Debian or Fedora. Among those forks, the main differences are which window manager is best supported – Gnome, KDE, LXDE, Enlightenment… download half a dozen distributions and set up VMs. Push all the buttons and tweak all the fiddlies until they’re geshtupfed, blow them away, and make new ones.

        If you find something you like, great. If not… your only investment was some entertainment time.

        The Apple user interface and apps were designed all of a piece. KDE, maybe not as tightly integrated, but close. Gnome is a bunch of separate programs flying mostly in the same direction. And there are a dozen others you can try… and you can run them *all at once*, switching between virtual desktop sessions.

        I think of Linux as the “Mister Potato Head” of operating systems; the Linux kernel is the potato, and you just add the “stuff” you want until you’re happy.

        • If you don’t want to mess with Linux to much Ubuntu isn’t a bad choice. It pretty much sets itself up and has most things you need. Linux in a VM is a great idea! I use it that way myself. Installing on the hard drive does interesting things to your Master Boot Record.

          I’m noticing this trend of ‘We know what you want better than you do trend’ in the new Warcraft release. There are some cool new features but Blizzard’s focus seems to be to force users to do content (enter PvP zones for recipes, crafting requires dungeon runs).

          • There are some cool new features but Blizzard’s focus seems to be to force users to do content (enter PvP zones for recipes, crafting requires dungeon runs).

            Argh, I hate it when they do that; most annoying part of the first several expansions.
            Not happy to hear they’re “going back” to THAT aspect.

          • If you don’t want to mess with Linux to much Ubuntu isn’t a bad choice.

            Six years ago (or so) I would have agreed with you 100%. But now I’m afraid that Ubuntu IS a bad choice, because they’ve made terrible UI design decisions recently. For anyone who wants to just use Linux without messing with it too much, I would strongly recommend Linux Mint, which is basically Ubuntu with the terrible UI decisions fixed. I recently set up Mint for my father when his Windows 7 installation decided to give up the ghost (and be unrepairable even with a reformat and reinstall, which was truly weird). And though he’s had to ask me several questions in the past month, mostly along the lines of, “How do I do (task that he knew how to do in Windows) in Linux Mint?”, he’s mostly been able to just get his work done. (It didn’t hurt that even on Windows, he had been using LibreOffice for several years, so most of his workflow needed almost no change).

            But my point is that I would NOT have given him Ubuntu. They made the same design mistake with their “Unity” UI that Microsoft made with Windows 8: trying to make a single UI design for both mouse/keyboard interactions and for touchscreens (tablets, phones). BAD IDEA. That’s the world’s third greatest blunder, JUST behind going up against a Sicilian when death is on the line.

            So just skip Ubuntu and go straight to Mint; you’ll be happier. (I recommend you install the “Cinnamon” version of Mint, unless you know for a fact that you want MATE or KDE.)

            • I have used neither Unity nor Windows 8, but I DO have a touch screen on my laptop that I don’t use, because Debian seems to think it’s just a funny mouse (it’s particularly weird when I have an extra monitor, and the touch screen “expands” to cover both monitor and laptop screen)…and I HAVE plugged in a mouse to my Android tablet, which treats it as a funny way to touch the screen.

              I REALLY wish operating systems would simply design interfaces that act like they are supposed to: touch interfaces for touch screens, mouse interfaces for mice, etc, rather than try to have these devices act like something they aren’t!

        • It’s probably worth mentioning that Window Maker is “an X11 window manager originally designed to provide integration support for the GNUstep Desktop Environment. In every way possible, it reproduces the elegant look and feel of the NEXTSTEP user interface.” I ran it for a while and tried very hard to like it, unsuccessfully. YMMV.

          https://windowmaker.org/

      • I liked the ][+ and the //e (still have one and yes, it runs, fwiw) but switched from Windows to Linux years and years ago. I did for a while trying using $HOUSEMATE’s old Mac laptop and found it frustrating and infuriating. There were things that were Apple/Jobs design philosophy so baked in that there was no way of changing them – and those were often precisely the things I wanted to change – and on the other platforms (Even Windows… even 3.1) I could. I’ve not used new-ish Apple products regularly since. Had enough.

        As for Jobs, well I suppose he did have some insights, but he often had them far too early, with disappointing results. Woz kept him from killing or crippling Apple at least once, probably more than that. (“We only need two expansion slots. Nobody will run more than a modem and a disk drive.” “Technical reason, eight slots will make the design easier to work with.”) The idea of doing away with interconnecting wires is a neat idea that for now only almost works. Yes, some peripherals (printers) can be on wifi or otherwise networked and there is bluetooth – which connects up to a whopping two devices which makes it rather limited.

        • In the NeXT days, he insisted that read/write optical drives were The Future, and that this should be the primary storage device for the cube. His vision of a classroom computer was every student walking in, booting up their own copy of the OS from a personally-owned optical disc, shutting it down at the end of class, and carrying all their data around in their backpacks.

          Unfortunately, the bleeding-edge single-sided optical drive he chose was incredibly slow and flaky, and took forever to boot the OS, especially with the limited memory he also insisted on. It also made every user a Unix sysadmin, and their attempts to hide that under the hood were “charmingly inept”.

          The nicest thing I can say about the original cube is that it was the best damn networked 400dpi PostScript printer in the world for years. We hated giving it up.

          (and, yes, when Jobs would visit campus to sell our department on his vision, he was the finest used-car salesman in the world, but you could not change his mind without high explosives; central management of labs just Did Not Compute for him, even in the days where we just connected the whole campus directly to the Internet, without the concept of firewalls)

          -j

    • I found a metaphor that seems to work to most folks’ satisfaction– Apple stuff is an automatic, while Windows and Android are manual transmission.

      It kinda glosses over that you can get at least some automatics to drive much better than programmed, and that shifting is incredibly simpler than what you do on a computer, but the idea seems sound.

    • Airbus style computing

  6. He veered away from power-on buttons.

    Nit: If you mean hard power switches (no voltage present), yes; otherwise, no, as far as I know (although I’m not current my certifications by several years, but I haven’t read of any changes to the power buttons) there’s still “soft” power buttons. In other words, there is a voltage on the motherboard from the power supply when the computer is plugged in. This speeds up power up, I guess.

    • As far as my peripheral point of view permits, I think you’ve captured the Jobs effect. He was brilliant; there was a substantial building behind the facade.

      • Yes he was. It’s just too bad he was so invested in flaunting his superiority, and acting like it meant the rules didn’t apply to him. Wouldn’t have taken much effort to simply be a decent person, and then people would be following that example, instead of what we’re now enduring.

        • Rules—like his trick with his car license plates; California would let you go months (six?) without buying a license plate. Jobs would trade cars at six months to avoid buying plates. I think California has closed the “Job’s loophole.”

          Power Buttons: I’m using a Mac now; it has a power button—but it’s in the back out of sight. The iMacs at work have power buttons that are in the back, and the iMacs have been that way for over ten years. Now, most of the towers going back to the G3 tower (the bluish (or teal?) rectangular prism, the same basic shape that was kept and enlarged all the way until the last change resulted in the small column form factor of the current Mac Pro) had fairly obvious power buttons on their front panels. The exception was the G4 Cube (see http://lowendmac.com/2000/power-mac-g4-cube/); its power button was not a mechanical button but a touch button (a static electric effect, I believe). That’s very slick in keeping with the stylish nature of the Cube. However, I know of one person at least who ruined theirs by accidentally touching their Cube’s power button while doing a firmware upgrade, ruining their firmware. A new logic board would have been required to fix it and the last I heard of it, the 400 MHz G4 Cube wasn’t worth that much money.

          What I was told that Steve Jobs hated was fans. The original iMac (Bondi, 233 MHz G3, tray-loading CD) had a fan; Jobs killed it on the next major model (slot loading CD drive).

          • But those aren’t real power buttons, they are soft power buttons

            • So Jobs is the bastage I am cussing about removing the kill switch on my PC every time I have to crawl under the desk and unplug it because it froze up?

              Yeah it has a “power” button on the front, but that only works if the computer isn’t froze up. A power button ought to be like a flipping toggle switch, it interrupts the flow of POWER and doesn’t allow any power to reach the equipment it is controlling.

              • There’s supposed to be a pure-electrical circuit that cuts all power to the mainboard when you hold down that powerish key for N seconds, based on electronics completely separate from any and all processors or clock signals or anything else on the mainboard.

                I’ve hit freezes that only responded to the power-plug-pull on desktops, but I’ve not yet (knock wood) had that happen on a laptop with internal battery power supply… um, wait, on further reflection I did have to kill an old laptop once in a lab by pulling both the battery and external plug. OK then, so much for knocking on wood.

              • Pressing and holding the power button doesn’t work? 😕

    • I’m going off a two-years old memory of Isaacson’s bio, but my memory is that he made a big deal about not having power buttons on Apple devices, and that Isaacson queried him about it near the end, and Jobs replied with something about not liking them because they reminded him of his own mortality. The details might be wrong, but I’m pretty sure I’ve got the gist of it right.

  7. BobtheRegisterredFool

    Wait, he’s, specifically, the mother fucker that inspired all that ‘change the world’ crap?

    He died too clean a death.

    • That’s unfair. He had a lot of hippie-dippie ideals, but he did, in fact, change the world. For the better. More than once.

      • BobtheRegisterredFool

        I’d already taken such a dislike to him as to suspect that pancreatic cancer was a just end.

        I am skeptical that his achievements are significant enough to matter that much, as I calculate the world.

        Enough to offset a fad among possible or young technologists that directs their energies to wasteful and destructive ends? I doubt it.

      • Depends on your definition of better.

  8. Next, someone will write an item on the Mac myth.

    Note: not me, it won’t be pretty.

  9. I guess I’m not sold on Jobs as much as most people. Intelligent? Yes. Good salesman? Yes. Genius? No.

    Granted I haven’t delved into his actual work much, but from everything people have told me he didn’t actually do the engineering work. He seemed more like a cult of personality.

    • He didn’t do the engineering but I think he’s ability to see how people would naturally use a category of devices qualifies as genius. Go back and read the response to the iPad when he first showed it. It was panned because he wasn’t doing anything a laptop couldn’t. Now see how many tablets are in the world. He knew better than most about the potential in the tablet.

      If one definition of genius, one I like to use, is recognizing something no one saw before but once you did and explained it everyone thought it was obvious (say, Kerberos authentication) then Jobs had it at least in terms of what technology products matched how people would use them.

    • And I didn’t say his genius was engineering. Jobs’s genius was more abstract than that.

      A good example from the Isaacson biography: Isaacson notes, at one point, that he took an iPad on vacation to a Central or South American country, and at one point let a local child play with it. The kid, with no English, possibly no literacy, and no previous experience with computers, took to it instantly, had fun, did several different things, and at no point was puzzled or frustrated with it, and also didn’t break any of the software by making a wrong choice or doing something it didn’t expect.

      That’s because of Jobs’s obsession not with engineering, but with getting his engineers and programmers to deliver things that anybody can use, and intuitively. I’m sorry, but that’s a very rare kind of genius.

      The iPod interface is another example. If you remember what every other mp3 player was like until it came along, and for a little while after, you can’t argue that engineering is enough. Everything else changed to the new model, which worked brilliantly because of Jobs, not his engineers.

      • Wozniak was the engineering genius. Jobs was all about The Big Idea, and Jason notes, with making things work for other than the traditional users.
        Even with all their faults, and all the valid complaints that Macs really don’t do anything a Win or Linux box can’t do, there’s a reason the Macs are still the norm in Video production – they paid attention to what that market needed, and with a few missteps, continued to do so to this day.

        • In no way was I trying to slight Woz. He’s great, and as far as I can tell, a totally lovable human being. He just didn’t have much to do with where I was going with the post.

          • Agreed – this was in response to the “…Jobs engineering genius…” stuff: There was one in early Apple, but it was Wozniak, not Jobs, and to some extent everything that came after could be viewed as Jobs trying, and eventually succeeding, in proving he was not just resting his laurels on the other Steve’s accomplishments.

      • Yeah, I’ve used some Apple products and I don’t find them as ‘intuitive’ as people like to claim. If Jobs was a genius it was in marketing. I liken it to calling my grandfather a genius because he was good at sales. To me that isn’t genius.

        • So, you dismiss the indisputable fact that people around the planet find them intuitive because you don’t, and use that dismissal to pretend that his genius was marketing.

          Guess there’s no point talking to you, is there.

          • I don’t find it intuitive, either. You can usually get motion, but getting anywhere is much more difficult.

            And it’s not like there aren’t a LOT of other folks who don’t agree it’s “intuitive” for actually using it to do things.

            Exactly how much support is required before it’s not instantly dismissed as “no use talking to someone like you”?

            • Had I said “every single human being ever born finds it intuitive”, you’d have a point.

              Since I didn’t say that, and since the UI designs were revolutionary and adopted across industries precisely because most people found them intuitive, I fail to see how single examples of people not finding them intuitive detracts from my point.

              My “instant” dismissal was not because Kamas disagreed with me, but because he dismissed a hell of a lot of data because he, personally, didn’t like something. He doesn’t like it, therefore it’s not real, therefore marketing. There is literally nothing to say to that, because he has elevated his personal preferences, and substituted them for what the market went for. I don’t care what his preferences are, and I’m not in the business of trying to change them. They have jack and squat to do with what the market rewarded.

              You don’t like it either, and good for you. Your dislike doesn’t change the facts I’ve already presented.

              But I suppose another example won’t hurt anything, just to bolster my point.

              The iPad interface is so intuitive that a baby not only learns it, but expects other objects to behave the same way.

              An interface that a baby learns before learning language, without frustration, is intuitive, whether you or I personally approve of it or not.

              • Had I said “every single human being ever born finds it intuitive”, you’d have a point.

                So then as long as ONE ever born found it intuitive, you can claim it’s intuitive.

                Oh, wait, that’s only so long as the standards you applied to a rather innoculous couple of comments also applies to you… no wonder you find it of no use to talk with those who don’t just accept your claims without support.

                • So then as long as ONE ever born found it intuitive, you can claim it’s intuitive.

                  Damn, you killed that straw man dead.

                  What I actually said was:

                  most people found them intuitive…

                  Which I even emphasized, but that was too opaque for you.

                  Do come back when you can deal with what I actually say.

                • Intuitive does not mean “easy to figure out” it. That’s just what they want to intuit. What it actually means is that they expect you to figure it out for yourself, or that it easy to use for those who find it easy to use. Intuitive is one of those fuzzy marketing words that doesn’t actually mean what you think it means, like “inconceivable.”

                  • Just like liking Tolkien is a sort of neurolinguistic diagnostic in certain ways, and just like how much you like algebra or geometry or both tells me a lot about you, the supposed intuitiveness of Apple products is really just something that only appeals to certain kinds of people, with certain ways of thinking and interacting with the world.

                    And yes, I’m sure Apple knows that, and deliberately sacrifices more than half their potential market in favor of having bigger appeal to Apple-type thinkers. What is weird is how many Apple lovers do not recognize this.

                    • I wonder what the correlation to having a great liking for the thing that Apple does extremely well– basically, that everything is complete off the shelf, so to speak. Even when you add software, if it’s allowed to be added, then it’ll work. There are some exceptions, but….

                    • Yep, there is a broad class of people who embraced the Android smart phones because the Apple way of doing things (and I quote) “makes my brain hurt.”

                    • As long as Hollywood and other high end markets are willing to pay the Apple price premium, of course Apple is going to tilt the MacOS interface towards that market’s preconceptions.

                      That and the narrow pre-tested configurations of hardware is all Apple really has left – you can get faster Intel processors, more flexibility in configurations, and the latest and greatest in peripherals on the Win-hardware side while you have to wait years for the Mac to catch up – but they have cooler product intros up in SF.

                      The company I’m working for now is a Mac shop so I’m back to using a MacBook after about 15 years using laptops from the Win side. If all you are doing is Word and Excel and Powerpoint, it really makes little difference as long as someone else is paying for the machine – though the Mac Office stuff is still buggier and lags in features, I’ve been able to make it work. Even on the video editing/creative/Hollywood stuff, you can do it all on a Win machine if you want to – but if you are doing CAD, you’d be crazy to pick a Mac.

                      So the preference question you raise is valid, but there’s a lot of other factors that feed into what someone is using.

                    • Except, Hollywood doesn’t use Apple machines anywhere near as much as Apple likes to make believe.

                    • Them and Left government, such as schools etc. LA County uses them for all their mobile applications, and assumes you will too. To the point that one of their upgrade projects wanted to be Apple ready right out of the gate, but put off Android for a year or more.

                    • And neither do a large quantity of indies.

                    • And it occurs to me this won’t stack where I sent it. It was directed at Draven’s “Except Hollywood doesn’t use Apple machines anywhere near as much as Apple likes to make believe.”

                    • My short answer for them is, “name one visual effects studio that is Mac-based”.

                    • There are two things to keep in mind about “intuition”: first, it’s just the internalization and extrapolation of various rules. Once you’ve learned the axioms and theorems of Euclidian geometry, for example, it becomes fairly easy to prove, and even anticipate, new theorems, because they are intuitive…and if you try to change an axiom (say, for example, by supposing it’s possible that two intersecting lines can still be parallel to a third line), the resulting system is counter-intuitive — until you internalize *those* rules.

                      Second, even when you internalize the rules, every so often you get a theorem that pops up, for which both the truthfulness and the proof goes against every conceivable intuition that you’ve developed so far. Intuition can help us, but it only takes us so far.

                      So, this notion that “something is so intuitive that even a child can use it!” is rather silly. A child’s intuition is simply “if I play with this, I can probably learn it”. My 2-year-old daughter finds both Android and FirefoxOS (long story) intuitive — and she can sometimes surprise us with what she finds (such as finding the games after having deleted the icons to them…). There are 5-year-olds that find *the Linux Command Line* intuitive. It all depends on what you are brought up on.

                      I personally find Apple OS decisions both counter-intuitive, and once I learn the intuition, even a little insulting. It annoys me when people like Jobs tries to make the claim that Apple OS decisions are uniquely intuitive, because they aren’t. They are merely a decent (although sometimes jarring) set of decisions that make sense to a lot of people.

                • Well I’ll back you two up, by saying that I don’t find them intuitive either. And I’m the tech illiterate type who is supposed to find the Apple products so much more intuitive and easier to use. No, actually I find Windows or Android much easier.

                  So how many of us have to jump on the bandwagon to dispel the “most people” narrative?

            • A friend of mine worked tech support for Apple about ten years ago. He said they lost several techs per year to suicide.

          • I’ve found Apple products no more intuitive than Microsoft of Goldstar products. My first Leading Edge computer was just as intuitive as anything I’ve ever seen from Apple. My then two year old figured out how to do most everything on my wife’s Nook all by herself. Is she a genius or is the NookTablet interface just really ‘intuitive’ to the same extent as an iPad? So yes, I think the ‘intuitive’ thing is simply marketing.

            If you don’t want to talk to me, that’s fine. Fuck off.

            • I reiterate, your personal preferences do not alter the reality that most people, which set apparently does not include you, find Jobs-shepherded Apple interfaces far more intuitive than anything that came before them.

              That you think your preferences are some kind of game-ending argument is your problem, not mine.

              It does, however, make you immune to arguments you personally do not like, which makes arguing with you pointless, since all you need do is say “Well, I don’t like it, so there!

              And you know how to spell fuck. How lovely for you.

            • I’ve watched a series of two year olds pick up various tablets or phones and have no issue with having a ton of fun– including getting to apps that I did not put on the front screen because I didn’t want them to mess with them!

              I’m also now up to three three year olds who do the same with a mouse-based computer, because of finer motor control required to use one.

              On the flipside, my mother (who has been using iPods since about ’04 and an iPhone since about ’14) spent several hours trying to figure out my sister’s “phone”- before I figured out it’s an iPod.

              • I think Sugata Mitra’s “Hole In the Wall” experiments in India show how just about anything can be considered ‘intuitive’. I first heard about him on one of the Ted Radio Hour broadcasts, but I think he’s talked about it other places and may even have a book out.

                • Oooh, this is nice…..

                  “They immediately say you cannot replace a teacher with a computer. It is impossible. I do not know why it is impossible….”

                  I chuckled.

        • Personally, I think that someone can be a genius at sales. It’s something that I certainly can’t do well. Some people have a natural aptitude for sales, and I consider that to be as much a kind of genius as being naturally gifted at mathematics or language or engineering.

      • > at no point was puzzled or frustrated

        *Every* experience I’ve had with *every* Apple product has been akin to the old TV commercial about the gorilla trying to smash a suitcase. WTF were they thinking?!

        I find the “Apple experience” to be counterintuitive, non-obvious, and sharply limited in functionality.

        • I find the “Apple experience” to be counterintuitive, non-obvious, and sharply limited in functionality.

          If that were the usual user experience, Apple wouldn’t have gone from all but failing in the mid-90s to the most successful company in the world.

          I know people who feel that the Command Line Interface is the only really intuitive interface. And for them, it is.

          For the great mass of consumers, however, it is anything but. Whereas, again, Apple rode their UIs to unparalleled success.

          • No, actually the great mass of people dislike Apple interfaces. But a very large minority like it very passionately, and that is more than enough to make big money.

            • Yes, this.

              If MOST users (as you keep repeating) preferred Apple products because they were intuitive, then MOST users would own Apple products. But in reality Apple interfaces are only used by a minority of users. Are popular? Yes. Are they successful? Yes. But they are NOT so with the majority, only with a significant minority.

          • Apple products are quite intuitive if you are following the path laid our by Jobs (PBUH). Straying from that path is either impossible or very non-intuitive. Jobs’ genius was in figuring out a path that 80% of the potential computer market wanted to travel without deviation.

            That 80% may be a generous estimate given how quickly Apple falls to ~10% market share.

            • It’s lways a low market share, but Apple customers spend a lot more per unit due to Apple’s premium pricing – and they do it willingly. That was Steve Jobs’ true genius – instead of joining Dell in the race to commoditize PCs, find a market that will support extravagant pricing levels, and then feed that market’s perceived needs.

              He did the same thing on the phones.

              • Profit margins are always huge in fad products. Just imagine the markup on Pogs. The problem with fad products is that fads end. Someday people will look back at spending 2-10x market price for Apple gear and shake their heads.

              • This largely reinforces the point of this post, that Al Capone was right: You can get much farther with a decent product and hype than you can with a decent product alone. What the Jobs Cultists took from that was that you didn’t need a decent product if you had hype enough.

              • So what you’re saying is, Apple fanbois are the IT equivalent of hipsters. That doesn’t surprise me at all.

                • Hypothesis:
                  Apple fanbois are the IT equivalent of hipsters

                  Proof:

                  In depth analysis of contextual elements supporting the thesis seems gratuitous, but note body types, modes of dress and body language.

          • Having fixed the never sufficiently damned things for 2 years. They’re about as intuitive as anything ELSE out there. If you were accurate about how ‘intuitive’ they were Mac computers would have a larger than 20% market share. They’ve done better than most in getting a decent chunk of multiple DIFFERENT markets. That doesn’t make them intuitive. I find them more irritatingly nannyish than microsoft’s “Do you really want to do that?”. I can at least turn that off. Unless you can write Unix scripts getting around a Mac interface at a higher level than ‘call a tech the button didn’t do what the button was supposed to do’. The hamstring your ability to do ANYTHING with the hardware.

            Macs are adiquate computers with snob value. They sell like the main appeal is snob value. I will give Jobs this: He managed to create a brand that has unassailable snob value.

          • You say “Command Line Interface” as if it’s the most dreadful thing in the world, except for a few quirky people for which it’s wonderful…yet this “counter-intuitive” thing that Jobs himself tried to kill in Apple products can be found easily enough in every OSX product.

            I’m not familiar with whether it’s available in some form or another on iOS, but I do know that there are apps that provide such an interface for Android.

            I would propose that this “counter-intuitive” interface is far more common (and far more intuitive) than you would propose. It may, perhaps, be even more common than Apple’s other interfaces.

            • Gosh, I’m tired of people putting words in my mouth.

              Here’s what I mean by “intuitive”. Give a two year old a bash shell, and an iPad, and which one does he figure out, on his own, without help?

              Do the same thing with some adult who has precisely zero interest in computers or programming.

              At no point did I say or imply that the CLI was “the most dreadful thing in the world”. If you inferred that, that’s you bringing your own baggage to a fairly simple point.

              • You keep saying give a two year old an ipad or iphone. From what I’ve seen of how fragile they are, I’m surprised a two year would do anything with them but immediately break them. Which of course has nothing to do with their operating system, but personally would be an additional big deterent to spending the kind of maney they want for them, on something so unnecessarily fragile.

                • Which of course has nothing to do with their operating system…

                  No, but it let you ignore the substance of my point, so you’ve got that going for you.

                • Bearcat,
                  Strike — ALL OF MY FRIENDS give ipods and ipads to their two year olds, and they’re fine.

                  • I’d guess most of them spring for the nice holders/protectors? (I do that for ANY electronic I can manage it with– it’s a relatively small investment with multiple large payoffs.)

                    I do know of kids that broke the I-whatzit, but they’re also the destructive cousins from heck. (AKA, “timmy, don’t.”)

                  • I’ll note that I have only ever seen two iphones handed to two year olds. One of them got changed to Chinese, which wasn’t exactly intuitive, since the two year old neither spoke nor read Chinese. And was actually quite humorous, as the owner of the phone had to borrow someone else’s phone to call customer service and figure out how to change it back to English. The other one was quickly frozen up (which means that particular two year old was about on my tech level, I seem to have a talent at getting any operating system to freeze up). So neither one was broken.

                    However, practically every one of my adult friends who own iphones, have broken them, they are remarkably more fragile than other brands. Which is why I have only ever seen two given to little kids, while I’ve seen any number of various other brands, with other operating systems, given to kids to play with.

  10. > forbade her employees from communicating with
    > one another about what they were working on

    Oh. Like the NSDAP and the CPUSSR…

  11. Perhaps the world should not be subject to techno prophets.
    They fix stuff that is not broken. Of course, that bleeds out.

    Probably makes it easier for the next crop of primates to play games.

    At least a mad prophet just channels bad news, and the world as being imprinted with good intentions. No profit in that.

    Of course, not really your world. Just overwritten with whatever pays the bills.

  12. I personally have to wonder how much of the Cult of Jobs and the portrayal of him as a visionary and philosopher is to avoid admitting that he was, first and foremost, a shrewd capitalist.

    He saw that computers would become increasingly important in the world and that they wouldn’t just be used by engineers and mathematicians, but also by people who needed the technology but didn’t want to have to learn the details of how it worked.

    So he created Apple to sell computers to people artists and writers and other non-technical users. He was the first one into a large underserved market and he worked very hard to build a loyal customer base. Donating computers to schools, for example, was a cold-blooded marketing tool to ensure that his operating system was the one that young adults would be familiar with when it came time for them to buy their own.

    The Apple brand was consistently higher priced than other computers, and he made a lot of money by getting people to buy based on style and image rather than quality. I can’t judge him as an engineer (although I’ve read a number of people who say he wasn’t that great of one) but he was a brilliant marketer.

    But the kind of people who are big fans of the Apple line of hardware also tend to be people for whom “capitalist” and “business” and “advertising” are dirty words, so they have to praise him as some kind of philosophical guru instead.

    • He saw potential markets, serviced and encouraged them, and profited by it. No argument with any of that.

      But given how flagrantly wasteful he was with money long before his projects showed any profit at all (for example, the NextStep headquarters), not sure I can go with “shrewd”. He was lucky that his perceptions of where things would go eventually turned out to be correct, but when he was wrong, he blew millions and millions that were not his.

    • He saw that computers would become increasingly important in the world and that they wouldn’t just be used by engineers and mathematicians, but also by people who needed the technology but didn’t want to have to learn the details of how it worked.

      That’s why we bought an Amiga 500. Went to the computer store (remember those?), the salesman sat my wife down and told her to point and click to the different icons. After two minutes she turned and said, “I want one.” She absolutely hated C: prompts to get to the word processor.

  13. c4c

  14. I work in biomed. Thranos (or however it’s spelled) irritates me on a personal level because it’s astoundingly wrong in it’s science. There is a counter example of industry being awesome (the Human genome project and the rapid development of NextGen sequencing) but there were a lot of intelligent and dedicated people working on those technologies. It wasn’t one special genius. They also didn’t bill what they did as this ‘miracle technology’ and focused on better tech and cutting costs. Thranos tried to do a ‘big’ scientific revolution with ZERO ability to back it up.

  15. Gee, has this turned into “Find the Apple FanBoi?”

    Many times I’ve suspected that kids, who are in experimenting-learning mode will figure out whatever interface is used and they then teach others, so the key is getting your stuff in schools (or to folks with young kids) as early as possible and then proclaim obviousness. It has been said that the nipple is the only truly intuitive interface – and I’ve seen mothers who said even that was not the case.

    Apple products are not bad, but they are also not as great as oft claimed.

    • Paul (Drak Bibliophile) Howard

      One of the Greatest Religious Fights of the Modern Era is PC vs Apple. 😈 😈 😈 😈 😈 😈

      Note, my first “PC” was an AMSTRAD 9512 and from then on I’ve used PCs.

      So I’ll not take part in the PC vs Apple fight. 😉

      • IMSAI 8080 in HS as well as a teletype punched-paper-tape over an acoustically coupled modem to a VAX in the school district main offices, then the HS lab got an Apple II. I bought an Atari 800 (yes it was a computer – I still have that one around somewhere – I could barely afford it with eth employee discount working in a computer store selling them) roughly the same time I was taking programming classes in college still using punch cards (JCL cards anyone?), and later on my younger brother had an Amiga. When I graduated I started on Windows PCs – I used an actual IBM PC with Visicalc at m first post-college job, and then an early COMPAQ luggable. After a job change I used Solaris boxes and earlyish (pre-return-of-Jobs) Macs, then by corporate No-More-Macs edict I was switched to PCs. Next job change was all Dell laptops, but I installed Linux on a bunch of older towers as a mini SQL query process server farm. Along in there I’d built up three or four Windows PCs from components for home. Now with the latest job I’m back to MacOS on a MacBook.

        In the end it’s really not all that different. I guess I’m something of a heretic to all sides.

      • If you ever read David Drake’s Raj Whithall books, one of their “scriptures” describes the War in Heaven between the followers of the Apple of Knowledge and the IBeMmeraphim. 😎

        • Yep. I wondered if Drake got caught at the edge of the debate/argument/discussion. I had to tell the students that they can only argue iPhone vs Android in religion class.

        • Reminds me of an SF novel where a passenger ship had two groups of religious pilgrims aboard – the Lennonites and the Lovecraftians. One group was laid back, the other was… disturbing.

          It was John Morressey, but I forget whether it was “A Law for the Stars” or “The Mansions of Space.”

      • vim vs. emacs can be just as bad as PC vs. Mac.

        • When $HOUSEMATE acquired the first Mac that had the (then new) OS X on it there was some remark about it having everything. “Even the kitchen sink?” I asked. $HOUSEMATE tried to invoke emacs – and it came up. Reply, “Yes.”

          As for editor of choice: joe. Why? In $HOUSEMATE’s case, because it can be set to use Wordstar key bindings.

        • When I was an active sysadmin/netadmin, I always used vi/vim. It wasn’t so much a question of features or religion as it was that no matter what platform, manufacturer, OS, etc. I was using, there was ALWAYS a version of vi/vim there. Well, not on Windows, but we never allowed Windows on any of our servers or network devices.

          • That was the reason for at least having a vi reference sheet handy. i was quite stunned when after one Linux install and a need to edit something vi wasn’t there! They went with pico or nano or such, but it took a while to discover.

          • VIM is one of the first things I install on any new Windows box I have to work with.

        • Nah, vim vs. emacs is small potatoes. Pretty much nobody actually argues that one seriously anymore.

          You want to see a REAL religious war? Go to a group of programmers and ask “Spaces or tabs?” Then duck, so that you don’t get hit with a flying chair when the foo fight* breaks out.

          * Well, you can’t have a bar fight without having a foo fight first, can you? I mean, that would just be wrong. If you are now chuckling, you are a programmer. If you are currently puzzled, see http://www.catb.org/jargon/html/F/foo.html for enlightenment.

      • Not sure what the first computer I used was (terminal connected to.. who knows anymore), then there was a TRS-80 (not sure which model) at school. Eventuall Apple ][+ and equivalents (Franklins). The first home home machine was a Netronics ELF II – starting with the base board: 256 Bytes of RAM, a few switches, a couple 7-segment displays and a hex pad. Gave up on Tom Pittman’s attempt at explaining things and read RCA’s CPU manual instead. It’s a bit amusing, knowing the other usage, that for years Pa referred to No-Ops as C4s as that’s what the RCA CDP1802 had as the hex code for No-Op.

        • Paul (Drak Bibliophile) Howard

          When I started my first programming classes, it was on a NCR machine using punched cards.

          On the job, I used punched cards as “input” and later got into the dumb terminals.

          Later, my job computers have been PCs so I’ve never used Apple computers.

          • Punched cards? oi, you had it posh. When I first used a computer it required menhirs trilithons punched paper tape for programming it.

    • It has been said that the nipple is the only truly intuitive interface – and I’ve seen mothers who said even that was not the case.

      Eldest daughter.

      No trouble with Windows; took MONTHS to nurse without a guard….
      And in the animal realm, there’s always a couple of dumb calves who really can’t figure out how to nurse without a cluebat, which can be REALLY fun when momma isn’t happy about a human being there. About one in four hundred don’t figure it out at all. (Going off of how often dad has to bring one in and tube him to keep him alive, and they still never figure out how to nurse off a cow. Usually sell those as feeder calves.2)

  16. I have noticed in my career, both civilian and military, that there are way to many people willing to listen to the person yelling the loudest then the person who knows what they’re doing and quietly goes about doing it.

  17. BobtheRegisterredFool

    ISIS will be on the United Nations Human Rights Council.

    Yea or Nay?

  18. I personally suspect that a large part of the reason Mac originally took off was PageMaker, Photoshop, and similar software where GUI was important, rather than the Mac itself (hardware or OS). Before Windows was ubiquitous, that type of software was much easier to use on a Mac.

    • And that’s not the first time, either. The Apple ][ could take 64 KB (and make easy use of 48 KB, yes) which meant it could handle this new spreadsheet thing at useful scale… and VisiCalc helped sell the Apple ][ for a while.