…because success doesn’t automatically equal being right.
I’ve been an Apple observer for as many years as I’ve been in IT support. I haven’t always purchased or used their products. In the early days my first computer was a TRS-80 Model I. I moved up to a Commodore 64, and then moved to a Commodore 128 before jumping to the IBM PC-clone platform. Apple was a ‘bit’ player in those early days (pun intended) but the Apple II and III didn’t appeal to me for some reason. For the longest time I mocked the fledgling IBM PC when I had my C-128 (“The 128 has everything built-in—it all costs extra on the PC!”) After a while I came around to the idea that this was actually a Good Thing, especially after leaving college and realizing the PC was the platform that businesses used. How times change!
Some History, and an Attitude Change
Apple always had the cachet of being different. The idea of a WIMP (windows, icons, mouse pointer) interface was a game changer, and Micro$oft realized it when they saw Apple’s implementation of Xerox PARC’s interface and copied it themselves. Over the years the Mac’s interface underwent constant tweaking and improvements, and despite Micro$oft’s best efforts to duplicate its usability their implementation still lacked the grace of Apple’s interface. As a PC wonk I often failed to see the good of MacOS. Things were just, well, too different. There was no command prompt in MacOS, no easy way to get past the OS if you needed to do something that the windowing interface wouldn’t allow. When I had to support a Mac user I dreaded it. Apple’s way was so difficult that, for me, what would take a few minutes to fix in DOS/Windows would often take several hours in MacOS. My very negative experience with MacOS and a messaging server package at the local PBS station (detailed in an earlier posting) didn’t help my perceptions of the platform.
Then I started working at a local university. Several users in my department had desktop Macs; in addition some of our electronic classrooms had Macs alongside PCs with a switching arrangement for the two platforms. I gradually became used to them and, as I started imaging PCs and Macs (which were, by this time, on OS X) I grew to appreciate the ease with which Macs could be imaged and those images deployed to any number of identical machines. While Micro$oft insisted on tying its OS so tightly to the hardware that an image made on one PC wouldn’t work on another if it had a different video card, you could simply image the most powerful Mac in your fleet and use that image on lesser equipped machines. That is, as long as the CPUs for those machines were in the same class–an image for a G5 wouldn’t work on a G4. Windows has improved over the years, but it still requires some black magic to make things happen across platforms. I ended up purchasing my first Mac, a G4 machine with such radical styling that some in the computer press called it the iTit, when they were surplused by the university. I ended up with three of these (hey, the price was right) with the intent of using them to display images in my home as Objets d’Art. Although I’d finally gone to the “dark side” my main home computers continued to be PCs.
Apple Forges New Territory
Meanwhile Apple branched out into consumer electronics and, in turn, set the music and telecom industries on their ears with the introductions of the iPod and iPhone. I never bought into either platform and only ended up with an iPod Mini because I won it. It’s still a neat device, but I hate the restrictiveness of iTunes, Apple’s digital rights management, and their limited codec support. When the time came to get a smartphone I went Android. Then in 2010 Apple introduced the iPad. I had wanted some type of tablet PC that would allow me to view PDFs since I discovered newsgroups and the large amount of free reading materials there. This device seemed to fit the bill, but I was still put off by iTunes. I waited until after the iPad 2 was released and still hoped for an Android tablet. Android’s developers and the device manufacturers couldn’t seem to get their acts together, and a viable iPad competitor wasn’t predicted to become available until 2012.
The Apple Advantage
I bit the bullet this past spring, drank Apple’s Kool-Aid and bought a closeout iPad 1 with 3G and 64GB of RAM. I’d read about the available apps and liked the idea of connecting USB peripherals through a backdoor method using the Camera Connection Kit. There were USB microphones and recording gear that could be harnessed to the iPad this way, making high-quality portable recording and editing an option that intrigued me. The ability to hook a wired keyboard into the iPad was the clincher.
Almost two years ago my department was dismantled by the university’s administration, and our varying areas of expertise were moved into other departments. The task of classroom technology support moved to the campus IT department. Their mantra of “We do IT faster, better, and cheaper” dictated slashing support costs a number of ways. One major change was to begin the elimination of dual PCs and Macs in classroom podiums by replacing them with dual-boot Mac Minis. When my department was responsible for classroom support we’d looked at doing the same thing but couldn’t come to a consensus. We rack-mounted our podium electronics for installation convenience and theft prevention. Small rack-mount server chassis PCs were used in the beginning and eventually changed to desktop PCs secured to rack shelves by inaccessible bolts. In the rooms with PCs and Macs we bolted the aluminum-cased boat-anchor Macs vertically to rack shelves and secured the cabling out of user’s reach. Those Macs were overkill for the classrooms but we needed user-accessible power switches. Apple’s decision to put the power switch on the Mini’s backside meant that we continued putting sub-$1000 PCs in the classrooms alongside Macs that cost twice as much. Dual-booting wasn’t a possibility until Apple changed to the Wintel architecture and even then the IT folks (to whose guidance we deferred as they set campus computing standards) said dual-booting was a kludge and extremely difficult to image. They worked out the problems over the course of a few years and, by the time they assumed classroom support, felt comfortable enough to convert the classrooms to a single dual-boot machine.
Securing Mac Minis in public environments was a challenge before several companies introduced different mounting options. The IT crew found several that allowed rack mounting but those were deemed too expensive. They settled on a design that allowed secure mounting of the Minis to tabletops. This past summer the IT department replaced 38 of the 120-odd classroom in-podium PC/Macs with podium top-mounted dual-boot Mac Minis. This seemed to be a good move that would eventually bring Mac capabilities to all campus classrooms while saving money.
The IT folks also maintain a pool of short-term checkout laptops. After using PC laptops for years they began replacing them with dual-boot MacBooks. My department replaced its PC-only laptops with dual-boot Macbooks three years ago, which allow us to teach workshops involving either PC- or Mac-based software. Apple came out the winner with our institution in a number of ways once the OS-specific wall was destroyed. I even considered replacing my desktop PC with a dual-boot Mac when the time came to upgrade.
Customer Needs vs. Jobs’s Vision of the Future
Apple performed an upgrade to iOS last spring that rendered most iPad Camera Connection Kit-attachable USB peripherals useless except for solid-state memory devices. Now when plugging in most USB devices the user is presented with a dialog box that reads “This device draws too much power” even with peripherals that worked before the ‘upgrade.’ Online forums lit up with user complaints, and people who’d purchased the connection kits were returning them to Apple as broken, getting surprised looks from the geniuses at the Apple Stores. It turns out that you can still use those peripherals, but only if you connect them using powered USB hubs. The iPad is all about portability–what’s portable about having to connect extra stuff through another box that requires AC power? There’s only one company that I knew of who made a battery-powered USB hub but it’s long discontinued. Some forum posts I’ve read claim that you can hook up devices through hubs without using AC power if the hub identifies itself to the iPad as a powered USB device. This would apply to those hubs that have an optional AC power port but aren’t always sold with the adapter. I’ve gone through a half-dozen models of these during testing and have yet to find one that eliminates the need for an external power supply. Even if a user finds a hub that will work, they’ll have a nasty-looking kludge with an extra box to juggle.
After our IT crew committed themselves to the Mac Mini conversion Apple revamped their product line and eliminated the Mini’s optical drive. Further purchases for classroom conversions will now have to include some flavor of external CD/DVD drive along with some way to secure the thing to the podium tops. Even when those drives are secured they will have highly-fragile trays, unlike the slot-loading drives in the Mac Minis. This means that equipment maintenance costs just went up as well. “But,” I hear some of you saying, “What about the cloud? Users don’t need optical drives anymore! They’re so last-century! Besides, hasn’t everyone changed to USB drives by now?” Not in the educational environment. Instructors still get presentation materials through textbook publishers who often use DVD-ROMs as the distribution media. Instructors also want to play DVDs and music CDs in their classrooms.
Apple also decided to eliminate the MacBook from the consumer product line, and though institutional education customers can still buy them, they eliminated user-replaceable batteries. When checking out laptops to various borrowers throughout the day, you have to be able to put freshly-charged batteries in them. Removing a high-demand circulating item just because it needs charged cuts down on resource availability and is a deal-killing issue. As a result the IT department has decided to run the current MacBooks until they die, at which time they’ll go back to PCs. As for those classroom machines, well…the jury’s still out, but it doesn’t look likely that they’ll go back to installing Mac towers. Perhaps we’ll finally become a PC-only campus except for special-purpose labs.
Apple’s never been afraid to orphan their users in their march toward the future. In the early Mac days new models came out every six months, so whenever you bought your Mac you could be sure that it would soon be outdated. When the changeover to OS X occurred they supported OS 9 applications for a little while but soon stopped. A few years after the introduction of Intel-based Macs Apple dropped their PowerPC support. Many of those PowerPC-based CPUs would still be powerful enough to be used today if development for that branch of the OS hadn’t halted. By contrast, take any old PC-compatible software and try to run it on today’s hardware. Unless that software was timing-dependent (tied to the CPU speed for proper operation) or depended on hardware features that are no longer available it will still run. Apple would argue that’s been IBM / Intel / Microsoft’s problem—platform advancement is held hostage by user’s demands to support “legacy” systems and software. Perhaps that’s true; however, except for a few features like FireWire what innovations made Apple’s computers so “revolutionary” that they simply blew the PC away? Even those Apple-exclusive features eventually migrated to the PC, and Apple has remained a minority player in the computer field. Marching forward isn’t a bad thing, but ignoring your customer’s needs and wants certainly is. Apple’s traditionally sucked at listening to customers and accepting criticism. Ask anyone who’s had their Apple support forum posts deleted because their opinions or complaints were deemed, in some way or other, as ‘offensive’ to the company. The only way they listened was when a lawsuit was filed, as in the case of defective LCD screens a few years ago. They’ve never depended on focus groups as Steve Jobs trusted his instincts.
Where Apple made revolutionary changes to the consumer electronics industry and people’s lifestyles has been in consumer media. The iPod changed the portable music player market and iTunes revolutionized how people buy their entertainment. The iPad has been a runaway success, so much so that the competition is still trying to play catch-up. Electronic gadget consumers expect rapid innovation and obsolescence so Apple’s orphaning tactics might not irritate those customers as much as the users of its computers. However, there’s one thing they absolutely mustn’t do, something I’ve learned through years of supporting faculty members. I call it Combs’s Maxim: A company must not remove functionality from an already-released product, even when that functionality isn’t expressly noted, marketed or endorsed. The USB connectivity issue I mentioned with the iPad is a great example of this—Apple not only left users high-and-dry with no good reason or explanation for their actions, but also violated the trust of the companies that market add-on products for the iPad. Sure, those companies should’ve used the official iPad docking port for connectivity but that limits sales of their hardware only to iDevice users. A USB peripheral can also be sold to work on other platforms. Sony pulled the same trick with its Playstation 3 when they removed the ability to run Linux on the box, a move which angered users and spawned a class-action lawsuit.
It’s Time for a New Apple
Apple was recently announced as the world’s largest company, beating out Exxon Mobil. Their sales figures are phenomenal, and their liquidity is formidable. They can do anything they want and buy about any other company they want. If they want to continue pushing the envelope and forming the future of media more power to them–someone has to be the innovator. But pissing off your present customers will ensure they’ll no longer be your future customers—and you’ll be left alone in the future you tried to create.
I used to say that Sony made products that were 90% perfect in the market niche where 80% of consumers lived. I bought their stereo and home theater gear exclusively because they would work together with a single remote control. The products were of good quality and sounded good, but other manufacturers had similar products with more and newer features (number of inputs, audio processing modes, et cetera) at the same price point. Getting those features on a Sony added several hundred dollars to their cost. I finally abandoned them when I bought my Onkyo pre/pro and haven’t looked back.
I started writing this piece several weeks before the passing of Steve Jobs on October 5th. I believe that many of the nits I’ve picked have belonged exclusively to Mr. Jobs and his instincts. Under his guidance Apple seemed more Sony-like to me—they made great products that were hamstrung in some frustrating ways. I’d written the following point more crudely in earlier drafts but it still bears expression: With Steve Jobs gone, it’s high time that Apple goes the extra 10% and shakes off their self-imposed limitations. They must listen to their customers, especially in the markets like education where they’ve traditionally dominated. In this economy customers cannot, and will not, invest in short-lived products with unrealistic limitations on their capabilities. Smart consumers invest their money in technologies with upgrade paths and long lives, and anyone spending money without those points in mind is either a fool or won’t stay financially liquid for long.
I want Apple to be around for a long time, but in a post-Jobs future they must change and improve to ensure their longevity. Let’s all hope they recognize the need for it, and have the courage to do so.
You must be logged in to post a comment.