MacBooks updated, but even consumers should go Pro


Apple has just updated their entry-level MacBook models to match the recent 13″ MacBook Pros in terms of speed, battery life, and graphics performance, whilst maintaining a fair-sounding USD$999 (SGD$1488) price point.

That money will get you a 2.4Ghz Intel Core 2 Duo processor, a Nvidia GeForce 320M graphics processor with 256MB of memory, and a non-removable battery capacious enough to last 10 hours of typical use. That’s really the best feature here; five years ago you’d be happy to get three hours out of a low-end machine.

But if you upgrade a MacBook to have 4GB of RAM ($1648) and compare that to a 13″ MacBook Pro (with 4GB of RAM as standard, $1788), it looks like a much poorer deal. $1648 vs $1788, for a difference of $140.

Here’s what that $140 gets you:

  • A sturdier aluminium body that’s slimmer all around and just a bit lighter
  • The option of upgrading to a maximum of 8GB of RAM, instead of 4GB for the MacBook
  • An illuminated keyboard that dims in response to ambient lighting conditions
  • Firewire 800
  • An SD card slot
  • The appearance of not being a cheapskate/noob/student.
Jokes aside, I can’t see why it would be in anyone’s interest to buy this model over a MacBook Pro. Sure, mainstream consumers will appreciate the SD card support when dealing with digital cameras, and the metal body probably handles heat better, but the ability to install RAM past 4GB is the closer for me. If you buy your computers with the intention of using them up to the three-year mark and beyond, you’ll want that upgrade path in your future. A little extra memory in the later years can go a long way towards rejuvenating an old computer and preparing it for the demands of more advanced operating systems.

The New Apple

There’s a phrase that tends to pop up in conversations about the latest divisive move from Cupertino: “the new Apple”. There’s always a new Apple that threatens the way things have been, or turns its back on a loyal segment; doing something other than what we, presumably desirable, tech-savvy customers want for our money.

Lately, it’s been the iPad and its being in bed with the iPhone OS when we’d already arranged for a marriage to Mac OSX. It’s a computer for grandparents that will have severe implications for their grandchildren’s ability to grow up into the kind of curious, tinkering hackers who poke their noses where they don’t belong and thereby discover new and better ways to write software and build hardware and renew the flattened spirit of progress, thus we are destroying the circle itself!, the naysayers charge, gasping for air.

With the iPhone model, software developers leave Apple a cut of every sale on the sides of their plates, while suffering the indignity of letting the publisher have final veto rights. Tinkering and sales aside, the goddamned thing wants to be a computer but has no multitasking! – This is the work of the new Apple.

When new MacBook Pros were released with the same glossy, reflective screens as consumer MacBooks, pissing off graphics professionals who needed color accuracy and glare-free visibility in daylight, that too was the new Apple. The new Apple ditched PowerPC chips for Intel’s, after trumpeting the former’s superiority for a decade; the new Apple said no removable batteries for any portable device, too bad if you have a 20-hour flight; the new Apple also developed an odd nippled mouse that stopped scrolling after just months of use, ironically named after an unstoppable cartoon character; the new Apple resembles the Orwellian state in the old Apple’s ‘1984’ ad, year after year.

The truth is, of course, that there is no new Apple. The ones who talk about it, imagine it, are mostly from a core of computing enthusiasts and creative professionals who have had love affairs with their Macs from before the second coming of Jobs. When consumers flocked en masse to cheaper PCs, they stayed with the ship and played music like nothing was happening. And edited video. And designed layouts. And touched up photos. The creative industry stayed with the Mac because it had the best software for their needs. Over time, they made the platform their own.

Theorists might point to Jobs’ return and subsequent introduction of colorful, family-friendly iMacs as the day when new Apple began, but only because of how long it had been since Apple last produced anything of interest to the public. If anything, the new Apple was born right after the Apple II.

Designed to be a computer for the everyman, the first Macintosh was built on the the same fundamental principles as the iPad 26 years later. Intuitive to use above all else, thanks to new technologies: a mouse then, multi-touch now. Resistant to tinkering: both are sealed with limited options for expansion. The inexplicable absence of features that might have been trivial to add: a color screen and hard drive on the Mac, a camera and multitasking on the iPad. Both were doubtlessly shaped by the idiosyncratic tastes and insights of Steve Jobs, whose involvement and personality defines Apple to the point that the idea of a ‘new’ direction seems flawed. It has always been Steve’s way.

Professionals need to believe that because they kept the company going for much of the 80s and 90s, their needs are still important to it. But the Mac Pro is the last remaining concession to this group of customers. It’s the only Mac that can be upgraded, and to which more than one non-glossy display can be connected for serious graphics work. Ever since the explosion of Mac use in the home, with the help of iLife and iWork as key selling points, the face of Apple has changed. If I’d asked you ten years ago to describe the Mac for me, you’d have said “used by video editors and designers”. Chances are, that’s not your first thought today.

I don’t suggest that Apple is leaving professionals out to dry, obviously the segment is still extremely important for the brand’s prestige and these customers are useful for pushing engineering efforts into things like octo-core and 64-bit computing, all of which eventually trickle down to the consumer products, but there have been bumps in the road to show that the company’s attention is slipping now that it’s gained the widespread consumer adoration it has courted all along. Case in point: the recent debacle over the MacBook Pro’s downgraded SATA interface. By the way, we’ve reached a point where the Pro products are bought by regular consumers just because they look cooler or carry more status. It was a recognizable trend by the time MacBooks sold out at a premium price just for being painted black, and it made a sort of poetic sense when the unibody aluminum consumer MacBooks morphed overnight into 13″ MacBook Pros earlier last year.

With the help of pundits and analysts who, at best, bat a little over 50%, it’s all too easy to fall into the trap of thinking you know the game plan, which is how all ‘new Apple’ complaints begin. If you want to know what the new Apple is liable to do, just ask if it’s something the common man will understand, notice is missing or broken, and still buy the hell out of anyway. Just like the first floppy drive-less Macs, less-space-than-a-Nomad iPods, and 2G-only iPhones.

Fear of a Pad Planet

There’s been a certain reaction to the iPad from some quarters of the tech-inclined community, inspired by the belief that the device signals a shift towards a new form of computing that old people can finally understand. That reaction has been fear and apprehension.

It begins by looking at the iPad as a better personal computer for the majority of people. After all, it surfs the web, does email, plays games, and that’s what most people do with their computers most of the time, right? Better yet, it does all of those things without a long boot-up sequence, viruses, and confusing computery concepts like a filesystem, administrator rights, directories (recently renamed ‘Folders’ for these same users), registries, multi-step installation procedures, and the list goes on. Parents will finally stop calling us for help with strange error messages, and we will forget that it was ever hard.

But if people start to prefer the iPad and its descendants to ‘real’ computers, so the argument goes, then we will have robbed the next generation of a basic foundational understanding of computers. Because there will be no tinkering in Apple’s clinical workshop, they will never see the crucial workings of a program beneath its simplified user interface, and we will not have people to build the next Google, YouTube, or Bittorrent. The iPad/iPhone were built to enable end-users to consume content, and so it must be that creativity stands to suffer.

As I wrote yesterday, I currently see the iPad as a great way to access information and interact with media, freed from the physical contraints of an iPhone’s smaller screen and shorter battery life. Apple sees it, quite necessarily, as something more*. Which is why they built iWork productivity apps and demonstrated Brushes, an application that lets the large screen be used as a drawing surface for artists.

Offering a new breed of computer to an older person and seeing them take to it with joy and wonderment, as opposed to frustration and confusion, is a wonderful image and what the industry should work towards, but just because a filesystem is obscured doesn’t mean the curious can’t get to it. One might argue that jailbreaking an iPad is no different from the things people did to their computers in the past. There will always be unauthorized tools for messing around, and one day you may even be able to write, compile, and test code for an iPad on the thing itself. I wouldn’t worry about the younger generation of hackers.

My parents online
I want to talk about two tasks I’ve observed my parents and people their age doing on their computers.

1 – My mother mainly works with email. She receives documents relating to her church activities, which she must save locally before editing and sending them out again to other members of her group. She organizes these files in folders, which are really good metaphors that she understands, and often keeps multiple dated versions.

Of course, the iPad of today can’t save email attachments for working on in the Pages word processor. One day it will. But that sort of management is bound to increase the level of complexity. Lists of documents, tags or folders, deleting and renaming, and so on. I thought of introducing her to Google Docs, which would let her work with live documents in the cloud, and even collaborate in real-time with her friends. When changes are made, instead of emailing a copy of a document to other people, she would only have to send invites to view the document online. The iPad would work well with that approach – no local storage necessary. The responsibility and blame for any complexity is passed off onto the web service provider, in this case Google, leaving the iPad’s reputation to remain spotless.

2 – My father (and other fathers I hear about) likes to download videos off YouTube for later viewing, both on the desktop and on his iPhone. These are usually music videos and funny but horrifying accidents. This requires using a program or website like KeepVid to save them locally, and then often another program to re-encode the clips for use on the iPhone.

I believe saving videos off Youtube is a copyright gray area that Apple will never touch by sanctioning an app that exists to do it. Music videos are often removed from Youtube when found to be unauthorized uploads, which might explain the compulsion to save them. But even if they stayed online, is streaming instead of saving an ideal solution? That’s a lot of wasted bandwidth, and what if they want a Taylor Swift video or two while traveling by air? Apple will never allow the Youtube app to save video and compete with iTunes sales.

Both of these scenarios and their cloud-based natures highlight the need for increased openness and cooperation on the web. If we can’t have open computing systems, then we need an open internet to take its place. My mother’s friends shouldn’t all have to have Google accounts to access her shared documents, and Youtube shouldn’t have a monopoly on streaming video just because the iPad comes with an app built-in. The widespread adoption of HTML5 video in lieu of Flash would be fantastic, and remove the need for a native Youtube viewer. Likewise, online storage accounts like the ones offered by Dropbox and Microsoft Live Mesh should be able to trade files and work together. Productivity and content creation services should have a way of talking to each other across networks.

I like Google Wave’s implementation of federated servers. You can run your own private Wave system, really make it your own for whatever purposes, but the underlying protocol can communicate with every other Wave server if/when you need it to.

If that kind of openness were applied to all other services, companies would stand to lose their ‘stickiness’, but they’d surely find other ways to retain users. Should a landscape of interoperability and sharing ever come to pass in every corner of the web, it would be to the benefit of us all. How fitting, then, if we were steered in that direction by the threat of having to work on oversimplified computers.


With apologies to Public Enemy for the title.

* When Nintendo first launched the DS in 2004, they called it a “third pillar” to allay fears that the company was going mad and replacing its popular and very profitable Game Boy Advance series with a risky touchscreen experiment. The DS went on to become a huge hit, accelerating the GBA’s demise and eventually becoming their main handheld product. You may wish to see Apple’s positioning of the iPad as a similar play: someday it may overtake the MacBook completely.

Alex Payne on the iPad

Alex Payne, in a widely-linked article, wrote today that:

The thing that bothers me most about the iPad is this: if I had an iPad rather than a real computer as a kid, I’d never be a programmer today. I’d never have had the ability to run whatever stupid, potentially harmful, hugely educational programs I could download or write. I wouldn’t have been able to fire up ResEdit and edit out the Mac startup sound so I could tinker on the computer at all hours without waking my parents. The iPad may be a boon to traditional eduction, insofar as it allows for multimedia textbooks and such, but in its current form, it’s a detriment to the sort of hacker culture that has propelled the digital economy.

As far as I can tell, Apple never intended for young Alex Payne to access the Mac’s startup sound any more than they intend for a future programmer to hack an iPad’s filesystem and do some tinkering of his own tomorrow. Sure it’s harder with DRM and encryption, but we’re united by the internet these days, and breaking those walls down has become a group effort. No young hacker today has to learn alone. We change with the territory and so nothing has really changed at all.

The Causes of iPad Disappointment

Letdown. Disappointment. Anticlimax. These words have appeared in nearly all the first articles written about the newly unveiled Apple iPad, barely a day old in the world. The reasons are not entirely important in the long run, but many of these stories themselves will admit that expectations were raised beyond reasonable levels, that Apple had no hope of impressing everyone the way they did in 2007 with the first iPhone. This environment of fanciful conjecture and presumptuous theorizing was the result of an industry’s decade-long fascination with getting the idea of a tablet computer to stick – seemingly against the wishes of the consumers meant to buy them – and the belief that Apple can succeed where other companies embarrass themselves. They’ve done it before, after all.

It never helps that Apple says very little about what it’s got until it’s got it. The veil of secrecy provides theatrical levels of entertainment at every event; charged affairs where people whoop and whistle from the moment Jobs takes the stage. As press conferences go, they overdeliver. But on the eve of events as the iPad’s debut, such enthusiasm cuts both ways, and the company is left with the unenviable task of managing expectations without any direct communication – a task that has become increasingly hard over the past few years.

In the beginning, Macs were a relatively quiet business; high profile products that most people saw but never considered owning. The success of the iPod energized Apple’s public image, and eventually sensational moves like the decision to cancel its hottest product, the iPod mini, in favor of an impossibly innovative new replacement, raised the bar not only for its hapless competitors, but for the company itself. Even then, the shifts from small to smaller iPods with color screens where black and white displays were once standard, were no indication of the iPhone’s shape or form before January 2007. That device’s unprecedented introduction, so many orders of magnitude beyond what had been expected of Apple (a phone that played music, and that wasn’t as ugly or challenged as the Motorola ROKR, would have sufficed for many), changed the pre-event guessing game forever. Do one magic trick, and you’re always going to be asked for more.

Consider that at the time of the iPhone’s release, touchscreens were not a new technology. Palm’s PDAs and countless phones running Microsoft’s Windows Mobile operating system had touchscreens for years, and were fairly well liked. Yet as the tech world watched Steve Jobs scroll a contact list with a flick of his finger, it was impossible to make the connection between that and the experience we had become accustomed to. The older technology, resistive touchscreens, required styluses or fingernails, with scrolling conducted with bars that were clicked and dragged. A series of small innovations (capacitive touchscreens, direct contact momentum scrolling, and a smoothly-animated graphical interface) combined to redefine the way we expect to interact with handheld computers today. It’s a classic Apple play: refine existing technologies, add advancements in software, and produce an entirely new class of product.

Given the rise of ebook readers like the Kindle, and the continuing efforts of PC makers to fashion smaller and cheaper computers from low-power CPUs like Intel’s Atom, it was only natural to think that Apple would soon do the same for the popular netbook category or a tablet*. It would be another game changer equal to or greater than the iPhone, we began to hear. In the days leading up to January 27, a quote attributed to Steve Jobs was circulated, to the effect that the tablet would become his greatest achievement. John Gruber of the Daring Fireball blog predicted that Apple was ‘swinging big’ with a new product that one would buy instead of a laptop. Others dreamt of new multi-touch interfaces that would further bury Microsoft’s second stab at tablet computer, shown to be an HP “slate” running Windows 7 without any modifications that might make operation with a finger possible in lieu of a mouse. In all fairness, many of these outsized rumors were based on the presumption that the tablet would cost up to USD$1000. What could possibly cost that much, close to a full computer, except a full computer? It would be ironic if the thousand-dollar figure was leaked by Apple to increase the impact of the final $499 price point, only to backfire by raising expectations.

These pre-announcement assumptions by enthusiasts and tech writers are now par for the course, as are the disappointments that follow each new product revelation. The iPhone 3G largely met expectations because it corrected the one deficiency that kept the original iPhone from greatness**: the speed at which it accessed the internet. It also coincided with the opening of the platform with the iPhone SDK, which led to the app-happy state of affairs we now enjoy. Regardless, complaints about the low resolution camera, unremovable battery, etc. continued to get a public airing.

Last year’s iPhone 3GS was roundly criticized for being an unexciting upgrade, retaining the same looks as its predecessor with little more than a megapixel and speed bump, effectively delivering the previous year’s expected iPhone but late. It went on to become a huge success. The buying public is immune to disappointment, it seems, perhaps because they don’t read blogs that sell them pipe dreams.

The iPhone 3GS announcement, and the internet’s lukewarm reactions to it, would be a good analogy for what’s happening with the iPad, except nobody hoped for the next iPhone to summarize a decade of engineering efforts. Like the iPhone 3GS, the iPad initially appears to be an evolutionary product, being based on existing technology Apple has repeatedly shown in public. It’s faster and more powerful, but not radically different from known territory. At first sight, it’s hard to imagine what the changes mean in actual use. You might think you can live with the old model and how things used to be. That’s a shame.

Apple is positioning the iPad as a third pillar in their portable product lineup: more than a smartphone, less than a laptop, yet better at some things than either of the other two. This instantly invalidates the idea of buying one “instead of a laptop”. Clearly you are meant to have both. It syncs with a Mac or PC the same way an iPhone or iPod does. It’s a secondary computer, but it’s also an appliance (see Farhad Manjoo’s articles on – here and here).

It sends email, it plays games, and it will be fantastic for Twitter, but in my opinion, the iPad in its current form holds the most value as an interactive document, or to use a term repeated many times last night, an ‘intimate’ way of experiencing media. Despite having no plans to purchase a Kindle DX or similar reader I suspect that I will fall in love with the thing the moment I hold a book-sized slice of full-color webpage in my hands. As Manjoo writes, “Everything about it—its size, shape, weight, and fantastically intuitive user interface—feels just right.”

With the first iPhone, Apple understood that touch interfaces are an emotional experience. Pressing the pad of your finger to a virtual page (in Eucalyptus, for example) and turning it fools some part of the brain that isn’t dedicated to understanding a screen is not the same thing as paper. It’s satisfying, even though a facsimile of a real experience. It’s more realistic than using a fingernail, which one never applies to a real page, and more personal than a stylus. My guess is that if capacitive touch of that quality wasn’t available, the iPhone project would never have gotten the green light.

I submit that the iPad takes one more step towards solidifying the illusion of digital media with real, physical presence. It’s not just a bigger iPhone, it’s an iPhone big enough to pass for a printed page and fool your mind. A frame that holds websites with long-form writing, augmented with video and animation, that we can hold lazily, effortlessly, in our hands or laps like the glossy magazines or newspapers they approximate, is nothing short of magical, to borrow another marketing word. If the iPad became transparent like a slab of glass when turned off, wouldn’t be exactly like the science fiction movie ideal of a portable computer? Don’t you want to live in the future? I say use your imagination.

And yet more frustration sprang from the absence of anticipated features, some of which are explained by the iPad’s positioning as a third pillar, while others invite more guesswork and predictions. Q. Why doesn’t it have a camera? A. It’s not for photography, and videoconferencing is the realm of a phone. Q. Why doesn’t it have USB ports/connect to an iPhone for syncing and tethering? A. It’s not a primary computer. Q. Where’s the multitasking and improved notifications? A. iPhone 4.0?

I believed that an update to the current iPhone OS, version 3.2, would be announced during yesterday’s event, which didn’t happen. As we know, the iPad runs a version of the OS with this number, but for the next 60 days it’s not something you can buy. That gives a 60-day window to expect iPhone OS 3.2 to be released for existing iPhone users. Will this bring some of the iPad’s new features to the iPhone, like the iBooks reader and iBookstore? My guess is no. These will remain exclusive iPad features for the time-being. In that case, iPhone OS 3.2 might only bring a few bugfixes and trivial UI enhancements. I believe Apple is already looking ahead to iPhone OS 4.0, to be announced in the March-May window, and released in concert with the next iPhone model. Improved notifications are a must, and I’m fairly confident they will be present.

The other concern, third-party app multitasking, is far and away the number one demanded feature for the iPhone OS amongst people I know, but I’m becoming ever more skeptical that it will materialize. Apple has invested heavily in push notifications as an alternative, too much for it to be merely a stopgap measure. With the iPad, which Apple is attempting to push as a viable machine for occasional work (more than a smartphone, less than a laptop), the lack of multitasking is even more apparent. If I’m writing a document in Pages and need to move back and forth quickly between a website, email, and a notes app from which I might want to copy information, iPhone OS 3.x requires me to switch in and out of the Home Screen each time, closing and reopening the apps.

The non-multitasking answer? Persistence and a quick launcher. iPhone OS 4.0 could enable a system-wide method for saving an app’s state (what you’re doing in it) when you quit, and having it restore upon the next launch. And according to Gruber, everyone who’s laid hands on it agrees the iPad and its new A4 processor are incredibly fast. Add that combination to a home button double-click that pops up a list of your last five apps, and you’re effectively alt-tabbing between running applications without the battery drain.

iPhone OS 4.0 will be a big deal when announced, whatever it actually does, and Apple understandably couldn’t show its features yesterday as part of the iPad. But because they share the same OS, it stands to reason that whatever the iPhone gains from here on out will also be on the iPad. Being that the iPhone is incredibly important to Apple and already accounts for 32% of smartphone profits worldwide while receiving rapid software development both from within the company and out, it’s inevitable that the iPad’s image will soon evolve beyond that of an underwhelming giant iPod touch.


* Legend has it that research for a “Safari Pad” tablet wound up becoming the basis of the iPhone, so it’s possible that the universe is completely backward.

** It’s worth noting that the original iPhone was also criticized for its camera, built-in battery, price, and not supporting third-party applications, Java, Microsoft Exchange, MMS, copy and paste, etc. despite being ridiculously ahead of the pack in terms of miniaturization, engineering, web browsing, media playback, and user experience. After three years, competitors have yet to match the software or browsing. Someday, the iPad’s reception will probably be remembered in a similar way.