iPhone 5 Camera Comparison vs. Ricoh GR Digital III

The new iPhone 5 features an improved camera, mainly in the area of image signal processing in the A6 chip, which reportedly allows it to do intelligent sharpening, noise removal, and pixel binning for low-light situations. The lens elements have also been rearranged, resulting in a slightly different field of vision from the iPhone 4S. There’s also the new sapphire crystal lens cover which resists scratches — unfortunately, I already have a tiny speck of dust on the inside of mine, which I’ll have to get them to clean at some point.

I’m more interested in seeing how the iPhone 5 competes with other point and shoot cameras than with the iPhone 4S. Here are two scenes taken with the Ricoh GR Digital III (my review here), a high-end compact comparable to Panasonic’s LUMIX LX3/5/7 series, and Canon’s S90/95/100 cameras.

The photos below are direct from camera and have not been fixed or enhanced. The GRD III is something of a prosumer camera, and if handled correctly, i.e. with manual controls and lots of fiddling, is capable of some great results. For parity with the iPhone 5, these photos were taken in fully automatic mode, letting the camera figure things out.

Ricoh GRD 3

I had to take this shot twice because the Ricoh chose a very shallow focus, directed on the leaves in the middle, which left the stone duck and foreground leaves blurred out. It’s a little underexposed, but the larger sensor gives some beautiful detail to the fern.

iPhone 5

The iPhone 5 analyzed the same scene, and chose to keep a relatively deep focus for a usable shot the first time around. The photo is also noticeably warmer (pleasant, but perhaps inaccurate) and brighter. This photo is good to go without any editing, which is how most users want it. No problems with sharpness in the details.

Ricoh GRD 3

The GRD had trouble focusing again, and ended up with a spot in the middle (above and to the right of her nose), which keeps the dog’s legs in focus but not the face. Although what fine details that were in focus got captured with a good amount of clarity, the photo is pretty dull and boring on the whole. Your aunt would not consider this a keeper without a trip to iPhoto.

iPhone 5

Again, brighter and warmer. I don’t think the iPhone makes everything warmer, only in shade and indoor lighting conditions. None of the daylight shots I’ve seen so far look overly warm. Sharpness is consistent across all areas of interest, and noise is acceptable for ISO 400. Fine fur details are not as well resolved as in the GRD photo, but this may be down to JPEG compression. Using an app that allows setting lower JPEG compression, such as 645 Pro, may compensate for this.

For most purposes, I can’t see why the iPhone 5 wouldn’t be an adequate camera replacement. In terms of straight-from-the-camera usability, these photos are astounding compared to the GRD III, which used to cost in the region of USD$500-600 (it has now been replaced by the GRD IV model).

I’ve gone on a few trips where I ended up taking all or most of my photos on an iPhone 4/4S, with few regrets. Focusing on the 4S was a little touchy, and it tended to take photos before focus had fully locked, if you hit the button too soon; this seems to work the way it should on the iPhone 5.

Nudged

It’s been a long day. I heard the news about Steve Jobs from Facebook and Twitter while I was still in bed in the morning. I didn’t think it’d be this soon; like John Gruber, I kept believing he’d pull through again. Not shocked, not depressed, but deeply moved by the enormity of what had been lost.

I said to someone that future generations capable of mapping time and parallel dimensions might look at their charts and see how the course of our world changed at this moment. Things are different now, for us all, than they might have been if he lived to be 90. I don’t know anyone who could doubt that.

At lunch, I bought the iPod classic I’ve been thinking about for the past week. Silver, not black. Closest to the original. I remember getting an iPod with my first Mac, an iBook, and loving it passionately as an extension of that computing experience, one that I was thrilled to take out with me each day. The music player and laptop had nothing in common from a technical point of view, but they were both imbued with the same values.

Steve’s values, or Apple’s values? The common theory is that these days, they’re indistinguishable thanks to codifying efforts by Jobs himself, but I can’t discount the value of great leadership or ignore the subtle differences present even in people who share the same values. The company he founded will continue to succeed within the trajectory they’re now on, but we’re missing a nudger now. A man who puts the rest of us on a different course as a matter of his own existence.

I didn’t want this to be a Steve-Jobs-changed-my-life post, but crossing paths with those first two instances of his work caused my own views and interests to be nudged, my trajectory recalculated. Until the maps of some time travelers fall into my lap, I can’t imagine the life I was going to have before he touched it.

Ditching Read It Later for Instapaper

This evening I made the switch from Read It Later to Instapaper. The latter is by far the more popular service. On the surface, it might be hard to choose one over the other. Their iPhone apps both cost $4.99 (Read It LaterInstapaper), they both have free-to-use websites, they both suck the text out of a web article you’re too busy to read at the moment of encounter, and store it online for later enjoyment. Well, at least that’s the idea.

It seems grabbing the right text off a page isn’t that easy, and RIL was just letting me down too many times. Quite often I’d have words like Home, About, and Related Articles – clearly bits of the navigational interface missed by the dust filter – appearing before or in the middle of the story I wanted to read. Sometimes they’d be the only words on display: the article itself having been weeded out and tossed aside, 90% of the page’s content or not!
The RIL text engine wasn’t very smart about pretending to be a normal browser either. Sometimes the policing mechanisms of a website would prevent it from loading the intended content and direct RIL to the front page instead. In the instances where I might only get around to reading the article months later, there’d be simply no way to remember what I was supposed to have been saving. Salon, Edge Magazine, Wired Mobile, and The New York Times all gave it trouble, among others.
There were reasons I stayed this long, though. Read It Later excels at being social. After reading an item I really liked, I could send it to Diigo for full-text archiving, or Evernote, or tweet it, Facebook it, bookmark it in Delicious, share it in Google Reader, or even email the plain text to a friend who might be interested. The Diigo bit was closest to my heart. But for every sweet feature – a full-screen view and a scrollbar for quick skimming are two examples worth mentioning – there’d also be the disadvantages of being second-best.
I think the reason Instapaper has such a knack for sniffing out the right words from a page is that dedicated users send Marco Arment emails whenever something doesn’t work right. By his own admission, the system is a pile of hacks, but as far as the end user (me) is concerned, it just works. I wish it didn’t always have to be about Features vs. Excellence, but Instapaper definitely wins the lower-my-blood-pressure challenge. RIL probably doesn’t get enough feedback to develop a comparably intelligent engine, but missing the first paragraph of every article on the New York Times? Come on.
Also, most apps install support for Instapaper first, and the wait for RIL integration is always long and uncertain. I don’t know if Nate Weiner, Read It Later’s developer, does anything to help adoption of his service along, but like in the case of the new Twitterrific for iPhone, users like me end up being the ones petitioning other app developers to please please please consider adding RIL support. It sucks.
Plus, in the time since I last saw Instapaper, it’s received a bunch of great new features like a paginated viewing method, and an enhanced presentation with inline graphics. I’ll miss RIL’s sharing features, and hope Instapaper adds just a couple more export options to the current choices of Tumblr & Twitter (Diigo, please!), but for the moment it’s enough that I can bookmark stuff and be secure in the knowledge that they’ll be waiting for me, complete, when I get to them.
The fact that this blog somehow appears in the screenshot for Instapaper in the App Store has nothing to do with it, I swear!

➟ Apple’s new iPhone 4 ads

I posted these four ads on Twitter earlier, calling them a cut above Apple’s recent advertising; each one a force of emotion. It strikes me now that these ads are so natural, so well conceived and performed, that they’re more moving than scenes several times their length in Hollywood film. In any case, they are a refreshing change from disembodied hands and players introducing themselves as metaphors for machines.

What’s remarkable about Apple’s advertising is how they have come to accurately reflect the brand’s approach. It’s a lot rarer than you’d think, and most communications from large companies with offices in multiple countries inevitably veer into “off-brand” territory. Just as the modern Mac and iPhone are familiar tools whittled down to their purest forms – no extraneous buttons or indicator lights, solid blocks of CNC-machined material, and straightforward “naturalistic” user interfaces – the modern Apple ad is simple, uncluttered, and devoid of transitions and flashy effects.

They keep the basics: a story, a product, and a pay-off. These iPhone 4 ads all have the same straightforward presentation, an over-the-shoulder shot of someone having a FaceTime conversation, and yet they look like no other ads on TV. You’d recognize the next one in a heartbeat. They’ve taken out everything that could be a distraction, and there’s nothing you could add to make them better. That’s good work, and the craftsmanship is impeccable. I imagine being on the Apple account at TBWA\Chiat\Day is like being an honorary Apple employee.

Link (Apple.com – four new ads total)

Smartphone usability and my parents

I just watched my mother try to take a photo with her Nokia smartphone for the first time. An orchid in the home was blooming, and it was the closest camera within reach. She only uses it as a regular phone, and as the least technically-minded member of the family, is strangely the only one not using an iPhone. Needless to say, she was baffled by the Symbian OS. A primary hardware feature on the device, and the icon was buried in a submenu. Afterwards, she asked my father where to find the file so she could email it to herself, and he couldn’t readily answer her.

His last phone before the iPhone 3GS was a Nokia E90 Communicator, a top-of-the-line Symbian workhorse business machine. He’d spent so much time understanding how it worked, that the iPhone’s simplicity initially confused him. He’d ask how to access the file system so he could manage his data. Coming around to a task-centric model (photos are always available in the Photos app; music lives in the iPod data well, managed with iTunes) took awhile, but now that he gets it, the Nokia way is unfathomable. Managing a nested file system on a mobile device is no consumer’s idea of fun.

There’s always been the image of Macs being for stupid/lazy people who can’t work “real” computers and handle complexity in the user interface. Now the iPhone has inherited that reputation in the face of competition from Android, a system that David Pogue calls “best suited for technically proficient high-end users who don’t mind poking around online to get past the hiccups” in his review of the new Droid X. This became clearer as I got older, but I don’t consider most people over the age of 40 who struggle with technology to be stupid or lazy. It comes down to privilege, familiarity, and priorities.

One of Apple’s most prominent user experience attempts at improving accessibility involves mimicking real-world interfaces, such as using a yellow notepaper background and handwriting fonts in Notes, and superfluous flipping page animations in iBooks. Marco Arment has a good post on this: Overdoing the interface metaphor. It’s a divisive strategy that works well in the early stages of familiarization, but soon becomes a hindrance as one grows more proficient/confident. One of the best metaphors I’ve ever encountered on a mobile device was the lens cover on one of my old Sony-Ericsson cameraphones. Slide it open, and the camera application started up. Along with a physical shutter button, it was perfect, and my mother would have understood it instantly. Such a design benefits even experienced users who know how to start the camera up from the main menu. It’s easy to see how a physical feature can offer that experience, but the real challenge is finding that middle ground in software.

* I ended up taking the photo with my Panasonic LX3.

➟ Steve Jobs at D1

From 2003, Steve Jobs at the first All Things Digital conference. At this point, Apple had only sold 700,000 iPods after two years on the market, and the question of whether Apple would build PDAs and tablets was in the air. Steve said no, but his replies were conditional and it’s clear that the iPhone/iPad were brought to market only after satisfying all the shortcomings that these concepts had in 2003.

On everything else they talked about, he was dead right. Microsoft’s just-announced tablets did fail, and handwriting technology is now irrelevant because everyone prefers the speed of typing. And yet Bill Gates just repeated the other day on Larry King that he still doesn’t believe in the iPad because it lacks pen support.

Update: 27 minutes in, Walt Mossberg gives him a few minutes to demonstrate the newly-launched iTunes Music Store. It’s a real masterclass in sales pitch delivery: passionate, concise, human.

Link [45min video at allthingsd.com]

MacBooks updated, but even consumers should go Pro

Image: Apple.com

Apple has just updated their entry-level MacBook models to match the recent 13″ MacBook Pros in terms of speed, battery life, and graphics performance, whilst maintaining a fair-sounding USD$999 (SGD$1488) price point.

That money will get you a 2.4Ghz Intel Core 2 Duo processor, a Nvidia GeForce 320M graphics processor with 256MB of memory, and a non-removable battery capacious enough to last 10 hours of typical use. That’s really the best feature here; five years ago you’d be happy to get three hours out of a low-end machine.

But if you upgrade a MacBook to have 4GB of RAM ($1648) and compare that to a 13″ MacBook Pro (with 4GB of RAM as standard, $1788), it looks like a much poorer deal. $1648 vs $1788, for a difference of $140.

Here’s what that $140 gets you:

  • A sturdier aluminium body that’s slimmer all around and just a bit lighter
  • The option of upgrading to a maximum of 8GB of RAM, instead of 4GB for the MacBook
  • An illuminated keyboard that dims in response to ambient lighting conditions
  • Firewire 800
  • An SD card slot
  • The appearance of not being a cheapskate/noob/student.
Jokes aside, I can’t see why it would be in anyone’s interest to buy this model over a MacBook Pro. Sure, mainstream consumers will appreciate the SD card support when dealing with digital cameras, and the metal body probably handles heat better, but the ability to install RAM past 4GB is the closer for me. If you buy your computers with the intention of using them up to the three-year mark and beyond, you’ll want that upgrade path in your future. A little extra memory in the later years can go a long way towards rejuvenating an old computer and preparing it for the demands of more advanced operating systems.

The New Apple

There’s a phrase that tends to pop up in conversations about the latest divisive move from Cupertino: “the new Apple”. There’s always a new Apple that threatens the way things have been, or turns its back on a loyal segment; doing something other than what we, presumably desirable, tech-savvy customers want for our money.

Lately, it’s been the iPad and its being in bed with the iPhone OS when we’d already arranged for a marriage to Mac OSX. It’s a computer for grandparents that will have severe implications for their grandchildren’s ability to grow up into the kind of curious, tinkering hackers who poke their noses where they don’t belong and thereby discover new and better ways to write software and build hardware and renew the flattened spirit of progress, thus we are destroying the circle itself!, the naysayers charge, gasping for air.

With the iPhone model, software developers leave Apple a cut of every sale on the sides of their plates, while suffering the indignity of letting the publisher have final veto rights. Tinkering and sales aside, the goddamned thing wants to be a computer but has no multitasking! – This is the work of the new Apple.

When new MacBook Pros were released with the same glossy, reflective screens as consumer MacBooks, pissing off graphics professionals who needed color accuracy and glare-free visibility in daylight, that too was the new Apple. The new Apple ditched PowerPC chips for Intel’s, after trumpeting the former’s superiority for a decade; the new Apple said no removable batteries for any portable device, too bad if you have a 20-hour flight; the new Apple also developed an odd nippled mouse that stopped scrolling after just months of use, ironically named after an unstoppable cartoon character; the new Apple resembles the Orwellian state in the old Apple’s ‘1984’ ad, year after year.

The truth is, of course, that there is no new Apple. The ones who talk about it, imagine it, are mostly from a core of computing enthusiasts and creative professionals who have had love affairs with their Macs from before the second coming of Jobs. When consumers flocked en masse to cheaper PCs, they stayed with the ship and played music like nothing was happening. And edited video. And designed layouts. And touched up photos. The creative industry stayed with the Mac because it had the best software for their needs. Over time, they made the platform their own.

Theorists might point to Jobs’ return and subsequent introduction of colorful, family-friendly iMacs as the day when new Apple began, but only because of how long it had been since Apple last produced anything of interest to the public. If anything, the new Apple was born right after the Apple II.

Designed to be a computer for the everyman, the first Macintosh was built on the the same fundamental principles as the iPad 26 years later. Intuitive to use above all else, thanks to new technologies: a mouse then, multi-touch now. Resistant to tinkering: both are sealed with limited options for expansion. The inexplicable absence of features that might have been trivial to add: a color screen and hard drive on the Mac, a camera and multitasking on the iPad. Both were doubtlessly shaped by the idiosyncratic tastes and insights of Steve Jobs, whose involvement and personality defines Apple to the point that the idea of a ‘new’ direction seems flawed. It has always been Steve’s way.

Professionals need to believe that because they kept the company going for much of the 80s and 90s, their needs are still important to it. But the Mac Pro is the last remaining concession to this group of customers. It’s the only Mac that can be upgraded, and to which more than one non-glossy display can be connected for serious graphics work. Ever since the explosion of Mac use in the home, with the help of iLife and iWork as key selling points, the face of Apple has changed. If I’d asked you ten years ago to describe the Mac for me, you’d have said “used by video editors and designers”. Chances are, that’s not your first thought today.

I don’t suggest that Apple is leaving professionals out to dry, obviously the segment is still extremely important for the brand’s prestige and these customers are useful for pushing engineering efforts into things like octo-core and 64-bit computing, all of which eventually trickle down to the consumer products, but there have been bumps in the road to show that the company’s attention is slipping now that it’s gained the widespread consumer adoration it has courted all along. Case in point: the recent debacle over the MacBook Pro’s downgraded SATA interface. By the way, we’ve reached a point where the Pro products are bought by regular consumers just because they look cooler or carry more status. It was a recognizable trend by the time MacBooks sold out at a premium price just for being painted black, and it made a sort of poetic sense when the unibody aluminum consumer MacBooks morphed overnight into 13″ MacBook Pros earlier last year.

With the help of pundits and analysts who, at best, bat a little over 50%, it’s all too easy to fall into the trap of thinking you know the game plan, which is how all ‘new Apple’ complaints begin. If you want to know what the new Apple is liable to do, just ask if it’s something the common man will understand, notice is missing or broken, and still buy the hell out of anyway. Just like the first floppy drive-less Macs, less-space-than-a-Nomad iPods, and 2G-only iPhones.

Fear of a Pad Planet

There’s been a certain reaction to the iPad from some quarters of the tech-inclined community, inspired by the belief that the device signals a shift towards a new form of computing that old people can finally understand. That reaction has been fear and apprehension.

It begins by looking at the iPad as a better personal computer for the majority of people. After all, it surfs the web, does email, plays games, and that’s what most people do with their computers most of the time, right? Better yet, it does all of those things without a long boot-up sequence, viruses, and confusing computery concepts like a filesystem, administrator rights, directories (recently renamed ‘Folders’ for these same users), registries, multi-step installation procedures, and the list goes on. Parents will finally stop calling us for help with strange error messages, and we will forget that it was ever hard.

But if people start to prefer the iPad and its descendants to ‘real’ computers, so the argument goes, then we will have robbed the next generation of a basic foundational understanding of computers. Because there will be no tinkering in Apple’s clinical workshop, they will never see the crucial workings of a program beneath its simplified user interface, and we will not have people to build the next Google, YouTube, or Bittorrent. The iPad/iPhone were built to enable end-users to consume content, and so it must be that creativity stands to suffer.

As I wrote yesterday, I currently see the iPad as a great way to access information and interact with media, freed from the physical contraints of an iPhone’s smaller screen and shorter battery life. Apple sees it, quite necessarily, as something more*. Which is why they built iWork productivity apps and demonstrated Brushes, an application that lets the large screen be used as a drawing surface for artists.

Offering a new breed of computer to an older person and seeing them take to it with joy and wonderment, as opposed to frustration and confusion, is a wonderful image and what the industry should work towards, but just because a filesystem is obscured doesn’t mean the curious can’t get to it. One might argue that jailbreaking an iPad is no different from the things people did to their computers in the past. There will always be unauthorized tools for messing around, and one day you may even be able to write, compile, and test code for an iPad on the thing itself. I wouldn’t worry about the younger generation of hackers.

My parents online
I want to talk about two tasks I’ve observed my parents and people their age doing on their computers.

1 – My mother mainly works with email. She receives documents relating to her church activities, which she must save locally before editing and sending them out again to other members of her group. She organizes these files in folders, which are really good metaphors that she understands, and often keeps multiple dated versions.

Of course, the iPad of today can’t save email attachments for working on in the Pages word processor. One day it will. But that sort of management is bound to increase the level of complexity. Lists of documents, tags or folders, deleting and renaming, and so on. I thought of introducing her to Google Docs, which would let her work with live documents in the cloud, and even collaborate in real-time with her friends. When changes are made, instead of emailing a copy of a document to other people, she would only have to send invites to view the document online. The iPad would work well with that approach – no local storage necessary. The responsibility and blame for any complexity is passed off onto the web service provider, in this case Google, leaving the iPad’s reputation to remain spotless.

2 – My father (and other fathers I hear about) likes to download videos off YouTube for later viewing, both on the desktop and on his iPhone. These are usually music videos and funny but horrifying accidents. This requires using a program or website like KeepVid to save them locally, and then often another program to re-encode the clips for use on the iPhone.

I believe saving videos off Youtube is a copyright gray area that Apple will never touch by sanctioning an app that exists to do it. Music videos are often removed from Youtube when found to be unauthorized uploads, which might explain the compulsion to save them. But even if they stayed online, is streaming instead of saving an ideal solution? That’s a lot of wasted bandwidth, and what if they want a Taylor Swift video or two while traveling by air? Apple will never allow the Youtube app to save video and compete with iTunes sales.

Both of these scenarios and their cloud-based natures highlight the need for increased openness and cooperation on the web. If we can’t have open computing systems, then we need an open internet to take its place. My mother’s friends shouldn’t all have to have Google accounts to access her shared documents, and Youtube shouldn’t have a monopoly on streaming video just because the iPad comes with an app built-in. The widespread adoption of HTML5 video in lieu of Flash would be fantastic, and remove the need for a native Youtube viewer. Likewise, online storage accounts like the ones offered by Dropbox and Microsoft Live Mesh should be able to trade files and work together. Productivity and content creation services should have a way of talking to each other across networks.

I like Google Wave’s implementation of federated servers. You can run your own private Wave system, really make it your own for whatever purposes, but the underlying protocol can communicate with every other Wave server if/when you need it to.

If that kind of openness were applied to all other services, companies would stand to lose their ‘stickiness’, but they’d surely find other ways to retain users. Should a landscape of interoperability and sharing ever come to pass in every corner of the web, it would be to the benefit of us all. How fitting, then, if we were steered in that direction by the threat of having to work on oversimplified computers.

—-

With apologies to Public Enemy for the title.

* When Nintendo first launched the DS in 2004, they called it a “third pillar” to allay fears that the company was going mad and replacing its popular and very profitable Game Boy Advance series with a risky touchscreen experiment. The DS went on to become a huge hit, accelerating the GBA’s demise and eventually becoming their main handheld product. You may wish to see Apple’s positioning of the iPad as a similar play: someday it may overtake the MacBook completely.

Alex Payne on the iPad

Alex Payne, in a widely-linked article, wrote today that:

The thing that bothers me most about the iPad is this: if I had an iPad rather than a real computer as a kid, I’d never be a programmer today. I’d never have had the ability to run whatever stupid, potentially harmful, hugely educational programs I could download or write. I wouldn’t have been able to fire up ResEdit and edit out the Mac startup sound so I could tinker on the computer at all hours without waking my parents. The iPad may be a boon to traditional eduction, insofar as it allows for multimedia textbooks and such, but in its current form, it’s a detriment to the sort of hacker culture that has propelled the digital economy.

As far as I can tell, Apple never intended for young Alex Payne to access the Mac’s startup sound any more than they intend for a future programmer to hack an iPad’s filesystem and do some tinkering of his own tomorrow. Sure it’s harder with DRM and encryption, but we’re united by the internet these days, and breaking those walls down has become a group effort. No young hacker today has to learn alone. We change with the territory and so nothing has really changed at all.

The Causes of iPad Disappointment



Letdown. Disappointment. Anticlimax. These words have appeared in nearly all the first articles written about the newly unveiled Apple iPad, barely a day old in the world. The reasons are not entirely important in the long run, but many of these stories themselves will admit that expectations were raised beyond reasonable levels, that Apple had no hope of impressing everyone the way they did in 2007 with the first iPhone. This environment of fanciful conjecture and presumptuous theorizing was the result of an industry’s decade-long fascination with getting the idea of a tablet computer to stick – seemingly against the wishes of the consumers meant to buy them – and the belief that Apple can succeed where other companies embarrass themselves. They’ve done it before, after all.

It never helps that Apple says very little about what it’s got until it’s got it. The veil of secrecy provides theatrical levels of entertainment at every event; charged affairs where people whoop and whistle from the moment Jobs takes the stage. As press conferences go, they overdeliver. But on the eve of events as the iPad’s debut, such enthusiasm cuts both ways, and the company is left with the unenviable task of managing expectations without any direct communication – a task that has become increasingly hard over the past few years.

In the beginning, Macs were a relatively quiet business; high profile products that most people saw but never considered owning. The success of the iPod energized Apple’s public image, and eventually sensational moves like the decision to cancel its hottest product, the iPod mini, in favor of an impossibly innovative new replacement, raised the bar not only for its hapless competitors, but for the company itself. Even then, the shifts from small to smaller iPods with color screens where black and white displays were once standard, were no indication of the iPhone’s shape or form before January 2007. That device’s unprecedented introduction, so many orders of magnitude beyond what had been expected of Apple (a phone that played music, and that wasn’t as ugly or challenged as the Motorola ROKR, would have sufficed for many), changed the pre-event guessing game forever. Do one magic trick, and you’re always going to be asked for more.

Consider that at the time of the iPhone’s release, touchscreens were not a new technology. Palm’s PDAs and countless phones running Microsoft’s Windows Mobile operating system had touchscreens for years, and were fairly well liked. Yet as the tech world watched Steve Jobs scroll a contact list with a flick of his finger, it was impossible to make the connection between that and the experience we had become accustomed to. The older technology, resistive touchscreens, required styluses or fingernails, with scrolling conducted with bars that were clicked and dragged. A series of small innovations (capacitive touchscreens, direct contact momentum scrolling, and a smoothly-animated graphical interface) combined to redefine the way we expect to interact with handheld computers today. It’s a classic Apple play: refine existing technologies, add advancements in software, and produce an entirely new class of product.

Given the rise of ebook readers like the Kindle, and the continuing efforts of PC makers to fashion smaller and cheaper computers from low-power CPUs like Intel’s Atom, it was only natural to think that Apple would soon do the same for the popular netbook category or a tablet*. It would be another game changer equal to or greater than the iPhone, we began to hear. In the days leading up to January 27, a quote attributed to Steve Jobs was circulated, to the effect that the tablet would become his greatest achievement. John Gruber of the Daring Fireball blog predicted that Apple was ‘swinging big’ with a new product that one would buy instead of a laptop. Others dreamt of new multi-touch interfaces that would further bury Microsoft’s second stab at tablet computer, shown to be an HP “slate” running Windows 7 without any modifications that might make operation with a finger possible in lieu of a mouse. In all fairness, many of these outsized rumors were based on the presumption that the tablet would cost up to USD$1000. What could possibly cost that much, close to a full computer, except a full computer? It would be ironic if the thousand-dollar figure was leaked by Apple to increase the impact of the final $499 price point, only to backfire by raising expectations.

These pre-announcement assumptions by enthusiasts and tech writers are now par for the course, as are the disappointments that follow each new product revelation. The iPhone 3G largely met expectations because it corrected the one deficiency that kept the original iPhone from greatness**: the speed at which it accessed the internet. It also coincided with the opening of the platform with the iPhone SDK, which led to the app-happy state of affairs we now enjoy. Regardless, complaints about the low resolution camera, unremovable battery, etc. continued to get a public airing.

Last year’s iPhone 3GS was roundly criticized for being an unexciting upgrade, retaining the same looks as its predecessor with little more than a megapixel and speed bump, effectively delivering the previous year’s expected iPhone but late. It went on to become a huge success. The buying public is immune to disappointment, it seems, perhaps because they don’t read blogs that sell them pipe dreams.

The iPhone 3GS announcement, and the internet’s lukewarm reactions to it, would be a good analogy for what’s happening with the iPad, except nobody hoped for the next iPhone to summarize a decade of engineering efforts. Like the iPhone 3GS, the iPad initially appears to be an evolutionary product, being based on existing technology Apple has repeatedly shown in public. It’s faster and more powerful, but not radically different from known territory. At first sight, it’s hard to imagine what the changes mean in actual use. You might think you can live with the old model and how things used to be. That’s a shame.

Apple is positioning the iPad as a third pillar in their portable product lineup: more than a smartphone, less than a laptop, yet better at some things than either of the other two. This instantly invalidates the idea of buying one “instead of a laptop”. Clearly you are meant to have both. It syncs with a Mac or PC the same way an iPhone or iPod does. It’s a secondary computer, but it’s also an appliance (see Farhad Manjoo’s articles on Slate.com – here and here).

It sends email, it plays games, and it will be fantastic for Twitter, but in my opinion, the iPad in its current form holds the most value as an interactive document, or to use a term repeated many times last night, an ‘intimate’ way of experiencing media. Despite having no plans to purchase a Kindle DX or similar reader I suspect that I will fall in love with the thing the moment I hold a book-sized slice of full-color webpage in my hands. As Manjoo writes, “Everything about it—its size, shape, weight, and fantastically intuitive user interface—feels just right.”

With the first iPhone, Apple understood that touch interfaces are an emotional experience. Pressing the pad of your finger to a virtual page (in Eucalyptus, for example) and turning it fools some part of the brain that isn’t dedicated to understanding a screen is not the same thing as paper. It’s satisfying, even though a facsimile of a real experience. It’s more realistic than using a fingernail, which one never applies to a real page, and more personal than a stylus. My guess is that if capacitive touch of that quality wasn’t available, the iPhone project would never have gotten the green light.

I submit that the iPad takes one more step towards solidifying the illusion of digital media with real, physical presence. It’s not just a bigger iPhone, it’s an iPhone big enough to pass for a printed page and fool your mind. A frame that holds websites with long-form writing, augmented with video and animation, that we can hold lazily, effortlessly, in our hands or laps like the glossy magazines or newspapers they approximate, is nothing short of magical, to borrow another marketing word. If the iPad became transparent like a slab of glass when turned off, wouldn’t be exactly like the science fiction movie ideal of a portable computer? Don’t you want to live in the future? I say use your imagination.

And yet more frustration sprang from the absence of anticipated features, some of which are explained by the iPad’s positioning as a third pillar, while others invite more guesswork and predictions. Q. Why doesn’t it have a camera? A. It’s not for photography, and videoconferencing is the realm of a phone. Q. Why doesn’t it have USB ports/connect to an iPhone for syncing and tethering? A. It’s not a primary computer. Q. Where’s the multitasking and improved notifications? A. iPhone 4.0?

I believed that an update to the current iPhone OS, version 3.2, would be announced during yesterday’s event, which didn’t happen. As we know, the iPad runs a version of the OS with this number, but for the next 60 days it’s not something you can buy. That gives a 60-day window to expect iPhone OS 3.2 to be released for existing iPhone users. Will this bring some of the iPad’s new features to the iPhone, like the iBooks reader and iBookstore? My guess is no. These will remain exclusive iPad features for the time-being. In that case, iPhone OS 3.2 might only bring a few bugfixes and trivial UI enhancements. I believe Apple is already looking ahead to iPhone OS 4.0, to be announced in the March-May window, and released in concert with the next iPhone model. Improved notifications are a must, and I’m fairly confident they will be present.

The other concern, third-party app multitasking, is far and away the number one demanded feature for the iPhone OS amongst people I know, but I’m becoming ever more skeptical that it will materialize. Apple has invested heavily in push notifications as an alternative, too much for it to be merely a stopgap measure. With the iPad, which Apple is attempting to push as a viable machine for occasional work (more than a smartphone, less than a laptop), the lack of multitasking is even more apparent. If I’m writing a document in Pages and need to move back and forth quickly between a website, email, and a notes app from which I might want to copy information, iPhone OS 3.x requires me to switch in and out of the Home Screen each time, closing and reopening the apps.

The non-multitasking answer? Persistence and a quick launcher. iPhone OS 4.0 could enable a system-wide method for saving an app’s state (what you’re doing in it) when you quit, and having it restore upon the next launch. And according to Gruber, everyone who’s laid hands on it agrees the iPad and its new A4 processor are incredibly fast. Add that combination to a home button double-click that pops up a list of your last five apps, and you’re effectively alt-tabbing between running applications without the battery drain.

iPhone OS 4.0 will be a big deal when announced, whatever it actually does, and Apple understandably couldn’t show its features yesterday as part of the iPad. But because they share the same OS, it stands to reason that whatever the iPhone gains from here on out will also be on the iPad. Being that the iPhone is incredibly important to Apple and already accounts for 32% of smartphone profits worldwide while receiving rapid software development both from within the company and out, it’s inevitable that the iPad’s image will soon evolve beyond that of an underwhelming giant iPod touch.

~

* Legend has it that research for a “Safari Pad” tablet wound up becoming the basis of the iPhone, so it’s possible that the universe is completely backward.

** It’s worth noting that the original iPhone was also criticized for its camera, built-in battery, price, and not supporting third-party applications, Java, Microsoft Exchange, MMS, copy and paste, etc. despite being ridiculously ahead of the pack in terms of miniaturization, engineering, web browsing, media playback, and user experience. After three years, competitors have yet to match the software or browsing. Someday, the iPad’s reception will probably be remembered in a similar way.