Tag: Apple

  • ➟ Apple’s new iPhone 4 ads

    I posted these four ads on Twitter earlier, calling them a cut above Apple’s recent advertising; each one a force of emotion. It strikes me now that these ads are so natural, so well conceived and performed, that they’re more moving than scenes several times their length in Hollywood film. In any case, they are a refreshing change from disembodied hands and players introducing themselves as metaphors for machines.

    What’s remarkable about Apple’s advertising is how they have come to accurately reflect the brand’s approach. It’s a lot rarer than you’d think, and most communications from large companies with offices in multiple countries inevitably veer into “off-brand” territory. Just as the modern Mac and iPhone are familiar tools whittled down to their purest forms – no extraneous buttons or indicator lights, solid blocks of CNC-machined material, and straightforward “naturalistic” user interfaces – the modern Apple ad is simple, uncluttered, and devoid of transitions and flashy effects.

    They keep the basics: a story, a product, and a pay-off. These iPhone 4 ads all have the same straightforward presentation, an over-the-shoulder shot of someone having a FaceTime conversation, and yet they look like no other ads on TV. You’d recognize the next one in a heartbeat. They’ve taken out everything that could be a distraction, and there’s nothing you could add to make them better. That’s good work, and the craftsmanship is impeccable. I imagine being on the Apple account at TBWA\Chiat\Day is like being an honorary Apple employee.

    Link (Apple.com – four new ads total)

  • Smartphone usability and my parents

    I just watched my mother try to take a photo with her Nokia smartphone for the first time. An orchid in the home was blooming, and it was the closest camera within reach. She only uses it as a regular phone, and as the least technically-minded member of the family, is strangely the only one not using an iPhone. Needless to say, she was baffled by the Symbian OS. A primary hardware feature on the device, and the icon was buried in a submenu. Afterwards, she asked my father where to find the file so she could email it to herself, and he couldn’t readily answer her.

    His last phone before the iPhone 3GS was a Nokia E90 Communicator, a top-of-the-line Symbian workhorse business machine. He’d spent so much time understanding how it worked, that the iPhone’s simplicity initially confused him. He’d ask how to access the file system so he could manage his data. Coming around to a task-centric model (photos are always available in the Photos app; music lives in the iPod data well, managed with iTunes) took awhile, but now that he gets it, the Nokia way is unfathomable. Managing a nested file system on a mobile device is no consumer’s idea of fun.

    There’s always been the image of Macs being for stupid/lazy people who can’t work “real” computers and handle complexity in the user interface. Now the iPhone has inherited that reputation in the face of competition from Android, a system that David Pogue calls “best suited for technically proficient high-end users who don’t mind poking around online to get past the hiccups” in his review of the new Droid X. This became clearer as I got older, but I don’t consider most people over the age of 40 who struggle with technology to be stupid or lazy. It comes down to privilege, familiarity, and priorities.

    One of Apple’s most prominent user experience attempts at improving accessibility involves mimicking real-world interfaces, such as using a yellow notepaper background and handwriting fonts in Notes, and superfluous flipping page animations in iBooks. Marco Arment has a good post on this: Overdoing the interface metaphor. It’s a divisive strategy that works well in the early stages of familiarization, but soon becomes a hindrance as one grows more proficient/confident. One of the best metaphors I’ve ever encountered on a mobile device was the lens cover on one of my old Sony-Ericsson cameraphones. Slide it open, and the camera application started up. Along with a physical shutter button, it was perfect, and my mother would have understood it instantly. Such a design benefits even experienced users who know how to start the camera up from the main menu. It’s easy to see how a physical feature can offer that experience, but the real challenge is finding that middle ground in software.

    * I ended up taking the photo with my Panasonic LX3.

  • ➟ Steve Jobs at D1

    From 2003, Steve Jobs at the first All Things Digital conference. At this point, Apple had only sold 700,000 iPods after two years on the market, and the question of whether Apple would build PDAs and tablets was in the air. Steve said no, but his replies were conditional and it’s clear that the iPhone/iPad were brought to market only after satisfying all the shortcomings that these concepts had in 2003.

    On everything else they talked about, he was dead right. Microsoft’s just-announced tablets did fail, and handwriting technology is now irrelevant because everyone prefers the speed of typing. And yet Bill Gates just repeated the other day on Larry King that he still doesn’t believe in the iPad because it lacks pen support.

    Update: 27 minutes in, Walt Mossberg gives him a few minutes to demonstrate the newly-launched iTunes Music Store. It’s a real masterclass in sales pitch delivery: passionate, concise, human.

    Link [45min video at allthingsd.com]

  • ➟ iPad magic in Tokyo

    A Japanese magician performs a multimedia (and multi-prop) presentation with an iPad, out on the street by Ginza’s iconic Apple store. It’s a pretty impressive string of visual effects, one after another in under three minutes.

    Link [YouTube]

  • MacBooks updated, but even consumers should go Pro

    Image: Apple.com

    Apple has just updated their entry-level MacBook models to match the recent 13″ MacBook Pros in terms of speed, battery life, and graphics performance, whilst maintaining a fair-sounding USD$999 (SGD$1488) price point.

    That money will get you a 2.4Ghz Intel Core 2 Duo processor, a Nvidia GeForce 320M graphics processor with 256MB of memory, and a non-removable battery capacious enough to last 10 hours of typical use. That’s really the best feature here; five years ago you’d be happy to get three hours out of a low-end machine.

    But if you upgrade a MacBook to have 4GB of RAM ($1648) and compare that to a 13″ MacBook Pro (with 4GB of RAM as standard, $1788), it looks like a much poorer deal. $1648 vs $1788, for a difference of $140.

    Here’s what that $140 gets you:

    • A sturdier aluminium body that’s slimmer all around and just a bit lighter
    • The option of upgrading to a maximum of 8GB of RAM, instead of 4GB for the MacBook
    • An illuminated keyboard that dims in response to ambient lighting conditions
    • Firewire 800
    • An SD card slot
    • The appearance of not being a cheapskate/noob/student.
    Jokes aside, I can’t see why it would be in anyone’s interest to buy this model over a MacBook Pro. Sure, mainstream consumers will appreciate the SD card support when dealing with digital cameras, and the metal body probably handles heat better, but the ability to install RAM past 4GB is the closer for me. If you buy your computers with the intention of using them up to the three-year mark and beyond, you’ll want that upgrade path in your future. A little extra memory in the later years can go a long way towards rejuvenating an old computer and preparing it for the demands of more advanced operating systems.
  • The New Apple

    There’s a phrase that tends to pop up in conversations about the latest divisive move from Cupertino: “the new Apple”. There’s always a new Apple that threatens the way things have been, or turns its back on a loyal segment; doing something other than what we, presumably desirable, tech-savvy customers want for our money.

    Lately, it’s been the iPad and its being in bed with the iPhone OS when we’d already arranged for a marriage to Mac OSX. It’s a computer for grandparents that will have severe implications for their grandchildren’s ability to grow up into the kind of curious, tinkering hackers who poke their noses where they don’t belong and thereby discover new and better ways to write software and build hardware and renew the flattened spirit of progress, thus we are destroying the circle itself!, the naysayers charge, gasping for air.

    With the iPhone model, software developers leave Apple a cut of every sale on the sides of their plates, while suffering the indignity of letting the publisher have final veto rights. Tinkering and sales aside, the goddamned thing wants to be a computer but has no multitasking! – This is the work of the new Apple.

    When new MacBook Pros were released with the same glossy, reflective screens as consumer MacBooks, pissing off graphics professionals who needed color accuracy and glare-free visibility in daylight, that too was the new Apple. The new Apple ditched PowerPC chips for Intel’s, after trumpeting the former’s superiority for a decade; the new Apple said no removable batteries for any portable device, too bad if you have a 20-hour flight; the new Apple also developed an odd nippled mouse that stopped scrolling after just months of use, ironically named after an unstoppable cartoon character; the new Apple resembles the Orwellian state in the old Apple’s ‘1984’ ad, year after year.

    The truth is, of course, that there is no new Apple. The ones who talk about it, imagine it, are mostly from a core of computing enthusiasts and creative professionals who have had love affairs with their Macs from before the second coming of Jobs. When consumers flocked en masse to cheaper PCs, they stayed with the ship and played music like nothing was happening. And edited video. And designed layouts. And touched up photos. The creative industry stayed with the Mac because it had the best software for their needs. Over time, they made the platform their own.

    Theorists might point to Jobs’ return and subsequent introduction of colorful, family-friendly iMacs as the day when new Apple began, but only because of how long it had been since Apple last produced anything of interest to the public. If anything, the new Apple was born right after the Apple II.

    Designed to be a computer for the everyman, the first Macintosh was built on the the same fundamental principles as the iPad 26 years later. Intuitive to use above all else, thanks to new technologies: a mouse then, multi-touch now. Resistant to tinkering: both are sealed with limited options for expansion. The inexplicable absence of features that might have been trivial to add: a color screen and hard drive on the Mac, a camera and multitasking on the iPad. Both were doubtlessly shaped by the idiosyncratic tastes and insights of Steve Jobs, whose involvement and personality defines Apple to the point that the idea of a ‘new’ direction seems flawed. It has always been Steve’s way.

    Professionals need to believe that because they kept the company going for much of the 80s and 90s, their needs are still important to it. But the Mac Pro is the last remaining concession to this group of customers. It’s the only Mac that can be upgraded, and to which more than one non-glossy display can be connected for serious graphics work. Ever since the explosion of Mac use in the home, with the help of iLife and iWork as key selling points, the face of Apple has changed. If I’d asked you ten years ago to describe the Mac for me, you’d have said “used by video editors and designers”. Chances are, that’s not your first thought today.

    I don’t suggest that Apple is leaving professionals out to dry, obviously the segment is still extremely important for the brand’s prestige and these customers are useful for pushing engineering efforts into things like octo-core and 64-bit computing, all of which eventually trickle down to the consumer products, but there have been bumps in the road to show that the company’s attention is slipping now that it’s gained the widespread consumer adoration it has courted all along. Case in point: the recent debacle over the MacBook Pro’s downgraded SATA interface. By the way, we’ve reached a point where the Pro products are bought by regular consumers just because they look cooler or carry more status. It was a recognizable trend by the time MacBooks sold out at a premium price just for being painted black, and it made a sort of poetic sense when the unibody aluminum consumer MacBooks morphed overnight into 13″ MacBook Pros earlier last year.

    With the help of pundits and analysts who, at best, bat a little over 50%, it’s all too easy to fall into the trap of thinking you know the game plan, which is how all ‘new Apple’ complaints begin. If you want to know what the new Apple is liable to do, just ask if it’s something the common man will understand, notice is missing or broken, and still buy the hell out of anyway. Just like the first floppy drive-less Macs, less-space-than-a-Nomad iPods, and 2G-only iPhones.

  • Fear of a Pad Planet

    There’s been a certain reaction to the iPad from some quarters of the tech-inclined community, inspired by the belief that the device signals a shift towards a new form of computing that old people can finally understand. That reaction has been fear and apprehension.

    It begins by looking at the iPad as a better personal computer for the majority of people. After all, it surfs the web, does email, plays games, and that’s what most people do with their computers most of the time, right? Better yet, it does all of those things without a long boot-up sequence, viruses, and confusing computery concepts like a filesystem, administrator rights, directories (recently renamed ‘Folders’ for these same users), registries, multi-step installation procedures, and the list goes on. Parents will finally stop calling us for help with strange error messages, and we will forget that it was ever hard.

    But if people start to prefer the iPad and its descendants to ‘real’ computers, so the argument goes, then we will have robbed the next generation of a basic foundational understanding of computers. Because there will be no tinkering in Apple’s clinical workshop, they will never see the crucial workings of a program beneath its simplified user interface, and we will not have people to build the next Google, YouTube, or Bittorrent. The iPad/iPhone were built to enable end-users to consume content, and so it must be that creativity stands to suffer.

    As I wrote yesterday, I currently see the iPad as a great way to access information and interact with media, freed from the physical contraints of an iPhone’s smaller screen and shorter battery life. Apple sees it, quite necessarily, as something more*. Which is why they built iWork productivity apps and demonstrated Brushes, an application that lets the large screen be used as a drawing surface for artists.

    Offering a new breed of computer to an older person and seeing them take to it with joy and wonderment, as opposed to frustration and confusion, is a wonderful image and what the industry should work towards, but just because a filesystem is obscured doesn’t mean the curious can’t get to it. One might argue that jailbreaking an iPad is no different from the things people did to their computers in the past. There will always be unauthorized tools for messing around, and one day you may even be able to write, compile, and test code for an iPad on the thing itself. I wouldn’t worry about the younger generation of hackers.

    My parents online
    I want to talk about two tasks I’ve observed my parents and people their age doing on their computers.

    1 – My mother mainly works with email. She receives documents relating to her church activities, which she must save locally before editing and sending them out again to other members of her group. She organizes these files in folders, which are really good metaphors that she understands, and often keeps multiple dated versions.

    Of course, the iPad of today can’t save email attachments for working on in the Pages word processor. One day it will. But that sort of management is bound to increase the level of complexity. Lists of documents, tags or folders, deleting and renaming, and so on. I thought of introducing her to Google Docs, which would let her work with live documents in the cloud, and even collaborate in real-time with her friends. When changes are made, instead of emailing a copy of a document to other people, she would only have to send invites to view the document online. The iPad would work well with that approach – no local storage necessary. The responsibility and blame for any complexity is passed off onto the web service provider, in this case Google, leaving the iPad’s reputation to remain spotless.

    2 – My father (and other fathers I hear about) likes to download videos off YouTube for later viewing, both on the desktop and on his iPhone. These are usually music videos and funny but horrifying accidents. This requires using a program or website like KeepVid to save them locally, and then often another program to re-encode the clips for use on the iPhone.

    I believe saving videos off Youtube is a copyright gray area that Apple will never touch by sanctioning an app that exists to do it. Music videos are often removed from Youtube when found to be unauthorized uploads, which might explain the compulsion to save them. But even if they stayed online, is streaming instead of saving an ideal solution? That’s a lot of wasted bandwidth, and what if they want a Taylor Swift video or two while traveling by air? Apple will never allow the Youtube app to save video and compete with iTunes sales.

    Both of these scenarios and their cloud-based natures highlight the need for increased openness and cooperation on the web. If we can’t have open computing systems, then we need an open internet to take its place. My mother’s friends shouldn’t all have to have Google accounts to access her shared documents, and Youtube shouldn’t have a monopoly on streaming video just because the iPad comes with an app built-in. The widespread adoption of HTML5 video in lieu of Flash would be fantastic, and remove the need for a native Youtube viewer. Likewise, online storage accounts like the ones offered by Dropbox and Microsoft Live Mesh should be able to trade files and work together. Productivity and content creation services should have a way of talking to each other across networks.

    I like Google Wave’s implementation of federated servers. You can run your own private Wave system, really make it your own for whatever purposes, but the underlying protocol can communicate with every other Wave server if/when you need it to.

    If that kind of openness were applied to all other services, companies would stand to lose their ‘stickiness’, but they’d surely find other ways to retain users. Should a landscape of interoperability and sharing ever come to pass in every corner of the web, it would be to the benefit of us all. How fitting, then, if we were steered in that direction by the threat of having to work on oversimplified computers.

    —-

    With apologies to Public Enemy for the title.

    * When Nintendo first launched the DS in 2004, they called it a “third pillar” to allay fears that the company was going mad and replacing its popular and very profitable Game Boy Advance series with a risky touchscreen experiment. The DS went on to become a huge hit, accelerating the GBA’s demise and eventually becoming their main handheld product. You may wish to see Apple’s positioning of the iPad as a similar play: someday it may overtake the MacBook completely.

  • Alex Payne on the iPad

    Alex Payne, in a widely-linked article, wrote today that:

    The thing that bothers me most about the iPad is this: if I had an iPad rather than a real computer as a kid, I’d never be a programmer today. I’d never have had the ability to run whatever stupid, potentially harmful, hugely educational programs I could download or write. I wouldn’t have been able to fire up ResEdit and edit out the Mac startup sound so I could tinker on the computer at all hours without waking my parents. The iPad may be a boon to traditional eduction, insofar as it allows for multimedia textbooks and such, but in its current form, it’s a detriment to the sort of hacker culture that has propelled the digital economy.

    As far as I can tell, Apple never intended for young Alex Payne to access the Mac’s startup sound any more than they intend for a future programmer to hack an iPad’s filesystem and do some tinkering of his own tomorrow. Sure it’s harder with DRM and encryption, but we’re united by the internet these days, and breaking those walls down has become a group effort. No young hacker today has to learn alone. We change with the territory and so nothing has really changed at all.