SAN FRANCISCO—In his Jan. 9 keynote address before the Macworld Expo crowd in San Francisco, Apple CEO Steve Jobs offered what has become an annual lesson in user values in technology design. But why do so few in the industry seem to be taking the course?
That is the great mystery. Often its easy to dismiss the power of integration, whether its hardware and software or software and services. Or with Apples iPhone, something of all three.
When Steve Jobs demonstrated the first iPod and iTunes music store, many analysts dismissed it. While the iPod had new, useful technology in its hardware interface (the clickwheel) that meshed well with the content management software app on the computer, the whole thing appeared to some as just another product in a crowded category.
However, what sold customers on the iPod was the way that all parts of the system worked together: the elegance of the computer-side application; the expression of technology in hardware and user interface; its easy integration with the music store. Each component separately expressed excellence and usability, and together they were amazing.
Jobs today said that Apple has sold 2 billion songs to date from that store. It sold 1.2 billion songs in 2006. Certainly, thats its own mark of success.
Yet, some folks in the press section of the audience werent sold on the iPhone. Of course, some of them also missed the value of the Internet on first demonstration.
However, I admit that it can be hard to check reality while in the bubble of a Steve Jobs Macworld demo. He is the master of such demonstrations and the Mac-phile crowd hangs on every word from his gigantic projected mouth seen on the tall-and-wide screen in the Moscone Center.
Still, the device rang my bell in the cool department. With the full browser implementation, its almost like a tablet PC but smaller and with telephony. As Jobs said, most smart phones arent very smart. The iPhones IQ must be off the scale.
Here are a few notes that I scribbled in the dark:
- Details matter. The best single demonstration for me was what Apple calls the "pinch," which lets users zoom in on a part of display on the iPhones screen by sliding the thumb and index finger away from one another. What counts here is that the screen can handle multiple simultaneous inputs and interpret them correctly.
This pinch zoom capability was a very natural action and immediately understandable to the user, and made possible by the screen technology. Of course, Apple often provides several ways to do something and simply double-tapping the display also zoomed in on an area.
I also appreciated how automatic functions are enabled by small sensors in the phone. Apple is leading in these little touches in the hardware interface, such as the use of ambient light sensors in its MacBook Pro notebooks that can automatically control the screens brightness or bring up backlights in the keys for working in the dark.
Jobs said there was an accelerometer, and proximity and ambient light sensors built into the phone. These benefit the user interface with automatic functions.
So, in the demonstration of the photo capabilities of the iPhone, when users come across an image in landscape aspect, they just turn the phone sideways. The accelerometer senses the movement and automatically rotates the screen from portrait to landscape, or vice versa. This will also be useful when youre looking at a Web page in the iPhones browser.
When users lift the phone to their ears to take a call, the proximity sensor turns off the display, which saves power and stops wrong input by inadvertent ear input.
And like Apples notebooks, the built-in ambient light sensor can reduce or brighten the displays backlight depending on ambient light, which can save power.