What Lies Beyond the Macintosh Desktop Metaphor?

The Post-PC era, the era of iPads and other tablets, suggests that it is time to re-evaluate the archaic computer interface developed by Xerox and Apple 30 years ago. However, before we get too excited, there are lots of questions to ask and answer.

The interface we use on our PCs and Macs is generally known as WIMP: windows, icons, menus & pointer. The assumption is that the pointer is some kind of device like a mouse, trackball or a trackpad. We move a cursor to an object and perform some kind of drag or click.

Nowadays, we're all very familiar with iOS and how that differs from the OS X interface, and, naturally, questions arise.

  1. Is there a better way to interface with a desktop computer?
  2. Is iOS one of those ways?
  3. If not, what distinctions are important when considering the differences in the tasks and the way we use a tablet and a desktop or notebook computer?
  4. Is the movement to a more modern interface driven by user needs, a natural technical evolution, or developer considerations?
  5. What features of the traditional WIMP interface might be lost in a more modern interface and how important are they?

 

Like previous generations of interfaces, this too will pass. (Credit: Apple)

The Guardian

In Star Trek:TOS, “The City on the Edge of Forever,” the Guardian asks, “Since before your sun burned hot in space and before your race was born, I have awaited a question.”

Actually, I have several. And so, here they are:

1. Is there a better way to interface with a desktop computer? The answer is, yes, but not everyone is going to embrace it after using Macs since the beginning of time. Well, 1984 anyway.

I recall the transition from DOS to Windows and Apple II to Macs. It didn't happen overnight, and not everyone was enthusiastic. Heck, there are still people today using DOS.

Gregg Keizer quotes Patrick Moorhead, principal of Moor Insights & Strategy,: “Just as it took 10 years for DOS to get out of everyone's system, only when 'Modern' is completely ready will the desktop disappear. It will take five, six or seven years, to bring all the important desktop apps into the Modern UI.”

We got over this.  We'll get over WIMP.

The key thing to remember is that the original character-based interfaces, UNIX, Apple II SOS, and DOS, were all that the CPU/GPU could handle. As hardware improved, it became natural to represent our window (with lower case) into the workings of the computer with a virtual window, indeed Windows and Mac OS. Today, with much more amazing hardware at our disposal, it makes sense to marry that technology with new ways of interacting with a computer. For example, Siri, touch screen interfaces, and perhaps 3D hand motions.

This evolution of the interface, driven by hardware and advanced software, is a natural thing, not something to be dreaded.

2. Is iOS one of those ways? For now, Apple, at least, is acknowledging that there are different tasks that are accomplished on the desktop and notebook computer and the tablet. Tablets are limited in their size and battery power — and therefore their CPU/GPU capabilities. So it doesn't make a lot of sense to cast, brute force, a tablet oriented interface, iOS, onto a much more powerful notebook or desktop computer.

For example, Macs are held differently, and so the notion of the “Gorilla arm” comes into play. We touch our tablets as they rest in our laps, but we don't want to hold our arm out for long periods of time manipulating a more powerful device.

At least, that's what Apple is doing. Microsoft and its partners, however, seem to be rushing headlong into a unified touchscreen interface with Windows 8 and the tiled UI common across PCs and tablets like the Microsoft Surface. I asked Microsoft about whether the plan is to eventually get rid of the classic Desktop, but they declined to comment for this article. There is much discussion for and against a vigorous Microsoft vision to deprecate the classic Desktop and move smartly to the tiled UI exclusively.

It's unlikely that Apple will follow Microsoft. For now, invoking some of the best ideas of iOS into OS X appears to be what Apple is doing without suggesting that iOS should be the future UI for all Apple computers. That process has been called iOS-ification of OS X, but that doesn't mean that OS X becomes iOS or dumbed down. iOS currently doesn't have the flexibility (legacy apps), facilities (daemons), legacy hardware support (drivers) needed for serious desktop or notebook use. So that means one has to ask certain additional questions. Namely:

3. What distinctions are important when considering the differences in the tasks and the way we use a tablet and a desktop or notebook computer?

Steve Jobs once said that desktops are like trucks. We need trucks to do certain kinds of heavy lifting, but they'll become rare compared to passenger cars, used by most people. For example, to conduct certain kinds of system maintenance, advanced users (truck drivers) need a very detailed kind of window into the file system, perhaps UNIX permissions. On the other hand, many users only need to see some basic information about when they created a document.

So the real question is: when we decide what kinds of information needs to be displayed, like file information in the Finder, indeed even the data folders and OS files, are we dumbing it down for advanced users who have come to expect it? The answer is probably yes, and so there will remain a need for apps like PathFinder for the truck drivers.

On the other hand, what facility for creation of content could be lost? The answer should be none at all. So the real question relates to factoring technical information about the system and the means by which content gets created. iOS does that. When we confuse the two, we can become unglued and nervous about the developer's intentions.

Screen sizes come into play strongly. Tablets, so far, are perfect in the 7 to 10-inch range. That dicates the kinds of tasks we tackle. Very large, perhaps even dual 27-inch displays, lead themselves to much more ambitious tasks and object manipulations. OS X remains superior there and isn't likely to lose that advantage.

Desktops have access to vast amounts of electrical power. That implies GPUs that are an order of magnitude or more better than a tablet. That facility will be invoked to our advantage rather than a more limited tablet OS scaling those heights. At least for now.

A next generation user interface will use advanced hardware to make it easier, more natural and more intuitive — more human — to interact with a computer. For example, we don't generally have a strong urge to observe the DNA activity or digestive processes of our cats to enjoy their companionship. (However, we do pray they won't puke on the carpet.)

4. Is the movement to a more modern interface driven by user needs, a natural technical evolution, or developer considerations?

In the end, every computer maker wants to sell more devices and make customers happier. When technological development makes it possible to achieve a higher level of abstraction, that will come as matter of course, just as WIMP replaced the command line.

I don't see this kind of evolution as some evil plot being perpetrated for the selfish ends of the OS designer. We simply can't move forward in technology without higher levels of abstraction and more human-like interfaces with computers. In fact, no OS developer is going to consciously take away our ability to create content. And if we seem to lose some element of nerdiness, we can always apply for that truck driver's license to hold us over while we size up the next generation UI.

Heck, after awhile, we'll get to like it. It's a natural technical evolution. It may just take 10 years to be able to say, “We knew this was the best idea all along!”

5. What features of the traditional WIMP interface might be lost in a more modern interface and how important are they?

One has to admit that a computer mouse is a crutch. In principle. it's just as unnatural as a mechanical typewriter of old. In both cases, it was the only technological means to achieve an end. The mouse-like pointer is going to disappear.

Windows and icons on the other hand have their uses. While the representation of our content may change a little, we can still expect to visualize, in some way, what we've created, even if we don't know (or care) where it's stored. iOS does that nicely. So, we've gone from this:

To this:

The real questions are: can we keep from losing our work? Do we know where to find it? Can we recognize it when we find it? Can we back it up? If yes, then fine, and we let the computer take care of the housekeeping.

Touching apps to launch them, touching the screen to manipulate content (or asking Siri to do it) or something similar with a wave of the hand doesn't really make us stupid. It's just a faster, more intuitive way to get the computer to cooperate. The real issue here is that we often don't have a clear idea about how incremetal evoluton of UI technique will carry us forward; instead, we tend to invoke preconceptions of abruptness and awkwardness that lead to alarm. We jump to conclusions. We shouldn't do that.

On a desktop, the act of creation involves folding in content from lots of sources of different kinds, and so iOS with its data model and sandboxing probably can't handle that. That's why we'll see some minor iOS-ification without damaging the ability of OS X to create content.

In general, in the past we needed to “see” our content in certain ways because we had limited ways of “mainipulating” the content. Seeing ino the guts meant verification of an action. As we advance in our abilities to interact with the computer, the historical crutch of how we see content will subtlely change. But that doesn't mean we won't “see” our content in ways that are useful to the task at hand.

Each year at Apple's World Wide Developer Conference (WWDC) we get a glimpse of the direction Apple is headed. The changes happen incrementally, year by year. We don't fall off a cliff. We gain facility but don't lose capability in our OSes. But we do encounter change even as we try not to grumble too much and keep the big picture in mind.

That's how we stumble forward, learning as we go. It's been that way since day one of personal computers.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.