Dear Herb and WeatherCat armchair user-interface researchers,
Well, Edouard, there is another way to look at that, as I'm sure you already know. "Their [The guys at Xerox PARC] solution was to create a desktop "virtual reality."" Some say that invention is more discovery than invention. In that instance, the symbolic metaphor, between desktop activity and abstract file housekeeping within a storage device, already existed. The guys at Xerox PARC were bright enough to notice.
I think you are missing the real benefit that the desktop metaphor provided. People without any insights as to how a computer actually worked could operate a computer and accomplished tasks. My Mom never did understand how a file system works and she gets in real trouble every now and then. Still, she has been using a Mac since the late 1980s and would be extremely unhappy without a computer. It her way to keep in touch with her relatives and friends in France. The big idea I was trying to advance is that the Mac allowed the common people to use computers who otherwise would never be able to understand what was going on.
In this view of invention and discovery, it might be possible, given the extensive complexity added to the digital landscape by multiple devices of different sizes and functions together with their interconnection at multiple junctions as determined by legacy processes evolving with changes in usage profiles, that there is no discoverable metaphor to be applied. Put another way, we're in unknown and hitherto unexplored territory here with the explosion of popularity in digital technology in this century.
Unfortunately, what you are calling a virtue is a vice by any other name. Before Apple, computers were strictly in the realm of nerds and geeks. Thanks to Apple and the desktop paradigm, people were able to learn enough about computers that many people could purchase, setup, maintain, and manage a computer. The desktop paradigm was enough of a window into how computers actually worked that most people were able to figure out the complexities underneath. Still the virtual environment served as the "shallow end of the pool." Without it, people would be drowned in the complexity.
Today people are drowned the complexity and the silicon valley throws everything and kitchen sink at the general public. Not simply is the public asked to beta-test buggy software, the public is now the user-interface guinea pigs. The very complexity is driven in part by the open market and the wild west nature of silicon valley. While competition might provide better products in terms of speed, storage capacity or efficiency. How on earth can the general public be asked which is the best user-interface? Especially when the question isn't even really asked. Whereas once products really were scrutinized by user-interface engineers, today products have multiple user-interfaces and it does not appear much thought is given as to how they are supposed to work together.
Devices that use the human body as a pointing and control device desperately need to be designed in a way that is consistent with how human beings actually use their fingers, hands, and minds. iOS is frustrating because it is way too easy to attempt to do one thing and end up doing something else. Worse, on iOS your pointing device can interfere with the task you are trying to perform. When using a mouse, the cursor is always in view - trying to use your hands forces you to be at the very spot you are trying to work on.
Sadly iOS suffers from its piecemeal and incremental creation. Today's iPhones and iPads have so much more processing capacity than the first iPhone. If iOS had been designed from the ground up with this sort of power in mind - what would the user-interface look like?
This is one of those sad moments where I plainly see that profit and novelty are working to the genuine detriment of humankind. It isn't that a real effort has been attempted to develop the same sort of analogies that the desktop user-interface provided to get us out of the command-line. Xerox PARC was free to explore their ideas because nobody could profit from personal computer yet. Is there any attempt to create a cooperative effort between industry, research institutes, academia and government to actually see what could be done? The answer is obviously
no. Companies like Apple want their innovations to remain secret, even if as a single company it cannot do the sort of research needed to solve hard problems like what should be the best paradigm for a device controlled by our own hands.
As a result, we get mediocre products that are heavily biased by the geek culture, lacking in the broader perspective that say academics could bring to such technology. Are we worse off because of it? Honestly, can there really be any doubt?

Edouard