A Brief History of the Problem

image link-topic-sf0.jpg

From the 1940s through about 2003 the development of computer interface technology for real (that is, mainframe and later UNIX-based) computers was, for the most part, a story of continuous progress. It began to falter then, but it wasn't until 2007 that the fatal blow to good computer interface technology was struck: the iPhone. Mind you, I'm not bashing the iPhone; far from it. It's a marvelous device, and the interface technologies developed for it are equally wonderful. They're extremely well adapted to their purpose, which is to let you interface with a few square inches of completely un-ergonomically packaged glass while on the run.

The problem is that in this period computer interface developers, in both the Linux world and elsewhere, came to believe that computer users were stupid, timid creatures who could not bear to face the actual power of their machine. Beginning in 2003, they began ripping useful features out of interfaces (example: viewports in Enlightenment/Gnome), and within a decade were ripping essential principles of computer organization out (example: the concept of the Current Working Directory, ripped out of GTK). The common Linux user interfaces promoted in 2013 simply lacked basic elements of the programmer's working environment that were just ordinary in 2003. Simultaneously, developers began trying to entice users by turning the interface into a kind of animated videogame complete with chorus lines of dancing fish to accompany every click.

The iPhone accelerated this as these same developers went through the agony of not having been as successful as Steve Jobs. Since then, they've resolutely been mutating the major Linux distributions and graphical development systems (e.g., Ubuntu, Gnome/GTK) into iClones. But of course the issues of communicating with a few square inches on the run, so well handled by the iPhone, have nothing to do with the issues of doing serious computer work on a few square feet of visual real estate in an office. If you wish to experience the future they envision, turn your computer off, tape your iPhone to its screen, and lean across your desk to do all of your work with your thumbs.

Google's great success has been a bad influence as well. The default configurations of some modern Linux systems now attempt to do aggressive "semantic" (it isn't really - they have no concept of what "semantic" means) indexing which can easily suck up 50 percent of the CPU for days and fill large disks by analyzing every aspect of your computer and data without your knowledge. In addition to the ethical problem here, I have a pragmatic problem with this. My file system is, as I write this, something over 7 Terabytes in size with over 4.3 million files. Nepomuk and Friends constitute a very effective denial-of-service attack on my system.

The result is that now a stock installation of a well-known Linux distribution comes with a standard environment which uses gigabytes of RAM and which can easily consume 50 or 60 percent of a powerful CPU while just sitting there. Semantic indexing and poor URL completion can consume the rest. (I have one relatively modern system which takes so much time trying to guess what I'll type that it drops keystrokes. This behavior would have been unacceptable sixty years ago.) At the same time these systems lack basic features which were once common - but they do have windows in spinning cubes and dancing fish.

It is possible, still, to install and configure a computer system upon which you can get real work done. But where in 2003 you could do this with a stock installation of a major distribution, now you've got to work at it. The Notebooks here (which double as my own installation notes) discuss the ways in which I presently do this.

Select Resolution: 0 [other resolutions temporarily disabled due to lack of disk space]