Going Nowhere Really Fast, or How Computers Only Come in Two Speeds.

Is there a ballpoint pen in your pocket?  How fast is it?
What do you mean, you don't know? You didn't ask the salesman?

There is indeed a maximum speed at which the little ball in the pen can roll and still leave a satisfactory trace of ink upon the page.  Would you pay extra for a faster model?  If not, why not?  What would you do if someone were to sell you a costly yet noticeably slow pen?

Despite a multi-billion-dollar propaganda industry's best efforts, it remains obvious that computers come in just two speeds: slow and fast. A slow computer is one which cannot keep up with the operator's actions in real time, and forces the hapless human to wait.  A fast computer is one which can, and does not.

Today's personal computers (with a few possible exceptions) are only available in the "slow" speed grade.  Modern word-processing bloatware would make a time-traveling salesman of Underwood manual typewriters retch in disgust: it is not rare to see a palpable (and all the more so for its unpredictability) delay between pressing a key and the drawing of a symbol on the screen.

Millions of people have been bamboozled into thinking that editing a letter in real time requires a supercomputer.

The GUI of my 4MHz Symbolics 3620 lisp machine is more responsive on average than that of my 3GHz office PC.  The former boots (into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created) faster than the latter boots into its syrupy imponade hell.  And this is true in spite of an endless parade of engineering atrocities committed in the name of "speed."

Computer systems could in principle have the reasonable, sane design we expect of all other everyday objects.  The bloat-free, genuinely fast computer is not some marvel of a far-off quasi-mystical science fiction future - it existed decades ago.  And could exist again.



2020 Update: Some comparative input lag measurements by Dan Luu, after yet another decade of verschlimmbesserung.

This entry was written by Stanislav , posted on Thursday September 30 2010 , filed under Hot Air, LispMachine, NonLoper, Philosophy, Progress, SoftwareSucks, Symbolics . Bookmark the permalink . Post a comment below or leave a trackback: Trackback URL.

23 Responses to “Going Nowhere Really Fast, or How Computers Only Come in Two Speeds.”

  • [...] This post was mentioned on Twitter by Hacker News YC, pok rap and Hacker News, Ashley Durant. Ashley Durant said: Loper OS » Going Nowhere Really Fast, or How Computers Only Come ...: Computer systems could in principle have the... http://bit.ly/b1cgom [...]

  • jseliger says:

    And could exist again.

    I hope it does.

    Still, this reminds me of Joel Spolsky's Bloatware and the 80/20 myth. The reason this computer doesn't exist is because people don't want it—or, to be more precise, they don't want to make the trade-offs it implies in sufficient numbers for there to be a market for such a computer.

    Nothing is stopping someone from making a stripped-down version of, say, Linux that will boot "into a graphical everything-visible-and-modifiable programming environment, the most expressive ever created faster than the latter boots into its syrupy imponade hell." But most people evidently prefer the features that modern OSes and programs offer. Or, rather, they prefer that modern OSes support THEIR pet feature and make everything as easy to accomplish as possible at the expense of some speed.

    • Stanislav says:

      Dear jseliger,

      If you think that a static-language-kernel abomination like Linux (or any other UNIX clone) could be turned into a civilized programming environment, you are gravely mistaken.

      And if only the bloat and waste consisted of actual features that someone truly wants to use. As things now stand, countless CPU cycles are burned on total losses that no human may even be consciously aware of, such as the impedance mismatch between an idiot kernel's slab allocator and the garbage collector in the runtime of your favorite dynamic language. See: this.

      Yours,
      -Stanislav

      • Matt Campbell says:

        Dear Stanislav,

        On what basis do you claim (or at least imply) that the combination of a typical OS kernel's memory allocator and a dynamic language runtime's garbage collector has any perceptible impact on responsiveness? I'm genuinely curious.

        Thanks,
        Matt

        • Stanislav says:

          Dear Matt Campbell,

          Ever use a Java app?

          One can argue about exactly why a 4GHz, 12-core modern monstrosity is less responsive than a 1980s microcomputer when used in all kinds of everyday ways (word processing, etc.) but the fact remains.

          Yours,
          -Stanislav

          • Mike Stimpson says:

            You have totally failed to answer Matt Campbell's question. All you did was more implication of... something unspecified.

            Given that a 4GHz machine can be less responsive than a 1980's minicomputer (or even a 1980's DOS machine), on what factual basis do you claim that the problem is between the kernel's memory allocator and the runtime's garbage collector?

            • Stanislav says:

              Dear Mike Stimpson,

              Of course, this is by far not the only issue at work here. To understand what was meant, consider the concept of impedance mismatch - as applied to cases such as the expansion of a road causing an increase in traffic jams - or, more appropriately to our discussion - Bélády's anomaly and related phenomena.

              You really don't want memory allocators (esp. with GC) operating on top of one another. Feel free to verify this experimentally.

              Yours,
              -Stanislav

  • Eric Normand says:

    You hit the nail on the head.

    I've been thinking about this for years now. Recently, I devoted several sleepless nights to this problem. Why is it that our computers are getting faster and faster yet we are using them more and more to render things generated on some distant server? And it's still slow!

    I hope we can find a solution to this problem. I think one thing that needs to change is that every graduate of Computer Science should have to write an OS from scratch and not just learn Java. That will at least prepare people to begin tackling this problem.

    Oh, well! I'd love to rant on, but there's so much, I'm afraid of how long it will take.

  • jseliger says:

    If you think that a static-language-kernel abomination like Linux (or any other UNIX clone) could be turned into a civilized programming environment, you are gravely mistaken.

    That may be true: my programming skill and knowledge ends around relatively simple scripting and so forth. But whatever the weaknesses of Linux, OS X, and Windows, taken together they represent uncounted hours of programming and debugging time and effort. To try and replicate all that would be, to say the least, hard. Probably effectively impossible. If you want to do it, though, I would be your first cheerleader—but the history of computing is also rife with massive rewrites of existing software and paradigms that fail. See, for example, gnu/hurd: http://www.gnu.org/software/hurd/hurd.html for a classic example.

    And if only the bloat and waste consisted of actual features that someone truly wants to use.

    One man's feature is another's bloat, and vice-versa, which is what Joel Spolsky points out: if 80% of the users use 20% of the features, then you can retain 80% of the users if you cut those 20%. The problem is that the 20% is different for each person. That's why the computer experience looks like it does today: because people hate bloat, unless it's their bloat, in which case they'll tolerate it.

    I don't think the power grid metaphor is a good one because transmission lines do one thing: move electricity. Computers can be programmed to do effectively anything, and, because users' needs vary so much, so does the software used.

    Note the last line of the analysis of the Lisp Machine linked to in your about page:

    Symbolics is a classic example of a company failing at heterogeneous engineering. Focusing
    exclusively on the technical aspects of engineering led to great technical innovation. However, Symbolics
    did not successfully engineer its environment, custormers [sic], competitors and the market. This made the
    company unable to achieve long term success.

    That kind of thinking sounds, to me, like the kind of thinking that leads one to lament how "slow" modern computers are. They are—from one perspective. From another, they enable things that the Lisp machine didn't have (like, say, YouTube).

    However, I'm a random armchair quarterback, and code walks while BS walks. If you think you can produce an OS that people want to use, write it. But when it doesn't support X, where "X" is whatever they want, don't be surprised when those people don't use it. There is a massive amount of computing history pointing to this syndrome; for another example, see Dreaming in Code.

  • ScottA says:

    Another example: I still use Word 2000.

    Why? Well, when I installed Word 2003, I noticed it sometimes took up to 3-4 seconds for the software to react to a key-press. So I de-installed Word 2003 and re-installed Word 2000. I was under a deadline - I had to, you know, get some actual "work" done. I'm a very fast typist (up to 100 words a minute) and I'd like to keep it that way.

    And don't get me started on Word 2007 - with that ridiculous "ribbon" across the top of the screen taking up about 15% of the screen real estate.

    Bill Gates must be a total retard if he thinks it's "progress" to suddenly get rid of standard widely-used and accepted conventions such 'File' and 'Edit' menus which had been working fine for millions of people for about 25 years.

    Just imagine how I felt when I fired up Word 2007 for the first time, and couldn't find the command for Page Setup or Print Preview. Twenty-five years of considering myself a more-or-less knowledgeable computer operator - gone out the window(s).

  • Steve Webb says:

    I blame java and bloated, resource-hogging platforms. The disk drive has increased in speed quite a bit in the last 10 years and is still the slowest thing in a computer (excluding the user). If we wrote all programs in efficient languages like C and ASM, our computers should be thousands of times faster than we perceive them to be today. The reliance on interpreted languages and bloated platforms (even the browser is considered a platform nowadays and is mega-bloated if you ask me) but people continue to think that the advancements of hardware will make up for the lazy and bloated programming solutions that people come up with nowadays.

    I write C programs once in a while and marvel at how fast they run. Secretly, I wish that everyone programmed in C still.

    Solution: Give all developers a 200Mhz Pentium I computer with 64MB of ram to program on. Their code won't be bloated and will run mega-fast on current-day hardware.

  • dmbarbour says:

    A fast computer is one which can, and does not.

    That is very zen. A fast computer can, but does not.

    But I do believe that this is, in significant part, a function of increased demand. In ye'old days, we ran one program at a time. (Even then, not many were especially fast.) Nowadays, I tend to have twenty or thirty elements open (notepads, music, browser with a dozen tabs, IDE with tabs, IM, etc.)

    It wasn't so long ago that a half-hour video was compressed to 30-40MB and you were happy with what today would be considered a pixelated mess. And, honestly, you couldn't even run those on your 4MHz machine... you'd need to downsample it further. Today, a half-hour show will cost you 300-600MB.

    But I'll grant that much of it is application development. Applications turn into monolithic messes because it is more convenient to build them upwards (more abstractions atop the OS) than it is to build them sideways (pluggable or service architectures). Why is it more convenient? Because Modern OS abstractions suck compared to language abstractions. To use separate processes, developers would need to install serialization, parsing, validation, state and synchronization, etc. per domain abstraction. Not wishing to suffer that, they shove everything into one process, and add new features in libraries rather than separate apps/services.

    Monolithic applications might not be a huge problem, but they are also developed in imperative paradigms. It is difficult, in imperative paradigm, to properly handle concurrent behavior and ensure responsiveness, especially in combination with modular composition (i.e. working with third-party libraries). Synchronous IO is unique to imperative programming, and is horrible for composable responsiveness. We could better achieve responsiveness with a paradigm designed for it, such as reactive or dataflow (seen in Lustre, Spreadsheets, Max/MSP) or functional reactive (seen in Fran or FrTime).

    If we were to target both issues, we could keep applications smaller and achieve responsiveness between applications. Publish-subscribe architectures (such as Data Distribution Service) have already demonstrated this to a significant degree. I'm developing a paradigm and new language based on the goals of reactive dataflow and service/app composition and near-real-time responsiveness.

    a palpable (and all the more so for its unpredictability) delay

    Besides real-time concurrent garbage collection without paging, do you have any plans for supporting responsiveness inside user-developed applications in Loper?

    How will your applications interact? How do you plan to express concurrency? I do not believe that the imperative pseudo-functional nature of regular Lisp will help you here; synchronous IO should be rejected as a possibility if you wish to ensure responsiveness.

    But you could use a small framework - perhaps an event loop similar to Apple's Grand Central Dispatch - to at least simplify reasoning about responsiveness (app is responsive if you keep event-processing in small chunks).

    • Mike Stimpson says:

      I believe you misread one line in the article.

      A fast computer is one which can (keep up with the operator in real time) and does not (force the human to wait). It's not zen; it's parallelism with the previous sentence.

      • James Pelham says:

        I also misunderstood that line. At first, read it more like "A fast computer is one which can (keep up with the operator in real time)... but still does not (because all the modern software written for it is needlessly slow)."

  • Roger says:

    I have a fairly old computer, by todays standards, however it has one of the nicer Intel core 2 duo processors and 4gs of ram. I can bog it down, but usually when I do, it is because I am running a Virtual Machine, as well as doing a bunch of other stuff.

    For regular usage, I run pretty wide open and rarely bog it down, using Gnome Desktop + Linux. Even though it has a quarter of the Ram and half the number of cores as my friend who does a lot of gaming, it is still a really relevant machine.

    However, it is not uncommon for people to have a problem with their system or run a bunch of background stuff, including viri+malware, to come to the conclusion that the only way to fix it is with an upgrade. And, as you point out, bloat probably is one of the bigger factors that makes people think they need to upgrade.

    One of the things that gets me is that it also often ends up being an "upgrade" to a Walmart grade computer...

  • dmbarbour says:

    Here is a performance comparison of the 1986 Mac Plus vs. 2007 AMD DualCore, measuring user experience characteristics.

  • John says:

    Dear Stanislav,

    Do you know of Bruce Tognazzini, usability expert? I think you'd like his work. He's written quite a lot, but here he has an entry (search the page for "Bug Name: Instant Feedback lacking") on this specific issue. Basically he agrees and shares your disgust and disappointment:

    Discussion by Tog: The studies that established necessary response times were completed before the first commercial ever GUI appeared. No excuse exists for violating this rule. It reflects incompetent design or implementation, as well as incompetent quality assurance. It reflects badly on an entire engineering organization.

  • click here says:

    click here...

    Loper OS » Going Nowhere Really Fast, or How Computers Only Come in Two Speeds....

  • Knuth says:

    "We shouldn't be penny wise and pound foolish, nor should we always think of efficiency in terms of so many percent gained or lost in total running time or space. When we buy a car, many of us are almost oblivious to a difference of $50 or $100 in its price, while we might make a special trip to a particular store in order to buy a 50 cent item for only 25 cents. My point is that there is a time and place for efficiency; I have discussed its proper role in my paper on structured programming, which appears in the current issue of Computing Surveys"

  • Handle says:

    By coincidence, I just wrote to some friends about how things have gotten even worse in the decade since you originally wrote this post. I would be happier with the worst lags of 2010.

    How did it get worse? The """cloud""". I work at one of the USG headquarters, and I'm told my office ]just finished the grand cloudification project, and larded up with however many of the latest cybersecurity requirements there are (really cyber insecurity, so everything I do can be monitored) such that my laptop terminal is apparently some kind of complicated netflix box streaming me a video of a simulation of what it would be like to operate normal office productivity software on an actual personal computer, in which every keystroke and pixel-movement of my mouse must be relayed through seventeen encryption certificate authority intermediaries and copies send to the entire law enforcement and intelligence community, with a few more to Microsoft, Adobe, and Google, and probably also to Beijing, Moscow, Berlin, and Tel Aviv.

    My productivity has cratered. There is a 30-second, exasperating, circle-spinning lag to literally everything I do. I cannot actually make a printer start printing for about 75-90 seconds. I cannot "save as" on to the network drive for about 45 seconds. It takes about a minute or two just to "boot up" one instance of Word, which in the Office 365 instantiation is carrying on constant encyclopedic conversations with some enterprise back end when I am just trying to type a two paragraph memo. I am not exaggerating in the slightest when I think the ratio of network communication to actual content is 50,000 to 1, since every 100 bytes I type I think required 50 MB of *who the heck knows* exchange of communication activity.

    All this is about as frustrating as playing in a first-person shooter tournament with world champions and for high stakes, on the world's crappiest hardware with the worst latency and lag and ending up at about 1 frame every 5 seconds, with every shot you fire not even getting there before you've been stabbed in the face 40 times, which all gets rendered together in one "you died" frame. This is every minute of my workday life now. "You Died" - "You Died" - "You Died".

    Maybe ... I did die? Is this bureaucrat hell? After being 100% cloudified, who can tell the difference?

    • Stanislav says:

      Dear Handle,

      Funnily enough, I originally wrote this piece -- with pen and paper -- while waiting for the mandatory "McAfee" on a MEDCOM-owned WinXP box to churn. As it did every morning, rendering the thing quite unusable. But at least it wasn't a dumb terminal...

      There's hope, though: perhaps, eventually, your shop will re-introduce mechanical typewriters, as Putin supposedly has.

      Either that, or you make colonel (there's gotta be some upside to working in that darkest of heathen pits? I assume...) and some other poor bugger can type the memos, while you play golf.

      Yours,
      -S

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre lang="" line="" escaped="" highlight="">


MANDATORY: Please prove that you are human:

26 xor 7 = ?

What is the serial baud rate of the FG device ?


Answer the riddle correctly before clicking "Submit", or comment will NOT appear! Not in moderation queue, NOWHERE!