Home > Uncategorized > Analyzing mainframe data in light of the IBM antitrust cases

Analyzing mainframe data in light of the IBM antitrust cases

Issues surrounding the decades of decreasing cost/increasing performance of microcomputers, powered by Intel’s x86/Pentium have been extensively analysed. There has been a lot of analysis of pre-Intel computers, in particular Mainframe computers. Is there any major difference?

Mainframes certainly got a lot faster throughout the 1960s and 1970s, and the Computer and Automation’s monthly census shows substantial decreases in rental prices (the OCR error rate is not yet low enough for me to be willing to spend time extracting 300 pages of tabular data).

During the 1960s and 1970s the computer market was dominated by IBM, whose market share was over 70% (its nearest rivals, Sperry Rand, Honeywell, Control Data, General Electric, RCA, Burroughs, and NCR, were known as the seven dwarfs).

While some papers analyzing the mainframe market do mention that there was an antitrust case against IBM, most don’t mention it. There are some interesting papers on the evolution of families of IBM products, but how should this analysis be interpreted in light of IBM’s dominant market position?

For me, the issue of how to approach the interpretation of IBM mainframe cost/performance/sales data is provided by the book The U.S. Computer Industry: A Study of Market Power by Gerald Brock.

Brock compares the expected performance of a dominant company in a hypothetical computer industry, where anticompetitive practices do not occur, with IBM’s performance in the real world. There were a variety of mismatches (multiple antitrust actions have found IBM guilty of abusing its dominant market power).

Any abuse of market power by IBM does not impact the analysis of computer related issues about what happened in the 1950s/1960s/1970, but the possibility of this behavior introduces uncertainty into any analysis of why things happened.

Intel also had its share of antitrust cases, which will be of interest to people analysing the x86/Pentium compatible processor market.

Categories: Uncategorized Tags: , , ,
  1. David in Tokyo
    November 17, 2023 12:39 | #1

    Probably not much help, but it has crossed my mind over the years that what a bank wants for (or thinks of, or needs in) computer performance may not be all that aligned with someone who was interested in AI and symbolic mathematics. I showed up at MIT in the fall of ’72, and fell in with both SIPB and the MACSYMA Group (one of the profs hired me as a part time Lisp programmer in January ’73). SIPB was a student group tasked with doling out Multix time for individual undergrad projects on the Comp. Sci. department’s GE 645 Multix system.

    There was a tad of rivalry between these two worlds with the Multix folks thinking Multix was a serious OS on a serious computer, while the AI/MACSYMA types weren’t particularly interested in Multix. In particular, the Multix folks thought that the PDP-6 and PDP-10 were overgrown toy minicomputers. And they did have a point: the PDP-6/10 had an 18-bit address space, while the GE 645 had a 32-bit address space. Between the PDP-6/10 having 36-bit words that could hold the two pointers needed for a Lisp cell, and memory being scarce and expensive, the 18-bit bit wasn’t that much of a limitation. But Macsyma was getting bigger, and some mathematicians were pushing the limits of Macsyma on an 18-bit machine, so someone reimplemented the MIT AI Lab’s Lisp on the GE-645.

    It.Wasn’t.Nice. Joel Moses’ secretary was an art school graduate, and she rose to the occassion by drawing up some beautifully lettered posters announcing the results: “MACSYMA is FOUR times larger and TEN times slower on Multics than on ITS.”

    Anyway, I don’t doubt but that the GE 645 was a serious mainframe, but it wasn’t the right thing for Lisp…

  2. November 17, 2023 23:45 | #2

    @David in Tokyo
    Lisp has always struck me as a virus that only infects the mind of certain kinds of very clever people. Lisp has a long history in AI because this is the only application domain where it is not outcompeted by other languages, at least until recently when LLMs’ anointed Python as the AI language.

    I think that the success of projects using Lisp has more to do with the abilities of those it has infected, than any characteristics of the language.

    MIT went for the GE-645 because of its support for virtual memory, which that IBM was not willing to offer at the time. Watts Humphrey’s reflections in the book In the Beginning 2.0: Personal Recollections of Software Pioneers gives the details.

  3. David in Tokyo
    November 18, 2023 02:41 | #3

    Thanks for the point about why MIT used the GE machines, and the book refference. (Ha! Glass seems to be a kewl bloke; thanks!)

    Lisp wins for symbol-based AI and math because it’s so easy to use. Symbol/hash tables, associative data structures, and input and output and formatting all come for free. So for any application where you associate information with symbols, you just write your app with no BS. It’s not so much as not being outcompeted as simply being the only game in town.

    Python’s dictionaries get you part way to Lisp, but they don’t tell you that. I’m using Python for some Japanese text processing (counting words in corpuses, finding usage examples, classifying things by type of Chinese character used, slinging around definitions from a Japanese to English dictionary) and it works OK, but much of that is because it provides a string data type that handles Unicode strings as fixed-width arrays.

    I’d rather write in C++, but every time I try, I find bugs in the UTF-32 implementation. Every other language that I’ve looked at that claims “Unicode support”, only “handles” (by passing through) UTF-8, and I’m not interested in coding hash tables and string search functions and the like to work with a variable-width encoding. That’d be insane, but that’s all you get.

    But Python is used for current AI because it supports libraries and people write efficient (AI and other) libraries for it. It’s not a programming language, it’s a library interface.

  1. No trackbacks yet.