Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Happy Mainframe Day

Happy Mainframe Day

Scheduled Pinned Locked Moved Uncategorized
30 Posts 13 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • markd@hachyderm.ioM markd@hachyderm.io

    @aka_pugs Really was the beginning of the modern era of computing, starting with the normalisation of 8-bit bytes and character addressable architecture.

    Well, that's all true so long as we don't mention EBCDIC 🙂

    aka_pugs@mastodon.socialA This user is from outside of this forum
    aka_pugs@mastodon.socialA This user is from outside of this forum
    aka_pugs@mastodon.social
    wrote last edited by
    #10

    @markd They had ASCII mode, but the peripherals never got the memo.

    Link Preview Image
    stevebellovin@infosec.exchangeS 1 Reply Last reply
    0
    • aka_pugs@mastodon.socialA aka_pugs@mastodon.social

      Happy Mainframe Day!
      OTD 1964: IBM announces the System/360 family. 8-bit bytes ftw!

      Shown: Operator at console of Princeton's IBM/360 Model 91.

      Link Preview Image
      debaer@23.socialD This user is from outside of this forum
      debaer@23.socialD This user is from outside of this forum
      debaer@23.social
      wrote last edited by
      #11

      @aka_pugs Only 8 bits per word? That will never be enough for anyone! 18 bit words like the PDP-1 or 12 bit words like the PDP-8, now that's serious computing! Also, 8 bit words make octal representation quite pointless…

      1 Reply Last reply
      0
      • aka_pugs@mastodon.socialA aka_pugs@mastodon.social

        @markd They had ASCII mode, but the peripherals never got the memo.

        Link Preview Image
        stevebellovin@infosec.exchangeS This user is from outside of this forum
        stevebellovin@infosec.exchangeS This user is from outside of this forum
        stevebellovin@infosec.exchange
        wrote last edited by
        #12

        @aka_pugs @markd ASCII mode was only about how some of the decimal arithmetic instructions behaved. For the printers, the character set was pretty arbitrary, and the Translate instruction would have allowed for easy compatibility no matter what. The real EBCDIC issue was the card reader—and per Fred Brooks, IBM wanted to go with ASCII but their big data processing customers talked them out of it.. But that's a story for another post. (And 8-bit bytes? Brooks felt that 8-bit bytes and 32-bit words was one of the most important innovations in the S/360 line. It wasn't a foregone conclusion—many scientific computing folks really wanted to stick with 36-bit words, for extra precision. IBM ran *lots* of simulations to assure everyone that 32 bit floating point was ok.)

        Why yes, in grad school I did take computer architecture from Brooks…

        markd@hachyderm.ioM 1 Reply Last reply
        1
        0
        • R relay@relay.infosec.exchange shared this topic
        • markd@hachyderm.ioM markd@hachyderm.io

          @aka_pugs Really was the beginning of the modern era of computing, starting with the normalisation of 8-bit bytes and character addressable architecture.

          Well, that's all true so long as we don't mention EBCDIC 🙂

          olbohlen@norden.socialO This user is from outside of this forum
          olbohlen@norden.socialO This user is from outside of this forum
          olbohlen@norden.social
          wrote last edited by
          #13

          @markd a former colleague of mine used to joke that EBCDIC was the first strong crypto algorithm to be exported from the US 😉

          1 Reply Last reply
          0
          • markd@hachyderm.ioM markd@hachyderm.io

            @aka_pugs Really was the beginning of the modern era of computing, starting with the normalisation of 8-bit bytes and character addressable architecture.

            Well, that's all true so long as we don't mention EBCDIC 🙂

            gerardo@mast.hpc.socialG This user is from outside of this forum
            gerardo@mast.hpc.socialG This user is from outside of this forum
            gerardo@mast.hpc.social
            wrote last edited by
            #14

            @markd: Or, as my computing mentor, Harold V. McIntosh, called it, "Moby Dick".

            @aka_pugs

            1 Reply Last reply
            0
            • stevebellovin@infosec.exchangeS stevebellovin@infosec.exchange

              @aka_pugs @markd ASCII mode was only about how some of the decimal arithmetic instructions behaved. For the printers, the character set was pretty arbitrary, and the Translate instruction would have allowed for easy compatibility no matter what. The real EBCDIC issue was the card reader—and per Fred Brooks, IBM wanted to go with ASCII but their big data processing customers talked them out of it.. But that's a story for another post. (And 8-bit bytes? Brooks felt that 8-bit bytes and 32-bit words was one of the most important innovations in the S/360 line. It wasn't a foregone conclusion—many scientific computing folks really wanted to stick with 36-bit words, for extra precision. IBM ran *lots* of simulations to assure everyone that 32 bit floating point was ok.)

              Why yes, in grad school I did take computer architecture from Brooks…

              markd@hachyderm.ioM This user is from outside of this forum
              markd@hachyderm.ioM This user is from outside of this forum
              markd@hachyderm.io
              wrote last edited by
              #15

              @SteveBellovin @aka_pugs If you were on the non-EBCDIC side of the fence you got the impression that IBM sales pushed EBCDIC pretty hard as a competitive advantage - even if their engineering covertly preferred ASCII.

              The 32-bit word must have been a harder-sell for the blue suits since the competition were selling 60bit and 36bit amongst other oddballs.

              Fortunately the emergence of commercial customers marked the declining relevance of scientific computing... Did IBM get lucky or were they prescient?

              But yeah, the S/360 definitely marked the end of the beginning of computing in multiple ways.

              aka_pugs@mastodon.socialA stevebellovin@infosec.exchangeS 2 Replies Last reply
              0
              • markd@hachyderm.ioM markd@hachyderm.io

                @SteveBellovin @aka_pugs If you were on the non-EBCDIC side of the fence you got the impression that IBM sales pushed EBCDIC pretty hard as a competitive advantage - even if their engineering covertly preferred ASCII.

                The 32-bit word must have been a harder-sell for the blue suits since the competition were selling 60bit and 36bit amongst other oddballs.

                Fortunately the emergence of commercial customers marked the declining relevance of scientific computing... Did IBM get lucky or were they prescient?

                But yeah, the S/360 definitely marked the end of the beginning of computing in multiple ways.

                aka_pugs@mastodon.socialA This user is from outside of this forum
                aka_pugs@mastodon.socialA This user is from outside of this forum
                aka_pugs@mastodon.social
                wrote last edited by
                #16

                @markd @SteveBellovin IBM had a huge lead in commercial data processing because of their punch card business. And that world did not care about floating point. The model 91 was an ego-relief product, not a real business. IMO.

                Data processing and HPC markets never converged - until maybe AI.

                johnmashey@mstdn.socialJ 1 Reply Last reply
                0
                • aka_pugs@mastodon.socialA aka_pugs@mastodon.social

                  @markd @SteveBellovin IBM had a huge lead in commercial data processing because of their punch card business. And that world did not care about floating point. The model 91 was an ego-relief product, not a real business. IMO.

                  Data processing and HPC markets never converged - until maybe AI.

                  johnmashey@mstdn.socialJ This user is from outside of this forum
                  johnmashey@mstdn.socialJ This user is from outside of this forum
                  johnmashey@mstdn.social
                  wrote last edited by
                  #17

                  @aka_pugs @markd @SteveBellovin
                  But still, FORTRAN IV got lots of use especially on 360/50…85 in universities & R&D labs. i suspect not much on /30 /40.
                  I still think of 360 as a huge bet to consolidate the chaos of the 701…7094 36-bit path and the 702…7074 &1401 variable-string paths.
                  And for fun: I asked both Gene Amdahl & Fred Brooks why they used 24-bit addressing, ignoring high 8-bits… which caused a lot of problems/complexity later.
                  A: save hardware on 360/30, w/8-bit data paths.

                  aka_pugs@mastodon.socialA hyc@mastodon.socialH stevebellovin@infosec.exchangeS 3 Replies Last reply
                  0
                  • johnmashey@mstdn.socialJ johnmashey@mstdn.social

                    @aka_pugs @markd @SteveBellovin
                    But still, FORTRAN IV got lots of use especially on 360/50…85 in universities & R&D labs. i suspect not much on /30 /40.
                    I still think of 360 as a huge bet to consolidate the chaos of the 701…7094 36-bit path and the 702…7074 &1401 variable-string paths.
                    And for fun: I asked both Gene Amdahl & Fred Brooks why they used 24-bit addressing, ignoring high 8-bits… which caused a lot of problems/complexity later.
                    A: save hardware on 360/30, w/8-bit data paths.

                    aka_pugs@mastodon.socialA This user is from outside of this forum
                    aka_pugs@mastodon.socialA This user is from outside of this forum
                    aka_pugs@mastodon.social
                    wrote last edited by
                    #18

                    @JohnMashey @markd @SteveBellovin I love this quote from Boeing about the 360.
                    See https://drive.google.com/file/d/1Zb6s_Ti7ON-6DNGg8VB2VvLSbh72_bgM/view?usp=sharing

                    Link Preview Image
                    stuartmarks@mastodon.socialS 1 Reply Last reply
                    0
                    • aka_pugs@mastodon.socialA aka_pugs@mastodon.social

                      @JohnMashey @markd @SteveBellovin I love this quote from Boeing about the 360.
                      See https://drive.google.com/file/d/1Zb6s_Ti7ON-6DNGg8VB2VvLSbh72_bgM/view?usp=sharing

                      Link Preview Image
                      stuartmarks@mastodon.socialS This user is from outside of this forum
                      stuartmarks@mastodon.socialS This user is from outside of this forum
                      stuartmarks@mastodon.social
                      wrote last edited by
                      #19

                      @aka_pugs @JohnMashey @markd @SteveBellovin Huh, interesting comment on hex floating point. I’ve long thought that a controversial choice. I remember hearing an IBM numerical analyst claim that the hex floating point was “cleaner” than competing formats (this predated IEEE 754) but much literature today echoes the criticism given here that the hex format effectively shortens the significand.

                      stevebellovin@infosec.exchangeS 2 Replies Last reply
                      0
                      • johnmashey@mstdn.socialJ johnmashey@mstdn.social

                        @aka_pugs @markd @SteveBellovin
                        But still, FORTRAN IV got lots of use especially on 360/50…85 in universities & R&D labs. i suspect not much on /30 /40.
                        I still think of 360 as a huge bet to consolidate the chaos of the 701…7094 36-bit path and the 702…7074 &1401 variable-string paths.
                        And for fun: I asked both Gene Amdahl & Fred Brooks why they used 24-bit addressing, ignoring high 8-bits… which caused a lot of problems/complexity later.
                        A: save hardware on 360/30, w/8-bit data paths.

                        hyc@mastodon.socialH This user is from outside of this forum
                        hyc@mastodon.socialH This user is from outside of this forum
                        hyc@mastodon.social
                        wrote last edited by
                        #20

                        @JohnMashey @aka_pugs @markd @SteveBellovin sounds like Motorola copied their reasoning years later, with the MC68000.

                        For us UMich folks, the 360/67 was the machine that mattered...

                        johnmashey@mstdn.socialJ 1 Reply Last reply
                        0
                        • stuartmarks@mastodon.socialS stuartmarks@mastodon.social

                          @aka_pugs @JohnMashey @markd @SteveBellovin Huh, interesting comment on hex floating point. I’ve long thought that a controversial choice. I remember hearing an IBM numerical analyst claim that the hex floating point was “cleaner” than competing formats (this predated IEEE 754) but much literature today echoes the criticism given here that the hex format effectively shortens the significand.

                          stevebellovin@infosec.exchangeS This user is from outside of this forum
                          stevebellovin@infosec.exchangeS This user is from outside of this forum
                          stevebellovin@infosec.exchange
                          wrote last edited by
                          #21

                          @stuartmarks @aka_pugs @JohnMashey @markd There are many different points here to respond to; let me first address the EBCDIC/ASCII issue, and why IBM's sales reps pushed it.
                          As @aka_pugs pointed out, IBM had a huge commercial data processing business dating back to the pre-computer punch card days. (IBM was formed by the merger of several companies, including Hollerith's own.) Look at the name of the company: International *Business* Machines. You could do amazing things with just punch card machines—Arthur Clarke's classic 1946 story Rescue Party (https://en.wikipedia.org/wiki/Rescue_Party) referred in passing to "Hollerith analyzers". But punch cards had a crucial limit: your life was much better if all of the data for a record fit onto a single 80-column card. This meant that space was at a premium, so it was common to overpunch numeric columns, e.g., age, with "zone punch" in the 11 or 12 rows. Thus, a card column with just a punch in the 1 row was the digit 1, but if it had a row 12 punch as well it was either the letter A *or* the digit 1 and a binary signal for whatever was encoded by the row 12 punch. The commercial computers of the 1950s, which used 6-bit "bytes" for decimal digits as "BCD"—binary-coded decimal—mirrored this: the two high-order bits could be encoded data.
                          The essential point here is that with BCD, it was possible to do context-free decomposition, in a way that you couldn't do with ASCII. The IBM engineers wanted the S/360 to be an ASCII machine, but the big commercial customers pushed back very hard. IBM bowed to the commercial reality (but with the ASCII bit for dealing with "packed decimal" conversions), and marketed the machine that way: "you don't have to worry about your old data, because EBCDIC"—extended BCD interachange code—"your old files are still good." That's why the sales people talked it up—they saw this as a major commercial advantage.

                          markd@hachyderm.ioM 1 Reply Last reply
                          1
                          0
                          • hyc@mastodon.socialH hyc@mastodon.social

                            @JohnMashey @aka_pugs @markd @SteveBellovin sounds like Motorola copied their reasoning years later, with the MC68000.

                            For us UMich folks, the 360/67 was the machine that mattered...

                            johnmashey@mstdn.socialJ This user is from outside of this forum
                            johnmashey@mstdn.socialJ This user is from outside of this forum
                            johnmashey@mstdn.social
                            wrote last edited by
                            #22

                            @hyc @aka_pugs @markd @SteveBellovin
                            16MB in S/360 & 68K, ignoring high bits, => clever programmers used high 8 bits for flags, as I did when writing ASSIST in 1970, still running as late as 2015, likely still.
                            68000 to 68020, 24 to 32-bit caused trouble for Mac II software.
                            I wrote of this in BYTE 1991, see section
                            “The mainframe, minicomputer, and microprocessor”
                            https://www.bourguet.org/v2/comparch/mashey-byte-1991
                            MIPS R4000 was released later in 1991. It translated 40 bits, but trapped high-order bits not all 0s/1s.

                            johnmashey@mstdn.socialJ 1 Reply Last reply
                            0
                            • johnmashey@mstdn.socialJ johnmashey@mstdn.social

                              @hyc @aka_pugs @markd @SteveBellovin
                              16MB in S/360 & 68K, ignoring high bits, => clever programmers used high 8 bits for flags, as I did when writing ASSIST in 1970, still running as late as 2015, likely still.
                              68000 to 68020, 24 to 32-bit caused trouble for Mac II software.
                              I wrote of this in BYTE 1991, see section
                              “The mainframe, minicomputer, and microprocessor”
                              https://www.bourguet.org/v2/comparch/mashey-byte-1991
                              MIPS R4000 was released later in 1991. It translated 40 bits, but trapped high-order bits not all 0s/1s.

                              johnmashey@mstdn.socialJ This user is from outside of this forum
                              johnmashey@mstdn.socialJ This user is from outside of this forum
                              johnmashey@mstdn.social
                              wrote last edited by
                              #23

                              @hyc @aka_pugs @markd @SteveBellovin

                              Some users who knew about R4000 wanted to use the high bits as tag bits. I said NO!

                              stevebellovin@infosec.exchangeS 1 Reply Last reply
                              0
                              • johnmashey@mstdn.socialJ johnmashey@mstdn.social

                                @hyc @aka_pugs @markd @SteveBellovin

                                Some users who knew about R4000 wanted to use the high bits as tag bits. I said NO!

                                stevebellovin@infosec.exchangeS This user is from outside of this forum
                                stevebellovin@infosec.exchangeS This user is from outside of this forum
                                stevebellovin@infosec.exchange
                                wrote last edited by
                                #24

                                @JohnMashey @hyc @aka_pugs @markd Brooks said that the 24-bit address decision was an economic one. But he also recognized, and stated, that "every successful architecture runs out of address space." (Aside: that's one reason why IPv6 addresses are 128 bits instead of 64—I and a few others insisted on it, and I specifically quoted Brooks' observation.) But there was one really crucial error in the S/360 architecture: the Load Address instruction was defined by the architecture to zero the high-order byte, making it impossible to use that instruction on 32-bit address machines. Since LA was the most common instruction used, per actual hardware traces, this was a serious issue. (It wasn't only used for addresses; indeed, many of the instances were to provide what Brooks called the "indispensable small positive constant".) The I/O architecture was also 24-bit, but that didn't bother the architects—they figured it would be replaced with something smarter later on anyway.

                                Update: I forgot about the Branch and Link instructions, which were used for subroutine calls. Per the Principles of Operation manual, "The rightmost 32 bits of the PSW, including the updated instruction address, are stored." The high-order 8 bits of the PSW included the "condition code", used for conditional branches, and the "program mask", which could be and was changed by application programs to disable some software-related interrupts, e.g., fixed-point overflow. This instruction was also not 32-bit-address compatible. (In Blaauw and Brooks, they note that extension to 32-bit addressing was seen as desirable and necessary from the very beginning.)

                                johnmashey@mstdn.socialJ 1 Reply Last reply
                                0
                                • stevebellovin@infosec.exchangeS stevebellovin@infosec.exchange

                                  @JohnMashey @hyc @aka_pugs @markd Brooks said that the 24-bit address decision was an economic one. But he also recognized, and stated, that "every successful architecture runs out of address space." (Aside: that's one reason why IPv6 addresses are 128 bits instead of 64—I and a few others insisted on it, and I specifically quoted Brooks' observation.) But there was one really crucial error in the S/360 architecture: the Load Address instruction was defined by the architecture to zero the high-order byte, making it impossible to use that instruction on 32-bit address machines. Since LA was the most common instruction used, per actual hardware traces, this was a serious issue. (It wasn't only used for addresses; indeed, many of the instances were to provide what Brooks called the "indispensable small positive constant".) The I/O architecture was also 24-bit, but that didn't bother the architects—they figured it would be replaced with something smarter later on anyway.

                                  Update: I forgot about the Branch and Link instructions, which were used for subroutine calls. Per the Principles of Operation manual, "The rightmost 32 bits of the PSW, including the updated instruction address, are stored." The high-order 8 bits of the PSW included the "condition code", used for conditional branches, and the "program mask", which could be and was changed by application programs to disable some software-related interrupts, e.g., fixed-point overflow. This instruction was also not 32-bit-address compatible. (In Blaauw and Brooks, they note that extension to 32-bit addressing was seen as desirable and necessary from the very beginning.)

                                  johnmashey@mstdn.socialJ This user is from outside of this forum
                                  johnmashey@mstdn.socialJ This user is from outside of this forum
                                  johnmashey@mstdn.social
                                  wrote last edited by
                                  #25

                                  @SteveBellovin @hyc @aka_pugs @markd
                                  Agreed, I wrote a lot of S/360 assembly code & LA was very useful.
                                  Although not architectural, but software convention, using high-order bit in last ptr argument in argument list persisted.

                                  On economics, I do wonder how much $ the 360/30-based decision cost IBM in the long term, in terms of software/hardware complexity.

                                  1 Reply Last reply
                                  0
                                  • stuartmarks@mastodon.socialS stuartmarks@mastodon.social

                                    @aka_pugs @JohnMashey @markd @SteveBellovin Huh, interesting comment on hex floating point. I’ve long thought that a controversial choice. I remember hearing an IBM numerical analyst claim that the hex floating point was “cleaner” than competing formats (this predated IEEE 754) but much literature today echoes the criticism given here that the hex format effectively shortens the significand.

                                    stevebellovin@infosec.exchangeS This user is from outside of this forum
                                    stevebellovin@infosec.exchangeS This user is from outside of this forum
                                    stevebellovin@infosec.exchange
                                    wrote last edited by
                                    #26

                                    @stuartmarks @aka_pugs @JohnMashey @markd I checked what Blaauw and Brooks said about the S/360 floating point architecture. "The use of a hexadecimal base was intended to speed up the implementation, yet the resulting loss of precision was underestimated. The absence of a guard digit in the 64-bit format had to be corrected soon after the first machines were delivered."

                                    1 Reply Last reply
                                    0
                                    • markd@hachyderm.ioM markd@hachyderm.io

                                      @SteveBellovin @aka_pugs If you were on the non-EBCDIC side of the fence you got the impression that IBM sales pushed EBCDIC pretty hard as a competitive advantage - even if their engineering covertly preferred ASCII.

                                      The 32-bit word must have been a harder-sell for the blue suits since the competition were selling 60bit and 36bit amongst other oddballs.

                                      Fortunately the emergence of commercial customers marked the declining relevance of scientific computing... Did IBM get lucky or were they prescient?

                                      But yeah, the S/360 definitely marked the end of the beginning of computing in multiple ways.

                                      stevebellovin@infosec.exchangeS This user is from outside of this forum
                                      stevebellovin@infosec.exchangeS This user is from outside of this forum
                                      stevebellovin@infosec.exchange
                                      wrote last edited by
                                      #27

                                      @markd @aka_pugs It's not clear to me that scientific computing declined in relevance then. But the S/360 line was, as @JohnMashey indicated, a way to unify the scientific and commercial lines of computers. (To be sure, on the lower-end models, the decimal instruction set and the floating point instruction set were options—and the 360/91 emulated the decimal instructions in the kernel (which IBM calls a nucleus).) One interesting way this was relevant: memory parity. Before the S/360, commercial computers had parity bits on memory; scientific ones did not. After all, commercial computers were used for things like banking, where you couldn't afford to lose money because of a hardware problem, whereas scientific computers were only used for things like bridge and nuclear reactor design, which of course don't cost money… There was also the moral issue—lives could be at stake—which was also Brooks' justification for insisting that all S/360s (including the /44, intended only for scientific computing) would have parity. The CDC 6600, a supercomputer of the day, did not have parity; the designer, Seymour Cray, said "Parity is for farmers" (https://en.wikipedia.org/wiki/ECC_memory#Personal_computers). The successor, the 7600, did have parity. (Note: to understand Cray's line, see https://en.wikipedia.org/wiki/Doctrine_of_parity.)

                                      1 Reply Last reply
                                      0
                                      • johnmashey@mstdn.socialJ johnmashey@mstdn.social

                                        @aka_pugs @markd @SteveBellovin
                                        But still, FORTRAN IV got lots of use especially on 360/50…85 in universities & R&D labs. i suspect not much on /30 /40.
                                        I still think of 360 as a huge bet to consolidate the chaos of the 701…7094 36-bit path and the 702…7074 &1401 variable-string paths.
                                        And for fun: I asked both Gene Amdahl & Fred Brooks why they used 24-bit addressing, ignoring high 8-bits… which caused a lot of problems/complexity later.
                                        A: save hardware on 360/30, w/8-bit data paths.

                                        stevebellovin@infosec.exchangeS This user is from outside of this forum
                                        stevebellovin@infosec.exchangeS This user is from outside of this forum
                                        stevebellovin@infosec.exchange
                                        wrote last edited by
                                        #28

                                        @JohnMashey @aka_pugs @markd The purpose of the higher-end machines was to sell the very profitable lower-end machine, by showing that there was an upgrade path. And then they blew it by having incompatible operating systems…

                                        1 Reply Last reply
                                        0
                                        • stevebellovin@infosec.exchangeS stevebellovin@infosec.exchange

                                          @stuartmarks @aka_pugs @JohnMashey @markd There are many different points here to respond to; let me first address the EBCDIC/ASCII issue, and why IBM's sales reps pushed it.
                                          As @aka_pugs pointed out, IBM had a huge commercial data processing business dating back to the pre-computer punch card days. (IBM was formed by the merger of several companies, including Hollerith's own.) Look at the name of the company: International *Business* Machines. You could do amazing things with just punch card machines—Arthur Clarke's classic 1946 story Rescue Party (https://en.wikipedia.org/wiki/Rescue_Party) referred in passing to "Hollerith analyzers". But punch cards had a crucial limit: your life was much better if all of the data for a record fit onto a single 80-column card. This meant that space was at a premium, so it was common to overpunch numeric columns, e.g., age, with "zone punch" in the 11 or 12 rows. Thus, a card column with just a punch in the 1 row was the digit 1, but if it had a row 12 punch as well it was either the letter A *or* the digit 1 and a binary signal for whatever was encoded by the row 12 punch. The commercial computers of the 1950s, which used 6-bit "bytes" for decimal digits as "BCD"—binary-coded decimal—mirrored this: the two high-order bits could be encoded data.
                                          The essential point here is that with BCD, it was possible to do context-free decomposition, in a way that you couldn't do with ASCII. The IBM engineers wanted the S/360 to be an ASCII machine, but the big commercial customers pushed back very hard. IBM bowed to the commercial reality (but with the ASCII bit for dealing with "packed decimal" conversions), and marketed the machine that way: "you don't have to worry about your old data, because EBCDIC"—extended BCD interachange code—"your old files are still good." That's why the sales people talked it up—they saw this as a major commercial advantage.

                                          markd@hachyderm.ioM This user is from outside of this forum
                                          markd@hachyderm.ioM This user is from outside of this forum
                                          markd@hachyderm.io
                                          wrote last edited by
                                          #29

                                          @SteveBellovin @stuartmarks @aka_pugs @JohnMashey All of which (punch card focus, overloading high order pointer bits, packed decimal, 6bit bytes, scientific vs commercial, memory parity, two-speed memory) signalled the beginning of the end of an era where programmers and engineers counted every bit, every machine cycle and every memory reference. An era where programmers optimised hardware rather than round the other way.

                                          While the need to deal with feeble compute power created interesting and novel architectures (Singer System Ten anyone? - https://en.wikipedia.org/wiki/Singer_System_Ten), the lock-in was a nightmare for customers embarking on their (oftentimes first) tech refresh.

                                          So sure, one can readily admire the S/360 design, nonetheless, its biggest contribution may have been as an extinction event for all those oddball architectures due to market dominance.

                                          stevebellovin@infosec.exchangeS 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups