“Man is a slow, sloppy, and brilliant thinker; computers are fast, accurate, and stupid.” ― John Pfeiffer

 

Later Computing

IBM to the Rescue

As the development of computer’s progressed, so did the variety of their uses. A common began to become apparent; code that was machine specific, such that it ran only on one model of computer, was impractical and cumbersome. The time had come for cross platform computing, in which code could be compiled and run across many different machines. The most significant solution to this problem arrived courtesy of IBM. The company released System/360, an instruction set crossed with a virtual machine. The idea was that any computer could simulate System/360 at varying speeds while still delivering the same output from programs. Thus a customer only had to make a choice on how much performance they wanted from their machine, not what code would compile and execute.


RISC and CISC

IBM’s invention eventually led to a type of instruction set called a CISC, where the instruction set contains a very large amount of instructions for the programmer to utilize in order to write effective code. As a reaction to CISC, RISC was developed. See the other page for more information on RISC.


 

Parallelism: A technique for the modern CPU

As the 1980s progressed, parallelism became the new hot topic in CPU architecture design. The new concepts involved allowing the CPU to execute multiple numeric operations at the same time. Instruction pipelining, in which the CPU can begin to act on the next instruction even as the first is not finished being processed, became common. Another revolutionary idea was that of a superscalar architecture, in which instructions were distributed to several arithmetic logic units (ALU) inside of a single CPU, as opposed to the normal one ALU.