

It is arguable which of the early microcomputers could be called a first. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. First Generation of Microcomputers (1971 – 1976)įirst microcomputers were a weird bunch.

The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today. The first single-chip CPU, or a microprocessor, was Intel 4004. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.įourth Generation Computers (1971 – present)įirst microchips-based central processing units consisted of multiple microchips for different CPU components. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.įirst appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce.

The invention of the integrated circuits ( ICs), also known as microchips, paved the way for computers as we know them today. While abacus - may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today. are merely abstractions of the numbers being crunched within the machine in digital computers these are the ones and zeros, representing electrical on and off states, and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code. Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results. Regardless, this is what computing is all about, in a nutshell. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. In that respect the earliest computer was the abacus, used to perform basic arithmetic operations.Įvery computer supports some form of input, processing, and output. In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations.
