TO MY BLOG !!!

martes, 21 de septiembre de 2010

HISTORY OF ELECTRONIC DIGITAL COMPUTATION

History of electronic digital computation


The era of modern computing began with a flurry of development before and during World War II, as electronic circuit elements replaced mechanical equivalents, and digital calculations replaced analog calculations. Machines such as the Z3, the Atanasoff–Berry Computer, the Colossus computers, and the ENIAC were built by hand using circuits containing relays or valves (vacuum tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Defining a single point in the series as the "first computer" misses many subtleties (see the table "Defining characteristics of some early digital computers of the 1940s" below).

Alan Turing's 1936 paper proved enormously influential in computing and computer science in two ways. Its main purpose was to prove that there were problems (namely the halting problem) that could not be solved by any sequential process. In doing so, Turing provided a definition of a universal computer which executes a program stored on tape. This construct came to be called a Turing machine. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

For a computing machine to be a practical general-purpose computer there must be some convenient read-write mechanism, punched tape, for example. With knowledge of Alan Turing's theoretical 'universal computing machine' John von Neumann defined an architecture which uses the same memory both to store programs and data: virtually all contemporary computers use this architecture (or some variant). While it is theoretically possible to implement a full computer entirely mechanically (as Babbage's design showed), electronics made possible the speed and later the miniaturization that characterize modern computers.
There were three parallel streams of computer development in the World War II era; the first stream largely ignored, and the second stream deliberately kept secret. The first was the German work of Konrad Zuse. The second was the secret development of the Colossus computers in the UK. Neither of these had much influence on the various computing projects in the United States. The third stream of computer development, Eckert and Mauchly's ENIAC and EDVAC, was widely publicized.
George Stibitz is internationally recognized as one of the fathers of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator that he dubbed the "Model K" (for "kitchen table", on which he assembled it), which was the first to calculate using binary form.

martes, 20 de abril de 2010

SIMPLE PAST TENSE HISTORY OF COMPUTER ARCHITECTURE

THE PAST TENSE  -  Brief history of computer architecture

1943-46 - ENIAC (Electronic Numerical Integrator and Calculator) by J. Mauchly and J. Presper Eckert, first general purpose electronic computer The size of its numerical word was 10 decimal digits, and it could perform 5000 additions and 357 multiplications per second.





Built to calculate trajectories for ballistic shells during WWII, programmed by setting switches and plugging & unplugging cables. It used 18,000 tubes, weighted 30 tones and consumed 160
kilowatts of electrical power.
1951 - UNIVAC (Universal Automatic Computer) - the first commercial computer, built by Eckert and Mauchly, cost – around $1 million, 46 machines sold.


UNIVAC had an add time of 120 microseconds, multiply time of 1,800 microseconds and a divide time of 3,600 microseconds, used magnetic tape as input.





1953 - IBM's 701, the first commercially successful general-purpose computer. The 701 had electrostatic storage tube memory, used magnetic tape to store information, and had binary, fixed-point, single address hardware.


IBM 650 - 1st mass-produced computer (450 machines sold in one year)



Source: by Elissaveta Arnaoudova in www.mgnet.org

lunes, 19 de abril de 2010

UNDERLINE THE VERB "TO BE" INCLUDING THE IMPERSONAL FORMS

COMPUTER ARCHITECTURE


In computer science, computer architecture or digital computer organization is the conceptual design and fundamental operational structure of a computer system. It is a blueprint and functional description of requirements and design implementations for the various parts of a computer.
It is also defined as the science and art of selecting and interconnecting hardware components to create computers that meet functional, performance and cost goals.
Computer architecture has three main subcategories:


1- Instruction set architecture, or ISA, is the abstract image of a computing system that is a machine language (or assembly language) programmer, including the instruction set, word size, memory address modes, processor registers, and address and data formats.

2- Microarchitecture, also Computer organization is a lower level, more concrete and detailed, description of the system. There are constituent parts of the system interconnected and they interoperate in order to implement the ISA. The size of a computer's cache for instance, is an organizational issue.


3. System Design: it is all of the other hardware components within a computing system such as:

--System interconnects such as computer buses and switches

-- Memory controllers and hierarchies

--CPU off-load mechanisms such as direct memory access (DMA)

--Issues like multiprocessing.

Adapted from http://www.wikipedia.org/, Abril 2010