Sunday, November 11, 2012

ENIAC


ENIAC (play /ˈɛni.æk/Electronic Numerical Integrator And Computer)[1][2] was the first electronic general-purpose computer. It was Turing-complete, digital, and capable of being reprogrammed to solve a full range of computing problems.[3]
ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory.[4][5] When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It boasted speeds one thousand times faster than electro-mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general-purpose programmability, excited scientists and industrialists. The inventors promoted the spread of these new ideas by conducting a series of lectures on computer architecture.
ENIAC's design and construction was financed by the United States Army during World War II. The construction contract was signed on June 5, 1943, and work on the computer began in secret by the University of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". The completed machine was announced to the public the evening of February 14, 1946[6] and formally dedicated the next day[7] at the University of Pennsylvania, having cost almost $500,000 (approximately $6,000,000 today). It was formally accepted by the U.S. Army Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground,Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955.[2]
ENIAC was conceived and designed by John Mauchly and J. Presper Eckert of the University of Pennsylvania.[8] The team of design engineers assisting the development included Robert F. Shaw (function tables), Jeffrey Chuan Chu (divider/square-rooter), Thomas Kite Sharpless (master programmer), Arthur Burks (multiplier), Harry Huskey (reader/printer) and Jack Davis (accumulators). ENIAC was named an IEEE Milestone in 1987.[9]

History of computer science


The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity. Wilhelm Schickard designed the first mechanical calculator in 1623, but did not complete its construction.[2] Blaise Pascal designed and constructed the first working mechanical calculator, the Pascaline, in 1642. In 1694 Gottfried Wilhelm Leibnitz completed the Step Reckoner, the first calculator that could perform all four arithmetic operations. Charles Babbage designed a difference engine and then a general-purpose Analytical Engine in Victorian times,[3] for which Ada Lovelace wrote a manual. Because of this work she is regarded today as the world's first programmer.[4] Around 1900, punched card machines were introduced.
During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.[5] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[6][7] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.[8] Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[9] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[9] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base. Initially, computers were quite costly, and some degree of human aid was needed for efficient use - in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage.

Computer science


Computer science or computing science (abbreviated CS or CompSci) is the scientific and practical approach to computation and its applications. A computer scientistspecializes in the theory of computation and the design of computational systems.[1]
Its subfields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory (which explores the fundamental properties of computational problems), are highly abstract, whilst fields such as computer graphics emphasise real-world applications. Still other fields focus on the challenges in implementing computation. For example, programming language theory considers various approaches to the description of computation, whilst the study of computer programming itself investigates various aspects of the use of programming language and complex systemsHuman-computer interaction considers the challenges in making computers and computations useful, usable, and universally accessible to humans.