kontera

Thursday, December 17, 2009

computer

The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see history of computing hardware) which was supposed to perform much calculation usingmassive parallel processing. It was to be the end result of a massive government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and usable artificial intelligence capabilities.

The term fifth generation was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance.

Opinions about its outcome are divided: Either it was a failure, or it was ahead of its time

History

In the late 1960s and early '70s, there was much talk about "generations" of computer hardware — usually "three generations".

  1. First generation: Vacuum tubes. Mid-1940s. IBM pioneered the arrangement of vacuum tubes in pluggable modules. The IBM 650 was a first-generation computer.
  2. Second generation: Transistors. 1956. The era of miniaturization begins. Transistors are much smaller than vacuum tubes, draw less power, and generate less heat. Discrete transistors are soldered to circuit boards, with interconnections accomplished by stencil-screened conductive patterns on the reverse side. The IBM 7090 was a second-generation computer.
  3. Third generation: Integrated circuits (silicon chips containing multiple transistors). 1964. A pioneering example is the ACPX module used in the IBM 360/91, which, by stacking layers of silicon over a ceramic substrate, accommodated over 20 transistors per chip; the chips could be packed together onto a circuit board to achieve unheard-of logic densities. The IBM 360/91 was a hybrid second- and third-generation computer.

Omitted from this taxonomy is the "zeroth-generation" computer based on metal gears (such as the IBM 407) or mechanical relays (such as the Mark I), and the post-third-generation computers based on Very Large Scale Integrated (VLSI) circuits.

There was also a parallel set of generations for software:

  1. First generation: Machine language.
  2. Second generation: Assembly language.
  3. Third generation: Structured programming languages such as C, COBOL and FORTRAN.
  4. Fourth generation: Domain-specific languages such as SQL (for database access) and TeX (for text formatting

    Background and design philosophy

    Throughout these multiple generations up to the 1980s, Japan had largely been a follower in the computing arena, building computers following U.S. and British leads. The Ministry of International Trade and Industry (MITI) decided to attempt to break out of this follow-the-leader pattern, and in the mid-1970s started looking, on a small scale, into the future of computing. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used.

    Prior to the 1970s, MITI guidance had successes such as an improved steel industry, the creation of the oil supertanker, the automotive industry, consumer electronics, and computer memory. MITI decided that the future was going to be information technology. However, theJapanese language, in both written and spoken form, presented and still presents major obstacles for computers. These hurdles could not be taken lightly. So MITI held a conference and invited people around the world to help them.

    The primary fields for investigation from this initial project were:

    • Inference computer technologies for knowledge processing
    • Computer technologies to process large-scale data bases and knowledge bases
    • High performance workstations
    • Distributed functional computer technologies
    • Super-computers for scientific calculation

    The project imagined a parallel processing computer running on top of massive databases (as opposed to a traditional filesystem) using a logic programming language to define and access the data. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is a Logical Inference Per Second. At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established the Institute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies

    Implementation

    So ingrained was the belief that parallel computing was the future of all performance gains that the Fifth-Generation project generated a great deal of apprehension in the computer field. After having seen the Japanese take over the consumer electronics field during the 1970s and apparently doing the same in the automotive world during the 1980s, the Japanese in the 1980s had a reputation for invincibility. Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program of Research in Information Technology (ESPRIT), as well as ECRC (European Computer Research Centre) in Munich, a collaboration between ICL in Britain, Bull in France, and Siemens in Germany.

    Five running Parallel Inference Machines (PIM) were eventually produced: PIM/m, PIM/p, PIM/i, PIM/k, PIM/c. The project also produced applications to run on these systems, such as the parallel database management system Kappa, the legal reasoning system HELIC-II, and the automated theorem prover MGTP, as well as applications to bioinformatics

    Failure

    The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and Intelx86 machines). The project did produce a new generation of promising Japanese researchers. But after the FGCS Project, MITI stopped funding large-scale computer research projects, and the research momentum developed by the FGCS Project dissipated. It should be noted, however, that MITI/ICOT embarked on a Sixth Generation Project in the 1990s.

    A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a knowledge representation and problem solving language for AI applications. This never happened cleanly; a number of languages were developed, all with their own limitations. In particular, the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages.[1]

    Another problem was that existing CPU performance quickly pushed through the "obvious" barriers that experts perceived in the 1980s, and the value of parallel computing quickly dropped to the point where it was for some time used only in niche situations. Although a number ofworkstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially.

    The project also suffered from being on the wrong side of the technology curve. During its lifespan, Apple Computer introduced the GUI to the masses; the internet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining.[citation needed] Moreover the project found that the promises of logic programming were largely negated by the use of committed choice.

    At the end of the ten year period the project had spent over 50 billion yen (about US $400 million at 1992 exchange rates) and was terminated without having met its goals. The workstations had no appeal in a market where general purpose systems could now take over their job and even outrun them. This is parallel to the Lisp machine market, where rule-based systems such as CLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary.[2]

    In spite of the possibility of considering the project a failure, many of the approaches envisioned in the Fifth-Generation project, such as logic programming distributed over massive knowledge-bases, are now being re-interpreted in current technologies. The Web Ontology Language(OWL) employs several layers of logic-based knowledge representation systems, while many flavors of parallel computing proliferate, including multi-core architectures at the low-end and massively parallel processing at the high end

    Timeline

    • 1982: the FGCS project begins and receives $450,000,000 worth of industry funding and an equal amount of government funding.
    • 1985: the first FGCS hardware known as the Personal Sequential Inference Machine (PSI) and the first version of the Sequentual Inference Machine Programming Operating System (SIMPOS) operating system is released. SIMPOS is programmed in Kernel Language 0 (KL0), a concurrent Prolog-variant with object oriented extensions.
    • 1987: a prototype of a truly parallel hardware called the Parallel Inference Machine (PIM) is built using several PSI:s connected in a network. The project receives funding for 5 more years. A new version of the kernel language Kernel Language 1 (KL1) which looks very similar to "Flat GDC" (Flat Guarded Definite Clauses) is created, influenced by developments in Prolog. The operating system written in KL1 is renamed Parallel Inference Machine Operating System, or PIMOS.The history of computer development is often referred to in reference to the different generations of computing devices. A generation refers to the state of improvement in the development of a product. This term is also used in the different advancements of computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it. As a result of the miniaturization, speed, power, and memory of computers has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play.

      Each generation of computer is characterized by major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read

      First Generation - 1940-1956: Vacuum Tubes

      The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. A magnetic drum,also referred to as drum, is a metal cylinder coated with magnetic iron-oxide material on which data and programs can be stored. Magnetic drums were once use das a primary storage device but have since been implemented as auxiliary storage devices.

      The tracks on a magnetic drum are assigned to channels located around the circumference of the drum, forming adjacent circular bands that wind around the drum. A single drum can have up to 200 tracks. As the drum rotates at a speed of up to 3,000 rpm, the device's read/write heads deposit magnetized spots on the drum during the write operation and sense these spots during a read operation. This action is similar to that of a magnetic tape or disk drive.

      They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language to perform operations, and they could only solve one problem at a time. Machine languages are the only languages understood by computers. While easily understood by computers, machine languages are almost impossible for humans to use because they consist entirely of numbers. Programmers, therefore, use either a high-level programming language or an assembly language. An assembly language contains the same instructions as a machine language, but the instructions and variables have names instead of being just numbers.

      Programs written in high-level languages retranslated into assembly language or machine language by a compiler. Assembly language programs retranslated into machine language by a program called an assembler.

      Every CPU has its own unique machine language. Programs must be rewritten or recompiled, therefore,to run on different types of computers. Input was based on punched cards and paper tape, and output was displayed on printouts.

      The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

      Second Generation - 1956-1963: Transistors

      Transistors replaced vacuum tubes and ushered in the second generation of computers. Transistor is a device composed of semiconductor material that amplifies a signal or opens or closes a circuit. Invented in 1947at Bell Labs, transistors have become the key ingredient of all digital circuits, including computers. Today's microprocessors contains tens of millions of microscopic transistors.

      Prior to the invention of transistors, digital circuits were composed of vacuum tubes, which had many disadvantages. They were much larger, required more energy, dissipated more heat, and were more prone to failures. It's safe to say that without the invention of transistors, computing as we know it today would not be possible.

      The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube,allowing computers to become smaller, faster, cheaper,more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

      Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages,which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

      The first computers of this generation were developed for the atomic energy industry.

      Third Generation - 1964-1971: Integrated Circuits

      The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

      A nonmetallic chemical element in the carbon family of elements. Silicon - atomic symbol "Si" - is the second most abundant element in the earth's crust, surpassed only by oxygen. Silicon does not occur uncombined in nature. Sand and almost all rocks contain silicon combined with oxygen, forming silica. When silicon combines with other elements, such as iron, aluminum or potassium, a silicate is formed. Compounds of silicon also occur in the atmosphere, natural waters,many plants and in the bodies of some animals.

      Silicon is the basic material used to make computer chips, transistors, silicon diodes and other electronic circuits and switching devices because its atomic structure makes the element an ideal semiconductor. Silicon is commonly doped, or mixed,with other elements, such as boron, phosphorous and arsenic, to alter its conductive properties.

      A chip is a small piece of semi conducting material(usually silicon) on which an integrated circuit is embedded. A typical chip is less than ¼-square inches and can contain millions of electronic components(transistors). Computers consist of many chips placed on electronic boards called printed circuit boards. There are different types of chips. For example, CPU chips (also called microprocessors) contain an entire processing unit, whereas memory chips contain blank memory.

      Semiconductor is a material that is neither a good conductor of electricity (like copper) nor a good insulator (like rubber). The most common semiconductor materials are silicon and germanium. These materials are then doped to create an excess or lack of electrons.

      Computer chips, both for CPU and memory, are composed of semiconductor materials. Semiconductors make it possible to miniaturize electronic components, such as transistors. Not only does miniaturization mean that the components take up less space, it also means that they are faster and require less energy.

      Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became

      Fourth Generation - 1971-Present: Microprocessors

      The microprocessor brought the fourth generation of computers, as thousands of integrated circuits we rebuilt onto a single silicon chip. A silicon chip that contains a CPU. In the world of personal computers,the terms microprocessor and CPU are used interchangeably. At the heart of all personal computers and most workstations sits a microprocessor. Microprocessors also control the logic of almost all digital devices, from clock radios to fuel-injection systems for automobiles.

      Three basic characteristics differentiate microprocessors:

      • Instruction Set: The set of instructions that the microprocessor can execute.


      • Bandwidth: The number of bits processed in a single instruction.


      • Clock Speed: Given in megahertz (MHz), the clock speed determines how many instructions per second the processor can execute.

      In both cases, the higher the value, the more powerful the CPU. For example, a 32-bit microprocessor that runs at 50MHz is more powerful than a 16-bitmicroprocessor that runs at 25MHz.

      What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004chip, developed in 1971, located all the components of the computer - from the central processing unit and memory to input/output controls - on a single chip.

      Abbreviation of central processing unit, and pronounced as separate letters. The CPU is the brains of the computer. Sometimes referred to simply as the processor or central processor, the CPU is where most calculations take place. In terms of computing power,the CPU is the most important element of a computer system.

      On large machines, CPUs require one or more printed circuit boards. On personal computers and small workstations, the CPU is housed in a single chip called a microprocessor.

      Two typical components of a CPU are:

      • The arithmetic logic unit (ALU), which performs arithmetic and logical operations.


      • The control unit, which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary.

      In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

      As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI's, the mouse and handheld devices to a mass audience because they were smaller and cheaper than their predecessors.

      Fifth Generation - Present and Beyond: Artificial Intelligence

      Fifth generation computing devices, based on artificial intelligence, are still in development,though there are some applications, such as voice recognition, that are being used today.

      Artificial Intelligence is the branch of computer science concerned with making computers behave like humans. The term was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology. Artificial intelligence includes:

      • Games Playing: programming computers to play games such as chess and checkers


      • Expert Systems: programming computers to make decisions in real-life situations (for example, some expert systems help doctors diagnose diseases based on symptoms)


      • Natural Language: programming computers to understand natural human languages


      • Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains


      • Robotics: programming computers to see and hear and react to other sensory stimuli

      Currently, no computers exhibit full artificial intelligence (that is, are able to simulate human behavior). The greatest advances have occurred in the field of games playing. The best computer chess programs are now capable of beating humans. In May,1997, an IBM super-computer called Deep Blue defeated world chess champion Gary Kasparov in a chess match.

      In the area of robotics, computers are now widely used in assembly plants, but they are capable only of very limited tasks. Robots have great difficulty identifying objects based on appearance or feel, and they still move and handle objects clumsily.

      Natural-language processing offers the greatest potential rewards because it would allow people to interact with computers without needing any specialized knowledge. You could simply walk up to a computer and talk to it. Unfortunately, programming computers to understand natural languages has proved to be more difficult than originally thought. Some rudimentary translation systems that translate from one human language to another are in existence, but they are not nearly as good as human translators.

      There are also voice recognition systems that can convert spoken sounds into written words, but they do not understand what they are writing; they simply take dictation. Even these systems are quite limited -- you must speak slowly and distinctly.

      In the early 1980s, expert systems were believed to represent the future of artificial intelligence and of computers in general. To date, however, they have not lived up to expectations. Many expert systems help human experts in such fields as medicine and engineering, but they are very expensive to produce and are helpful only in special situations.

      Today, the hottest area of artificial intelligence is neural networks, which are proving successful in an umber of disciplines such as voice recognition and natural-language processing.

      There are several programming languages that are known as AI languages because they are used almost exclusively for AI applications. The two most common are LISP and Prolog.


      Acronym for Electronic Numerical Integrator And Computer, the world's first operational electronic digital computer, developed by Army Ordnance to compute World War II ballistic firing tables. The ENIAC, weighing 30 tons, using 200 kilowatts of electric power and consisting of 18,000 vacuum tubes,1,500 relays, and hundreds of thousands of resistors,capacitors, and inductors, was completed in 1945. In addition to ballistics, the ENIAC's field of application included weather prediction, atomic-energy calculations, cosmic-ray studies, thermal ignition,random-number studies, wind-tunnel design, and other scientific uses. The ENIAC soon became obsolete as the need arose for faster computing speeds. about each generation and the developments that led to the current devices that we use today.


No comments:

Post a Comment