Студопедия
Новини освіти і науки:
МАРК РЕГНЕРУС ДОСЛІДЖЕННЯ: Наскільки відрізняються діти, які виросли в одностатевих союзах


РЕЗОЛЮЦІЯ: Громадського обговорення навчальної програми статевого виховання


ЧОМУ ФОНД ОЛЕНИ ПІНЧУК І МОЗ УКРАЇНИ ПРОПАГУЮТЬ "СЕКСУАЛЬНІ УРОКИ"


ЕКЗИСТЕНЦІЙНО-ПСИХОЛОГІЧНІ ОСНОВИ ПОРУШЕННЯ СТАТЕВОЇ ІДЕНТИЧНОСТІ ПІДЛІТКІВ


Батьківський, громадянський рух в Україні закликає МОН зупинити тотальну сексуалізацію дітей і підлітків


Відкрите звернення Міністру освіти й науки України - Гриневич Лілії Михайлівні


Представництво українського жіноцтва в ООН: низький рівень культури спілкування в соціальних мережах


Гендерна антидискримінаційна експертиза може зробити нас моральними рабами


ЛІВИЙ МАРКСИЗМ У НОВИХ ПІДРУЧНИКАХ ДЛЯ ШКОЛЯРІВ


ВІДКРИТА ЗАЯВА на підтримку позиції Ганни Турчинової та права кожної людини на свободу думки, світогляду та вираження поглядів



TEXT B. SIX COMPUTER GENERATIONS

 

The first three generations of computers have traditionally been identified as those using vacuum tubes, transistors, and integrated circuits, respectively. The fourth generation was never so clearly delineated, but has generally been associated with the use of large scale integrated circuits that enabled the creation of microprocessor chips. The next major deviation in computer technology, therefore, could be considered (in 1980) to be the fifth generation.

The development of the fifth generation of computer systems is characterized mainly by the acceptance of parallel processing. Until this time parallelism was limited to pipelining and vector processing, or at most to a few processors sharing jobs. The fifth generation saw the introduction of machines with hundreds of processors that could all be working on different parts of a single program. The scale of integration in semiconductor continued at an incredible pace - by 1990 it was possible to build chips with a million components - and semiconductor memories became standard on all computers.

All of the mainstream commercial computers to date have followed very much in the footsteps of the original stored program computer, the EDVAC, attributed to John von Neumann. Thus, this conventional computer architecture is referred to as "von Neumann". It has been generally accepted that the computers of the future would need to break away from this traditional, sequential, kind of processing in order to achieve the kinds of speeds necessary to accommodate the applications expected to be wanted or required. It is expected that future computers will need to be more intelligent providing natural language interfaces, able to "see" and "hear", and having a large store of knowledge. The amount of computing power required to support these capabilities will naturally be immense.

Other new developments were the widespread use of computer networks and the increasing use of single-user workstations. Prior to 19851arge scale parallel processing was viewed as a research goal, but two systems introduced around this time are Meal of the first commercial products to be based on parallel processing. The Sequent Balance 8000 connected up to 20 processors to a single shared memory module (but each processor had its own local cache). The machine was designed the compete with the DEC VAX-780 as a general purpose Unix system, with each processor working on a different user's job. However Sequent provided a library of subroutines that aid allow programmers to write programs that would use more than one processor, and the machine was widely used to explore parallel algorithms and programming techniques.

The Intel iPSC-1, nicknamed "the hypercube", took a different approach. Instead of using one memory module, Intel connected each processor to its own memory and used a network interface to connect processors. This distributed memory architecture meant memory was no longer a bottleneck and large systems (using more processors) could be built. The largest iPSC-1 had 128 processors. Toward the end of this period a third type of parallel processor was introduced to the market. In this style of machine, known as a data-parallel or SIMD, there are several thousand very simple processors. All processors work under the direction of a single control unit.

Scientific computing in this period was still dominated by vector processing. Most manufacturers of vector processors in this parallel models, but there were very few (two to eight) processors introduced parallel machines. In the area of computer networking, both wide area network (WAN) and local area network (LAN) technology developed at a rapid pace, stimulating a transition from the traditional mainframe computing environment toward a distributed computing environment in which each user has their own workstation for relatively simple tasks (editing and compiling programs, reading mail).

One of the most dramatic changes in the sixth generation will be the explosive growth of wide area networking. Network bandwidth has expanded tremendously in the last few years and will continue to improve for the next several years. Tl transmission rates are now standard for regional networks, and the national "backbone" that interconnects regional networks uses T3. Networking technology is becoming more widespread than its original strong base in universities and government laboratories as it is rapidly finding application in K-12 education, community networks and private industry. A little over a decade after the warning voiced in the Lax report, the future of a strong computational science infrastructure is bright. The federal commitment to high performance computing has been further strengthened with the passage of two particularly significant pieces of legislation: the High Performance Computing Act of 1991, which established the High Performance Computing and Communication Program (HPCCP) and Sen. Core's Information Infrastructure and Technology Act of 1992, which addresses a broad spectrum of issues ranging from high performance computing to expanded network access as the necessity to make leading edge technologies available to educators from kindergarten through graduate school.

 


Читайте також:

  1. Advantages of Computer Data Processing.
  2. Advantages of Computer Data Processing.
  3. Before you read the text below say how conference participants can use computers for preliminary information exchange.
  4. COMPUTER CRIME
  5. Computer Crime
  6. Computer Crime
  7. Computer Graphics
  8. Computer graphics and pattern preparation
  9. Computer Instructions
  10. Computer Programmer
  11. Computer Support Specialists
  12. Computer-Aided Design and Computer-Aided Manufacturing




Переглядів: 716

<== попередня сторінка | наступна сторінка ==>
TEXT A. CENTRAL PROCESSING UNIT | X. Read Text В and translate it without a dictionary. Render it in Russian.

Не знайшли потрібну інформацію? Скористайтесь пошуком google:

  

© studopedia.com.ua При використанні або копіюванні матеріалів пряме посилання на сайт обов'язкове.


Генерація сторінки за: 0.001 сек.