Von Neumann architecture

The von Neumann architecture , which is also known as the Neumann model and Princeton architecture , is a computer architecture based on the 1945 description by the mathematician and physicist John von Neumann and others in the First Draft of a Report on the EDVAC . [1] This describes a design architecture for an electronic computer with parts of a processing unitcontaining an arithmetic logic unit and processor registers ; a control unit containing an instruction register andprogram counter ; a memory to store both data and instructions ; external mass storage ; and input and output mechanisms. [1] [2] The meaning HAS to be Evolved Any stored-program computer in qui year fetch statement and a data operation can not Occur at the time Sami Because They share a common bus . This is referred to as the Neumann bottleneck and often limits the performance of the system. [3]

The design of a Neumann architecture machine is simpler than that of a Harvard architecture machine, which is also a stored-program system. address and data nozzles for fetching instruction.

A stored program is one that keeps its program instructions , as well as its data, read-write , random-access memory (RAM). These programs were developed in the 1940s, as they were used by the Colossus and the ENIAC , which were programmed by setting switches and inserting patches to the data signals and signals between various functional units. In the vast majority of modern computers, the same memory is used for both data and program instructions, and the von Neumann vs. Harvard distinction applies to the cache architecture, not the main memory ( split cache architecture ).


The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a calculator (in principle) is a fixed program computer. It can do basic mathematics , but it can not be used as a word processor or a gaming console. Changing the program of a fixed-program machine requires rewiring, restructuring, or redesigning the machine. The earliest computers were not so much “programmed” as they were “designed”. “Reprogramming”, when it was possible, was a laborious process, starting with flowchartsand paper notes, followed by detailed engineering designs, and then the often-arduous process of physically rewiring and rebuilding the machine. It could take a few weeks to set up a program on ENIAC and get it working. [4]

With the proposal of the stored-program computer, this changed. A stored-program computer includes, by design, an instruction set and can store in memory a set of instructions (a program ) that details the computation .

A stored-program design also allows for self-modifying code . One early motivation for such a facility was the need for a program to increment or otherwise change the address portion of the instructions, which had to be done manually in early designs. This becomes less important when index registers and indirect software become standard machine architecture. Another use was to be used frequently in the stream statement using immediate addressing . Self-modifying code has been greatly improved, since it is usually hard to understand and debug , as well as being inefficient under modern processor pipelining and caching schemes.


One has a large scale, the ability to treat instructions as it is what makes assemblers , compilers , linkers , loaders , and other automated programming tools possible. One can write programs which write programs. [5] Neumann’s architecture has a sophisticated self-hosting computing ecosystem.

Some high-level languages Such As LISP leverage the von Neumann architecture by providing good an abstract, machine-independent way to Manipulate executable code at runtime, or by using runtime information to tune just-in-time compilation (eg in the case of languages hosted on the Java virtual machine , or languages ​​embedded in web browsers ).

We have smaller scale, some repetitive operations such as BITBLT or pixel & vertex shaders could be accelerated on general purpose processes with just-in-time compilation techniques. This is one of self-modifying code that has remained popular.

Development of the stored-program concept

The mathematician Alan Turing , who had been alerted to a problem of mathematical logic by the readings of Max Newman at the University of Cambridge , wrote a paper in 1936 entitled On Computable Numbers, with an Application to the Entscheidungsproblem , which was published in the Proceedings of the London Mathematical Society . [6] In it he described a hypothetical machine which he called “universal computing machine”, and which is now known as the ” Universal Turing Machine “. The hypothetical machine had an infinite store (memory in today’s terminology) that contained both instructions and data. John von Neumannbecame acquainted with Turing while he was a visiting professor at Cambridge in 1935, and also during Turing’s PhD year at the Institute for Advanced Study in Princeton, New Jersey during 1936 – 1937. Whether he knew of Turing’s paper of 1936 at that time is not clear.

In 1936, Konrad Zuse also anticipated that machine instructions could be stored in the same storage used for data. [7]

Independently, J. Presper Eckert and John Mauchly , who were developing the ENIAC at the Moore School of Electrical Engineering , at the University of Pennsylvania , wrote about the stored-program concept in December 1943. [8] [9] In planning a new machine EDVAC , Eckert wrote in January 1944 That They Would store data and programs in a new addressable memory device, a metal mercury delay line memory . This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing’s work.

Von Neumann was involved in the Manhattan Project at the Los Alamos National Laboratory , which required huge amounts of calculation. This was the ENIAC project, during the summer of 1944. The EDVAC. As part of that group, he wrote up a description titled First Draft of a Report on the EDVAC [1] based on the work of Eckert and Mauchly. It was unfinished when his colleague Herman Goldstine circulated with Neumann’s name on it, to the consternation of Eckert and Mauchly. [10] The paper was read by von Neumann’s colleagues in America and Europe, and influenced the next round of computer designs.

Jack Copeland considers that it is “historically inappropriate, to refer to the electronic stored-program of Neumann machines'”. [11] His Los Alamos colleague Stan Frankel said of von Neumann’s look for Turing’s ideas:

I know that in or about 1943 or ’44 von Neumann was well aware of the fundamental importance of Turing’s Paper of 1936 … Von Neumann introduced me to that paper and to his urging I studied it with care. Many people have acclaimed von Neumann as the “father of the computer” (but in a modern sense of the term) I am sure he would never have made that mistake himself. He may well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing- in so far as not anticipated by Babbage … Both Turing and von Neumann, of course , also made substantial contributions to the “reduction to practice”[12]

At the time que la “First Draft” report circulated Was, Was Turing Producing a postponement Entitled Proposed Electronic Calculator qui Described in detail engineering and programming, His idea of a Machine That Was called Expired the Automatic Computing Engine (ACE) . [13] He presented this to the Executive Committee of the British National Physical Laboratory on February 19, 1946. Though Turing knew of his wartime experience at Bletchley Park that it was proposed, the secret surrounding Colossus , that was superimposed for several months. decades, prevented him from saying so. Various successful implementations of the ACE design were produced.

Both von Neumann’s and Turing’s papers described stored-program computers, Neumann’s goal of earlier paper and the computer architecture of the Nile. In the 1953 publication Faster than Thought: A Symposium on Digital Computing Machinery (edited by BV Bowden), a section in the chapter we Computers in America reads as follows: [14]

The Machine of the Institute for Advanced Studies, Princeton

In 1945, Professor J. von Neumann, who was then working at the Moore School of Engineering in Philadelphia, where the ENIAC had been built, issued on behalf of a group of his co-workers to report on the logical design of digital computers. The report contained a fairly detailed proposal for the design of the machine which has become known to EDVAC (electronic discrete variable automatic computer). This machine has only been completed in America, but the von Neumann report compiles the construction of the electronic delay-storage automatic calculator (EDSAC) in Cambridge (see page 130).

In 1947, Burks, Goldstine and von Neumann published another report which outlined the design of another type of machine (a parallel machine this time) which should be exceedingly fast, capable perhaps of 20,000 operations per second. They have pointed out that they have been readily available, and at first they suggest the use of a special vacuum tube -called the ” Selectron ” -which hAD-been invented by the Princeton Laboratories of the RCA tubes These were expensive and difficulty to make, so von Neumann subsequently Decided to build a Machine based on the Williams memory. This machine, which was completed in June, 1952 in Princeton has become popularly known as the Maniac. The design of this machine has been one of the most important features of the United States, which is known as Johniacs.

In the same book, the first two paragraphs of a chapter on ACE read as follows: [15]

Automatic Computation at the National Physical Laboratory

The National Physical Laboratory, Teddington, one of the most modern digital computers, where it has been developed and implemented by the National Physical Laboratory, where it has been designed and built by a small team of mathematicians and electronics research engineers on the staff of the Laboratory, assisted by a number of production engineers from the English Electric Company, Limited. The equipment is well known to the laboratory, but it is not limited to the automatic computation engine, but is comparatively small in bulk and contains only about 800 thermionic valves, as can be judged from Plates XII, XIII and XIV, it is an extremely rapid and versatile calculating machine.

The basic concepts and principles of computation by a machine were formulated by Dr. AM Turing, FRS, in a paper 1 . read before the London Mathematical Society in 1936, but work on such machines in Britain was delayed by the war. In 1945, however, an examination of the problems was made at the National Physical Laboratory by JR Womersley, then superintendent of the Mathematics Division of the Laboratory. He was joined by Dr. Turing and a small staff of specialists, and, by 1947, the preliminary planning was made to advanced to the establishment of the special group already mentioned. In April, 1948, the latter became the Electronics Section of the Laboratory, under the charge of Mr. FM Colebrook.

Early von Neumann-architecture computers

The First Draft is a design that has been used by many universities and corporations to build their computers. [16] Among these various computers, only ILLIAC and ORDVAC had compatible instruction sets.

  • ARC2 ( Birkbeck, University of London ) officially came online on May 12, 1948. [17]
  • Manchester Small-Scale Experimental Machine (SSEM), nicknamed “Baby” (University of Manchester, England) made its first successful run of a stored-program on June 21, 1948.
  • EDSAC (University of Cambridge, England) was the first stored-computer program (May 1949)
  • Manchester Mark 1 (University of Manchester, England) Developed from the SSEM (June 1949)
  • CSIRAC ( Council for Scientific and Industrial Research ) Australia (November 1949)
  • EDVAC ( Ballistic Research Laboratory , Computing Laboratory at Aberdeen Proving Ground 1951)
  • ORDVAC (U-Illinois) at Aberdeen Proving Ground, Maryland (completed November 1951) [18]
  • IAS machine at Princeton University (January 1952)
  • MANIAC at Los Alamos Scientific Laboratory (March 1952)
  • ILLIAC at the University of Illinois, (September 1952)
  • BESM-1 in Moscow (1952)
  • AVIDAC at Argonne National Laboratory (1953)
  • ORACLE at Oak Ridge National Laboratory (June 1953)
  • BESK in Stockholm (1953)
  • JOHNNIAC at RAND Corporation (January 1954)
  • DASK in Denmark (1955)
  • WEIZAC at the Weizmann Institute of Science in Rehovot , Israel (1955)
  • PERM in Munich (1956?)
  • SILLIAC in Sydney (1956)

Early stored-program computers

The date information in the following chronology is difficult to put into proper order. Some dates are for first running a test program, some dates are shown for the first time or the date of completion.

  • The IBM SSEC had the ability to treat instructions, and was published on January 27, 1948. [19] [20] However it was partially electromechanical, not fully electronic. In practice, instructions were read from the paper tape of their limited memory. [21]
  • The ARC2 developed by Andrew Booth and Kathleen Booth at Birkbeck, University of London officiellement online cam is May 12, 1948. [17] It featured the first rotating drum storage device . [22] [23]
  • The Manchester SSEM (the Baby ) was the first fully electronic computer to run a stored program. It ran a factoring program for 52 minutes on June 21, 1948, after running a simple division program and a program to show that two numbers were relatively prime .
  • The ENIAC was modified to a primitive read-only stored-program computer (using the Function Tables for Program ROM ) and was demonstrated as such on September 16, 1948, by Adele Goldstine for von Neumann.
  • The BINAC ran some tests in February, March, and April 1949, although not completed until September 1949.
  • The Manchester Mark 1 Developed from the TAU project. An intermediate version of the Mark 1 was available in April 1949, but was not completed until October 1949.
  • The EDSAC ran its first program on May 6, 1949.
  • The EDVAC was delivered in August 1949, but it was not until 1951.
  • The CSIR Mk I ran its first program in November 1949.
  • The SEAC was demonstrated in April 1950.
  • The Pilot ACE ran its first program on May 10, 1950 and was demonstrated in December 1950.
  • The SWAC was completed in July 1950.
  • The Whirlwind was completed in December 1950 and was in actual use in April 1951.
  • The first ERA Atlas (later the ERA 1101 / UNIVAC 1101 commercial) was installed in December 1950.


Throughout the decades of the 1960s and 1970s, they have become more flexible in their architecture. For example, memory-mapped I / O allows input and output devices to be treated the same as memory. [24] A single system bus could be used to provide a modular system with lower cost clarification needed ] . This is sometimes called a “streamlining” of the architecture. [25] In coming decades, simple microcontrollers would sometimes omit features of the model to lower cost and size. Larger computers added features for higher performance.

Design limitations

Von Neumann bottleneck

The shared bus entre le program memory and data memory leads to the von Neumann bottleneck , the limited throughput (data transfer rate) entre les central processing unit (CPU) and memory Compared To the amount of memory. Because the one bus can only access one of the two classes of memory at a time, throughput is lower than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continually forced to waitfor needed data to be transferred to or from memory. Since CPU speed and memory size have grown much faster than the throughput between them, the bottleneck has become more of a problem.

The von Neumann bottleneck was described by John Backus in his 1977 ACM Turing Reading Award . According to Backus:

Surely there must be a primitive way of making big changes in the world by pushing large numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for a problem, but more importantly, it is an intellectual bottleneck larger conceptual units of the task at hand. Neumann bottleneck, and where it is more important than ever, but where to find it. [26] [27]


There are several known methods for mitigating the Von Neumann performance bottleneck. For example, the following all can improve performance why? ] :

  • Providing a cache between the CPU and the main memory
  • providing separate caches or separate access paths for data and instructions (the so-called Modified Harvard architecture )
  • using branch predictor algorithms and logic
  • providing good has limited CPU stack or other on-chip scratchpad memory to Reduce memory access

The problem can be solved by using parallel computing , using non-uniform memory access (NUMA) architecture-this approach is commonly employed by supercomputers. It is less clear that the intellectual backlash that has had a major influence. citation needed ] Modern functional programming and object-oriented programming are much more important than ” FORTRAN” were, but internally, that is still what computers spend much of their time doing, even highly parallel supercomputers.

As of 1996, a database benchmark study found that three out of four CPU cycles were spent waiting for memory. Researchers expect that increasing the number of simultaneous instruction with multithreading or single-chip multiprocessing will make this bottleneck even worse. [28]

Self-modifying code

Aside from the von Neumann bottleneck, program modifications can be quite harmful, by accident or design. In some simple stored-program computer designs, a malfunctioning program can damage itself, other programs, or the operating system , possibly leading to a computer crash . Memory protection and other forms of access control can usually protect against both accidental and malicious program modification.

Program modifications can be beneficial. The Von Neumann architecture allows for encryption. clarification needed ]

See also

  • Computer science portal
  • CARDboard Illustrative Aid to Computation
  • Interconnect bottleneck
  • Little man computer
  • Random-access machine
  • Turing machine
  • Neuromorphic engineering
  • Eckert architecture


  1. ^ Jump up to:c von Neumann, John (1945), First Draft of a Report on the EDVAC(PDF) , archived from the original (PDF) on 2013-03-14 , retrieved 2011-08-24
  2. Jump up^ Ganesan 2009
  3. Jump up^ Markgraf, Joey D. (2007), The Von Neumann Bottleneck , archived fromthe original on December 12, 2013
  4. Jump up^ Copeland 2006, p. 104
  5. Jump up^ MFTL (My Favorite Toy Language) entry Jargon File 4.4.7 , retrieved 2008-07-11
  6. Jump up^ Turing, Alan M. (1936), “On Computable Numbers, with an Application to the Entscheidungsproblem,” Proceedings of the London Mathematical Society , 2 (published 1937), 42 , pp. 230-265, doi : 10.1112 / plms / s2-42.1.230 (and Turing, Alan M. (1938), “On Computable Numbers, with an Application to the Entscheidungsproblem .A correction,” Proceedings of the London Mathematical Society , 2 (published 1937), 43 (6), pp. 544-546, doi :10.1112 / plms / s2-43.6.544 )
  7. Jump up^ “Electronic Digital Computers” , Nature , 162 : 487, September 25, 1948,doi : 10.1038 / 162487a0 , archived from the original on April 6, 2009 , retrieved April 10, 2009
  8. Jump up^ Lukoff, Herman (1979). From Dits to Bits: A personal history of the electronic computer . Portland, Oregon, USA: Robotics Press. ISBN 0-89661-002-0 . LCCN 79-90567 .
  9. Jump up^ ENIAC project administrator Grist Brainerd’s December 1943 progress report for the first period of the ENIAC’s implementation of the concept of the stored program (while simultaneously rejecting its implementation in the ENIAC) by stating that “in order to have the simplest project and not to complicate matters “the ENIAC would be constructed without any” automatic regulation “.
  10. Jump up^ Copeland 2006, p. 113
  11. Jump up^ Copeland, Jack (2000), A Brief History of Computing: ENIAC and EDVAC , retrieved 2010-01-27
  12. Jump up^ Copeland, Jack (2000), A Brief History of Computing: ENIAC and EDVAC , retrieved 2010-01-27 which quotes Randell, Brian (1972), Meltzer, B .; . Michie, D., eds, “is Alan Turing and the Origins of Digital Computers”, Machine Intelligence , Edinburgh: Edinburgh University Press ,7 : 10, ISBN 0-902383-26-4
  13. Jump up^ Copeland 2006, pp. 108-111
  14. Jump up^ Bowden 1953, pp. 176,177
  15. Jump up^ Bowden 1953, p. 135
  16. Jump up^ “Electronic Computer Project” . Institute for Advanced Study . Retrieved 2011-05-26 .
  17. ^ Jump up to:b Campbell-Kelly, Martin (April 1982). “The Development of Computer Programming in Britain (1945 to 1955)”. IEEE Annals of the History of Computing . 4 (2): 121-139. doi : 10.1109 / MAHC.1982.10016 .
  18. Jump up^ Robertson, James E. (1955), Illiac Design Techniques , report number UIUCDCS-R-1955-146, Digital Computer Laboratory, University of Illinois at Urbana-Champaign
  19. Jump up^ Selective Sequence Electronic Calculator (USPTO website)
  20. Jump up^ Selective Sequence Electronic Calculator (Google Patents)
  21. Jump up^ Grosch, Herbert RJ (1991), Computer: Bit Slices From a Life , Third Millennium Books, ISBN 0-88733-085-1
  22. Jump up^ Lavington, Simon, ed. (2012). Alan Turing and his Contemporaries: Building the World’s First Computers . London: British Computer Society . p. 61. ISBN 9781906124908 .
  23. Jump up^ Johnson, Roger (April 2008). “School of Computer Science & Information Systems: A Short History” (PDF) . Birkbeck College . University of London . Retrieved 2017-07-23 .
  24. Jump up^ Bell, C. Gordon ; Cady, R .; McFarland, H .; O’Laughlin, J .; Noonan, R .; Wulf, W. (1970), “A New Architecture for Mini Computers-The DEC PDP-11″(PDF) , Spring Joint Computer Conference , pp. 657-675
  25. Jump up^ Null, Linda; Lobur, Julia (2010), The essentials of computer organization and architecture (3rd ed.), Jones & Bartlett Learning, pp. 36, 199-203,ISBN 978-1-4496-0006-8
  26. Jump up^ Backus, John W. “Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs”. doi : 10.1145 / 359576.359579 .
  27. Jump up^ Dijkstra, Edsger W. “EW Dijkstra Archive: A Review of the 1977 Turing Award Reading” . Retrieved 2008-07-11 .
  28. Jump up^ Sites, Richard L .; Patt, Yale. “Architects Look to Processors of Future”. Microprocessor report. 1996

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright computerforum.eu 2018
Shale theme by Siteturner