Rendered at 06:51:56 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
Rochus 21 hours ago [-]
Eniac was indeed impressive and an important milestone. I recommend the 1999 book "ENIAC - The triumphs and tragedies of the world's first computer" by Scott McCartney which is both interesting to read and very informative. Also the review of the book by the late Jean Bartik, one of the "computers" and thus an eyewitnmess, is very interesting: https://web.archive.org/web/20221101120020/https://www.amazo....
Though the article is very US focussed, keeping quiet that German engineer Konrad Zuse completed the Z3 in May 1941, five years before ENIAC, effectively creating the world's first working programmable and fully automatic digital computer. While ENIAC required days of manual cable patching to program, the Z3 was quickly programmed by a punched tape ("Lochstreifen"), and Zuse also has invented Plankalkül between 1942 and 1945, which is widely recognized as the world's first high-level programming language. The cooperation between Zuse and ETH Zurich eventually led to the first self-compiling compiler and eventually Algol 60 (see "The European Side of the Last Phase of the Development of ALGOL 60" by Peter Naur in ACM SIGPLAN "History of Programming Languages" from 1978). And there was also the British Colossus, which was also a "programmable computer" and successfully utilized vacuum tubes for code-breaking by early 1944.
jcranmer 16 hours ago [-]
There are three main problems with trying to offer a simple answer to the question of "what is the first computer?"
The most obvious of the problems is that a computer isn't a singular technology that springs up de novo, but something that develops from antecedents over a long, messy transition problem that requires a judgement call as to when the proto-computer becomes an actual computer. A judgement call which is obviously going to be biased based on the other considerations. Consider, for a more contemporary example, what you would argue as the "first smartphone" or the "first LLM." Personally, I think the ENIAC is still somewhat too proto-computer for my tastes: I'd prefer a "first" that uses binary arithmetic and has stored programs, neither of which is true for the ENIAC.
The second major issue is it's also instructive to look at the candidates' influence on later development. Among the contenders for "first computer," it's unfortunately kinda clear that ENIAC has the most lasting influence. ENIAC's development produced the papers that directly inspires the next generation of machines. Colossus is screwed here because of the secrecy of the code-breaking effort. Meanwhile, Zuse and Z3 suffer from being on the losing end of WW2. ABC has a claim here, but it's not clear whether or not the developers of ENIAC drew influence from ABC or not.
The final major issue isn't so much an issue by itself but rather something that colors the interpretation of the first two issues: national pride. An American is far more likely to weight the influence and ingenuity of the ENIAC and similar machines to label one of them the "first computer." A UK person would instead prefer to crown Colossus or the Manchester Baby. A German would prefer the Z3.
alephnil 14 hours ago [-]
In many ways the ENIAC was more like an FPGA than a computer. It was programmed with patch cables connecting the different computational units as well as switches, and had no CPU as such. The cables had to be physically rerouted when changing to a new program, which took weeks. My understanding is that it was eventually programmed to emulate a von Neuman machine around 1948/49. As far as I understand, this was done mainly by Jean Bartik based on Von Neumans ideas.
If this is correct, it was not a von Neuman machine originally, but it eventually became one, and at approximately the same time as the Manchester Baby.
mrob 20 hours ago [-]
The Z3 was only general purpose by accident, and this was only discovered in 1997 (published 1998). [0] It's only of theoretical interest because the technique required is too inefficient for real-world applications.
ENIAC is notable because it was the first intentionally general purpose computer to be built.
I do not think that it is right at all to say "intentionally general purpose computer".
ENIAC was built for a special purpose, the computation of artillery tables.
It was a bespoke computer built for a single customer: the United States Army's Ballistic Research Laboratory.
This is why it has been designed as the digital electronic equivalent of the analog mechanical computers that were previously used by the Army and why it does not resemble at all what is now meant by "general-purpose computer".
The computers of Aiken and Zuse were really intentionally general-purpose, their designers did not have in mind any specific computation, which is why they were controlled by a program memory, not by a wiring diagram.
What you claim about Z3 being general purpose by accident does not refer to the intention of its designer, but only to the fact that its instruction set was actually powerful enough by accident, because at that early time it was not understood which kinds of instructions are necessary for completeness.
All the claims made now about ENIAC being general-purpose are retroactive. Only after the war ended and the concept of a digital computer became well understood, the ENIAC was repurposed to also do other tasks than originally planned.
The first truly general-purpose electronic digital computers that were intentionally designed to be so were those designed based on the von Neumann report.
Before the completion of the first of those, there were general-purpose hybrid electronic-electromechanical digital computers, IBM SSEC being the most important of them, which solved a lot of scientific and technical problems, before electronic computers became available.
rootbear 13 hours ago [-]
A counter argument is that Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC. He could only get ENIAC funded if it was valuable to the war effort. I’ve read quite a lot about that machine and I’m not aware of any architectural features that were specific to ballistics calculations. This is unlike the British Colossus, another early computer, which was specifically designed for code breaking and wasn’t general purpose.
As for the objection that it wasn’t stored program, I was interested to learn that it was converted to stored program operation after only two years or so of operation, using the constant table switches as the program store. But the Manchester Baby, which used the same memory for code and data was more significant in the history of stored program machines.
On the general question of “first computer”, I think the answer is whatever machine you want it to be if you heap enough conditional adjectives on it.
Rochus 13 hours ago [-]
> Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC
True. Mauchly was a physics professor interested in meterology, and he knew that predicting the weather and calculating an artillery shell's flight are mathematically the same type of problem, which was important to get funding. In the fifties, Eniac was even used to calculate weather forecasts (see https://ams.confex.com/ams/2020Annual/webprogram/Manuscript/...). So these were just two related special problems, and it would be a stretch to interpret this as an intention to build a general-purpose computer. The latter had to wait until the sixties.
Rochus 18 hours ago [-]
> The Z3 was only general purpose by accident ... ENIAC [..] was the first intentionally general purpose computer
That's a pretty academic take. Neither Eckert, nor Mauchly, nor Zuse knew about Alan Turing’s 1936 paper when they designed their machines. The classification of ENIAC (and the Z3) as a "universal Turing machine" is entirely a retroactive reinterpretation by later computer scientists. John von Neumann knew the paper and was aware of its significance, but he only turned up in the ENIAC project when the design was complete. At this time, Eckert and Mauchly were already well aware of ENIAC's biggest flaw (the massive effort to reprogram the machine, and in fact they came up with the stored-program concept which von Neumann later formalized). ENIAC’s funding and primary justification were for the very specific purpose of calculating artillery firing tables for the military. The machine was built for this purpose, which included the feature which retroactively led to the mentioned classification.
ahartmetz 20 hours ago [-]
Still feels like history written by the victors (of WW2 and computing, eventually) in this case. If you want to be mathematically precise, it's been proven to be Turing-complete. If you want to use common sense (IMO better), it was one of the most significant leaps in automated computation and simply didn't need to do more for its intended applications. For conditional branches to make sense, you also need a fast temporary storage array (since it would be awfully slow running directly off tape like a Turing machine), and to realize that all that effort makes sense, you first need to play with a computer for a while and discover the new possibilities.
21 hours ago [-]
yaakov34 17 hours ago [-]
The Z3 was not a general purpose computer; it was a calculator that performed a predetermined sequence of operations that was written to its tape. It was remarkable for being all-binary in an era when differential gears and cams were very common in computing devices, and had some other advanced features. But the 1990s article that declared it Turing-complete is just silly. It would apply to every four-function calculator that supports rounding, and programming a computer like that is not just "impractical" - both the tape and execution time would grow exponentially in number of branches - but it is not the model that Turing proposed. The whole point of Turing's (theoretical) device is that a short program using the abilities of that device could perform unlimited computations; if you make the program length unlimited instead, that's a much less interesting model of computation.
The problem is that anything that gets into Wikipedia becomes ingrained in the Internet's collective mind, which then can't be changed.
ogogmad 14 hours ago [-]
Would it not have been easy to add branch instructions to it? Just rewind the instruction tape however many places. It seems 99% of the job was done.
adrian_b 20 hours ago [-]
The shortened title is very incorrect.
What the article says is different: "the first large-scale, general-purpose, programmable electronic digital computer".
The claim of the article can be considered correct, and "electronic" is a part that cannot be deleted from it without falsifying the claim.
Before ENIAC, there have been digital computers that were much more general-purpose, because they run programs written on punched tape, instead of requiring a rewiring like ENIAC.
ENIAC, which evolved from the analog computers known as differential analyzers, had a structure closer to an FPGA than to a modern digital computer.
In contrast, an earlier relay computer like Harvard Mark I was intended as a successor of the mechanical digital computer designed by Charles Babbage, so it already had the same structure with a modern digital computer, except that it used different kind of memories for data and for programs, hence the name "Harvard architecture". The same was true for the Zuse computer.
The earlier ABC digital computer was electronic, but it can be considered as special-purpose, not general-purpose. The first relay computers at Bell Labs may also be considered as special purpose.
mrob 20 hours ago [-]
In this context, "general purpose" means "Turing complete" in the informal sense of handwaving away the requirement for infinite storage space.
adrian_b 20 hours ago [-]
What you say changes nothing.
The earlier relay computers were Turing complete.
For ENIAC it also does not make sense to claim that it was Turing complete. Such a claim can be made for a computer controlled by a program memory, where you have a defined instruction set, and the instruction set may be complete or not. If you may rewire arbitrarily the execution units, any computer is Turing complete.
The earlier ABC electronic computer was built for a special purpose, the solution of systems of linear algebraic equations, like ENIAC was built only for a special purpose, the computation of artillery tables.
By rewiring the ABC electronic computer you could have also computed anything, so you can say that it was Turing complete, if rewiring is allowed.
The only difference is that rewiring was simpler in ENIAC, because it had been planned to be easy, so there were special panels where you could alter the connections.
Neither ABC nor ENIAC had enough memory to be truly general-purpose, and by the end of the war it was recognized that this was the main limitation for extending the domain of applications, so the ENIAC team proposed ultrasonic delay lines as the solution for a big memory (inspired by the use of delay lines as an analog memory in radars), while von Neumann proposed the use of a cathode ray tube of the kind used in video cameras (iconoscope; this was implemented first in the Manchester computers).
Because ENIAC was not really designed as general-purpose, its designers originally did not think about high-capacity memories. On the other hand, John Vincent Atanasoff, the main designer of the ABC computer, has written a very insightful document about the requirements for memories in digital computers, years before ENIAC, where he analyzed all the known possibilities and where he invented the concept of DRAM, but implemented with discrete capacitors. Later, the proposal of von Neumann was also to use a DRAM, but to use a cheaper and more compact iconoscope CRT, instead of discrete capacitors.
While the ABC computer was not general-purpose as built, the document written by Atanasoff in 1940, “Computing Machine for the Solution of Large Systems of Linear Algebraic Equations”, demonstrated a much better understanding of the concept of a general-purpose electronic digital computer than the designers of ENIAC would demonstrate before the end of 1944 / beginning of 1945, when they realized that a bigger memory is needed to make a computer suitable for any other applications, i.e. for really making it "general purpose".
klelatti 21 hours ago [-]
ENIAC was very important but this article overstates its significance and ignores other (non US) machines to the point of historical inaccuracy. No mention of Z3 or Manchester Baby for example, the latter based on the von Neumann paper for example, was arguably a more accurate pointer towards how computer architecture would develop.
BirAdam 17 hours ago [-]
ENIAC was far more important for the general computing industry than other machines of its time. ENIAC led to EMCC (the first computer company) and UNIVAC. It was UNIVAC and IBM struggling against one another that created the entire industry.
adrian_b 2 hours ago [-]
The publication of the von Neumann report has been many orders of magnitude more important for the computing industry than ENIAC.
Soon after that publication, many teams in many places and in several countries have started designing computers and a lot of research results have been published by them, which has lead to the establishment of the computer industry.
The first computer company was a flop, they got lucky that they were bought eventually, otherwise they would not have been able to deliver a useful product. UNIVAC was indeed the first important commercial computer (not the first commercial computer). Nevertheless, UNIVAC was mostly already obsolete at the time of its introduction, due to the use of the delay line memories, for which better alternatives were known. UNIVAC did have a technical first nonetheless, it was the first to use magnetic tapes. However soon the competition (i.e. IBM) made more reliable and also cheaper magnetic tape units.
In USA, UNIVAC had the advantage of being the first on the market, but the IBM computers that followed shortly were more innovative, so IBM deserved becoming the leader of the market instead of UNIVAC. Moreover, IBM was more open at that time and they published a lot of useful technical information about their computers, which contributed to the advancement of the entire computing industry.
The ENIAC team and their successors had a quite minor contribution to the early years of the US computing industry, in comparison with research centers like IAS, universities like MIT, government agencies like NBS (the predecessor of NIST) or companies like IBM, ATT and a few others, all of which introduced essential innovations in computers and they also published the results of their work, enabling the reuse by others.
hedora 16 hours ago [-]
IBM built the Harvard Mark 1 in 1944, before EMCC existed.
adrian_b 2 hours ago [-]
Then IBM also built the hybrid electronic-electromechanical IBM SSEC computer (operational from January 1948), which was a truly general-purpose digital computer, which was available before any fully-electronic computer and for a few years it was the most powerful computer of the world and it solved many important problems.
While ENIAC, being completely electronic, remained faster than SSEC for a few problems, most problems could not be solved at all on ENIAC, because it had no big-capacity memory, so for most computing problems SSEC was the best choice until the completion of the first electronic computers with memories based either on cathode-ray tubes or on delay lines or on magnetic drums.
IBM SSEC was available as a public computing service, so it was used by many companies and institutions. Besides SSEC, before the first electronic computers there were a few others electromechanical computers, e.g. at Bell Labs or at Harvard, but those were slower and had fewer users.
sebastos 19 hours ago [-]
Wait, where are you thinking the von Neumann paper which came from?
adrian_b 2 hours ago [-]
The von Neumann report was written after von Neumann had several discussions with the ENIAC team about how to make a better computer as a successor for ENIAC.
The report was not published formally, but it was "leaked", so it does not have any credits for the ideas contained in it.
Because of this, with few exceptions it is impossible to determine with certainty which parts of the report are original ideas of von Neumann and which parts are ideas that von Neumann might have learned during the discussions with the ENIAC team.
An example of an idea that certainly did not come from the ENIAC team was the proposal to use an iconoscope CRT as the main memory (which was implemented first in the British Manchester computers, so such a memory became known as a Williams-Kilburn tube). The ENIAC team had a different idea of what to use as a memory, i.e. delay lines taken from radars. Von Neumann replaced this suggestion with a CRT, because he thought that a random-access memory is better.
The von Neumann report had an exceptional importance because it defined with perfect clarity what a digital computer should be, which should be its structure and then provided a detailed description of how such a computer should be designed, which was good enough to enable anyone who read the report to build such a computer. This effect really happened, and a great number of teams at universities, government agencies, independent research centers like IAS and various companies, both in USA and in other countries, have built electronic computers in the following decade, exploring various design options.
There is no doubt that the clarity of the report is due to von Neumann and whichever were the ideas of the ENIAC team about a future computer, they were much more jumbled.
Because the ENIAC team did not publish their ideas (and they did not intend to, because they already wanted to monetize what they had learned about computers, by founding a private company), it does not really matter what they thought. The world has learned how to make general-purpose electronic computers from the von Neumann report.
ENIAC was a programmable computing automaton, but it was not a digital computer in the modern sense of the word, i.e. a digital system with 4 levels of closed positive-feedback loops (the complexity of a digital system is determined by the number of levels of nested positive-feedback loops, combinational logic has 0 levels, a memory has 1 level, an automaton has 2 levels, a processor has 3 levels and a computer has 4 levels; these are minimum numbers, as a real device may have more levels than strictly necessary, to achieve various advantages).
klelatti 18 hours ago [-]
The paper came out of work on ENIAC and was adapted to follow the approach in the paper but Baby was built from outset to use that approach and its design much more closely matches the architecture that has been used by almost all digital computers since. I don’t dispute that ENIAC is important but it’s role is more nuanced than this article implies.
JoeDaDude 15 hours ago [-]
For a while I worked at what was then the Sperry Rand Corporation (now Unisys) which had some pride in their heritage as the descendant of the Univac Corporation founded by ENIAC inventors Eckert and Mauchly. In a glass case there was a vacuum tube circuit said to be a memory unit of the original ENIAC. No one seemed to know much about it, casting doubt on the claimed provenance of the device.
The tube circuit resembled the ones shown in the photo linked below (although none of those in the photo are from ENIAC).
ENIAC is where the profession of programming was born — and the first programmers were six women: Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Meltzer, Frances Bilas, and Ruth Lichterman.
They had to program it by physically rewiring patch cables and flipping switches. There was no programming language, no stored program. The "software" was the hardware configuration itself.
It took another decade before FORTRAN (1957) gave programmers a way to write instructions in something resembling human language.
adrian_b 2 hours ago [-]
There were programmers before that, e.g. for the IBM ASCC at Harvard, which was based on the ideas of Howard Aiken (inspired by Babbage).
Programming the IBM ASCC (a.k.a. Harvard Mark I) was much closer to the programming a modern computer in comparison with ENIAC, as it had an instruction set and programmers wrote a sequence of instructions on punched tapes. However even ASCC had some panel where it was possible to rewire some of the execution units to change their behavior, i.e. to change what some of the instructions from the instruction set did, but that was not the primary means for programming the computer. In ASCC the rewiring was akin to the microprogramming available in some later electronic computers, where you could change what some instructions did or you could add custom instructions.
Among the programmers of the IBM ASCC, Grace Hopper became later famous due to her contributions to the first high-level programming languages.
Therefore the profession of programmer has not started with ENIAC, even if the ENIAC programmers were among the first programmers.
AnimalMuppet 12 hours ago [-]
Even in the 1950s, my mother worked on a machine that could be programmed in octal, but you could change the instruction set with patch cables.
adrian_b 2 hours ago [-]
Perhaps one of the IBM 60x programmable calculator series, which were widely used.
LetsGetTechnicl 14 hours ago [-]
Thank you for highlighting the contributions of women in computing, especially at it's inception! That is so easily forgotten or intentionally ignored in the age of the "tech bro"
adrian_b 1 hours ago [-]
There were other programmers before those of ENIAC, but they also included women, like Grace Hopper, who later had an important role in the development of programming languages.
gerikson 22 hours ago [-]
> The computer contained about 18,000 vacuum tubes, which were cooled by 80 air blowers. More than 30 meters long, it filled a 9 m by 15 m room and weighed about 30 kilograms. It consumed as much electricity as a small town.
Surprisingly light though...
ahartmetz 20 hours ago [-]
The vacuum in the tubes weighs nothing, so they produce lift.
adrian_b 19 hours ago [-]
An obvious typo. It was tons, not kilograms.
Perhaps AI aided?
gus_massa 20 hours ago [-]
I still remember the CRT TV we had at home when I was kid. It was big but almost empty.
Vacuum tubes break too often. Once per year? But if you have a thousand of them you have to change one very often. So I guess they have a lot of space for humans repairing it.
jonjacky 10 hours ago [-]
A newer book, from 2016, ENIAC in Action: Making and Remaking the Modern Computer by Thomas Haigh, Mark Priestly, and Crispin Rope
An interesting revelation here is that, although ENIAC was not originally conceived as a stored program computer, it was quite early converted to one. They repurposed a lookup table intended to calculate functions to store instructions instead. Many of the well-known ENIAC calculations, such as Monte Carlo simulations, were programmed in this mode.
bpoyner 16 hours ago [-]
Reminds me of stumbling across the Harvard Mark I, which is about 1-2 years older, while wandering through the Harvard Science Center (as one does). By far the oldest computer I've seen in person. Seems they moved it since then to the Science and Engineering Complex.
ux266478 18 hours ago [-]
For the curious programmers who are wondering what it was like to program ENIAC, a simulator is available:
I find it interesting that, unless I’m mistaken, this was a completely engineering effort.
That is, they were not trying to follow the notion of a universal computing device that had already been defined by Turing and Church at the time. They were just trying to build something like a huge programmable calculator, but they ended up building a universal computation device anyway.
My alma mater, Ursinus, is a very small school and has few claims to fame; but one of them is that John Mauchly taught there before going to Penn to design ENIAC. Wikipedia puts it bluntly:
> Mauchly's teaching career truly began in 1933 at Ursinus College where he was appointed head of the physics department, where he was, in fact, the only staff member.
josefritzishere 18 hours ago [-]
One of the engineers who worked on the Eniac lived next door to my old landlord in New Jersey. They played chess in the back yard. He was reputedly quite good.
Finnucane 15 hours ago [-]
"More than 30 meters long, it filled a 9 m by 15 m room and weighed about 30 kilograms."
It was a balloon?
imadch 15 hours ago [-]
[dead]
RaviPatel79689 11 hours ago [-]
[dead]
neuromark_cho10 14 hours ago [-]
[dead]
Gamer_Siniest69 18 hours ago [-]
[dead]
AliEveryHour16 13 hours ago [-]
[dead]
jamesvzb 19 hours ago [-]
[dead]
rvz 24 hours ago [-]
Unfortunately, if the subject doesn't have "AI", no one cares.
xattt 22 hours ago [-]
It has IA in its name so that must count for something.
embedding-shape 22 hours ago [-]
That's AI in Spanish, so checks out in large parts of the world :)
casey2 4 hours ago [-]
It's wild that in 100 years we went from room sized calculators to AGSI capable of destroying the planet.
Though the article is very US focussed, keeping quiet that German engineer Konrad Zuse completed the Z3 in May 1941, five years before ENIAC, effectively creating the world's first working programmable and fully automatic digital computer. While ENIAC required days of manual cable patching to program, the Z3 was quickly programmed by a punched tape ("Lochstreifen"), and Zuse also has invented Plankalkül between 1942 and 1945, which is widely recognized as the world's first high-level programming language. The cooperation between Zuse and ETH Zurich eventually led to the first self-compiling compiler and eventually Algol 60 (see "The European Side of the Last Phase of the Development of ALGOL 60" by Peter Naur in ACM SIGPLAN "History of Programming Languages" from 1978). And there was also the British Colossus, which was also a "programmable computer" and successfully utilized vacuum tubes for code-breaking by early 1944.
The most obvious of the problems is that a computer isn't a singular technology that springs up de novo, but something that develops from antecedents over a long, messy transition problem that requires a judgement call as to when the proto-computer becomes an actual computer. A judgement call which is obviously going to be biased based on the other considerations. Consider, for a more contemporary example, what you would argue as the "first smartphone" or the "first LLM." Personally, I think the ENIAC is still somewhat too proto-computer for my tastes: I'd prefer a "first" that uses binary arithmetic and has stored programs, neither of which is true for the ENIAC.
The second major issue is it's also instructive to look at the candidates' influence on later development. Among the contenders for "first computer," it's unfortunately kinda clear that ENIAC has the most lasting influence. ENIAC's development produced the papers that directly inspires the next generation of machines. Colossus is screwed here because of the secrecy of the code-breaking effort. Meanwhile, Zuse and Z3 suffer from being on the losing end of WW2. ABC has a claim here, but it's not clear whether or not the developers of ENIAC drew influence from ABC or not.
The final major issue isn't so much an issue by itself but rather something that colors the interpretation of the first two issues: national pride. An American is far more likely to weight the influence and ingenuity of the ENIAC and similar machines to label one of them the "first computer." A UK person would instead prefer to crown Colossus or the Manchester Baby. A German would prefer the Z3.
If this is correct, it was not a von Neuman machine originally, but it eventually became one, and at approximately the same time as the Manchester Baby.
ENIAC is notable because it was the first intentionally general purpose computer to be built.
[0] https://www.inf.fu-berlin.de/inst/ag-ki/rojas_home/documents...
ENIAC was built for a special purpose, the computation of artillery tables.
It was a bespoke computer built for a single customer: the United States Army's Ballistic Research Laboratory.
This is why it has been designed as the digital electronic equivalent of the analog mechanical computers that were previously used by the Army and why it does not resemble at all what is now meant by "general-purpose computer".
The computers of Aiken and Zuse were really intentionally general-purpose, their designers did not have in mind any specific computation, which is why they were controlled by a program memory, not by a wiring diagram.
What you claim about Z3 being general purpose by accident does not refer to the intention of its designer, but only to the fact that its instruction set was actually powerful enough by accident, because at that early time it was not understood which kinds of instructions are necessary for completeness.
All the claims made now about ENIAC being general-purpose are retroactive. Only after the war ended and the concept of a digital computer became well understood, the ENIAC was repurposed to also do other tasks than originally planned.
The first truly general-purpose electronic digital computers that were intentionally designed to be so were those designed based on the von Neumann report.
Before the completion of the first of those, there were general-purpose hybrid electronic-electromechanical digital computers, IBM SSEC being the most important of them, which solved a lot of scientific and technical problems, before electronic computers became available.
As for the objection that it wasn’t stored program, I was interested to learn that it was converted to stored program operation after only two years or so of operation, using the constant table switches as the program store. But the Manchester Baby, which used the same memory for code and data was more significant in the history of stored program machines.
On the general question of “first computer”, I think the answer is whatever machine you want it to be if you heap enough conditional adjectives on it.
True. Mauchly was a physics professor interested in meterology, and he knew that predicting the weather and calculating an artillery shell's flight are mathematically the same type of problem, which was important to get funding. In the fifties, Eniac was even used to calculate weather forecasts (see https://ams.confex.com/ams/2020Annual/webprogram/Manuscript/...). So these were just two related special problems, and it would be a stretch to interpret this as an intention to build a general-purpose computer. The latter had to wait until the sixties.
That's a pretty academic take. Neither Eckert, nor Mauchly, nor Zuse knew about Alan Turing’s 1936 paper when they designed their machines. The classification of ENIAC (and the Z3) as a "universal Turing machine" is entirely a retroactive reinterpretation by later computer scientists. John von Neumann knew the paper and was aware of its significance, but he only turned up in the ENIAC project when the design was complete. At this time, Eckert and Mauchly were already well aware of ENIAC's biggest flaw (the massive effort to reprogram the machine, and in fact they came up with the stored-program concept which von Neumann later formalized). ENIAC’s funding and primary justification were for the very specific purpose of calculating artillery firing tables for the military. The machine was built for this purpose, which included the feature which retroactively led to the mentioned classification.
The problem is that anything that gets into Wikipedia becomes ingrained in the Internet's collective mind, which then can't be changed.
What the article says is different: "the first large-scale, general-purpose, programmable electronic digital computer".
The claim of the article can be considered correct, and "electronic" is a part that cannot be deleted from it without falsifying the claim.
Before ENIAC, there have been digital computers that were much more general-purpose, because they run programs written on punched tape, instead of requiring a rewiring like ENIAC.
ENIAC, which evolved from the analog computers known as differential analyzers, had a structure closer to an FPGA than to a modern digital computer.
In contrast, an earlier relay computer like Harvard Mark I was intended as a successor of the mechanical digital computer designed by Charles Babbage, so it already had the same structure with a modern digital computer, except that it used different kind of memories for data and for programs, hence the name "Harvard architecture". The same was true for the Zuse computer.
The earlier ABC digital computer was electronic, but it can be considered as special-purpose, not general-purpose. The first relay computers at Bell Labs may also be considered as special purpose.
The earlier relay computers were Turing complete.
For ENIAC it also does not make sense to claim that it was Turing complete. Such a claim can be made for a computer controlled by a program memory, where you have a defined instruction set, and the instruction set may be complete or not. If you may rewire arbitrarily the execution units, any computer is Turing complete.
The earlier ABC electronic computer was built for a special purpose, the solution of systems of linear algebraic equations, like ENIAC was built only for a special purpose, the computation of artillery tables.
By rewiring the ABC electronic computer you could have also computed anything, so you can say that it was Turing complete, if rewiring is allowed.
The only difference is that rewiring was simpler in ENIAC, because it had been planned to be easy, so there were special panels where you could alter the connections.
Neither ABC nor ENIAC had enough memory to be truly general-purpose, and by the end of the war it was recognized that this was the main limitation for extending the domain of applications, so the ENIAC team proposed ultrasonic delay lines as the solution for a big memory (inspired by the use of delay lines as an analog memory in radars), while von Neumann proposed the use of a cathode ray tube of the kind used in video cameras (iconoscope; this was implemented first in the Manchester computers).
Because ENIAC was not really designed as general-purpose, its designers originally did not think about high-capacity memories. On the other hand, John Vincent Atanasoff, the main designer of the ABC computer, has written a very insightful document about the requirements for memories in digital computers, years before ENIAC, where he analyzed all the known possibilities and where he invented the concept of DRAM, but implemented with discrete capacitors. Later, the proposal of von Neumann was also to use a DRAM, but to use a cheaper and more compact iconoscope CRT, instead of discrete capacitors.
While the ABC computer was not general-purpose as built, the document written by Atanasoff in 1940, “Computing Machine for the Solution of Large Systems of Linear Algebraic Equations”, demonstrated a much better understanding of the concept of a general-purpose electronic digital computer than the designers of ENIAC would demonstrate before the end of 1944 / beginning of 1945, when they realized that a bigger memory is needed to make a computer suitable for any other applications, i.e. for really making it "general purpose".
Soon after that publication, many teams in many places and in several countries have started designing computers and a lot of research results have been published by them, which has lead to the establishment of the computer industry.
The first computer company was a flop, they got lucky that they were bought eventually, otherwise they would not have been able to deliver a useful product. UNIVAC was indeed the first important commercial computer (not the first commercial computer). Nevertheless, UNIVAC was mostly already obsolete at the time of its introduction, due to the use of the delay line memories, for which better alternatives were known. UNIVAC did have a technical first nonetheless, it was the first to use magnetic tapes. However soon the competition (i.e. IBM) made more reliable and also cheaper magnetic tape units.
In USA, UNIVAC had the advantage of being the first on the market, but the IBM computers that followed shortly were more innovative, so IBM deserved becoming the leader of the market instead of UNIVAC. Moreover, IBM was more open at that time and they published a lot of useful technical information about their computers, which contributed to the advancement of the entire computing industry.
The ENIAC team and their successors had a quite minor contribution to the early years of the US computing industry, in comparison with research centers like IAS, universities like MIT, government agencies like NBS (the predecessor of NIST) or companies like IBM, ATT and a few others, all of which introduced essential innovations in computers and they also published the results of their work, enabling the reuse by others.
While ENIAC, being completely electronic, remained faster than SSEC for a few problems, most problems could not be solved at all on ENIAC, because it had no big-capacity memory, so for most computing problems SSEC was the best choice until the completion of the first electronic computers with memories based either on cathode-ray tubes or on delay lines or on magnetic drums.
IBM SSEC was available as a public computing service, so it was used by many companies and institutions. Besides SSEC, before the first electronic computers there were a few others electromechanical computers, e.g. at Bell Labs or at Harvard, but those were slower and had fewer users.
The report was not published formally, but it was "leaked", so it does not have any credits for the ideas contained in it.
Because of this, with few exceptions it is impossible to determine with certainty which parts of the report are original ideas of von Neumann and which parts are ideas that von Neumann might have learned during the discussions with the ENIAC team.
An example of an idea that certainly did not come from the ENIAC team was the proposal to use an iconoscope CRT as the main memory (which was implemented first in the British Manchester computers, so such a memory became known as a Williams-Kilburn tube). The ENIAC team had a different idea of what to use as a memory, i.e. delay lines taken from radars. Von Neumann replaced this suggestion with a CRT, because he thought that a random-access memory is better.
The von Neumann report had an exceptional importance because it defined with perfect clarity what a digital computer should be, which should be its structure and then provided a detailed description of how such a computer should be designed, which was good enough to enable anyone who read the report to build such a computer. This effect really happened, and a great number of teams at universities, government agencies, independent research centers like IAS and various companies, both in USA and in other countries, have built electronic computers in the following decade, exploring various design options.
There is no doubt that the clarity of the report is due to von Neumann and whichever were the ideas of the ENIAC team about a future computer, they were much more jumbled.
Because the ENIAC team did not publish their ideas (and they did not intend to, because they already wanted to monetize what they had learned about computers, by founding a private company), it does not really matter what they thought. The world has learned how to make general-purpose electronic computers from the von Neumann report.
ENIAC was a programmable computing automaton, but it was not a digital computer in the modern sense of the word, i.e. a digital system with 4 levels of closed positive-feedback loops (the complexity of a digital system is determined by the number of levels of nested positive-feedback loops, combinational logic has 0 levels, a memory has 1 level, an automaton has 2 levels, a processor has 3 levels and a computer has 4 levels; these are minimum numbers, as a real device may have more levels than strictly necessary, to achieve various advantages).
The tube circuit resembled the ones shown in the photo linked below (although none of those in the photo are from ENIAC).
https://en.wikipedia.org/wiki/File:Women_holding_parts_of_th...
They had to program it by physically rewiring patch cables and flipping switches. There was no programming language, no stored program. The "software" was the hardware configuration itself.
It took another decade before FORTRAN (1957) gave programmers a way to write instructions in something resembling human language.
Programming the IBM ASCC (a.k.a. Harvard Mark I) was much closer to the programming a modern computer in comparison with ENIAC, as it had an instruction set and programmers wrote a sequence of instructions on punched tapes. However even ASCC had some panel where it was possible to rewire some of the execution units to change their behavior, i.e. to change what some of the instructions from the instruction set did, but that was not the primary means for programming the computer. In ASCC the rewiring was akin to the microprogramming available in some later electronic computers, where you could change what some instructions did or you could add custom instructions.
Among the programmers of the IBM ASCC, Grace Hopper became later famous due to her contributions to the first high-level programming languages.
Therefore the profession of programmer has not started with ENIAC, even if the ENIAC programmers were among the first programmers.
Surprisingly light though...
Perhaps AI aided?
Vacuum tubes break too often. Once per year? But if you have a thousand of them you have to change one very often. So I guess they have a lot of space for humans repairing it.
https://mitpress.mit.edu/9780262334433/eniac-in-action/
An interesting revelation here is that, although ENIAC was not originally conceived as a stored program computer, it was quite early converted to one. They repurposed a lookup table intended to calculate functions to store instructions instead. Many of the well-known ENIAC calculations, such as Monte Carlo simulations, were programmed in this mode.
https://www.cs.drexel.edu/~bls96/eniac/simulator.html
And a programming manual:
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=846...
It's really got a nice archaic character.
The Harvard Mark 1 ran its first program in 1944, but didn’t have branches as we understand them until 1946:
https://en.wikipedia.org/wiki/Harvard_Mark_I
Apparently, you could achieve loops by taping the input program into a physical loop, even in 1944.
https://en.wikipedia.org/wiki/Colossus_computer
That is, they were not trying to follow the notion of a universal computing device that had already been defined by Turing and Church at the time. They were just trying to build something like a huge programmable calculator, but they ended up building a universal computation device anyway.
https://en.wikipedia.org/wiki/Colossus_computer
> Mauchly's teaching career truly began in 1933 at Ursinus College where he was appointed head of the physics department, where he was, in fact, the only staff member.
It was a balloon?
next 20 years ;)