@@GH-oi2jf Atanasoff was a prominent American/Bulgarian inventor who took pride in his Bulgarian heritage and maintained strong ties to his ancestral home of Bulgaria. From a letter he wrote, "To My Fatherland" : "I have always felt that the Bulgarian heritage in my blood has kept my spirit. And now, as I am growing old, I am even happier for my good fortune. My people have met me warmly and have given me a high prize, the Cyril and Methodius Order First Class, my first public recognition. I was elected a member of the Bulgarian Academy of Sciences and I am in touch with many friends in Bulgaria."
though not kind of - for a decade or two after this computers took a big silly detour - and only after that did we return to and recognize atanasoff's contribution which was WAY ahead of it's time
I live about 30 miles east of the university and I happen to know one of the engineers who was on the team that built the replica. I was at the unveiling and saw it work. It is, of course, "primitive" by todays' standards, but try to imagine one man, in the course of one long evening coming up with the 7 tube adder/subtractor module, the input method and the memory drum (which is, BTW not DRAM but DSAM since it is sequentially accessed as the drum rotates. This was an AMAZING piece of work.
The original machine probably didn't look so nice. The designers of this machine were motivated by the need to solve real-world problems. They did not aim to make use of either science or art, just practical technologies.
Just think about creating this machine in 1939 73 years ago ...this is mindblowing🤯 to see how they must have came to idea and engineering to do this in 1939.Hatts off to inventors of ABC.
literaly 1 and 0 or on and off state . this gives a good analog reference for me to understand how my " modern" computer works . what is old is new again. a lot ot the tech reminds me of old style player piano rolls and such.
The dynamic RAM in your modern computer also uses capacitors to store data, but they need to be refreshed every fifty milliseconds or more often, and the difference between a one and a zero might be a dozen or so electrons.
now i understand the point of binary and Boolean... its almost like the drum/wheel was created first before the concept of binary. I wish my teachers would teach it like this. makes me feel like im part of the solution
It is pointless to argue about who invented the computer, particularly by arguing what constitutes a computer. Atanasoff himself said "I have always taken the position that there is enough credit for everyone in the invention and development of the electronic computer.” Just remember people for what they contributed and when.
Not to bust your B's here, but "Primitive" is a term that should never be used when looking back in time at machines. I can't imagine anything more simplified than this. Write a simple program, or draw out a single digit calculator on paper. The shear genius involved is evident. Once you have transistors, and realize they can be used as switches, things become easy. But without efforts like this, when and why do transistors even get invented? Vacuum tubes work great for amplification.
A marvelous machine. I especially like the way none of the controls are marked. You just have to know what they do. Steam locomotives are the same way. All you see is a couple of dozen red handles. Turn the wrong one, and everybody dies. Less at stake with the computer.
And now we have very powerful computers in our pockets that also operate as phones, connect to the internet, tell us where on the earth we are, and take photos :)
Wow! I would love to revamp this design using transistors and LEDS to replace the vacuum tubes. The LEDS will show you the logic state of each transistor. The rotating drum looks like the biggest challenge. I think magnetic coils and small magnets might do the same job. The mechanical engineering that went into making these electronic wonders is just as impressive to me, as a modern electrical engineer. It's fascinating to see what one could do with great ingenuity! It was NOT invented by "extraterrestrial aliens", like so many Hollywood movies suggest 😞🙄
If say one hundred of these computer where build and connected together, could a distributed computing network have been formed and been useful in some way?
+Quaalude Charlie As explained, it makes many equations in many unknowns solvable. Without such a calculator, and without a computer, it can take weeks to solve such large algebraic systems. Was explained in the video.
man... if i was in charge of this, i would have stopped and been like. "we can calculate faster by hand.. what the heck is the purpose of this!?" This amazes me how some people just refuse to do things the easy way in order to achieve a vision that no one else has.
but if ANYONE does - it's probably john atanasoff - considering the eckert and mauchly one is widely considered 'the first' by many yet we know it was a bit of a ripoff of atanasoff's
antigen4 - But it was Atanasoff who said their was enough credit for everybody. He did not claim priority except, in court, to describe what he did and when.
Well *ACTUALLY* the Zuse Z3 was first(being digital, but electromechanical), even being able to process floating point numbers, being able to be programmed and was turing complete. And built with electromagnetic telephone relays in May 1941.
@@joshcorley9220 There are multiple sources, some say it was built before but didn't work flawlessly, others say it was built in the same year. Latest and most sure is probably 1942 when it was *successfully* tested.
@@heynic37 their prototype was finished and demonstrated in 1939 so who cares, that would be the first one. And no it was finished in the first weeks of January 1941 and demonstrated in 1942. I’m pretty sure the Z was done later in 1941 and it wasn’t demonstrated until a couple years later.
The ZUSE Z3 was completed in 1941 and was a fully programmable digital computer, it was electro-mechanical though. But even the ABC has some significant mechanical elements. The ENIAC and the COLOSSUS were quite early, too. A statement like that at the beginning is difficult, especially when you think of several development prototypes that were built around that time. But never forget the ZUSEs. Just sayin' .
I try to remember as many of them as I can, because they probably all made contributions. Babbage. The British wartime stuff like Collosus(sp?). Zuse machines. ABC. Eniac. Even the Univac I, the first commercial electronic digital computer. People have this desire for "firsts". But progress is usually more iterative with the limitions of each iteration driving the next iteration. In any case, these fledgling computers are all interesting to me.
Computer operators had to be hands-on and attentive with all those mechanical parts. Imagine how badly you could mess up a computation if you put a card in backwards or if it wasn't seated properly.
This computer was a one-off, built as proof of concept. It was never a production machine. Computers very quickly became more practical operationally. Hollerith cards have one corner cut, so a card in a deck cannot be oriented incorrectly without it being obvious.
The Apollo space program had a computer whose program was thin wires passing through metal cores, basically a metal core memory, but ROM not RAM. They hired women skilled in needlework I believe, to "weave" the program. Lots of room for error with that process!
Wow! That is a primitive computer, but amazing never-the-less. It seems to be very complicated for what it does. Was there no attempt to simplify the operation of the machine, or is this the end result of some simplifications?
I know you weren't. It's hard to get everything you want in with a 500 letter limit. But that machine is striped down to the very basics. With the limitations of mechanical components (as apposed to electronic ones) and the size of their fingers and what not, it's remarkably small. And there's always a time factor, because mathematics is everything for an engineer, you could spend your whole life refining and calculating. Check out Charles Babbage and his Difference Engine #2 here on YT
David Ascroft Dude, there is the real inventions and inventors and always a British versions of them !!! lol ..Computer was not invented in Britain period.. Colossus was simply a different version of computer which was not invented in Britain.. Try to explain to a German WHO invented the computer ..lol www.german-way.com/notable-people/featured-bios/konrad-zuse/ The vast majority of the so called British inventions ( jet engine, Radar, TV, electric motor, magnetron ect ect ) as presented by British sources to gullible British audience are figments of British imagination !! You go on maintaining the same old fiction, you make yourself laughing stock of the world.
Today, there wouldn't be modern computers without giant contributions made by the US.. Transistor was invented in the US.. www.physlink.com/education/askexperts/ae414.cfm Three Americans received Nobel Prize for it . And also the SOFTWARE was invented in America during Apollo moon landing Project. www.biography.com/people/jack-kilby-40499 The most important invention came from this man !!! There is no Brit here !!!
okay so the finished card has 3 fields, one for each number, but how did the operator decide in which field the pressed digits would go? i only see him pressing the numbers in an immediate sequence
Gigaguenther I imagine it's modeled after the IBM tabulating machines of the era. Those determined which digits go where by use of a plug board. Each column is wired so its value is assigned to different places. There are plenty of explanations of those machines on the net.
Hi Matt. I was not trying to be derogatory using the word primitive and I did say it is an amazing device for its time. I suppose my comment was a reaction to what was back then compared to what is now. However, I still think that the operation of this machine seems overly complex and wonder if it could not have been simplified. Maybe the version was much simpler?
The Zuse Z1 was made about 1936. Is that before the ABC? If the ABC had been mass produced during WW2 for distributed computing, how do you think that would have helped with the war effort,
cheap handheld camera. we dont remember how bad "consumer-grade" technology was before the age of Apple ;) Since Apple improved the quality of everything by being popular and cool, now pretty much everything is good
While I really appreciate the work that went into these kinds of machines, I think I'm going to stick with my 8-core 4GHz system. It's a little bit faster... But only just a little bit. This thing doesn't even have a GPU. :-\ How am I supposed to do important things on it, like play Fallout 4 and watch porn?
It can't be on 100 videos! I never got any sort of reply so I kept asking around. So I'll ask again, have you ever seen such and air computer? Event the Computer Museums don't seem to any info on this tech. Patent 3190554 . I'd also like to know if N.A.S.A. ever thought about using this in the manned missions.
+ufoengines Pneumatic logic was commonplace in industrial applications though hardly as complex as a computer. Analogue pneumatic computers were also commonly used for control systems. I doubt that a pneumatic digital computer would have gained much interest for speed reasons.
I was thinking that as a 3D printing project for some tech high school that designing/building a Turing Complete digital computing running on air would be a gas! Might get a little more speed if you ran it with helium. Also read somewhere that live crabs have been run though logic gates to demonstrate computation. However you might pull the same trick using brush bots. Kids would have to dig this. First you make up a whole bunch of brush bots and then run through a maze/logic gates to compute something. That would make a You Tube that I'd like for sure!
+ufoengines I don't know about that particular patent, but pneumatic computers were in use for years in industrial process control. They are slow and limited, but they work.
Thanks for posting this. You could argue endlessly about who was first or who got inspiration from whom. The abc is toy compared to ENIAC and while it's amazing in it's own right there is no real way to compare the two.
@@david203 Hmm, to me general purpose just means you can program it generally, aka Turing complete. A computer can easily be turing complete, but the program is input, "on the fly" on some sort of offline storage, like cards, tape, whatever. People tend to use the term "stored program computer", when the compute program is stored in the computer's memory.
@@michaelbauers8800 Yes. See the other comments for much the same information. Back in the beginnings of computer history there were many devices that were intended to solve different domains of problems. The range of programming and the ease of reprogramming increased over time. In this case, only systems of linear equations were solved, but the input was by punched cards instead of the more usual wiring boards. Special/General purpose are relative terms, and can be defined externally in terms of the range of problems solved and internally in terms of the functionality used to program and compute the results. There is no one "first computer", merely a large number of precursors to the modern Von Neumann architecture. See en.wikipedia.org/wiki/Von_Neumann_architecture .
Looks to me more like a semi digital calculator. What I want to know is: What was the true first computer. It's needs to have a processor that has an instruction set and a program counter.
Z1 built by Konrad Zuse in 1935/6 (there's a replica but the original was destroyed by british bombing in ww2) en.wikipedia.org/wiki/Z1_(computer) www.computinghistory.org.uk/det/6170/Zuse-Z1-built-by-Konrad-Zuse/
"First true computer" is subjective though. This is why people debate these points all the time. If they were precise, there's less debate perhaps. For example, people could ask questions like "What was the first electronic, digital, stored program computer?" That's probably easier to answer. Of course you have to educate yourself on the variables first, but by the time you start learning about all the early computers, and how they varied, you probably know enough, to not be so worried about "firsts". Progress is iterative.
Atanasoff actually developed this computer back in the 1930's to help speed up the solving of complex equations. My guess is that it will still take you much longer today if you were to do it by hand.
They made a machine that could solve sets of simultaneous equations that nobody would even attempt to do by hand, that was the point. Dr. Atanasoff needed to solve sets that were too complex for humans to attempt. It didn't bother him that it took a day to do each one, because they couldn't be done at all before.
YEAH AMERICA IS GREAT BUT John Attanasof was a bulgarian imigrant sooooooo basicaly he used USA finansing cus Bulgaria was poor and Invented it that means he is BULGARIAN NOT AMERICAN AND BULGARIANS INVENTED the first computer greetings From Bulgaria
This comment (from 12 years ago) is 100% false. Attanasof's father immigrated from Bulgaria - not Attanasof himself. As another poster pointed out, John was born in New York state; he actually grew up in Florida. (That's where he learned about electronics.) Therefore, John was 100% AMERICAN - of Bulgarian extraction - & his computer was a 100% AMERICAN invention. Notwithstanding, Attanasof was always proud of being of Bulgarian extraction.
+edgeeffect Of course not in the modern sense. But it is a programmable *compute*-r. A calculator would just do simple arithmetic, so a human would still have to sequence the operations to do any useful work, but this could be programmed to solve actual equations. This was a one kind of machine in it's time.
edgeeffect Right. Like I said, it wasn't programmable in the modern sense of being Turing-complete; it would solve equations. It was different from a calculator in that it sequenced separate calculations; it didn't just do simple arithmetic. It *does* compute equations, though. I guess commonly accepted terminology doesn't make this point clear enough.
+NuggetOfBlueGold "A computer is a device that computes" is an over-simplistic definition based on the derivation of it's name and isn't particularly workable. At college, I remember, I was taught a quick and easy definition of what a computer really is - side stepping potentially complex concepts as Turing completeness: "A device for processing raw data into meaningful information under self-modifiable program control". When we're discussing computer history, we tend to waive a few of these requirements. We say it doesn't strictly haveto be self-modifying so that early stored-program computers like the Manchester MK 1 are "allowed" to be computers. We say the program doesn't necessarily have to be stored in the memory so that the the likes of ENIAC are "allowed in". The point at which just too many rules have been waived is what makes The Analytical Engine a computer but The Difference Engine “not a computer” and that is the requirement that it is PROGRAMMABLE. And this is the nub of Turing completeness... computers are general purpose machines that can perform different functions based on their software and the Atanasoff-Berry could only solve simultaneous equations and nothing else. Which makes it very nearly a computer but not quite. A calculator is a special-purpose computing device. But to be a computer it needs to be a GENERAL purpose computing device and part of that remit would, sadly, include the playing of games.
This was a 'Special Purpose Computer'! It performed simultaneous linear equations. It did not have a Stored Program. Its programmability was limited. It was not fully automated. It ROM sequencer was mechanical. Conclusion: It was not the first modern all-electronic general purpose fully & freely programmable digital computer..
The only reason this computer is considered "first" is because some lawyers needed to come up with a way to break the ENIAC patent. It has more in common with a Babbage engine in the sense that a modern replica is what actually worked!
The genius and engineering behind this machine is amazing.
I solved it in my head dummy.
Thanks, Kanye! Very cool.
He is from my country ;D
Пилешка Супа - Atanasoff was American. His father was from Bulgaria.
@@GH-oi2jf Atanasoff was a prominent American/Bulgarian inventor who took pride in his Bulgarian heritage and maintained strong ties to his ancestral home of Bulgaria. From a letter he wrote, "To My Fatherland"
:
"I have always felt that the Bulgarian heritage in my blood has kept my spirit. And now, as I am growing old, I am even happier for my good fortune. My people have met me warmly and have given me a high prize, the Cyril and Methodius Order First Class, my first public recognition. I was elected a member of the Bulgarian Academy of Sciences and I am in touch with many friends in Bulgaria."
This blows me away! This is utterly fantastic! What a love for truth in history!
Great work!!!!!!
I absolutely love this! This is an awesome explaination of how computers got where they are today.
though not kind of - for a decade or two after this computers took a big silly detour - and only after that did we return to and recognize atanasoff's contribution which was WAY ahead of it's time
It looks like a drum-scanner, I saw that one on my Graphic Lyceum at Eindhoven, the Netherlands, thanks for showing and kind regards.
I live about 30 miles east of the university and I happen to know one of the engineers who was on the team that built the replica. I was at the unveiling and saw it work. It is, of course, "primitive" by todays' standards, but try to imagine one man, in the course of one long evening coming up with the 7 tube adder/subtractor module, the input method and the memory drum (which is, BTW not DRAM but DSAM since it is sequentially accessed as the drum rotates. This was an AMAZING piece of work.
I've a huge respect for these pioneers... building sush machines it's an harmony of science and art (it looks cool, don't you think?).
The original machine probably didn't look so nice. The designers of this machine were motivated by the need to solve real-world problems. They did not aim to make use of either science or art, just practical technologies.
Every major technological product should be either stored or replicated like this.
Amazing to see the ABC in action. And the effort that you guys put in to recreating this! Wow! Much less Atanasoff's building it in the first place!!
Just think about creating this machine in 1939 73 years ago ...this is mindblowing🤯 to see how they must have came to idea and engineering to do this in 1939.Hatts off to inventors of ABC.
Wow, a binary conversion via table using a huge metal drum. Very creative!
What an amazing machine! Although I got totally lost after the first few minutes, there's certainly a lot going on in there.
literaly 1 and 0 or on and off state . this gives a good analog reference for me to understand how my " modern" computer works . what is old is new again. a lot ot the tech reminds me of old style player piano rolls and such.
The dynamic RAM in your modern computer also uses capacitors to store data, but they need to be refreshed every fifty milliseconds or more often, and the difference between a one and a zero might be a dozen or so electrons.
and i am here watching this on a foldable smartphone , Technology has come a long way HUh....
As Spock would say: fascinating !
The ingenuity of mankind will never cease to amaze me.
now i understand the point of binary and Boolean... its almost like the drum/wheel was created first before the concept of binary. I wish my teachers would teach it like this. makes me feel like im part of the solution
Wow. That's a thing of beauty. Amazing that it actually works!
It is pointless to argue about who invented the computer, particularly by arguing what constitutes a computer. Atanasoff himself said "I have always taken the position that there is enough credit for everyone in the invention and development of the electronic computer.”
Just remember people for what they contributed and when.
he was modest to a fault. people who know his contribution rightly credit him and berry.
Not to bust your B's here, but "Primitive" is a term that should never be used when looking back in time at machines. I can't imagine anything more simplified than this. Write a simple program, or draw out a single digit calculator on paper. The shear genius involved is evident. Once you have transistors, and realize they can be used as switches, things become easy. But without efforts like this, when and why do transistors even get invented? Vacuum tubes work great for amplification.
A marvelous machine. I especially like the way none of the controls are marked. You just have to know what they do.
Steam locomotives are the same way. All you see is a couple of dozen red handles. Turn the wrong one, and everybody dies. Less at stake with the computer.
Not the same way with modern computers. Someone clicks on the wrong email, then an entire banking system goes down.
So the re-creators of the replica computer were able to get it working, unlike the original which was never fully operational. Very cool
would love to see this unit done in modern electronics instead of tubes . this must have been fantastic to see in its day.
But it's trivial with modern digital electronics, IMO.
Now I'll switch on the memory drum, so you can't hear me talk anymore.
Absolutely amazing.
Wow. You don't truly appreciate your personal computers today until you see this video.
And now we have very powerful computers in our pockets that also operate as phones, connect to the internet, tell us where on the earth we are, and take photos :)
@mattitheowl Collosus wasn't in prototype form until 1943, the ABC was working in 1942.
ABC first prototype in 1939. and the basis for ENIAC which lost a patent suit to atanasoff and berry
If the drum is rotated too fast, the bits will reach the escape velocity and be flung into space.
Wow! I would love to revamp this design using transistors and LEDS to replace the vacuum tubes. The LEDS will show you the logic state of each transistor.
The rotating drum looks like the biggest challenge. I think magnetic coils and small magnets might do the same job.
The mechanical engineering that went into making these electronic wonders is just as impressive to me, as a modern electrical engineer. It's fascinating to see what one could do with great ingenuity!
It was NOT invented by "extraterrestrial aliens", like so many Hollywood movies suggest 😞🙄
Impresionante, las computadoras son recientes y la tecnología avanzo con pasos agigantados y acelerados.
If say one hundred of these computer where build and connected together, could a distributed computing network have been formed and been useful in some way?
I can see how it is better than pencil and paper when working on larger more complex math values , Thank you for the Demonstration :) QC
+Quaalude Charlie As explained, it makes many equations in many unknowns solvable. Without such a calculator, and without a computer, it can take weeks to solve such large algebraic systems. Was explained in the video.
Велик човек
They forgot to show the dwarf who solves the equations and lives inside the cabinet.
Yes, but that was a relay computer. The ABC is all-electronic, even when you consider that there is a motor to rotate the memory drums.
man... if i was in charge of this, i would have stopped and been like. "we can calculate faster by hand.. what the heck is the purpose of this!?" This amazes me how some people just refuse to do things the easy way in order to achieve a vision that no one else has.
Can it run Doom?
Paegr arcade pink rock n rolls his stor reads psyco craft maze esc enc bypass
RUN Doom ? . . . It is Doom !
Good question.
Or how about Wordstar?
3000 bits... damn, pretty large storage space this computer has
No one man or woman is entirely or even primarily responsible for the invention of the modern digital computer
agree
but if ANYONE does - it's probably john atanasoff - considering the eckert and mauchly one is widely considered 'the first' by many yet we know it was a bit of a ripoff of atanasoff's
antigen4 - But it was Atanasoff who said their was enough credit for everybody. He did not claim priority except, in court, to describe what he did and when.
I wish the audio quality was better.
It's sad to see that almost nobody says that John Atanasov's father is Bulgarian.
very cool.
Well *ACTUALLY* the Zuse Z3 was first(being digital, but electromechanical), even being able to process floating point numbers, being able to be programmed and was turing complete. And built with electromagnetic telephone relays in May 1941.
And you are wrong, this was invented in the 30s, it just means is the first computer wasn’t Turing whatever or programmable.
@@joshcorley9220 There are multiple sources, some say it was built before but didn't work flawlessly, others say it was built in the same year. Latest and most sure is probably 1942 when it was *successfully* tested.
@@heynic37 their prototype was finished and demonstrated in 1939 so who cares, that would be the first one. And no it was finished in the first weeks of January 1941 and demonstrated in 1942. I’m pretty sure the Z was done later in 1941 and it wasn’t demonstrated until a couple years later.
In the last days knowledge will be abundant
PS
let me know if it is up for sale
Information is not knowledge, knowledge is not wisdom
The ZUSE Z3 was completed in 1941 and was a fully programmable digital computer, it was electro-mechanical though. But even the ABC has some significant mechanical elements. The ENIAC and the COLOSSUS were quite early, too. A statement like that at the beginning is difficult, especially when you think of several development prototypes that were built around that time. But never forget the ZUSEs. Just sayin' .
Oh get over it, this one was done in the 30s before any of the others.
I try to remember as many of them as I can, because they probably all made contributions. Babbage. The British wartime stuff like Collosus(sp?). Zuse machines. ABC. Eniac. Even the Univac I, the first commercial electronic digital computer. People have this desire for "firsts". But progress is usually more iterative with the limitions of each iteration driving the next iteration. In any case, these fledgling computers are all interesting to me.
Computer operators had to be hands-on and attentive with all those mechanical parts. Imagine how badly you could mess up a computation if you put a card in backwards or if it wasn't seated properly.
This computer was a one-off, built as proof of concept. It was never a production machine. Computers very quickly became more practical operationally. Hollerith cards have one corner cut, so a card in a deck cannot be oriented incorrectly without it being obvious.
The Apollo space program had a computer whose program was thin wires passing through metal cores, basically a metal core memory, but ROM not RAM. They hired women skilled in needlework I believe, to "weave" the program. Lots of room for error with that process!
so cool!
fascinating. its amaiZing what was done with zinc plated vacuum ttubes
Wow bulgaria such a small countery,but we made the thing that the hole world use i am so proud
John Atanasoff was born in the United States and built this machine in the United States. His father was from Bulgaria.
Well as a Bulgarian, I can say that he wanted the computer to be known from Bulgaria, but of course his wish did not happen. @@GH-oi2jf
Brilliant stuff! But whoever did the end credits for this video was pretty dyslexic... ;^)
Spark Gap printer haha good way to get a $10,000 NAL from the FCC if you don't properly shield that thing before running.
amazing
The bit drums are like player piano tubes or a music box. Same concept.
very much agree
Wow! That is a primitive computer, but amazing never-the-less. It seems to be very complicated for what it does. Was there no attempt to simplify the operation of the machine, or is this the end result of some simplifications?
Could we have gotten to this without the player piano and automatic musical instruments?
echodelta9 Nope, unless you had access to knowledge 30 years in the future. en.wikipedia.org/wiki/Leibniz_wheel
echodelta9 - yes.
From the moment they turn the drums on, the narration is overwhelmed by the noise. :(
It was not the first computer to have parallel circuits, since the 1941 Z3 had a parallel adder.
The early Zuse machines were relay devices, not electronic. The ABC was the first electronic computing device.
I know you weren't. It's hard to get everything you want in with a 500 letter limit. But that machine is striped down to the very basics. With the limitations of mechanical components (as apposed to electronic ones) and the size of their fingers and what not, it's remarkably small. And there's always a time factor, because mathematics is everything for an engineer, you could spend your whole life refining and calculating. Check out Charles Babbage and his Difference Engine #2 here on YT
Stop comparing.
The computer was invented in the U.S.A..!!!!!
No, it was in Britain, the Colossus code breaking computer used in WW2
David Ascroft Dude, there is the real inventions and inventors and always a British versions of them !!! lol ..Computer was not invented in Britain period.. Colossus was simply a different version of computer which was not invented in Britain..
Try to explain to a German WHO invented the computer ..lol
www.german-way.com/notable-people/featured-bios/konrad-zuse/
The vast majority of the so called British inventions ( jet engine, Radar, TV, electric motor, magnetron ect ect ) as presented by British sources to gullible British audience
are figments of British imagination !!
You go on maintaining the same old fiction, you make yourself laughing stock of the world.
Today, there wouldn't be modern computers without giant contributions made by the US..
Transistor was invented in the US..
www.physlink.com/education/askexperts/ae414.cfm
Three Americans received Nobel Prize for it .
And also the SOFTWARE was invented in America during Apollo moon landing Project.
www.biography.com/people/jack-kilby-40499
The most important invention came from this man !!!
There is no Brit here !!!
David Ascroft this was built before colossus.
And the computer chip was invented in the USA.
This is as simple as a computer can get using 1930s technology.
I would absolutely use it to mining purposes
huh? how does the machine know how many digits there are to each number?
okay so the finished card has 3 fields, one for each number, but how did the operator decide in which field the pressed digits would go? i only see him pressing the numbers in an immediate sequence
Gigaguenther I imagine it's modeled after the IBM tabulating machines of the era. Those determined which digits go where by use of a plug board. Each column is wired so its value is assigned to different places. There are plenty of explanations of those machines on the net.
Hi Matt. I was not trying to be derogatory using the word primitive and I did say it is an amazing device for its time. I suppose my comment was a reaction to what was back then compared to what is now. However, I still think that the operation of this machine seems overly complex and wonder if it could not have been simplified. Maybe the version was much simpler?
The Zuse Z1 was made about 1936. Is that before the ABC?
If the ABC had been mass produced during WW2 for distributed computing, how do you think that would have helped with the war effort,
With a bit of over clocking i think they could get it to run Crysis 2
@HerrXRDS It can only handle three pixels before you start to get major lag.
ڈیجیٹل کمپوٹر جان ونسنٹ اٹیناسوف نے ایجاد کیا تھا۔۔۔۔۔کیا یہ معلومات درست ہیں؟؟؟؟
Yes, well, the first digital computer
Neat.
but will it blend?
This can't be recorded in 99. It looks like the 80s?
cheap handheld camera. we dont remember how bad "consumer-grade" technology was before the age of Apple ;) Since Apple improved the quality of everything by being popular and cool, now pretty much everything is good
nice troll. apple didnt improve anything at all. apple was just good at marketing
While I really appreciate the work that went into these kinds of machines, I think I'm going to stick with my 8-core 4GHz system. It's a little bit faster... But only just a little bit. This thing doesn't even have a GPU. :-\
How am I supposed to do important things on it, like play Fallout 4 and watch porn?
To start with, it obviously was not constructed to deal with either Fallout 4 or porn...
I was making a sarcastic joke. I know it wasn't made for that.
+Spacekriek. In other words, you are saying its useless! :D
Ran across this old patent 3190554 on a digital computer that ran on air. Wonder if anything was done with this idea.
do you have to spam this comment on 100 videos?
It can't be on 100 videos! I never got any sort of reply so I kept asking around. So I'll ask again, have you ever seen such and air computer? Event the Computer Museums don't seem to any info on this tech. Patent 3190554 . I'd also like to know if N.A.S.A. ever thought about using this in the manned missions.
+ufoengines Pneumatic logic was commonplace in industrial applications though hardly as complex as a computer. Analogue pneumatic computers were also commonly used for control systems.
I doubt that a pneumatic digital computer would have gained much interest for speed reasons.
I was thinking that as a 3D printing project for some tech high school that designing/building a Turing Complete digital computing running on air would be a gas! Might get a little more speed if you ran it with helium. Also read somewhere that live crabs have been run though logic gates to demonstrate computation. However you might pull the same trick using brush bots. Kids would have to dig this. First you make up a whole bunch of brush bots and then run through a maze/logic gates to compute something. That would make a You Tube that I'd like for sure!
+ufoengines I don't know about that particular patent, but pneumatic computers were in use for years in industrial process control. They are slow and limited, but they work.
10:06 PRIVATLY ? REASCERCH ? May we please have these words in English?
+MrCuddlyable3 You missed INSITUTE.
sums up Iowa state quality
Thanks for posting this. You could argue endlessly about who was first or who got inspiration from whom. The abc is toy compared to ENIAC and while it's amazing in it's own right there is no real way to compare the two.
+Mike Young I agree with the comparison. ABC is not a true computer, since there is no stored program, no conditionals, and only trivial loops.
+David Spector The British Colossus was no computer!
So I finally get to see it work... I don't see it as general purpose, though.
+scowell Can't be general purpose without being able to hold a program.
@@david203 Hmm, to me general purpose just means you can program it generally, aka Turing complete. A computer can easily be turing complete, but the program is input, "on the fly" on some sort of offline storage, like cards, tape, whatever. People tend to use the term "stored program computer", when the compute program is stored in the computer's memory.
@@michaelbauers8800 Yes. See the other comments for much the same information.
Back in the beginnings of computer history there were many devices that were intended to solve different domains of problems. The range of programming and the ease of reprogramming increased over time.
In this case, only systems of linear equations were solved, but the input was by punched cards instead of the more usual wiring boards.
Special/General purpose are relative terms, and can be defined externally in terms of the range of problems solved and internally in terms of the functionality used to program and compute the results.
There is no one "first computer", merely a large number of precursors to the modern Von Neumann architecture. See en.wikipedia.org/wiki/Von_Neumann_architecture .
i am late
Jonh Atanasov dad is Bulgarian so he is half Bulgarian.
Looks to me more like a semi digital calculator.
What I want to know is: What was the true first computer. It's needs to have a processor that has an instruction set and a program counter.
Mum Blic ugh
Z1 built by Konrad Zuse in 1935/6
(there's a replica but the original was destroyed by british bombing in ww2)
en.wikipedia.org/wiki/Z1_(computer)
www.computinghistory.org.uk/det/6170/Zuse-Z1-built-by-Konrad-Zuse/
"First true computer" is subjective though. This is why people debate these points all the time. If they were precise, there's less debate perhaps. For example, people could ask questions like "What was the first electronic, digital, stored program computer?" That's probably easier to answer. Of course you have to educate yourself on the variables first, but by the time you start learning about all the early computers, and how they varied, you probably know enough, to not be so worried about "firsts". Progress is iterative.
that's why nowadays most kids question why it is called a computer and not a "GAMEr"
I punched cards for a class in 1982. And don't forget the hanging chads in Florida 2000 election recount.
Why use a pocket calculator when you can take one of these to school............? And you will have the answer within a few hours, too ! Wow.....
cool:)
greetings from Bulgaria this pc is created and designed in Bulgaria and is from Bulgarian
i'm just gonna use pen and paper
But does it blend?
These computers were semi digital.
no - fully digital. this is the computer that was 'ripped off' to build the ENIAC
What does “semi-digital” mean? What part was not digital?
Well those gears might be controlled digitally but those gears itself are not digital,hence the term semi digital.
can i put linux on this
Having problems with circuits? Try circuit solver seek androidcircuitsolver on google!
so they made a machine that can do it slower then we can do by hand????
Atanasoff actually developed this computer back in the 1930's to help speed up the solving of complex equations. My guess is that it will still take you much longer today if you were to do it by hand.
They made a machine that could solve sets of simultaneous equations that nobody would even attempt to do by hand, that was the point. Dr. Atanasoff needed to solve sets that were too complex for humans to attempt. It didn't bother him that it took a day to do each one, because they couldn't be done at all before.
@alexcandy1411 Only in RL mode.
YEAH AMERICA IS GREAT BUT John Attanasof was a bulgarian imigrant sooooooo basicaly he used USA finansing cus Bulgaria was poor and Invented it that means he is BULGARIAN NOT AMERICAN AND BULGARIANS INVENTED the first computer greetings From Bulgaria
He was a Bulgarian American and also his coworker berry was american so no the invention was not Bulgarian
bob4o99 - His father was from Bulgaria. He was born in New York state.
This comment (from 12 years ago) is 100% false. Attanasof's father immigrated from Bulgaria - not Attanasof himself. As another poster pointed out, John was born in New York state; he actually grew up in Florida. (That's where he learned about electronics.) Therefore, John was 100% AMERICAN - of Bulgarian extraction - & his computer was a 100% AMERICAN invention. Notwithstanding, Attanasof was always proud of being of Bulgarian extraction.
I like potatoes
Computacion ABC
Some how it's ramindes me a atmel's 8bit mcu's
First digital computer is colossus
Seems like a lot of work and money, that can be done by pencil and paper very quickly.
+TickyTack23 Large algebraic systems cannot be solved manually in a reasonable time. This was explained in the video.
@ryanch94 It will, most likely, blend just about anything you throw into it.
This thing can run crysis at 60fps.
"Institute for physical reascerch" *WHAT?*
Plays World of Warcraft like a dream.
computer? or just a calculator?
+edgeeffect Of course not in the modern sense. But it is a programmable *compute*-r. A calculator would just do simple arithmetic, so a human would still have to sequence the operations to do any useful work, but this could be programmed to solve actual equations. This was a one kind of machine in it's time.
+frtard It wasn't programmable.. it could only solve simultaneous equations. plato.stanford.edu/entries/computing-history/#Atan
edgeeffect Right. Like I said, it wasn't programmable in the modern sense of being Turing-complete; it would solve equations. It was different from a calculator in that it sequenced separate calculations; it didn't just do simple arithmetic. It *does* compute equations, though. I guess commonly accepted terminology doesn't make this point clear enough.
+NuggetOfBlueGold "A computer is a device that computes" is an over-simplistic definition based on the derivation of it's name and isn't particularly workable. At college, I remember, I was taught a quick and easy definition of what a computer really is - side stepping potentially complex concepts as Turing completeness: "A device for processing raw data into meaningful information under self-modifiable program control". When we're discussing computer history, we tend to waive a few of these requirements. We say it doesn't strictly haveto be self-modifying so that early stored-program computers like the Manchester MK 1 are "allowed" to be computers. We say the program doesn't necessarily have to be stored in the memory so that the the likes of ENIAC are "allowed in". The point at which just too many rules have been waived is what makes The Analytical Engine a computer but The Difference Engine “not a computer” and that is the requirement that it is PROGRAMMABLE. And this is the nub of Turing completeness... computers are general purpose machines that can
perform different functions based on their software and the Atanasoff-Berry could only solve simultaneous equations and nothing else. Which makes it very nearly a computer but not quite.
A calculator is a special-purpose computing device. But to be a computer it needs to be a GENERAL purpose computing device and part of that remit would, sadly, include the playing of games.
This was a 'Special Purpose Computer'! It performed simultaneous linear equations.
It did not have a Stored Program.
Its programmability was limited.
It was not fully automated.
It ROM sequencer was mechanical.
Conclusion: It was not the first modern all-electronic general purpose fully & freely programmable digital computer..
it's the first digital computer. how do you expect it to be other than primitive?
zarni000 - Actually, the first electronic digital computer. Relay-based computers were earlier. And it was a special purpose machine.
The only reason this computer is considered "first" is because some lawyers needed to come up with a way to break the ENIAC patent. It has more in common with a Babbage engine in the sense that a modern replica is what actually worked!