Well, back in '42 ... computers were just emerging, so to speak. They had things like the ENIAC down at University of Pennsylvania. ... Now they were slow, they were very cumbersome and huge and all, there were computers that would fill a couple rooms this size and they would have about the ability of one of the little calculators that you can buy now for $10. But nevertheless we could see the potential of this, the thing that happened here if things ever got cheaper and we could ever make the up-time better, sort of keep the machines working for more than ten minutes, things like that. It was really very exciting.


We had dreams, Turing and I used to talk about the possibility of simulating entirely the human brain, could we really get a computer which would be the equivalent of the human brain or even a lot better? And it seemed easier then than it does now maybe. We both thought that this should be possible in not very long. in ten or 15 years. Such was not the case, it hasn't been done in thirty years.


Shannon, 1977; as cited in Soni, Jimmy, and Rob Goodman. A mind at play: How Claude Shannon invented the information age. Simon and Schuster, 2017. p. 106

Here is the page in Google books.

Also, since you are here, check out this twitter thread by @dabacon. The cited article is infuriating; for example, look at this:

Indeed, all of the assumptions that theorists make about the preparation of qubits into a given state, the operation of the quantum gates, the reliability of the measurements, and so forth, cannot be fulfilled exactly. They can only be approached with some limited precision. So, the real question is: What precision is required? With what exactitude must, say, the square root of 2 (an irrational number that enters into many of the relevant quantum operations) be experimentally realized? Should it be approximated as 1.41 or as 1.41421356237? Or is even more precision needed? Amazingly, not only are there no clear answers to these crucial questions, but they were never even discussed!