How did past mathematicians feel about giant computations? Did those who saw the advent of computers get jealous?
A bit of a soft question.
Did past mathematicians ever reveal what feelings they felt as they did a massive computation by hand? Did they hate it? Did they wish that they didn't have to? Might they have even loved it?
Did any of them see computers rise and rise, and regret spending time on tedious computations in their early years?
Before computers, mathematicians were still sometimes able to use computers, but human ones. A notable example is the article by Cayley "On Tschirnhausen's transformation" about the BringâJerrard reduction of the quintic, where one can read:
In many other articles in classical invariant theory by Cayley, one can find some rather scary computations which he must have done himself. But for this particular paper, even for Cayley, it must have been too hard.
To get an idea of the kind of computations by hand that classical invariant theorists were able to pull off, see the book on Modern Algebra by George Salmon, as well as the book on binary forms by Faà Di Bruno which contains the full explicit expansion of the degree 18 invariant of binary quintics. For comparison, see this answer to Explicit formulas for invariants of binary quintic forms.
Gauss once wrote in a letter that using a certain table of primes to count their frequencies during brief periods of free time gave him âmuch pleasureâ. See the first page of
https://www.ams.org/journals/bull/2006-43-01/S0273-0979-05-01096-7/S0273-0979-05-01096-7.pdf
and note his comment that he did ânot have the patience for a continuous countâ.
There is the famous quote by Babbage in the 1800s:
I wish to God these calculations had been executed by steam.
See https://en.m.wikiquote.org/wiki/Charles_Babbage.
Emmy Noetherâs 1907 Ph.D. thesis was full of hundreds of calculations and she later called her thesis âcrapâ. I think that refers more to her unhappiness with the lack of conceptual clarity coming from all those calculations, but I presume she also despised spending all that time on the many calculations, which would be far shorter now with computer algebra systems.
The book When Computers Were Human by David Alan Grier contains much information that is relevant to the historical question.
The birth of the electronic computer was intertwined with the military needs of World War II. To understand how people felt about computing, one should first recognize that many of the people who were employed as (human) computers had lower social status. Grier writes:
It was really the job of the dispossessed, the opportunity granted to those who lacked the financial or
societal standing to pursue a scientific career. Women probably constituted the largest number of computers, but they were joined by African
Americans, Jews, the Irish, the handicapped, and the merely poor. The
Mathematical Tables Project employed several polio victims as computers, while the Langley research center kept an office of twelve African
American computers carefully segregated from the rest of the staff.
Such people may have welcomed a computing job as a good source of income in a world of limited options, and those with a patriotic spirit may have been happy to be able to contribute concretely to the war effort.
As for the threat of unemployment, that was probably occasioned more by the end of the war (and the resulting drop in demand for massive computation) than by the replacement of humans by machines, which were still expensive and limited at the time.
Anyway, I think it's important to recognize that when it comes to the history of "giant computations," the image of an ivory-tower mathematician doing calculations in pursuit of rarefied truth is a narrow one. Numerically at least, there were far more people for whom computation was simply paid work. If computation is just your day job, that will color the way you think about it, as well as the way you think about automation of your job.
In the context of mathematical physics, the use of computers has long been frowned upon. While not a mathematician himself, P.W. Anderson in his Nobel lecture referred to the "indignity of numerical simulations":
Localization was a different matter: very few believed it at the time,
and even fewer saw its importance; among those who failed to fully
understand it at first was certainly its author. It has yet to receive
adequate mathematical treatment, and one has to resort to the
indignity of numerical simulations to settle even the simplest
questions about it.
Leibniz famously designed and built a working mechanical calculator. The below quote comes from what one might call an advertisement he wrote for his machine, so it may not reflect his true feelings on long computations.
[It is] unworthy to waste the time of outstanding people with menial arithmetic work, because even the most simple-minded person can write down the results reliably when using a machine.
The quote can be found all over the internet, the one I copied is from the webpage of the city of Hannover. The best reference I found is a translation of his 'Machina Arithmetica in qua non additio tantum et subtractio sed et multiplicatio nullo, divisio vero paene nullo animi labore peragantur' from 1685, as translated by Mark Kormes in David Eugene Smiths "A Source Book in Mathematics" (1929), p. 181
Dunno whether "the early me" counts as a "past mathematician", but I can confirm that, even after various successful computations of theta-correspondence and other things (about mapping Casimir eigenfunctions to Casimir eigenfunctions), involving a roller-coaster of "clever" integral computations... soon after I "succeeded", I realized that the whole thing was simply not persuasive. That is, I was glad to have "succeeded", and I knew for other reasons (!?!) that the conclusion was correct, but the direct (kinda-fun/wacky) computation was not persuasive.
That is, in my current opinion, a computation that cannot really be human verified (e.g., by me!?!) is less interesting than one that can.
E.g., many computer systems can compute high powers of $2$, within seconds, which is kinda fun, but I can't personally verify the outcomes.
... nor could I easly verify my own computation of such things. :)
Oh, maybe it's about verifiability. My first encounter with this mathematical possibility was with "certifiable primes"... for which a description of (eminently feasible, easily reproducible) computation is given, that will indeed prove that a certain large number is prime. As opposed to the non-certifiable claim that some large number is prime, because one has attempted trial division up to its square root... No way to "prove" that.