Hacker News new | past | comments | ask | show | jobs | submit login
Three Mathematicians We Lost in 2020 (newyorker.com)
220 points by _pius on Jan 19, 2021 | hide | past | favorite | 37 comments



Terence Tao wrote a great obituary for Conway last year:

https://terrytao.wordpress.com/2020/04/12/john-conway/

"Gödel’s Lost Letter and P=NP" also does obituaries of mathematicians quite often, and goes more into detail on their work.

https://rjlipton.wordpress.com/2020/04/14/john-horton-conway...


Tao: "Conway was arguably an extreme point in the convex hull of all mathematicians." What a beautiful and mathy way to describe him.


I didn't know about FRACTRAN, that's pretty cool: https://en.wikipedia.org/wiki/FRACTRAN


One of my favorite facts about FRACTRAN that's often missing from articles around it [0]

> And it follows that FRACTRAN games are undecidable. This doesn’t speak directly to the Collatz conjecture, because (1/2)n | 3n + 1 is not a FRACTRAN game. But it does follow that arbitrary Collatz games are undecidable, since the set of all Collatz games includes the set of all FRACTRAN games.

[0] https://raganwald.com/2020/05/03/fractran.html#why-fractran-...


I remember reading one of Ron Graham's early papers on scheduling [1] when I was a grad and thinking to myself: I'll never have the mental clarity to achieve what this guy has done. What a mind. His wife shares [2]:

``On the wall in Ron's office, he hung a poster of squares arranged in 90 lines each consisting of 52 little squares. Later on he modified it so it contains 100 lines. (He sometimes joked that his grandma lived to 99 and then hit by a truck.) The rule is to fill one square each week. Thus, one can see how many squares are left and how finite and precious life is. He only used 84 lines but every square was gloriously filled.''

[1] http://www.math.ucsd.edu/~ronspubs/72_04_two_processors.pdf [2] http://www.math.ucsd.edu/~fan/ron/kayak.html


Surprisingly for me, the passing of John D. Barrow went so unresonated. He has some great books about the idea of the mathematical universe and his lectures (on YT) are quite interesting.


His book Pi in the Sky was one of the two most influential math books for me in my youth; the other was Modern Mathematics by Edna Kramer.

I had not heard of his death. Thanks for posting about it.


We lost a bunch of excellent scientists, teachers and human beings overall, known and less known. Besides Barrow, it was personally painful to learn that Malcolm J. Skove also passed away last June. He co-authored one of the best introductory physics textbooks that I have read, "Physics: Classical and Modern".


Vaughan Jones, winner of the 1990 Fields Medal, also lost to 2020.


My first thought was: “at least it’s not the Jones polynomial guy”... oh. Damn it.

I know him from “gauge fields, knots and gravity” and the expanding popularity of low dimensional topology generally.



Boris Tsirelson


Vaughan was a fucking mensch. I doubt the world realizes how much worse it is without him.


I recommend Dyson's Maker of Patterns[0] autobiography told primarily through his letters to his family throughout his life. Includes great stories about Feynman, Oppenheimer, other heavy-hitters of mid-century physics. The format was surprisingly narrative, makes me wish I wrote as many personal longer-form letters with family and friends. Also Siobhan Roberts' biography of Conway, Genius at Play[1], told another great story. Low on math content but Conway's personality and the narrators attempts to corral it were quick fun.

Both are recommended. Does anyone have recommendation for a popular-science level book on Graham? I'm only familiar with him from the Numberphile video series.

[0] https://www.harvard.com/book/maker_of_patterns_an_autobiogra...

[1] short snippet from it https://www.ias.edu/ideas/2015/roberts-john-horton-conway



One of the puzzles I wrote for the EFF30 puzzlehunt was dedicated to the memory of Conway and Guy (for reasons that should be pretty apparent if you solve it).

https://eff30.cat/ctf/largest/

All three of the Winning Ways authors died within the past two years. :-(


I'm still sad about Conway. He was on my list of people that I was hoping to meet some day.


the same too. RIP Conway.


Richard K. Guy also died in 2020. He coauthored Winning Ways on combinatorial game theory and had some notable contributions to number theory.


Conway is often remembered for his Game of Life, but IMHO that is trivia in comparison with his surreal number system [1]. I predict that while surreal numbers today are an under-appreciated toy, in coming centuries they will gradually become more and more important, eventually surpassing even what we currently call real numbers.

[1] https://en.wikipedia.org/wiki/Surreal_number


This seems a strange prediction to me. Can you elaborate on why you think this? Do you think the reals will be replaced by surreals in common usage by laypeople, by working mathematicians or in mathematical pedagogy? While I certainly think surreal numbers are really nifty for tying together different number systems into one rigorous foundation, your claim sounds like someone saying that the algebraic numbers would surpass the rational numbers.


I think they'll be important in AGI research for one thing. Take reinforcement learning (RL) for example: the rewards are arbitrarily constrained to be real numbers (or rational numbers). But why? There is a rich variety of RL environments one can construct using non-real number reward systems, and there's nothing special about the reals (or rationals) that suggests any connection to RL. The real numbers, recall, are the unique complete ordered field. What do complete ordered fields intrinsically have to do with reinforcement learning? Nothing, as far as I can tell (if there were some such connection, it would certainly be interesting to professors of real analysis).

As you point out, surreal numbers do a great job of tying together many different number systems, and that's inherently appropriate for studying generalized reinforcement learning where rewards come from many different number systems.

I wrote a whole paper on this topic: https://philpapers.org/archive/ALETAT-12.pdf


I remain skeptical because all practical computing necessarily has finite representation. We don't compute with real numbers, we use floats or doubles of a given precision. While this very occasionnaly leads to weird errors (from the perspective of pure mathematics) it's usually good enough.

Having briefly read your paper I think it makes a mistake analogous to arguments that machine intelligence is impossibly by Godel's incompleteness theorems. Consider the statement:

> This statement is not true if written by xamuel

Despite the fact that you cannot consistently assert that statement, it doesn't strike me as a strong argument against your general intelligence.

By analogy, why don't your arguments in section 3 prove the impossibility of Human General Intelligence? You are replacing the usual meaning of AGI (a machine that can execute a wide variety of tasks in human to superhuman fashion) with a far stronger one (a machine that can complete arbitrary tasks perfectly). A machine that (as in example 5) has an `incorrect' loss function may still be generally an AGI.

I'd challenge the notion that non-Archimedean tasks are relevant to AGI. Here is another example of your non-Archimedean tasks. Pick a pair of integers. The reward for this task is such that for points p and q, p < q if p comes before q lexicograpically, and |p-q| >= 1 if p!=q. All the complexity is baked into the structure of the task, but it doesn't seem to be a compelling barrier to an AGI any more than the search task "Find the largest integer" is.

I'd expect AGI's to be able to approach non-Archimedean tasks at least as well as humans, but I expect they'd do it in an analagous way -- by loading Surreal numbers or the equivalent into their software rather than their hardware. That is to say, an AGI should be able to reason about these concepts despite not being built out of them.


>You are replacing the usual meaning of AGI (a machine that can execute a wide variety of tasks in human to superhuman fashion) with a far stronger one (a machine that can complete arbitrary tasks perfectly).

No, I never make any assumption about AGIs being capable of completing arbitrary tasks perfectly. Indeed, that would be quite impossible, no agent could do that. The point is rather that the traditional RL agent cannot even comprehend environments with non-Archimedean rewards, but a genuine AGI would be able to comprehend them.


Interesting. Most ML models presumably operate over IEEE double precision floating point values. These have a couple interesting properties like INF and NAN, but not infinitesimals.


*speculation warning*

We're still not comfortable checking if rationals are equal in any major language... there's a long way to go.

I could see a future high-level programming language (think Python design goals) using the rigorous foundation of surreals as the underlying numeric system. Sure 95% of the time it'll look the same to the programmer, and the performance will be worse, but you get inherent stability around infinities, irrationals, NaNs, floating-point comparisons, etc

It's reasonable to think that ~200 years from now, highschoolers will learn that numbers are all "really" from this structure of surreal numbers, and that most computers and physics processing etc uses them behind the scenes.


> We're still not comfortable checking if rationals are equal in any major language...

This seems to be confusing the issue to me. Checking for rational equality is trivial (just cross multiply). What we don't have is major languages deciding the the type rational numbers is useful enough to get first class support. And even that is overstating the issue, some languages do care and make representing rationals easy. In Julia, rational numbers can just be written like 2//3.

If people try to use floats to represent rationals and run into errors, the problem isn't that languages are incapable of representing rationals.

I don't see any upsides to using surreal numbers as a datatype. If you're restricting it to a finite size you're still going to have analagous problems that float or double do and I don't see any advantages in stability or whatever. If you're allowing arbitrary size (but obviously still finite) representation, you can still only represent the dyadic rationals, so you can't even express 1/3. The infinitesimals and infinities would require infinite size, so they're never practical.


Yes, it's instructive to recall that the Pythagorean Cult believed all numbers were rational (and this was a religious sort of belief). There's a legend (albeit probably not historically accurate) that when a member of the cult proved that sqrt(2) was irrational, they put him to death... https://en.wikipedia.org/wiki/Hippasus


I wonder how they would have reacted to a proof that the reals are a larger infinity.


Every now and then I try to understand them, and I think if they fulfill (what I think is claimed) to make Leibniz calculus on a rigerous bases, then you are right. For me the litmus test is how to prove chain rule (since the classical proof and the rigerous one are conceptually different). But when I read about surreal number it somehow seems you need the modern proofs as a prerequisite.

Let me ask you: how do you prove chain rule with surreal numbers.


I think you might be mixing up surreals with hyperreals.


Yes. Most likely. Allthough I just read that hyperreals can be a subfield of the surreal.


I saw Graham walking around UCSD. I didn't recognize him but my friend pointed him out. I wanted to meet him.


John Conway loss will live in my memory for sometime, just like loosing John Nash on a car accident. Their contribution to math will outlive everything else.


Wear your seat belts, people...


... even in a Taxi cab.


This is the saddest loss of all time




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: