I don't want to formalize any of my work on mathematics.
-
@cbaberle said "And this process itself can lead to a lot of "a ha" moments and "radical" new ideas, itself."
I have found the same thing. When I am working on things that are not directly about formalized mathematics, but with using a proof assistant as a blackboard (echoing Martin's wonderful phrasing), I feel that I am much freer to make wild conjectures, because I can disprove them equally quickly.
The numbers of "models" of quantum programming based on traced monoidal categories (that did not in fact work) is staggering. The failures were usually quite subtle. My co-author(s) and I had convinced ourselves via 'paper math' that they worked, for each and every one of them.
@johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject I feel like there is a bit of a selection bias here. Would you say that formalization is as useful as a blackboard for eg number theory or geometric measure theory, as it is for type-theory/logic/computer science?
There is a huge UX problem, due to the fact that most mathematical research is done with objects and methods that fit badly in the current text-based, heavily formal (I don't know how to say this better) proof assistants. So my impression is that for most mathematics working formalization-first would be as painful and counterproductive as it would be for a PL/type theorist to work only on a whiteboard.
Hopefully this will change in the future! I do believe that formalization is very useful for mathematics. -
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject I feel like there is a bit of a selection bias here. Would you say that formalization is as useful as a blackboard for eg number theory or geometric measure theory, as it is for type-theory/logic/computer science?
There is a huge UX problem, due to the fact that most mathematical research is done with objects and methods that fit badly in the current text-based, heavily formal (I don't know how to say this better) proof assistants. So my impression is that for most mathematics working formalization-first would be as painful and counterproductive as it would be for a PL/type theorist to work only on a whiteboard.
Hopefully this will change in the future! I do believe that formalization is very useful for mathematics.@mc These areas might be more painful, but they contain mistakes. I found a few measure theoretic mistakes in some recent papers / theses over the last year or two.
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject
-
@andrejbauer - "Formalized mathematics makes us think "the right way" in the sense that it requires mental hygiene, it encourages better organization, it invites abstraction, and it demands honesty."
I did read it differently. I was really worrying that Kevin meant formalizing mathematics in *Lean* forces us to think the right way. But in fact I don't think formalizing mathematics at all makes us think "the" right way. It has good sides, which you mention, so it's *a* right way to do mathematics. But it also has bad sides. Mostly, it doesn't encourage radical new ideas that don't fit well in existing formalisms. Newton, Euler, Dirac, Feynman and Witten are just a few of the most prominent people who broke out of existing frameworks, didn't think formally, and did work that led to a huge growth of mathematics. If you say "those people are physicists, not mathematicians", then you're slicing disciplines differently than me. I find their ideas more mathematically interesting than most mathematics that fits into existing frameworks.
@dougmerritt @MartinEscardo @JacquesC2 @pigworker @xenaproject
@johncarlosbaez @dougmerritt @MartinEscardo @JacquesC2 @pigworker @xenaproject Well, it should be obvious, but let me say it anyway.
We need mathematicians of every kind: the thinkers, the dreamers, the formalizers, and even physicists.
-
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject I feel like there is a bit of a selection bias here. Would you say that formalization is as useful as a blackboard for eg number theory or geometric measure theory, as it is for type-theory/logic/computer science?
There is a huge UX problem, due to the fact that most mathematical research is done with objects and methods that fit badly in the current text-based, heavily formal (I don't know how to say this better) proof assistants. So my impression is that for most mathematics working formalization-first would be as painful and counterproductive as it would be for a PL/type theorist to work only on a whiteboard.
Hopefully this will change in the future! I do believe that formalization is very useful for mathematics.@mc @JacquesC2 @cbaberle @johncarlosbaez @dougmerritt @MartinEscardo @pigworker @xenaproject I'd go further and "blame" the philosophers of language, too.
-
@MartinEscardo - how long would it take to "try it once"? Then I can weigh that against other things I could do with that time.
I have absolutely nothing I want to formalize, and I'd be weighing time spent on this against things like learning how to make music on a digital audio workstation, which I put off because it doesn't provide the instant gratification of playing a piano, but which I know I'd like in the end. In either case I know I'll start by suffering for hours, learning someone else's syntax and conventions instead of thinking my own thoughts. But for music software I know I'll eventually get a product I enjoy that I couldn't make any other eay.
@johncarlosbaez @MartinEscardo It took me weeks to finish my first non-trivial proof (preservation in an STLC). I'd think there's stuff that can be knocked out in a few hours that still retain the flavor, but no example leaps to mind.
-
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject I feel like there is a bit of a selection bias here. Would you say that formalization is as useful as a blackboard for eg number theory or geometric measure theory, as it is for type-theory/logic/computer science?
There is a huge UX problem, due to the fact that most mathematical research is done with objects and methods that fit badly in the current text-based, heavily formal (I don't know how to say this better) proof assistants. So my impression is that for most mathematics working formalization-first would be as painful and counterproductive as it would be for a PL/type theorist to work only on a whiteboard.
Hopefully this will change in the future! I do believe that formalization is very useful for mathematics.This issue of fit (which is indeed related to linguistics) is why CASes have a huge leg up on ITPs regarding UX. But they also have their fit problems, never mind their correctness problems.
Mathematica's UI coupled with a nice ITP would be nice. That would need a complete rewrite of both halves to be viable. But that would be a vast improvement.
Then we'd be in a good position, as a community, to actually figure out the "next generation" system.
@cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject
-
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject I feel like there is a bit of a selection bias here. Would you say that formalization is as useful as a blackboard for eg number theory or geometric measure theory, as it is for type-theory/logic/computer science?
There is a huge UX problem, due to the fact that most mathematical research is done with objects and methods that fit badly in the current text-based, heavily formal (I don't know how to say this better) proof assistants. So my impression is that for most mathematics working formalization-first would be as painful and counterproductive as it would be for a PL/type theorist to work only on a whiteboard.
Hopefully this will change in the future! I do believe that formalization is very useful for mathematics.@mc I definitely agree with this, and I have a bunch of thoughts about fixing it. Some programmatic remarks:
- Interactive theorem proving and text-based representations are already uneasy bedfellows. Let's embrace the "interactive" part and move to true structure editing where proof data is stored as abstract syntax trees and the language provides an open protocol for interfacing with such ASTs, on top of which one can implement whatever UI one pleases. Text-based editing becomes just one such "view" of the underlying proof data.
- A lot of "ordinary" math comes down to: draw a structured graph (in *both* senses of the word "graph") of some sort, and make some conclusion based on its structure. By now there's plenty of existing work on compiling things like commutative diagrams/string diagrams/etc. to syntax trees, etc., and we can and *should* make use of this work to provide more convenient interfaces to ITPs, via protocols as above.
- Interactive theorem proving deserves interactive documentation. We should have an analogue of Jupyter/Mathematica Notebooks for ITPs where different editor UIs can be mixed and matched, data can be displayed in a variety of formats, etc.@JacquesC2 @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject
-
Sorry, let me adopt your convention to make my reply more clear -- but you can tell I was replying to John since it's his post that show up above mine when you click my post (at least on my instance)
@johncarlosbaez @andrejbauer @dougmerritt @JacquesC2 @pigworker @xenaproject
@hallasurvivor - I don't think Grothendieck broke out of existing formalisms in quite the way I'm talking about. Though it broke radically new ground, all his work was rigorized very quickly. The people I mentioned did work that either took a century or more to make rigorous, or is still in process of being brought into the fold of rigorous mathematics. I'm talking about Newton's calculus, Euler's manipulations of divergent series to compute the zeta function, Feynman's path integrals, and the many path integral "proofs" given by Witten.
There are dozens of less famous but still interesting examples. I would never have written a paper about the Cobordism Hypothesis (and the still less finished Tangle Hypothesis and Generalized Tangle Hypothesis) if I had been thinking about formal definitions or proofs.
(That's an example that comes from outside analysis! But my other examples suggest analysis is a lot more future-leaning than algebra, in the sense of racking up debts against future formalization.)
-
@johncarlosbaez @dougmerritt @MartinEscardo @JacquesC2 @pigworker @xenaproject Well, it should be obvious, but let me say it anyway.
We need mathematicians of every kind: the thinkers, the dreamers, the formalizers, and even physicists.
@andrejbauer - I agree: I want a diversity of approaches. Quotes like "It forces you to think about mathematics in the right way" are what scare me - together with the money that's getting poured into Lean right now, which could fool a young mathematician into thinking this is "the" right way to do math.
I have no fear that *you* will be pressured into doing anything you don't want to, just because it's the hot new trend.
-
@johncarlosbaez What I find interesting about this is that during my mathematical training I was taught there is but one way to do mathematics: ZFC. When I got to meet people working in proof verification I learnt that there is a whole world of formalisms out there.
@dpiponi @johncarlosbaez I managed to get as far as a DPhil without any formal (sorry) education in the foundations of maths. The only times I recall it rearing its head were the passing comments in topology that Tychonoff's theorem needed the axiom of choice, and likewise in measure theory for non-measurable sets.
-
@andrejbauer - I agree: I want a diversity of approaches. Quotes like "It forces you to think about mathematics in the right way" are what scare me - together with the money that's getting poured into Lean right now, which could fool a young mathematician into thinking this is "the" right way to do math.
I have no fear that *you* will be pressured into doing anything you don't want to, just because it's the hot new trend.
@johncarlosbaez @andrejbauer You don't have to think in the formally correct way, but it helps to know what the formally correct way is.
-
@MartinEscardo Re: trusting C code, I heard horror stories about compilers for aeronautics, where one of the criteria for acceptance is that the "compiler" is very transparent and that the assembly code is auditable on its own. Apparently, getting these sort of people to accept an optimising compiler, even CompCert, was a hard battle… (But it was won!)
@mevenlennonbertrand @MartinEscardo I have developed stuff using C language for many years. The thing is, it requires a good understanding of the C in order to properly program stuff. C is notorious for many UBs (undefined behaviors), for lax typing, etc. And it requires more coding and debugging to achieve things. This is one reason why many other languages such as Python are used. I myself use Python instead of C for quick programming (zero worry about memory leak or corruption). Just FYI.
-
@dpiponi @johncarlosbaez I managed to get as far as a DPhil without any formal (sorry) education in the foundations of maths. The only times I recall it rearing its head were the passing comments in topology that Tychonoff's theorem needed the axiom of choice, and likewise in measure theory for non-measurable sets.
@RobJLow @dpiponi - I spent a huge amount of time learning logic, computability theory, and set theory as a kid, then got sick of that because it seemed the rest involved large cardinals and forcing, which didn't appeal to me. So around 1982 I quit thinking about this stuff. By the time I heard about the exciting newer approaches to logic, like topos theory and type theory, I was much less engrossed and picked it up slowly except for the categorical approach to quantum logic. Then homotopy type theory was born... right around when I'd lost interest in homotopy theory.
-
@johncarlosbaez @dougmerritt @MartinEscardo @JacquesC2 @pigworker Somewhat unexpectedly, I find myself on the same side as @xenaproject on this one, I suppose because I read "the right way" differently from @johncarlosbaez
Formalized mathematics makes us think "the right way" in the sense that it requires mental hygiene, it encourages better organization, it invites abstraction, and it demands honesty.
Formalized mathematics does not at all impose "One and Only Truth", nor does it "nail things down with rigidity" or "impose concensus". Those are impressions that an outsider might get by observing how, for the first time, some mathematicians have banded together to produce the largest library of formalized mathematics in history. But let's be honest, it's miniscule.
Even within a single proof assistant, there is a great deal of freedom of exploration of foundations, and there are many different ways to formalize any given topic. Not to mention that having several proof assistants, each peddling its own foundation, has only contributed to plurality of mathematical thought.
Current tools are relatively immature and do indeed steal time from creative thought to some degree, although people who are proficient in their use regularly explore mathematics with proof assistants (for example @MartinEscardo and myself), testifying to their creative potential.
Finally, any fear that Mathlib and Lean will dominate mathematical thought, or even just formalized mathematics, is a hollow one. Mathlib will soon be left in the dust of history, but it will always be remembered as the project that brought formalized mathematics from the fringes of computer science to the mainstream of mathematics.
@pigworker @andrejbauer @xenaproject @JacquesC2 @MartinEscardo @johncarlosbaez @dougmerritt btw, in light of this discussion, you folks might be interested in this work: https://hci.social/@chrisamaphone/116324984760765758
-
@johncarlosbaez @andrejbauer You don't have to think in the formally correct way, but it helps to know what the formally correct way is.
@sjb I think the point that both @johncarlosbaez and @andrejbauer are making is that there is no single correct to way to think, whether you are working formally or traditionally.
-
@MartinEscardo - how long would it take to "try it once"? Then I can weigh that against other things I could do with that time.
I have absolutely nothing I want to formalize, and I'd be weighing time spent on this against things like learning how to make music on a digital audio workstation, which I put off because it doesn't provide the instant gratification of playing a piano, but which I know I'd like in the end. In either case I know I'll start by suffering for hours, learning someone else's syntax and conventions instead of thinking my own thoughts. But for music software I know I'll eventually get a product I enjoy that I couldn't make any other eay.
Have a look at these Lean games:
It's good to work a few examples to see how they spell out in a proof assistant. It's not difficult! It's fun!
I see proof assistants as a sort of calculator for assumptions. They really shine when you throw lots of stuff at them, like verifying computer programs. Which involves lots of boring stuff, and a proof assistant can help with that! If it's a burden then don't use a proof assistant.
-
Have a look at these Lean games:
It's good to work a few examples to see how they spell out in a proof assistant. It's not difficult! It's fun!
I see proof assistants as a sort of calculator for assumptions. They really shine when you throw lots of stuff at them, like verifying computer programs. Which involves lots of boring stuff, and a proof assistant can help with that! If it's a burden then don't use a proof assistant.
@johncarlosbaez asked:
> how long would it take?I just noticed I didn't answer your question. An hour is fine. More is more.
-
@MartinEscardo - how long would it take to "try it once"? Then I can weigh that against other things I could do with that time.
I have absolutely nothing I want to formalize, and I'd be weighing time spent on this against things like learning how to make music on a digital audio workstation, which I put off because it doesn't provide the instant gratification of playing a piano, but which I know I'd like in the end. In either case I know I'll start by suffering for hours, learning someone else's syntax and conventions instead of thinking my own thoughts. But for music software I know I'll eventually get a product I enjoy that I couldn't make any other eay.
@johncarlosbaez It is absolutely fine not wanting to try it.
But then please stop saying things such as "I don't think formalizing mathematics at all makes us think the right way" or other claims about formalization.
If you want to speak with authority about formalization, then learn it first. And, yes, it is not easy or fast to learn it.
-
@JacquesC2 @johncarlosbaez @dougmerritt @MartinEscardo @andrejbauer @pigworker Lean is worse, but, infamously, https://en.wikipedia.org/wiki/Worse_is_better
@markusde @JacquesC2 @johncarlosbaez @dougmerritt @MartinEscardo @andrejbauer @pigworker Ok, so what is better (ie worse) than Lean?
-
@mc These areas might be more painful, but they contain mistakes. I found a few measure theoretic mistakes in some recent papers / theses over the last year or two.
@JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject
@ohad @JacquesC2 @cbaberle @johncarlosbaez @andrejbauer @dougmerritt @MartinEscardo @pigworker @xenaproject yeah but that's not my point. is formal-first, agda-as-a-blackoboard, a good fit for those disciplines with the current tools? one can definitely argue for formalization of the results *at some point* as a means to check their correctness (I'm a bit skeptical of that too, but it's not a hill I want to die on)