alan-crowe 16 hours ago

The review distills the book's view of the difference between pure mathematics and applied mathematics. "applied" split from "pure" to meet the technical needs of the US military during WW2.

My best example of the split is https://en.wikipedia.org/wiki/Symmetry_of_second_derivatives Wikpedia notes that "The list of unsuccessful proposed proofs started with Euler's, published in 1740,[3] although already in 1721 Bernoulli had implicitly assumed the result with no formal justification." The split between pure (Euler) and applied(Bernoulli) is already there.

The result is hard to prove because it isn't actually true. A simple proof will apply to a counter example, so cannot be correct. A correct proof will have to use the additional hypotheses needed to block the counter examples, so cannot be simple.

Since the human life span is 70 years, I face an urgent dilemma. Do I master the technique needed to understand the proof (fun) or do I crack on and build things (satisfaction)? Pure mathematicians are planning on constructing long and intricate chains of reasoning; a small error can get amplified into a error that matters. From a contradiction one can prove anything. Applied mathematics gets applied to engineering; build a prototype and discover problems with tolerances, material impurities, and annoying edge cases in the mathematical analysis. A error will likely show up in the prototype. Pure? Applied? It is really about the ticking of the clock.

  • jkhdigital 15 hours ago

    I’m not sure if I am mathematically sophisticated enough to follow along but I’ll try. This chain of thought reminds me of the present state of cryptography, which is built on unproven assumptions about the computational hardness of certain problems. Meanwhile Satoshi Nakamoto hacks together some of these cryptographic components into a novel system for decentralized payments with a few hand-wavy arguments about why it will work and it grows into a $1+ trillion asset class.

    • DiscourseFan 15 hours ago

      yes the cool thing about tech is that you don't have to know why it will work or even how, just so long as it does.

    • wslh 14 hours ago

      The innovation on Bitcoin is not about cryptography but game-theory at work. For example, is it convenient for a miner to destroy the system or to continue mining? There are theoretical attacks at around 20%, not 51%. A state actor could also attack the system if they want to invest enough resources.

  • vector_spaces 14 hours ago

    The chains of reasoning are only long and intricate if you trace each result back to axiomatics. Most meaningful results are made up of a handful of higher-level building blocks -- similar to how software is crafted out of modules rather than implementing low-level functionality from scratch each time (yes, similar but also quite different)

    • zmgsabst 10 hours ago

      Literally the same:

      A type is a theorem and its implementation a proof, if you believe that Curry-Howard stuff.

      We “prove” (implement) advanced “theorems” (types) using already “proven” (implemented) bodies of work rather than return to “axioms” (machine code).

      • practal 8 hours ago

        No, it is not the same, CH is just a particular instance of it, much like "shape" is not the same thing as "triangle".

  • practal 15 hours ago

    I took a look at the book a while ago, and I like how it treats abstraction as its guiding theme. For my project Practal (https://practal.com), I've recently pivoted to a new tagline which now includes "Designing with Abstractions". And I think that points to how to resolve the dilemma you point out between pure and applied: we soon will not have to decide between pure and applied mathematics. Designing (≈ applied math) will be done the most efficient way by finding and using the right abstractions (≈ pure math).

  • ogogmad 13 hours ago

    I think that the problem is that theoretical real analysis is often presented like it's nothing but a validation of things people already knew to be true -- but maybe it's not?

    The example you gave concerns differentiation. Differentiation is messy in real analysis because it's messy in numerical computing. How real analysis fixes this mess parallels how numerical computing must fix the mess. How do we make differentiation - or just derivatives, perhaps - computable?

    The rock-bottom condition for computability is continuity. All discontinuous functions are uncomputable. It turns out that it is sufficient, to make your theorem hold, to have the 2nd partial derivatives f_{xy} and f_{yx} be continuous. They wouldn't even be computable otherwise!

    One of the proofs provided uses integration. In numerical contexts, it is integration which is considered "easy", and "differentiation" which is considered hard. This is totally backwards to symbolic calculus.

    The article also mentions Distribution Theory. This is important in the theory of linear PDEs. I suspect it is implicit in the algorithmic theory as well, whether practitioners have spelled this out or not. This is a theory that makes the differentiation operator itself computable, but at the cost of making the derivatives weaker than ordinary functions. How so? On the one hand, it allows to obtain things like the Dirac delta as derivatives, but those aren't even functions. On the other hand, these objects behave like functions - let's say f(x,y) - but we can't evaluate them at points; instead, we can take their inner product with test functions, which we can use to approximate evaluation. This is important because PDE solvers may only be able to provide solutions in the weak, distribution-theoretic sense.

    • 082349872349872 8 hours ago

      Do I understand properly that in a different universe distributions could have been called prefunctions?

      • practal 8 hours ago

        A distribution is a function, on the space of test functions.

        • gunnihinn an hour ago

          A distribution is not a function. It is a continuous linear functional on a space of functions.

          Functions define distributions, but not all distributions are defined that way, like the Dirac delta or integration over a subset.

          • skhunted an hour ago

            A functional is a function.

            • ogogmad 33 minutes ago

              The term "function" sadly means different things in different contexts. I feel like this whole thread is evidence of a need for reform in maths education from calculus up. I wouldn't be surprised if you understood all of this, but I'm worried about students encountering this for the first time.

              • skhunted 30 minutes ago

                Don’t know if you are a mathematician or not but mathematically speaking “function” has a definition that is valid in all mathematical contexts. Functional clearly meets the criteria to be a function since being a function is part of the definition of being a functional.

                • ogogmad 20 minutes ago

                  The situation is worse than I thought. The term "function", as used in foundations of mathematics, includes functionals as a special case. By contrast, the term "function", as used in mathematical analysis, explicitly excludes functionals. The two definitions of the word "function" are both common, and directly contradict one another.

                  • skhunted 13 minutes ago

                    By contrast, the term "function", as used in mathematical analysis, explicitly excludes functionals. The two definitions of the word "function" are both common, and directly contradict each other.

                    This is incorrect. In mathematics there is a single definition of function. There is no conflict or contradiction. In all cases a function is a subset of the cross product of two spaces that satisfies a certain condition.

                    What changes from subject to subject is what the underlying spaces of interest are.

        • keithalewis an hour ago

          Try composing two distributions.

          • practal an hour ago

            Try composing f : A -> B with g : A -> B, for A ≠ B. Still, f and g are functions. So, what exactly is your point?