# Interesting mathematical statements (1 Viewer)

#### InteGrand

##### Well-Known Member
I have always found fixed point theorems quite pretty.

Probably the most well-known one is the Brouwer fixed point theorem. One version of this states that if you have a continuous function f from a closed n-ball (eg the set of points of distance =< 1 from the origin in n-dimensional Euclidean space) to itself, then this function must have a fixed point, which is an x such that f(x)=x.

So in one dimension, this says that a continuous function f defined on the interval [0,1] that takes values in [0,1] must have a fixed point. (The 1-d version can be proved at high school level, try it!)

Fixed point theorems can have some pretty whack consequence. Eg, if I am in Russia and I put a map of Russia on a table, there will be a point on this map that lies directly above the actual physical spot in Russia that it represents.
Was the Brouwer fixed point theorem used by Nash to prove the existence of a Nash Equilibrium in a game or something? I don't know too much about game theory haha but this was something I think I heard. And is your use of Russia as the country at all a subtle reference to this game theory of Nash's time? Lol (seems too coincidental that you chose Russia )

#### glittergal96

##### Active Member
Was the Brouwer fixed point theorem used by Nash to prove the existence of a Nash Equilibrium in a game or something? I don't know too much about game theory haha but this was something I think I heard. And is your Russia thing at all a subtle reference to this game theory of Nash's time) Lol (seems too coincidental that you chose Russia )
Haha yes, spot on with it being a crucial ingredient in Nash's work. This is one of the applications in abstract maths I referred to in my last line which I added after you took your quote.

And nope, not an intentional reference. Russia was actually an arbitrary choice lol.

##### Active Member
$\bg_white \noindent Something a MX2 student may find either interesting or intriguing is the closed-form expression for the following infinite exponential tetration:$

$\bg_white h(x) = x^{x^{x^{.^{.^{.}}}}} = \frac{\mbox{W}_0 (-\ln x)}{-\ln x}, \,\, \mbox{for} \,\, e^{-e} \leqslant x \leqslant e^{1/e}.$

$\bg_white \noindent Here \mbox{W}_0 (x) is the principal branch of the \textit{Lambert W function} which is defined as the inverse of the function y = x e^x.$

\bg_white \noindent Within the interval of convergence this allows values for the following infinite exponential tetrations to be found:\\\begin{align*}h(\sqrt{2}) &= \sqrt{2}^{\sqrt{2}^{\sqrt{2}^{.^{.^{.}}}}} = \frac{\mbox{W}_0 (-\ln \sqrt{2})}{-\ln \sqrt{2}} = - \frac{2}{\ln 2} \mbox{W}_0 \left (-\frac{\ln 2}{2} \right ) = 2\end{align*}

\bg_white \begin{align*}h \left (\frac{1}{4} \right ) &= {\frac{1}{4}}^{{\frac{1}{4}}^{{\frac{1}{4}}^{.^{.^{.}}}}}=\frac{\mbox{W}_0 (-\ln (1/4))}{-\ln (1/4)} = - \frac{\mbox{W}_0 \left (2\ln 2 \right )}{2 \ln 2} = \frac{1}{2}\end{align*}

\bg_white \begin{align*}h \left (\frac{1}{e} \right ) &= {\frac{1}{e}}^{{\frac{1}{e}}^{{\frac{1}{e}}^{.^{.^{.}}}}}=\frac{\mbox{W}_0 (-\ln (1/e))}{-\ln (1/e)} = \mbox{W}_0 (1) = \Omega,\end{align*}\\where \Omega is the so-called \textit{omega constant} (0.567 \, 143 \, 29\ldots).

\bg_white \begin{align*}h (\sqrt[3]{3}) &= {\sqrt[3]{3}}^{{\sqrt[3]{3}}^{{\sqrt[3]{3}}^{.^{.^{.}}}}}= -\frac{3}{\ln 3} \mbox{W}_0 \left (-\frac{\ln 3}{3} \right ) \end{align*}

$\bg_white \noindent etc.$

Last edited:

#### InteGrand

##### Well-Known Member
$\bg_white \noindent Another really nice one is Bayes' Billiard Ball argument that proves that \int _0 ^1 \binom{n}{k}x^k (1-x)^{n-k}\text{ d}x=\frac{1}{n+1} without needing to do any integration at all. The 18^{\text{th}} century probabilist Bayes essentially proved this by considering two equivalent approaches to the same probability question (I don't think he was cooking up this problem with the specific aim of proving this integration identity though, of course; he was working on probability).$

$\bg_white \underline{\textbf{Problem}}$

$\bg_white \noindent Suppose a point A is chosen uniformly in [0,1], and n billiard balls (points) are then thrown randomly (uniformly, identically and independently) onto this interval. Let X be the number of billiard balls that end up landing left of A (so X can be any integer from 0 to n inclusive). What is the probability that X=k (without knowing what A is)?$

$\bg_white \underline{\textbf{First approach}}$

$\bg_white \noindent Recall that A was chosen uniformly on the interval [0,1]. Call its value (point in the interval) \mathcal{A}. (So \mathcal{A} is random variable uniformly distributed on [0,1], so its probability density function is f_{\mathcal{A}} (x) = 1, for 0\leq x \leq 1.)$

$\bg_white \noindent Suppose we knew the value of \mathcal{A} to be x. \textsl{Given} this value, the probability of obtaining k billiard balls left of this is just \mathbb{P} \left(X=k | \mathcal{A}=x \right)=\binom{n}{k}x^k (1-x)^{n-k}.^\dagger$

$\bg_white \noindent This is a conditional probability, namely the probability that X=k given we already know what x (the value of the point A) is. Since A is random too, the probability that X=k is given by integrating this conditional probability multiplied by the p.d.f. of \mathcal{A} over all possible values of x. ^{\ddagger} Since the probability density function is just 1, the desired probably is \mathbb{P}\left( X = k\right) = \int _0 ^1 \mathbb{P} \left(X=k | \mathcal{A}=x \right) \cdot f_{\mathcal{A}}(x) \text{ d}x = \int _0 ^1\binom{n}{k}x^k (1-x)^{n-k} \text{ d}x .$

$\bg_white \underline{\textbf{Second approach}}$

$\bg_white \noindent Alternatively, consider having chosen uniformly and independently (n+1) points in [0,1] to start with, and then choosing one of these to be the point A, so that the remaining n are billiard balls. When we have our (n+1) points on the interval (doesn't matter where they ended up being chosen), clearly for each k from 0 to n, to have k balls left of A, there is exactly one point from these (n+1) points we can choose, namely the point that is (k+1)^{\text{th}} from the left out of these points (e.g. to have A have 2 billiard balls left of it, we'd have to choose the third point of these (n+1) points to be A). So as there are (n+1) choices and exactly 1 favourable choice, \mathbb{P}\left(X=k\right) = \frac{1}{n+1}.$

$\bg_white \noindent So comparing this with the first approach, it follows that$

$\bg_white \int _0 ^1 \binom{n}{k}x^k (1-x)^{n-k}\text{ d}x=\frac{1}{n+1}.$

$\bg_white \hrulefill$

$\bg_white \noindent \dagger Why? Because if x is given, when we throw a billiard ball onto the interval, the probability of it landing left of x, that is, in the interval [0,x], is simply x. E.g. if x were \frac{1}{3}, then the chance of a billiard ball landing left of this is just 1 in 3, as the total interval is of length 1. The \binom{n}{k}x^k (1-x)^{n-k} is then just a standard binomial probability, familiar to students of HSC 3U/4U.$

$\bg_white \noindent \ddagger This is the continuous version of something known as the Law of Total Probability. Its discrete analogue is probably known by most students here intuitively. It basically states that if an event C occurs with one of events D_1, D_2, \ldots , D_n, and the events D_i are pairwise mutually exclusive events (i.e. can't have two of them happen together, e.g. a coin landing heads or a coin landing tails), then \mathbb{P}\left(C\right) = \sum_{i=1} ^n \mathbb{P}\left(C| D_i \right) \mathbb{P} \left(D_i\right) See Wikipedia here for an example:$

https://en.wikipedia.org/wiki/Law_of_total_probability#Example

Last edited:

#### glittergal96

##### Active Member
100 people are standing on the positive real axis looking in the positive direction, each wearing hats coloured either black or white.

Each person can see infinitely far and hear from infinite distances.

Going in increasing order, these people are asked the colour of the hat on their head. (They are allowed to discuss a strategy before this whole guessing process starts).

It is slightly surprising that there is a strategy that guarantees correct guesses from 99 of these people.

What is slight more surprising is that these is still such a strategy if more hat colours are allowed (even an uncountable infinitude of hat colours!)

What is most surprising of all is that if there are countably infinite people in this line, and each of these people is deaf, AND there is an uncountable infinity of possible hat colours, we can STILL guarantee the correctness of all but finitely many guesses!!

This is another example of a consequence of the axiom of choice that some people often find unsettling. It also relies on the impossible assumption that a person can have infinite memory...ie they can recall infinite sets, functions on infinite sets, etc etc.

Removing the human / real world element though and viewing the result purely abstractly, it is still immensely weird.

#### leehuan

##### Well-Known Member
This isn't exactly an interesting maths statement but more or less something I found humourous.

##### -insert title here-
This isn't exactly an interesting maths statement but more or less something I found humourous.
that is rather amusing.

I have seen people do the same for exponential limits.

##### -insert title here-
$\bg_white \noindent Not so much surprising a fact as it's own consequences, but... \\ Let F(x) be the integral of the invertible function f(x) and f^{-1}(x) be the inversion of f(x).\\Then the following statement is always true: \\ \int f^{-1}(x) dx = xf^{-1}(x)-F\left(f^{-1}(x)\right) +C\\Due to the arbitrariness of f, the relationship is completely bijective. This means that if a function and it's inverse are expressible in elementary terms, and only one of them has an obvious integral in elementary form, then the other function is guaranteed to have an integral expressible in elementary terms.$

#### leehuan

##### Well-Known Member
Carmichael's theorem:

$\bg_white \\ When n>12, the nth Fibonacci number has at least one prime divisor that DOES NOT divide any earlier Fibonacci number.$

Last edited:

#### KingOfActing

##### Well-Known Member
I don't know if anyone's posted this yet:
$\bg_white If a theory \varphi can prove that \varphi is consistent, then \varphi is inconsistent.$

#### braintic

##### Well-Known Member
Carmichael's theorem:

$\bg_white \\ When n>12, the nth Fibonacci number has at least one prime divisor that DOES NOT divide any Fibonacci number.$
Any EARLIER Fibonacci number.

It wouldn't make any sense without that word.

##### -insert title here-
I don't know if anyone's posted this yet:
$\bg_white If a theory \varphi can prove that \varphi is consistent, then \varphi is inconsistent.$
Then the computational extension of the theorem is as follows:

If a (turing complete) machine can decide whether or not any given input will halt or not, then said machine cannot decide the input comprising itself.

By the law of the excluded middle, this is a contradiction, and as such, any such machine cannot possibly exist.

#### GoldyOrNugget

##### Señor Member
Then the computational extension of the theorem is as follows:

If a (turing complete) machine can decide whether or not any given input will halt or not, then said machine cannot decide the input comprising itself.

By the law of the excluded middle, this is a contradiction, and as such, any such machine cannot possibly exist.
I don't think this statement is quite true. Specifically, the "input comprising itself" bit - I don't know of a proof that shows that this is the case. If you're alluding to the diagonalisation argument, the result is slightly more complex.

Suppose that A is a Turing machine that given any machine T and input I, decides whether B halts on I.

Let C be a machine that, when given I, calls A with machine I and input I and halts iff A concludes that I doesn't halt on input I.

If A is called with machine C and input C and halts, then that means that A must halt on C when given C. Inversely if A is called with (C, C) and doesn't halt, then that means A must halt on (C, C). Here lies the contradiction. (unless I've made a mistake)

So C is the input that breaks T, not T. Unless I've misunderstood what you've said.

But the more important question is - who are you? What kind of high school kid has this kind of background in maths? Where did you learn this stuff?

#### KingOfActing

##### Well-Known Member
I don't think this statement is quite true. Specifically, the "input comprising itself" bit - I don't know of a proof that shows that this is the case. If you're alluding to the diagonalisation argument, the result is slightly more complex.

Suppose that A is a Turing machine that given any machine T and input I, decides whether B halts on I.

Let C be a machine that, when given I, calls A with machine I and input I and halts iff A concludes that I doesn't halt on input I.

If A is called with machine C and input C and halts, then that means that A must halt on C when given C. Inversely if A is called with (C, C) and doesn't halt, then that means A must halt on (C, C). Here lies the contradiction. (unless I've made a mistake)

So C is the input that breaks T, not T. Unless I've misunderstood what you've said.

But the more important question is - who are you? What kind of high school kid has this kind of background in maths? Where did you learn this stuff?
He's more gay for maths than leehuan

#### leehuan

##### -insert title here-
I don't think this statement is quite true. Specifically, the "input comprising itself" bit - I don't know of a proof that shows that this is the case. If you're alluding to the diagonalisation argument, the result is slightly more complex.

Suppose that A is a Turing machine that given any machine T and input I, decides whether B halts on I.

Let C be a machine that, when given I, calls A with machine I and input I and halts iff A concludes that I doesn't halt on input I.

If A is called with machine C and input C and halts, then that means that A must halt on C when given C. Inversely if A is called with (C, C) and doesn't halt, then that means A must halt on (C, C). Here lies the contradiction. (unless I've made a mistake)

So C is the input that breaks T, not T. Unless I've misunderstood what you've said.

But the more important question is - who are you? What kind of high school kid has this kind of background in maths? Where did you learn this stuff?
That's why I said comprising. There are a few gaps that I wasn't bothered filling in, because I know the input requires two machines in order to break the supposed machine.

#### GoldyOrNugget

##### Señor Member
That's why I said comprising. There are a few gaps that I wasn't bothered filling in, because I know the input requires two machines in order to break the supposed machine.
You're an IMO guy/girl aren't you?