Home

> urticator.net
  Search

  About This Site
> Domains
  Glue
  Stories

  Computers
  Driving
  Games
  Humor
  Law
> Math
  Numbers
  Science

> Continued Fractions
  Game Theory (Section)
  Group Theory
  Probability
  Miscellaneous

  Approximation Quality
  Two Examples
  The Markov Equation
  Equivalence
  The Markov Constant
  The Pattern
  Group Elements and Matrices
  The Random Thoughts
  Symmetries
  Chains and Denominators
  The Discriminant
  Properties of Rotations
  Reflection Symmetry
  A Nice Correspondence
> Epilogue

> The True Pattern
  Bibliography

The True Pattern

From [Aigner] I learned that it's surprisingly easy to prove a surprisingly large amount about the pattern.

The first step is to realize that it was a mistake to normalize the Markov solutions. Here's the structure that we want.

251
2295131
216929433519413341

Here we have solutions where the largest value is always in the center. In the first row we have the solution (2,5,1). In the second row there are two solutions. On the left we have (2,29,5), constructed by flipping the value on the right in (2,5,1) and putting it in the center, and on the right we have (5,13,1), constructed by flipping the value on the left in (2,5,1) and putting it in the center. In the third row we have four solutions, and so on.

Now let's do the same thing with large and small units, which can be either expansions or matrices. (We know from A Nice Correspondence that the two are essentially the same.)

LLSS
LLLSLSLSSS
LLLLSLLSLLSLSLSLSLSSLSSLSSSS

Here we have triples where the center is always just the product of the sides. In the first row we have the triple (L,LS,S). In the second row we have two triples, (L,LLS,LS) and (LS,LSS,S). In the third row we have four triples, and so on.

Here are the expansions and matrices for the initial triple, along with the corresponding Markov numbers u.

LLSS
251
 
22221111
 
52   127   21
215311

For each of those matrices, the trace is 3u and the entry in the lower left corner is u. We can and will prove that the rest of the matrices have the same properties. By the way, a matrix with those properties is called a Cohn matrix. The definition in [Aigner] says that the value u ought to appear in the upper right, but that's not what we want here, so I've transposed everything.

Before we prove anything, though, we'll need to know four interrelated facts about 2×2 matrices with determinant 1. I think of myself as having a pretty good understanding of matrices, but I was very surprised by these facts.

First, suppose we have a matrix G, and suppose we also write down the transposed cofactor matrix.

ab   d−b
cd−ca

The product of the two is (ad − bc)I, but ad − bc = det G, and here det G = 1, so the second matrix is actually the inverse of the first.

Second, the matrices have the same trace.

tr G = (a + d) = tr G−1

Third, if we add the matrices together, the off-diagonal entries cancel.

G + G−1 = (a + d)I = (tr G)I

Fourth, we can rearrange that last equation to get a strange result.

G = (tr G)I − G−1

Now we can prove things! We'll use induction over the tree structure, so suppose that we have a triple (L,LS,S) of matrices, that we have a solution (v,u,w), and that the traces of the matrices are equal to three times the corresponding Markov numbers.

tr L= 3v
tr LS= 3u
tr S= 3w

Here we don't assume that v is larger than w or that L is in any sense larger than S.

Now let's look at tr LLS. We can apply the fourth matrix fact with G = LS.

tr LLS = (tr L)(tr LS) − tr L(LS)−1

To simplify that last term, we can use (1) the fact that traces are invariant under rotation and (2) the second matrix fact.

tr L(LS)−1 = tr LS−1L−1 = tr S−1L−1L = tr S−1 = tr S

Here's the result.

tr LLS = (tr L)(tr LS) − tr S

We know the values of everything on the right-hand side.

tr LLS = (3v)(3u) − 3w = 3(3uv − w)

But, we've seen 3uv − w before! It's the value uw that we get from flipping the value w. That takes care of the left branch of the tree.

tr L= 3v
tr LLS= 3uw
tr LS= 3u

In the same way, we can show that the right branch of the tree contains the value uv = 3uw − v that we get from flipping the value v.

tr LS= 3u
tr LSS= 3uv
tr S= 3w

What about the entries in the lower left corners of the matrices? Well, first of all, to get a grip on them mathematically we can use the unit vectors e1 and e2. Here's one example.

e2TLe1 = v

We can remove the Markov number from that equation by multiplying by 3.

3e2TLe1 = 3v = tr L

Then we can swap the left and right sides to get nice initial conditions for induction.

tr L= 3e2TLe1
tr LS= 3e2TLSe1
tr S= 3e2TSe1

To start the induction, we can take an equation from above …

tr LLS = (tr L)(tr LS) − tr S

… and substitute in the new values.

tr LLS = (3e2TLe1)(3e2TLSe1) − 3e2TSe1

How anyone ever figured out this next part, I have no idea. On the right side of tr L we have Le1, which is a column vector with components a and c, and on the left side of tr LS we have e2TL, which is a row vector with components c and d. Since they're next to each other, we can multiply them together, but we have to remember that matrix multiplication isn't commutative. When we multiply a row vector by a column vector, we get a scalar, the dot product, but when we multiply a column vector by a row vector, we get a matrix.

acad
cccd

That matrix looks a lot like the matrix cL (or Lc, same thing).

acbc
ccdc

In fact, the difference is zero except that the entry in the upper right corner is ad − bc = det L = 1.

01
00

We can write that as e1e2T, then we have the following.

Le1e2TL = cL + e1e2T

We can rewrite the matrix cL by tacking on a factor of 3 and then applying the third matrix fact.

3cL = 3vL = (tr L)L = (L + L−1)L = LL + I

Let's put all the pieces together.

3Le1e2TL = 3cL + 3e1e2T = LL + I + 3e1e2T

When we put that back into context, the third term becomes zero when it meets the e2T in front, the second term becomes 3e2TSe1 and cancels with the second term in the context, and the first term becomes the desired result.

tr LLS = 3e2TLLSe1

If we copy and paste the equations for tr L and tr LS, that takes care of the left branch of the tree. And, of course we can handle the right branch of the tree in just the same way.

tr LSS = 3e2TLSSe1

With that, the proof is complete. Unfortunately, it's only a proof, and not, as the Merovingian would say, a reason.

But this is not a reason, this is not a “why”.

Where does that leave us?

  • From Group Elements and Matrices we now really know that D = 9u2 − 4. The pattern magic has become pattern technology!
  • From The Discriminant we now really know that M ≥ (root D)/u (because the entry in the lower left corner of the matrix is also the denominator qL−1).
  • We still don't know how to prove that that inequality is actually an equality.
  • We still don't know how to prove that there are no other expansions with M < 3.

 

  See Also

@ November (2024)