Maker of Patterns: Part 1
A mathematician, like a painter or poet, is a maker of patterns. If his patterns are more permanent than theirs, it is because they are made with ideas.
G.H. Hardy
How can a mathematical proof or block of code be beautiful? Typical descriptions are elegant, generalizable, insightful, or surprising. It's something that I've thought deeply about as an amateur (a generous description) mathematic an and a professional software engineer.
One of the best descriptions of this phenomenon comes from a pure mathematician named G.H. Hardy. Hardy was a mathematician known for his contributions to number theory and analysis. He's also responsible for the Hardy-Weinberg Principle in biology. In 1940, he wrote A Mathematician's Apology, as his defense of working on pure mathematics as his life's work.
Hardy believed that pure mathematics was justified entirely by its aesthetic value and not by its practical uses. In particular, Hardy valued two ideas in pure mathematics: generality and depth. But these principles aren't just found in mathematics. They are central to computer science and the art of software engineering.
Generality is an ambiguous and complex to define but may be more easily shown by a trivial proof of the Pigeonhole principle.
If three objects are each painted either red or blue, then there must be at least two objects of the same color.
And a proof,
Assume, without loss of generality, that the first object is red. If either of the other two objects is red, then we are finished; if not, then the other two must be blue.
Generality here refers to the fact that we could have made the objects red or blue, white or black. We could have generalized to n+1 objects and n colors (and if you want to sound like a mathematician, you can abbreviate the phrase "without loss of generality" to w.l.o.g.).
Take Chess, for example. For proving mathematical problems related to Chess, it doesn't matter if the pieces are white and black, or blue and red, or even physical pieces. So while Hardy believed that these sorts of general problems weren't as pure as his beloved number theory, I think they're still in the same category.
For computer scientists seek out the idea of generality in abstraction. Generalization appears in object-oriented programming, in functions and methods, and in generic and dynamic typing. It's clear to a programmer that a general solution that achieves the same result as multiple case-specific solutions is always objectively better.
I wrote about my philosophy of generality and how I used it to choose what to work on in First Principles. It's a slightly different way of looking at it, but it may be interesting if you're still curious about applying the concept to everyday life.
Depth is another difficult term to define but central to the aesthetic of mathematics and computer science. It has something to do with difficulty, deeper ideas are harder to understand, but they are not the same. For example, Pythogras' theorem is deep, but many mathematicians wouldn't find it difficult to understand. One can think of depth as the complex relationships between mathematical ideas. For example, irrational numbers (e.g., √2) are "deeper" than integers (i.e., whole numbers).
In computer science, depth is more literal. Depth can be thought of as layers in a TCP/IP stack, or as the difference between low-level machine code and high-level human-readable code. Sometimes, problems can only be solved by going deeper, diving one layer lower into abstractions. Both mathematicians and software engineers need the ability to map the relationships and hierarchies between these concepts.
In part two, I'll look at Hardy's second argument about pure mathematics: mathematics is a young person's game. It's an observation that also generalizes to computer science and software engineering. Finally, I'll investigate the link and what it means for the future of both disciplines.
Footnote: Hardy thought that pure mathematics was superior to applied mathematics. His field of number theory was the purest within mathematics because it had few or no real-world applications. Hardy was trying to distance himself from WWII, which had started a year earlier, in 1939. Ironically, number theory would become the foundation for cryptography, which played a central role in the war, driving encryption, code-breaking, and communication.