Handmade Folks,

UPDATE (5/22/2018)

rby90 recommended teachyourselfcs

Delix added a Handmade wiki entry. See The Pigeonhole Principle

UPDATE (6/2/2018)

Halarious recommends The Napkin Project. At the time of writing, the last sections present set theory pretty accessibly.

---

In lieu of providing monthly news (nothing's changed since April), I want to give a brief and high-level overview of discrete mathematics. I've felt alarmed by rumors of programming students this year questioning its value, or just not knowing what it is! I can't confirm those rumors, but it won't hurt to present the topic.

*NOTE: I am not a master, but I can get by. It's also improved my skills as an engineer nontrivially, so I'd like beginners to know what it is. With a subject this close to my heart, I may come across as pushy and opinionated.*

**What is Discrete Mathematics?**

My old textbook, surprisingly, shows us a powerful sample of the problems discrete math can solve. So I'll start by quoting these directly:

How many ways are there to choose a valid password on a computer system?

What is the probability of winning a lottery?

Is there a link between two computers in a network?

How can I identify spam e-mail messages?

How can I encrypt a message so that no unintended recipient can read it?

What is the shortest path between two cities using a transportation system?

How can a list of integers be sorted so that the integers are in increasing order?

How many steps are required to do such a sorting?

How can it be proved that a sorting algorithm correctly sorts a list?

How can a circuit that adds two integers be designed?

How many valid Internet addresses are there?

All these problems are mathematical, and we can solve them as such. Also note this: the common thread underpinning these questions is that they involve*discrete* entities (discrete as in separate, distinct, and unattached). Discrete mathematics, then, may be defined as the study of discrete structures. The problems listed can be viewed as instances of discrete structures with well-defined properties and behavior, and like any other mathy thing -- we can prove theorems about them as well.

The terminology here leads to a crucial observation: computers store and manipulate information in a discrete fashion. They must solve problems, whether continuous or discontinuous, discretely.

**What are these discrete structures?**

Too many to go over! I'll list the ones I've found most useful as a software engineer.

Sets

A set is an unordered collection of things. Anyone who's been a programmer for any length of time has probably managed a series of objects (stored as arrays, linked lists, elements in a hash table, etc.). Studying the set as a basic discrete structure is useful, because we can think about many programming problems as being a "set" of something. For example, I'm sure the reader has heard other programmers say*C++ is a 'superset' of C* or *this problem is a 'subset' of the longest path problem*. Please note those terms are rigorously defined and come from discrete math.

[NOTE: C++ is not a superset of C.]

At a minimum, one should learn about the set identities, null set, singleton set, unions of sets, intersections of sets and the cardinality of sets. Understand the difference between countable and uncountable sets. They give an intuition about the properties of entities belonging in a group. I've solved programming problems by proving two countable sets are disjoint; in other words, that their intersection is the empty set. It was a matter of*recognizing* the problem could be understood in terms of set theory.

Should the reader not be convinced one should study this, be aware many discrete structures are constructed using sets. These include combinations and permutations, graphs, and finite-state machines.

Combinatorics

Combinatorics is the study of arranging objects, and with the exception of Turing machines, learning "how to count" is the most important mathematical subject I've ever studied. At least five of the problems listed at the beginning of this blog post*require* combinatorics, whether directly or implicitly. In addition, some intuition will develop by studying this topic: remember computers do things step-by-step. Their effort is countable. If you study advanced counting for months, and you've done enough low-level programming, you'll develop a VERY strong sense of the effort involved for any programming task. Phrased another way, an software engineer that can foresee combinatorial explosions will prevent them through smarter architectural decisions e.g. not over-engineering and not generalizing code when it's not worth the cost. Sounds obvious, but the difference of someone who knows this versus someone who **knows** this is of a true order of magnitude.

At a minimum, one should learn the basics: product rule, sum rule, principle of inclusion-exlusion, and the division rule. With that alone, one can answer the first and last questions posted at the start, along with questions I recommend programmers be capable of answering, such as:

How many variables can be named in Go? What about Rust?

HINT: My old textbook fleshes out the question for Java, quoted below:

How many different WiFi WPA keys are there?

HINT: My old textbook fleshes out the question for the (very old) WEP key, quoted below:

How many positive integers less than 100,000 are not divisible by 6 or by 8? (Requires principle of inclusion-exclusion)

After this, one could go for the Pigeonhole Principle, which is a fun and useful one. To top off the introduction to combinatorics I would say go with combinations and permutations. They are discrete structures built upon sets and basic counting.

HINT: Let's consider a real-world example:

Graphs

This will be a short section. We intuitively know what a graph is. Just like sets, they are foundational in understanding other discrete structures, but they can also be used directly for understanding patterns of programming behaviors. Social networks, the Web, airline routes, the module dependencies in someone's program, etc. are instances of graph theory problems. In addition, I'm sure the reader has heard other programmers say*my program's call 'graph' is too large* or *this problem amounts to collapsing 'nodes' in an abstract syntax 'tree'.* Please note those terms are rigorously defined, and come from discrete math.

At a minimum, one should learn the definition of directed and undirected graphs; bipartite graphs, and the star, ring and hybrid topologies.

Turing Machines

In my view, for a programmer looking for immediate results, this topic takes longer to reap the benefits than, say, learning combinatorics. However, long-term study of language grammars, finite-state automata, and ultimately turing machines will open the floodgates. The key insight that changed my life (and career trajectory), was learning that*programming finds the right finite-state automaton for a specific task.*

HINT: To make more sense of this, I recommend Per's Bitwise. His first few episodes immediately use a simple grammar and a pushdown automaton (a type of state machine) in a practical setting using the C language.

**Conclusion**

Resources for Mastering Discrete Math: Ranked in Difficulty

I implore the community to take mathematics in their life seriously. Discrete math is to programmers what calculus is to physicists.

Discrete Mathematics with Applications - Anyone too rusty with their math will thank Susanna for her clarity of thought and easy-to-follow examples, ramping up the difficulty only after the reader has attained enough comfort.

My Old Textbook - Still as expensive as ever, I see. Sigh. However, no other book has a larger collection of exercises. I recommend Susanna to get an overview, and this book if someone gets serious about mastering the subject fully. The reader can get by with Susanna just fine, though.

[Optional] Concrete Mathematics: A Foundation for Computer Science - Donald Knuth's Concrete Mathematics is a blending of CONtinuous and disCRETE mathematics. It's a book that expands on papa Knuth's famous*Mathematical Preliminaries* in the Art of Computer Programming... I recommend it only after going through my old textbook.

[Optional] CLRS - The popular CLRS book is concerned with the modern anthology of classical computer science algorithms. Deep algorithmic thinking is a super power most of us don't get to fully develop, and should be the reader's next goal after learning discrete math. I'm serious, do NOT attempt this book until understanding discrete math. It's the language used for describing the algorithms. Some veteran readers may disagree with me on this, but at the very least, I strongly advise a decent overview of the specific discrete structures I presented in this post.

Yours,

Abner

UPDATE (5/22/2018)

UPDATE (6/2/2018)

---

In lieu of providing monthly news (nothing's changed since April), I want to give a brief and high-level overview of discrete mathematics. I've felt alarmed by rumors of programming students this year questioning its value, or just not knowing what it is! I can't confirm those rumors, but it won't hurt to present the topic.

My old textbook, surprisingly, shows us a powerful sample of the problems discrete math can solve. So I'll start by quoting these directly:

All these problems are mathematical, and we can solve them as such. Also note this: the common thread underpinning these questions is that they involve

The terminology here leads to a crucial observation: computers store and manipulate information in a discrete fashion. They must solve problems, whether continuous or discontinuous, discretely.

Too many to go over! I'll list the ones I've found most useful as a software engineer.

Sets

A set is an unordered collection of things. Anyone who's been a programmer for any length of time has probably managed a series of objects (stored as arrays, linked lists, elements in a hash table, etc.). Studying the set as a basic discrete structure is useful, because we can think about many programming problems as being a "set" of something. For example, I'm sure the reader has heard other programmers say

[NOTE: C++ is not a superset of C.]

At a minimum, one should learn about the set identities, null set, singleton set, unions of sets, intersections of sets and the cardinality of sets. Understand the difference between countable and uncountable sets. They give an intuition about the properties of entities belonging in a group. I've solved programming problems by proving two countable sets are disjoint; in other words, that their intersection is the empty set. It was a matter of

Should the reader not be convinced one should study this, be aware many discrete structures are constructed using sets. These include combinations and permutations, graphs, and finite-state machines.

Combinatorics

Combinatorics is the study of arranging objects, and with the exception of Turing machines, learning "how to count" is the most important mathematical subject I've ever studied. At least five of the problems listed at the beginning of this blog post

At a minimum, one should learn the basics: product rule, sum rule, principle of inclusion-exlusion, and the division rule. With that alone, one can answer the first and last questions posted at the start, along with questions I recommend programmers be capable of answering, such as:

HINT: My old textbook fleshes out the question for Java, quoted below:

The name of a variable in the JAVA programming language is a string of between 1 and 65,535 characters, inclusive, where each character can be an uppercase or a lowercase letter, a dollar sign, and underscore, or a digit, except that the first character must not be a digit. Determine the number of different variable names in JAVA.

HINT: My old textbook fleshes out the question for the (very old) WEP key, quoted below:

A wired equivalent privacy (WEP) key for a wireless fidelity (WiFi) network is a string of either 10, 26, or 58 hexadecimal digits. How many different WEP keys are there?

After this, one could go for the Pigeonhole Principle, which is a fun and useful one. To top off the introduction to combinatorics I would say go with combinations and permutations. They are discrete structures built upon sets and basic counting.

HINT: Let's consider a real-world example:

Recently at work, I've been using the theorems in combinations and permutations to permute a subset of any valid input given to the compiler. This produces a basic version of a compiler fuzzer to catch parser bugs (a more serious fuzzer could be a random program generator, but I've found more compiler bugs in less time by permutation). The subset here is an R-combination of substrings, and I'm very specific about this:

In how many ways can I select an R number of substrings from a source file with Y lines?

Let's say that after answering this, I make the choice to permute 8 substrings that seem likely to cause compiler bugs. Combinatorics tells us we can rearrange 8 strings 40,320 times. I might decide to just care about the first 10,000.

How do I produce 10,000 rearrangements of 8 strings without hard-coding them?

Fortunately there's a well-known algorithm in permutation theory to efficiently generate all the permutations of an N-length set in lexicographic order, as needed. The proof of that is elegant, but more importantly for me: problem solved.

Counter-Argument

"But Abner, the reader protests, why not just randomly choose some strings and randomly swap them around?"That's a fair observation. The reasoning is I don't want to be at the full mercy of a PRNG to explore the sample space of inputs. By being in full control of how to combine substrings and permute them, I can generate sample inputs that are unique for every run. Even if I do decide to use PRNGs, it's better if it's applied to varying the R-combination and N-permutation of my earlier approach, as needed. That way I have a steering wheel into the input space, even if part of the process is randomized.

Furthermore, I can use discrete probability to predict the outcome of an event. For example, if my boss asksWhat's the probability your fuzzer will produce the same result? We don't want to waste server cycles.I can tell him:I wouldn't worry, there's only a 1 in 1,231,323,000 chance we will obtain the same compiler input, and here's why.

[NOTE: Discrete probability is another topic from discrete mathematics, and it's par for the course in ML and AI research. As the reader may have guessed, the theorems proved in discrete probability rely on sets, basic counting, combinations, and permutations.]

Graphs

This will be a short section. We intuitively know what a graph is. Just like sets, they are foundational in understanding other discrete structures, but they can also be used directly for understanding patterns of programming behaviors. Social networks, the Web, airline routes, the module dependencies in someone's program, etc. are instances of graph theory problems. In addition, I'm sure the reader has heard other programmers say

At a minimum, one should learn the definition of directed and undirected graphs; bipartite graphs, and the star, ring and hybrid topologies.

Turing Machines

In my view, for a programmer looking for immediate results, this topic takes longer to reap the benefits than, say, learning combinatorics. However, long-term study of language grammars, finite-state automata, and ultimately turing machines will open the floodgates. The key insight that changed my life (and career trajectory), was learning that

HINT: To make more sense of this, I recommend Per's Bitwise. His first few episodes immediately use a simple grammar and a pushdown automaton (a type of state machine) in a practical setting using the C language.

Resources for Mastering Discrete Math: Ranked in Difficulty

I implore the community to take mathematics in their life seriously. Discrete math is to programmers what calculus is to physicists.

Discrete Mathematics with Applications - Anyone too rusty with their math will thank Susanna for her clarity of thought and easy-to-follow examples, ramping up the difficulty only after the reader has attained enough comfort.

My Old Textbook - Still as expensive as ever, I see. Sigh. However, no other book has a larger collection of exercises. I recommend Susanna to get an overview, and this book if someone gets serious about mastering the subject fully. The reader can get by with Susanna just fine, though.

[Optional] Concrete Mathematics: A Foundation for Computer Science - Donald Knuth's Concrete Mathematics is a blending of CONtinuous and disCRETE mathematics. It's a book that expands on papa Knuth's famous

[Optional] CLRS - The popular CLRS book is concerned with the modern anthology of classical computer science algorithms. Deep algorithmic thinking is a super power most of us don't get to fully develop, and should be the reader's next goal after learning discrete math. I'm serious, do NOT attempt this book until understanding discrete math. It's the language used for describing the algorithms. Some veteran readers may disagree with me on this, but at the very least, I strongly advise a decent overview of the specific discrete structures I presented in this post.

Yours,

Abner

Halarious

It kind of even has a Handmade philosophy vibe to it, I feel :P

It certainly seems that way! I like it; posted at top of the post.