Why do I need to learn about discrete mathematics?
Discrete mathematics is a fundamental tool for understanding many theories and techniques behind artificial intelligence, machine learning, deep learning, data mining, security, digital imagine processing and natural language processing.
The problem-solving techniques and computation thinking introduced in discrete mathematics are necessary for creating complicated software too.
What can I do after finishing learning discrete mathematics?
You will be prepared to learn modern theories and techniques to create modern security, machine learning, data mining, image processing or natural language processing software.
What should I do now?
Please read
– this Kenneth H. Rosen (2012). Discrete Mathematics and Its Applications. McGraw-Hill book and
– this Alfred V. Aho and Jeffrey D. Ullman (1994). Foundations of Computer Science book (free online version).
Alternatively, please watch this MIT 6.042J – Mathematics for Computer Science, Fall 2010 course (Textbook).
Terminology Review:
- Statement: An assertion that is either true or false.
- Mathematical Statements.
- Mathematical Proof: A convincing argument about the accuracy of a statement.
- If p, then q. p is hypothesis. q is conclusion.
- Proposition: A true statement.
- Theorem: An important proposition.
- Lemmas: Supporting propositions.
- Logic: A language for reasoning that contains a collection of rules that we use when doing logical reasoning.
- Propositional Logic: A logic about truth and falsity of statements.
- Logic Connectives: Not (Negation), And (Conjunction), Or (Disjunction), If then (Implication), If and only if (Equivalence).
- Truth Table.
- Contrapositive of Proposition: The contrapositive of p ⇒ q is the proposition ¬q ⇒ ¬p.
- Modus Ponens: If both P ⇒ Q and P hold, then Q can be concluded.
- Predicate: A property of some objects or a relationship among objects represented by the variables.
- Quantifier: Tells how many objects have a certain property.
- Mathematical Induction: Base Case, Inductive Case.
- Recursion: A Base, An Recursive Step.
- Sum Example: Annuity.
- Set.
- Subset.
- Set Operations: A ∪ B, A ∩ B, A ⊂ U: A’ = {x : x ∈ U and x ∉ A}, A \ B = A ∩ B’ = {x : x ∈ A and x ∉ B}.
- Cartesian Product: A × B = {(a; b) : a ∈ A and b ∈ B};
- A binary relation (or just relation) from X to Y is a subset R ⊆ X × Y. To describe the relation R, we may list the collection of all ordered pairs (x, y) such that x is related to y by R.
- A mapping or function f ⊂ A × B from a set A to a set B to be the special type of relation in which for each element a ∈ A there is a unique element b ∈ B such that (a, b) ∈ f.
- Equivalence Relation.
- Equivalence Class.
- Partition.
- A state machine is a binary relation on a set, the elements of the set are called states, the relation is called the transition relation, and an arrow in the graph of the transition relation is called a transition.
- Greatest Common Divisor.
- Division Algorithm.
- Prime Numbers.
- The Fundamental Theorem of Arithmetic: Let n be an integer such that n > 1. Then n can be factored as a product of prime numbers. n = p₁p₂ ∙ ∙ ∙ pₖ
- Congruence: a is congruent to b modulo n if n | (a – b), written a ≡ b (mod n).
- Fermat’s Little Theorem.
- Stirling’s Approximation.
- Probability.
- Example: The Monty Hall Problem.
- The Four Step Method: (1) Find the Sample Space (Set of possible outcomes), (2) Define Events of Interest (Subset of the sample space), (3) Determine Outcome Probabilities, (4) Compute Event Probabilities.
- A tree diagram is a graphical tool that can help us work through the four step approach when the number of outcomes is not too large or the problem is nicely structured.
- Example: The Strange Dice.
- Conditional Probability: P(A|B) = P (A ∩ B) / P(B).
- A conditional probability P(B|A) is called a posteriori if event B precedes event A in time.
- Example: Medical Testing.
- Independence: P(B|A) = P(B) or P(A∩B) = P(A) · P(B).
- Mutual Independence: The probability of each event is the same no matter which of the other events has occurred.
- Pairwise Independence: Any two events are independent.
- Example: The Birthday Problem.
- The birthday paradox refers to the counterintuitive fact that only 23 people are needed for that probability to exceed 50%, for 70 people: P = 99.9%.
- Bernoulli Random Variable (Indicator Random Variable): f: Ω ↦ {1, 0}.
- Binomial Random Variable: A number of successes in an experiment consisting of n trails. P (X = x) = [(n!)/((x!) · (n-x)!))]pˣ(1 − p)ⁿ − ˣ
- Expectation (Average, Mean). E = Sum(R(w) · P(w)) = Sum(x · P(X = x)).
- Median P(R < x) ≤ 1/2 and P(R>x) < 1/2.
- Example: Splitting the Pot.
- Mean Time to Failure: If a system independently fails at each time step with probability p, then the expected number of steps up to the first failure is 1/p.
- Linearity of Expectation.
- Example: The Hat Check Problem.
- Example: Benchmark: E(Z/R) = 1.2 does NOT mean that E(Z) = 1.2E(R).
- Variance: var(X) = E[(X−E[X])²].
- Kurtosis: E[(X−E[X])⁴].
- Markov’s Theorem: P(R ≥ x) ≤ E(R)/x (R > 0, x > 0).
- Chebyshev’s Theorem: P(|R – E(R)| ≥ x) ≤ var(R)/x². Boundary of the probability of deviation from the mean.
- The Chernoff Bound: P(T ≥ c·E(T)) ≤ e−ᶻ·ᴱ⁽ᵀ⁾, where z = c·lnc − c + 1, T = Sum(Tᵢ), 0 ≤ Tᵢ ≤ 1.
After finishing learning about discrete mathematics please click Topic 21 – Introduction to Computation and Programming using Python to continue.