me@thopay.dev
thopay
thopay

From The Vault: Diving into Big O, Big Theta, and Big Omega

Welcome to the first post in a new series I’m starting, “From The Vault”. In this series, I’ll be sharing some of the notes, thoughts, and insights I’ve recorded over the course of my studies in Software Engineering at Iowa State and beyond. These notes come from my personal vault (my Obsidian vault).

Today, we’re delving into the world of computational complexity, specifically focusing on Big O, Big Theta, and Big Omega. They are fundamental concepts in the study of algorithm efficiency and performance, and they were covered during my COM S 311 class at Iowa State University.

Note: While these notes are based on content directly from my vault, they have be parsed and re-organized prior to sharing to make more sense within a blog post.



Understanding Asymptotic Efficiency

Asymptotic Efficiency examines how the running time of an algorithm increases with the input size, particularly as the input size approaches infinity. It’s vital for evaluating algorithms beyond just raw execution times, which can vary between different hardware.


The Big Three of Computational Complexity

Big-O Notation (OO)

Big-O gives an upper bound on the asymptotic behavior of a function, essentially defining the maximum rate at which the function can grow.

Mathematical Definition: If positive constants cc and n0n_{0} exist such that nn0,f(n)cg(n)∀n \geq n_{0}, f(n) \leq c*g(n), then fO(g(n))f ∈ O(g(n)).

Size of nn based on input type:

TypeInput Size
integer# of bits or digits
array# of elements
stringlength of string
graph# of vertices or edges

Big-Omega Notation (Ω\Omega)

Big-Omega is about the lower bound, indicating the minimum growth rate of a function.

Mathematical Definition: Ω(g(n))={f(n)  There exist positive c,nc such that 0cg(n)f(n) nn0}\Omega(g(n)) = \{ f(n)\ |\ \text{There exist positive}\ c, n_{c}\ \text{such that}\ 0\leq cg(n) \leq f(n)\ ∀n \geq n_{0}\}


Big-Theta Notation (Θ\Theta)

Big-Theta provides a tight bound, showing that a function grows precisely at a certain rate.

Definition: Θ(g(n))={f(n)  There are positive c1,c2,n0 such that nn0,0c1g(n)f(n)c2g(n)}\Theta(g(n)) = \{ f(n)\ |\ \text{There are positive}\ c_{1}, c_{2}, n_{0}\ \text{such that\ } ∀n \geq n_{0}, 0\leq c_{1}g(n) \leq f(n) \leq c_{2}g(n) \}


Simplifying with Examples

Big-O: If f(n)=3n+2f(n) =3n + 2, we simplify this to O(n)O(n), disregarding constants and lower order terms.

Big-Omega: For f(n)=n2+3nf(n) = n^2 + 3n, we can say it’s Ω(n2)\Omega(n^2), suggesting it grows at least as fast as n2n^2.

Big-Theta: If we have f(n)=2n2+3n+1f(n) = 2n^2 + 3n + 1, it simplifies to Θ(n2)\Theta(n^2), indicating it grows exactly as n2n^2 for large values of nn.


Examples from Class

Example 1

f(n)=5n2+2n+1f(n) =5n^{2}+2n+1 and g(n)=n2g(n) = n^{2} Proof 1


Example 2

f(n)=6n3+18n38f(n) = 6n^{3}+18n-38 and g(n)=n3g(n) =n^3 Claim: f(n)O(g(n))f(n) ∈ O(g(n)) Proof 1


Example 3

n3O(n2)n^{3} ∈ O(n^{2})

Claim: n3O(n2)n^{3} ∉ O(n^{2})

Proof by contradiction:


Example 4

f(n)=log(n5)f(n) = \log(n^{5}) and g(n)=log(n)+5g(n)=\log(n)+5

Claim: f(n)O(g(n))f(n) ∈ O(g(n))

Mathematical Rule: log(nk)=klog(n)\log(n^{k})=k\log(n)

Proof:


Example 5

f(n)=log(n)f(n)=\log(n) and g(n)=ng(n) =\sqrt{ n }

Claim: f(n)O(g(n))f(n) ∈ O(g(n))

Proof:


Extra: Review of log


Formal Derivations of Runtime

Expectation: Express runtime in Big-O notation

Example 1

for (i = 1; i <= n; i++) {
	print(i)
}

Goal: Derive runtime of this loop as a function of nn Analysis:

i# Primitive Operations
1c
2c
3c

Example 2

for (i = 1; i <= n; i++) {
	for (j = 1; j <= n; j++) {
		// constant # of primitive operations
	}
}

Analysis:


Example 3

for (i = 1; i <= n; i++) {
	for (j=1; j <= i; j++) {
		// constant # of primitive operations
	}
}

Analysis:

This highlights an important point: a loop running nn times nested within another loop running nn times does not always result in a runtime of O(n2)O(n^2). In this case the inner loop only runs ii times for the iith iteration of the outer loop. The analysis still results in O(n2)O(n^2), but the actual number of operations is less than the previous example where both loops ran nn times.



I must admit, these notes, although relatively comprehensive, have room for improvement. So far, I’ve found the study of Big O, Big Theta, and Big Omega challenging when understood via proofs.

My plan is to revisit and add to these notes as my understanding of the subject continues to grow. This is a journey, and I believe in the power of constant learning and improvement. Sharing these notes publicly isn’t just about providing a resource to others; it’s also about motivating myself.

By putting my notes out there, I’m compelled to take better, more deliberate notes during class. It’s a commitment to myself to strive for clarity in understanding and precision in note-taking. I hope that my journey will inspire you in some way and lead to shared growth.