Note: Due to the missed class day, the deadline for this assignment has been extended until Feb 1.
Consider \(T(n)=6n^2+4n+3\).
Is it correct to say that \(T(n)\) is \(O(n^4)\)? Give an informal one-sentence justification.
Is \(O(n^4)\) a tight upper bound for \(T(n)\)? If so, explain why. If not, give a tight upper bound (again with an informal one-sentence justification).
Is it correct to say that \(T(n)\) is \(\Omega(n)\)? Give an informal one-sentence justification.
Is \(\Omega(n)\) a tight lower bound for \(T(n)\)? If so, explain why. If not, give a tight lower bound (again with an informal one-sentence justification).
Give a formal proof that \((2n^2+n)(4n)\) is \(O(n^3)\) – “formal proof” means you must use the mathematical definition of big-oh, and not “rules of thumb” or other properties of big-oh.
Simplify the following summation, giving a closed-form solution (with no summation). Use the summation properties and formulas from Section A.1 in the textbook, referencing each property that you use. \[ \sum_{i=0}^n (2i+2^i) \]
Bill Doors is having an argument with Steve Careers over whose algorithm is faster. For an input of size \(n\), Bill’s algorithm runs in \(n^3\) microseconds, while Steve’s runs in \(100 n^2\) microseconds. For each answer below that is an amount of time, select the unit of time that gives the cleanest answer. In other words, if something takes 7,200,000,000 microseconds, that’s a terrible way to say it – you should instead say it takes 2 hours.
Bill claims that the big constant on Steve’s algorithm makes it slow, and to prove his point he runs both algorithms on an input of size 10. How much time does each algorithm take?
Bill’s conclusion is “See, your algorithm takes \(X\) times as long to run.” What is \(X\)?
Steve isn’t going to just give up the argument though! He says, “I can make your algorithm take \(X\) times longer by using an input of size \(n\).” What is \(n\)? In addition to giving \(n\), calculate the running time of each algorithm for that value of \(n\).
Elon Odor comes up and says “You guys aren’t thinking big enough! I want to change the world! I’m going to run your algorithms with inputs of size \(n=30,000\).” How much time does each algorithm take now?
The following pseudocode gives the selection sort algorithm for an input array with \(n\) elements: \(A=\langle a_1,a_2,\ldots,a_n\rangle\).
Selection-Sort(A)
1. for i = 1 to n-1
2. mp = i
3. for j = i+1 to n
4. if A[j] < A[mp]
5. mp = j
6. t = A[i]
7. A[i] = A[mp]
8. A[mp] = t
Do a careful runtime analysis of this algorithm, using explicit constants, patterned after the analysis of Insertion-Sort in Section 2.2 of the textbook. As in the book’s analysis, introduce variables where you need them and drive a formula for running time – this is like the general formula for \(T(n)\) on page 26. Next, use this formula to find the best case and worst case running times of Selection-Sort. After deriving the precise formulas for best case and worst case running times, state each using \(\Theta\) notation.
Each of the following functions give the running time for an algorithm, measured in milliseconds. For each one, compute the largest problem size (the largest \(n\)) that can be run in one day. If the answer is over a billion, you can just say “over a billion.” Show your work!
\(T(n)=\lg n\)
\(T(n)=100 n\)
\(T(n)=n^2\)
\(T(n)=2^n\)
Consider a function \(T(n)=g(n)+h(n)\), where \(f(n)\), \(g(n)\), and \(h(n)\) are positive functions such that \(g(n)\) is \(\Theta(f(n))\) and \(h(n)\) is \(O(f(n))\). Prove that \(T(n)\) is \(\Theta(f(n))\) (in other words, the added function \(h(n)\) is not important to the asymptotic analysis).
Prove that \((3n)^n\) is not \(O(n^{n})\)