For each answer, provide a short justification of your answer. Your justification should follow from the definitions of the order notations.
1. (Average 4.49 out of 5 points) O (n) ¹ Θ (n^{2})
The sets are unequal (in fact they are disjoint, and have no members in common). The set Θ(n^{2}) is all functions that grow as fast as n^{2}; the set O(n) is the set of all functions that grow no faster than n. Since n^{2} grows faster than n, there is no overlap between the sets.2. (4.11 / 5) Θ (n) Ì O (2n)
First, note that the factor has no effect on the growth rate, so O(2n) is equivalent to O(n). The set Θ(n) is the set of all functions that grow as fast as n; this is a proper subset of O(n) which is the set of all functions that grow no faster than n. There are functions in O(n) that are not in Θ(n) such as constant functions or log n.3. (4.14 / 5) Æ = O (n!) Ç Ω(n^{n})
The intersection of O(n!) and Ω(n^{n}) is empty, since n^{n} grows faster than n! (see Problem Set 1).4. (4.82 / 5) O (1) É Θ (1)
This one is tricky, and we accepted =, Ê or É as a full credit answer (with a good explanation). Certainly we known every element of Θ(1) is also in O(1) from the definitions of Θ and O. The tougher question is figuring out if there is any function in O(1) that is not in Θ(1). From the informal definitions, this would mean is there any function that produces a positive value as output that grows slower than a constant. Intuitively, it would seem that functions like 1/n have this property — as n increases, the value of the function decreases asymptotically. To answer it more convincingly, we need to consider the definitions of O and Θ.If f is in O (1), then we know there are constants c > 0 and n_{0} ≥ 0 such that f(n) < c for all n ≥ n_{0}. (Since g(n) = 1, the g(n) term goes away.) For O(1) to be equal to Θ(1), we would need to show that this also implies f is in Ω(1). This requires that we can find constants c > 0 and n_{0} ≥ 0 such that f(n) > c for all n ≥ n_{0}. That is, f is in Ω(1) if and only if 1 is in O(f). Consider f(n) = 1/n. We cannot have 1 is in O(1/n) since for some value of n we know eventually 1 > c * 1 / n for some value of c (we can choose n > c for any choice of c). Hence, we know O(1) is a strict superset of Θ(1).
l = LinkedList.LinkedList ().append(1).append(2).append(3) r = l.reverse ()should make r the list [3, 2, 1] and leave l as the list [1, 2, 3].
The rest of the code is taken from the LinkedList.py implementation of an immutable list datatype from Problem Set 2.
The easiest way to think of reverse is simply adding the first element to the end of the list resulting from reversing the rest of the elements:6. (8.46 / 10) What is the asymptotic running time of your reverse implementation? Explain your answer convincingly, and be sure to define any variables you use and state any assumptions you make clearly.def reverse (self): head = ListNode (self.__info) if self.__next == None: return head else: return self.__next.reverse().append (self.__info)There are more complex iterative ways (that are more efficient) to do this by switching the __next pointers down the list, but they are much tougher than the simple recursive solution.
Our reverse implementation has running time in Θ(n^{2}) where n is the number of elements in the input list. (Note, it is possible to implement reverse with running time in Θ(n) by adjusting the __next pointers directly.)There will be n calls to reverse, once for each element in the list (actually, there are n - 1 because the base case stops the recursion for the list of length 1). But, each call to reverse involves a call to our append method. The provided append implementation has running time in Θ(m) where m is the size of the input to append (see Problem Set 2). The average length of the input list to append is n / 2, so the average running time of the append call is in Θ(n / 2) = Θ(n).
We are making n calls to reverse, each of which involves running time Θ(n) from the call to append, so the total running time will be in Θ(n^{2}).
We just need to find an input list where the best match for the first student does not produce the optimal global match. Here's a simple example:8. (4.3 / 5) What is the asymptotic running time of assignPartners? Be sure to define any variables you use in your answer and state your assumptions about Python operations clearly.
- students = { "Alice", "Bob", "Colleen" }
- The goodness scores of (Alice, Bob) = 10, (Alice, Colleen) = 20, and (Bob, Colleen) = 110. (Bob and Colleen are in the same section and different years; Alice and Bob are in different years and different sections; Alice and Colleen are in different majors, the same year, and different sections.)
- The greedy algorithm will first find the best partner for Alice, which is Colleen. Then, there is no partner left fo Bob, so the total goodness score is 20. If we matched Bob with Colleen, and left Alice unpaired, the goodness score is 110.
Θ(n^{2}) where n is the number of students. We have two nested for loops, each of which iterates through the students. So, there are Θ(n^{2}) evaluations of the inner for loop (there are not exactly n^{2} iterations, since we skip the inner loop if the student already has a partner; in the worst case, however, no students will be assigned partners if the best match for each student is with None).9. (6.37 / 10) Define the allPossiblePartnerAssignments procedure Zulma needs.The body of the inner for loop evaluates goodnessScore twice. This involves indexing into records to find each students record, and doing a dictionary lookup on the field we are comparing, as well as the find call in the notpartners list. We need to assume all these operations have running times in O(1) for the overall running time to be in Θ(n^{2}). The one we are most concerned about is find, which searches the notpartners string for each student to see if it contains the other student. This is likely to have running time in O(s) where s is the length of the notpartners string. If we assume those lengths are small and fixed (which is in fact the case for CS216 students), then this is still constant time as n grows. If we assume people have a fixed fraction of the rest of the world they wouldn't want to partner with, then we expect the lengths of the partner lists to grow as a fraction of n. This would make the overall running time of assignPartners in O(n^{3}).
The easiest way to do this was to realize that we can find all possible partner assignments by arranging the students in all possible orders, and then just having them partner with the adjacent student. The subtlety (which no one got quite correctly) with this approach is dealing with the possibility that someones best match is with None (recall that the goodnessScore returns -1 if either partner is None, but can return a more negative score if the one partner is on the others notpartners list). So, we need to consider all possible orderings of the students with the list extended with enough None values so each student could potentially be partnered with None. We do this by appending enough Nones to the list before calling allLists. We use the allLists procedure from Section 3 to produce all possible lists.10. (7.63 / 10) Explain why Zuma's partner assignment algorithm would not run fast enough to be used to assign partners for PS4. (Note: a good answer would include an explanation of the running time of assignPartners. Assume you have a correct and optimally efficient allPossiblePartnerAssignments implementation regardless of your answer to question 9. You should be able to answer question 10 well, even if you could not answer question 9.)def allPossiblePartnerAssignments (students): s = students[:] # we use a copy to avoid modifying input list for n in range (len (students)): s.append (None) for ordering in allLists (s): assignments = { } for i in range (len (ordering)): if i + 1 == len(ordering): assignments[ordering[i]] = None else: if not ordering[i] == None: # avoid adding partners for None assignments[ordering[i]] = ordering[i + 1] if not ordering[i + 1] == None: assignments[ordering[i + 1]] = ordering[i] yield assignments(Note: I applied the same rule to myself as you had on this exam of not actually trying the code in an interpreter. So, there is probably at least one bug in it. The first person to find a non-trivial bug in this and explain how to fix it gets 10 bonus points.)
The algorithm we used in question 9 involves creating all possible permutations of a list of length 2n, where n is the number of students (recall we added n Nones to the list we passed to allLists). Hence, there are (2n)! possible orderings. For each ordering, we loop through all the elements (up to 2n of them), so the work of allPossiblePartnerAssignments for each possible assignment is 2n * (2n)!. Regardless of any other work, this exceeds 10^{357}. This far exceeds the number of atoms in the universe, so is most definitely not something we could compute in time for PS4.11. (average 0.98 bonus points) Suggest a better algorithm to use to assign partners for future problem sets.This is not the best possible algorithm, however, especially because of how we dealt with the None matches. We could improve it by recognizing the each None is identical, so all the orderings that just involve moving Nones around with other Nones are identical. So, perhaps we could approach O(n!). We know the brute force algorithm can't do better than this however — we need to try all possible assignments of students, so we need to try all possible orderings. But even 96! is way beyond the realm of anything tractable for computing.
I was surprised by the number of people who answered this by analyzing the complexity to be in something like O(n^{3}), and then arguing that performing 100^{3} iterations would be intractable. You should have a bit better of a sense of the power of the computers you use. When they say a 2GHz processor, that means it can perform 2 Billion operations per second (what an operation is is a bit complex, but you can think of any simple calculation as a few operations). Hence, numbers like 100^{3} are not a problem at all. Its only 1 Million, and the computers in lab can do around two thousand million operations in one second.
We'll discuss this in a future class.[an error occurred while processing this directive]