算法代写 | EECS 376: Foundations of Computer Science Fall 2020 Homework 10

这个作业是完成概率相关的算法分析
EECS 376: Foundations of Computer Science
Fall 2020 Homework 10

Problems marked with E are graded on effort, which means that they are graded subjectively on
the perceived effort you put into them, rather than on correctness. For bonus questions, we will
not provide any insight during office hours or Piazza, and we do not guarantee anything about the
difficulty of these questions. We strongly encourage you to typeset your solutions in LATEX.
If you collaborated with someone, you must state their names. You must write your own solution
and may not look at any other student’s write-up.
0. If applicable, state the name(s) and uniqname(s) of your collaborator(s).
1. One way to estimate √
3 is by randomly throwing darts (represented as red points) uniformly
into a 1 × 1 square, similar to how we estimated π in class:
In the above diagram, the gray shaded region is an equilateral triangle.
For any given dart thrown at point (x, y), we can determine if the following are true:
• 3x
2 > y2
• 3(1 − x)
2 > y2
A geometric analysis reveals that a dart thrown at (x, y) lies in the shaded triangle precisely
when both of these inequalities are true.
Let p be the probability that a randomly thrown dart lies in the shaded region. In other
words, p is the area of the triangle divided by the area of the square. By throwing many
darts, we can estimate this p and use it to approximate √
3.
(a) What is the theoretical value of p? You will need to use this for part (c).
(b) Suppose we run t trials of the above process and arrive at an estimate of τ for √
3. Using
Chernoff bounds, give an upper bound for the number of trials

n necessary to estimate
3 using this method such that
i. |τ −

3| < 0.1 = 10−1 with probability 99%.
ii. |τ −

3| < 0.01 = 10−2 with probability 99%.
iii. |τ −

3| < 0.001 = 10−3 with probability 99%. E (c) On Canvas, we have provided a C++ program called findSqrtThree.cpp that calculates the value of √ 3 using a random number generator. If you encounter problems running on your computer, please compile and run it on CAEN (we tested that the program should terminate in a reasonable amount of time on CAEN). 2. You are trying to connect a network of server clusters and decide to take a randomized approach for connecting one cluster to another. To represent this network, we denote by Gn,p a random, undirected graph on n vertices constructed as follows: for each distinct set of two vertices u, v ∈ V , include the edge (u, v) in Gn,p with probability p. (a) What is the expected number of edges in Gn,p? In other words, what is the expected number of inter-cluster connections in the network? (b) Given v ∈ Gn,p, what is the expected value of deg(v)? In other words, what is the expected number of other clusters that an arbitrary cluster is connected with? (c) Assume n > 1. Let D be a blackbox that picks a random vertex (server cluster) v ∈
Gn,p and returns deg(v). Describe an algorithm that, using D as the only way to get
information about edges in G, produces output whose expected value is p. In other words,
solely by looking at the completed network, the algorithm estimates the probability that
originally was used to randomly connect server clusters to each other.
(d) Let p
0 be the estimate for p returned by your algorithm in part (c). Use Chernoff bounds
to give an upper bound on the probability that p
0 ≥ 3p. Your answer should depend on
p and n.
3. A hash function maps data objects to the indices of an array where they are stored; we would
like the function to distribute the objects roughly evenly among the indices. Suppose we need
to hash a large number n of objects to k indices. Consider an “ideal” function f that maps
each object to a uniformly random and independent index in {1, 2, . . . , k}.
(a) Let Xi be a random variable for the number of objects that f hashes to index i. Find,
with proof, a closed-form expression for Ex[Xi
].
(b) Find, with proof, an upper bound on the probability that f maps at least 2 · Ex[Xi
]
different objects to index i:
i. Using Markov’s inequality
ii. Using Chernoff bounds
iii. Under what condition on n and k does each method yield a tighter bound?
(c) Assuming that n, k are such that the Chernoff bound is the tighter of the two bounds
above, give an upper bound on the probability that f hashes at least 2 · Ex[Xi
] objects
to index i for some index i.
E (d) Consider the probability that some index i has more than 2 · Ex[Xi
] objects hashing to
it. Assuming that k 

n, describe how this probability behaves as n grows.