代做CMPSC 497、辅导java,c++编程

- 首页 >> Database

CMPSC 497: Advanced Algorithms Due 03/03/2023 at 10:00 pm
Problem Set 2
Notice: Type your answers using LaTeX and make sure to upload the answer file on Gradescope before the deadline.
For more details and instructions, read the syllabus.
Problem 1. Stocks
A simple model for the stock market works as follows: a stock with price q will increase by a factor r > 1 to qr with
probability p and will fall to q/r with probability 1? p. If we start with a stock with price 1, find formulae for the
expected value and variance of the price of the stock after T days.
Problem 2. Tight concentration
(a) Given a positive integer k, describe a random variable X assuming only non-negative values such that Markov’s
inequality is tight. That is, such that Pr[X ≥ a] = E[X ]/a.
(b) Given a positive integer k, describe a random variable X such that Chebyshev’s inequality is tight. That is, such
that Pr[|E?E[X ]| ≥ k√Var(X)] = 1/k2.
Problem 3. Another Chernoff bound.
Let X1, . . . ,Xn be n independent random variables such that Xi ∈ [ai,bi] for every i= 1, . . . ,n. Let Sn = ∑ni=iXi and let
μ = E[Sn]. Show that for any δ > 0
Pr[|Sn?μ| ≥ δ ]≤ 2exp
(
? 2δ
2
∑ni=1(bi?ai)2
)
.
Hint: Follow the prove approach from class for 0/1 random variables. When it is time to use the information about the
Xi’s, use of the fact that if Y is a random variable such that E[Y ] = 0 andY ∈ [a,b], then for any t ≥ 0, E[etY ]≤ e
t2(b?a)2
8 .
Problem 4. Balance partitioning
Given an n×n matrix A whose entries are either 0 or 1, we want to find a column vector b ∈ {?1,1}n that minimizes
∥Ab∥∞. (Recall that if Ab = c with c = (c1, . . . ,cn), then ∥Ab∥∞ = maxi=1,...,n |ci|.) Consider the following algorithm
for choosing b: each entry of b is ?1 with probability 1/2 and 1 otherwise. Show that for this choice of b, ∥Ab∥∞ =
O(

n lnn) with probability 1?O(n?1).
1
Problem 5. A random geometric network
Suppose n points are placed uniformly at random in the unit square. Each point is then connected to the k closest
points. Let G be the resulting random graph. Show that there exists a constant c > 0 (independent of n and k), such
that when k ≥ c logn, Pr[G is connected]→ 1 as n→ ∞.
Hint: Partition the unit square into small squares of area lognn . Show that in every small square there is at least one
point with high probability. Then, use a Chernoff bound to show that with high probability the number of points in
disc of radius

a logn
n is at most b logn for suitable constants a,b > 0. Show that these properties imply that G is
connected.
Problem 6. Reduction I
Given two sets, P and Q, of n points in Rn space, your task is to find x ∈ P and y ∈ Q such that ∥x? y∥2 is minimum.
(a) Show how to do this in O(n3) time.
(b) Show how to do this in time O(n2 logn) if it suffices to find to find x ∈ P and y ∈Q whose distance 1.001 times the
minimum possible distance. (If the min distance is 0, you must find x= y.)
Problem 7. Reduction II
Given n data points in Rd , where d = n5, we want to find a subset of 4 data points whose sum vector has the smallest
length. Formally, find the 4 points x1,x2,x3,x4 ∈ Rd such that ∥x1+ x2+ x3+ x4∥2 is minimized.
(a) Show how to do this in O(n9) time.
(b) Show how to do this in time O(n5 logn) if it suffices to find a subset whose sum has a length that is 1.001 the length
of the smallest possible sum.

站长地图