4005-801: Homework, slides, and other materials
Week 1
- Slides for this week: Section 1.1: Spanning trees, pdf (and ppt)
-
Tuesday, Dec 1
- Covered in class: Course logistics, see the main course website
- Covered in class: Section 1.1, Spanning trees - we will continue with the proof next time
- A note for students with disabilities:
RIT is committed to providing reasonable accommodations to students with disabilities.
If you would like to request accommodations such as special seating, testing modifications,
or note taking services due to a disability, please contact the Disability Services Office.
It is located in the Eastman Building, Room 2342; the Web site is www.rit.edu/dso.
After you receive accommodation approval, it is imperative that you see me during office hours
so that we can work out whatever arrangement is necessary.
Thursday, Dec 3
- Covered in class: finished Section 1.1
Homework 1, due 12/10 12:00pm (in class):
- Consider Kn, the complete graph on n vertices. Compute the number of its spanning trees using the Kirchhoff's Theorem. Give a closed form expression.
- In class we discussed unweighted graphs and we counted their spanning trees. Recall that we showed that D-A = N.NT where A is the adjacency matrix of a given (unweighted) graph,
D is the diagonal matrix with its vertex degrees on the diagonal, and N is the incidence matrix of any orientation of the graph.
Now suppose we have a weighted graph and we want to count the overall weight of all its spanning trees.
Let Aw be the weighted adjacency matrix and let Dw be the diagonal matrix with the i-th diagonal entry equal to the sum of the edge weights for edges incident to the i-th vertex.
How would you define Nw so that Dw-Aw = Nw.NwT ?
- Consider a graph G and one of its edges e. Let A be the number of spanning trees of G, B the number of spanning trees of G-e (the graph G with edge e removed), and C
be the number of spanning trees of the graph G/e (the graph G with edge e contracted and self-loops removed). Derive a relationship between A, B, and C
(the most accurate relationship you can come up with) and reason why the relationship holds.
Recall that a contraction of an edge means that we merge its two endpoints into a single vertex.
- Let G be a simple graph, i.e. no parallel edges are allowed. Suppose we duplicate every single edge of G, obtaining G'. What is the relationship between the number of spanning trees of G and the number of spanning trees of G'?
- Let G be a simple graph and suppose we place a vertex in the middle of every edge of G, obtaining G''. What is the relationship between the number of spanning trees of G and the number of spanning trees of G''?
Remark: you are free to discuss the solutions between yourselves (or with anybody else whom you make listen), but the submission should capture your own understanding of the solutions.
Week 2
- Slides for this week: Section 1.2: Perfect matchings in a planar graph, pdf (and ppt)
Chapter 2: #P-completeness, pdf (and ppt)
-
Tuesday, Dec 8
- Covered in class: Section 2.2, Perfect matchings in planar graphs
-
Thursday, Dec 10
- Covered in class: Chapter 2, #P-completness
- Covered in class: solution of Problem 1 from Hw1
- Homework 2, due 12/17 12:00pm (in class):
- Exercise 1.16
- Exercise 1.17
- Consider the following problems:
- #PERF-MATCH, Input: a graph G, Output: number of perfect matchings in G
- #ALL-MATCH, Input: a graph G, Output: number of all matchings in G
- #k-MATCH, Input: a graph G and an integer k, Output: number of matchings of G with exactly k edges
Show:
- #PERF-MATCH is polynomial-time Turing-reducible to #k-MATCH
- #ALL-MATCH is polynomial-time Turing-reducible to #k-MATCH
- Consider Kn,n, the complete bipartite graph on n+n vertices.
- How many different perfect matchings are in Kn,n?
- Let e be any edge in Kn,n. What is the probability that e is in a perfect matchings chosen uniformly at
random from the set of all perfect matchings?
- Let e be any edge in Kn,n. Let X be a random variable that assigns a value to every randomly chosen pefect matching. If the matching
contains edge e, then X equals 1; otherwise X equals 0.
Compute the expected value of X and the variance of X.
Week 3
- Slides for this week: Chapter 3: Sampling and Counting, pdf
(and ppt)
-
Tuesday, Dec 15
- Covered in class: #P-completeness of counting perfect matchings in bipartite graphs (Section 2.2)
- Covered in class: solutions of Hw 1, Problems 2-5
-
Thursday, Dec 17
- Covered in class: reductions from counting to sampling and sampling to counting (Sections 3.1 and 3.2)
- Covered in class: solutions of Hw 2
- Have a nice break!
- Homework 3, due 1/7 12:00p (in class):
- In class we showed how to sample perfect matchings of planar graphs (we used the fact that we know how to count all perfect matchings in any planar graph).
In other words, we reduced the problem of sampling perfect matchings to the problem of counting perfect matchings.
Sketch how we can sample spanning trees of any graph uniformly at random, that is, every spanning tree will be equally likely.
Use a similar approach as the one we used for matchings. Hint: a problem from the first homework might be useful.
- In class we sketched how we can use sampling to count all matchings.
Suppose we have a sampling algorithm A that, for any input graph, returns a uniformly random spanning tree.
Sketch how we can count all spanning trees by using algorithm A, using a similar outline as what we did in class with matchings.
(In particular, do not use Kirhoff's result - I'll discuss the rationale for this next week.)
- GRAPH-COLORING problem: given a graph G and a number k, we want to color every vertex of G by one of k colors so that no two adjacent vertices are colored with the same color.
#GRAPH-COLORING problem: given a graph G and a number k, count all possible k-colorings of G.
Consider the following Markov chain:
- Start with any k-coloring of G.
- In every move:
- Randomly choose one of the k colors, let it be c, and randomly choose one of the vertices, let it be v.
- If no neighbor of v is colored by c, recolor vertex v by color c. Otherwise, keep the original color of v.
Consider these two cases:
- Find a k and G such that it is not possible to get from every k-coloring of G to every other k-coloring of G using the above Markov chain. (In other words, the Markov chain does not connect the state space.)
- Show that if k is bigger than twice the maximum degree of G, then it is possible to move from every k-coloring of G to every other k-coloring of G (the Markov chain connects the state space). In particular, given any two k-colorings of G,
describe which vertices to recolor and to which colors so that get from the first k-coloring to the second k-coloring.
Week 4
- Slides for this week: we continued with last week's slides
-
Tuesday, Jan 5
- Welcome back and happy new year 2010!
- Covered in class: technical details of the counting to sampling reduction
-
Thursday, Jan 7
- Covered in class: finished the technical details (now you are experts in random variables :) - I just realized that I haven't fixed the issue with the constant c, I'll do so as soon as I get the chance.
- Covered in class: definition of Markov chains, transition matrix
- Read ahead: the rest of Chapter 3 (we will briefly go over it next Tuesday)
- Possible presentation topics:
- Metropolis-Hastings filter: I would really like to see this one presented
- Simulated annealing: again, I would like to see this one
- Stopping heuristics for Markov chains (a survey of several stopping heuristics)
- Practical applications of Markov chains (choose one and tell us about it)
- #P-complete problems, including not too complicated (but not trivial either) reductions
- Deterministic counting algorithms
- Torpidly mixing Markov chains (Markov chains that do not mix/converge quickly)
The first three topics are well described in Markov Chain Monte Carlo in Practice, a book owned by the RIT library.
There are other interesting topics in the book, if you would prefer to pick one of those.
The other topics are more open-ended, where you search for a problem on your own. (Come talk to me if
you'd like me to help you find a suitable problem.)
Other logistics about the presentations:
- Your presentation will last about 30 minutes, with about 5 minutes for questions. You might join forces
and prepare an hour long joint presentation where both presenters equally contribute to the preparation
and the presentation. Some topics go well together, for example Metropolis-Hastings and simulated annealing.
- The presentations take place in weeks 9 and 10. We will have 3 presentations per class.
- You are free to choose between preparing slides or presenting your topic using the whiteboard. If you choose
whiteboard, prepare a summary handout of your topic. I will post the slides and the handouts on this website.
- Recall that presentations are worth 25% of the final grade. This will be broken into 70% for your presentation and
30% for participation in other students presentations. In particular, you get full 30% if you ask
a relevant question every presentation day and if a relevant question is asked of every presenter.
- Homework 4, due 1/14 12:00pm (in class):
- Consider the following experiment: you flip a coin and if it comes as heads, your random variable X=1,
otherwise X=0. Repeat the experiment k times, thus you will have k independent random variables X1,
X2, ..., Xk. Let Y be the sum of the X's (therefore, Y counts how many heads you see during the experiment) and let Z be the average of the X's.
You want to know what is the probability of Y and Z being close to their expected values.
- What is the expected value of Y?
- What is the variance of Y?
- What is the expected value of Z?
- What is the variance of Z?
- Suppose epsilon is 1/10. Use Chebyshev's inequality to find the probability of Y being outside of the interval defined by
(1+-epsilon)E[Y].
- What k do you want to choose so that the probability of Y being within (1+-epsilon)E[Y] is at least 3/4?
- Suppose epsilon is 1/10. Use Chebyshev's inequality to find the probability of Z being outside of the interval defined by
(1+-epsilon)E[Z].
- What k do you want to choose so that the probability of Z being within (1+-epsilon)E[Z] is at least 3/4?
- Consider the following Markov chain on binary number of length n:
- With probability 1/2 do not change the binary number.
- Otherwise, random choose one of the bits and flip it (0 for 1 or 1 for 0).
Now answer the following questions:
- Give the transition matrix P of the chain for n=3 and draw the corresponding transition graph.
- Consider the vector mu0=(1,0,0,0,0,0,0,0). Multiply mu0 by P to
get mu1. Multiply mu1 by P to get mu2, etc.
Compute several values of the mu's. What is happening to the mu's? (You might want to write a little
program to help you with this. Submit the printout with the values of the mu's.)
Week 5
- Slides for this week: Chapter 4: Coupling - pdf,
ppt
-
Tuesday, Jan 12
- Covered in class: properties of Markov chains, mixing time (end of Cahpter 3, beggining of Chapter 4)
- Covered in class: solutions of Hw3
-
Thursday, Jan 14
- Covered in class: the coupling technique, the first half of the coupling argument applied on the coloring Markov chain
- Homework 5, due 1/22 4:00pm:
- Let n,k be integers such that 0<=k<=n-1. Consider the following Markov chain on binary numbers of length n
that contain exactly k or k+1 ones:
- With probability 1/2 do not change the binary number.
- Otherwise, choose one of the bits uniformly at random, let it be the i-th bit.
- Flip the i-th bit if the resulting number contains between k and k+1 ones.
- Otherwise do not change the binary number.
Now answer the following questions:
- How many states does the Markov chain have?
- Is the Markov chain ergodic? Reason your answer.
- Describe a stationary distribution for this Markov chain. Is the stationary distribution unique?
- Design a (small) Markov chain with a unique non-uniform stationary distribution.
- Design a (small) Markov chain with several different stationary distributions.
How to submit: (a) bring your solutions to my office hours, (b) submit by e-mail, (c) bring them to the class on Thursday, or (d) slide them under my office door before the deadline.
Week 6
- Slides for this week: we continued with the coupling slides
-
Tuesday, Jan 19
- Covered in class: finishing the coupling argument
-
Thursday, Jan 21
- Covered in class: eigenvalues of a matrix and the intuition behind the connection between the mixing time and eigenvalues of the transition matrix
- Covered in class: Path Coupling
- Covered in class: Solutions of Hw 4
- Homework 6, due 1/29 4:00pm:
- Suppose we have a Markov chain on n states where:
- If the current state is state 1: we flip an n-sided dice. We move to the state with the number shown on the dice.
- If the current state is not state 1: we flip an n-sided dice. If the dice is showing number 1, we move to state 1.
Otherwise, we stay in the current state.
Answer the following questions:
- If we start in state 1, what is the mixing time of the Markov chain for epsilon 0.1?
- If we start in state 2, intuitively, what do you think the mixing time is going to be? Reason your answer.
- Skim through http://en.wikipedia.org/wiki/Chernoff_bound.
Recall the first problem of Hw4, namely the definition of the random variable Z.
Apply the Chernoff-Hoeffding bound stated in the section titled "Theorem for additive form" to the following problem:
how many X's do you need to average so that Z will be within an additive error of epsilon 0.05 with probability > 3/4?
- Consider the Markov chain from the last problem in Hw 4, sampling binary numbers of length n.
Use coupling to estimate the mixing time - how many steps do we need to get within epsilon of the stationary distribution (measured by the total variation distance)?
- Does your argument from the previous problem work on the Markov chain from the first problem in Hw 5?
If yes, explain how does the argument go. If not, explain the problem.
- What are the eigenvalues of the transition matrix of this Markov chain for n=3? List the eigenvalues and the corresponding
eigenvectors. You may use any available software.
Week 7
- Slides for this week: pdf,
ppt
-
Tuesday, Jan 26
- Covered in class: meaning of mu.p^t for some starting distribution mu and transition matrix P,
and the connection with the stationary distribution
- Covered in class: introduction to Canonical Paths (Chapter 5)
- Covered in class: solutions of Problem 1 from Hw 5
- The project has been posted. The presentation schedule will be posted soon.
-
Thursday, Jan 28
- Covered in class: canonical paths applied to the Markov chain on all matchings - we will finish the encoding idea
on Tuesday
- The presentation schedule has been posted.
- Recall, the exam is next Thursday, Feb 4, 12-1:50pm in class.
Week 8
- Slides for this week: Demo for the canonical paths argument - pdf,
ppt
-
Tuesday, Feb 2
- Covered in class: finished the canonical paths argument applied to matchings
- Covered in class: solutions of Hw6
- Covered in class: solutions of the canonical path practice problem from the Exam website
-
Thursday, Feb 4