I should be able to offer some insight here, at least for the Propp-Wilson algorithm, as I studied it for my dissertation. "Coupling" is an important concept here. This is, I think, where the whole "share a single random number generator" statement comes from.
To answer your questions:
Do "several Markov chains ... share a single random-number generator" for a(i) and "the proposed move at each step, 'left' or 'right', is obtained in the same way by all the chains at any timestep, independent of the current state" for b mean the same?
In (a), they use the same random number but different rules to determine each step. I.e. some processes are allowed to step left while others step right, depending on the current state. Different rules do different things using the same generated random number.
In (b), they use the same generated random number and the same rules to determine each step. E.g. if a random number less than $0.5$ means one process steps left, then all processes step left. Notice that just before coalescing, all processes are on one side of the thick line in (b)i.
You can't have coalescence in (b) happen from both sides to some state in between due to the principle I just stated. Let me clarify: you could break that rule and might still get coalescence but not necessarily. Consider the process $i\rightarrow\{0,1\}$ uniformly for $i=0,1$. Let's simulate the process using coin flips: $H$ means switch values and $T$ means keep current value, processes started at $0$ and $1$ will never coalesce. However if $H$ means $0\rightarrow1$ and $1$ stays $1$, and $T$ means $0$ stays $0$ and $1\rightarrow0$ then the process can coalesce (and in fact coalesces after the first coin flip).
If yes, why does the coalescence in figure 32.1 a(i) happen not when the chains are all in the leftmost state or all in the rightmost state?
The process can coalesce on states that are accessible from every other state. This ties in to the above question as well. You could potentially create a "bad enough coupling" where coalescence happens only on certain states, or that coalescence never happens, depending on the nature of the stochastic process.
Edit: After re-reading your question again, I think you might be referring to the right-most and left-most state out of the current states of the processes as opposed to the right-most and left most states of the state space. This issue is addressed above regarding the fact that (1) the processes are using the same specific generated random number, but (2) different rules. They won't necessarily coalesce "from one side only" if they use different rules. If they use different random numbers, then they move independently, period.
More details about coupling and exact sampling:
For example, let's say the process $X_n\in\{0,1\}$, and when $X_n=0$, then $x_{n+1}$ will be determined by the flip of a fair coin. When $X_n=1$, then $X_{n+1}=0$ (with probability 1). I'm pretty sure this example is straight from a paper I read years ago. There are two initial states to choose from, and we can start in each state and simulate them independently using different coins. However, we can couple two copies of the process, one starting at 0 the other starting at $1$. And we can simulate them both off of the same coin flips.
Now we have to choose our coupling. For example, a coin flip of $H$ means that $0$ becomes $1$ and $1$ becomes $0$, and a coin flip of $T$ means that $0$ stays zero and $1$ becomes $0$. You have to choose how to couple the processes carefully sometimes, but this example process is easy to work with.
The two coupled processed run forward in time will coalesce on the first tails coin flip. For example: $\begin{array}{cccccccc} \text{coin flips:} & H & H & H & T & H & H & T\\ X_0 & X_1 & X_2 & X_3 & X_4 & X_5 & X_6 & X_7 \\ 0 & 1 & 0 & 1 & 0 & 1 & 0 & 0 \\ 1 & 0 & 1 & 0 & 0 & 1 & 0 & 0 \end{array}$ After coalescing at the fourth step, both processes will move together because they are coupled on a single set of coin flips. They are both following the rules of the process perfectly.
The important thing for exact sampling is that you use "coupling from the past". In other words, you don't simulate them forward in time until they coalesce. You simulate one step (step $n=0$), and check if they coalesce (e.g. get a $T$). If is wasn't a $T$, you flip the coin again, and put that coin flip ahead of the previous one (at step $n=-1$). If you still didn't get a $T$, you continue this process "into the past". Once you get a $T$ the process has coalesced and you run it forward in time through your coupling and output the state at time $n=0$. This outputted state, $X_0$, is an exact sample from the stationary distribution of the Markov chain. The coupling from the past will be a $T$ and then some undetermined number of $H$'s, e.g. $THH$ or $THHH$. The following is an example for coupling from the past: $\begin{array}{ccccc} \text{coin flips:} & T & H & H & H \\ X_{-4} & X_{-3} & X_{-2} & X_{-1} & X_0 \\ 0 & 0 & 1 & 0 & 1 \\ 1 & 0 & 1 & 0 & 1 \end{array}$ Note that the $H$ at $n=0$ is the first coin flip, the $H$ at $n=-1$ is the second coin flip, etc. However, the initial values are set at $n=-4$ (in the past) and run forward to time $n=0$.
My intuitive understanding is that in order to have a true sample from the stationary distribution, you need to run it some random length of time after the time of "first coalescence".
For this process the stationary distribution is $2/3$ probability of state $0$ and $1/3$ probability for state $1$. The transition matrix is $T=\left[\begin{array}{cc} 1/2 & 1/2 \\ 1 & 0 \end{array}\right].$ Solving for the stationary distribution gives $\pi=\pi T$, thus $\pi=[2/3, 1/3]$.
To get state $0$ as output from the coupling from the past algorithm, you need an even number of $H$'s following the $T$ and an odd number of $H$'s gives state $1$ as the sample. These probabilities can be calculated directly using geometric series as well to confirm. $P(\text{even number of }H\text{'s before a }T)= \sum_{n=0}^\infty \frac{1}{2^{2n}}\cdot \frac{1}{2}=\frac{1}{2}\frac{1}{1-\frac{1}{4}}=\frac{2}{3}$ $P(\text{odd number of }H\text{'s before a }T)= \sum_{n=0}^\infty \frac{1}{2^{2n+1}}\cdot \frac{1}{2}=\frac{1}{4}\frac{1}{1-\frac{1}{4}}=\frac{1}{3}$
I know this is an old question, but I hope someone finds this helpful!