Wikipedia tells us that the expected value is:
$E[x] = x_1p_1 + x_2p_2 + \ ...\ + x_kp_k$, where the $x_i$ are the values that $x$ can take, and $p_i$ are the corresponding probabilities that $x$ will take on that value.
There are 52 cards in the deck.
For game (A), the value of drawing your selected card is 1 dollar, and the value of drawing any other card is 0. The respective probabilities are 1/52 and 51/52. So we have:
$E = 1 \cdot \dfrac{1}{52} + 0 \cdot \dfrac{51}{52} = 1.9$ cents
Can you do game (B)?
(Added) You might want to consider how to generalize this, just in case you are accosted in a dark alley and forced to wager your life on a discrete random variable.
Let's say that you choose $1$ out of $n$ cards/roulette slots/numbers/etc., and you will receive $R$ if you are right, and $0$ if you are wrong. Then the expectation is
$E = R \cdot \dfrac{1}{n} + 0 \cdot \dfrac{n-1}{n} = \dfrac{R}{n}$
If you compare this with what it cost you to play the game, you can see what the payoff would need to be for you to "break even".