First, a brief (I hope) digression regarding orientation. For Euclidean vector spaces of dimension $3$ or less, we have a physical intuition for orientation which is connected to our sense of direction—direction of movement along a line, clockwise vs. counterclockwise rotations in the plane, the right-hand rule for Euclidean space. If we examine the effects of (matrices of) linear transformations on this, we will find that those with negative determinant reverse these directions and those with positive determinant leave them unchanged: multiplying by a negative number reverses motion along a path; reflections on the plane change clockwise rotations into counterclockwise ones, and so on. We classify linear transformations as orientation-changing or orientation-preserving accordingly.
Notice that there is no intrinsic property of these vector spaces qua vector spaces that allows us to say that a particular orientation is “positive,” “good” or whatever term we’d like to use. We make a choice based on external criteria. From the point of view of a pure vector space, this choice is entirely arbitrary. As someone else put it recently, aside from some harmless changes of sign here and there, everything works the same way regardless of this choice.
In higher-dimensional spaces, we run into a problem: most of us don’t have a geometric intuition to fall back on that will let us define “right-handed.” When considering more abstract vector spaces, there may not even be a geometrical model to fall back on. What would the right-hand rule even mean for the space of polynomials of degree less than three or the space of current flows in an electrical circuit with three resistors?
So, we basically define orientation via the determinants of change-of-basis matrices. Specifically, two bases $\mathscr B$ and $\mathscr B'$ of $\mathbb R^n$ have the same orientation if the change-of-basis matrix that relates them has a positive determinant, and they have a different orientation if negative. Establishing the “good” orientation amounts to selecting one of these two equivalence classes. For more abstract real $n$-dimensional spaces, we observe that a choice of basis induces an isomorphism with $\mathbb R^n$, and that these isomorphisms then induce orientations in the abstract space.
Moving finally to the specific transformations you’re considering, the matrix that corresponds to swapping axes around in $\mathbb R^n$ is a permutation matrix, that is, the identity matrix with its columns permuted in the same way that the axes are to be rearranged. (Remember that the columns of the matrix are the images of the original axes.) It’s a basic property of determinants that swapping two columns of a matrix changes the sign of its determinant, so swapping a pair of axes is an orentation-changing transformation. To preserve the original orientation, we need to do something else that will also change the sign of the determinant. Flipping an axis multiplies the corresponding column by $-1$, so flipping an odd number of axes will do the trick, as will swapping other axes around so that you end up with an even permutation, or some other combination of these actions that results in a positive determinant. The bottom line is that, when you swap a pair of axes, some other axes have to come along for the ride as well if you want to keep the same orientation.