My thoughts so far:
For any given matrix $A$ there is a function $f_A:{\cal P}([m])\to[n]$ (where $[m]$ denotes $\{1,\dots,m\}$) such that $f_A(X)$ is the size of the set $\{j\in[n]:\forall i\in X.a_{ij}=1\}$, i.e. the number of ones in the bitwise conjunction of the rows $A_i$ for $i\in X$. One can define the equivalence relation $f\sim^* f'$ whenever $f\circ(X\mapsto\{\sigma(i):i\in X\})=f'$ for some $\sigma\in{\cal S}_m$. Then whenever $A\sim A'$ we have $f_A\sim^*f_{A'}$ and my intuition says also vice versa. So (i) would reduce to study the equivalence relation $\sim^*$ and for (ii) one could ask which functions $f:{\cal P}([m])\to[n]$ actually yield a matrix $A$ with $f=f_A$.
Then I had another idea, one can just sort matrices lexicographically, interpreting them as concatenation of all their rows. I wrote a Haskell script to find the minimal $A'$ in $[A]_\sim$ w.r.t. this ordering, I don't know how efficient it is but it should be more efficient than brute forcing through all $\sigma$-$\tau$-combinations. Here foo is some helper function finding the minimal element out of all column only permutations of a given matrix, so getMinEqu is only brute forcing through the row permutations. It works by first ordering the columns such that the first row looks like [f,...,f,t,...,t] and then applying itself to the two submatrices sitting below the fs and below the ts.
import Data.List
f::Bool
f=False
t::Bool
t=True
getInd::[Int]->[a]->[a]
getInd ks xs=[xs!!k|k<-ks]
getMinEqu::[[Bool]]->[[Bool]]
getMinEqu xs=head$sort$map foo$permutations xs where
foo::[[Bool]]->[[Bool]]
foo []=[]
foo(x:xs)
|m*n>1
=zipWith(++)((map(const f)falPos)
:(foo$map(getInd falPos)$tail xs))
((map(const t)truPos)
:(foo$map(getInd truPos)$tail xs))
|otherwise=x:xs where
m::Int
m=length(x:xs)
n::Int
n=length x
falPos::[Int]
falPos=elemIndices f x
truPos::[Int]
truPos=elemIndices t x