Let $(\Omega,\mathcal{F},P)$ be any probability space, and let $X:\Omega\to\mathbb{R}$ be any random variable (i.e. any measurable function). Then, by definition, we say that $X$ has the binomial distribution if there exists $p\in[0,1]$ and $n\in\{0,1,2\ldots\}$ such that $ P(\{\omega:X(\omega)=k\}) = \binom nk p^k(1-p)^k, $ for all $k\in\{0,1,2,\ldots,n\}$.
As an example, we could fix some $p\in[0,1]$ and $n\in\{0,1,2\ldots\}$, and then take $\Omega=\{(a_i)_{i=1}^n:a_i=0,1\}$, take $\mathcal{F}$ to be the power set of $\Omega$, take $P$ to be the probability measure on $(\Omega,\mathcal{F})$ determined by the identity $ P(\{a_1,\ldots,a_n\}) = \prod_{i=1}^n p^{a_i}(1 - p)^{1 - a_i}, $ and finally take $X$ to be the random variable defined by $ X((a_i)_{i=1}^n) = a_1 + \ldots + a_n. $ We would then have to go through the calculations and prove that $X$ has the binomial distribution.
But this is just one example. Here is another. Let $p\in[0,1]$ and $n\in\{0,1,2\ldots\}$ be given. Let $\Omega=\mathbb{R}^n$, let $\mathcal{F}$ by the collection of Lebesgue measurable sets, and let $P$ be the standard Gaussian measure defined by $ P(A) = \frac1{(2\pi)^{n/2}}\int_A \exp\left({-\frac12(x_1^2+\cdots+x_n^2)}\right)\,dx_1\cdots dx_n. $ Choose $x_0\in[-\infty,\infty]$ such that $ \frac1{\sqrt{2\pi}}\int_{-\infty}^{x_0}e^{-x^2/2}\,dx = p. $ Define $H:\mathbb{R}\to\mathbb{R}$ by $H(x)=1$ if $x\le x_0$, and $H(x)=0$ otherwise. Finally, define $X:\Omega\to\mathbb{R}$ by $ X(x_1,\ldots,x_n) = \sum_{i=1}^n H(x_i). $ It might be a little harder to check in this case, but one can still go through the calculations and verify that $X$ has the binomial distribution. (In this case, the components of $(x_1,\ldots,x_2)$ are independent, standard Gaussians on $\mathbb{R}$, and $X$ is the number of them which are less than or equal to $x_0$.)
Finally, one last example. Let $\Omega=\{0,1,2,\ldots,n\}$, let $\mathcal{F}$ be the power set, and let $P$ be determined by the identities $ P(\{k\}) = \binom nk p^k(1-p)^k, $ for all $k\in\Omega$. It now follows that if $X(\omega)=\omega$, then $X$ has the binomial distribution.
The point here is that the distribution of a random variable does not tell us what the underlying probability space is, or what the function $X$ is. Different random variables on different probability spaces can have the same distribution. The distribution of $X$ is just a rule telling us how to determine $P(X\in A)$ for all measurable sets $A\subset\mathbb{R}$. It does not tell us what $X(\omega)$ is, nor does it tell us what $\Omega$ is.
Edit:
Seeing the edited question, the answer is (in general) no. Any random variable $X$ on $(\Omega,\Sigma,P)$ can take on at most two values. So if $p\in(0,1)$ and $n\ge 2$, then $X$ cannot have the binomial distribution. However, the probability space $(\Omega^n,\Sigma^n,P^n)$ is exactly the probability space I used above in the first example.
To add further details, there are only four {0,1}-valued random variables on $(\Omega,\Sigma,P)$. They are $ U(0) = U(1) = 0;$ $X(0)=1, X(1) = 0;$ $Y(0)=0, Y(1)=1;$ $Z(0)=Z(1)=1. $ Of these four, only $X$ is Bernoulli with parameter $p$. There are no others. So it is not possible to construct $n$ independent Bernoulli($p$)'s on $(\Omega,\Sigma,P)$. To do that, one must use $(\Omega^n,\Sigma^n,P^n)$.