As Jyrki notes, your condition is a special case of linear disjointness. E.g., this is taken from Lang's Algebra, Section VIII.3:
Definition. Let $K$ and $L$ be extensions of $k$, contained in some common algebraically closed field $\Omega$ that contains $k$. We say that $K$ is linearly disjoint from $L$ over $k$ if every finite set of elements of $K$ that is linearly independent over $k$ is also linearly independent over $L$.
Although the definition is asymmetric, the condition is in fact symmetric:
Proposition. $K$ is linearly disjoint from $L$ over $k$ if and only if $L$ is linearly disjoint from $K$ over $k$.
Proof. Let $y_1,\ldots,y_n\in L$ be elements that are linearly independent over $k$, and let $\alpha_1y_1+\cdots +\alpha_ny_n = 0\tag{1}$ be a $K$-linear combination equal to $0$. Reordering if necessary, assume that $\alpha_1,\ldots,\alpha_r$ are linearly independent over $k$, and $\alpha_{r+1},\ldots,\alpha_n$ are $k$-linear combinations of $\alpha_1,\ldots,\alpha_r$; that is, $\alpha_i = \sum_{j=1}^r \beta_{ij}\alpha_j,\qquad i=r+1,\ldots,n.$
We can rewrite $(1)$ to get $\begin{align*} \sum_{j=1}^r \alpha_j y_j + \sum_{i=r+1}^{n}\left(\sum_{j=1}^r \beta_{ij}\alpha_j\right)y_i &=0\\ \sum_{j=1}^r \left(y_j + \sum_{i=r+1}^n\beta_{ij}y_i\right)\alpha_j&=0 \end{align*}$ Since $K$ is linearly disjoint from $L$ over $k$ and $\alpha_1,\ldots,\alpha_r$ are $k$-linearly independent, it follows that for each $j=1,\ldots,r$ we have $y_j + \sum_{i=r+1}^n\beta_{ij}y_i = 0.$ But since the $y_1,\ldots,y_n$ are $k$-linearly independent, this cannot occur; thus, $r=0$, so that $\alpha_1=\cdots=\alpha_n=0$, as desired. $\Box$
Added to address a question raised in comments.
A question was raised in comment, on whether, like linear disjointness, the condiition off being weakly independent can be made one sided. That is, suppose that $a$ and $b$ are such that $[K(a,b):K(b)] = [K(a):K]$. Does it follow that $[K(a,b):K(b)] = [K(b):K]$?
Theorem. Let $K$ be a field, and let $a$ and $b$ be elements of some overfield that contains $K$. If $[K(a,b):K(b)] = [K(a):K]$, then $[K(a,b):K(a)] = [K(b):K]$.
Proof. It suffices to show that if $1,b,b^2,\ldots,b^{m-1}$ are linearly independent over $K$, then they are linearly independent over $K(a)$. Suppose we have $p_0(a) + p_1(a)b + \cdots +p_{m-1}(a)b^{m-1}=0,$ where each $p_i(x)$ is a rational function on $a$; clearing denominators, we may assume that they are in fact polynomials, and that they are of degree less than $[K(a):K]$ (arbitrary degree if $a$ is transcendental). Let $n$ be the highest power of $a$ that occurs. Then we can rewrite this expression in the form $q_0(b) + q_1(b)a + \cdots + q_{n}(b)a^{n}=0$ where the $q_i$ are polynomials with coefficients in $K$ (namely, $q_0(x)$ has the constant coefficient of $p_i$ as the degree $i$th coefficient; $q_1(x)$ has the degree one coefficient of $p_i$ as the degree $i$th coefficient, etc). Since $1,a,a^2,\ldots,a^n$ are linearly independent over $K$, they are linearly independent over $K(b)$, so we conclude that $q_I(x)=0$ for all $i$; this yields that the $p_i$ are $0$ for all $i$ as well, which establishes the claim. $\Box$
Theorem. Let $a$ and $b$ be algebraic over $K$. Then $a$ and $b$ are weakly independent over $K$ if and only if $K(a)$ is linearly disjoint from $K(b)$ over $K$.
Proof. Let $n=[K(a):K]$ and $m=[K(b):K]$. Assume first that $K(a)$ is linearly disjoint from $K(b)$ over $K$. Since $1,a,a^2,\ldots,a^{n-1}$ are $K$-linearly independent in $K(a)$, it follows that they are $K(b)$-linearly independent, and therefore the minimal polynomial of $a$ over $K(b)$ has degree at least $n$. Since the minimal polynomial of $a$ over $K$ has degree exactly $n$, it follows that $[K(a,b):K(b)]=n$. A symmetric argument shows that $[K(a,b):K(a)]=m$, proving that $a$ and $b$ are weakly independent over $K$.
Conversely, assume that $a$ and $b$ are weakly independent over $K$. Let $y_1,\ldots,y_r$ be elements of $K(a)$ that are linearly independent over $K$. We can write them in terms of $1,a,\ldots,a^{n-1}$, and we get: $\begin{align*} y_1 &= \alpha_{01} + \alpha_{11}a +\alpha_{21}a^2+\cdots +\alpha_{n-1,1}a^{n-1}\\ y_2 &= \alpha_{02} + \alpha_{12}a + \alpha_{22}a^2+\cdots + \alpha_{n-1,2}a^{n-1}\\ &\vdots\\ y_r &= \alpha_{0r} + \alpha_{1r}a + \alpha_{2r}a^2 + \cdots + \alpha_{n-1,r}a^{n-1}. \end{align*}$ Because $y_1,\ldots,y_r$ are linearly independent over $K$, any $r\times r$ subdeterminant of the $\alpha_{ij}$ will be nonzero. That is, the matrix of the $\alpha_{ij}$ has rank $r$.
Suppose that $\beta_1,\ldots,\beta_n\in K(b)$ are such that $\beta_1y_1+\cdots + \beta_ry_r = 0$. Plugging in and reordering, we get: $0 = \sum_{i=1}^r \beta_i\alpha_{0i} + \left(\sum_{i=1}^r\beta_i\alpha_{1i}\right)a + \cdots + \left(\sum_{i=1}^r\beta_i\alpha_{n-1,i}\right)a^{n-1}.$ Since $1,a,a^2,\ldots,a^{n-1}$ are linearly independent over $K(b)$ (because $[K(b)(a):K(b)]=n$), it follows that $\sum_{i=1}^r \beta_i\alpha_{ji} = 0$ for $j=0,\ldots,n-1$. Viewing this as a system of $n$ linear equations over the $\beta_i$, we have $n$ equations in $r$ unknowns; the coefficient matrix is full rank (rank $r$), and therefore the system has a unique solution, namely $\beta_1=\cdots=\beta_r=0$. Thus, $y_1,\ldots,y_r$ are linearly independent over $K(b)$, showing that $K(a)$ is linearly disjoint from $K(b)$, as claimed. $\Box$
Added. The nonalgebraic cases.
Proposition: If $a$ is algebraic over $K$ and $b$ is transcendental over $K$, then both the following conditions hold:
- $a$ and $b$ are weakly independent over $K$; and
- $K(a)$ and $K(b)$ are linearly disjoint over $K$.
Proof. If $a$ is algebraic and $b$ is transcendental over $K$, then $b$ is transcendental over $K(a)$; hence $[K(a,b):K(a)] = [K(b):K]$; by previously established proposition, it follows that $[K(a,b):K(b)] = [K(a):K]$. Alternatively, the fact that $[K(a,b):K(b)]=[K(a):K]$ when $a$ is algebraic and $b$ is transcendental is well-known. Thus, $a$ and $b$ are weakly independent.
For the second part, we can proceed as above: let $y_1,\ldots,y_m$ be linearly independent elements of $K(a)$; to show that they are linearly independent over $K(b)$, write out $r_1(b)y_1 + \cdots r_m(b)y_m = 0$, where the $r_i$ are rational functions on $b$. Clearling denominators, we may assume that they are in fact polynomials; then looking at the terms of degree $n$ in $b$ we see that the coefficients of degree $n$ must be equal to $0$ (by the linear independence of $y_1,\ldots,y_m$ over $K$), so $r_1(b)=\cdots=r_m(b)=0$. Thus, $y_1,\ldots,y_m$ remain independent over $K(b)$, so $K(a)$ is linearly disjoint from $K(b)$, as claimed. $\Box$
Proposition. Let $K$ be a field, and let $a$ and $b$ be transcendental over $K$. Then the following are equivalent:
- $a$ and $b$ are weakly independent over $K$;
- $K(a)$ and $K(b)$ are linearly disjoint.
Proof. (2)$\implies$(1): since $1,a,a^2,\ldots,a^n$ are linearly independent over $K$ for every $n$, then they are linearly independent over $K(b)$. Thus, $[K(a,b):K(b)]\gt n$ for all $n$< so $[K(a,b):K(b)]=\infty = [K(a):K]$. Thus, $a$ and $b$ are weakly independent over $K$.
(1)$\implies$(2): Since $a$ and $b$ are weakly independent and transcendental, it follows that $a$ is transcendental over $K(b)$, and $b$ is transcendental over $K(a)$. If $r_1(a),\ldots,r_n(a)$ are $K$-linearly independent rational functions on $a$, and $s_1(b),\ldots,s_n(b)$ are rational functions on $b$ such that $s_1(b)r_1(a)+\cdots+s_n(b)r_n(b)=0,$ then clearing denominators we obtain a polynomial expression in $a$ and $b$ equal to $0$. This implies the expression is trivial (all coefficients are $0$), which in turn yields that $s_1(b)=\cdots=s_n(b)=0$. Thus, $K(a)$ and $K(b)$ are linearly disjoint. $\Box$
The fact that linear disjointness is weaker than algebraic independence is also well known. For example, Proposition VIII.3.3 in Lang reads:
Proposition. Let $L$ be an extension of $k$, and let $\{u_1,\ldots,u_r\}$ be a set of quantities algebraically independent over $L$. Then $k(u)$ is linearly disjoint from $L$ over $k$.
On the other hand, we have the following:
Definition. We say that $K$ is free from $L$ over $k$ if every finite set of elements of $K$ algebraically independent over $k$ remains algebraically independent over $L$. If $(x)$ and $(y)$ are two sets of elements in $\Omega$, we say that they are free over $k$ (or independent over $k$) if $k(x)$ and $k(y)$ are free over $k$.
Proposition. If $K$ and $L$ are linearly disjoint over $k$, then they are free over $k$.
Proof. Let $x_1,\ldots,x_n$ be elements of $K$ algebraically independent over $k$. Suppose we have a relation $\sum y_iM_i(x_1,\ldots,x_n)=0$ where $M_i(x_1,\ldots,x_n)$ is a monomial in $x_1,\ldots,x_n$ and $y_i\in L$. This gives a linear relation over $L$ of the $M_i(x_1,\ldots,x_n)$, but because $x_1,\ldots,x_n$ are algebraically independent over $k$, we know that their monomials are linearly independent over $k$; and so by linear disjointness, their monomials are linearly independent over $L$. Therefore, $y_i=0$ for all $i$, so $x_1,\ldots,x_n$ are algebraically independent over $L$. $\Box$