(Note: In mathematics, $\{\cdot\}$ traditionally denotes a set: a construction where order and repetitions have no significance. However, in many programming languages it denotes something like an ordered tuple, so in deference to the OP, I have preserved their original notational choice.)
We're looking for a function $f$ which takes in a finite sequence and outputs a number, so that:
- $f\left(\{ 0, 0, 0, 5, 5, 5, 5, 10, 10, 10, 10, 15, 15, 15, 15 \}\right)$ is slightly higher than $f\left(\{ 50, 50, 50, 50, 62, 62, 50, 62, 80, 80, 80, 80, 100, 100, 80, 100 \}\right)$
- $f\left(\{ 0, 1,2,3,4,5,6,7,8,9 \}\right)$ is higher than $f\left(\{ 0,0,0,0,0,9,9,9,9,9\}\right)$
- $f\left(\{ 0,0,0,0,9,0,9,9,9,9 \}\right)$ is higher than $f\left(\{9,0,0,0,0,9,9,9,9,0\}\right)$
I'm going to discuss a series of candidates, which will hopefully end with one that's satisfying to you, with the purpose of explaining the thought process I followed, rather than dumping a complicated formula with no motivation.
Firstly, since "the differences between each step should not matter" we can make a rough measure of "going up" by giving a point each time the next value is bigger, and taking away a point each time the next value is smaller. Said another way, consider the sequence of differences: e.g. {50,50,50,50,50,62,62,50,62,80,80,80,80,100,100,80,100} would have differences {0,0,0,12,0,-12,12,18,0,0,0,20,0,-20,20}. We give +1 point for each positive difference, and -1 point for each negative difference. Call this function $f_1$. One problem is that both $f_1\left(\{0,0,0,5,5,5,5,10,10,10,10,15,15,15,15\}\right)$ and $f_1\left(\{50,50,50,50,50,62,62,50,62,80,80,80,80,100,100,80,100\}\right)$ are $3$.
To fix this, we should make it so that length is not a factor (since the second of those is longer). The highest number of times a sequence could go up is one less than its length, so consider $f_2(X)=\frac{f_1(X)}{\text{length}(X)-1}\text{.}$ Then, for the three example pairs, $f_2$ outputs $3/14>1/5$, $1>1/9$, and $1/9>-1/9$, so that it seems to work. However, "either increasing or decreasing sets of values are both acceptable", and a strictly decreasing sequence would get a negative value.
To fix that, we could just take $f_3=|f_2|$, so that an $f_2$ of $-1$ is just as good as an $f_2$ of $1$. But then both entries of the third pair have $f_3$ value $1/9$, because this function thinks {9,0,0,0,0,9,9,9,9,0} is trying to be decreasing (down 2 up 1) just as much as {0,0,0,0,9,0,9,9,9,9} is trying to be increasing (up 2 down 1), which means it's missing the fact that {9,0,0,0,0,9,9,9,9,0} "looks more "undecided" as to which direction it is going" which the OP brought up.
To get a clue as to the best way to fix this, note that the OP uses words like "'smoother' staircase" and "noise". The issue is that if we don't want to be too sensitive to noise, then we should be looking at a way of smoothing things. One way is with the sequences of simple moving averages. If we just look at the averages of adjacent pairs, {0,0,0,0,9,0,9,9,9,9} smooths out to {0,0,0,4.5,4.5,4.5,0,0,0}, which never goes down. But {9,0,0,0,0,9,9,9,9,0} smooths out to {4.5,0,0,0,4.5,9,9,9,4.5}, which goes down twice and up twice. This lets us see that modulo a bit of noise {0,0,0,0,9,0,9,9,9,9} is clearly going up, but {9,0,0,0,0,9,9,9,9,0} isn't as clearly doing anything. An even stronger indication of the undecidedness of {9,0,0,0,0,9,9,9,9,0} is seen by looking at three-term averages: it becomes {3,0,0,3,6,9,9,6} so that at this scale it's going more up than down! The next candidate is $f_4$: the average $f_3$ score across all scales.
However, it's much better if a sequence is really strictly monotonic before smoothing than if we have to smooth it to see, so we should weight the $f_3$ scores. One weighting that feels natural to me is that the $f_3$ score of the $k$-term averages should be worth $1/k$ as much as the $f_3$ score of the original sequence. So define $f_5(X)=\left(\sum_{k=1}^{\text{length}(X)-1}\frac{f_3(k\text{-term averages of }X)}k\right)/\left(\sum_{k=1}^{\text{length}(X)-1}\frac1k\right)$ (Here we divide by the harmonic number to make the max score $1$ regardless of length.) However, this has a subtle problem. What if the function looks monotonic at several scales, but in opposite directions? For example, the plot may look something like this: 
Then $f_5$ would score it fairly highly, even though it's very confused.
If we don't want that picture to score highly, then we should instead say that a function is either going up or going down, at all scales. We should replace $f_3$ with $f_2$ in the above definition and then take the absolute value at the end, to get $f_6(X)=\left|\sum_{k=1}^{\text{length}(X)-1}\frac{f_2(k\text{-term averages of }X)}k\right| /\left(\sum_{k=1}^{\text{length}(X)-1}\frac1k\right)$ When we apply $f_6$ to the three example pairs, we get approximately $0.603>0.571,1>0.456,0.428>0.056$. This satisfies both the "slightly higher" condition and shows how very confused {9,0,0,0,0,9,9,9,9,0} is.
With a bit of inspiration from Counting negative values in list, this can be implemented in Mathematica code: f2[list_]:=Mean[(UnitStep[Differences[list]-1]-UnitStep[-Differences[list]-1])]; f6[list_]:=Abs[Total[Table[f2[MovingAverage[list,i]]/i,{i,1,Length[list]-1}]]]\ /HarmonicNumber[Length[list]-1]; N[f6[{9, 0, 0, 0, 0, 9, 9, 9, 9, 0}],10]
would output 0.05644550428
However, there's one thing I didn't address: "the differences between each step...should be internally fairly consistent within the set". I feel like that's a separate issue from what was discussed above, because a strictly increasing sequence gets a maximum score of 1 for all the candidates. I suppose the thing to do is to define some function $g$ based on the variance or standard deviation of the differences, normalized in some way. Then, depending on how important the regularity of the differences is to you, you could pick a function like $\alpha f_6+(1-\alpha)g$ for some number $\alpha\in[0,1]$ based on your preference.