Do you need to come up with the function before you have the data, or can you look at the whole data set before choosing the function? If you have all the data, percentiles sound like they do what you want.
If you can't examine the data in advance, it would still help to have some idea of the scale. For example the specific scores might always stay within +-10% of the average, or they might range by a factor of 1000. If you choose a function that keeps the ones a factor of 1000 within bounds, the ones within 10% will be very compressed. One example would be to start with the logistic function $f(t)=\frac{1}{1+\exp(-t)}$, which takes input in $(-\infty,\infty)$ and returns values in $(0,1)$. If values in (0,100) are acceptable, you could set t=(specific percent-average percent)/scale, then score$=\frac{1}{1+\exp(-t)}$, choosing scale to reflect the range of interest.
Added: with your edit, maybe setting average percent to 1% and scale to 0.5% or 1% will meet your needs. You could try some values in a spreadsheet and see what you think.
Added2: that is why I talked about the scale factor, as your values are so close arithmetically. If scale$=1, t=0.003-0.0042=-0.0012$ and score$=0.4997$. But maybe you should use a scale of $0.001$. Then $t=-1.2$ and score$=23\%$. An idea how this works using average$=0.0042=0.42\%$, scale$=0.1\%$ is (all amounts in percent-the percent signs ruin the formatting) $$\begin {array}{c c} raw & score \\ 0.00 & 1.48 \\ 0.10 & 3.92 \\ 0.20 & 9.98 \\ 0.30 & 23.15 \\ 0.40 & 45.02 \\ 0.50 & 69.00\\ 0.60 & 85.81\\ 0.70 & 94.27\\ 0.80 & 97.81\\ 0.90 & 99.18\\ 1.00 & 99.70 \end{array}$$