1
$\begingroup$

Hello I am trying to model the relationship between two variables, say x and y. I have a number of subjects - for each subjectm I have a number of x and corresponding y, both of which are always positive. This data tends to be very sparse. There are some problem specific constraints: 1) y(0) = 0 (or very close to it) 2) y is increasing as a function of x 3) y' is decreasing as a function of x

This is rather nebulous, but I have a feeling that the most important difference between subjects is in the height of the curve, not in the slope. Because of the sparsity, I think I can get away with forcing each subject to have the same "slope" (perhaps at a specified x), but allowing the height to vary. I have been playing around with various sorts of logistic functions, but the asymptote isn't really justifiable. I have also been looking at things like a*log(x+b), but this doesn't really conform to the intuition delineated above. Does anyone have any suggestions?

  • 0
    As non-expert said, if we could look at the graph we might be able to help more.2011-02-01

2 Answers 2

1

(This is supposed to be a comment.)

I would say that without knowing the physical process(es) that generated the $y$'s for each corresponding $x$'s, any number of functions would be admissible. Barring that, one usually graphs the data first before even thinking about models...

  • 0
    Maybe you can edit your post to show that noisy graph you speak of, and then we can start from there.2011-01-02
0

First, draw a distinction between what you expect and what you measure.

Since the expectations can't sculpt a functional form, investigate a generic method like polynomial approximation. Then you can look at the result and see how well it matches your expectations.

Start with a set of $m$ measurements $\left\{ x_{k}, y_{k} \right\}_{k=1}^{m}$ and $d$th order approximation.

Model

$ y(x) = a_{0} + \sum_{\mu=1}^{d} a_{\mu} x^{\mu} $

Linear system

$ \begin{align} \mathbf{A} a &= y \\ % \left[ \begin{array}{cccc} 1 & x_{1} & x_{1}^{2} & \dots & x_{1}^{d} \\ 1 & x_{2} & x_{2}^{2} & \dots & x_{2}^{d} \\ \vdots & \vdots & \vdots & & \vdots \\ 1 & x_{m} & x_{m}^{2} & \dots & x_{m}^{d} \\ \end{array} \right] % \left[ \begin{array}{c} a_{0} \\ a_{1} \\ \vdots \\ a_{d} \end{array} \right] % &= % \left[ \begin{array}{c} y_{1} \\ y_{2} \\ \vdots \\ y_{m} \end{array} \right] % \end{align} $

Least squares solution

The least squares solution is $ a_{LS} = \mathbf{A}^{+}b + \left( \mathbf{I}_{d+1} - \mathbf{A}^{+} \mathbf{A} \right) z, \quad z \in \mathbb{C}^{d+1} $