I am trying to numerically integrate a spectral distribution data. A spectral distribution function gives the Luminosity per Wavelength of an object (like a galaxy) as a function of the Wavelength. In more mathematical terms,
$$ dL(\lambda) = \rho(\lambda) d\lambda$$
where $L$ = Luminosity in ($\lambda+d\lambda$, $\lambda$); $\rho$ = spectral distribution function; $\lambda$ = Wavelength.
The problem is I am not presented with the spectral distribution function per se. What I am given is the value of this function at discrete wavelengths. For example, my data would look like:
==================================
|Wavelength|Spectral Distribution|
| ... | ... |
| 248 | 500 |
| 300 | 510 |
| 305 | 512 |
| 306 | 518 |
| ... | ... |
==================================
Please do observe that the wavelength's are not varying in a regular fashion. Up until now I've been just using the following primitive formula to find the total luminosity.
$$\Delta L=\frac{(\lambda_2-\lambda_1)}{2}[\rho_2+\rho_1]$$ $$L = \sum \Delta \lambda$$
Is there any other method that I use which gives an improvement over this method I am using?
EDIT: Here is a plot of $\log_{10}(\rho)$ vs $\lambda$: