I have a system of ODEs whose time courses I'd like to numerically integrate. One of these should have a continuous strictly decreasing term and a discontinuous increasing term. It looks something like this:
d[A]/dt = -k*A over the whole interval
and additionally
d[A]/dt = c for t in every 10 steps (if t%10 == 0)
I have encoded the first one as ODE in SBML, the 2nd one as discrete event. Using numerical solvers like LSODE or CVODE fail to integrate this, however, with a root finding error.
What is the best way to encode such an ODE in a way that the solvers above can handle?
Ok I found the issue:
d[A]/dt = c for t in every 10 steps (if t%10 == 0)
does not work because t does not need to be divisible by 10 at any point. The following works:
d[A]/dt = c for t in every 10 steps (if t%10 < 0.01)