I am trying to fit a series of points to a function of the form `f(A)=a·(1-exp(-21.6·A))+b·A+c·A^2+d·A^3`

. I want this function to satisfy the condition% co_of%, then, since% co_of% is small, I approximate this condition as% co_of%.

The problem is that I want the `f(1)=1`

function to have no zero in the `exp(-21.6)`

interval. Doing a little calculation, this means that we have certain higher (or lower) bounds for a, b and c, but I do not know how to impose these conditions for the adjustment of the curve (or if there is another better method to do so) or in `d=1-a-b-c`

nor in `f(A)-A`

, which are the programs that I use for this kind of tasks, since setting initial conditions does not help to solve the problem.