Description
I am trying to compute the jacobian of a function that solves an Ode internally (using DiffEquations.jl), the function has the following structure:
function fConstraint(x::Vector)::Vector sol = calc_data(x) #ode solving #println(x) res = [ sol[1][2]-data["hgoal"]*1e3/datos_prob["r12"], -sol[1][2], sol[2][2]+data["tspan"][2]*0.99, -sol[1][1]+data["hmmin"]*1e3/datos_prob["r12"] ] end
The function is used to compute the restrictions in an optimization problem, and I am using a solver that requires the gradient of the constraints. Is there a way to do this with ForwardDiff? I ask because I get an enormous error code that starts like this:
ForwardDiff.jacobian(fConstraint, [0.001,0.1,0.1])
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(fConstraint),Float64},Float64,3})
Closest candidates are:
Float64(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:194
Float64(::T<:Number) where T<:Number at boot.jl:741
Float64(::Int8) at float.jl:60
...
, and I guess it is because of the internal ode, otherwise it is a very simple function and it computes without problems (I have tried with another much simpler function that only uses sums and it computes the jacobian well).
Thanks in advance :)