Skip to content

Hessian AD for logjoint and second-order optim interface broken #1878

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
bgroenks96 opened this issue Aug 20, 2022 · 4 comments
Closed

Hessian AD for logjoint and second-order optim interface broken #1878

bgroenks96 opened this issue Aug 20, 2022 · 4 comments

Comments

@bgroenks96
Copy link
Contributor

Support for second-order forward-mode differentiation seems to be broken on the latest version (v0.21.10).

MWE:

using Turing
using Optim
using ForwardDiff
@model function gaussian()
    μ ~ Normal(0,1)
    σ ~ InverseGamma(2,3)
    x ~ Normal(μ, σ)
    return x
end
m = gaussian()
nlj(x) = -logjoint(m, (μ=x[1],σ=x[2]))
ForwardDiff.hessian(nlj, [0.0,3.0]) # fails

Error:

ERROR: TypeError: in setfield!, expected Float64, got a value of type ForwardDiff.Dual{Nothing, ForwardDiff.Dual{Nothing, Float64, 2}, 2}
Stacktrace:
  [1] setproperty!
    @ .\Base.jl:43 [inlined]
  [2] setindex!
    @ .\refvalue.jl:57 [inlined]
  [3] acclogp!!
    @ C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\threadsafe.jl:19 [inlined]
  [4] tilde_assume!!
    @ C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\context_implementations.jl:118 [inlined]
  [5] gaussian(__model__::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}, __varinfo__::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.SimpleVarInfo{NamedTuple{(:μ, :σ), Tuple{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, 
ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}}, Float64, DynamicPPL.NoTransformation}, Vector{Base.RefValue{Float64}}}, __context__::DynamicPPL.DefaultContext)
    @ Main c:\Users\bgroenke\dmawi\data\sparc\personal_accounts\03_PhD\Brian\repos\CryoGrid\CryoGridML\experiments\ensemble\eks_inference.jl:40
  [6] macro expansion
    @ C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\model.jl:493 [inlined]
  [7] _evaluate!!(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.SimpleVarInfo{NamedTuple{(:μ, :σ), Tuple{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}}, Float64, DynamicPPL.NoTransformation}, Vector{Base.RefValue{Float64}}}, context::DynamicPPL.DefaultContext)
    @ DynamicPPL C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\model.jl:476
  [8] evaluate_threadsafe!!(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.SimpleVarInfo{NamedTuple{(:μ, :σ), Tuple{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}}, Float64, DynamicPPL.NoTransformation}, context::DynamicPPL.DefaultContext)
    @ DynamicPPL C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\model.jl:467
  [9] evaluate!!(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.SimpleVarInfo{NamedTuple{(:μ, :σ), Tuple{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}}, Float64, DynamicPPL.NoTransformation}, context::DynamicPPL.DefaultContext)
    @ DynamicPPL C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\model.jl:402
 [10] logjoint(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.SimpleVarInfo{NamedTuple{(:μ, :σ), Tuple{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}}, Float64, DynamicPPL.NoTransformation})
    @ DynamicPPL C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\model.jl:548
 [11] logjoint(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.DefaultContext}, θ::NamedTuple{(:μ, :σ), Tuple{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}})
    @ DynamicPPL C:\Users\bgroenke\.julia\packages\DynamicPPL\1qg3U\src\simple_varinfo.jl:570
 [12] nlj(x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}})
    @ Main c:\Users\bgroenke\dmawi\data\sparc\personal_accounts\03_PhD\Brian\repos\CryoGrid\CryoGridML\experiments\ensemble\eks_inference.jl:47
 [13] vector_mode_dual_eval!
    @ C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\apiutils.jl:37 [inlined]
 [14] vector_mode_gradient(f::typeof(nlj), x::Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 
2}}})
    @ ForwardDiff C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\gradient.jl:106
 [15] gradient
    @ C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\gradient.jl:19 [inlined]
 [16] #114
    @ C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\hessian.jl:16 [inlined]
 [17] vector_mode_dual_eval!
    @ C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\apiutils.jl:37 [inlined]
 [18] vector_mode_jacobian(f::ForwardDiff.var"#114#115"{typeof(nlj), ForwardDiff.HessianConfig{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), 
Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}}}}, x::Vector{Float64}, cfg::ForwardDiff.JacobianConfig{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}}})
    @ ForwardDiff C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\jacobian.jl:148
 [19] jacobian(f::Function, x::Vector{Float64}, cfg::ForwardDiff.JacobianConfig{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}}}, ::Val{false})
    @ ForwardDiff C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\jacobian.jl:21
 [20] hessian(f::Function, x::Vector{Float64}, cfg::ForwardDiff.HessianConfig{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}}}, ::Val{true})
    @ ForwardDiff C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\hessian.jl:17
 [21] hessian(f::Function, x::Vector{Float64}, cfg::ForwardDiff.HessianConfig{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}, 2}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(nlj), Float64}, Float64, 2}}}) (repeats 2 times)
    @ ForwardDiff C:\Users\bgroenke\.julia\packages\ForwardDiff\pDtsf\src\hessian.jl:15
 [22] top-level scope
    @ c:\Users\bgroenke\dmawi\data\sparc\personal_accounts\03_PhD\Brian\repos\CryoGrid\CryoGridML\experiments\ensemble\eks_inference.jl:52

Also with the Optim interface:

optimize(Turing.condition(m, x=1.0), MAP(), Newton()) # also fails
ERROR: type TwiceDifferentiableHV has no field h_calls
Stacktrace:
  [1] getproperty
    @ .\Base.jl:42 [inlined]
  [2] dotgetproperty(x::NLSolversBase.TwiceDifferentiableHV{Float64, Vector{Float64}, Vector{Float64}, Vector{Float64}}, f::Symbol)
    @ Base .\Base.jl:45
  [3] hessian!!(obj::NLSolversBase.TwiceDifferentiableHV{Float64, Vector{Float64}, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
    @ NLSolversBase C:\Users\bgroenke\.julia\packages\NLSolversBase\cfJrN\src\interface.jl:93
  [4] initial_state(method::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing}, d::NLSolversBase.TwiceDifferentiableHV{Float64, Vector{Float64}, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
    @ Optim C:\Users\bgroenke\.julia\packages\Optim\rpjtl\src\multivariate\solvers\second_order\newton.jl:46
  [5] optimize(d::NLSolversBase.TwiceDifferentiableHV{Float64, Vector{Float64}, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64}, method::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing})
    @ Optim C:\Users\bgroenke\.julia\packages\Optim\rpjtl\src\multivariate\optimize\optimize.jl:36
  [6] optimize(f::NLSolversBase.InplaceObjective{Nothing, Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:μ, :σ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:μ, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:μ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, 
Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, DynamicPPL.SampleFromPrior, Turing.ModeEstimation.OptimizationContext{DynamicPPL.DefaultContext}}, Nothing, Nothing, Nothing}, initial_x::Vector{Float64}, method::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
    @ Optim C:\Users\bgroenke\.julia\packages\Optim\rpjtl\src\multivariate\optimize\interface.jl:142
  [7] optimize(f::NLSolversBase.InplaceObjective{Nothing, Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:μ, :σ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:μ, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:μ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, 
Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, DynamicPPL.SampleFromPrior, Turing.ModeEstimation.OptimizationContext{DynamicPPL.DefaultContext}}, Nothing, Nothing, Nothing}, initial_x::Vector{Float64}, method::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing})
    @ Optim C:\Users\bgroenke\.julia\packages\Optim\rpjtl\src\multivariate\optimize\interface.jl:141
  [8] _optimize(::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:μ, :σ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:μ, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:μ, 
Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, DynamicPPL.SampleFromPrior, Turing.ModeEstimation.OptimizationContext{DynamicPPL.DefaultContext}}, ::Vector{Float64}, ::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, ::Optim.Options{Float64, Nothing}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:242
  [9] _optimize(::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, ::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:μ, :σ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:μ, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:μ, 
Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, DynamicPPL.SampleFromPrior, Turing.ModeEstimation.OptimizationContext{DynamicPPL.DefaultContext}}, ::Vector{Float64}, ::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, ::Optim.Options{Float64, Nothing})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:233
 [10] _optimize(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, f::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:μ, :σ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:μ, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:μ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, DynamicPPL.SampleFromPrior, Turing.ModeEstimation.OptimizationContext{DynamicPPL.DefaultContext}}, optimizer::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, args::Optim.Options{Float64, Nothing}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:199
 [11] _optimize(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, f::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:μ, :σ), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:μ, Setfield.IdentityLens}, Int64}, Vector{Normal{Float64}}, Vector{AbstractPPL.VarName{:μ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:σ, Setfield.IdentityLens}, Int64}, Vector{InverseGamma{Float64}}, Vector{AbstractPPL.VarName{:σ, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, DynamicPPL.SampleFromPrior, Turing.ModeEstimation.OptimizationContext{DynamicPPL.DefaultContext}}, optimizer::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, args::Optim.Options{Float64, Nothing})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:199
 [12] _map_optimize(::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, ::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, ::Vararg{Any}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:184
 [13] _map_optimize(::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, ::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, ::Vararg{Any})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:183
 [14] optimize(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, ::MAP, optimizer::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:169
 [15] optimize(model::DynamicPPL.Model{typeof(gaussian), (), (), (), Tuple{}, Tuple{}, DynamicPPL.ConditionContext{(:x,), NamedTuple{(:x,), Tuple{Float64}}, DynamicPPL.DefaultContext}}, ::MAP, optimizer::Newton{LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, options::Optim.Options{Float64, Nothing}) (repeats 2 times)
    @ Turing C:\Users\bgroenke\.julia\packages\Turing\Tpj0b\src\modes\OptimInterface.jl:169
 [16] top-level scope
    @ c:\Users\bgroenke\dmawi\data\sparc\personal_accounts\03_PhD\Brian\repos\CryoGrid\CryoGridML\experiments\ensemble\eks_inference.jl:53

I think there have been some changes to the JuliaOptim ecosystem packages recently. Perhaps this broke something?

@devmotion
Copy link
Member

devmotion commented Aug 20, 2022

IIRC second order derivatives have never been supported (at least in the Optim interface).

@bgroenks96
Copy link
Contributor Author

So what was going on in #1232 and #1369 then?

I do recall there being an error previously when trying to use Newton saying that second-order derivatives were not supported; however, this error seems to no longer be there. Instead we get the weird NLSolversBase error from above.

Is there a discussion somewhere on why second-order derivatives are not currently possible and what needs to be done to make it happen?

@bgroenks96
Copy link
Contributor Author

@cpfiffer Could you clarify?

@torfjelde
Copy link
Member

It seems like it's just an issue with ThreadSafeVarInfo, no?

bors bot pushed a commit to TuringLang/DynamicPPL.jl that referenced this issue Sep 7, 2022
The `ThreadSafeVarInfo` still has quite a few implementation details inherited from compat with `VarInfo`. This PR makes `ThreadSafeVarInfo` work better with other implementations, e.g. `SimpleVarInfo`.

It also fixes TuringLang/Turing.jl#1878 (comment)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants