The UDE refers to an approach to embed the machine learning into differential equations. The resulting UDE has some parts of the equation replaced by universal approximators i.e., neural network (NN). The UDE model approach allows us to approximate a wide, if not infinite, variety of functional relationships. As an example, I will test how well the UDE model approach will approximate a sub-exponential growth model, which is challenging to fit if we use an exponential growth model.
I am using Julia for the UDE approach as it appeared that the Julia is the most advanced in this regard.
# Define the hybrid modelfunctionude_dynamics!(du, u, p, t, p_true) û =U(u, p, _st)[1] # Network prediction du[1] = dS =- û[1] du[2] = dI =+ û[1] - p_true[3]*u[2] du[3] = dR =+ p_true[3]*u[2] end
ude_dynamics! (generic function with 1 method)
# Closure with the known parameternn_dynamics!(du, u, p, t) =ude_dynamics!(du, u, p, t, p_)
nn_dynamics! (generic function with 1 method)
# Define the problemprob_nn =ODEProblem(nn_dynamics!, Xn[:, 1], tspan, p)
ODEProblem with uType Vector{Float64} and tType Float64. In-place: true
timespan: (0.0, 20.0)
u0: 3-element Vector{Float64}:
1.0239505612968622
0.0034985090690380412
0.00031492340046744696
# I don't understand the details of the algorithm# sensealg=QuadratureAdjoint(autojacvec=ReverseDiffVJP(true))# I just adopted what's provided in the web page: # https://docs.sciml.ai/Overview/stable/showcase/missing_physics/functionpredict(θ, X = Xn[:, 1], T = t) _prob =remake(prob_nn, u0 = X, tspan = (T[1], T[end]), p = θ)Array(solve(_prob, Tsit5(), saveat = T, abstol =1e-6, reltol =1e-6, sensealg=QuadratureAdjoint(autojacvec=ReverseDiffVJP(true))))end
res2 = Optimization.solve(optprob2, Optim.LBFGS(), callback = callback, maxiters =1000);println("Final training loss after $(length(losses)) iterations: $(losses[end])")
Final training loss after 3002 iterations: 0.00037095070511105146
# Trained on noisy data vs real solutionpl_trajectory =plot(ts, transpose(Xhat), xlabel ="t", ylabel ="S(t), I(t), R(t)", color =:red, label = ["UDE Approximation"nothing])
scatter!(solution.t, transpose(Xn), color =:black, label = ["Measurements"nothing])
# Ideal unknown interactions of the predictor# Ybar = [-p_[2] * (Xhat[1, :] .* Xhat[2, :])'; p_[3] * (Xhat[1, :] .* Xhat[2, :])']# Ybar = [p_[2] .* Xhat[1,:] .* (Xhat[2,:].^p_[1])]Ybar =transpose([p_[2] * Xhat[1,i] * (Xhat[2,i].^p_[1]) for i ∈1:41, j ∈1:1])