You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/variational/VariationalInference.jl
+19-26Lines changed: 19 additions & 26 deletions
Original file line number
Diff line number
Diff line change
@@ -1,23 +1,16 @@
1
1
module Variational
2
2
3
-
using..Core, ..Utilities
4
-
using DocStringExtensions: TYPEDEF, TYPEDFIELDS
5
-
using Distributions, Bijectors, DynamicPPL
6
-
using LinearAlgebra
7
-
using..Turing: PROGRESS, Turing
8
-
using DynamicPPL: Model, SampleFromPrior, SampleFromUniform
9
-
using Random: AbstractRNG
3
+
import AdvancedVI
4
+
import Bijectors
5
+
import DistributionsAD
6
+
import DynamicPPL
7
+
import StatsBase
8
+
import StatsFuns
10
9
11
-
using ForwardDiff
12
-
using Tracker
13
-
14
-
import..Core: getchunksize, getADbackend
15
-
16
-
import AbstractMCMC
17
-
import ProgressLogging
18
-
19
-
using AdvancedVI
10
+
import Random
20
11
12
+
# Reexports
13
+
using AdvancedVI: vi, ADVI, ELBO, elbo, TruncatedADAGrad, DecayedADAGrad
21
14
export
22
15
vi,
23
16
ADVI,
@@ -34,38 +27,38 @@ use `DynamicPPL.MiniBatch` context to run the `Model` with a weight `num_total_o
34
27
## Notes
35
28
- For sake of efficiency, the returned function is closes over an instance of `VarInfo`. This means that you *might* run into some weird behaviour if you call this method sequentially using different types; if that's the case, just generate a new one for each type using `make_logjoint`.
0 commit comments