Skip to content

Commit 107ef76

Browse files
committed
update placeholder text
1 parent 1fbe6fd commit 107ef76

File tree

1 file changed

+93
-18
lines changed

1 file changed

+93
-18
lines changed

index.qmd

Lines changed: 93 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -33,23 +33,30 @@ Turing is written entirely in Julia, and is interoperable with its powerful ecos
3333

3434
::: {.example-text style="text-align:right;padding:0.5rem;"}
3535

36-
<div class="fs-4 fw-bold pb-1">Hello, World in Turing</div>
36+
<div class="fs-4 fw-bold pb-1">Intuitive syntax</div>
3737

38-
Some text about how easy it is to [get going](https://turinglang.org/docs/tutorials/00-introduction/).
38+
The [modelling syntax of Turing.jl](https://turinglang.org/docs/core-functionality) closely resembles the mathematical specification of a probabilistic model.
39+
For example, the following model describes a coin flip experiment with `N` flips, where `p` is the probability of heads.
40+
41+
```math
42+
\begin{align*}
43+
p &\sim \text{Beta}(1, 1) \\
44+
y_i &\sim \text{Bernoulli}(p) \quad \text{for } i = 1, \ldots, N
45+
```
3946

4047
:::
4148

4249
::: {.example-code style="overflow-x: scroll;"}
4350
```{.julia .code-overflow-scroll}
51+
# Define the model
4452
@model function coinflip(; N::Int)
45-
# Prior belief about the probability of heads
4653
p ~ Beta(1, 1)
47-
48-
# Heads or tails of a coin are drawn from `N`
49-
# Bernoulli distributions with success rate `p`
5054
y ~ filldist(Bernoulli(p), N)
55+
end
5156
52-
end;
57+
# Condition on data
58+
data = [0, 1, 1, 0, 1]
59+
model = coinflip(; N = length(data)) | (; y = data)
5360
```
5461
:::
5562

@@ -59,22 +66,90 @@ end;
5966

6067
::: {.example-text style="padding:0.5rem;"}
6168

62-
<div class="fs-4 fw-bold pb-1">Goodbye, World in Turing</div>
69+
<div class="fs-4 fw-bold pb-1">Flexible parameter inference</div>
70+
71+
Turing.jl provides full support for sampling one or more MCMC chains from the posterior distribution, including options for parallel sampling.
72+
73+
Variational inference and point estimation methods are also available.
74+
75+
:::
76+
77+
::: {.example-code style="overflow-x: scroll;"}
78+
```{.julia .code-overflow-scroll}
79+
# Sample one chain
80+
chain = sample(model, NUTS(), 1000)
81+
82+
# Sample four chains, one per thread
83+
# Note: to obtain speedups, Julia must be started
84+
# with multiple threads enabled, e.g. `julia -t 4`
85+
chains = sample(model, NUTS(), MCMCThreads(), 1000, 4)
86+
```
87+
:::
88+
89+
:::::
90+
91+
92+
::::: {.d-flex .flex-row .flex-wrap .panel-wrapper .gap-3 .pb-2}
93+
94+
::: {.example-text style="text-align:right;padding:0.5rem;"}
95+
96+
<div class="fs-4 fw-bold pb-1">MCMC sampling algorithms</div>
97+
98+
A number of MCMC sampling algorithms are available in Turing.jl, including (but not limited to) HMC, NUTS, Metropolis–Hastings, particle samplers, and Gibbs.
6399

64-
Some text about how easy it is to interface with external packages like AbstractGPs. Learn more about modelling [Gaussian Processes](https://turinglang.org/docs/tutorials/15-gaussian-processes/) with Turing.jl.
100+
Turing.jl also supports ['external samplers'](https://turinglang.org/docs/usage/external-samplers/) which conform to the AbstractMCMC.jl interface, meaning that users can implement their own algorithms.
65101

66102
:::
67103

68104
::: {.example-code style="overflow-x: scroll;"}
69105
```{.julia .code-overflow-scroll}
70-
@model function putting_model(d, n; jitter=1e-4)
71-
v ~ Gamma(2, 1)
72-
l ~ Gamma(4, 1)
73-
f = GP(v * with_lengthscale(SEKernel(), l))
74-
f_latent ~ f(d, jitter)
75-
binomials = Binomial.(n, logistic.(f_latent))
76-
y ~ product_distribution(binomials)
77-
return (fx=f(d, jitter), f_latent=f_latent, y=y)
106+
sample(model, NUTS(), 1000)
107+
108+
sample(model, MH(), 1000)
109+
110+
using SliceSampling
111+
sample(model, externalsampler(SliceSteppingOut(2.0)), 1000)
112+
```
113+
:::
114+
115+
:::::
116+
117+
::::: {.d-flex .flex-row-reverse .flex-wrap .panel-wrapper .gap-3 .pt-2 .section-end-space}
118+
119+
::: {.example-text style="padding:0.5rem;"}
120+
121+
<div class="fs-4 fw-bold pb-1">Composability with Julia</div>
122+
123+
As Turing.jl models are simply Julia functions under the hood, they can contain arbitrary Julia code.
124+
125+
For example, [differential equations](https://turinglang.org/docs/tutorials/bayesian-differential-equations/) can be added to a model using `DifferentialEquations.jl`, which is a completely independent package.
126+
127+
:::
128+
129+
::: {.example-code style="overflow-x: scroll;"}
130+
```{.julia .code-overflow-scroll}
131+
using DifferentialEquations
132+
133+
# Define the system of equations
134+
function lotka_volterra(du, u, params, t)
135+
α, β, δ, γ = params
136+
x, y = u
137+
du[1] = (α * x) - (β * x * y)
138+
du[2] = (δ * x * y) - (γ * y)
139+
end
140+
prob = ODEProblem(lotka_volterra, ...)
141+
142+
# Use it in a model
143+
@model function fit_lotka_volterra()
144+
# Priors
145+
α ~ Normal(0, 1)
146+
# ...
147+
148+
# Solve the ODE
149+
predictions = solve(prob, Tsit5(); p=p)
150+
151+
# Likelihood
152+
data ~ Poisson.(predictions, ...)
78153
end
79154
```
80155
:::
@@ -179,4 +254,4 @@ Placeholder text introducing the Bayesian Workflow diagram from the ACM special
179254

180255
<div class="section-start-space">
181256

182-
{{< include _includes//citation/cite.qmd >}}
257+
{{< include _includes//citation/cite.qmd >}}

0 commit comments

Comments
 (0)