Skip to content

Commit fa288c6

Browse files
authored
Merge branch 'main' into cgarling-patch-1
2 parents b878046 + b309f81 commit fa288c6

6 files changed

Lines changed: 112 additions & 122 deletions

File tree

docs/src/index.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,6 @@ Our contributors come from diverse backgrounds and have various levels of intera
1818
- [#astronomy](slack://channel?id=CMXU6SD7V&team=T68168MUP) on [JuliaLang Slack](https://julialang.org/slack/)
1919
- [#astronomy](https://julialang.zulipchat.com/#narrow/channel/astronomy) on [JuliaLang Zulip](https://julialang.zulipchat.com/register/)
2020
- [Astro/Space](https://discourse.julialang.org/c/domain/astro) topics on JuliaLang Discourse
21-
- [julia-astro](https://groups.google.com/forum/#!forum/julia-astro) mailing list on Google Groups
2221
- [Monthly JuliaAstro community call](https://julialang.org/community/#events) - Fourth Wednesday of each month at 12:00 ET
2322

2423
## Google Summer of Code (GSoC)

docs/src/tutorials/curve-fit.md

Lines changed: 53 additions & 71 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# [Curve Fitting](@id curve-fit)
22

33

4-
This tutorial will demonstrate fitting data with a straight line (linear regression), an abitrary non-linear model, and finally a Bayesian model.
4+
This tutorial will demonstrate fitting data with a straight line (linear regression), an arbitrary non-linear model, and finally a Bayesian model.
55

66
## Packages
77

@@ -21,9 +21,9 @@ using Pkg; Pkg.add(["Plots", "Optimization", "OptimizationOptimJL", "Turing", "P
2121
```
2222
In your own code, you most likely won't need all of these packages. Pick and choose the one that best fits your problem.
2323

24-
If you will be using these tools as part of a bigger project, it's strongly recommended to create a [Julia Project](https://pkgdocs.julialang.org/v1/environments/) to record package versions. If you're just experimenting, you can create a temporary project by running `] activate --temp`.
24+
If you will be using these tools as part of a bigger project, it's strongly recommended to create a [Julia Project](https://pkgdocs.julialang.org/v1/environments/) to record package versions. If you're just experimenting, you can create a temporary project by running `] activate --temp` in the Julia REPL.
2525

26-
If you're using [Pluto notebooks](https://github.com/fonsp/Pluto.jl), installing and recording package versions in a project are handled for you automatically.
26+
If you're using [Pluto notebooks](https://plutojl.org), installing and recording package versions in a project are handled for you automatically.
2727

2828

2929
## Generating the data
@@ -66,17 +66,14 @@ julia> scatter(x, y; xlabel="x", ylabel="y", label="data")
6666

6767
Before using any packages, let's perform a linear fit from scratch using some linear algebra.
6868

69-
The equation of a line can be written in matrix form as
69+
The equation of a line can be written in matrix form as
7070
```math
71-
\quad
72-
\begin{pmatrix}
73-
N & \sum y_i \\
74-
\sum y_i & \sum y_{i}^2
75-
\end{pmatrix}
7671
\begin{pmatrix}
77-
c_1 \\
78-
c_2 \\
79-
\end{pmatrix}=
72+
N & \sum y_i \\
73+
\sum y_i & \sum y_i^2
74+
\end{pmatrix}
75+
\begin{pmatrix} c_1 \\ c_2 \end{pmatrix}
76+
=
8077
\begin{pmatrix}
8178
\sum y_i \\
8279
\sum y_i x_i
@@ -88,34 +85,27 @@ where $c_1$ and $c_2$ are the intercept and slope.
8885
Multiplying both sides by the inverse of the first matrix gives
8986

9087
```math
91-
\quad
92-
\begin{pmatrix}
93-
c_1 \\
94-
c_2 \\
95-
\end{pmatrix}=
96-
\begin{pmatrix}
97-
N & \sum y_i \\
98-
\sum y_i & \sum y_{i}^2
88+
\begin{pmatrix} c_1 \\ c_2 \end{pmatrix}
89+
= \begin{pmatrix}
90+
N & \sum y_i \\
91+
\sum y_i & \sum y_i^2
9992
\end{pmatrix}^{-1}
100-
\begin{pmatrix}
101-
\sum y_i \\
102-
\sum y_i x_i
103-
\end{pmatrix}
93+
\begin{pmatrix} \sum y_i \\ \sum y_i x_i \end{pmatrix}
10494
```
10595

10696
We can write the right-hand side matrix and vector (let's call them `A` and `b`) in Julia notation like so:
107-
```julia
97+
```julia-repl
10898
julia> A = [
10999
length(x) sum(x)
110100
sum(x) sum(x.^2)
111-
]
101+
]
112102
2×2 Matrix{Int64}:
113103
21 1050
114104
1050 71750
115105
116106
julia> b = [
117107
sum(y)
118-
sum(y .* x)
108+
sum(y .* x)
119109
]
120110
2-element Vector{Float64}:
121111
210.4250937868108
@@ -128,14 +118,12 @@ julia> c = A\b
128118
2-element Vector{Float64}:
129119
-1.67268257376372
130120
0.2338585027008085
131-
132121
```
133122

134123
Let's make a helper function `linfunc` that takes an x value, a slope, and an intercept and calculates the corresponding y value:
135124
```julia-repl
136125
julia> linfunc(x; slope, intercept) = slope*x + intercept
137126
linfunc (generic function with 1 method)
138-
139127
```
140128

141129
Finally, we can plot the solution:
@@ -151,9 +139,9 @@ The packages [LsqFit](https://julianlsolvers.github.io/LsqFit.jl/dev/) and [GLM]
151139

152140
## (Non-)linear curve fit
153141

154-
The packages above can be used to fit different polynomial models, but if we have a truly arbitrary Julia function we would like to fit to some data we can use the [Optimization.jl](http://optimization.sciml.ai/stable/) package. Through its various backends, Optimization.jl supports a very wide range of algorithms for local, global, convex, and non-convex optimization.
142+
The packages above can be used to fit different polynomial models, but if we have a truly arbitrary Julia function we would like to fit to some data we can use the [Optimization.jl](http://optimization.sciml.ai/stable/) package. Through its various backends, Optimization.jl supports a very wide range of algorithms for local, global, convex, and non-convex optimization.
155143

156-
The first step is to define our objective function. We'll reuse our simple `linfunc` linear function from above:
144+
The first step is to define our objective function. We'll reuse our simple `linfunc` linear function from above and create an objective function based on the sum of the squared errors
157145
```julia
158146
linfunc(x; slope, intercept) = slope*x + intercept
159147

@@ -169,38 +157,39 @@ function objective(u, data)
169157

170158
# Get the x and y vectors from data
171159
x, y = data
172-
160+
173161
# Calculate the residuals between our model and the data
174162
residuals = linfunc.(x; slope, intercept) .- y
175163

176164
# Return the sum of squares of the residuals to minimize
177165
return sum(residuals.^2)
178166
end
167+
```
179168

169+
Now, we'll use SciML's problem-algorithm-solve workflow to solve our optimization problem:
170+
```julia
180171
# Define the initial parameter values for slope and intercept
181172
u0 = [1.0, 1.0]
182173
# Pass through the data we want to fit
183-
data = [x,y]
174+
data = [x, y]
184175

185176
# Create an OptimizationProblem object to hold the function, initial
186177
# values, and data.
187178
using Optimization
188-
prob = OptimizationProblem(objective,u0,data)
179+
prob = OptimizationProblem(objective, u0, data)
189180

190181
# Import the optimization backend we want to use
191182
using OptimizationOptimJL
192183

193-
# Minimize the function. Optimization.jl uses the SciML common solver
194-
# interface. Pass the problem you want to solve (optimization problem
195-
# here) and a solver to use.
196-
# NelderMead() is a derivative-free method for finding a function's
197-
# local minimum.
198-
sol = solve(prob,NelderMead())
184+
# Minimize the function. Optimization.jl uses the SciML common solver interface.
185+
# Pass the problem you want to solve (optimization problem here) and a solver to use.
186+
# NelderMead() is a derivative-free method for finding a function's local minimum.
187+
sol = solve(prob, NelderMead())
199188

200189
# Exctract the best-fitting parameters
201190
slope, intercept = sol.u
202191
```
203-
Note: the `NelderMead()` algorithm behaves nearly identically to MATLAB's `fminsearch`.
192+
Note: the `NelderMead()` algorithm behaves nearly identically to MATLAB's `fminsearch`.
204193

205194

206195
We can now plot the solution:
@@ -211,11 +200,11 @@ julia> plot!(x, yfit, label="best fit")
211200
```
212201
![](../assets/tutorials/curve-fit/optimization-linear-regression.svg)
213202

214-
We can now test out a quadratic fit using the same package:
203+
We can now test out a quadratic fit using the same package, by defining a new objective function:
215204
```julia
216205
function objective(u, data)
217206
x, y = data
218-
207+
219208
# Define an equation of a quadratic, e.g.:
220209
# 3x^2 + 2x + 1
221210
model = u[1] .* x.^2 .+ u[2] .* x .+ u[3]
@@ -226,12 +215,14 @@ function objective(u, data)
226215
# Return the sum of squares of the residuals to minimize
227216
return sum(residuals.^2)
228217
end
218+
```
229219

220+
```julia
230221
u0 = [1.0, 1.0, 1.0]
231-
data = [x,y]
232-
prob = OptimizationProblem(objective,u0,data)
222+
data = [x, y]
223+
prob = OptimizationProblem(objective, u0, data)
233224
using OptimizationOptimJL
234-
sol = solve(prob,NelderMead())
225+
sol = solve(prob, NelderMead())
235226
u = sol.u
236227

237228
yfit = u[1] .* x.^2 .+ u[2] .* x .+ u[3]
@@ -248,8 +239,8 @@ First, you can use automatic differentiation and a higher order optimization alg
248239
```julia
249240
using ForwardDiff
250241
optf = OptimizationFunction(objective, Optimization.AutoForwardDiff())
251-
prob = OptimizationProblem(optf,u0,data)
252-
@time sol = solve(prob,BFGS()) # another good algorithm is Newton()
242+
prob = OptimizationProblem(optf, u0, data)
243+
@time sol = solve(prob, BFGS()) # another good algorithm is Newton()
253244
```
254245
You can also write an "in-place" version of `objective` that doesn't allocate new arrays with each iteration.
255246

@@ -305,15 +296,15 @@ parameters = σ₂, intercept, slope
305296
internals = lp, n_steps, is_accept, acceptance_rate, log_density, hamiltonian_energy, hamiltonian_energy_error, max_hamiltonian_energy_error, tree_depth, numerical_error, step_size, nom_step_size
306297
307298
Summary Statistics
308-
parameters mean std naive_se mcse ess rhat ess_per_sec
299+
parameters mean std naive_se mcse ess rhat ess_per_sec
309300
Symbol Float64 Float64 Float64 Float64 Float64 Float64 Float64
310301
311302
σ₂ 6.7431 2.6279 0.0166 0.0265 10640.9415 1.0000 1810.6077
312303
intercept -1.5979 1.0739 0.0068 0.0105 10239.7534 1.0001 1742.3436
313304
slope 0.2328 0.0186 0.0001 0.0002 10306.9493 1.0001 1753.7773
314305
315306
Quantiles
316-
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
307+
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
317308
Symbol Float64 Float64 Float64 Float64 Float64
318309
319310
σ₂ 3.3126 4.9457 6.1965 7.9372 13.3608
@@ -322,37 +313,33 @@ Quantiles
322313
```
323314

324315

325-
```
316+
```julia
326317
intercept = chain["intercept"]
327318
slope = chain["slope"]
328319
σ₂ = chain["σ₂"]
329320

330-
plot(x, x .* slope' .+ intercept';
331-
label="",
332-
color=:gray,
333-
alpha=0.05
334-
)
321+
plot(x, x .* slope' .+ intercept'; label="", color=:gray, alpha=0.05)
335322
scatter!(x, y, xlabel="x", ylabel="y", label="data", color=1)
336323
```
337324
![](../assets/tutorials/curve-fit/bayesian-lin-regression.svg)
338325

339326
Each gray curve is a sample from the posterior distribution of this model. To examine the model parameters and their covariance in greater detail, we can make a corner plot using the PairPlots.jl package. We'll need a few more samples for a nice plot, so re-run the NUTS sampler with more iterations first.
340-
```
327+
```julia
341328
Random.seed!(1234)
342329
chain = sample(model, NUTS(0.65), 25_000)
343330

344331
using PairPlots
345332
table = (;
346-
intercept= chain["intercept"],
347-
slope= chain["slope"],
348-
σ= sqrt.(chain["σ₂"])
333+
intercept = chain["intercept"],
334+
slope = chain["slope"],
335+
σ = sqrt.(chain["σ₂"])
349336
)
350337
PairPlots.corner(table)
351338
```
352339
![](../assets/tutorials/curve-fit/lin-regress-corner.svg)
353340

354341

355-
Let's now repeat this proceedure with a Bayesian quadratic model.
342+
Let's now repeat this procedure with a Bayesian quadratic model.
356343

357344
```julia
358345
@model function quad_regression(x, y)
@@ -393,7 +380,7 @@ parameters = σ₂, u1, u2, u3
393380
internals = lp, n_steps, is_accept, acceptance_rate, log_density, hamiltonian_energy, hamiltonian_energy_error, max_hamiltonian_energy_error, tree_depth, numerical_error, step_size, nom_step_size
394381
395382
Summary Statistics
396-
parameters mean std naive_se mcse ess rhat ess_per_sec
383+
parameters mean std naive_se mcse ess rhat ess_per_sec
397384
Symbol Float64 Float64 Float64 Float64 Float64 Float64 Float64
398385
399386
σ₂ 1.5698 0.6322 0.0283 0.0518 117.5553 0.9994 19.9517
@@ -402,7 +389,7 @@ Summary Statistics
402389
u3 2.1371 0.6109 0.0273 0.0562 87.2121 0.9995 14.8018
403390
404391
Quantiles
405-
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
392+
parameters 2.5% 25.0% 50.0% 75.0% 97.5%
406393
Symbol Float64 Float64 Float64 Float64 Float64
407394
408395
σ₂ 0.8757 1.1468 1.3945 1.8181 3.3834
@@ -411,18 +398,13 @@ Quantiles
411398
u3 0.9635 1.7155 2.1211 2.5172 3.3960
412399
```
413400

414-
415-
```
401+
```julia
416402
u1 = chain["u1"]
417403
u2 = chain["u2"]
418404
u3 = chain["u3"]
419405
posterior = u1' .* x.^2 .+ u2' .* x .+ u3'
420406

421-
plot(x, posterior;
422-
label="",
423-
color=:gray,
424-
alpha=0.1
425-
)
407+
plot(x, posterior; label="", color=:gray, alpha=0.1)
426408
scatter!(x, y, xlabel="x", ylabel="y", label="data", color=1)
427409
```
428410
![](../assets/tutorials/curve-fit/bayesian-quad-regression.svg)
@@ -437,7 +419,7 @@ table = (;
437419
u_1 = chain["u1"],
438420
u_2 = chain["u2"],
439421
u_3 = chain["u3"],
440-
σ= sqrt.(chain["σ₂"])
422+
σ = sqrt.(chain["σ₂"])
441423
)
442424
PairPlots.corner(table)
443425
```

docs/src/tutorials/index.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,21 +9,21 @@ For more advanced usage examples, please see the documentation pages of the indi
99

1010
## Installing Julia
1111

12-
To install Julia, it's strongly recommended to download the official binary for your operating system.
12+
To install Julia, it's strongly recommended to download the official binary for your operating system.
1313
Visit the [julialang.org Downloads page](https://julialang.org/downloads), and select the latest stable version for your operating system. Currently, this is 1.7.3. Click the [help] links next to your operating system if you require more detailed instructions.
1414

1515
## Installing Packages
1616

1717
Julia packages are installed and managed using the built-in package manager called [Pkg.jl](https://pkgdocs.julialang.org/v1/).
18-
You can either use it interactively by entering "Pkg mode" in your terminal, or programatically. We'll illustrate interactive use here.
18+
You can either use it interactively by entering "Pkg mode" in your terminal, or programmatically. We'll illustrate interactive use here.
1919

2020
1. Start julia in a terminal by running `julia`.
2121
2. Type `]` to enter package-mode (see Julia documentation for more details)
2222
3. Type `add SomePackage` to install a package, replacing `SomePackage` with the desired package name without the `.jl` extension.
2323

2424
This will take a little while to download all the required packages and precompile for your system. If you have several packages to install, list them all at once instead of one by one to save time: `add SomePackage1 SomePackage2 SomePackage3`
2525

26-
It's recommended to use Julia projects to store what packages you use and make it easier to reproduce your work. You can create or activate a previously created project by entering Pkg-mode (type `]`) and running `activate myproject`. Another option is to create a folder for your project, and start julia in that folder with the option `julia --project=./`.
26+
It's recommended to use Julia projects to store what packages you use and make it easier to reproduce your work. You can create or activate a previously created project by entering Pkg-mode (type `]`) and running `activate myproject`. Another option is to create a folder for your project, and start julia in that folder with the option `julia --project`.
2727

2828
For more information on how to use the Julia package manager, refer to the [Pkg.jl documentation](https://pkgdocs.julialang.org/v1/repl/).
2929

0 commit comments

Comments
 (0)