Skip to content

Commit b6dad47

Browse files
committed
Add more context for instructions and break up large code blocks
1 parent 6bcba41 commit b6dad47

1 file changed

Lines changed: 8 additions & 3 deletions

File tree

docs/src/tutorials/curve-fit.md

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ using Pkg; Pkg.add(["Plots", "Optimization", "OptimizationOptimJL", "Turing", "P
2121
```
2222
In your own code, you most likely won't need all of these packages. Pick and choose the one that best fits your problem.
2323

24-
If you will be using these tools as part of a bigger project, it's strongly recommended to create a [Julia Project](https://pkgdocs.julialang.org/v1/environments/) to record package versions. If you're just experimenting, you can create a temporary project by running `] activate --temp`.
24+
If you will be using these tools as part of a bigger project, it's strongly recommended to create a [Julia Project](https://pkgdocs.julialang.org/v1/environments/) to record package versions. If you're just experimenting, you can create a temporary project by running `] activate --temp` in the Julia REPL.
2525

2626
If you're using [Pluto notebooks](https:/plutojl.org), installing and recording package versions in a project are handled for you automatically.
2727

@@ -151,7 +151,7 @@ The packages [LsqFit](https://julianlsolvers.github.io/LsqFit.jl/dev/) and [GLM]
151151

152152
The packages above can be used to fit different polynomial models, but if we have a truly arbitrary Julia function we would like to fit to some data we can use the [Optimization.jl](http://optimization.sciml.ai/stable/) package. Through its various backends, Optimization.jl supports a very wide range of algorithms for local, global, convex, and non-convex optimization.
153153

154-
The first step is to define our objective function. We'll reuse our simple `linfunc` linear function from above:
154+
The first step is to define our objective function. We'll reuse our simple `linfunc` linear function from above and create an objective function based on the sum of the squared errors
155155
```julia
156156
linfunc(x; slope, intercept) = slope*x + intercept
157157

@@ -174,7 +174,10 @@ function objective(u, data)
174174
# Return the sum of squares of the residuals to minimize
175175
return sum(residuals.^2)
176176
end
177+
```
177178

179+
Now, we'll use SciML's problem-algorithm-solve workflow to solve our optimization problem:
180+
```julia
178181
# Define the initial parameter values for slope and intercept
179182
u0 = [1.0, 1.0]
180183
# Pass through the data we want to fit
@@ -209,7 +212,7 @@ julia> plot!(x, yfit, label="best fit")
209212
```
210213
![](../assets/tutorials/curve-fit/optimization-linear-regression.svg)
211214

212-
We can now test out a quadratic fit using the same package:
215+
We can now test out a quadratic fit using the same package, by defining a new objective function:
213216
```julia
214217
function objective(u, data)
215218
x, y = data
@@ -224,7 +227,9 @@ function objective(u, data)
224227
# Return the sum of squares of the residuals to minimize
225228
return sum(residuals.^2)
226229
end
230+
```
227231

232+
```julia
228233
u0 = [1.0, 1.0, 1.0]
229234
data = [x,y]
230235
prob = OptimizationProblem(objective,u0,data)

0 commit comments

Comments
 (0)