Happy Fibonacci Day! The first four digits of the Fibonacci sequence, given by:
Form the date 11/23. However, this sequence is also useful in an application that surprised me a lot when I learned this earlier this year in my optimization class: one-dimensional line search methods.
Optimization in often involves algorithms that are iterative in some way, that is, you start with a guess for the minimizer and iterate a certain computation until you arrive at the actual minimizer (or within acceptable tolerance). A big part of the study of these methods is to characterize their convergence rates, stability, etc.
However, despite these problems being in multiple dimensions, it is often the case that a one-dimensional optimization problem finds itself being a part of such iterative algorithms. This is the case for popular methods such as the steepest descent, or the conjugate gradient algorithm.
It is in these 1D, line-search methods that the Fibonacci Sequence pops up. In fact, another fun number pops up: the golden ratio! Let’s build up to it.
Section Search Methods
Consider a 1D, unimodal function, that is, the function has a single minimizer in the closed interval . Such a function is depicted below:
How could we find the minimizer effectively if we could evaluate the function at any x? That is, how could we be smart about picking where to evaluate the function in a way that leads us closer to the minimizer?
One such way is to recognize that we can try to narrow our range from to something smaller. We could try to pick some point in the interval and compare it to the value of the function at and . However, that wouldn’t tell us much: if the value is smaller than , it could just be the case that we overshot the minimizer. Likewise for comparisons to .
To narrow the range, therefore, we must evaluate the function at two different points. One way to go about this is to pick two points that are equally distant from their closest points, as shown in the figure below:
Where with . We then evaluate the function at those new points. If , then the minimizer must be within . Otherwise, it must be within . An example of such range reduction is shown below:
Golden Section Search
Now here’s a neat trick: to make this algorithm faster, it would be nice to minimize the number of times we have to evaluate the function . Consider the case where as shown above. Note that we know the minimizer lies on the range and that already lies on that range. On top of that, we already evaluated ! Therefore, we can pick as the rightmost section endpoint for the next iteration, that is, . The range is reduced as follows:
We can write an expression for how fast this range is reduced. Note that:
But and . Therefore:
Which expands to the following quadratic expression for :
Solving this for and demanding, as stated before, that gives:
Some more manipulation yields the following observation:
Or dividing the range from to causes the ratio of the shorter to the longer segment to be equal to the ratio between the longer to the sum of the two. This is referred to as the golden ratio!
Fibonacci Search
Here comes Fibonacci: note that in the Golden Section search, we cut the range down by in every iteration, that is, the range reduction is fixed for every step. The following question arises: what if we allow this range reduction to vary from iteration to iteration?
Just like before, we would ideally only have to evaluate the function at a single new point at every iteration. Now, however, we allow ourselves to choose a different at every iteration. From the previous figure, we can conclude that the condition for to allow a single new evaluation at each iteration is:
Where any sequence that satisfies the condition above is valid. Rearranging leads to:
An example of a sequence that satisfies the condition above is the one chosen for the Golden Section Search, that is, .
After iterations, the range will be reduced by a factor of:
So the question to be asked here is: how can choose the sequence so that we minimize this reduction factor ? That is, how can we pick a sequence of cuts such that we cut the sequence by as much as possible at each iteration?
The proof for this is somewhat long, but it can be shown that such optimal sequence is of form:
Where is the th number of the Fibonacci Sequence! Note also that the range is reduced by a factor of .
Example: minimizing
Let’s try this in Python.
Below is a snippet for a Fibonacci Search function:
def fibonacciSection(objective, interval, N, epsilon=0.05):
F = [1, 1]
for _ in range(2, N + 2):
F.append(F[-1] + F[-2])
intervals = []
points = []
left, right = interval
lower = 'a'
a = left + (F[N] / F[N + 1]) * (right - left)
f_a = objective(a)
b = left + (F[N - 1] / F[N + 1]) * (right - left)
f_b = objective(b)
for i in range(1, N + 1):
if i != N:
rho = 1 - F[N + 1 - i] / F[N + 2 - i]
else:
rho = 0.5 - epsilon
if lower == 'a':
b = a
f_b = f_a
a = left + rho * (right - left)
f_a = objective(a)
else:
a = b
f_a = f_b
b = left + (1 - rho) * (right - left)
f_b = objective(b)
intervals.append((left, right))
points.append((a, b))
if f_a < f_b:
right = b
lower = 'a'
else:
left = a
lower = 'b'
return intervals, points
Which can be evaluated for any given objective function with an initial interval and number of iterations. Note that it is often the case that the number of iterations depends on some fixed tolerance. Since the interval reduction rate is known, the minimum number of iterations can be calculated. I am avoiding this for simplicity here.
Let’s run this for . We call the function with:
objective = lambda x: x**2
interval = [-1, 1]
N = 20
intervals, points = fibonacciSection(objective, interval, N)
Plotting the interval length over the iterations, you can see the reduction in interval length as the algorithm narrows down to the region around [0, 0]:
Even more fun: we can animate this! Below is a GIF showing the algorithm at work:
Conclusion
That’s about it for this post. This was a quick one for Fibonacci Day. Hopefully the GIF above works nicely. Will try to bring the topic back to aero next time. Woops!
Thanks for reading!
References
[1]: Chong, E. K. P., & Zak, S. H. (2013). An introduction to optimization. John Wiley & Sons.