## Documentation Center |

Find minimum of unconstrained multivariable function using derivative-free method

`x = fminsearch(fun,x0)x = fminsearch(fun,x0,options)[x,fval] = fminsearch(...)[x,fval,exitflag] = fminsearch(...)[x,fval,exitflag,output] = fminsearch(...)`

`fminsearch` finds the minimum of a scalar
function of several variables, starting at an initial estimate. This
is generally referred to as *unconstrained nonlinear optimization*.

`x = fminsearch(fun,x0)` starts
at the point `x0` and returns a value `x` that
is a local minimizer of the function described in `fun`. `x0` can
be a scalar, vector, or matrix. `fun` is a `function_handle`.

Parameterizing Functions in the MATLAB^{®} Mathematics
documentation explains how to pass additional parameters to your objective
function `fun`. See also Example 2 and Example 3 below.

`x = fminsearch(fun,x0,options)` minimizes
with the optimization parameters specified in the structure `options`.
You can define these parameters using the `optimset` function. `fminsearch` uses
these `options` structure fields:

Level of display. | |

Check whether objective function values are valid. | |

Maximum number of function evaluations allowed | |

Maximum number of iterations allowed | |

User-defined function that is called at each iteration. See Output Functions in MATLAB Mathematics for more information. | |

Plots various measures of progress while the algorithm
executes, select from predefined plots or write your own. Pass a function
handle or a cell array of function handles. The default is none ( `@optimplotx`plots the current point`@optimplotfval`plots the function value`@optimplotfunccount`plots the function count
See Plot Functions in MATLAB Mathematics for more information. | |

Termination tolerance on the function value | |

Termination tolerance on |

`[x,fval] = fminsearch(...)` returns
in `fval` the value of the objective function `fun` at
the solution `x`.

`[x,fval,exitflag] = fminsearch(...)` returns
a value `exitflag` that describes the exit condition
of `fminsearch`:

| |

Maximum number of function evaluations or iterations was reached. | |

Algorithm was terminated by the output function. |

`[x,fval,exitflag,output] = fminsearch(...)` returns
a structure `output` that contains information about
the optimization in the following fields:

| |

Number of function evaluations | |

Number of iterations | |

Exit message |

`fun` is the function to be minimized. It
accepts an input `x` and returns a scalar `f`,
the objective function evaluated at `x`. The function `fun` can
be specified as a function handle for a function file

x = fminsearch(@myfun, x0)

where `myfun` is a function file such as

function f = myfun(x) f = ... % Compute function value at x

or as a function handle for an anonymous function, such as

x = fminsearch(@(x)sin(x^2), x0);

Other arguments are described in the syntax descriptions above.

The Rosenbrock banana function is a classic test example for multidimensional minimization:

The minimum is at `(1,1)` and has the value `0`.
The traditional starting point is `(-1.2,1)`. The
anonymous function shown here defines the function and returns a function
handle called `banana`:

banana = @(x)100*(x(2)-x(1)^2)^2+(1-x(1))^2;

Pass the function handle to `fminsearch`:

[x,fval] = fminsearch(banana,[-1.2, 1])

This produces

x = 1.0000 1.0000 fval = 8.1777e-010

This indicates that the minimizer was found to at least four decimal places with a value near zero.

If `fun` is parameterized, you can use anonymous
functions to capture the problem-dependent parameters. For example,
suppose you want to minimize the objective function `myfun` defined
by the following function file:

function f = myfun(x,a) f = x(1)^2 + a*x(2)^2;

Note that `myfun` has an extra parameter `a`,
so you cannot pass it directly to `fminsearch`. To
optimize for a specific value of `a`, such as `a
= 1.5`.

a = 1.5; % define parameter first

Call

`fminsearch`with a one-argument anonymous function that captures that value of`a`and calls`myfun`with two arguments:x = fminsearch(@(x) myfun(x,a),[0,1])

You can modify the first example by adding a parameter *a* to
the second term of the banana function:

This changes the location of the minimum to the point `[a,a^2]`.
To minimize this function for a specific value of `a`,
for example a = `sqrt(2)`,
create a one-argument anonymous function that captures the value of `a`.

a = sqrt(2); banana = @(x)100*(x(2)-x(1)^2)^2+(a-x(1))^2;

Then the statement

[x,fval] = fminsearch(banana, [-1.2, 1], ... optimset('TolX',1e-8));

seeks the minimum `[sqrt(2), 2]` to an accuracy
higher than the default on `x`.

`fminsearch` can
often handle discontinuity, particularly if it does not occur near
the solution. fminsearch may only give local solutions.

`fminsearch` only minimizes over the real numbers,
that is, *x* must only consist of real numbers and *f*(*x*) must
only return real numbers. When *x* has complex variables,
they must be split into real and imaginary parts.

[1] Lagarias, J.C., J. A. Reeds, M. H. Wright,
and P. E. Wright, "Convergence Properties of the Nelder-Mead
Simplex Method in Low Dimensions," *SIAM Journal
of Optimization*, Vol. 9 Number 1, pp. 112-147, 1998.

Was this topic helpful?