nhill bracket search (see `scipy.optimize.bracket`). The minimizer ``x`` will not necessarily satisfy ``xa <= x <= xb``. tol : float, optional Relative error in solution `xopt` acceptable for convergence. full_output : bool, optional If True, return all output args (xmin, fval, iter, funcalls). maxiter : int, optional Maximum number of iterations in solution. Returns ------- xmin : ndarray Optimum point. fval : float (Optional output) Optimum function value. iter : int (Optional output) Number of iterations. funcalls : int (Optional output) Number of objective function evaluations made. See also -------- minimize_scalar: Interface to minimization algorithms for scalar univariate functions. See the 'Brent' `method` in particular. Notes ----- Uses inverse parabolic interpolation when possible to speed up convergence of golden section method. Does not ensure that the minimum lies in the range specified by `brack`. See `scipy.optimize.fminbound`. Examples -------- We illustrate the behaviour of the function when `brack` is of size 2 and 3 respectively. In the case where `brack` is of the form ``(xa, xb)``, we can see for the given values, the output does not necessarily lie in the range ``(xa, xb)``. >>> def f(x): ... return (x-1)**2 >>> from scipy import optimize >>> minimizer = optimize.brent(f, brack=(1, 2)) >>> minimizer 1 >>> res = optimize.brent(f, brack=(-1, 0.5, 2), full_output=True) >>> xmin, fval, iter, funcalls = res >>> f(xmin), fval (0.0, 0.0) ©