(e.g. ``-myfprime(xk)``) to find a step length that satisfies the strong Wolfe conditions. If the search direction is not a descent direction (e.g. ``myfprime(xk)``), then `alpha`, `new_fval`, and `new_slope` will be None. Examples -------- >>> import numpy as np >>> from scipy.optimize import line_search A objective function and its gradient are defined. >>> def obj_func(x): ... return (x[0])**2+(x[1])**2 >>> def obj_grad(x): ... return [2*x[0], 2*x[1]] We can find alpha that satisfies strong Wolfe conditions. >>> start_point = np.array([1.8, 1.7]) >>> search_gradient = np.array([-1.0, -1.0]) >>> line_search(obj_func, obj_grad, start_point, search_gradient) (1.0, 2, 1, 1.1300000000000001, 6.13, [1.6, 1.4]) r