1 / m'`. If ``"randomized"`` the initial search position is :math:`P_0 = (J + K) / 2`, where :math:`J` is the barycenter and :math:`K` is a random doubly stochastic matrix. shuffle_input : bool (default: False) Set to `True` to resolve degenerate gradients randomly. For non-degenerate gradients this option has no effect. maxiter : int, positive (default: 30) Integer specifying the max number of Frank-Wolfe iterations performed. tol : float (default: 0.03) Tolerance for termination. Frank-Wolfe iteration terminates when :math:`\frac{||P_{i}-P_{i+1}||_F}{\sqrt{m'}} \leq tol`, where :math:`i` is the iteration number. Returns ------- res : OptimizeResult `OptimizeResult` containing the following fields. col_ind : 1-D array Column indices corresponding to the best permutation found of the nodes of `B`. fun : float The objective value of the solution. nit : int The number of Frank-Wolfe iterations performed. Notes ----- The algorithm may be sensitive to the initial permutation matrix (or search "position") due to the possibility of several local minima within the feasible region. A barycenter initialization is more likely to result in a better solution than a single random initialization. However, calling ``quadratic_assignment`` several times with different random initializations may result in a better optimum at the cost of longer total execution time. Examples -------- As mentioned above, a barycenter initialization often results in a better solution than a single random initialization. >>> from scipy.optimize import quadratic_assignment >>> import numpy as np >>> rng = np.random.default_rng() >>> n = 15 >>> A = rng.random((n, n)) >>> B = rng.random((n, n)) >>> options = {"rng": rng} >>> res = quadratic_assignment(A, B, options=options) # FAQ is default method >>> print(res.fun) 47.797048706380636 # may vary >>> options = {"rng": rng, "P0": "randomized"} # use randomized initialization >>> res = quadratic_assignment(A, B, options=options) >>> print(res.fun) 47.37287069769966 # may vary However, consider running from several randomized initializations and keeping the best result. >>> res = min([quadratic_assignment(A, B, options=options) ... for i in range(30)], key=lambda x: x.fun) >>> print(res.fun) 46.55974835248574 # may vary The '2-opt' method can be used to attempt to refine the results. >>> options = {"partial_guess": np.array([np.arange(n), res.col_ind]).T, "rng": rng} >>> res = quadratic_assignment(A, B, method="2opt", options=options) >>> print(res.fun) 46.55974835248574 # may vary References ---------- .. [1] J.T. Vogelstein, J.M. Conroy, V. Lyzinski, L.J. Podrazik, S.G. Kratzer, E.T. Harley, D.E. Fishkind, R.J. Vogelstein, and C.E. Priebe, "Fast approximate quadratic programming for graph matching," PLOS one, vol. 10, no. 4, p. e0121002, 2015, :doi:`10.1371/journal.pone.0121002` .. [2] D. Fishkind, S. Adali, H. Patsolic, L. Meng, D. Singh, V. Lyzinski, C. Priebe, "Seeded graph matching", Pattern Recognit. 87 (2019): 203-215, :doi:`10.1016/j.patcog.2018.09.014` .. [3] "Doubly stochastic Matrix," Wikipedia. https://en.wikipedia.org/wiki/Doubly_stochastic_matrix N>