pecially with one-hot encoded categorical features with rare categories. Be aware that the memory usage of this solver has a quadratic dependency on `n_features` because it explicitly computes the Hessian matrix. .. versionadded:: 1.2 max_iter : int, default=100 The maximal number of iterations for the solver. Values must be in the range `[1, inf)`. tol : float, default=1e-4 Stopping criterion. For the lbfgs solver, the iteration will stop when ``max{|g_j|, j = 1, ..., d} <= tol`` where ``g_j`` is the j-th component of the gradient (derivative) of the objective function. Values must be in the range `(0.0, inf)`. warm_start : bool, default=False If set to ``True``, reuse the solution of the previous call to ``fit`` as initialization for `coef_` and `intercept_`. verbose : int, default=0 For the lbfgs solver set verbose to any positive number for verbosity. Values must be in the range `[0, inf)`. Attributes ---------- coef_ : array of shape (n_features,) Estimated coefficients for the linear predictor (`X @ coef_ + intercept_`) in the GLM. intercept_ : float Intercept (a.k.a. bias) added to linear predictor. n_features_in_ : int Number of features seen during :term:`fit`. .. versionadded:: 0.24 n_iter_ : int Actual number of iterations used in the solver. feature_names_in_ : ndarray of shape (`n_features_in_`,) Names of features seen during :term:`fit`. Defined only when `X` has feature names that are all strings. .. versionadded:: 1.0 See Also -------- PoissonRegressor : Generalized Linear Model with a Poisson distribution. TweedieRegressor : Generalized Linear Model with a Tweedie distribution. Examples -------- >>> from sklearn import linear_model >>> clf = linear_model.GammaRegressor() >>> X = [[1, 2], [2, 3], [3, 4], [4, 3]] >>> y = [19, 26, 33, 30] >>> clf.fit(X, y) GammaRegressor() >>> clf.score(X, y) np.float64(0.773...) >>> clf.coef_ array([0.072..., 0.066...]) >>> clf.intercept_ np.float64(2.896...) >>> clf.predict([[1, 0], [2, 8]]) array([19.483..., 35.795...]) r/