_targets,) The dual gap at the end of the optimization for the optimal alpha (``alpha_``). n_iter_ : int Number of iterations run by the coordinate descent solver to reach the specified tolerance for the optimal alpha. n_features_in_ : int Number of features seen during :term:`fit`. .. versionadded:: 0.24 feature_names_in_ : ndarray of shape (`n_features_in_`,) Names of features seen during :term:`fit`. Defined only when `X` has feature names that are all strings. .. versionadded:: 1.0 See Also -------- lars_path : Compute Least Angle Regression or Lasso path using LARS algorithm. lasso_path : Compute Lasso path with coordinate descent. Lasso : The Lasso is a linear model that estimates sparse coefficients. LassoLars : Lasso model fit with Least Angle Regression a.k.a. Lars. LassoCV : Lasso linear model with iterative fitting along a regularization path. LassoLarsCV : Cross-validated Lasso using the LARS algorithm. Notes ----- In `fit`, once the best parameter `alpha` is found through cross-validation, the model is fit again using the entire training set. To avoid unnecessary memory duplication the `X` argument of the `fit` method should be directly passed as a Fortran-contiguous numpy array. For an example, see :ref:`examples/linear_model/plot_lasso_model_selection.py `. :class:`LassoCV` leads to different results than a hyperparameter search using :class:`~sklearn.model_selection.GridSearchCV` with a :class:`Lasso` model. In :class:`LassoCV`, a model for a given penalty `alpha` is warm started using the coefficients of the closest model (trained at the previous iteration) on the regularization path. It tends to speed up the hyperparameter search. Examples -------- >>> from sklearn.linear_model import LassoCV >>> from sklearn.datasets import make_regression >>> X, y = make_regression(noise=4, random_state=0) >>> reg = LassoCV(cv=5, random_state=0).fit(X, y) >>> reg.score(X, y) 0.9993... >>> reg.predict(X[:1,]) array([-78.4951...]) rB