ulticlass'`` or ``multilabel``. See the documentation of :func:`~torchmetrics.functional.classification.binary_precision_recall_curve`, :func:`~torchmetrics.functional.classification.multiclass_precision_recall_curve` and :func:`~torchmetrics.functional.classification.multilabel_precision_recall_curve` for the specific details of each argument influence and examples. Legacy Example: >>> pred = torch.tensor([0, 0.1, 0.8, 0.4]) >>> target = torch.tensor([0, 1, 1, 0]) >>> precision, recall, thresholds = precision_recall_curve(pred, target, task='binary') >>> precision tensor([0.5000, 0.6667, 0.5000, 1.0000, 1.0000]) >>> recall tensor([1.0000, 1.0000, 0.5000, 0.5000, 0.0000]) >>> thresholds tensor([0.0000, 0.1000, 0.4000, 0.8000]) >>> pred = torch.tensor([[0.75, 0.05, 0.05, 0.05, 0.05], ... [0.05, 0.75, 0.05, 0.05, 0.05], ... [0.05, 0.05, 0.75, 0.05, 0.05], ... [0.05, 0.05, 0.05, 0.75, 0.05]]) >>> target = torch.tensor([0, 1, 3, 2]) >>> precision, recall, thresholds = precision_recall_curve(pred, target, task='multiclass', num_classes=5) >>> precision [tensor([0.2500, 1.0000, 1.0000]), tensor([0.2500, 1.0000, 1.0000]), tensor([0.2500, 0.0000, 1.0000]), tensor([0.2500, 0.0000, 1.0000]), tensor([0., 1.])] >>> recall [tensor([1., 1., 0.]), tensor([1., 1., 0.]), tensor([1., 0., 0.]), tensor([1., 0., 0.]), tensor([nan, 0.])] >>> thresholds [tensor([0.0500, 0.7500]), tensor([0.0500, 0.7500]), tensor([0.0500, 0.7500]), tensor([0.0500, 0.7500]), tensor([0.0500])] z+`num_classes` is expected to be `int` but `z was passed.`z*`num_labels` is expected to be `int` but `z