log_loss¶

pai4sk.metrics.
log_loss
(y_true, y_pred, eps=1e15, normalize=True, sample_weight=None, labels=None)¶ Log loss, aka logistic loss or crossentropy loss.
This is the loss function used in (multinomial) logistic regression and extensions of it such as neural networks, defined as the negative loglikelihood of the true labels given a probabilistic classifier’s predictions. The log loss is only defined for two or more labels. For a single sample with true label yt in {0,1} and estimated probability yp that yt = 1, the log loss is
log P(ytyp) = (yt log(yp) + (1  yt) log(1  yp))For SnapML solver this supports both local and distributed(MPI) method of execution.
Read more in the User Guide.
Parameters:  y_true (arraylike or label indicator matrix) – Ground truth (correct) labels for n_samples samples. It also accepts SnapML data partition, which includes the correct labels.
 y_pred (arraylike of float, shape = (n_samples, n_classes) or (n_samples,)) – Predicted probabilities, as returned by a classifier’s
predict_proba method. If
y_pred.shape = (n_samples,)
the probabilities provided are assumed to be that of the positive class. The labels iny_pred
are assumed to be ordered alphabetically, as done bypreprocessing.LabelBinarizer
.  eps (float) – Log loss is undefined for p=0 or p=1, so probabilities are clipped to max(eps, min(1  eps, p)).
 normalize (bool, optional (default=True)) – If true, return the mean loss per sample. Otherwise, return the sum of the persample losses.
 sample_weight (arraylike of shape = [n_samples], optional) – Sample weights.
 labels (arraylike, optional (default=None)) – If not provided, labels will be inferred from y_true. If
labels
isNone
andy_pred
has shape (n_samples,) the labels are assumed to be binary and are inferred fromy_true
. .. versionadded:: 0.18
Returns: loss
Return type: Examples
>>> log_loss(["spam", "ham", "ham", "spam"], # doctest: +ELLIPSIS ... [[.1, .9], [.9, .1], [.8, .2], [.35, .65]]) 0.21616...
References
C.M. Bishop (2006). Pattern Recognition and Machine Learning. Springer, p. 209.
Notes
The logarithm used is the natural logarithm (basee).