Can the Precision, Recall and F1 be the same value?

Yes, this is possible. Let's assume binary classification with

Pr = TP  / (TP + FP); Re = (TP + FN); F1 = 2TP / (2TP + FP + FN)

The trivial solution to Pr = Re = F1 is TP = 0. So we know precision, recall and F1 can have the same value in general. Now, this does not apply to your specific result. If we solve the system of equations, we find another solution: FP = FN. So, if the number of false positives is the same as the number of false negatives, all three metrics have identical values.

For multiclass classification problems we have

F1 = 2 * (Pr * Re) / (Pr + Re)

If Pr = Re, again all three metrics are identical.


This seems to be because of the option - average='weighted'

Refer: https://scikit-learn.org/stable/modules/generated/sklearn.metrics.precision_recall_fscore_support.html

'weighted': Calculate metrics for each label, and find their average weighted by support (the number of true instances for each label). This alters ‘macro’ to account for label imbalance; it can result in an F-score that is not between precision and recall.