Skip to content
This repository was archived by the owner on Nov 22, 2022. It is now read-only.

Commit 63e4aba

Browse files
rutyrinottfacebook-github-bot
authored andcommitted
add informative prints to assert statements (#1360)
Summary: Pull Request resolved: #1360 Adding information in print for asserts about metrics, to save user time debugging the assert Reviewed By: m3rlin45 Differential Revision: D21528177 fbshipit-source-id: b1527278877cc5f914a327d64881e40cc9861e96
1 parent 3bba58a commit 63e4aba

File tree

1 file changed

+6
-2
lines changed

1 file changed

+6
-2
lines changed

pytext/metric_reporters/classification_metric_reporter.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -102,13 +102,17 @@ def from_config_and_label_names(cls, config, label_names: List[str]):
102102
ComparableClassificationMetric.LABEL_AVG_PRECISION,
103103
ComparableClassificationMetric.LABEL_ROC_AUC,
104104
):
105-
assert config.target_label is not None
105+
assert (
106+
config.target_label is not None
107+
), "target_label must be set for selected metric"
106108
assert config.target_label in label_names
107109
if config.model_select_metric in (
108110
ComparableClassificationMetric.ROC_AUC,
109111
ComparableClassificationMetric.MCC,
110112
):
111-
assert len(label_names) == 2
113+
assert (
114+
len(label_names) == 2
115+
), "selected metric is valid for binary labels only"
112116

113117
return cls(
114118
label_names,

0 commit comments

Comments
 (0)