mlebench/competitions/jigsaw-toxic-comment-classification-challenge/classes.py (1 lines of code) (raw):

CLASSES = ["toxic", "severe_toxic", "obscene", "threat", "insult", "identity_hate"]