src/lighteval/models/nanotron/nanotron_model.py (7 lines): - line 187: # TODO @nouamane: this is only needed for training, can we just mark params as NanotronParameter instead? - line 256: TODO: Remove these conditionals once HuggingFace supports a way to - line 323: # TODO: Merge `tok_encode_batch` here. - line 573: ), # [padding_length - seq] #TODO: padding_token not always 0 - line 584: ), # [padding_length - seq] #TODO: padding_token not always 0 - line 954: False # if we unpad in modeling, it doesn't matter if left or right # TODO: not supported yet - line 1013: ) # [1, seq, voc] #TODO: this assumes padding on the right src/lighteval/metrics/harness_compatibility/truthful_qa.py (1 line): - line 57: # TODO: This completely ignores any normalization, but keeping it as is src/lighteval/metrics/metrics_sample.py (1 line): - line 788: TODO: Will have to move this to sacrebleu.