pytorch-transformers/pytorch_transformers/modeling_xlnet.py (2 lines): - line 750: qlen: TODO Lysandre didn't fill - line 751: mlen: TODO Lysandre didn't fill XLM/src/trainer.py (2 lines): - line 455: pred_mask[0] = 0 # TODO: remove - line 557: if False: # AMP checkpoint reloading is buggy, we cannot do that - TODO: fix - https://github.com/NVIDIA/apex/issues/250 pytorch-transformers/pytorch_transformers/tokenization_xlm.py (1 line): - line 699: # TODO: make sure we are using `xlm-mlm-enro-1024`, since XLM-100 doesn't have this step XLM/src/model/transformer.py (1 line): - line 417: # TODO: add extra layer norm here? pytorch-transformers/pytorch_transformers/modeling_transfo_xl.py (1 line): - line 1357: # TODO: This is not implemented XLM/src/model/__init__.py (1 line): - line 143: encoder = TransformerModel(params, dico, is_encoder=True, with_output=False) # TODO: only output when necessary - len(params.clm_steps + params.mlm_steps) > 0