Summary: 35 instances, 27 unique Text Count # TODO: Move this logic inside 1 # TODO: Remove after transformers package upgrade to 2.5 3 # TODO: Log a warning here 1 # TODO: Fix later when we move to proper standardized features 1 # TODO: Deprecate this method and move configuration updates directly to processors 4 # TODO: Update 2 # TODO: Later test if this can be removed 1 # TODO remove OCR order vectors; they are not needed 1 # TODO: Do clean implementation without Sequential 1 # TODO: Later support multihead 2 # TODO: To remove technical debt, a possible solution is to use 1 # TODO: we want to make the padding_idx==0, however, with custom initilization, 1 # TODO: Deprecate support for this 1 # TODO: Update after implementing decoder 1 # TODO: Update kwargs with defaults 1 # TODO possibly replace it with another sample list 1 # TODO: Make the data update generic for any type of model 1 # TODO: Clean up VizWiz IMDB from copy tokens 1 # TODO : Switch to AutoModelForPreTraining after transformers 1 # TODO: Fix question model retrieval 1 # TODO: there may still be some instability in the exponent calculation. 2 # NOTE, TODO: Code duplication w.r.t to STVQA, revisit 1 # TODO: Once omegaconf fixes int keys issue, bring this back 1 # TODO: Later upload visual dialog features as well 1 # TODO: Add Help flag here describing MMF Configuration 1 # TODO: Move this logic inside 1 TODO: Update on docs sprint 1