Summary: 41 instances, 36 unique Text Count # TODO: Update to also compute m2c_align_mat and m2t_align_mat 1 # TODO (PVP) still a bit hacky here - there might be a better solution 2 # TODO: sum the probability of all occurrences 1 # TODO fix 1 if column.type in ["number", "time"]: # TODO fine-grained date 1 # TODO: Fix this 1 # TODO (PVP): this should later be handled by the forward fn() in each model in the future see PR 3140 2 # TODO batching 2 if "name" in kwargs: del kwargs["name"] #TODO: fix this 1 # TODO: Write 'train', 'val', 'test' somewhere else 1 # TODO initialize using xavier 1 # TODO: words should match memory 3 # TODO: Mask other probabilities first? 1 # TODO: from_cond should be true from non-bert model 1 # TODO batching 1 # TODO: Record that this constructor is a sequence fragment? 1 assert factorize_sketch == 2 #TODO support other grammars 1 # TODO: Support multiple layers 1 #TODO: should get types from the data 1 # TODO: In Torch 1.3, discontinue use of torch.jit.Attribute so that 1 # TODO: Figure out how to get right sizes (query, key) to module 1 # TODO: the dropout mask should be stored in the state instead? 1 # TODO: not nice 1 # TODO: better heuristic e.g., tables that have exact match 1 # TODO: Support batch_first 1 ## TODO: Don't hardcode train 1 TODO: 1 # TODO: support slices 1 # TODO: Get rid of surrounding quotes 1 # TODO: smarter mapping. 1 # TODO initialize using xavier 1 # TODO: Figure out how to get right sizes (query, value) to module 1 # TODO: This should be automatically inferred from encoder 1 #TODO: from_cond should be true from non-bert model 1 # TODO: make it posible to deal with "singleton?" 1 # TODO: Check that things don't blow up when some lengths are 0 1