duplicated block id: 1 size: 25 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (74:99) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (244:269) duplicated block id: 2 size: 24 cleaned lines of code in 2 files: - stubs/torch/nn/functional.pyi (15:39) - stubs/torch/nn/functional/__init__.pyi (15:39) duplicated block id: 3 size: 23 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (128:151) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (301:324) duplicated block id: 4 size: 22 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (242:263) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (421:442) duplicated block id: 5 size: 22 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (159:180) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (332:353) duplicated block id: 6 size: 20 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (74:93) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (423:442) duplicated block id: 7 size: 20 cleaned lines of code in 2 files: - xformers/components/attention/feature_maps/softmax.py (180:211) - xformers/components/attention/feature_maps/softmax.py (240:271) duplicated block id: 8 size: 19 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (215:234) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (394:413) duplicated block id: 9 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (612:625) - xformers/components/attention/csrc/cuda/spmm.cu (628:641) duplicated block id: 10 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (595:608) - xformers/components/attention/csrc/cuda/spmm.cu (684:697) duplicated block id: 11 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (612:625) - xformers/components/attention/csrc/cuda/spmm.cu (645:658) duplicated block id: 12 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (595:608) - xformers/components/attention/csrc/cuda/spmm.cu (665:678) duplicated block id: 13 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (4:18) - xformers/components/attention/csrc/cuda/spmm.cu (3:17) duplicated block id: 14 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (665:678) - xformers/components/attention/csrc/cuda/spmm.cu (684:697) duplicated block id: 15 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (579:592) - xformers/components/attention/csrc/cuda/spmm.cu (665:678) duplicated block id: 16 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (628:641) - xformers/components/attention/csrc/cuda/spmm.cu (665:678) duplicated block id: 17 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (579:592) - xformers/components/attention/csrc/cuda/spmm.cu (684:697) duplicated block id: 18 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (628:641) - xformers/components/attention/csrc/cuda/spmm.cu (684:697) duplicated block id: 19 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (44:57) - xformers/components/attention/csrc/cuda/spmm.cu (476:489) duplicated block id: 20 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (645:658) - xformers/components/attention/csrc/cuda/spmm.cu (665:678) duplicated block id: 21 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (595:608) - xformers/components/attention/csrc/cuda/spmm.cu (612:625) duplicated block id: 22 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (548:561) - xformers/components/attention/csrc/cuda/spmm.cu (719:732) duplicated block id: 23 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (701:714) - xformers/components/attention/csrc/cuda/spmm.cu (718:731) duplicated block id: 24 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (595:608) - xformers/components/attention/csrc/cuda/spmm.cu (628:641) duplicated block id: 25 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (595:608) - xformers/components/attention/csrc/cuda/spmm.cu (645:658) duplicated block id: 26 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (645:658) - xformers/components/attention/csrc/cuda/spmm.cu (684:697) duplicated block id: 27 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (579:592) - xformers/components/attention/csrc/cuda/spmm.cu (595:608) duplicated block id: 28 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (579:592) - xformers/components/attention/csrc/cuda/spmm.cu (612:625) duplicated block id: 29 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (579:592) - xformers/components/attention/csrc/cuda/spmm.cu (628:641) duplicated block id: 30 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (612:625) - xformers/components/attention/csrc/cuda/spmm.cu (665:678) duplicated block id: 31 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (579:592) - xformers/components/attention/csrc/cuda/spmm.cu (645:658) duplicated block id: 32 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (612:625) - xformers/components/attention/csrc/cuda/spmm.cu (684:697) duplicated block id: 33 size: 14 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (628:641) - xformers/components/attention/csrc/cuda/spmm.cu (645:658) duplicated block id: 34 size: 13 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (17:31) - experimental/ragged_inference/triton_v2_qk_dotprod.py (16:30) duplicated block id: 35 size: 13 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (64:76) - xformers/benchmarks/benchmark_sddmm.py (68:80) duplicated block id: 36 size: 13 cleaned lines of code in 2 files: - xformers/components/attention/compositional.py (126:138) - xformers/components/multi_head_dispatch.py (92:104) duplicated block id: 37 size: 13 cleaned lines of code in 2 files: - xformers/components/attention/local.py (84:99) - xformers/components/attention/random.py (77:92) duplicated block id: 38 size: 13 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (548:560) - xformers/components/attention/csrc/cuda/spmm.cu (702:714) duplicated block id: 39 size: 13 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (3:15) - xformers/components/attention/csrc/cuda/spmm.cu (3:15) duplicated block id: 40 size: 13 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (3:15) - xformers/components/attention/csrc/cuda/sparse_softmax.cu (4:16) duplicated block id: 41 size: 13 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (336:348) - xformers/components/attention/csrc/cuda/spmm.cu (366:378) duplicated block id: 42 size: 12 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (292:303) - xformers/components/attention/csrc/cuda/sddmm.cu (307:318) duplicated block id: 43 size: 12 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (263:274) - xformers/components/attention/csrc/cuda/sddmm.cu (277:288) duplicated block id: 44 size: 12 cleaned lines of code in 2 files: - experimental/ragged_inference/seq_kv_cache.py (80:95) - experimental/ragged_inference/seq_kv_cache.py (114:129) duplicated block id: 45 size: 12 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (207:218) - xformers/components/attention/csrc/cuda/sddmm.cu (246:257) duplicated block id: 46 size: 11 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1209:1219) - stubs/torch/__init__.pyi (1775:1785) duplicated block id: 47 size: 11 cleaned lines of code in 2 files: - xformers/components/attention/feature_maps/softmax.py (140:163) - xformers/components/attention/feature_maps/softmax.py (197:220) duplicated block id: 48 size: 11 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (501:511) - xformers/components/attention/csrc/cuda/spmm.cu (514:524) duplicated block id: 49 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (14:23) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (188:197) duplicated block id: 50 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (283:294) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (373:384) duplicated block id: 51 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (14:23) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (101:110) duplicated block id: 52 size: 10 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1231:1240) - stubs/torch/__init__.pyi (1797:1806) duplicated block id: 53 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (26:37) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (113:124) duplicated block id: 54 size: 10 cleaned lines of code in 2 files: - xformers/triton/k_dropout.py (108:119) - xformers/triton/k_dropout.py (199:210) duplicated block id: 55 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (101:110) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (188:197) duplicated block id: 56 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (36:45) - xformers/components/attention/csrc/cuda/sparse_softmax.cu (182:191) duplicated block id: 57 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (264:274) - xformers/components/attention/csrc/cuda/spmm.cu (807:817) duplicated block id: 58 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (271:280) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (361:370) duplicated block id: 59 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (113:124) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (200:211) duplicated block id: 60 size: 10 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1198:1207) - stubs/torch/__init__.pyi (1764:1773) duplicated block id: 61 size: 10 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (26:37) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (200:211) duplicated block id: 62 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/ortho.py (75:83) - xformers/components/attention/random.py (84:92) duplicated block id: 63 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1232:1240) - stubs/torch/__init__.pyi (1777:1785) duplicated block id: 64 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2061:2069) - stubs/torch/__init__.pyi (2318:2326) duplicated block id: 65 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (324:332) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (413:421) duplicated block id: 66 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1200:1208) - stubs/torch/__init__.pyi (1222:1230) duplicated block id: 67 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1766:1774) - stubs/torch/__init__.pyi (1788:1796) duplicated block id: 68 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (64:72) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (234:242) duplicated block id: 69 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/local.py (91:99) - xformers/components/attention/ortho.py (75:83) duplicated block id: 70 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (477:485) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (489:497) duplicated block id: 71 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1777:1785) - stubs/torch/__init__.pyi (1798:1806) duplicated block id: 72 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1211:1219) - stubs/torch/__init__.pyi (1798:1806) duplicated block id: 73 size: 9 cleaned lines of code in 2 files: - xformers/factory/block_factory.py (178:189) - xformers/factory/block_factory.py (237:247) duplicated block id: 74 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2341:2349) - stubs/torch/__init__.pyi (2352:2360) duplicated block id: 75 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (151:159) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (234:242) duplicated block id: 76 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (530:538) - xformers/components/attention/csrc/cuda/spmm.cu (736:744) duplicated block id: 77 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (520:528) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (532:540) duplicated block id: 78 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (64:72) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (151:159) duplicated block id: 79 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/ortho.py (179:189) - xformers/components/attention/ortho.py (214:224) duplicated block id: 80 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1221:1229) - stubs/torch/__init__.pyi (1787:1795) duplicated block id: 81 size: 9 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1211:1219) - stubs/torch/__init__.pyi (1232:1240) duplicated block id: 82 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (465:473) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (489:497) duplicated block id: 83 size: 9 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (465:473) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (477:485) duplicated block id: 84 size: 9 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (34:46) - experimental/ragged_inference/triton_v2_qk_dotprod.py (33:45) duplicated block id: 85 size: 9 cleaned lines of code in 2 files: - xformers/components/residual.py (56:66) - xformers/components/residual.py (75:85) duplicated block id: 86 size: 8 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1248:1255) - stubs/torch/__init__.pyi (1493:1500) duplicated block id: 87 size: 8 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1222:1229) - stubs/torch/__init__.pyi (1766:1773) duplicated block id: 88 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (178:186) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (261:269) duplicated block id: 89 size: 8 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1259:1266) - stubs/torch/__init__.pyi (1505:1512) duplicated block id: 90 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (139:146) - xformers/components/attention/csrc/cuda/spmm.cu (788:795) duplicated block id: 91 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (296:304) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (387:397) duplicated block id: 92 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (197:206) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (370:379) duplicated block id: 93 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (91:99) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (178:186) duplicated block id: 94 size: 8 cleaned lines of code in 2 files: - xformers/benchmarks/LRA/code/model_wrapper.py (161:169) - xformers/benchmarks/LRA/code/model_wrapper.py (202:210) duplicated block id: 95 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/global_tokens.py (14:22) - xformers/components/attention/random.py (14:22) duplicated block id: 96 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (389:398) - xformers/components/attention/csrc/cuda/spmm.cu (839:848) duplicated block id: 97 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cpu/sparse_softmax.cpp (152:159) - xformers/components/attention/csrc/cpu/spmm.cpp (58:65) duplicated block id: 98 size: 8 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1200:1207) - stubs/torch/__init__.pyi (1788:1795) duplicated block id: 99 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/feature_maps/softmax.py (30:37) - xformers/components/attention/feature_maps/softmax.py (180:187) duplicated block id: 100 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (110:119) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (280:289) duplicated block id: 101 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/feature_maps/softmax.py (30:37) - xformers/components/attention/feature_maps/softmax.py (240:247) duplicated block id: 102 size: 8 cleaned lines of code in 2 files: - experimental/ragged_inference/seq_kv_cache.py (98:106) - experimental/ragged_inference/seq_kv_cache.py (124:132) duplicated block id: 103 size: 8 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1813:1820) - stubs/torch/__init__.pyi (1826:1833) duplicated block id: 104 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (139:146) - xformers/components/attention/csrc/cuda/sparse_softmax.cu (245:252) duplicated block id: 105 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cpu/sparse_softmax.cpp (94:101) - xformers/components/attention/csrc/cpu/sparse_softmax.cpp (152:159) duplicated block id: 106 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cpu/sparse_softmax.cpp (103:112) - xformers/components/attention/csrc/cpu/sparse_softmax.cpp (162:171) duplicated block id: 107 size: 8 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (109:116) - experimental/ragged_inference/triton_v2_qk_dotprod.py (108:115) duplicated block id: 108 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (245:252) - xformers/components/attention/csrc/cuda/spmm.cu (788:795) duplicated block id: 109 size: 8 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cpu/sparse_softmax.cpp (94:101) - xformers/components/attention/csrc/cpu/spmm.cpp (58:65) duplicated block id: 110 size: 8 cleaned lines of code in 2 files: - xformers/triton/k_dropout.py (67:84) - xformers/triton/k_dropout.py (152:170) duplicated block id: 111 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (175:181) - xformers/benchmarks/benchmark_core.py (224:230) duplicated block id: 112 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/LRA/setup/retrieval.py (30:37) - xformers/benchmarks/LRA/setup/text.py (28:35) duplicated block id: 113 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1685:1691) - stubs/torch/__init__.pyi (1739:1745) duplicated block id: 114 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (504:510) - xformers/components/attention/csrc/cuda/spmm.cu (706:712) duplicated block id: 115 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (504:510) - xformers/components/attention/csrc/cuda/spmm.cu (723:729) duplicated block id: 116 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/core.py (85:93) - xformers/ops.py (21:29) duplicated block id: 117 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (517:523) - xformers/components/attention/csrc/cuda/spmm.cu (706:712) duplicated block id: 118 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (157:164) - xformers/benchmarks/benchmark_core.py (204:211) duplicated block id: 119 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2064:2070) - stubs/torch/__init__.pyi (2073:2079) duplicated block id: 120 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/feature_maps/softmax.py (218:229) - xformers/components/attention/feature_maps/softmax.py (278:289) duplicated block id: 121 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (517:523) - xformers/components/attention/csrc/cuda/spmm.cu (723:729) duplicated block id: 122 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (504:510) - xformers/components/attention/csrc/cuda/spmm.cu (552:558) duplicated block id: 123 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/LRA/setup/listops.py (29:35) - xformers/benchmarks/LRA/setup/text.py (30:36) duplicated block id: 124 size: 7 cleaned lines of code in 2 files: - xformers/factory/block_factory.py (293:300) - xformers/factory/block_factory.py (370:377) duplicated block id: 125 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1838:1844) - stubs/torch/__init__.pyi (1851:1857) duplicated block id: 126 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (517:523) - xformers/components/attention/csrc/cuda/spmm.cu (552:558) duplicated block id: 127 size: 7 cleaned lines of code in 2 files: - xformers/factory/block_factory.py (193:199) - xformers/factory/block_factory.py (252:258) duplicated block id: 128 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_encoder.py (336:342) - xformers/benchmarks/benchmark_vit_timm.py (320:326) duplicated block id: 129 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (859:865) - stubs/torch/__init__.pyi (867:873) duplicated block id: 130 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (204:211) - xformers/benchmarks/benchmark_nystrom_utils.py (19:26) duplicated block id: 131 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (168:174) - xformers/benchmarks/benchmark_core.py (187:193) duplicated block id: 132 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_nystrom_utils.py (31:37) - xformers/benchmarks/benchmark_nystrom_utils.py (48:54) duplicated block id: 133 size: 7 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_qk_dotprod.py (108:114) - experimental/ragged_inference/triton_v2_ragged_qk_dotprod.py (43:49) duplicated block id: 134 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_triton_dropout.py (40:48) - xformers/benchmarks/benchmark_triton_layernorm.py (31:39) duplicated block id: 135 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (351:357) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (440:446) duplicated block id: 136 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1153:1159) - stubs/torch/__init__.pyi (1165:1171) duplicated block id: 137 size: 7 cleaned lines of code in 2 files: - xformers/triton/k_softmax.py (22:28) - xformers/triton/k_softmax.py (108:114) duplicated block id: 138 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2082:2088) - stubs/torch/__init__.pyi (2091:2097) duplicated block id: 139 size: 7 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (109:115) - experimental/ragged_inference/triton_v2_ragged_qk_dotprod.py (43:49) duplicated block id: 140 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (494:500) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (537:543) duplicated block id: 141 size: 7 cleaned lines of code in 2 files: - stubs/numpy/__init__.pyi (18:28) - stubs/torch/nn/__init__.pyi (14:26) duplicated block id: 142 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2257:2263) - stubs/torch/__init__.pyi (2267:2273) duplicated block id: 143 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2080:2086) - stubs/torch/__init__.pyi (2339:2345) duplicated block id: 144 size: 7 cleaned lines of code in 2 files: - xformers/components/feedforward/fused_mlp.py (36:42) - xformers/components/feedforward/mlp.py (25:31) duplicated block id: 145 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1176:1182) - stubs/torch/__init__.pyi (1188:1194) duplicated block id: 146 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/LRA/run_tasks.py (274:282) - xformers/benchmarks/LRA/run_tasks.py (284:292) duplicated block id: 147 size: 7 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (157:164) - xformers/benchmarks/benchmark_nystrom_utils.py (19:26) duplicated block id: 148 size: 7 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (450:456) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (504:510) duplicated block id: 149 size: 7 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (88:96) - experimental/ragged_inference/triton_v2_qk_dotprod.py (87:95) duplicated block id: 150 size: 7 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1824:1830) - stubs/torch/__init__.pyi (1849:1855) duplicated block id: 151 size: 6 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (147:153) - xformers/benchmarks/benchmark_core.py (246:252) duplicated block id: 152 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1308:1313) - stubs/torch/__init__.pyi (1319:1324) duplicated block id: 153 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/global_tokens.py (83:88) - xformers/components/attention/random.py (84:89) duplicated block id: 154 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/global_tokens.py (83:88) - xformers/components/attention/ortho.py (75:80) duplicated block id: 155 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (42:47) - xformers/components/attention/csrc/cuda/sddmm.cu (199:204) duplicated block id: 156 size: 6 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_triton_stride_sum.py (29:35) - xformers/benchmarks/utils.py (93:99) duplicated block id: 157 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/compositional.py (162:167) - xformers/components/attention/compositional.py (169:174) duplicated block id: 158 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/blocksparse.py (121:126) - xformers/components/attention/ortho.py (75:80) duplicated block id: 159 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/blocksparse.py (121:126) - xformers/components/attention/local.py (91:96) duplicated block id: 160 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (26:32) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (373:379) duplicated block id: 161 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (561:567) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (591:597) duplicated block id: 162 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2071:2076) - stubs/torch/__init__.pyi (2328:2333) duplicated block id: 163 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (53:58) - xformers/components/attention/csrc/cuda/sddmm.cu (211:216) duplicated block id: 164 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1274:1279) - stubs/torch/__init__.pyi (1285:1290) duplicated block id: 165 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (32:40) - xformers/components/attention/csrc/cuda/sparse_softmax.cu (23:31) duplicated block id: 166 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (48:53) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (135:140) duplicated block id: 167 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (200:206) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (283:289) duplicated block id: 168 size: 6 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_qk_dotprod.py (97:106) - experimental/ragged_inference/triton_v2_ragged_qk_dotprod.py (33:41) duplicated block id: 169 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (113:119) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (373:379) duplicated block id: 170 size: 6 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_ragged_qk_dotprod.py (282:288) - experimental/ragged_inference/triton_v2_ragged_qk_dotprod.py (337:343) duplicated block id: 171 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (17:22) - stubs/torch/nn/__init__.pyi (14:19) duplicated block id: 172 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (100:105) - xformers/components/attention/csrc/cuda/sparse_softmax.cu (282:287) duplicated block id: 173 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/blocksparse.py (121:126) - xformers/components/attention/random.py (84:89) duplicated block id: 174 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2073:2078) - stubs/torch/__init__.pyi (2321:2326) duplicated block id: 175 size: 6 cleaned lines of code in 2 files: - xformers/triton/k_softmax.py (40:57) - xformers/triton/k_softmax.py (125:139) duplicated block id: 176 size: 6 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (194:200) - xformers/benchmarks/benchmark_core.py (246:252) duplicated block id: 177 size: 6 cleaned lines of code in 2 files: - stubs/torch/linalg/__init__.pyi (38:46) - stubs/torch/linalg/__init__.pyi (50:55) duplicated block id: 178 size: 6 cleaned lines of code in 2 files: - xformers/triton/k_dropout.py (58:63) - xformers/triton/k_dropout.py (142:147) duplicated block id: 179 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1812:1817) - stubs/torch/__init__.pyi (1837:1842) duplicated block id: 180 size: 6 cleaned lines of code in 2 files: - xformers/benchmarks/benchmark_core.py (147:153) - xformers/benchmarks/benchmark_core.py (194:200) duplicated block id: 181 size: 6 cleaned lines of code in 2 files: - xformers/benchmarks/LRA/setup/listops.py (29:34) - xformers/benchmarks/LRA/setup/retrieval.py (32:37) duplicated block id: 182 size: 6 cleaned lines of code in 2 files: - xformers/factory/block_factory.py (168:174) - xformers/factory/block_factory.py (223:230) duplicated block id: 183 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (26:32) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (283:289) duplicated block id: 184 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/random.py (48:66) - xformers/components/attention/scaled_dot_product.py (46:52) duplicated block id: 185 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/global_tokens.py (83:88) - xformers/components/attention/local.py (91:96) duplicated block id: 186 size: 6 cleaned lines of code in 2 files: - xformers/factory/block_factory.py (279:286) - xformers/factory/block_factory.py (355:362) duplicated block id: 187 size: 6 cleaned lines of code in 2 files: - experimental/ragged_inference/seq_kv_cache.py (90:95) - experimental/ragged_inference/seq_kv_cache.py (98:103) duplicated block id: 188 size: 6 cleaned lines of code in 2 files: - xformers/benchmarks/LRA/setup/listops.py (37:42) - xformers/benchmarks/LRA/setup/text.py (38:43) duplicated block id: 189 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (910:915) - stubs/torch/__init__.pyi (939:944) duplicated block id: 190 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1097:1102) - stubs/torch/__init__.pyi (1982:1987) duplicated block id: 191 size: 6 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (98:107) - experimental/ragged_inference/triton_v2_ragged_qk_dotprod.py (33:41) duplicated block id: 192 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (48:53) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (308:313) duplicated block id: 193 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cpu/sddmm.cpp (92:101) - xformers/components/attention/csrc/cpu/spmm.cpp (94:104) duplicated block id: 194 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (328:333) - xformers/components/attention/csrc/cuda/spmm.cu (482:487) duplicated block id: 195 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/spmm.cu (50:55) - xformers/components/attention/csrc/cuda/spmm.cu (328:333) duplicated block id: 196 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sparse_softmax.cu (255:261) - xformers/components/attention/csrc/cuda/spmm.cu (798:804) duplicated block id: 197 size: 6 cleaned lines of code in 2 files: - experimental/ragged_inference/triton_v2_matmul.py (98:107) - experimental/ragged_inference/triton_v2_qk_dotprod.py (97:106) duplicated block id: 198 size: 6 cleaned lines of code in 2 files: - xformers/triton/k_layer_norm.py (284:289) - xformers/triton/k_layer_norm.py (293:298) duplicated block id: 199 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (1083:1088) - stubs/torch/__init__.pyi (1971:1976) duplicated block id: 200 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (903:908) - stubs/torch/__init__.pyi (932:937) duplicated block id: 201 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (482:487) - xformers/components/attention/csrc/cuda/sddmm2_cuda.cu (525:530) duplicated block id: 202 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (768:773) - stubs/torch/__init__.pyi (2019:2024) duplicated block id: 203 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/blocksparse.py (121:126) - xformers/components/attention/global_tokens.py (83:88) duplicated block id: 204 size: 6 cleaned lines of code in 2 files: - xformers/components/attention/csrc/cuda/sddmm.cu (53:58) - xformers/components/attention/csrc/cuda/sddmm.cu (250:255) duplicated block id: 205 size: 6 cleaned lines of code in 2 files: - stubs/numpy/__init__.pyi (18:23) - stubs/torch/__init__.pyi (17:22) duplicated block id: 206 size: 6 cleaned lines of code in 2 files: - stubs/torch/__init__.pyi (2090:2095) - stubs/torch/__init__.pyi (2351:2356)