huggingface / hf-nix
Duplication

Places in code with 6 or more lines that are exactly the same.

Intro
Learn more...
Duplication Overall
system49% (1,951 lines)
dependency graphs: 2D graph | 3D graph | 3D graph (with duplicates)...
Duplication per Extension
nix51% (1,951 lines)
Duplication per Component (primary)
pkgs53% (1,951 lines)
ROOT0% (0 lines)
Longest Duplicates
The list of 50 longest duplicates.
See data for all 190 duplicates...
Size#FoldersFilesLinesCode
119 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
358:494 (17%)
371:507 (17%)
view
70 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
568:651 (10%)
583:666 (10%)
view
52 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
226:280 (7%)
232:286 (7%)
view
44 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
668:718 (6%)
683:733 (6%)
view
36 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
1:36 (5%)
1:36 (5%)
view
36 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
496:533 (5%)
512:549 (5%)
view
32 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
85:124 (4%)
85:124 (4%)
view
31 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
51:83 (4%)
51:83 (4%)
view
31 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
534:564 (4%)
551:581 (4%)
view
30 x 2 pkgs/python-modules/flash-attn-rotary
pkgs/python-modules/flash-attn
31:67 (47%)
29:65 (48%)
view
28 x 2 pkgs/magma
pkgs/magma
12:39 (40%)
43:70 (40%)
view
28 x 2 pkgs/python-modules/attention-kernels
pkgs/python-modules/marlin-kernels
29:60 (40%)
40:71 (36%)
view
28 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
301:330 (4%)
309:338 (4%)
view
25 x 2 pkgs/python-modules/flash-attn-layer-norm
pkgs/python-modules/flash-attn-rotary
38:68 (39%)
37:67 (39%)
view
25 x 2 pkgs/python-modules/flash-attn-layer-norm
pkgs/python-modules/flash-attn
38:68 (39%)
35:65 (40%)
view
24 x 2 pkgs/python-modules/attention-kernels
pkgs/python-modules/moe-kernels
27:55 (34%)
27:55 (36%)
view
24 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
167:193 (3%)
175:201 (3%)
view
23 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
720:743 (3%)
735:758 (3%)
view
23 x 2 pkgs/python-modules/marlin-kernels
pkgs/python-modules/moe-kernels
40:66 (29%)
29:55 (34%)
view
20 x 2 pkgs/python-modules/flash-attn-v1
pkgs/python-modules/flash-attn
27:50 (32%)
27:50 (32%)
view
19 x 2 pkgs/python-modules/flash-attn-rotary
pkgs/python-modules/flash-attn-v1
31:52 (30%)
29:50 (30%)
view
18 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
150:167 (2%)
148:165 (2%)
view
18 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
131:148 (2%)
138:155 (2%)
view
17 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/mamba-ssm
30:49 (25%)
30:49 (25%)
view
17 x 2 pkgs/python-modules/attention-kernels
pkgs/python-modules/moe-kernels
1:18 (24%)
1:18 (25%)
view
16 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/flash-attn-rotary
1:17 (23%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn-rotary
pkgs/python-modules/flashinfer
1:17 (25%)
1:17 (26%)
view
16 x 2 pkgs/python-modules/causal-conv1d
pkgs/python-modules/flash-attn-layer-norm
1:17 (24%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/causal-conv1d
pkgs/python-modules/flash-attn-rotary
1:17 (24%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/flash-attn-layer-norm
1:17 (23%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn-layer-norm
pkgs/python-modules/flash-attn
1:17 (25%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn-rotary
pkgs/python-modules/flash-attn-v1
1:17 (25%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn
pkgs/python-modules/flashinfer
1:17 (25%)
1:17 (26%)
view
16 x 2 pkgs/python-modules/causal-conv1d
pkgs/python-modules/flash-attn-v1
1:17 (24%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/flash-attn
1:17 (23%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn-v1
pkgs/python-modules/flashinfer
1:17 (25%)
1:17 (26%)
view
16 x 2 pkgs/python-modules/causal-conv1d
pkgs/python-modules/flashinfer
1:17 (24%)
1:17 (26%)
view
16 x 2 pkgs/python-modules/flash-attn-v1
pkgs/python-modules/flash-attn
1:17 (25%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/causal-conv1d
pkgs/python-modules/flash-attn
1:17 (24%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/causal-conv1d
1:17 (23%)
1:17 (24%)
view
16 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/flashinfer
1:17 (23%)
1:17 (26%)
view
16 x 2 pkgs/python-modules/flash-attn-rotary
pkgs/python-modules/flash-attn
1:17 (25%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/flash-attn-v1
1:17 (23%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn-layer-norm
pkgs/python-modules/flash-attn-v1
1:17 (25%)
1:17 (25%)
view
16 x 2 pkgs/python-modules/flash-attn-layer-norm
pkgs/python-modules/flashinfer
1:17 (25%)
1:17 (26%)
view
16 x 2 pkgs/python-modules/flash-attn-layer-norm
pkgs/python-modules/flash-attn-rotary
1:17 (25%)
1:17 (25%)
view
15 x 2 pkgs/python-modules/torch_2_6
pkgs/python-modules/torch_2_7
198:213 (2%)
203:218 (2%)
view
14 x 2 pkgs/python-modules/awq-inference-engine
pkgs/python-modules/exllamav2
1:14 (20%)
1:14 (20%)
view
14 x 2 pkgs/python-modules/exllamav2
pkgs/python-modules/flash-attn-rotary
1:14 (20%)
1:14 (22%)
view
14 x 2 pkgs/python-modules/exllamav2
pkgs/python-modules/flash-attn-layer-norm
1:14 (20%)
1:14 (21%)
view