facebookresearch / xformers
Duplication

Places in code with 6 or more lines that are exactly the same.

Intro
  • For duplication, we look at places in code where there are 6 or more lines of code that are exactly the same.
  • Before duplication is calculated, the code is cleaned to remove empty lines, comments, and frequently duplicated constructs such as imports.
  • You should aim at having as little as possible (<5%) of duplicated code as high-level of duplication can lead to maintenance difficulties, poor factoring, and logical contradictions.
Learn more...
Duplication Overall
  • 15% duplication:
    • 14,155 cleaned lines of cleaned code (without empty lines, comments, and frequently duplicated constructs such as imports)
    • 2,217 duplicated lines
  • 206 duplicates
system15% (2,217 lines)
Duplication per Extension
cu45% (941 lines)
py8% (766 lines)
pyi16% (458 lines)
cpp11% (52 lines)
Duplication per Component (primary)
xformers/components23% (1,240 lines)
stubs/torch16% (451 lines)
xformers/benchmarks7% (211 lines)
experimental/ragged_inference19% (145 lines)
xformers/triton7% (86 lines)
xformers/factory15% (70 lines)
xformers6% (7 lines)
stubs/numpy7% (7 lines)
stubs/fvcore0% (0 lines)
stubs0% (0 lines)
stubs/triton0% (0 lines)
stubs/sklearn0% (0 lines)
stubs/matplotlib0% (0 lines)
stubs/recommonmark0% (0 lines)
packaging0% (0 lines)
experimental0% (0 lines)
ROOT0% (0 lines)
xformers/sparse0% (0 lines)
xformers/helpers0% (0 lines)
Longest Duplicates
The list of 20 longest duplicates.
See data for all 206 duplicates...
Size#FoldersFilesLinesCode
25 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sddmm2_cuda.cu
sddmm2_cuda.cu
74:99 (4%)
244:269 (4%)
view
24 x 2 stubs/torch/nn
stubs/torch/nn/functional
pyi
functional.pyi
__init__.pyi
15:39 (100%)
15:39 (100%)
view
23 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sddmm2_cuda.cu
sddmm2_cuda.cu
128:151 (3%)
301:324 (3%)
view
22 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sddmm2_cuda.cu
sddmm2_cuda.cu
242:263 (3%)
421:442 (3%)
view
22 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sddmm2_cuda.cu
sddmm2_cuda.cu
159:180 (3%)
332:353 (3%)
view
20 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sddmm2_cuda.cu
sddmm2_cuda.cu
74:93 (3%)
423:442 (3%)
view
20 x 2 xformers/components/attention/feature_maps
xformers/components/attention/feature_maps
softmax.py
softmax.py
180:211 (14%)
240:271 (14%)
view
19 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sddmm2_cuda.cu
sddmm2_cuda.cu
215:234 (3%)
394:413 (3%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
612:625 (1%)
628:641 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
595:608 (1%)
684:697 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
612:625 (1%)
645:658 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
595:608 (1%)
665:678 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
sparse_softmax.cu
spmm.cu
4:18 (5%)
3:17 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
665:678 (1%)
684:697 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
579:592 (1%)
665:678 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
628:641 (1%)
665:678 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
579:592 (1%)
684:697 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
628:641 (1%)
684:697 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
44:57 (1%)
476:489 (1%)
view
14 x 2 xformers/components/attention/csrc/cuda
xformers/components/attention/csrc/cuda
cu
spmm.cu
spmm.cu
645:658 (1%)
665:678 (1%)
view
Duplicated Units
The list of top 9 duplicated units.
See data for all 9 unit duplicates...
Size#FoldersFilesLinesCode
9 x 2 stubs/torch
stubs/torch
pyi
__init__.pyi
__init__.pyi
0:0 
0:0 
view
8 x 2 stubs/torch
stubs/torch
pyi
__init__.pyi
__init__.pyi
0:0 
0:0 
view
8 x 2 stubs/torch
stubs/torch
pyi
__init__.pyi
__init__.pyi
0:0 
0:0 
view
7 x 2 stubs/torch
stubs/torch
pyi
__init__.pyi
__init__.pyi
0:0 
0:0 
view
7 x 2 stubs/torch
stubs/torch
pyi
__init__.pyi
__init__.pyi
0:0 
0:0 
view
7 x 3 xformers/components/attention
xformers/components/attention
xformers/components/attention
ortho.py
local.py
random.py
0:0 
0:0 
0:0 
view
15 x 2 xformers/components/attention/feature_maps
xformers/components/attention/feature_maps
softmax.py
softmax.py
0:0 
0:0 
view
6 x 2 xformers/components/attention/feature_maps
xformers/components/attention/feature_maps
softmax.py
softmax.py
0:0 
0:0 
view
6 x 2 xformers/components
xformers/components
residual.py
residual.py
0:0 
0:0 
view