facebookresearch / xformers
File Size

The distribution of size of files (measured in lines of code).

Intro
  • File size measurements show the distribution of size of files.
  • Files are classified in four categories based on their size (lines of code): 1-100 (very small files), 101-200 (small files), 201-500 (medium size files), 501-1000 (long files), 1001+(very long files).
  • It is a good practice to keep files small. Long files may become "bloaters", code that have increased to such gargantuan proportions that they are hard to work with.
Learn more...
File Size Overall
  • There are 157 files with 14,905 lines of code.
    • 1 very long files (2,344 lines of code)
    • 2 long files (1,351 lines of code)
    • 12 medium size files (3,530 lines of codeclsfd_ftr_w_mp_ins)
    • 25 small files (3,519 lines of code)
    • 117 very small files (4,161 lines of code)
15% | 9% | 23% | 23% | 27%
Legend:
1001+
501-1000
201-500
101-200
1-100


explore: zoomable circles | sunburst | 3D view
File Size per Extension
1001+
501-1000
201-500
101-200
1-100
pyi80% | 0% | 8% | 0% | 11%
cu0% | 65% | 30% | 0% | 4%
py0% | 0% | 29% | 35% | 35%
cpp0% | 0% | 0% | 32% | 67%
h0% | 0% | 0% | 87% | 12%
bash0% | 0% | 0% | 0% | 100%
cfg0% | 0% | 0% | 0% | 100%
yml0% | 0% | 0% | 0% | 100%
in0% | 0% | 0% | 0% | 100%
File Size per Logical Decomposition
primary
1001+
501-1000
201-500
101-200
1-100
stubs/torch83% | 0% | 8% | 0% | 7%
xformers/components0% | 23% | 14% | 29% | 31%
xformers/benchmarks0% | 0% | 44% | 22% | 33%
xformers/sparse0% | 0% | 65% | 21% | 12%
xformers/factory0% | 0% | 62% | 31% | 5%
experimental/ragged_inference0% | 0% | 32% | 54% | 12%
xformers/triton0% | 0% | 18% | 33% | 48%
ROOT0% | 0% | 0% | 94% | 5%
xformers0% | 0% | 0% | 0% | 100%
stubs/numpy0% | 0% | 0% | 0% | 100%
packaging0% | 0% | 0% | 0% | 100%
xformers/helpers0% | 0% | 0% | 0% | 100%
experimental0% | 0% | 0% | 0% | 100%
stubs0% | 0% | 0% | 0% | 100%
stubs/triton0% | 0% | 0% | 0% | 100%
stubs/fvcore0% | 0% | 0% | 0% | 100%
stubs/sklearn0% | 0% | 0% | 0% | 100%
stubs/matplotlib0% | 0% | 0% | 0% | 100%
stubs/recommonmark0% | 0% | 0% | 0% | 100%
Longest Files (Top 50)
File# lines# units
pyi
__init__.pyi
in stubs/torch
2344 419
cu
spmm.cu
in xformers/components/attention/csrc/cuda
775 -
cu
sddmm2_cuda.cu
in xformers/components/attention/csrc/cuda
576 -
run_tasks.py
in xformers/benchmarks/LRA
467 10
csr_tensor.py
in xformers/sparse
365 23
cu
sddmm.cu
in xformers/components/attention/csrc/cuda
351 -
benchmark_encoder.py
in xformers/benchmarks
314 7
benchmark_vit_timm.py
in xformers/benchmarks
311 13
block_factory.py
in xformers/factory
294 18
cu
sparse_softmax.cu
in xformers/components/attention/csrc/cuda
268 -
triton_v2_ragged_qk_dotprod.py
in experimental/ragged_inference
252 7
pyi
__init__.pyi
in stubs/torch/nn
238 50
k_layer_norm.py
in xformers/triton
225 10
benchmark_core.py
in xformers/benchmarks
225 4
compositional.py
in xformers/components/attention
220 4
ortho.py
in xformers/components/attention
199 6
nystrom.py
in xformers/components/attention
194 4
benchmark_pytorch_transformer.py
in xformers/benchmarks
183 9
sparse_softmax.cpp
in xformers/components/attention/csrc/cpu
168 4
triton_v2_qk_dotprod.py
in experimental/ragged_inference
159 6
core.py
in xformers/components/attention
154 11
model_factory.py
in xformers/factory
149 6
dropout.py
in xformers/triton
148 5
attention_patterns.py
in xformers/components/attention
148 21
model_wrapper.py
in xformers/benchmarks/LRA/code
147 12
softmax.py
in xformers/components/attention/feature_maps
146 11
triton_v2_matmul.py
in experimental/ragged_inference
144 6
k_dropout.py
in xformers/triton
141 4
multi_head_dispatch.py
in xformers/components
141 6
in_proj_container.py
in xformers/components
131 5
blocksparse.py
in xformers/components/attention
129 3
computeUtil.h
in xformers/components/attention/csrc
122 14
_csr_ops.py
in xformers/sparse
121 9
seq_kv_cache.py
in experimental/ragged_inference
119 14
run_grid_search.py
in xformers/benchmarks/LRA
116 2
softmax.py
in xformers/triton
114 5
setup.py
in root
113 4
favor.py
in xformers/components/attention
113 4
run_with_submitit.py
in xformers/benchmarks/LRA
110 8
benchmark_triton_fused_linear.py
in xformers/benchmarks
110 3
reversible.py
in xformers/components
100 10
pyi
__init__.pyi
in stubs/numpy
99 22
k_fused_matmul_fw.py
in xformers/triton
98 3
k_softmax.py
in xformers/triton
98 3
benchmark_triton_blocksparse.py
in xformers/benchmarks
96 1
garbage_pad_ragged_acts.py
in experimental/ragged_inference
93 11
spmm.cpp
in xformers/components/attention/csrc/cpu
90 2
_sputnik_sparse.py
in xformers/components/attention
88 23
fused_linear_layer.py
in xformers/triton
87 5
sddmm.cpp
in xformers/components/attention/csrc/cpu
87 2
Files With Most Units (Top 20)
File# lines# units
pyi
__init__.pyi
in stubs/torch
2344 419
pyi
__init__.pyi
in stubs/torch/nn
238 50
_sputnik_sparse.py
in xformers/components/attention
88 23
csr_tensor.py
in xformers/sparse
365 23
pyi
__init__.pyi
in stubs/numpy
99 22
attention_patterns.py
in xformers/components/attention
148 21
block_factory.py
in xformers/factory
294 18
seq_kv_cache.py
in experimental/ragged_inference
119 14
computeUtil.h
in xformers/components/attention/csrc
122 14
attention_mask.py
in xformers/components/attention
84 14
benchmark_vit_timm.py
in xformers/benchmarks
311 13
k_activations.py
in xformers/triton
72 12
model_wrapper.py
in xformers/benchmarks/LRA/code
147 12
garbage_pad_ragged_acts.py
in experimental/ragged_inference
93 11
core.py
in xformers/components/attention
154 11
softmax.py
in xformers/components/attention/feature_maps
146 11
k_layer_norm.py
in xformers/triton
225 10
reversible.py
in xformers/components
100 10
run_tasks.py
in xformers/benchmarks/LRA
467 10
utils.py
in xformers/sparse
70 10
Files With Long Lines (Top 4)

There are 4 files with lines longer than 120 characters. In total, there are 6 long lines.

File# lines# units# long lines
cu
sddmm2_cuda.cu
in xformers/components/attention/csrc/cuda
576 - 2
sparse_softmax.cpp
in xformers/components/attention/csrc
8 - 2
sddmm.cpp
in xformers/components/attention/csrc
6 - 1
spmm.cpp
in xformers/components/attention/csrc
6 - 1