pytorch / extension-cpp
Unit Size

The distribution of size of units (measured in lines of code).

Intro
  • Unit size measurements show the distribution of size of units of code (methods, functions...).
  • Units are classified in four categories based on their size (lines of code): 1-20 (small units), 20-50 (medium size units), 51-100 (long units), 101+ (very long units).
  • You should aim at keeping units small (< 20 lines). Long units may become "bloaters", code that have increased to such gargantuan proportions that they are hard to work with.
Learn more...
Unit Size Overall
  • There are 33 units with 285 lines of code in units (43.3% of code).
    • 0 very long units (0 lines of code)
    • 0 long units (0 lines of code)
    • 4 medium size units (106 lines of code)
    • 6 small units (77 lines of code)
    • 23 very small units (102 lines of code)
0% | 0% | 37% | 27% | 35%
Legend:
101+
51-100
21-50
11-20
1-10
Unit Size per Extension
101+
51-100
21-50
11-20
1-10
cpp0% | 0% | 76% | 12% | 11%
py0% | 0% | 13% | 35% | 50%
Unit Size per Logical Component
primary logical decomposition
101+
51-100
21-50
11-20
1-10
cpp0% | 0% | 59% | 0% | 40%
cuda0% | 0% | 43% | 19% | 37%
python0% | 0% | 29% | 27% | 42%
ROOT0% | 0% | 0% | 89% | 10%
Alternative Visuals
Longest Units
Top 20 longest units
Unit# linesMcCabe index# params
30 1 9
std::vector lltm_backward()
in cuda/lltm_cuda.cpp
29 1 9
def backward()
in python/lltm_baseline.py
25 5 3
22 1 5
19 2 3
std::vector lltm_forward()
in cuda/lltm_cuda.cpp
13 1 5
def forward()
in python/lltm_baseline.py
12 1 6
11 4 3
11 2 3
def forward()
in python/lltm.py
11 3 3
def __init__()
in python/lltm.py
8 3 3
def __init__()
in python/lltm_baseline.py
8 1 3
def __init__()
in cuda/lltm.py
8 1 3
def __init__()
in cpp/lltm.py
8 1 3
def forward()
in cuda/lltm.py
6 1 6
def forward()
in cpp/lltm.py
6 1 6
def backward()
in cuda/lltm.py
5 1 3
torch::Tensor d_elu()
in cpp/lltm.cpp
5 1 2
def reset_parameters()
in python/lltm.py
4 2 1
def d_elu()
in python/lltm_baseline.py
4 1 2