facebookresearch / UnsupervisedDecomposition
Conditional Complexity

The distribution of complexity of units (measured with McCabe index).

Intro
  • Conditional complexity (also called cyclomatic complexity) is a term used to measure the complexity of software. The term refers to the number of possible paths through a program function. A higher value ofter means higher maintenance and testing costs (infosecinstitute.com).
  • Conditional complexity is calculated by counting all conditions in the program that can affect the execution path (e.g. if statement, loops, switches, and/or operators, try and catch blocks...).
  • Conditional complexity is measured at the unit level (methods, functions...).
  • Units are classified in four categories based on the measured McCabe index: 1-5 (simple units), 6-10 (medium complex units), 11-25 (complex units), 26+ (very complex units).
Learn more...
Conditional Complexity Overall
  • There are 790 units with 11,911 lines of code in units (80.0% of code).
    • 1 very complex units (68 lines of code)
    • 16 complex units (1,360 lines of code)
    • 65 medium complex units (2,859 lines of code)
    • 112 simple units (2,679 lines of code)
    • 596 very simple units (4,945 lines of code)
<1% | 11% | 24% | 22% | 41%
Legend:
51+
26-50
11-25
6-10
1-5
Alternative Visuals
Conditional Complexity per Extension
51+
26-50
11-25
6-10
1-5
py<1% | 11% | 24% | 22% | 41%
perl0% | 0% | 0% | 0% | 100%
Conditional Complexity per Logical Component
primary logical decomposition
51+
26-50
11-25
6-10
1-5
XLM/src/data11% | 0% | 28% | 15% | 44%
pytorch-transformers/pytorch_transformers0% | 9% | 19% | 24% | 47%
XLM/src/model0% | 30% | 23% | 13% | 31%
XLM/src/model/memory0% | 20% | 12% | 22% | 45%
pytorch-transformers/pseudoalignment0% | 17% | 53% | 8% | 20%
pytorch-transformers0% | 19% | 11% | 32% | 37%
XLM0% | 27% | 15% | 0% | 57%
XLM/src0% | 6% | 25% | 36% | 31%
XLM/src/evaluation0% | 0% | 54% | 17% | 28%
XLM/tools0% | 0% | 0% | 0% | 100%
Most Complex Units
Top 20 most complex units
Unit# linesMcCabe index# params
def check_data_params()
in XLM/src/data/loader.py
68 73 1
def forward()
in pytorch-transformers/pytorch_transformers/modeling_xlnet.py
101 50 9
def _forward()
in pytorch-transformers/pytorch_transformers/modeling_transfo_xl.py
108 41 4
def _from_pretrained()
in pytorch-transformers/pytorch_transformers/tokenization_utils.py
104 41 4
def generate_beam()
in XLM/src/model/transformer.py
94 41 9
def check_model_params()
in XLM/src/model/__init__.py
49 39 1
def build_model()
in XLM/src/model/__init__.py
58 37 2
def from_pretrained()
in pytorch-transformers/pytorch_transformers/modeling_utils.py
106 34 4
def __init__()
in XLM/src/trainer.py
72 34 3
def main()
in XLM/translate.py
86 33 1
def __init__()
in XLM/src/model/memory/memory.py
83 32 4
def check_params()
in XLM/src/model/memory/memory.py
49 31 1
def forward()
in pytorch-transformers/pytorch_transformers/modeling_xlm.py
66 29 9
def main()
in pytorch-transformers/pytorch_transformers/__main__.py
112 29 0
def _init_weights()
in pytorch-transformers/pytorch_transformers/modeling_transfo_xl.py
38 28 2
def main_bert_nsp()
in pytorch-transformers/pseudoalignment/pseudo_decomp_bert_nsp.py
128 28 0
def evaluate_ensemble()
in pytorch-transformers/ensemble_answers_by_confidence_script.py
106 26 3
def load_data()
in XLM/src/evaluation/glue.py
44 23 2
def load_tf_weights_in_bert()
in pytorch-transformers/pytorch_transformers/modeling_bert.py
59 22 3
def get_from_cache()
in pytorch-transformers/pytorch_transformers/file_utils.py
51 22 4