aws / sagemaker-spark-container
Conditional Complexity

The distribution of complexity of units (measured with McCabe index).

Intro
  • Conditional complexity (also called cyclomatic complexity) is a term used to measure the complexity of software. The term refers to the number of possible paths through a program function. A higher value ofter means higher maintenance and testing costs (infosecinstitute.com).
  • Conditional complexity is calculated by counting all conditions in the program that can affect the execution path (e.g. if statement, loops, switches, and/or operators, try and catch blocks...).
  • Conditional complexity is measured at the unit level (methods, functions...).
  • Units are classified in four categories based on the measured McCabe index: 1-5 (simple units), 6-10 (medium complex units), 11-25 (complex units), 26+ (very complex units).
Learn more...
Conditional Complexity Overall
  • There are 82 units with 692 lines of code in units (61.0% of code).
    • 0 very complex units (0 lines of code)
    • 0 complex units (0 lines of code)
    • 1 medium complex units (27 lines of code)
    • 5 simple units (148 lines of code)
    • 76 very simple units (517 lines of code)
0% | 0% | 3% | 21% | 74%
Legend:
51+
26-50
11-25
6-10
1-5
Alternative Visuals
Conditional Complexity per Extension
51+
26-50
11-25
6-10
1-5
py0% | 0% | 3% | 21% | 74%
Conditional Complexity per Logical Component
primary logical decomposition
51+
26-50
11-25
6-10
1-5
src/smspark0% | 0% | 3% | 21% | 74%
Most Complex Units
Top 20 most complex units
Unit# linesMcCabe index# params
def _get_list_of_files()
in src/smspark/cli.py
27 15 1
18 10 3
def __post_init__()
in src/smspark/config.py
16 9 1
def _render_spark_opts()
in src/smspark/cli.py
8 7 2
def run()
in src/smspark/job.py
64 7 4
def write_runtime_cluster_config()
in src/smspark/bootstrapper.py
42 6 1
def run()
in src/smspark/spark_event_logs_publisher.py
17 5 1
def submit_main()
in src/smspark/cli.py
13 5 0
def write_user_configuration()
in src/smspark/bootstrapper.py
29 5 1
def run()
in src/smspark/spark_executor_logs_watcher.py
14 4 1
def _get_hadoop_jar()
in src/smspark/bootstrapper.py
6 4 1
def start_hadoop_daemons()
in src/smspark/bootstrapper.py
22 4 1
def get_regional_configs()
in src/smspark/bootstrapper.py
14 4 1
def env_serializer()
in src/smspark/config.py
12 4 1
def write_config()
in src/smspark/config.py
25 4 1
def _config_event_log()
in src/smspark/spark_event_logs_publisher.py
7 3 1
def start_history_server()
in src/smspark/history_server_utils.py
21 3 1
def on_created()
in src/smspark/spark_executor_logs_watcher.py
8 3 1
def copy_aws_jars()
in src/smspark/bootstrapper.py
11 3 1
def _copy_optional_jars()
in src/smspark/bootstrapper.py
9 3 1