aws-deepracer / aws-deepracer-inference-pkg
Unit Size

The distribution of size of units (measured in lines of code).

Intro
  • Unit size measurements show the distribution of size of units of code (methods, functions...).
  • Units are classified in four categories based on their size (lines of code): 1-20 (small units), 20-50 (medium size units), 51-100 (long units), 101+ (very long units).
  • You should aim at keeping units small (< 20 lines). Long units may become "bloaters", code that have increased to such gargantuan proportions that they are hard to work with.
Learn more...
Unit Size Overall
  • There are 29 units with 390 lines of code in units (63.6% of code).
    • 0 very long units (0 lines of code)
    • 0 long units (0 lines of code)
    • 6 medium size units (184 lines of code)
    • 8 small units (138 lines of code)
    • 15 very small units (68 lines of code)
0% | 0% | 47% | 35% | 17%
Legend:
101+
51-100
21-50
11-20
1-10
Unit Size per Extension
101+
51-100
21-50
11-20
1-10
cpp0% | 0% | 48% | 36% | 14%
py0% | 0% | 0% | 0% | 100%
hpp0% | 0% | 0% | 0% | 100%
Unit Size per Logical Component
primary logical decomposition
101+
51-100
21-50
11-20
1-10
src0% | 0% | 48% | 36% | 14%
launch0% | 0% | 0% | 0% | 100%
include/inference_pkg0% | 0% | 0% | 0% | 100%
Alternative Visuals
Longest Units
Top 20 longest units
Unit# linesMcCabe index# params
void RLInferenceModel::sensorCB()
in inference_pkg/src/intel_inference_eng.cpp
50 11 1
InferenceEngine::InferRequest setMultiHeadModel()
in inference_pkg/src/intel_inference_eng.cpp
30 9 8
bool RLInferenceModel::loadModel()
in inference_pkg/src/intel_inference_eng.cpp
29 5 2
bool cvtToCVObjResize()
in inference_pkg/src/image_process.cpp
29 5 3
void InferStateHdl()
in inference_pkg/src/inference_node.cpp
23 4 3
void LoadModelHdl()
in inference_pkg/src/inference_node.cpp
23 5 3
template void loadStereoImg()
in inference_pkg/src/intel_inference_eng.cpp
20 5 5
void Grey::processImage()
in inference_pkg/src/image_process.cpp
20 5 3
void GreyDiff::processImage()
in inference_pkg/src/image_process.cpp
19 4 3
void stack()
in inference_pkg/src/image_process.cpp
18 4 4
template void load1DImg()
in inference_pkg/src/intel_inference_eng.cpp
17 4 5
void Grey::processImageVec()
in inference_pkg/src/image_process.cpp
17 3 3
template void loadStackImg()
in inference_pkg/src/intel_inference_eng.cpp
16 4 5
void masking()
in inference_pkg/src/image_process.cpp
11 4 3
int main()
in inference_pkg/src/inference_node.cpp
10 1 2
def generate_launch_description()
in inference_pkg/launch/inference_pkg_launch.py
9 1 0
void loadLidarData()
in inference_pkg/src/intel_inference_eng.cpp
8 2 2
void threshold()
in inference_pkg/src/image_process.cpp
8 2 3
void RLInferenceModel::startInference()
in inference_pkg/src/intel_inference_eng.cpp
6 2 0
void RGB::processImage()
in inference_pkg/src/image_process.cpp
4 1 3