awslabs / dynamic-training-with-apache-mxnet-on-aws

Dynamic training with Apache MXNet reduces cost and time for training deep neural networks by leveraging AWS cloud elasticity and scale. The system reduces training cost and time by dynamically updating the training cluster size during training, with minimal impact on model training accuracy.

Summary
JL
I
PROTO
CMAKE
ST
GV
PYX
CFG
PYI
G4
IN
T
email_034-attachment-send-file-code-cssCreated with Sketch.
Main Code: 252,286 LOC (1584 files) = H (25%) + PY (24%) + CC (18%) + PM (8%) + CU (5%) + SCALA (4%) + CUH (2%) + R (1%) + JL (1%) + I (1%) + PROTO (1%) + CLJ (<1%) + CPP (<1%) + JAVA (<1%) + HPP (<1%) + CMAKE (<1%) + ST (<1%) + GV (<1%) + M (<1%) + PYX (<1%) + PL (<1%) + YML (<1%) + CFG (<1%) + GROOVY (<1%) + PYI (<1%) + YAML (<1%) + G4 (<1%) + IN (<1%) + T (<1%) + PERL (<1%)
Secondary code: Test: 55,422 LOC (298); Generated: 0 LOC (0); Build & Deploy: 3,831 LOC (109); Other: 22,992 LOC (347);
File Size: 13% long (>1000 LOC), 34% short (<= 200 LOC)
Unit Size: 9% long (>100 LOC), 41% short (<= 10 LOC)
Conditional Complexity: 5% complex (McCabe index > 50), 57% simple (McCabe index <= 5)
Logical Component Decomposition: primary (19 components)
files_time

3 years, 2 months old

  • 100% of code older than 365 days
  • 100% of code not updated in the past 365 days

0% of code updated more than 50 times

Also see temporal dependencies for files frequently changed in same commits.

Goals: Keep the system simple and easy to change (4)
Straight_Line
Features of interest:
TODOs
48 files
Commits Trend

Latest commit date: 2021-04-28

0
commits
(30 days)
0
contributors
(30 days)
Commits

6

6

1

31

Contributors

2

2

1

6

2021 2020 2019 2018
show commits trend per language
Reports
Analysis Report
Artboard 48
Duplication
Analysis Report
Trend
Analysis Report
76_startup_sticky_notes
Notes & Findings
Links

generated by sokrates.dev (configuration) on 2022-01-31