aws / sagemaker-hyperpod-recipes
File Change Frequency

File change frequency (churn) shows the distribution of file updates (days with at least one commit).

Overview
File Change Frequency Overall
  • There are 116 files with 11,453 lines of code.
    • 0 files changed more than 100 times (0 lines of code)
    • 0 files changed 51-100 times (0 lines of code)
    • 0 files changed 21-50 times (0 lines of code)
    • 1 file changed 6-20 times (618 lines of code)
    • 115 files changed 1-5 times (10,835 lines of code)
0% | 0% | 0% | 5% | 94%
Legend:
101+
51-100
21-50
6-20
1-5

explore: grouped by folders | grouped by update frequency | data
Contributors Count Frequency Overall
  • There are 116 files with 11,453 lines of code.
    • 0 files changed by more than 25 contributors (0 lines of code)
    • 0 files changed by 11-25 contributors (0 lines of code)
    • 1 file changed by 6-10 contributors (618 lines of code)
    • 100 files changed by 2-5 contributors (10,313 lines of code)
    • 15 files changed by 1 contributor (522 lines of code)
0% | 0% | 5% | 90% | 4%
Legend:
26+
11-25
6-10
2-5
1

explore: grouped by folders | grouped by contributors count | data
File Change Frequency per File Extension
yaml, sh, py, md, txt, gitignore, json, gitmodules, toml
File Change Frequency per Extension
The number of recorded file updates
101+
51-100
21-50
6-20
1-5
py0% | 0% | 0% | 35% | 64%
yaml0% | 0% | 0% | 0% | 100%
toml0% | 0% | 0% | 0% | 100%
File Change Frequency per Logical Decomposition
primary
primary (file change frequency)
The number of recorded file updates
101+
51-100
21-50
6-20
1-5
launcher0% | 0% | 0% | 34% | 65%
recipes_collection0% | 0% | 0% | 0% | 100%
ROOT0% | 0% | 0% | 0% | 100%
template0% | 0% | 0% | 0% | 100%
launcher_scripts0% | 0% | 0% | 0% | 100%
Most Frequently Changed Files (Top 50)

See data for all files...

File# lines# unitscreatedlast modified# changes
(days)
# contributorsfirst
contributor
latest
contributor
stages.py
in launcher/nemo
618 49 2024-12-03 2025-04-17 14 9 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml
in recipes_collection/recipes/training/llama
108 - 2024-12-03 2025-02-25 5 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml
in recipes_collection/recipes/training/llama
108 - 2024-12-03 2025-02-25 5 3 arjkrish@amazon.com rnadimp@amazon.com
values.yaml
in launcher/nemo/k8s_templates/training
36 - 2024-12-03 2025-03-07 4 4 arjkrish@amazon.com htzhong@amazon.com
hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/llama
86 - 2024-12-03 2025-02-25 4 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml
in recipes_collection/recipes/training/llama
86 - 2024-12-03 2025-02-25 4 3 arjkrish@amazon.com rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
training.yaml
in launcher/nemo/k8s_templates/training
178 - 2024-12-03 2025-03-21 4 5 arjkrish@amazon.com rnadimp@amazon.com
main.py
in root
195 4 2024-12-03 2025-03-06 4 3 arjkrish@amazon.com rnadimp@amazon.com
config.yaml
in recipes_collection
24 - 2024-12-03 2025-02-17 3 3 arjkrish@amazon.com julian1@cimarron.me
train-script-gpu.yaml
in launcher/nemo/k8s_templates/training
49 - 2024-12-03 2025-02-17 3 3 arjkrish@amazon.com julian1@cimarron.me
82 2 2024-12-03 2024-12-24 3 3 arjkrish@amazon.com jesseche@amazon.com
train-script-trn.yaml
in launcher/nemo/k8s_templates/training
88 - 2024-12-03 2025-02-17 3 3 arjkrish@amazon.com julian1@cimarron.me
hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_deepseek_r1_671b_seq8k_gpu_qlora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
102 - 2025-02-18 2025-02-25 3 2 rnadimp@amazon.com rnadimp@amazon.com
hf_deepseek_r1_671b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
102 - 2025-02-18 2025-02-25 3 2 rnadimp@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq8k_gpu_p5x64_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq8k_gpu_p5x128_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-31 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.yaml
in recipes_collection/recipes/training/mixtral
105 - 2024-12-31 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml
in recipes_collection/recipes/training/mixtral
105 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
efa.py
in launcher
156 - 2024-12-03 2024-12-24 3 3 arjkrish@amazon.com jesseche@amazon.com
megatron_llama3_1_8b_nemo.yaml
in recipes_collection/recipes/training/llama
162 - 2024-12-03 2025-02-19 3 3 arjkrish@amazon.com 35045363+rohithn1@users.nor...
k8s.yaml
in recipes_collection/cluster
12 - 2024-12-03 2025-03-07 2 2 arjkrish@amazon.com htzhong@amazon.com
constants.py
in launcher/nemo
14 - 2024-12-03 2025-01-30 2 3 arjkrish@amazon.com rnadimp@amazon.com
config_k8s.yaml
in launcher_scripts/custom_script
41 - 2024-12-03 2025-03-07 2 2 arjkrish@amazon.com htzhong@amazon.com
falcon.yaml
in recipes_collection/recipes/training/custom_model
83 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
type_validator.py
in launcher/config_validator
99 10 2024-12-03 2025-03-07 2 2 arjkrish@amazon.com htzhong@amazon.com
hf_llama3_8b_seq8k_trn1x4_pretrain.yaml
in recipes_collection/recipes/training/llama
101 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
104 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_7b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
104 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
Files With Most Contributors (Top 50)
Based on the number of unique email addresses found in commits.

See data for all files...

File# lines# unitscreatedlast modified# changes
(days)
# contributorsfirst
contributor
latest
contributor
stages.py
in launcher/nemo
618 49 2024-12-03 2025-04-17 14 9 arjkrish@amazon.com rnadimp@amazon.com
training.yaml
in launcher/nemo/k8s_templates/training
178 - 2024-12-03 2025-03-21 4 5 arjkrish@amazon.com rnadimp@amazon.com
values.yaml
in launcher/nemo/k8s_templates/training
36 - 2024-12-03 2025-03-07 4 4 arjkrish@amazon.com htzhong@amazon.com
hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml
in recipes_collection/recipes/training/llama
108 - 2024-12-03 2025-02-25 5 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml
in recipes_collection/recipes/training/llama
108 - 2024-12-03 2025-02-25 5 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml
in recipes_collection/recipes/training/llama
86 - 2024-12-03 2025-02-25 4 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/llama
86 - 2024-12-03 2025-02-25 4 3 arjkrish@amazon.com rnadimp@amazon.com
main.py
in root
195 4 2024-12-03 2025-03-06 4 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml
in recipes_collection/recipes/training/mixtral
105 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x22b_seq8k_gpu_p5x64_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mixtral
104 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/mistral
100 - 2024-12-03 2025-02-25 3 3 arjkrish@amazon.com rnadimp@amazon.com
megatron_llama3_1_8b_nemo.yaml
in recipes_collection/recipes/training/llama
162 - 2024-12-03 2025-02-19 3 3 arjkrish@amazon.com 35045363+rohithn1@users.nor...
config.yaml
in recipes_collection
24 - 2024-12-03 2025-02-17 3 3 arjkrish@amazon.com julian1@cimarron.me
train-script-trn.yaml
in launcher/nemo/k8s_templates/training
88 - 2024-12-03 2025-02-17 3 3 arjkrish@amazon.com julian1@cimarron.me
train-script-gpu.yaml
in launcher/nemo/k8s_templates/training
49 - 2024-12-03 2025-02-17 3 3 arjkrish@amazon.com julian1@cimarron.me
efa.py
in launcher
156 - 2024-12-03 2024-12-24 3 3 arjkrish@amazon.com jesseche@amazon.com
82 2 2024-12-03 2024-12-24 3 3 arjkrish@amazon.com jesseche@amazon.com
hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq16k_gpu_p5x16_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq16k_gpu_p5x64_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq8k_trn1x4_pretrain.yaml
in recipes_collection/recipes/training/llama
101 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq8k_trn1x16_pretrain.yaml
in recipes_collection/recipes/training/llama
104 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
p4_hf_llama3_70b_seq8k_gpu.yaml
in recipes_collection/recipes/training/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
falcon.yaml
in recipes_collection/recipes/training/custom_model
83 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/llama
108 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_405b_seq128k_gpu_qlora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_405b_seq16k_gpu_qlora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/llama
108 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_405b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_70b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/llama
108 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_405b_seq8k_gpu_qlora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_8b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/llama
108 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
hf_llama3_405b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-03 2025-02-25 2 3 arjkrish@amazon.com rnadimp@amazon.com
Files With Least Contributors (Top 50)
Based on the number of unique email addresses found in commits.

See data for all files...

File# lines# unitscreatedlast modified# changes
(days)
# contributorsfirst
contributor
latest
contributor
sm_jobs.py
in template
109 2 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
slurm_launcher.py
in launcher/nemo
93 7 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
recipe_stages.py
in launcher/nemo
91 14 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
telemetry.py
in launcher
78 4 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
config_slurm.yaml
in launcher_scripts/custom_script
34 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
launchers.py
in launcher/nemo
33 4 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
sm_jobs.yaml
in recipes_collection/cluster
18 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
15 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
15 1 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
slurm.yaml
in recipes_collection/cluster
11 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
custom_allreduce.py
in launcher_scripts/custom_script
10 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
training-config.yaml
in launcher/nemo/k8s_templates/training
8 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
Chart.yaml
in launcher/nemo/k8s_templates/training
5 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
__init__.py
in launcher/nemo
1 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
__init__.py
in launcher
1 - 2024-12-03 2024-12-03 1 1 arjkrish@amazon.com arjkrish@amazon.com
value_validator.py
in launcher/config_validator
138 14 2024-12-03 2025-03-07 2 2 arjkrish@amazon.com htzhong@amazon.com
hf_llama3_8b_seq8k_gpu_dpo.yaml
in recipes_collection/recipes/fine-tuning/llama
110 - 2025-04-09 2025-04-09 1 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
108 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_3_70b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/llama
108 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_3_70b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/llama
108 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_70b_seq16k_gpu_p5x128_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_70b_seq8k_gpu_p5x128_pretrain.yaml
in recipes_collection/recipes/training/llama
107 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
107 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
107 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_32b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
107 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_32b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
107 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.yaml
in recipes_collection/recipes/fine-tuning/llama
107 - 2025-04-21 2025-04-21 1 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml
in recipes_collection/recipes/fine-tuning/llama
107 - 2025-04-17 2025-04-21 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml
in recipes_collection/recipes/fine-tuning/llama
107 - 2025-04-21 2025-04-21 1 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama4_17b_16e_seq8k_gpu_lora_text_to_text.yaml
in recipes_collection/recipes/fine-tuning/llama
107 - 2025-04-09 2025-04-17 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_14b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 4 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_7b_seq8k_gpu_fine_tuning.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_qwen_14b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-02-01 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/deepseek
106 - 2025-01-30 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_3_70b_seq16k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_405b_seq32k_gpu_qlora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_llama3_3_70b_seq8k_gpu_lora.yaml
in recipes_collection/recipes/fine-tuning/llama
106 - 2024-12-31 2025-02-25 2 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.yaml
in recipes_collection/recipes/training/mixtral
105 - 2024-12-31 2025-02-25 3 2 35045363+rohithn1@users.nor... rnadimp@amazon.com
Correlations

File Size vs. Number of Changes: 116 points

recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml x: 107 lines of code y: 1 # changes recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml x: 107 lines of code y: 2 # changes launcher/nemo/stages.py x: 618 lines of code y: 14 # changes recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_dpo.yaml x: 110 lines of code y: 1 # changes launcher/nemo/k8s_templates/training/training.yaml x: 178 lines of code y: 4 # changes launcher/config_validator/type_validator.py x: 99 lines of code y: 2 # changes launcher/config_validator/value_validator.py x: 138 lines of code y: 2 # changes launcher/nemo/k8s_templates/training/values.yaml x: 36 lines of code y: 4 # changes launcher_scripts/custom_script/config_k8s.yaml x: 41 lines of code y: 2 # changes recipes_collection/cluster/k8s.yaml x: 12 lines of code y: 2 # changes main.py x: 195 lines of code y: 4 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_lora.yaml x: 102 lines of code y: 3 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_fine_tuning.yaml x: 108 lines of code y: 3 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_lora.yaml x: 106 lines of code y: 3 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_lora.yaml x: 104 lines of code y: 2 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml x: 106 lines of code y: 4 # changes recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq16k_gpu_fine_tuning.yaml x: 108 lines of code y: 2 # changes recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml x: 109 lines of code y: 2 # changes recipes_collection/recipes/training/custom_model/falcon.yaml x: 83 lines of code y: 2 # changes recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml x: 86 lines of code y: 4 # changes recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml x: 108 lines of code y: 5 # changes recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml x: 101 lines of code y: 2 # changes recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml x: 100 lines of code y: 3 # changes recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.yaml x: 105 lines of code y: 3 # changes recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml x: 104 lines of code y: 3 # changes recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml x: 162 lines of code y: 3 # changes launcher/nemo/k8s_templates/training/train-script-gpu.yaml x: 49 lines of code y: 3 # changes launcher/nemo/k8s_templates/training/train-script-trn.yaml x: 88 lines of code y: 3 # changes recipes_collection/config.yaml x: 24 lines of code y: 3 # changes launcher/nemo/constants.py x: 14 lines of code y: 2 # changes launcher/accelerator_devices.py x: 82 lines of code y: 3 # changes launcher/efa.py x: 156 lines of code y: 3 # changes launcher/__init__.py x: 1 lines of code y: 1 # changes launcher/nemo/k8s_templates/training/Chart.yaml x: 5 lines of code y: 1 # changes launcher/nemo/k8s_templates/training/training-config.yaml x: 8 lines of code y: 1 # changes launcher/nemo/launchers.py x: 33 lines of code y: 1 # changes launcher/nemo/recipe_stages.py x: 91 lines of code y: 1 # changes launcher/nemo/slurm_launcher.py x: 93 lines of code y: 1 # changes launcher/telemetry.py x: 78 lines of code y: 1 # changes launcher_scripts/custom_script/config_slurm.yaml x: 34 lines of code y: 1 # changes launcher_scripts/custom_script/custom_allreduce.py x: 10 lines of code y: 1 # changes pyproject.toml x: 15 lines of code y: 1 # changes recipes_collection/cluster/slurm.yaml x: 11 lines of code y: 1 # changes recipes_collection/cluster/sm_jobs.yaml x: 18 lines of code y: 1 # changes template/sm_jobs.py x: 109 lines of code y: 1 # changes
14.0
# changes
  min: 1.0
  average: 2.41
  25th percentile: 2.0
  median: 2.0
  75th percentile: 3.0
  max: 14.0
0 618.0
lines of code
min: 1.0 | average: 98.73 | 25th percentile: 100.0 | median: 106.0 | 75th percentile: 107.0 | max: 618.0

Number of Contributors vs. Number of Changes: 116 points

recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml x: 2 # contributors y: 1 # changes recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml x: 2 # contributors y: 2 # changes launcher/nemo/stages.py x: 9 # contributors y: 14 # changes launcher/nemo/k8s_templates/training/training.yaml x: 5 # contributors y: 4 # changes launcher/nemo/k8s_templates/training/values.yaml x: 4 # contributors y: 4 # changes main.py x: 3 # contributors y: 4 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_lora.yaml x: 2 # contributors y: 3 # changes recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml x: 2 # contributors y: 4 # changes recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml x: 3 # contributors y: 2 # changes recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml x: 3 # contributors y: 5 # changes recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml x: 3 # contributors y: 3 # changes launcher/__init__.py x: 1 # contributors y: 1 # changes
14.0
# changes
  min: 1.0
  average: 2.41
  25th percentile: 2.0
  median: 2.0
  75th percentile: 3.0
  max: 14.0
0 9.0
# contributors
min: 1.0 | average: 2.44 | 25th percentile: 2.0 | median: 2.0 | 75th percentile: 3.0 | max: 9.0

Number of Contributors vs. File Size: 116 points

recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml x: 2 # contributors y: 107 lines of code launcher/nemo/stages.py x: 9 # contributors y: 618 lines of code recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_dpo.yaml x: 2 # contributors y: 110 lines of code launcher/nemo/k8s_templates/training/training.yaml x: 5 # contributors y: 178 lines of code launcher/config_validator/type_validator.py x: 2 # contributors y: 99 lines of code launcher/config_validator/value_validator.py x: 2 # contributors y: 138 lines of code launcher/nemo/k8s_templates/training/values.yaml x: 4 # contributors y: 36 lines of code launcher_scripts/custom_script/config_k8s.yaml x: 2 # contributors y: 41 lines of code recipes_collection/cluster/k8s.yaml x: 2 # contributors y: 12 lines of code main.py x: 3 # contributors y: 195 lines of code recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_lora.yaml x: 2 # contributors y: 102 lines of code recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_fine_tuning.yaml x: 2 # contributors y: 108 lines of code recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_lora.yaml x: 2 # contributors y: 104 lines of code recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml x: 3 # contributors y: 106 lines of code recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml x: 3 # contributors y: 108 lines of code recipes_collection/recipes/training/custom_model/falcon.yaml x: 3 # contributors y: 83 lines of code recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml x: 3 # contributors y: 86 lines of code recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml x: 3 # contributors y: 104 lines of code recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml x: 3 # contributors y: 101 lines of code recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml x: 3 # contributors y: 162 lines of code launcher/nemo/k8s_templates/training/train-script-gpu.yaml x: 3 # contributors y: 49 lines of code launcher/nemo/k8s_templates/training/train-script-trn.yaml x: 3 # contributors y: 88 lines of code recipes_collection/config.yaml x: 3 # contributors y: 24 lines of code launcher/nemo/constants.py x: 3 # contributors y: 14 lines of code launcher/accelerator_devices.py x: 3 # contributors y: 82 lines of code launcher/efa.py x: 3 # contributors y: 156 lines of code launcher/__init__.py x: 1 # contributors y: 1 lines of code launcher/nemo/k8s_templates/training/Chart.yaml x: 1 # contributors y: 5 lines of code launcher/nemo/k8s_templates/training/training-config.yaml x: 1 # contributors y: 8 lines of code launcher/nemo/launchers.py x: 1 # contributors y: 33 lines of code launcher/nemo/recipe_stages.py x: 1 # contributors y: 91 lines of code launcher/nemo/slurm_launcher.py x: 1 # contributors y: 93 lines of code launcher/telemetry.py x: 1 # contributors y: 78 lines of code launcher_scripts/custom_script/custom_allreduce.py x: 1 # contributors y: 10 lines of code pyproject.toml x: 1 # contributors y: 15 lines of code recipes_collection/cluster/sm_jobs.yaml x: 1 # contributors y: 18 lines of code template/sm_jobs.py x: 1 # contributors y: 109 lines of code
618.0
lines of code
  min: 1.0
  average: 98.73
  25th percentile: 100.0
  median: 106.0
  75th percentile: 107.0
  max: 618.0
0 9.0
# contributors
min: 1.0 | average: 2.44 | 25th percentile: 2.0 | median: 2.0 | 75th percentile: 3.0 | max: 9.0