An overview of contributor trends.
Committed in past 6 months (a rookie = the first commit in past year)
Past 30 days (2):
Past 31 to 90 days (4):
Past 91 to 180 days (8):
Last contributors more than 6 months ago
Commits (3m) |
Commit Days |
-
|
2
|
3
|
5
|
2
|
10
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
|
35045363+rohithn1@users.noreply.github.com | 10 | 13 |
-
|
|
|
|
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
rnadimp@amazon.com | 10 | 13 |
-
|
|
|
|
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
htzhong@amazon.com | 2 | 1 |
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
changnit@amazon.com | 2 | 1 |
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
julian1@cimarron.me | 1 | 1 |
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
julianhr@amazon.com | 1 | 1 |
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
jesseche@amazon.com | - | 2 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
tomhbous@amazon.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
80425164+can-sun@users.noreply.github.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
spiderpower02@gmail.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
shchinmy@amazon.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
agankta@amazon.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
39390679+ankitaagarwal015@users.noreply.github.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
arjkrish@amazon.com | - | 1 |
-
|
-
|
-
|
-
|
-
|
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
Commits (3m) |
Commit Days |
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
-
|
2 contributors (6 commits):
# | Contributor |
First Commit |
Latest Commit |
Commits Count |
File Updates (per extension) |
---|---|---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | 2024-12-31 | 2025-04-21 | 3 (50%) | yaml (7), sh (6), md (3), py (2) |
2. | rnadimp@amazon.com | 2024-12-31 | 2025-04-21 | 3 (50%) | yaml (7), sh (6), md (3), py (2) |
A contributor dependency is detected if two contributors have changed the same files in the past 30 days.
The number on lines shows the number of same files that both persons changed in past 30 days.
Contributor 1 | Contributor 2 | # shared files | |
---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | rnadimp@amazon.com |
12 shared files
recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.yaml launcher_scripts/llama/run_hf_llama3_8b_seq8k_gpu_dpo.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_dpo.yaml launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.sh recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_text_to_text.yaml launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.sh recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml launcher_scripts/llama/run_hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.sh recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml launcher/nemo/stages.py README.md launcher_scripts/llama/run_hf_llama4_17b_16e_seq8k_gpu_lora_text_to_text.sh |
Contributor | # connections | # commits | |
---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | 1 | 3 |
2. | rnadimp@amazon.com | 1 | 3 |
C-median: 1.0
A half of the contributors has more than 1.0 connections, and a half has less than this number.
C-mean: 1.0
An average number of connections a contributor has with other contributors.
C-index: 1.0
There are 1.0 contributors with 1.0 or more connections.
6 contributors (26 commits):
# | Contributor |
First Commit |
Latest Commit |
Commits Count |
File Updates (per extension) |
---|---|---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | 2024-12-31 | 2025-04-21 | 10 (38%) | yaml (113), sh (15), py (13), md (5), txt (1) |
2. | rnadimp@amazon.com | 2024-12-31 | 2025-04-21 | 10 (38%) | yaml (113), sh (15), py (13), md (5), txt (1) |
3. | htzhong@amazon.com | 2025-03-07 | 2025-03-07 | 2 (7%) | yaml (16), py (10) |
4. | changnit@amazon.com | 2025-02-27 | 2025-02-27 | 2 (7%) | yaml (6) |
5. | julian1@cimarron.me | 2025-02-17 | 2025-02-17 | 1 (3%) | yaml (6), py (4), sh (3) |
6. | julianhr@amazon.com | 2025-02-03 | 2025-02-03 | 1 (3%) | yaml (6), py (4), sh (3) |
A contributor dependency is detected if two contributors have changed the same files in the past 90 days.
The number on lines shows the number of same files that both persons changed in past 90 days.
Contributor 1 | Contributor 2 | # shared files | |
---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | rnadimp@amazon.com |
122 shared files
recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml tests/sm_jobs_workflow/test_sm_jobs_workflow.py recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/recipe.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq16k_gpu_lora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/launch.py recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_qlora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/llama3-2-11b_hydra.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_lora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/requirements.txt recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_fine_tuning.yaml tests/test_launcher_scripts.py recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_lora.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_lora.yaml launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.sh recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq8k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_671b_seq8k_gpu_qlora.sh recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq16k_gpu_lora.sh tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/sm_jobs_config.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/no_kwargs/llama-8b/llama-8b_hydra.yaml launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_671b_seq8k_gpu_lora.sh tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/llama3-2-11b_submission.sh launcher/nemo/k8s_templates/training/training.yaml launcher_scripts/llama/run_hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq32k_gpu_qlora.yaml tests/test_config_files.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml README.md launcher_scripts/llama/run_hf_llama3_8b_seq8k_gpu_dpo.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq8k_gpu_lora.sh recipes_collection/recipes/training/custom_model/falcon.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml ... |
2. | julianhr@amazon.com | julian1@cimarron.me |
13 shared files
recipes_collection/config.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/test_readme.py launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py launcher/nemo/k8s_templates/training/values.yaml tests/test_recipes.py launcher/nemo/stages.py |
3. | htzhong@amazon.com | julian1@cimarron.me |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py |
4. | htzhong@amazon.com | julianhr@amazon.com |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py |
5. | 35045363+rohithn1@users.noreply.github.com | htzhong@amazon.com |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml launcher/nemo/stages.py |
6. | rnadimp@amazon.com | htzhong@amazon.com |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml launcher/nemo/stages.py |
7. | 35045363+rohithn1@users.noreply.github.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
8. | rnadimp@amazon.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
9. | htzhong@amazon.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
10. | 35045363+rohithn1@users.noreply.github.com | julian1@cimarron.me |
2 shared files
tests/test_utils.py launcher/nemo/stages.py |
11. | 35045363+rohithn1@users.noreply.github.com | julianhr@amazon.com |
2 shared files
tests/test_utils.py launcher/nemo/stages.py |
12. | rnadimp@amazon.com | julian1@cimarron.me |
2 shared files
tests/test_utils.py launcher/nemo/stages.py |
13. | rnadimp@amazon.com | julianhr@amazon.com |
2 shared files
tests/test_utils.py launcher/nemo/stages.py |
Contributor | # connections | # commits | |
---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | 5 | 10 |
2. | rnadimp@amazon.com | 5 | 10 |
3. | htzhong@amazon.com | 5 | 2 |
4. | julian1@cimarron.me | 4 | 1 |
5. | julianhr@amazon.com | 4 | 1 |
6. | changnit@amazon.com | 3 | 2 |
C-median: 4.5
A half of the contributors has more than 4.5 connections, and a half has less than this number.
C-mean: 4.3
An average number of connections a contributor has with other contributors.
C-index: 4.0
There are 4.0 contributors with 4.0 or more connections.
14 contributors (44 commits):
# | Contributor |
First Commit |
Latest Commit |
Commits Count |
---|---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | 2024-12-31 | 2025-04-21 | 15 (34%) |
2. | rnadimp@amazon.com | 2024-12-31 | 2025-04-21 | 13 (29%) |
3. | htzhong@amazon.com | 2025-03-07 | 2025-03-07 | 2 (4%) |
4. | changnit@amazon.com | 2025-02-27 | 2025-02-27 | 2 (4%) |
5. | jesseche@amazon.com | 2024-12-07 | 2024-12-24 | 2 (4%) |
6. | agankta@amazon.com | 2024-12-04 | 2024-12-04 | 2 (4%) |
7. | julian1@cimarron.me | 2025-02-17 | 2025-02-17 | 1 (2%) |
8. | julianhr@amazon.com | 2025-02-03 | 2025-02-03 | 1 (2%) |
9. | tomhbous@amazon.com | 2024-12-23 | 2024-12-23 | 1 (2%) |
10. | 80425164+can-sun@users.noreply.github.com | 2024-12-16 | 2024-12-16 | 1 (2%) |
11. | spiderpower02@gmail.com | 2024-12-06 | 2024-12-06 | 1 (2%) |
12. | shchinmy@amazon.com | 2024-12-05 | 2024-12-05 | 1 (2%) |
13. | 39390679+ankitaagarwal015@users.noreply.github.com | 2024-12-04 | 2024-12-04 | 1 (2%) |
14. | arjkrish@amazon.com | 2024-12-03 | 2024-12-03 | 1 (2%) |
A contributor dependency is detected if two contributors have changed the same files in the past 180 days.
The number on lines shows the number of same files that both persons changed in past 180 days.
Contributor 1 | Contributor 2 | # shared files | |
---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | rnadimp@amazon.com |
156 shared files
tests/sm_jobs_workflow/test_sm_jobs_workflow.py launcher_scripts/llama/run_hf_llama3_70b_seq16k_gpu_p5x128.pretrain.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/launch.py recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_8b_seq8k_gpu_lora.sh recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_7b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_qlora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/llama3-2-11b_hydra.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_lora.yaml launcher/nemo/constants.py recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_8b_seq16k_gpu_lora.sh recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq16k_gpu_fine_tuning.sh launcher_scripts/llama/run_hf_llama3_405b_seq32k_gpu_qlora.sh launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_8b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_fine_tuning.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_671b_seq8k_gpu_qlora.sh recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq8k_gpu_fine_tuning.sh launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_671b_seq8k_gpu_lora.sh recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml launcher_scripts/llama/run_hf_llama3_70b_seq8k_gpu_p5x128_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_32b_seq8k_gpu_lora.sh launcher_scripts/llama/run_hf_llama3_3_70b_seq16k_gpu_fine_tuning.sh launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_32b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/llama-8b_hydra.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_70b_seq8k_gpu_lora.sh launcher_scripts/llama/run_hf_llama4_17b_16e_seq8k_gpu_lora_text_to_text.sh launcher_scripts/llama/run_hf_llama3_3_70b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq8k_gpu_fine_tuning.yaml tests/test_recipes.py launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq8k_gpu_lora.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_fine_tuning.yaml launcher/nemo/stages.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/recipe.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x128_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_7b_seq8k_gpu_lora.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_32b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_7b_seq16k_gpu_lora.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_lora.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_lora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/requirements.txt recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_fine_tuning.yaml tests/test_launcher_scripts.py tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml launcher_scripts/mixtral/run_hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.sh recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq16k_gpu_lora.sh launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.sh launcher_scripts/llama/run_hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.sh launcher_scripts/deepseek/run_hf_deepseek_r1_llama_70b_seq16k_gpu_lora.sh launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq16k_gpu_fine_tuning.sh ... |
2. | 35045363+rohithn1@users.noreply.github.com | arjkrish@amazon.com |
67 shared files
tests/sm_jobs_workflow/test_sm_jobs_workflow.py recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml launcher/nemo/constants.py recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/llama-8b_hydra.yaml launcher/nemo/stages.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_8b_seq8k_trn1_fine_tuning.sh recipes_collection/recipes/training/llama/hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/no_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml launcher/nemo/k8s_templates/training/training.yaml launcher_scripts/llama/run_hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.sh recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml README.md recipes_collection/recipes/training/custom_model/falcon.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x16_pretrain.yaml main.py tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/llama-8b_hydra.yaml tests/test_utils.py recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x64_pretrain.yaml |
3. | rnadimp@amazon.com | arjkrish@amazon.com |
67 shared files
tests/sm_jobs_workflow/test_sm_jobs_workflow.py recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml launcher/nemo/constants.py recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/llama-8b_hydra.yaml launcher/nemo/stages.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_8b_seq8k_trn1_fine_tuning.sh recipes_collection/recipes/training/llama/hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/no_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml launcher/nemo/k8s_templates/training/training.yaml launcher_scripts/llama/run_hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.sh recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml README.md recipes_collection/recipes/training/custom_model/falcon.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x16_pretrain.yaml main.py tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/llama-8b_hydra.yaml tests/test_utils.py recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x64_pretrain.yaml |
4. | htzhong@amazon.com | arjkrish@amazon.com |
13 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml recipes_collection/cluster/k8s.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher_scripts/custom_script/config_k8s.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py tests/config_validator/test_value_validator.py launcher/config_validator/type_validator.py tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/config_validator/value_validator.py launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml tests/config_validator/test_type_validator.py |
5. | julian1@cimarron.me | julianhr@amazon.com |
13 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/test_readme.py launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml tests/test_recipes.py launcher/nemo/stages.py recipes_collection/config.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py |
6. | julian1@cimarron.me | arjkrish@amazon.com |
11 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py recipes_collection/config.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py |
7. | julianhr@amazon.com | arjkrish@amazon.com |
11 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py recipes_collection/config.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py |
8. | jesseche@amazon.com | arjkrish@amazon.com |
11 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/sagemaker-llama-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/testcustom_slurm_test_custom_submission.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/sagemaker-hf-llama3-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh launcher/efa.py launcher/accelerator_devices.py |
9. | jesseche@amazon.com | spiderpower02@gmail.com |
9 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/sagemaker-llama-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/testcustom_slurm_test_custom_submission.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/sagemaker-hf-llama3-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
10. | spiderpower02@gmail.com | arjkrish@amazon.com |
9 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/sagemaker-llama-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/testcustom_slurm_test_custom_submission.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/sagemaker-hf-llama3-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
11. | julian1@cimarron.me | jesseche@amazon.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
12. | julian1@cimarron.me | spiderpower02@gmail.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
13. | julianhr@amazon.com | jesseche@amazon.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
14. | julianhr@amazon.com | spiderpower02@gmail.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
15. | htzhong@amazon.com | julian1@cimarron.me |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py |
16. | htzhong@amazon.com | julianhr@amazon.com |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py |
17. | rnadimp@amazon.com | htzhong@amazon.com |
4 shared files
launcher/nemo/stages.py tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
18. | 35045363+rohithn1@users.noreply.github.com | htzhong@amazon.com |
4 shared files
launcher/nemo/stages.py tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
19. | htzhong@amazon.com | jesseche@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py |
20. | htzhong@amazon.com | spiderpower02@gmail.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py |
21. | julian1@cimarron.me | 35045363+rohithn1@users.noreply.github.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
22. | julian1@cimarron.me | rnadimp@amazon.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
23. | julianhr@amazon.com | 35045363+rohithn1@users.noreply.github.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
24. | julianhr@amazon.com | rnadimp@amazon.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
25. | 35045363+rohithn1@users.noreply.github.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
26. | rnadimp@amazon.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
27. | htzhong@amazon.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
28. | changnit@amazon.com | arjkrish@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
29. | arjkrish@amazon.com | 80425164+can-sun@users.noreply.github.com |
2 shared files
launcher/efa.py launcher/accelerator_devices.py |
30. | 80425164+can-sun@users.noreply.github.com | jesseche@amazon.com |
2 shared files
launcher/efa.py launcher/accelerator_devices.py |
31. | arjkrish@amazon.com | tomhbous@amazon.com |
1 shared file
launcher/nemo/stages.py |
32. | spiderpower02@gmail.com | tomhbous@amazon.com |
1 shared file
launcher/nemo/stages.py |
33. | spiderpower02@gmail.com | rnadimp@amazon.com |
1 shared file
launcher/nemo/stages.py |
34. | spiderpower02@gmail.com | 35045363+rohithn1@users.noreply.github.com |
1 shared file
launcher/nemo/stages.py |
35. | jesseche@amazon.com | tomhbous@amazon.com |
1 shared file
launcher/nemo/stages.py |
36. | jesseche@amazon.com | rnadimp@amazon.com |
1 shared file
launcher/nemo/stages.py |
37. | jesseche@amazon.com | 35045363+rohithn1@users.noreply.github.com |
1 shared file
launcher/nemo/stages.py |
38. | tomhbous@amazon.com | julianhr@amazon.com |
1 shared file
launcher/nemo/stages.py |
39. | tomhbous@amazon.com | julian1@cimarron.me |
1 shared file
launcher/nemo/stages.py |
40. | tomhbous@amazon.com | rnadimp@amazon.com |
1 shared file
launcher/nemo/stages.py |
41. | tomhbous@amazon.com | 35045363+rohithn1@users.noreply.github.com |
1 shared file
launcher/nemo/stages.py |
42. | tomhbous@amazon.com | htzhong@amazon.com |
1 shared file
launcher/nemo/stages.py |
43. | 35045363+rohithn1@users.noreply.github.com | shchinmy@amazon.com |
1 shared file
README.md |
44. | 35045363+rohithn1@users.noreply.github.com | agankta@amazon.com |
1 shared file
README.md |
45. | 35045363+rohithn1@users.noreply.github.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
46. | rnadimp@amazon.com | shchinmy@amazon.com |
1 shared file
README.md |
47. | rnadimp@amazon.com | agankta@amazon.com |
1 shared file
README.md |
48. | rnadimp@amazon.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
49. | shchinmy@amazon.com | agankta@amazon.com |
1 shared file
README.md |
50. | shchinmy@amazon.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
51. | shchinmy@amazon.com | arjkrish@amazon.com |
1 shared file
README.md |
52. | agankta@amazon.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
53. | agankta@amazon.com | arjkrish@amazon.com |
1 shared file
README.md |
54. | 39390679+ankitaagarwal015@users.noreply.github.com | arjkrish@amazon.com |
1 shared file
README.md |
Contributor | # connections | # commits | |
---|---|---|---|
1. | arjkrish@amazon.com | 13 | 1 |
2. | 35045363+rohithn1@users.noreply.github.com | 12 | 15 |
3. | rnadimp@amazon.com | 12 | 13 |
4. | htzhong@amazon.com | 9 | 2 |
5. | jesseche@amazon.com | 9 | 2 |
6. | julian1@cimarron.me | 8 | 1 |
7. | julianhr@amazon.com | 8 | 1 |
8. | tomhbous@amazon.com | 8 | 1 |
9. | spiderpower02@gmail.com | 8 | 1 |
10. | agankta@amazon.com | 5 | 2 |
11. | shchinmy@amazon.com | 5 | 1 |
12. | 39390679+ankitaagarwal015@users.noreply.github.com | 5 | 1 |
13. | changnit@amazon.com | 4 | 2 |
14. | 80425164+can-sun@users.noreply.github.com | 2 | 1 |
C-median: 8.0
A half of the contributors has more than 8.0 connections, and a half has less than this number.
C-mean: 7.7
An average number of connections a contributor has with other contributors.
C-index: 8.0
There are 8.0 contributors with 8.0 or more connections.
14 contributors (44 commits):
# | Contributor |
First Commit |
Latest Commit |
Commits Count |
---|---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | 2024-12-31 | 2025-04-21 | 15 (34%) |
2. | rnadimp@amazon.com | 2024-12-31 | 2025-04-21 | 13 (29%) |
3. | htzhong@amazon.com | 2025-03-07 | 2025-03-07 | 2 (4%) |
4. | changnit@amazon.com | 2025-02-27 | 2025-02-27 | 2 (4%) |
5. | jesseche@amazon.com | 2024-12-07 | 2024-12-24 | 2 (4%) |
6. | agankta@amazon.com | 2024-12-04 | 2024-12-04 | 2 (4%) |
7. | julian1@cimarron.me | 2025-02-17 | 2025-02-17 | 1 (2%) |
8. | julianhr@amazon.com | 2025-02-03 | 2025-02-03 | 1 (2%) |
9. | tomhbous@amazon.com | 2024-12-23 | 2024-12-23 | 1 (2%) |
10. | 80425164+can-sun@users.noreply.github.com | 2024-12-16 | 2024-12-16 | 1 (2%) |
11. | spiderpower02@gmail.com | 2024-12-06 | 2024-12-06 | 1 (2%) |
12. | shchinmy@amazon.com | 2024-12-05 | 2024-12-05 | 1 (2%) |
13. | 39390679+ankitaagarwal015@users.noreply.github.com | 2024-12-04 | 2024-12-04 | 1 (2%) |
14. | arjkrish@amazon.com | 2024-12-03 | 2024-12-03 | 1 (2%) |
A contributor dependency is detected if two contributors have changed the same files in the past 365 days.
The number on lines shows the number of same files that both persons changed in past 365 days.
Contributor 1 | Contributor 2 | # shared files | |
---|---|---|---|
1. | 35045363+rohithn1@users.noreply.github.com | rnadimp@amazon.com |
156 shared files
tests/sm_jobs_workflow/test_sm_jobs_workflow.py launcher_scripts/llama/run_hf_llama3_70b_seq16k_gpu_p5x128.pretrain.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_fine_tuning.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/launch.py recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_8b_seq8k_gpu_lora.sh recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_7b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_qlora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/llama3-2-11b_hydra.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq16k_gpu_lora.yaml launcher/nemo/constants.py recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_8b_seq16k_gpu_lora.sh recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq16k_gpu_fine_tuning.sh launcher_scripts/llama/run_hf_llama3_405b_seq32k_gpu_qlora.sh launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_1_dot_5b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_8b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_fine_tuning.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_671b_seq8k_gpu_qlora.sh recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq8k_gpu_fine_tuning.sh launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq4k_gpu_lora_text_to_text.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_671b_seq8k_gpu_lora.sh recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml launcher_scripts/llama/run_hf_llama3_70b_seq8k_gpu_p5x128_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_32b_seq8k_gpu_lora.sh launcher_scripts/llama/run_hf_llama3_3_70b_seq16k_gpu_fine_tuning.sh launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_32b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/llama-8b_hydra.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_llama_70b_seq8k_gpu_lora.sh launcher_scripts/llama/run_hf_llama4_17b_16e_seq8k_gpu_lora_text_to_text.sh launcher_scripts/llama/run_hf_llama3_3_70b_seq8k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_7b_seq8k_gpu_fine_tuning.yaml tests/test_recipes.py launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq8k_gpu_lora.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq16k_gpu_fine_tuning.yaml launcher/nemo/stages.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/recipe.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama4_17b_16e_seq8k_gpu_lora_multimodal_finetuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x128_pretrain.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_7b_seq8k_gpu_lora.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_32b_seq16k_gpu_fine_tuning.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_671b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_7b_seq16k_gpu_lora.sh recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_32b_seq16k_gpu_lora.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_8b_seq8k_gpu_lora.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/multimodal/llama3-2-11b/requirements.txt recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_qwen_14b_seq16k_gpu_fine_tuning.yaml tests/test_launcher_scripts.py tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml launcher_scripts/mixtral/run_hf_mixtral_8x22b_seq16k_gpu_p5x128_pretrain.sh recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/deepseek/hf_deepseek_r1_distilled_llama_70b_seq8k_gpu_lora.yaml launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_14b_seq16k_gpu_lora.sh launcher_scripts/llama/run_hf_llama4_17b_16e_seq4k_gpu_lora_multimodal_finetuning.sh launcher_scripts/llama/run_hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.sh launcher_scripts/deepseek/run_hf_deepseek_r1_llama_70b_seq16k_gpu_lora.sh launcher_scripts/deepseek/run_hf_deepseek_r1_qwen_1_dot_5b_seq16k_gpu_fine_tuning.sh ... |
2. | 35045363+rohithn1@users.noreply.github.com | arjkrish@amazon.com |
67 shared files
tests/sm_jobs_workflow/test_sm_jobs_workflow.py recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml launcher/nemo/constants.py recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/llama-8b_hydra.yaml launcher/nemo/stages.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_8b_seq8k_trn1_fine_tuning.sh recipes_collection/recipes/training/llama/hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/no_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml launcher/nemo/k8s_templates/training/training.yaml launcher_scripts/llama/run_hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.sh recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml README.md recipes_collection/recipes/training/custom_model/falcon.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x16_pretrain.yaml main.py tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/llama-8b_hydra.yaml tests/test_utils.py recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x64_pretrain.yaml |
3. | rnadimp@amazon.com | arjkrish@amazon.com |
67 shared files
tests/sm_jobs_workflow/test_sm_jobs_workflow.py recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/megatron_llama3_1_8b_nemo.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_lora.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x32_pretrain.yaml launcher/nemo/constants.py recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_trn1x16_pretrain.yaml recipes_collection/recipes/training/llama/p4_hf_llama3_70b_seq8k_gpu.yaml tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_trn1_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.sh recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_8b_seq16k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq8k_gpu_p5x64_pretrain.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/llama-8b_hydra.yaml launcher/nemo/stages.py recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq8k_gpu_p5x16_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/with_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq8k_gpu_lora.yaml recipes_collection/recipes/training/mistral/hf_mistral_7b_seq16k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_fine_tuning.yaml launcher_scripts/llama/run_hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/config/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_trn1x4_pretrain.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.sh recipes_collection/recipes/training/llama/hf_llama3_8b_seq8k_gpu_p5x32_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq16k_gpu_lora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq16k_gpu_qlora.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_70b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/fine-tuning/llama/p4_hf_llama3_8b_seq8k_gpu_fine_tuning.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x32_pretrain.yaml launcher_scripts/llama/run_hf_llama3_8b_seq8k_trn1_fine_tuning.sh recipes_collection/recipes/training/llama/hf_llama3_2_90b_seq8k_gpu_p5x32_pretrain.yaml tests/sm_jobs_workflow/sm_jobs_baseline_artifacts/no_kwargs/llama-8b/llama-8b_hydra.yaml recipes_collection/recipes/training/llama/hf_llama3_2_3b_seq8k_gpu_p5x1_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_2_11b_seq8k_gpu_p5x4_pretrain.yaml launcher/nemo/k8s_templates/training/training.yaml launcher_scripts/llama/run_hf_llama3_2_1b_seq8k_gpu_p5x1_pretrain.sh recipes_collection/recipes/training/mistral/hf_mistral_7b_seq8k_gpu_p5x32_pretrain.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml README.md recipes_collection/recipes/training/custom_model/falcon.yaml recipes_collection/recipes/training/mixtral/hf_mixtral_8x7b_seq16k_gpu_p5x16_pretrain.yaml recipes_collection/recipes/fine-tuning/llama/hf_llama3_405b_seq128k_gpu_qlora.yaml recipes_collection/recipes/training/llama/hf_llama3_70b_seq16k_gpu_p5x64_pretrain.yaml recipes_collection/recipes/training/llama/hf_llama3_8b_seq16k_gpu_p5x16_pretrain.yaml main.py tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/llama-8b_hydra.yaml tests/test_utils.py recipes_collection/recipes/training/mixtral/hf_mixtral_8x22b_seq8k_gpu_p5x64_pretrain.yaml |
4. | htzhong@amazon.com | arjkrish@amazon.com |
13 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml recipes_collection/cluster/k8s.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher_scripts/custom_script/config_k8s.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py tests/config_validator/test_value_validator.py launcher/config_validator/type_validator.py tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/config_validator/value_validator.py launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml tests/config_validator/test_type_validator.py |
5. | julian1@cimarron.me | julianhr@amazon.com |
13 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/test_readme.py launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml tests/test_recipes.py launcher/nemo/stages.py recipes_collection/config.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py |
6. | julian1@cimarron.me | arjkrish@amazon.com |
11 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py recipes_collection/config.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py |
7. | julianhr@amazon.com | arjkrish@amazon.com |
11 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh launcher/nemo/k8s_templates/training/train-script-gpu.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py recipes_collection/config.yaml launcher/nemo/k8s_templates/training/train-script-trn.yaml tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh tests/test_utils.py |
8. | jesseche@amazon.com | arjkrish@amazon.com |
11 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/sagemaker-llama-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/testcustom_slurm_test_custom_submission.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/sagemaker-hf-llama3-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh launcher/efa.py launcher/accelerator_devices.py |
9. | jesseche@amazon.com | spiderpower02@gmail.com |
9 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/sagemaker-llama-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/testcustom_slurm_test_custom_submission.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/sagemaker-hf-llama3-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
10. | spiderpower02@gmail.com | arjkrish@amazon.com |
9 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/sagemaker-llama-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/testcustom_slurm_test_custom_submission.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/sagemaker-hf-llama3-8b_submission.sh tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
11. | julian1@cimarron.me | jesseche@amazon.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
12. | julian1@cimarron.me | spiderpower02@gmail.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
13. | julianhr@amazon.com | jesseche@amazon.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
14. | julianhr@amazon.com | spiderpower02@gmail.com |
6 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/slurm_workflow/slurm_baseline_artifacts/llama-8b/train_script.sh tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py tests/slurm_workflow/slurm_baseline_artifacts/test_custom/train_script.sh tests/slurm_workflow/slurm_baseline_artifacts/hf-llama3-8b/train_script.sh |
15. | htzhong@amazon.com | julian1@cimarron.me |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py |
16. | htzhong@amazon.com | julianhr@amazon.com |
4 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/k8s_templates/training/values.yaml launcher/nemo/stages.py |
17. | rnadimp@amazon.com | htzhong@amazon.com |
4 shared files
launcher/nemo/stages.py tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
18. | 35045363+rohithn1@users.noreply.github.com | htzhong@amazon.com |
4 shared files
launcher/nemo/stages.py tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
19. | htzhong@amazon.com | jesseche@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py |
20. | htzhong@amazon.com | spiderpower02@gmail.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/values.yaml tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/values.yaml launcher/nemo/stages.py |
21. | julian1@cimarron.me | 35045363+rohithn1@users.noreply.github.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
22. | julian1@cimarron.me | rnadimp@amazon.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
23. | julianhr@amazon.com | 35045363+rohithn1@users.noreply.github.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
24. | julianhr@amazon.com | rnadimp@amazon.com |
3 shared files
tests/test_recipes.py launcher/nemo/stages.py tests/test_utils.py |
25. | 35045363+rohithn1@users.noreply.github.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
26. | rnadimp@amazon.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
27. | htzhong@amazon.com | changnit@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
28. | changnit@amazon.com | arjkrish@amazon.com |
3 shared files
tests/k8s_workflow/k8s_baseline_artifacts/test_custom/k8s_template/templates/training.yaml launcher/nemo/k8s_templates/training/training.yaml tests/k8s_workflow/k8s_baseline_artifacts/llama-8b/k8s_template/templates/training.yaml |
29. | arjkrish@amazon.com | 80425164+can-sun@users.noreply.github.com |
2 shared files
launcher/efa.py launcher/accelerator_devices.py |
30. | 80425164+can-sun@users.noreply.github.com | jesseche@amazon.com |
2 shared files
launcher/efa.py launcher/accelerator_devices.py |
31. | arjkrish@amazon.com | tomhbous@amazon.com |
1 shared file
launcher/nemo/stages.py |
32. | spiderpower02@gmail.com | tomhbous@amazon.com |
1 shared file
launcher/nemo/stages.py |
33. | spiderpower02@gmail.com | rnadimp@amazon.com |
1 shared file
launcher/nemo/stages.py |
34. | spiderpower02@gmail.com | 35045363+rohithn1@users.noreply.github.com |
1 shared file
launcher/nemo/stages.py |
35. | jesseche@amazon.com | tomhbous@amazon.com |
1 shared file
launcher/nemo/stages.py |
36. | jesseche@amazon.com | rnadimp@amazon.com |
1 shared file
launcher/nemo/stages.py |
37. | jesseche@amazon.com | 35045363+rohithn1@users.noreply.github.com |
1 shared file
launcher/nemo/stages.py |
38. | tomhbous@amazon.com | julianhr@amazon.com |
1 shared file
launcher/nemo/stages.py |
39. | tomhbous@amazon.com | julian1@cimarron.me |
1 shared file
launcher/nemo/stages.py |
40. | tomhbous@amazon.com | rnadimp@amazon.com |
1 shared file
launcher/nemo/stages.py |
41. | tomhbous@amazon.com | 35045363+rohithn1@users.noreply.github.com |
1 shared file
launcher/nemo/stages.py |
42. | tomhbous@amazon.com | htzhong@amazon.com |
1 shared file
launcher/nemo/stages.py |
43. | 35045363+rohithn1@users.noreply.github.com | shchinmy@amazon.com |
1 shared file
README.md |
44. | 35045363+rohithn1@users.noreply.github.com | agankta@amazon.com |
1 shared file
README.md |
45. | 35045363+rohithn1@users.noreply.github.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
46. | rnadimp@amazon.com | shchinmy@amazon.com |
1 shared file
README.md |
47. | rnadimp@amazon.com | agankta@amazon.com |
1 shared file
README.md |
48. | rnadimp@amazon.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
49. | shchinmy@amazon.com | agankta@amazon.com |
1 shared file
README.md |
50. | shchinmy@amazon.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
51. | shchinmy@amazon.com | arjkrish@amazon.com |
1 shared file
README.md |
52. | agankta@amazon.com | 39390679+ankitaagarwal015@users.noreply.github.com |
1 shared file
README.md |
53. | agankta@amazon.com | arjkrish@amazon.com |
1 shared file
README.md |
54. | 39390679+ankitaagarwal015@users.noreply.github.com | arjkrish@amazon.com |
1 shared file
README.md |
Contributor | # connections | # commits | |
---|---|---|---|
1. | arjkrish@amazon.com | 13 | 1 |
2. | 35045363+rohithn1@users.noreply.github.com | 12 | 15 |
3. | rnadimp@amazon.com | 12 | 13 |
4. | htzhong@amazon.com | 9 | 2 |
5. | jesseche@amazon.com | 9 | 2 |
6. | julian1@cimarron.me | 8 | 1 |
7. | julianhr@amazon.com | 8 | 1 |
8. | tomhbous@amazon.com | 8 | 1 |
9. | spiderpower02@gmail.com | 8 | 1 |
10. | agankta@amazon.com | 5 | 2 |
11. | shchinmy@amazon.com | 5 | 1 |
12. | 39390679+ankitaagarwal015@users.noreply.github.com | 5 | 1 |
13. | changnit@amazon.com | 4 | 2 |
14. | 80425164+can-sun@users.noreply.github.com | 2 | 1 |
C-median: 8.0
A half of the contributors has more than 8.0 connections, and a half has less than this number.
C-mean: 7.7
An average number of connections a contributor has with other contributors.
C-index: 8.0
There are 8.0 contributors with 8.0 or more connections.