GoogleCloudPlatform / nvidia-nemo-on-gke
Training NVIDIA NeMo Megatron Large Language Model (LLM) using NeMo Framework on Google Kubernetes Engine
GitHub Repo 
2.9K
lines of main code
48 files
0
lines of test code
0 files
0.5K
lines of other code
4 files
1y
age
416 days
68%
main code touched
1 year (2.8K LOC)
22%
new main code
1 year (2.7K LOC)
1.8K
tf
0.9K
yaml
TPL
0.2K
tpl
0.07K
tfvars

13

31

3

4

2025 2024

generated by sokrates.dev (configuration) on 2025-05-04