aws / amazon-sagemaker-examples
Temporal Dependencies

A temporal dependency occurs when developers change two or more files at the same time (i.e. they are a part of the same commit).

Files Most Frequently Changed Together (Top 20)

data...

Pairs # same commits # commits 1 # commits 2 latest commit
sagemaker-pipelines/nlp/amazon_comprehend_sagemaker_pipeline/iam_helper.py
sagemaker-pipelines/nlp/amazon_comprehend_sagemaker_pipeline/deploy_comprehend.py
1 2 (50%) 2 (50%) 2022-01-13
training/distributed_training/pytorch/model_parallel/gpt2/data_pipeline.py
training/distributed_training/pytorch/model_parallel/bert/bert_example/sagemaker_smp_pretrain.py
1 1 (100%) 8 (12%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/data_prep_512.py
training/distributed_training/pytorch/model_parallel/bert/bert_example/sagemaker_smp_pretrain.py
1 1 (100%) 8 (12%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/data_prep_512.py
training/distributed_training/pytorch/model_parallel/gpt2/data_pipeline.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/__init__.py
training/distributed_training/pytorch/model_parallel/bert/bert_example/sagemaker_smp_pretrain.py
1 1 (100%) 8 (12%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/__init__.py
training/distributed_training/pytorch/model_parallel/gpt2/data_pipeline.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/__init__.py
training/distributed_training/pytorch/model_parallel/gpt2/data_prep_512.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16.py
training/distributed_training/pytorch/model_parallel/bert/bert_example/sagemaker_smp_pretrain.py
1 1 (100%) 8 (12%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16.py
training/distributed_training/pytorch/model_parallel/gpt2/data_pipeline.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16.py
training/distributed_training/pytorch/model_parallel/gpt2/data_prep_512.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16.py
training/distributed_training/pytorch/model_parallel/gpt2/fp16/__init__.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16util.py
training/distributed_training/pytorch/model_parallel/bert/bert_example/sagemaker_smp_pretrain.py
1 1 (100%) 8 (12%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16util.py
training/distributed_training/pytorch/model_parallel/gpt2/data_pipeline.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16util.py
training/distributed_training/pytorch/model_parallel/gpt2/data_prep_512.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16util.py
training/distributed_training/pytorch/model_parallel/gpt2/fp16/__init__.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16util.py
training/distributed_training/pytorch/model_parallel/gpt2/fp16/fp16.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/loss_scaler.py
training/distributed_training/pytorch/model_parallel/bert/bert_example/sagemaker_smp_pretrain.py
1 1 (100%) 8 (12%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/loss_scaler.py
training/distributed_training/pytorch/model_parallel/gpt2/data_pipeline.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/loss_scaler.py
training/distributed_training/pytorch/model_parallel/gpt2/data_prep_512.py
1 1 (100%) 1 (100%) 2022-01-04
training/distributed_training/pytorch/model_parallel/gpt2/fp16/loss_scaler.py
training/distributed_training/pytorch/model_parallel/gpt2/fp16/__init__.py
1 1 (100%) 1 (100%) 2022-01-04
File Change History per Logical Decomposition
primary
primary
The number on the lines shows the number of shared commits.

No temporal cross-component dependencies found.