syne_tune/optimizer/schedulers/searchers/bayesopt/gpautograd/learncurve/freeze_thaw.py (4 lines): - line 201: # TODO: This code is complex. If it does not run faster than - line 247: # TODO: Do we need AddJitterOp here? - line 313: # TODO: It is not clear whether this code is slower, and it is certainly - line 337: # TODO: Do we need AddJitterOp here? syne_tune/optimizer/schedulers/fifo.py (2 lines): - line 234: # TODO: Need load - line 274: # TODO: Need mkdir, save benchmarking/blackbox_repository/tabulated_benchmark.py (2 lines): - line 82: # TODO we feed a dummy value for entry_point since they are not required - line 208: # TODO: This could fail earlier syne_tune/backend/sagemaker_backend/sagemaker_backend.py (1 line): - line 139: # TODO once we have a multiobjective scheduler, we should add an example on how to tune instance-type/count. syne_tune/backend/sagemaker_backend/instance_info.py (1 line): - line 33: # TODO right now, we use a static file but some services are available to get updated information syne_tune/report.py (1 line): - line 49: # TODO dollar-cost computation is not available for file-based backends, what would be syne_tune/optimizer/schedulers/searchers/bayesopt/gpautograd/kernel/__init__.py (1 line): - line 13: # TODO wildcard import should be avoided syne_tune/experiments.py (1 line): - line 177: # TODO: Use conditional imports, in order not to fail if dependencies are not syne_tune/optimizer/schedulers/searchers/__init__.py (1 line): - line 13: # TODO wildcard import should be avoided benchmarking/blackbox_repository/blackbox_tabular.py (1 line): - line 219: TODO: the API is currently dissonant with `serialize`, `deserialize` for BlackboxOffline as `serialize` is there a member. syne_tune/optimizer/schedulers/searchers/bayesopt/gpautograd/learncurve/issm.py (1 line): - line 541: # TODO: Do we need AddJitterOp here? syne_tune/optimizer/schedulers/hyperband_pasha.py (1 line): - line 24: TODO: add link syne_tune/optimizer/schedulers/searchers/bayesopt/gpautograd/kernel/cross_validation.py (1 line): - line 65: TODO: Right now, all HPs are encoded, and the resource attribute counts as syne_tune/optimizer/schedulers/synchronous/hyperband.py (1 line): - line 107: TODO: Support model-based searchers. benchmarking/blackbox_repository/conversion_scripts/scripts/nasbench201_import.py (1 line): - line 166: # TODO: Try to save dummy file to S3 at start, to fail fast if the user