task-sdk/src/airflow/sdk/execution_time/task_runner.py (20 lines): - line 114: # TODO: Move this entire class into a separate file: - line 152: # TODO: Move this to `airflow.sdk.execution_time.context` - line 170: # TODO: Ensure that ti.log_url and such are available to use in context - line 177: # TODO: Make this go through Public API longer term. - line 188: # TODO: Assess if we need to pass these through timezone.coerce_datetime - line 279: key: str = "return_value", # TODO: Make this a constant (``XCOM_RETURN_KEY``) - line 280: include_prior_dates: bool = False, # TODO: Add support for this - line 354: # TODO: AIP 72 Execution API only allows working with a single map_index at a time - line 389: # TODO: Implement this method - line 550: # TODO: Task-SDK: - line 637: # TODO: Investigate why some empty lines are sent to the processes stdin. - line 714: # TODO: Port one of the following to Task SDK - line 759: # TODO: Call pre execute etc. - line 773: # TODO: Should we use structlog.bind_contextvars here for dag_id, task_id & run_id? - line 880: # TODO: Handle fail_stop here: https://github.com/apache/airflow/issues/44951 - line 881: # TODO: Handle addition to Log table: https://github.com/apache/airflow/issues/44952 - line 1115: # TODO: handle timeout in case of deferral - line 1127: # TODO: handle on kill callback here - line 1192: # TODO: Use constant for XCom return key & use serialize_value from Task SDK - line 1257: # TODO: add an exception here, it causes an oof of a stack trace! task-sdk/src/airflow/sdk/definitions/dag.py (9 lines): - line 108: # TODO: Task-SDK: we should be hashing on timetable now, not scheulde! - line 248: # TODO: Task-SDK: look at re-enabling slots after we remove pickling - line 396: # TODO: Task-SDK: Work out how to not import jinj2 until we need it! It's expensive - line 508: # TODO: Once - line 561: # TODO: This subclassing behaviour seems wrong, but it's what Airflow has done for ~ever. - line 673: # TODO: TaskSDK: move this on to BaseOperator and remove the check? - line 732: # TODO: Remove in RemovedInAirflow3Warning - line 944: # TODO: Task-SDK: this type ignore shouldn't be needed! - line 1096: # TODO: Task-SDK: remove __DAG_class airflow-core/src/airflow/api_fastapi/execution_api/routes/task_instances.py (7 lines): - line 175: # TODO: Pass a RFC 9457 compliant error message in "detail" field - line 254: # TODO: Add variables and connections that are needed (and has perms) for the task - line 417: # TODO: HANDLE execution timeout later as it requires a call to the DB - line 473: # TODO: Replace this with FastAPI's Custom Exception handling: - line 600: # TODO: Add description to the operation - line 601: # TODO: Add Operation ID to control the function name in the OpenAPI spec - line 602: # TODO: Do we need to use create_openapi_http_exception_doc here? task-sdk/src/airflow/sdk/execution_time/context.py (7 lines): - line 123: # TODO: check cache first - line 144: # TODO: This should probably be moved to a separate module like `airflow.sdk.execution_time.comms` - line 169: # TODO: check cache first - line 191: # TODO: This should probably be moved to a separate module like `airflow.sdk.execution_time.comms` - line 217: # TODO: This should probably be moved to a separate module like `airflow.sdk.execution_time.comms` - line 263: # TODO: This should probably be moved to a separate module like `airflow.sdk.execution_time.comms` - line 375: # TODO: This is temporary to avoid code duplication between here & airflow/models/taskinstance.py airflow-core/src/airflow/jobs/triggerer_job_runner.py (6 lines): - line 138: # TODO: signal instead. - line 642: # TODO: Is a separate dict worth it, or should we make `self.running_triggers` a dict? - line 676: # TODO: convert the dict back to a pretty stack trace - line 723: # TODO: set this in a sig-int handler - line 726: # TODO: connect this to the parent process - line 930: # TODO: better formatting of the exception? task-sdk/src/airflow/sdk/bases/operator.py (5 lines): - line 100: # TODO: Task-SDK - line 552: # TODO: The following mapping is used to validate that the arguments passed to the BaseOperator are of the - line 875: # TODO: Task-SDK: Make these ClassVar[]? - line 1096: # TODO: - line 1265: sys.setrecursionlimit(5000) # TODO fix this in a better way airflow-core/src/airflow/dag_processing/manager.py (5 lines): - line 491: # TODO: AIP-66 handle errors in the case of incomplete cloning? And test this. - line 499: # TODO: AIP-66 test to make sure we get a fresh record from the db and it's not cached - line 799: # TODO: Use an explicit session in this fn - line 1090: # TODO: This reloading should be removed when we fix our logging behaviour - line 1120: # TODO: AIP-66 emit metrics airflow-core/src/airflow/models/taskinstance.py (5 lines): - line 368: # TODO: We don't push TaskMap for mapped task instances because it's not - line 2124: # TODO: TaskSDK this function will need 100% re-writing - line 2244: # TODO: TaskSDK add start_trigger_args to SDK definitions - line 2948: # TODO: TaskSDK: We should remove this, but many tests still currently call `ti.run()`. See #45549 - line 3225: # TODO: Compare it with self._set_duration method task-sdk/src/airflow/sdk/api/client.py (4 lines): - line 153: # TODO: handle the naming better. finish sounds wrong as "even" deferred is essentially finishing. - line 369: # TODO: check if we need to use map_index as params in the uri - line 408: # TODO: check if we need to use map_index as params in the uri - line 565: # TODO: Error handling airflow-core/src/airflow/models/baseoperator.py (4 lines): - line 380: # TODO: Task-SDK: We need to set this to the scheduler DAG until we fully separate scheduling and - line 431: # TODO: Task-SDK: We need to set this to the scheduler DAG until we fully separate scheduling and - line 642: # TODO: TaskSDK This is only needed to support `dag.test()` etc until we port it over to use the - line 682: # TODO: TaskSDK This is only needed to support `dag.test()` etc until we port it over to use the airflow-core/src/airflow/models/variable.py (4 lines): - line 140: # TODO: This is not the best way of having compat, but it's "better than erroring" for now. This still - line 197: # TODO: This is not the best way of having compat, but it's "better than erroring" for now. This still - line 257: # TODO: This is not the best way of having compat, but it's "better than erroring" for now. This still - line 312: # TODO: This is not the best way of having compat, but it's "better than erroring" for now. This still task-sdk/src/airflow/sdk/execution_time/supervisor.py (4 lines): - line 335: # TODO: Make this process a session leader - line 767: # TODO: This should come from airflow.cfg: [core] task_success_overtime - line 1232: # TODO: convert the dict back to a pretty stack trace - line 1316: # TODO: Use logging providers to handle the chunked upload for us etc. providers/microsoft/azure/src/airflow/providers/microsoft/azure/hooks/wasb.py (4 lines): - line 214: # TODO: rework the interface as it might also return AsyncContainerClient - line 417: # TODO: rework the interface as it might also return Awaitable - line 433: # TODO: rework the interface as it might also return Awaitable - line 687: # TODO: rework the interface as in parent Hook it returns ContainerClient providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/operators/pod.py (4 lines): - line 97: # TODO: Remove once provider drops support for Airflow 2 - line 419: # TODO: remove it from here and from the operator's parameters list when the next major version bumped - line 425: self._config_dict: dict | None = None # TODO: remove it when removing convert_config_file_to_dict - line 503: # TODO: Remove this when the minimum version of Airflow is bumped to 3.0 pyproject.toml (4 lines): - line 485: # TODO: We can remove it once boto3 and aiobotocore both have compatible botocore version or - line 616: "D102", # TODO: Missing docstring in public method - line 617: "D103", # TODO: Missing docstring in public function - line 646: "ASYNC110", # TODO: Use `anyio.Event` instead of awaiting `anyio.sleep` in a `while` loop airflow-core/src/airflow/serialization/serialized_objects.py (4 lines): - line 19: # TODO: update test_recursive_serialize_calls_must_forward_kwargs and re-enable RET505 - line 759: # FIXME: casts set to list in customized serialization in future. - line 771: # FIXME: casts tuple to list in customized serialization in future. - line 1438: # TODO: refactor deserialization of BaseOperator and MappedOperator (split it out), then check task-sdk/src/airflow/sdk/definitions/_internal/abstractoperator.py (3 lines): - line 71: # TODO: Task-SDK -- these defaults should be overridable from the Airflow config - line 328: # TODO: Task-SDK -- Should the following methods removed? - line 381: # TODO: Mask the value. Depends on https://github.com/apache/airflow/issues/45438 airflow-core/src/airflow/executors/base_executor.py (3 lines): - line 426: # TODO: TaskSDK: Compat, remove when KubeExecutor is fully moved over to TaskSDK too. - line 427: # TODO: TaskSDK: We need to minimum version requirements on executors with Airflow 3. - line 498: # TODO: This should not be using `TaskInstanceState` here, this is just "did the process complete, or did airflow-core/src/airflow/api_fastapi/execution_api/routes/xcoms.py (3 lines): - line 46: # TODO: Placeholder for actual implementation - line 193: # TODO: once we have JWT tokens, then remove dag_id/run_id/task_id from the URL and just use the info in - line 265: # TODO: Can/should we check if a client _hasn't_ provided this for an upstream of a mapped task? That airflow-core/src/airflow/dag_processing/processor.py (3 lines): - line 96: # TODO: Set known_pool names on DagBag! - line 116: # TODO: Make `bag.dag_warnings` not return SQLA model objects - line 163: # TODO:We need a proper context object! airflow-core/src/airflow/models/dag.py (3 lines): - line 1746: # TODO: Task-SDK: This check is transitionary. Remove once all executors are ported over. - line 1856: # TODO: Task-SDK: remove this assert - line 1932: # TODO: AIP-66 should this be in the model? scripts/ci/pre_commit/validate_operators_init.py (3 lines): - line 35: TODO: Enhance this function to work with nested inheritance trees through dynamic imports. - line 86: TODO: Enhance this function to work with nested inheritance trees through dynamic imports. - line 176: TODO: Enhance this function to work with nested inheritance trees through dynamic imports. performance/src/performance_dags/performance_dag/performance_dag_utils.py (3 lines): - line 205: # TODO: allow every environment type to specify its own "forbidden" matching dag ids - line 523: # TODO: if PERF_MAX_RUNS is missing from configuration, then PERF_SCHEDULE_INTERVAL must - line 527: # TODO: we should not ban PERF_SCHEDULE_INTERVAL completely because we will make it impossible providers/google/src/airflow/providers/google/cloud/hooks/cloud_storage_transfer_service.py (3 lines): - line 257: # TODO: remove one day - line 418: # TODO: remove one day - line 619: # TODO: remove one day providers/standard/src/airflow/providers/standard/sensors/date_time.py (2 lines): - line 32: # TODO: Remove this when min airflow version is 2.10.0 for standard provider - line 50: # TODO: Remove once provider drops support for Airflow 2 task-sdk/src/airflow/sdk/definitions/decorators/task_group.py (2 lines): - line 122: # TODO: FIXME when mypy gets compatible with new attrs - line 129: # TODO: FIXME when mypy gets compatible with new attrs providers/celery/src/airflow/providers/celery/executors/celery_kubernetes_executor.py (2 lines): - line 61: # TODO: Remove this flag once providers depend on Airflow 3.0 - line 181: # TODO: Remove this once providers depend on Airflow 3.0 providers/standard/src/airflow/providers/standard/operators/python.py (2 lines): - line 74: except ImportError: # TODO: Remove once provider drops support for Airflow 2 - line 1124: # TODO: To be removed when Airflow 2 support is dropped airflow-core/src/airflow/dag_processing/collection.py (2 lines): - line 368: # TODO: This won't clear errors for files that exist that no longer contain DAGs. Do we need to pass - line 495: # FIXME: STORE NEW REFERENCES. providers/google/src/airflow/providers/google/cloud/operators/gcs.py (2 lines): - line 848: # TODO: download in parallel. - line 888: # TODO: upload in parallel. dev/breeze/src/airflow_breeze/utils/selective_checks.py (2 lines): - line 375: # TODO: In Python 3.12 we will be able to use itertools.batched - line 1003: # TODO: In Python 3.12 we will be able to use itertools.batched providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/executors/kubernetes_executor.py (2 lines): - line 153: # TODO: TaskSDK: move this type change into BaseExecutor - line 303: # TODO: AIP-72 handle populating tokens once https://github.com/apache/airflow/issues/45107 is handled. airflow-core/src/airflow/models/connection.py (2 lines): - line 460: # TODO: This is not the best way of having compat, but it's "better than erroring" for now. This still - line 467: # TODO: AIP 72: Add deprecation here once we move this module to task sdk. providers/google/src/airflow/providers/google/cloud/transfers/gcs_to_gcs.py (2 lines): - line 469: # TODO: After deprecating delimiter and wildcards in source objects, - line 478: # TODO: After deprecating delimiter and wildcards in source objects, airflow-core/src/airflow/cli/commands/task_command.py (2 lines): - line 164: # TODO: Task-SDK: Shouldn't really happen, and this command will go away before 3.0 - line 197: # TODO: Validate map_index is in range? providers/openlineage/src/airflow/providers/openlineage/utils/utils.py (2 lines): - line 34: # TODO: move this maybe to Airflow's logic? - line 749: # TODO: FIXME when mypy gets compatible with new attrs providers/standard/src/airflow/providers/standard/sensors/time.py (2 lines): - line 30: # TODO: Remove this when min airflow version is 2.10.0 for standard provider - line 48: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/common/db/common.py (2 lines): - line 115: # TODO: Re-enable when permissions are handled. Readable / writable entities, - line 174: # TODO: Re-enable when permissions are handled. Readable / writable entities, providers/samba/src/airflow/providers/samba/transfers/gcs_to_samba.py (2 lines): - line 38: # TODO: Remove once provider drops support for Airflow 2 - line 160: # TODO: After deprecating delimiter and wildcards in source objects, task-sdk/src/airflow/sdk/definitions/_internal/expandinput.py (2 lines): - line 144: # TODO: This initiates one API call for each XComArg. Would it be - line 203: # TODO: This initiates one API call for each XComArg. Would it be providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/executors/local_kubernetes_executor.py (2 lines): - line 52: # TODO: Remove this attribute once providers rely on Airflow >=3.0.0 - line 87: # TODO: fix this, there is misalignment between the types of queued_tasks so it is likely wrong providers/amazon/src/airflow/providers/amazon/aws/executors/ecs/ecs_executor.py (2 lines): - line 109: # TODO: TaskSDK: move this type change into BaseExecutor - line 604: # TODO: remove this method when min_airflow_version is set to higher than 2.10.0 airflow-core/src/airflow/migrations/versions/0032_3_0_0_rename_execution_date_to_logical_date_and_nullable.py (2 lines): - line 97: op.execute(f"-- TODO: DAG runs unable to be downgraded are moved to {offending_table_name}.") - line 98: op.execute(f"-- TODO: Table {offending_table_name} can be removed after contained data are reviewed.") task-sdk/src/airflow/sdk/execution_time/comms.py (2 lines): - line 129: key: str = "return_value", # TODO: Make this a constant; see RuntimeTaskInstance. - line 266: # TODO: Create a convert api_response to result classes so we don't need to do this providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py (2 lines): - line 42: # TODO: Remove once provider drops support for Airflow 2 - line 215: # TODO: Remove this when the minimum version of Airflow is bumped to 3.0 task-sdk/src/airflow/sdk/definitions/mappedoperator.py (2 lines): - line 251: # TODO: Move these to task SDK's BaseOperator and remove getattr - line 758: # TODO: TaskSDK: This probably doesn't need to live in definition time as the next section of code is providers/google/src/airflow/providers/google/cloud/hooks/gcs.py (2 lines): - line 334: # TODO: future improvement check file size before downloading, - line 1341: # TODO: Add batch. I tried to do it, but the Google library is not stable at the moment. providers/standard/src/airflow/providers/standard/sensors/filesystem.py (2 lines): - line 37: # TODO: Remove this when min airflow version is 2.10.0 for standard provider - line 53: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/jobs/scheduler_job_runner.py (2 lines): - line 701: # TODO: Task-SDK: This check is transitionary. Remove once all executors are ported over. - line 1712: # TODO: Logically, this should be DagRunInfo.run_after, but the providers/papermill/src/airflow/providers/papermill/operators/papermill.py (2 lines): - line 36: # TODO: Remove once provider drops support for Airflow 2 - line 63: # TODO: Remove this when provider drops 2.x support. providers/common/sql/src/airflow/providers/common/sql/hooks/sql.py (2 lines): - line 97: # TODO: this check can be removed once common sql provider depends on Airflow 3.0 or higher, - line 102: # TODO: this can be removed once common sql provider depends on Airflow 3.0 or higher providers/teradata/src/airflow/providers/teradata/operators/teradata_compute_cluster.py (2 lines): - line 34: # TODO: Remove once provider drops support for Airflow 2 - line 47: # TODO: Remove once provider drops support for Airflow 2 providers/slack/src/airflow/providers/slack/operators/slack.py (1 line): - line 36: # TODO: Remove once provider drops support for Airflow 2 task-sdk/src/airflow/sdk/definitions/context.py (1 line): - line 64: # TODO: Remove Operator from below once we have MappedOperator to the Task SDK providers/amazon/src/airflow/providers/amazon/aws/executors/batch/batch_executor.py (1 line): - line 465: # TODO: remove this method when min_airflow_version is set to higher than 2.10.0 providers/mysql/src/airflow/providers/mysql/transfers/vertica_to_mysql.py (1 line): - line 44: # TODO: Remove once provider drops support for Airflow 2 providers/tableau/src/airflow/providers/tableau/operators/tableau.py (1 line): - line 34: # TODO: Remove once provider drops support for Airflow 2 providers/amazon/pyproject.toml (1 line): - line 99: # TODO: We can remove it once boto3 and aiobotocore both have compatible botocore version or providers/databricks/src/airflow/providers/databricks/operators/databricks_sql.py (1 line): - line 343: # TODO: think on how to make sure that table_name and expression_list aren't used for SQL injection providers/microsoft/winrm/src/airflow/providers/microsoft/winrm/operators/winrm.py (1 line): - line 110: # TODO: Remove this after minimum Airflow version is 3.0 task-sdk/src/airflow/sdk/definitions/xcom_arg.py (1 line): - line 594: # TODO: How to tell if all the upstream TIs finished? providers/imap/src/airflow/providers/imap/sensors/imap_attachment.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/executors/local_executor.py (1 line): - line 121: # This is the "wrong" ti type, but it duck types the same. TODO: Create a protocol for this. providers/common/sql/src/airflow/providers/common/sql/operators/sql.py (1 line): - line 155: # TODO: can be removed once Airflow min version for this provider is 3.0.0 or higher airflow-core/src/airflow/utils/cli.py (1 line): - line 227: # TODO: AIP-66 - investigate more, can we use serdag? providers/standard/src/airflow/providers/standard/sensors/external_task.py (1 line): - line 52: # TODO: Remove once provider drops support for Airflow 2 providers/slack/src/airflow/providers/slack/operators/slack_webhook.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/ui/src/pages/Asset/CreateAssetEventModal.tsx (1 line): - line 72: // TODO move validate + prettify into JsonEditor chart/values.yaml (1 line): - line 478: # TODO: difference from `env`? This is a templated string. Probably should template `env` and remove this. providers/arangodb/src/airflow/providers/arangodb/operators/arangodb.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/ui/src/pages/Run/Details.tsx (1 line): - line 51: // TODO : Render DagRun configuration object providers/http/src/airflow/providers/http/operators/http.py (1 line): - line 42: # TODO: Remove once provider drops support for Airflow 2 providers/databricks/src/airflow/providers/databricks/operators/databricks_repos.py (1 line): - line 36: # TODO: Remove once provider drops support for Airflow 2 providers/influxdb/src/airflow/providers/influxdb/operators/influxdb.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/databricks/src/airflow/providers/databricks/sensors/databricks_sql.py (1 line): - line 36: # TODO: Remove once provider drops support for Airflow 2 airflow-ctl/src/airflowctl/ctl/cli_config.py (1 line): - line 261: # TODO change this while removing Python 3.9 support providers/segment/src/airflow/providers/segment/operators/segment_track_event.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/standard/src/airflow/providers/standard/operators/bash.py (1 line): - line 43: # TODO: Remove once provider drops support for Airflow 2 providers/sftp/src/airflow/providers/sftp/sensors/sftp.py (1 line): - line 40: # TODO: Remove once provider drops support for Airflow 2 providers/amazon/src/airflow/providers/amazon/aws/hooks/redshift_cluster.py (1 line): - line 79: # TODO: Wrap create_cluster_snapshot providers/microsoft/winrm/src/airflow/providers/microsoft/winrm/hooks/winrm.py (1 line): - line 32: # TODO: FIXME please - I have too complex implementation providers/google/src/airflow/providers/google/cloud/hooks/datafusion.py (1 line): - line 472: # TODO: This API endpoint starts multiple pipelines. There will eventually be a fix providers/tableau/src/airflow/providers/tableau/sensors/tableau.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/ui/src/components/ActionAccordion/ActionAccordion.tsx (1 line): - line 36: // TODO: Make a front-end only unconnected table component with client side ordering and pagination providers/mysql/src/airflow/providers/mysql/transfers/trino_to_mysql.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/apache/tinkerpop/src/airflow/providers/apache/tinkerpop/operators/gremlin.py (1 line): - line 28: # TODO: Remove once provider drops support for Airflow 2 providers/opensearch/src/airflow/providers/opensearch/log/os_json_formatter.py (1 line): - line 35: # TODO: Use airflow.utils.timezone.from_timestamp(record.created, tz="local") airflow-core/src/airflow/migrations/versions/0017_2_9_2_fix_inconsistency_between_ORM_and_migration_files.py (1 line): - line 46: # TODO: Rewrite these queries to use alembic when lowest MYSQL version supports IF EXISTS airflow-core/src/airflow/api_fastapi/auth/tokens.py (1 line): - line 294: # TODO: We could probably populate this from the jwks document, but we don't have that at providers/microsoft/azure/src/airflow/providers/microsoft/azure/hooks/msgraph.py (1 line): - line 326: # TODO: Once provider depends on Airflow 2.10 or higher code below won't be needed anymore as providers/common/io/src/airflow/providers/common/io/xcom/backend.py (1 line): - line 134: # TODO: Remove this branch once we drop support for Airflow 2 providers/openlineage/src/airflow/providers/openlineage/extractors/manager.py (1 line): - line 176: # TODO: Re-enable in Extractor PR providers/google/src/airflow/providers/google/cloud/sensors/tasks.py (1 line): - line 81: # TODO uncomment page_size once https://issuetracker.google.com/issues/155978649?pli=1 gets fixed providers/google/src/airflow/providers/google/suite/transfers/gcs_to_gdrive.py (1 line): - line 140: # TODO: After deprecating delimiter and wildcards in source objects, airflow-core/src/airflow/executors/workloads.py (1 line): - line 74: # TODO: Task-SDK: Can we replace TastInstanceKey with just the uuid across the codebase? providers/opensearch/src/airflow/providers/opensearch/operators/opensearch.py (1 line): - line 37: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/__init__.py (1 line): - line 69: # TODO: Remove this module in Airflow 3.2 providers/docker/src/airflow/providers/docker/operators/docker_swarm.py (1 line): - line 38: # TODO: Remove once provider drops support for Airflow 2 providers/standard/src/airflow/providers/standard/operators/trigger_dagrun.py (1 line): - line 59: # TODO: Remove once provider drops support for Airflow 2 providers/google/src/airflow/providers/google/cloud/links/dataproc.py (1 line): - line 47: # TODO: remove DATAPROC_JOB_LOG_LINK alias in the next major release providers/jenkins/src/airflow/providers/jenkins/sensors/jenkins.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/github/src/airflow/providers/github/sensors/github.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 providers/cohere/src/airflow/providers/cohere/operators/embedding.py (1 line): - line 34: # TODO: Remove once provider drops support for Airflow 2 providers/opsgenie/src/airflow/providers/opsgenie/operators/opsgenie.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/asana/src/airflow/providers/asana/operators/asana_tasks.py (1 line): - line 29: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/exceptions.py (1 line): - line 416: # TODO: workout this to correct place https://github.com/apache/airflow/issues/44353 airflow-core/src/airflow/ui/src/pages/Dag/Tasks/TaskCard.tsx (1 line): - line 92: {/* TODO: Handled mapped tasks to not plot each map index as a task instance */} providers/standard/src/airflow/providers/standard/sensors/bash.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 scripts/ci/pre_commit/check_base_operator_partial_arguments.py (1 line): - line 83: # TODO: Task-SDK: Look at the BaseOperator init functions in both airflow.models.baseoperator and combine providers/standard/src/airflow/providers/standard/operators/datetime.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/grpc/src/airflow/providers/grpc/operators/grpc.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/core_api/services/ui/connections.py (1 line): - line 247: default_conn_name=None, # TODO: later providers/telegram/src/airflow/providers/telegram/operators/telegram.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 providers/celery/src/airflow/providers/celery/sensors/celery_queue.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/presto/src/airflow/providers/presto/transfers/gcs_to_presto.py (1 line): - line 36: # TODO: Remove once provider drops support for Airflow 2 providers/weaviate/src/airflow/providers/weaviate/operators/weaviate.py (1 line): - line 34: # TODO: Remove once provider drops support for Airflow 2 providers/teradata/src/airflow/providers/teradata/operators/teradata.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/yandex/src/airflow/providers/yandex/links/yq.py (1 line): - line 28: # TODO: Remove once provider drops support for Airflow 2 providers/pinecone/src/airflow/providers/pinecone/operators/pinecone.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/models/dagrun.py (1 line): - line 1473: # TODO: Logically, this should be DagRunInfo.run_after, but the providers/redis/src/airflow/providers/redis/sensors/redis_key.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/trino/src/airflow/providers/trino/transfers/gcs_to_trino.py (1 line): - line 36: # TODO: Remove once provider drops support for Airflow 2 providers/edge3/src/airflow/providers/edge3/cli/edge_command.py (1 line): - line 283: # This is the "wrong" ti type, but it duck types the same. TODO: Create a protocol for this. providers/github/src/airflow/providers/github/hooks/github.py (1 line): - line 61: # TODO: When/If other auth methods are implemented this exception should be removed/modified. providers/opsgenie/src/airflow/providers/opsgenie/notifications/opsgenie.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/execution_api/datamodels/token.py (1 line): - line 26: # TODO: This is a placeholder for Task Identity Token schema. providers/standard/src/airflow/providers/standard/operators/weekday.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/core_api/security.py (1 line): - line 84: # TODO remove try-except when authentication integrated everywhere, safeguard for non integrated clients and endpoints providers/discord/src/airflow/providers/discord/operators/discord_webhook.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/oracle/src/airflow/providers/oracle/transfers/oracle_to_oracle.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/execution_api/routes/assets.py (1 line): - line 29: # TODO: Add dependency on JWT token airflow-core/src/airflow/api_fastapi/execution_api/routes/variables.py (1 line): - line 39: # TODO: Placeholder for actual implementation providers/google/src/airflow/providers/google/cloud/sensors/bigquery.py (1 line): - line 81: # TODO: Remove once deprecated providers/redis/src/airflow/providers/redis/operators/redis_publish.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/ui/src/queries/useDagParams.ts (1 line): - line 31: // TODO define the structure on API as generated code clients/python/openapi_v1.yaml (1 line): - line 5081: # TODO: Why we need these fields? providers/google/src/airflow/providers/google/cloud/hooks/bigquery.py (1 line): - line 2230: # TODO: Convert get_records into an async method task-sdk/src/airflow/sdk/types.py (1 line): - line 74: # TODO: `include_prior_dates` isn't yet supported in the SDK providers/teradata/src/airflow/providers/teradata/transfers/s3_to_teradata.py (1 line): - line 38: # TODO: Remove once provider drops support for Airflow 2 scripts/ci/pre_commit/update_airflow_pyproject_toml.py (1 line): - line 94: # TODO: when min Python version is 3.11 change back the code to fromisoformat providers/edge3/src/airflow/providers/edge3/cli/api_client.py (1 line): - line 55: # TODO: Consider these env variables jointly in task sdk together with task_sdk/src/airflow/sdk/api/client.py providers/smtp/src/airflow/providers/smtp/operators/smtp.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 providers/http/src/airflow/providers/http/sensors/http.py (1 line): - line 35: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/utils/dag_parsing_context.py (1 line): - line 25: # TODO: Remove this module in Airflow 3.2 providers/standard/src/airflow/providers/standard/sensors/python.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 dev/stats/get_important_pr_candidates.py (1 line): - line 311: @option_github_token # TODO: this should only be required if --load isn't provided task-sdk/src/airflow/sdk/definitions/_internal/contextmanager.py (1 line): - line 54: # TODO: Task-SDK: providers/github/src/airflow/providers/github/operators/github.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 task-sdk/src/airflow/sdk/definitions/connection.py (1 line): - line 153: # TODO: Mask sensitive keys from this list or revisit if it will be done in server providers/google/src/airflow/providers/google/cloud/transfers/gcs_to_sftp.py (1 line): - line 154: # TODO: After deprecating delimiter and wildcards in source objects, task-sdk/src/airflow/sdk/definitions/variable.py (1 line): - line 46: # TODO: Extend this definition for reading/writing variables without context airflow-core/src/airflow/api_fastapi/execution_api/routes/connections.py (1 line): - line 35: # TODO: Placeholder for actual implementation scripts/ci/pre_commit/check_init_decorator_arguments.py (1 line): - line 127: # TODO: This one is legacy access control. Remove it in 3.0. RemovedInAirflow3Warning providers/microsoft/azure/src/airflow/providers/microsoft/azure/log/wasb_task_handler.py (1 line): - line 90: # TODO: fix this - "relative path" i.e currently REMOTE_BASE_LOG_FOLDER should start with "wasb" providers/oracle/src/airflow/providers/oracle/operators/oracle.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 providers/dingding/src/airflow/providers/dingding/operators/dingding.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/databricks/src/airflow/providers/databricks/sensors/databricks_partition.py (1 line): - line 39: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/models/taskmap.py (1 line): - line 235: # TODO: Make more efficient with bulk_insert_mappings/bulk_save_mappings. providers/standard/src/airflow/providers/standard/sensors/weekday.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/qdrant/src/airflow/providers/qdrant/operators/qdrant.py (1 line): - line 33: # TODO: Remove once provider drops support for Airflow 2 providers/google/src/airflow/providers/google/cloud/utils/external_token_supplier.py (1 line): - line 160: # TODO more information about the error can be provided in the exception by inspecting the response providers/http/src/airflow/providers/http/hooks/http.py (1 line): - line 349: # TODO: remove ignore type when https://github.com/jd/tenacity/issues/428 is resolved airflow-ctl/src/airflowctl/api/operations.py (1 line): - line 152: # TODO: Get all with limit and offset to overcome default 100 limit for all list operations providers/databricks/src/airflow/providers/databricks/hooks/databricks_sql.py (1 line): - line 255: # TODO: adjust this to make testing easier scripts/ci/pre_commit/check_template_context_variable_in_sync.py (1 line): - line 158: # TODO: These are the keys that are yet to be ported over to the Task SDK. task-sdk/src/airflow/sdk/definitions/asset/__init__.py (1 line): - line 152: # TODO: Collect this into a DagWarning. providers/celery/src/airflow/providers/celery/executors/celery_executor.py (1 line): - line 229: # TODO: TaskSDK: move this type change into BaseExecutor airflow-core/src/airflow/ui/src/queries/useLogs.tsx (1 line): - line 80: // TODO: Add support for nested groups providers/opensearch/src/airflow/providers/opensearch/log/os_task_handler.py (1 line): - line 307: # TODO: Task-SDK: Where should this function be? airflow-core/src/airflow/sensors/__init__.py (1 line): - line 29: # TODO: Add definition from Task SDK here and remove `base.py` file task-sdk/src/airflow/sdk/definitions/asset/decorators.py (1 line): - line 216: outlets: Collection[BaseAsset] # TODO: Support non-asset outlets? providers/redis/src/airflow/providers/redis/sensors/redis_pub_sub.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/ui/src/pages/XCom/XComEntry.tsx (1 line): - line 48: width={200} // TODO: Make Skeleton take style from column definition providers/google/src/airflow/providers/google/cloud/hooks/compute_ssh.py (1 line): - line 38: # TODO:(potiuk) We should add test harness detecting such cases shortly providers/salesforce/src/airflow/providers/salesforce/operators/bulk.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 providers/jenkins/src/airflow/providers/jenkins/operators/jenkins_job_trigger.py (1 line): - line 157: # TODO Use get_queue_info instead providers/google/src/airflow/providers/google/cloud/hooks/pubsub.py (1 line): - line 159: # TODO: remove one day providers/mysql/src/airflow/providers/mysql/transfers/presto_to_mysql.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/models/expandinput.py (1 line): - line 87: # TODO: This initiates one database call for each XComArg. Would it be providers/neo4j/src/airflow/providers/neo4j/operators/neo4j.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/slack/src/airflow/providers/slack/transfers/sql_to_slack_webhook.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 devel-common/pyproject.toml (1 line): - line 103: # TODO: upgrade to newer versions of MyPy continuously as they are released providers/sftp/src/airflow/providers/sftp/hooks/sftp.py (1 line): - line 94: # TODO: remove support for ssh_hook when it is removed from SFTPOperator providers/mongo/src/airflow/providers/mongo/sensors/mongo.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/auth/managers/base_auth_manager.py (1 line): - line 68: # TODO: Move this inside once all providers drop Airflow 2.x support. airflow-core/src/airflow/api_fastapi/execution_api/datamodels/taskinstance.py (1 line): - line 281: # TODO: `dag_id` and `run_id` are duplicated from TaskInstance providers/standard/src/airflow/providers/standard/sensors/time_delta.py (1 line): - line 37: # TODO: Remove once provider drops support for Airflow 2 providers/docker/src/airflow/providers/docker/decorators/docker.py (1 line): - line 43: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/api_fastapi/execution_api/routes/asset_events.py (1 line): - line 33: # TODO: Add dependency on JWT token providers/datadog/src/airflow/providers/datadog/sensors/datadog.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 providers/snowflake/src/airflow/providers/snowflake/operators/snowflake.py (1 line): - line 40: # TODO: Remove once provider drops support for Airflow 2 task-sdk/src/airflow/sdk/definitions/_internal/mixins.py (1 line): - line 30: # TODO: Should this all just live on DAGNode? providers/standard/src/airflow/providers/standard/operators/latest_only.py (1 line): - line 40: # TODO: Remove once provider drops support for Airflow 2 providers/ftp/src/airflow/providers/ftp/sensors/ftp.py (1 line): - line 32: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/datasets/metadata.py (1 line): - line 24: # TODO: Remove this module in Airflow 3.2 providers/databricks/src/airflow/providers/databricks/hooks/databricks_base.py (1 line): - line 639: # TODO: get rid of explicit 'api/' in the endpoint specification providers/salesforce/src/airflow/providers/salesforce/operators/salesforce_apex_rest.py (1 line): - line 28: # TODO: Remove once provider drops support for Airflow 2 providers/common/sql/src/airflow/providers/common/sql/operators/generic_transfer.py (1 line): - line 36: # TODO: Remove once provider drops support for Airflow 2 providers/elasticsearch/src/airflow/providers/elasticsearch/log/es_task_handler.py (1 line): - line 253: # TODO: Task-SDK: Where should this function be? airflow-core/src/airflow/utils/db.py (1 line): - line 75: # TODO: Import this from sqlalchemy.orm instead when switching to SQLA 2. providers/yandex/src/airflow/providers/yandex/operators/yq.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/slack/src/airflow/providers/slack/transfers/sql_to_slack.py (1 line): - line 35: # TODO: Remove once provider drops support for Airflow 2 task-sdk/src/airflow/sdk/execution_time/execute_workload.py (1 line): - line 64: # This is the "wrong" ti type, but it duck types the same. TODO: Create a protocol for this. providers/celery/src/airflow/providers/celery/executors/celery_executor_utils.py (1 line): - line 168: # This is the "wrong" ti type, but it duck types the same. TODO: Create a protocol for this. providers/mysql/src/airflow/providers/mysql/transfers/s3_to_mysql.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/datasets/__init__.py (1 line): - line 30: # TODO: Remove this module in Airflow 3.2 providers/arangodb/src/airflow/providers/arangodb/sensors/arangodb.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 providers/yandex/src/airflow/providers/yandex/operators/dataproc.py (1 line): - line 30: # TODO: Remove once provider drops support for Airflow 2 dev/breeze/src/airflow_breeze/global_constants.py (1 line): - line 698: # TODO: bring back common-messaging when we bump airflow to 3.0.1 providers/openai/src/airflow/providers/openai/operators/openai.py (1 line): - line 35: # TODO: Remove once provider drops support for Airflow 2 task-sdk/src/airflow/sdk/definitions/taskgroup.py (1 line): - line 148: # TODO: If attrs supported init only args we could use that here providers/elasticsearch/src/airflow/providers/elasticsearch/log/es_json_formatter.py (1 line): - line 35: # TODO: Use airflow.utils.timezone.from_timestamp(record.created, tz="local") providers/google/src/airflow/providers/google/cloud/operators/dataproc.py (1 line): - line 652: # TODO: remove one day providers/teradata/src/airflow/providers/teradata/transfers/teradata_to_teradata.py (1 line): - line 31: # TODO: Remove once provider drops support for Airflow 2 providers/microsoft/azure/src/airflow/providers/microsoft/azure/operators/adx.py (1 line): - line 81: # TODO: Remove this after minimum Airflow version is 3.0 airflow-ctl/pyproject.toml (1 line): - line 25: # TODO there could be still missing deps such as airflow-core providers/docker/src/airflow/providers/docker/operators/docker.py (1 line): - line 54: # TODO: Remove once provider drops support for Airflow 2 airflow-core/src/airflow/ui/src/pages/Dag/Dag.tsx (1 line): - line 57: // TODO: replace with with a list dag runs by dag id request providers/ssh/src/airflow/providers/ssh/operators/ssh.py (1 line): - line 174: # TODO: Remove this after minimum Airflow version is 3.0 providers/fab/src/airflow/providers/fab/auth_manager/security_manager/override.py (1 line): - line 1091: # TODO: An action can't be removed from a role.