cli/foundation-models/system/distillation/nli/serverless_endpoint.yaml (
2
lines of code) (
raw
):
name: llama-nli-distilled model_id: azureml://locations/{AI_PROJECT_LOCATION}/workspaces/{WORKSPACE_ID}/models/llama-nli-distilled/versions/{VERSION}