02_feature_engineering_batch.ipynb (1,471 lines of code) (raw):
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "ur8xi4C7S06n"
},
"outputs": [],
"source": [
"# Copyright 2023 Google LLC\n",
"#\n",
"# Licensed under the Apache License, Version 2.0 (the \"License\");\n",
"# you may not use this file except in compliance with the License.\n",
"# You may obtain a copy of the License at\n",
"#\n",
"# https://www.apache.org/licenses/LICENSE-2.0\n",
"#\n",
"# Unless required by applicable law or agreed to in writing, software\n",
"# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
"# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
"# See the License for the specific language governing permissions and\n",
"# limitations under the License."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "JAPoU8Sm5E6e"
},
"source": [
"# FraudFinder - Feature Engineering (batch)\n",
"\n",
"<table align=\"left\">\n",
" <td>\n",
" <a href=\"https://console.cloud.google.com/ai-platform/notebooks/deploy-notebook?&download_url=https://github.com/GoogleCloudPlatform/fraudfinder/raw/main/02_feature_engineering_batch.ipynb\">\n",
" <img src=\"https://www.gstatic.com/cloud/images/navigation/vertex-ai.svg\" alt=\"Google Cloud Notebooks\">Open in Cloud Notebook\n",
" </a>\n",
" </td> \n",
" <td>\n",
" <a href=\"https://colab.research.google.com/github/GoogleCloudPlatform/fraudfinder/blob/main/02_feature_engineering_batch.ipynb\">\n",
" <img src=\"https://cloud.google.com/ml-engine/images/colab-logo-32px.png\" alt=\"Colab logo\"> Open in Colab\n",
" </a>\n",
" </td>\n",
" <td>\n",
" <a href=\"https://github.com/GoogleCloudPlatform/fraudfinder/blob/main/02_feature_engineering_batch.ipynb\">\n",
" <img src=\"https://cloud.google.com/ml-engine/images/github-logo-32px.png\" alt=\"GitHub logo\">\n",
" View on GitHub\n",
" </a>\n",
" </td>\n",
"</table>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "tvgnzT1CKxrO"
},
"source": [
"## Overview\n",
"\n",
"[FraudFinder](https://github.com/googlecloudplatform/fraudfinder) is a series of labs on how to build a real-time fraud detection system on Google Cloud. Throughout the FraudFinder labs, you will learn how to read historical bank transaction data stored in data warehouse, read from a live stream of new transactions, perform exploratory data analysis (EDA), do feature engineering, ingest features into a feature store, train a model using feature store, register your model in a model registry, evaluate your model, deploy your model to an endpoint, do real-time inference on your model with feature store, and monitor your model."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "25113176ad6a"
},
"source": [
"### Objective\n",
"\n",
"As you engineer features for model training, it's important to consider how the features are computed when making predictions with new data. For online predictions, you may have features that can be pre-computed via _batch feature engineering_. You may also features that need to be computed on-the-fly via _streaming-based feature engineering_. For these Fraudfinder labs, for computing features based on the last n _days_, you will use _batch_ feature engineering in BigQuery; for computing features based on the last n _minutes_, you will use _streaming-based_ feature engineering using Dataflow.\n",
"\n",
"This notebook shows how to generate new features on bank transactions by customer and terminal over the last n days, by doing batch feature engineering in SQL with BigQuery. Then, you will create a feature store using Vertex AI Feature Store, and ingest your newly-created features from BigQuery into Vertex AI Feature Store, so that a feature store can become the single source of data for both training and model inference. \n",
"\n",
"You will also create some placeholder values for streaming-based feature engineering, which is covered in the next notebook, `03_feature_engineering_streaming.ipynb`.\n",
"\n",
"This lab uses the following Google Cloud services and resources:\n",
"\n",
"- [Vertex AI](https://cloud.google.com/vertex-ai/)\n",
"- [BigQuery](https://cloud.google.com/bigquery/)\n",
"\n",
"\n",
"Steps performed in this notebook:\n",
"\n",
"- Build customer and terminal-related features\n",
"- Create Feature store, entities and features\n",
"- Ingest feature values in Feature store from BigQuery table\n",
"- Read features from the feature store"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Costs"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This tutorial uses billable components of Google Cloud:\n",
"\n",
"* Vertex AI\n",
"* Cloud Storage\n",
"* BigQuery\n",
"\n",
"Learn about [Vertex AI\n",
"pricing](https://cloud.google.com/vertex-ai/pricing), [BigQuery pricing](https://cloud.google.com/bigquery/pricing) and use the [Pricing\n",
"Calculator](https://cloud.google.com/products/calculator/)\n",
"to generate a cost estimate based on your projected usage."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "811f5d41ab7f",
"tags": []
},
"source": [
"### Load configuration settings from the setup notebook\n",
"\n",
"Set the constants used in this notebook and load the config settings from the `00_environment_setup.ipynb` notebook."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "d31d1487ac34",
"tags": []
},
"outputs": [],
"source": [
"GCP_PROJECTS = !gcloud config get-value project\n",
"PROJECT_ID = GCP_PROJECTS[0]\n",
"BUCKET_NAME = f\"{PROJECT_ID}-fraudfinder\"\n",
"config = !gsutil cat gs://{BUCKET_NAME}/config/notebook_env.py\n",
"print(config.n)\n",
"exec(config.n)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "XoEqT2Y4DJmf"
},
"source": [
"### Import libraries"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "pRUOFELefqf1",
"tags": []
},
"outputs": [],
"source": [
"# General\n",
"import datetime as dt\n",
"import json\n",
"import os\n",
"import random\n",
"import sys\n",
"import time\n",
"from datetime import datetime, timedelta\n",
"from typing import List, Union\n",
"\n",
"# Data Engineering\n",
"import numpy as np\n",
"import pandas as pd\n",
"pd.set_option('display.max_columns', 500)\n",
"\n",
"# Vertex AI and Vertex AI Feature Store\n",
"from google.cloud import aiplatform as vertex_ai\n",
"from google.cloud import bigquery\n",
"from google.cloud.aiplatform import EntityType, Feature, Featurestore"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Moq4QZKjY4fv"
},
"source": [
"### Define constants"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "G-4MU7kF3t4x",
"tags": []
},
"outputs": [],
"source": [
"# Define the date range of transactions for feature engineering (last 10 days up until yesterday)\n",
"YESTERDAY = datetime.today() - timedelta(days=1)\n",
"YEAR_MONTH_PREFIX = YESTERDAY.strftime(\"%Y-%m\")\n",
"DATAPROCESSING_START_DATE = (YESTERDAY - timedelta(days=10)).strftime(\"%Y-%m-%d\")\n",
"DATAPROCESSING_END_DATE = YESTERDAY.strftime(\"%Y-%m-%d\")\n",
"\n",
"# Define BiqQuery dataset and tables to calculate features.\n",
"RAW_BQ_TRANSACTION_TABLE_URI = f\"{PROJECT_ID}.tx.tx\"\n",
"RAW_BQ_LABELS_TABLE_URI = f\"{PROJECT_ID}.tx.txlabels\"\n",
"FEATURES_BQ_TABLE_URI = f\"{PROJECT_ID}.tx.wide_features_table\"\n",
"\n",
"# Define Vertex AI Feature store settings.\n",
"CUSTOMERS_TABLE_NAME = f\"customers_{DATAPROCESSING_END_DATE.replace('-', '')}\"\n",
"CUSTOMERS_BQ_TABLE_URI = f\"{PROJECT_ID}.tx.{CUSTOMERS_TABLE_NAME}\"\n",
"TERMINALS_TABLE_NAME = f\"terminals_{DATAPROCESSING_END_DATE.replace('-', '')}\" \n",
"TERMINALS_BQ_TABLE_URI = f\"{PROJECT_ID}.tx.{TERMINALS_TABLE_NAME}\"\n",
"ONLINE_STORAGE_NODES = 1\n",
"FEATURE_TIME = \"feature_ts\"\n",
"CUSTOMER_ENTITY_ID = \"customer\"\n",
"TERMINAL_ENTITY_ID = \"terminal\""
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "bedfb5ba7f57"
},
"source": [
"### Helpers\n",
"\n",
"Define a set of helper functions to run BigQuery query and create features. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "6b81dbbaa636",
"tags": []
},
"outputs": [],
"source": [
"def run_bq_query(sql: str, show=False) -> Union[str, pd.DataFrame]:\n",
" \"\"\"\n",
" Run a BigQuery query and return the job ID or result as a DataFrame\n",
" Args:\n",
" sql: SQL query, as a string, to execute in BigQuery\n",
" show: A flag to show query result in a Pandas Dataframe\n",
" Returns:\n",
" df: DataFrame of results from query, or error, if any\n",
" \"\"\"\n",
"\n",
" bq_client = bigquery.Client()\n",
"\n",
" # Try dry run before executing query to catch any errors\n",
" job_config = bigquery.QueryJobConfig(dry_run=True, use_query_cache=False)\n",
" bq_client.query(sql, job_config=job_config)\n",
"\n",
" # If dry run succeeds without errors, proceed to run query\n",
" job_config = bigquery.QueryJobConfig()\n",
" client_result = bq_client.query(sql, job_config=job_config)\n",
"\n",
" job_id = client_result.job_id\n",
"\n",
" # Wait for query/job to finish running. then get & return data frame\n",
" result = client_result.result()\n",
" print(f\"Finished job_id: {job_id}\")\n",
" \n",
" if show:\n",
" df = result.to_arrow().to_pandas()\n",
" return df"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "jKaROgJZ3t4y"
},
"source": [
"## Feature Engineering"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "0NDaDFyj3t4z"
},
"source": [
"### Define customer and terminal-related features for batch feature engineering\n",
"\n",
"In this section, you will create features, based on historical customer behaviour and historical terminal activities. This features will be batch-generated using SQL in BigQuery, where the historical data is stored.\n",
"\n",
"The query below will calculate 2 sets of features: \n",
"\n",
"1. **Customer-related features**: which describes the spending behaviour of customer within 1, 7 and 15 days time windows using number of transactions and average amount spent in dollars ($)\n",
"\n",
"2. **Terminal-related features** which describes the risk of a given terminal to be exposed to fraudulent transactions within 1, 7 and 15 days using average number of fraudulent transactions in dollars ($), the number of transactions and risk index. One thing to note is that you will add some delay to take into account time that would pass between the time of transaction and the result of fraud investigation or customer claim.\n",
"\n",
"You will use one month of transaction data starting from the end of January and going back to compute the features.\n",
"\n",
"Below is the schema you should expect to see, after doing the batch feature engineering in BigQuery:\n",
"\n",
"|feature_time |customer_id| customer batch features |\n",
"|-----------------------|-----------|---------------------------|\n",
"|2022-01-01 17:20:15 UTC|1 |(e.g., nb_tx, avg_tx) |\n",
"|2022-01-02 12:08:40 UTC|2 |(e.g., nb_tx, avg_tx) |\n",
"|2022-01-03 17:30:48 UTC|3 |(e.g., nb_tx, avg_tx) |\n",
"\n",
"\n",
"|feature_time |terminal_id| terminal batch features|\n",
"|-----------------------|-----------|------------------------|\n",
"|2022-01-01 17:20:15 UTC|12345 |(e.g., risk_x_days) |\n",
"|2022-01-02 12:08:40 UTC|26789 |(e.g., risk_x_days) |\n",
"|2022-01-03 17:30:48 UTC|101112 |(e.g., risk_x_days) |\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "43265bd3002f"
},
"source": [
"#### Create the query to create batch features"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Date settings to be used:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"print(f\"\"\"\n",
"DATAPROCESSING_START_DATE: {DATAPROCESSING_START_DATE}\n",
"DATAPROCESSING_END_DATE: {DATAPROCESSING_END_DATE}\n",
"\"\"\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Customer feature table\n",
"\n",
"Customer table SQL query string:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "43Ck_vAc3t4z",
"tags": []
},
"outputs": [],
"source": [
"create_customer_batch_features_query = f\"\"\"\n",
"CREATE OR REPLACE TABLE `{CUSTOMERS_BQ_TABLE_URI}` AS\n",
"WITH\n",
" -- query to join labels with features -------------------------------------------------------------------------------------------\n",
" get_raw_table AS (\n",
" SELECT\n",
" raw_tx.TX_TS,\n",
" raw_tx.TX_ID,\n",
" raw_tx.CUSTOMER_ID,\n",
" raw_tx.TERMINAL_ID,\n",
" raw_tx.TX_AMOUNT,\n",
" raw_lb.TX_FRAUD\n",
" FROM (\n",
" SELECT\n",
" *\n",
" FROM\n",
" `{RAW_BQ_TRANSACTION_TABLE_URI}`\n",
" WHERE\n",
" DATE(TX_TS) BETWEEN DATE_SUB(\"{DATAPROCESSING_END_DATE}\", INTERVAL 15 DAY) AND \"{DATAPROCESSING_END_DATE}\"\n",
" ) raw_tx\n",
" LEFT JOIN \n",
" `{RAW_BQ_LABELS_TABLE_URI}` as raw_lb\n",
" ON raw_tx.TX_ID = raw_lb.TX_ID),\n",
"\n",
" -- query to calculate CUSTOMER spending behaviour --------------------------------------------------------------------------------\n",
" get_customer_spending_behaviour AS (\n",
" SELECT\n",
" TX_TS,\n",
" TX_ID,\n",
" CUSTOMER_ID,\n",
" TERMINAL_ID,\n",
" TX_AMOUNT,\n",
" TX_FRAUD,\n",
" \n",
" # calc the number of customer tx over daily windows per customer (1, 7 and 15 days, expressed in seconds)\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY CUSTOMER_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 86400 PRECEDING\n",
" AND CURRENT ROW ) AS CUSTOMER_ID_NB_TX_1DAY_WINDOW,\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY CUSTOMER_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 604800 PRECEDING\n",
" AND CURRENT ROW ) AS CUSTOMER_ID_NB_TX_7DAY_WINDOW,\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY CUSTOMER_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 1209600 PRECEDING\n",
" AND CURRENT ROW ) AS CUSTOMER_ID_NB_TX_14DAY_WINDOW,\n",
" \n",
" # calc the customer average tx amount over daily windows per customer (1, 7 and 15 days, expressed in seconds, in dollars ($))\n",
" AVG(TX_AMOUNT) OVER (PARTITION BY CUSTOMER_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 86400 PRECEDING\n",
" AND CURRENT ROW ) AS CUSTOMER_ID_AVG_AMOUNT_1DAY_WINDOW,\n",
" AVG(TX_AMOUNT) OVER (PARTITION BY CUSTOMER_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 604800 PRECEDING\n",
" AND CURRENT ROW ) AS CUSTOMER_ID_AVG_AMOUNT_7DAY_WINDOW,\n",
" AVG(TX_AMOUNT) OVER (PARTITION BY CUSTOMER_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 1209600 PRECEDING\n",
" AND CURRENT ROW ) AS CUSTOMER_ID_AVG_AMOUNT_14DAY_WINDOW,\n",
" FROM get_raw_table)\n",
"\n",
"# Create the table with CUSTOMER and TERMINAL features ----------------------------------------------------------------------------\n",
"SELECT\n",
" PARSE_TIMESTAMP(\"%Y-%m-%d %H:%M:%S\", FORMAT_TIMESTAMP(\"%Y-%m-%d %H:%M:%S\", TX_TS, \"UTC\")) AS feature_ts,\n",
" CUSTOMER_ID AS customer_id,\n",
" CAST(CUSTOMER_ID_NB_TX_1DAY_WINDOW AS INT64) AS customer_id_nb_tx_1day_window,\n",
" CAST(CUSTOMER_ID_NB_TX_7DAY_WINDOW AS INT64) AS customer_id_nb_tx_7day_window,\n",
" CAST(CUSTOMER_ID_NB_TX_14DAY_WINDOW AS INT64) AS customer_id_nb_tx_14day_window,\n",
" CAST(CUSTOMER_ID_AVG_AMOUNT_1DAY_WINDOW AS FLOAT64) AS customer_id_avg_amount_1day_window,\n",
" CAST(CUSTOMER_ID_AVG_AMOUNT_7DAY_WINDOW AS FLOAT64) AS customer_id_avg_amount_7day_window,\n",
" CAST(CUSTOMER_ID_AVG_AMOUNT_14DAY_WINDOW AS FLOAT64) AS customer_id_avg_amount_14day_window,\n",
"FROM\n",
" get_customer_spending_behaviour\n",
"\"\"\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Run the query \n",
"\n",
"You create the customer features table"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(create_customer_batch_features_query)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Inspect the result \n",
"\n",
"You can query some data rows to validate the result of the query"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(\n",
" f\"SELECT * FROM `{CUSTOMERS_BQ_TABLE_URI}` LIMIT 10\",\n",
" show=True\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Terminal feature table\n",
"\n",
"Terminal table SQL query string:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "43Ck_vAc3t4z",
"tags": []
},
"outputs": [],
"source": [
"create_terminal_batch_features_query = f\"\"\"\n",
"CREATE OR REPLACE TABLE `{TERMINALS_BQ_TABLE_URI}` AS\n",
"WITH\n",
" -- query to join labels with features -------------------------------------------------------------------------------------------\n",
" get_raw_table AS (\n",
" SELECT\n",
" raw_tx.TX_TS,\n",
" raw_tx.TX_ID,\n",
" raw_tx.CUSTOMER_ID,\n",
" raw_tx.TERMINAL_ID,\n",
" raw_tx.TX_AMOUNT,\n",
" raw_lb.TX_FRAUD\n",
" FROM (\n",
" SELECT\n",
" *\n",
" FROM\n",
" `{RAW_BQ_TRANSACTION_TABLE_URI}`\n",
" WHERE\n",
" DATE(TX_TS) BETWEEN DATE_SUB(\"{DATAPROCESSING_END_DATE}\", INTERVAL 15 DAY) AND \"{DATAPROCESSING_END_DATE}\"\n",
" ) raw_tx\n",
" LEFT JOIN \n",
" `{RAW_BQ_LABELS_TABLE_URI}` as raw_lb\n",
" ON raw_tx.TX_ID = raw_lb.TX_ID),\n",
"\n",
" # query to calculate TERMINAL spending behaviour --------------------------------------------------------------------------------\n",
" get_variables_delay_window AS (\n",
" SELECT\n",
" TX_TS,\n",
" TX_ID,\n",
" CUSTOMER_ID,\n",
" TERMINAL_ID,\n",
" \n",
" # calc total amount of fraudulent tx and the total number of tx over the delay period per terminal (7 days - delay, expressed in seconds)\n",
" SUM(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 604800 PRECEDING\n",
" AND CURRENT ROW ) AS NB_FRAUD_DELAY,\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 604800 PRECEDING\n",
" AND CURRENT ROW ) AS NB_TX_DELAY,\n",
" \n",
" # calc total amount of fraudulent tx and the total number of tx over the delayed window per terminal (window + 7 days - delay, expressed in seconds)\n",
" SUM(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 691200 PRECEDING\n",
" AND CURRENT ROW ) AS NB_FRAUD_1_DELAY_WINDOW,\n",
" SUM(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 1209600 PRECEDING\n",
" AND CURRENT ROW ) AS NB_FRAUD_7_DELAY_WINDOW,\n",
" SUM(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 1814400 PRECEDING\n",
" AND CURRENT ROW ) AS NB_FRAUD_14_DELAY_WINDOW,\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 691200 PRECEDING\n",
" AND CURRENT ROW ) AS NB_TX_1_DELAY_WINDOW,\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 1209600 PRECEDING\n",
" AND CURRENT ROW ) AS NB_TX_7_DELAY_WINDOW,\n",
" COUNT(TX_FRAUD) OVER (PARTITION BY TERMINAL_ID ORDER BY UNIX_SECONDS(TX_TS) ASC RANGE BETWEEN 1814400 PRECEDING\n",
" AND CURRENT ROW ) AS NB_TX_14_DELAY_WINDOW,\n",
" FROM get_raw_table),\n",
"\n",
" # query to calculate TERMINAL risk factors ---------------------------------------------------------------------------------------\n",
" get_risk_factors AS (\n",
" SELECT\n",
" TX_TS,\n",
" TX_ID,\n",
" CUSTOMER_ID,\n",
" TERMINAL_ID,\n",
" # calculate numerator of risk index\n",
" NB_FRAUD_1_DELAY_WINDOW - NB_FRAUD_DELAY AS TERMINAL_ID_NB_FRAUD_1DAY_WINDOW,\n",
" NB_FRAUD_7_DELAY_WINDOW - NB_FRAUD_DELAY AS TERMINAL_ID_NB_FRAUD_7DAY_WINDOW,\n",
" NB_FRAUD_14_DELAY_WINDOW - NB_FRAUD_DELAY AS TERMINAL_ID_NB_FRAUD_14DAY_WINDOW,\n",
" # calculate denominator of risk index\n",
" NB_TX_1_DELAY_WINDOW - NB_TX_DELAY AS TERMINAL_ID_NB_TX_1DAY_WINDOW,\n",
" NB_TX_7_DELAY_WINDOW - NB_TX_DELAY AS TERMINAL_ID_NB_TX_7DAY_WINDOW,\n",
" NB_TX_14_DELAY_WINDOW - NB_TX_DELAY AS TERMINAL_ID_NB_TX_14DAY_WINDOW,\n",
" FROM\n",
" get_variables_delay_window),\n",
"\n",
" # query to calculate the TERMINAL risk index -------------------------------------------------------------------------------------\n",
" get_risk_index AS (\n",
" SELECT\n",
" TX_TS,\n",
" TX_ID,\n",
" CUSTOMER_ID,\n",
" TERMINAL_ID,\n",
" TERMINAL_ID_NB_TX_1DAY_WINDOW,\n",
" TERMINAL_ID_NB_TX_7DAY_WINDOW,\n",
" TERMINAL_ID_NB_TX_14DAY_WINDOW,\n",
" # calculate the risk index\n",
" (TERMINAL_ID_NB_FRAUD_1DAY_WINDOW/(TERMINAL_ID_NB_TX_1DAY_WINDOW+0.0001)) AS TERMINAL_ID_RISK_1DAY_WINDOW,\n",
" (TERMINAL_ID_NB_FRAUD_7DAY_WINDOW/(TERMINAL_ID_NB_TX_7DAY_WINDOW+0.0001)) AS TERMINAL_ID_RISK_7DAY_WINDOW,\n",
" (TERMINAL_ID_NB_FRAUD_14DAY_WINDOW/(TERMINAL_ID_NB_TX_14DAY_WINDOW+0.0001)) AS TERMINAL_ID_RISK_14DAY_WINDOW\n",
" FROM get_risk_factors \n",
" )\n",
"\n",
"# Create the table with CUSTOMER and TERMINAL features ----------------------------------------------------------------------------\n",
"SELECT\n",
" PARSE_TIMESTAMP(\"%Y-%m-%d %H:%M:%S\", FORMAT_TIMESTAMP(\"%Y-%m-%d %H:%M:%S\", TX_TS, \"UTC\")) AS feature_ts,\n",
" TERMINAL_ID AS terminal_id,\n",
" CAST(TERMINAL_ID_NB_TX_1DAY_WINDOW AS INT64) AS terminal_id_nb_tx_1day_window,\n",
" CAST(TERMINAL_ID_NB_TX_7DAY_WINDOW AS INT64) AS terminal_id_nb_tx_7day_window,\n",
" CAST(TERMINAL_ID_NB_TX_14DAY_WINDOW AS INT64) AS terminal_id_nb_tx_14day_window,\n",
" CAST(TERMINAL_ID_RISK_1DAY_WINDOW AS FLOAT64) AS terminal_id_risk_1day_window,\n",
" CAST(TERMINAL_ID_RISK_7DAY_WINDOW AS FLOAT64) AS terminal_id_risk_7day_window,\n",
" CAST(TERMINAL_ID_RISK_14DAY_WINDOW AS FLOAT64) AS terminal_id_risk_14day_window,\n",
"FROM\n",
" get_risk_index\n",
"\"\"\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Run the query \n",
"\n",
"You create the customer features table"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(create_terminal_batch_features_query)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Inspect the result \n",
"\n",
"You can query some data rows to validate the result of the query"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(\n",
" f\"SELECT * FROM `{TERMINALS_BQ_TABLE_URI}` LIMIT 10\",\n",
" show=True\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "4d22f76fcd85"
},
"source": [
"### Define customer and terminal-related features for _real-time_ feature engineering\n",
"\n",
"To make more accurate predictions, you can also create real-time features to inspect, for example, the most recent minutes of activity for both customers and terminals.\n",
"\n",
"In order to have these features available for training and inference, we first need to make sure they are defined in a BigQuery table as placeholders to be ingested into Vertex AI Feature Store.\n",
"\n",
"In the query below, you initialize two sets of features for real-time feature engineering: \n",
"\n",
"1. Customer features which describes the spending behaviour of customer within 15, 30 and 60 minutes time windows using number of transactions and average amount spent in dollars ($)\n",
"\n",
"2. Terminal features which describes the risk of a given terminal to be exposed to fraudulent transactions within 15, 30 and 60 minutes using average number of fraudulent transactions in dollars ($) and the number of transactions. \n",
"\n",
"To do so, you will:\n",
"\n",
"- Add one column for each real time feature\n",
"- Set 0 as default values for each of them\n",
"- Update all real-time columns with default values\n",
"\n",
"Then you will create the actual values for real-time feature engineering in the next `03_feature_engineering_streaming.ipynb` notebook. "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "612e2fa65748"
},
"source": [
"#### Define the query to initialize the real-time features."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Customer feature table\n",
"\n",
"Customer table SQL query string:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "c1aed0001232",
"tags": []
},
"outputs": [],
"source": [
"initiate_real_time_customer_features_query = f\"\"\"\n",
"ALTER TABLE `{CUSTOMERS_BQ_TABLE_URI}`\n",
"ADD COLUMN customer_id_nb_tx_15min_window INT64,\n",
"ADD COLUMN customer_id_nb_tx_30min_window INT64,\n",
"ADD COLUMN customer_id_nb_tx_60min_window INT64,\n",
"ADD COLUMN customer_id_avg_amount_15min_window FLOAT64,\n",
"ADD COLUMN customer_id_avg_amount_30min_window FLOAT64,\n",
"ADD COLUMN customer_id_avg_amount_60min_window FLOAT64;\n",
"\n",
"ALTER TABLE `{CUSTOMERS_BQ_TABLE_URI}`\n",
"ALTER COLUMN customer_id_nb_tx_15min_window SET DEFAULT 0,\n",
"ALTER COLUMN customer_id_nb_tx_30min_window SET DEFAULT 0,\n",
"ALTER COLUMN customer_id_nb_tx_60min_window SET DEFAULT 0,\n",
"ALTER COLUMN customer_id_avg_amount_15min_window SET DEFAULT 0,\n",
"ALTER COLUMN customer_id_avg_amount_30min_window SET DEFAULT 0,\n",
"ALTER COLUMN customer_id_avg_amount_60min_window SET DEFAULT 0;\n",
"\n",
"UPDATE `{CUSTOMERS_BQ_TABLE_URI}`\n",
"SET customer_id_nb_tx_15min_window = 0,\n",
" customer_id_nb_tx_30min_window = 0,\n",
" customer_id_nb_tx_60min_window = 0, \n",
" customer_id_avg_amount_15min_window = 0,\n",
" customer_id_avg_amount_30min_window = 0,\n",
" customer_id_avg_amount_60min_window = 0\n",
"WHERE TRUE; \n",
"\"\"\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"initiate_real_time_terminal_features_query = f\"\"\"\n",
"ALTER TABLE `{TERMINALS_BQ_TABLE_URI}`\n",
"ADD COLUMN terminal_id_nb_tx_15min_window INT64,\n",
"ADD COLUMN terminal_id_nb_tx_30min_window INT64,\n",
"ADD COLUMN terminal_id_nb_tx_60min_window INT64,\n",
"ADD COLUMN terminal_id_avg_amount_15min_window FLOAT64,\n",
"ADD COLUMN terminal_id_avg_amount_30min_window FLOAT64,\n",
"ADD COLUMN terminal_id_avg_amount_60min_window FLOAT64;\n",
"\n",
"ALTER TABLE `{TERMINALS_BQ_TABLE_URI}`\n",
"ALTER COLUMN terminal_id_nb_tx_15min_window SET DEFAULT 0,\n",
"ALTER COLUMN terminal_id_nb_tx_30min_window SET DEFAULT 0,\n",
"ALTER COLUMN terminal_id_nb_tx_60min_window SET DEFAULT 0,\n",
"ALTER COLUMN terminal_id_avg_amount_15min_window SET DEFAULT 0,\n",
"ALTER COLUMN terminal_id_avg_amount_30min_window SET DEFAULT 0,\n",
"ALTER COLUMN terminal_id_avg_amount_60min_window SET DEFAULT 0;\n",
"\n",
"UPDATE `{TERMINALS_BQ_TABLE_URI}`\n",
"SET terminal_id_nb_tx_15min_window = 0,\n",
" terminal_id_nb_tx_30min_window = 0,\n",
" terminal_id_nb_tx_60min_window = 0,\n",
" terminal_id_avg_amount_15min_window = 0,\n",
" terminal_id_avg_amount_30min_window = 0,\n",
" terminal_id_avg_amount_60min_window = 0\n",
"WHERE TRUE; \n",
"\"\"\""
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "1d35d1e69d72"
},
"source": [
"#### Run the query above to initialize the real-time features."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "aae355c66e4a",
"tags": []
},
"outputs": [],
"source": [
"for query in [initiate_real_time_customer_features_query, initiate_real_time_terminal_features_query]:\n",
" run_bq_query(query)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "76733185d022"
},
"source": [
"#### Inspect BigQuery features tables"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "774a90225747",
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(\n",
" f\"SELECT * FROM `{CUSTOMERS_BQ_TABLE_URI}` LIMIT 5\", show=True\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(\n",
" f\"SELECT * FROM `{TERMINALS_BQ_TABLE_URI}` LIMIT 5\", show=True\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's look at the final schema of the features table:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(f\"\"\"\n",
"SELECT column_name, data_type\n",
"FROM tx.INFORMATION_SCHEMA.COLUMNS\n",
"WHERE table_name = '{CUSTOMERS_TABLE_NAME}'\n",
"\"\"\", show=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"run_bq_query(f\"\"\"\n",
"SELECT column_name, data_type\n",
"FROM tx.INFORMATION_SCHEMA.COLUMNS\n",
"WHERE table_name = '{TERMINALS_TABLE_NAME}'\n",
"\"\"\", show=True)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "T9xB8hVU3t40"
},
"source": [
"## Feature store for feature management\n",
"\n",
"### What is a feature store?\n",
"\n",
"The features generated are great examples of features that we can store the [Vertex AI Feature Store](https://cloud.google.com/vertex-ai/docs/featurestore). This is because:\n",
"\n",
"- The features are needed for real-time prediction\n",
"- feature values in a feature store can be used for both training and serving\n",
"- if needed, features can be shared with other use cases beyond fraud detection\n",
"\n",
"Vertex AI Feature Store provides a centralized repository for organizing, storing, and serving ML features. Using a central featurestore enables an organization to efficiently share, discover, and re-use ML features at scale, which can increase the velocity of developing and deploying new ML applications.\n",
"\n",
"### Why would you like to set up it?\n",
"\n",
"So far you've built and stored features in BigQuery. \n",
"\n",
"Now, in order to predict fraud, you want to serve those features in real-time with millisecond scale latency. In particular, when the ML gateway receives a prediction request for a specific transaction (including customer, terminal, and transaction ids), the system needs to fetch the features related to that transaction and pass them as inputs to the model for online prediction. As you can imagine, an analytical data warehouse such as BigQuery is not able to provide low-latency near real-time read operations. \n",
"\n",
"Vertex AI Feature Store provides a managed service for low latency scalable feature serving. It also provides a centralized feature repository with easy APIs to search and discover features, as well as feature monitoring capabilities to track drift and other quality issues. \n",
"\n",
"Vertex AI Feature Store uses a time series data model to store a series of values for features, which enables Vertex AI Feature Store to maintain feature values as they change over time and to support point-in-time queries of feature values. Feature Store organizes resources hierarchically (`Featurestore -> EntityType -> Feature`) in the following order: \n",
"\n",
"- **Featurestore**: the resource to contains entities and features.\n",
" - **EntityType**: under a Featurestore, an EntityType describes an minimal data entry.\n",
" - **Feature**: under an EntityType, a feature is an attribute of the EntityType. \n",
"\n",
"\n",
"You must create these resources before you can ingest data into a Feature Store. \n",
"\n",
"In the follow section, you will use create a feature store using Vertex AI Feature Store, and ingest data into it to be used later for training and model inference."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "XoEqT2Y4DJmf"
},
"source": [
"### Initialize Vertex AI SDK\n",
"\n",
"Initialize the Vertex AI SDK to get access to Vertex AI services programmatically. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "c85ab891607c",
"tags": []
},
"outputs": [],
"source": [
"vertex_ai.init(project=PROJECT_ID, location=REGION, staging_bucket=BUCKET_NAME)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "AfKeQmf63t42"
},
"source": [
"### Create featurestore, `fraudfinder_<ID>`\n",
"\n",
"A featurestore is the top-level container for entity types, features, and feature values. Typically, an organization creates one shared featurestore for feature ingestion, serving, and sharing across all teams in the organization.\n",
"\n",
"Below you create a `featurestore` resources with different labels. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "nQmwn7agJZ6o",
"tags": []
},
"outputs": [],
"source": [
"try:\n",
" # Checks if there is already a Featurestore\n",
" ff_feature_store = vertex_ai.Featurestore(f\"{FEATURESTORE_ID}\")\n",
" print(f\"\"\"The feature st01_exploratory_data_analysisore {FEATURESTORE_ID} already exists.\"\"\")\n",
"except:\n",
" # Creates a Featurestore\n",
" print(f\"\"\"Creating new feature store {FEATURESTORE_ID}.\"\"\")\n",
" ff_feature_store = Featurestore.create(\n",
" featurestore_id=f\"{FEATURESTORE_ID}\",\n",
" online_store_fixed_node_count=ONLINE_STORAGE_NODES,\n",
" labels={\"team\": \"cymbal_bank\", \"app\": \"fraudfinder\"},\n",
" sync=True,\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "GkhVJqph3t42"
},
"source": [
"### Create the main entity types and their features\n",
"\n",
"An entity type is a collection of semantically related features. You define your own entity types, based on the concepts that are relevant to your use case. \n",
"\n",
"In this case, you create `customer` and `transaction` entity types. "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "c0gimd2Y3t44"
},
"source": [
"#### Create the ```customer``` entity type "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "TBPk3KdRQXec",
"tags": []
},
"outputs": [],
"source": [
"try:\n",
" # get entity type, if it already exists\n",
" customer_entity_type = ff_feature_store.get_entity_type(entity_type_id=CUSTOMER_ENTITY_ID)\n",
"except:\n",
" # else, create entity type\n",
" customer_entity_type = ff_feature_store.create_entity_type(\n",
" entity_type_id=CUSTOMER_ENTITY_ID, description=\"Customer Entity\", sync=True\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "gHglWPU43t45"
},
"source": [
"#### Create features of the ```customer``` entity type"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Y_TlUTtkRKaN",
"tags": []
},
"outputs": [],
"source": [
"customer_feature_configs = {\n",
" \"customer_id_nb_tx_1day_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the customer in the last day\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_nb_tx_7day_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the customer in the last 7 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_nb_tx_14day_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the customer in the last 14 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_avg_amount_1day_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last day\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_avg_amount_7day_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 7 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_avg_amount_14day_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 14 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_nb_tx_15min_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the customer in the last 15 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_nb_tx_30min_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the customer in the last 30 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_nb_tx_60min_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the customer in the last 60 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_avg_amount_15min_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 15 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_avg_amount_30min_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 30 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"customer_id_avg_amount_60min_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 60 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
"}"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Upho8Vg13t46",
"tags": []
},
"outputs": [],
"source": [
"customer_feature_ids = customer_entity_type.batch_create_features(\n",
" feature_configs=customer_feature_configs, sync=True\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "H7Qb-5aI3t46"
},
"source": [
"#### Create the ```terminal``` entity type"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Pje6CLgj3t46",
"tags": []
},
"outputs": [],
"source": [
"try:\n",
" # get entity type, if it already exists\n",
" terminal_entity_type = ff_feature_store.get_entity_type(entity_type_id=TERMINAL_ENTITY_ID)\n",
"except:\n",
" # else, create entity type\n",
" terminal_entity_type = ff_feature_store.create_entity_type(\n",
" entity_type_id=TERMINAL_ENTITY_ID, description=\"Terminal Entity\", sync=True\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "7tUvpIfR3t47"
},
"source": [
"#### Create features of the ```terminal``` entity type"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Hwkz6FSeXAUN",
"tags": []
},
"outputs": [],
"source": [
"terminal_feature_configs = {\n",
" \"terminal_id_nb_tx_1day_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the terminal in the last day\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_nb_tx_7day_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the terminal in the 7 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_nb_tx_14day_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the terminal in the 14 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_risk_1day_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Risk score calculated average number of frauds on the terminal in the last day\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_risk_7day_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Risk score calculated average number of frauds on the terminal in the last 7 days\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_risk_14day_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Risk score calculated average number of frauds on the terminal in the last 14 day\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_nb_tx_15min_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the terminal in the last 15 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_nb_tx_30min_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the terminal in the last 30 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_nb_tx_60min_window\": {\n",
" \"value_type\": \"INT64\",\n",
" \"description\": \"Number of transactions by the terminal in the last 60 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_avg_amount_15min_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 15 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_avg_amount_30min_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 30 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
" \"terminal_id_avg_amount_60min_window\": {\n",
" \"value_type\": \"DOUBLE\",\n",
" \"description\": \"Average spending amount in the last 60 minutes\",\n",
" \"labels\": {\"status\": \"passed\"},\n",
" },\n",
"}"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "SxNJmlCbX5HK",
"tags": []
},
"outputs": [],
"source": [
"terminal_feature_ids = terminal_entity_type.batch_create_features(\n",
" feature_configs=terminal_feature_configs, sync=True\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "a1335c1b0761"
},
"source": [
"### Inspect your feature store in the Vertex AI console\n",
"\n",
"You can also inspect your feature store in the [Vertex AI Feature Store console](https://console.cloud.google.com/vertex-ai/features)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "WwVoJ42m3t48"
},
"source": [
"### Ingest feature values in Vertex AI Feature Store\n",
"\n",
"Now we need to ingest the actual feature values you created in BigQuery into the Vertex AI Feature Store.\n",
"\n",
"To ingest features values in Vertex AI Feature Store, you need to check the following requirements related to **Source Data format and Layout**:\n",
"\n",
"- Features values have to [be stored](https://cloud.google.com/vertex-ai/docs/featurestore/source-data) in BigQuery tables or Avro and CSV files on Google Cloud Storage.\n",
"- Each imported feature entity *must* have an ID.\n",
"- Each feature entity can *optionally* have a timestamp, to specifying when the feature values are generated.\n",
"\n",
"In the following queries, you ingest feature values from those BigQuery tables into Vertex AI Feature Store."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "usby4wvu3t49"
},
"source": [
"#### Ingest customer feature values into `customers` entity in Vertex AI Feature Store \n",
"\n",
"In the following section, you will import customer feature values into your feature store."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "TVrPDdZi3t4-",
"tags": []
},
"outputs": [],
"source": [
"CUSTOMERS_FEATURES_IDS = [\n",
" feature.name for feature in customer_feature_ids.list_features()\n",
"]\n",
"CUSTOMER_BQ_SOURCE_URI = f\"bq://{CUSTOMERS_BQ_TABLE_URI}\"\n",
"CUSTOMER_ENTITY_ID_FIELD = \"customer_id\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "CQ1oDfu5oHXO",
"tags": []
},
"outputs": [],
"source": [
"customer_entity_type.ingest_from_bq(\n",
" feature_ids=CUSTOMERS_FEATURES_IDS,\n",
" feature_time=FEATURE_TIME,\n",
" bq_source_uri=CUSTOMER_BQ_SOURCE_URI,\n",
" entity_id_field=CUSTOMER_ENTITY_ID_FIELD,\n",
" disable_online_serving=False,\n",
" worker_count=10,\n",
" sync=True,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "493c957a2b08"
},
"source": [
"#### Monitor the `customer` features ingestion job in the console.\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "7892e9f545fc"
},
"source": [
"You can go to the [Feature Store Console](https://console.cloud.google.com/vertex-ai/ingestion-jobs) to view your ingestion job. "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "2MPhVzBO3t4-"
},
"source": [
"#### Ingest terminal feature values into `terminal` entity in Vertex AI Feature Store \n",
"\n",
"In the following section, you will import terminal feature values into your feature store."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "rLji8AwL3t4_",
"tags": []
},
"outputs": [],
"source": [
"TERMINAL_ENTITY_ID = \"terminal\"\n",
"TERMINALS_FEATURES_IDS = [\n",
" feature.name for feature in terminal_feature_ids.list_features()\n",
"]\n",
"TERMINALS_BQ_SOURCE_URI = f\"bq://{TERMINALS_BQ_TABLE_URI}\"\n",
"TERMINALS_ENTITY_ID_FIELD = \"terminal_id\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "W0ztQUh8pFUD",
"tags": []
},
"outputs": [],
"source": [
"terminal_entity_type.ingest_from_bq(\n",
" feature_ids=TERMINALS_FEATURES_IDS,\n",
" feature_time=FEATURE_TIME,\n",
" bq_source_uri=TERMINALS_BQ_SOURCE_URI,\n",
" entity_id_field=TERMINALS_ENTITY_ID_FIELD,\n",
" disable_online_serving=False,\n",
" worker_count=10,\n",
" sync=True,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "cd6b315b44d3"
},
"source": [
"#### Monitor the ingestion jobs in the console."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "ebc0be367098"
},
"source": [
"The ingestion jobs you just created run asynchronously and they should take several minutes to complete. Please monitoring them in the [console](https://console.cloud.google.com/vertex-ai/ingestion-jobs).\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "eZzgeozE3t4_"
},
"source": [
"### Search for feature values \n",
"In this section, you'll run a search query on your feature store to validate that some data was ingested, as expected."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "y13EaqMnzibE",
"tags": []
},
"outputs": [],
"source": [
"customer_aggregated_features = customer_entity_type.read(\n",
" entity_ids=[\"5830444124423549\", \"5469689693941771\", \"1361459972478769\"],\n",
" feature_ids=CUSTOMERS_FEATURES_IDS,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "987a0403c5be",
"tags": []
},
"outputs": [],
"source": [
"customer_aggregated_features"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### END\n",
"\n",
"Now you can go to the next notebook `03_feature_engineering_streaming.ipynb`"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "4c5b719f3dcd"
},
"source": [
"## Clean up"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "a2068db4e5f5"
},
"outputs": [],
"source": [
"# ff_feature_store.delete(sync=True, force=True)"
]
}
],
"metadata": {
"colab": {
"collapsed_sections": [],
"name": "02_feature_engineering_batch.ipynb",
"toc_visible": true
},
"environment": {
"kernel": "python3",
"name": "common-cpu.m115",
"type": "gcloud",
"uri": "gcr.io/deeplearning-platform-release/base-cpu:m115"
},
"kernelspec": {
"display_name": "Python 3 (Local)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
}
},
"nbformat": 4,
"nbformat_minor": 4
}