training/built-in-algorithms/xgboost_parquet_input_training.ipynb (356 lines of code) (raw):

{ "cells": [ { "cell_type": "markdown", "id": "checked-satisfaction", "metadata": { "tags": [] }, "source": [ "# Regression with Amazon SageMaker XGBoost (Parquet input)\n" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "\n", "This notebook's CI test result for us-west-2 is as follows. CI test results in other regions can be found at the end of the notebook. \n", "\n", "![This us-west-2 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/us-west-2/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "---" ] }, { "cell_type": "markdown", "id": "checked-satisfaction", "metadata": { "tags": [] }, "source": [ "\n", "This notebook exhibits the use of a Parquet dataset for use with the SageMaker XGBoost algorithm. The example here is almost the same as [Regression with Amazon SageMaker XGBoost algorithm](xgboost_abalone.ipynb).\n", "\n", "This notebook tackles the exact same problem with the same solution, but has been modified for a Parquet input. \n", "The original notebook provides details of dataset and the machine learning use-case.\n", "\n", "This notebook has been tested using the Python 3 (Data Science) kernel." ] }, { "cell_type": "code", "execution_count": null, "id": "invisible-percentage", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "import os\n", "import boto3\n", "import re\n", "import sagemaker\n", "from sagemaker import get_execution_role\n", "\n", "role = get_execution_role()\n", "region = boto3.Session().region_name\n", "\n", "# S3 bucket for saving code and model artifacts.\n", "# Feel free to specify a different bucket here if you wish.\n", "bucket = sagemaker.Session().default_bucket()\n", "prefix = \"sagemaker/DEMO-xgboost-parquet\"\n", "bucket_path = \"https://s3-{}.amazonaws.com/{}\".format(region, bucket)" ] }, { "cell_type": "markdown", "id": "strong-study", "metadata": { "tags": [] }, "source": [ "We will use [PyArrow](https://arrow.apache.org/docs/python/) library to store the Abalone dataset in the Parquet format. " ] }, { "cell_type": "code", "execution_count": null, "id": "cutting-feelings", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "!pip install -Uq pyarrow" ] }, { "cell_type": "code", "execution_count": null, "id": "decent-cycling", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "%%time\n", "\n", "import numpy as np\n", "import pandas as pd\n", "from sklearn.datasets import load_svmlight_file\n", "\n", "# Download the dataset and load into a pandas dataframe\n", "FILE_NAME = \"abalone.csv\"\n", "s3 = boto3.client(\"s3\")\n", "s3.download_file(f\"sagemaker-sample-files\", \"datasets/tabular/uci_abalone/abalone.csv\", FILE_NAME)\n", "feature_names = [\n", " \"Sex\",\n", " \"Length\",\n", " \"Diameter\",\n", " \"Height\",\n", " \"Whole weight\",\n", " \"Shucked weight\",\n", " \"Viscera weight\",\n", " \"Shell weight\",\n", " \"Rings\",\n", "]\n", "data = pd.read_csv(FILE_NAME, header=None, names=feature_names)\n", "\n", "# SageMaker XGBoost has the convention of label in the first column\n", "data = data[feature_names[-1:] + feature_names[:-1]]\n", "data[\"Sex\"] = data[\"Sex\"].astype(\"category\").cat.codes\n", "\n", "# Split the downloaded data into train/test dataframes\n", "train, test = np.split(data.sample(frac=1), [int(0.8 * len(data))])\n", "\n", "# requires PyArrow installed\n", "train.to_parquet(\"abalone_train.parquet\")\n", "test.to_parquet(\"abalone_test.parquet\")" ] }, { "cell_type": "code", "execution_count": null, "id": "unnecessary-diary", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "%%time\n", "sagemaker.Session().upload_data(\n", " \"abalone_train.parquet\", bucket=bucket, key_prefix=prefix + \"/\" + \"train\"\n", ")\n", "\n", "sagemaker.Session().upload_data(\n", " \"abalone_test.parquet\", bucket=bucket, key_prefix=prefix + \"/\" + \"test\"\n", ")" ] }, { "cell_type": "markdown", "id": "simple-pulse", "metadata": { "tags": [] }, "source": [ "We obtain the new container by specifying the framework version (0.90-1). This version specifies the upstream XGBoost framework version (0.90) and an additional SageMaker version (1). If you have an existing XGBoost workflow based on the previous (0.72) container, this would be the only change necessary to get the same workflow working with the new container." ] }, { "cell_type": "code", "execution_count": null, "id": "australian-block", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "from sagemaker.amazon.amazon_estimator import get_image_uri\n", "\n", "container = get_image_uri(region, \"xgboost\", \"0.90-1\")" ] }, { "cell_type": "markdown", "id": "according-flashing", "metadata": { "tags": [] }, "source": [ "After setting training parameters, we kick off training, and poll for status until training is completed, which in this example, takes between 5 and 6 minutes." ] }, { "cell_type": "code", "execution_count": null, "id": "smaller-louis", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "%%time\n", "import time\n", "from time import gmtime, strftime\n", "\n", "job_name = \"xgboost-parquet-example-training-\" + strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\n", "print(\"Training job\", job_name)\n", "\n", "# Ensure that the training and validation data folders generated above are reflected in the \"InputDataConfig\" parameter below.\n", "\n", "create_training_params = {\n", " \"AlgorithmSpecification\": {\"TrainingImage\": container, \"TrainingInputMode\": \"Pipe\"},\n", " \"RoleArn\": role,\n", " \"OutputDataConfig\": {\"S3OutputPath\": bucket_path + \"/\" + prefix + \"/single-xgboost\"},\n", " \"ResourceConfig\": {\"InstanceCount\": 1, \"InstanceType\": \"ml.m5.24xlarge\", \"VolumeSizeInGB\": 20},\n", " \"TrainingJobName\": job_name,\n", " \"HyperParameters\": {\n", " \"max_depth\": \"5\",\n", " \"eta\": \"0.2\",\n", " \"gamma\": \"4\",\n", " \"min_child_weight\": \"6\",\n", " \"subsample\": \"0.7\",\n", " \"silent\": \"0\",\n", " \"objective\": \"reg:linear\",\n", " \"num_round\": \"10\",\n", " },\n", " \"StoppingCondition\": {\"MaxRuntimeInSeconds\": 3600},\n", " \"InputDataConfig\": [\n", " {\n", " \"ChannelName\": \"train\",\n", " \"DataSource\": {\n", " \"S3DataSource\": {\n", " \"S3DataType\": \"S3Prefix\",\n", " \"S3Uri\": bucket_path + \"/\" + prefix + \"/train\",\n", " \"S3DataDistributionType\": \"FullyReplicated\",\n", " }\n", " },\n", " \"ContentType\": \"application/x-parquet\",\n", " \"CompressionType\": \"None\",\n", " },\n", " {\n", " \"ChannelName\": \"validation\",\n", " \"DataSource\": {\n", " \"S3DataSource\": {\n", " \"S3DataType\": \"S3Prefix\",\n", " \"S3Uri\": bucket_path + \"/\" + prefix + \"/test\",\n", " \"S3DataDistributionType\": \"FullyReplicated\",\n", " }\n", " },\n", " \"ContentType\": \"application/x-parquet\",\n", " \"CompressionType\": \"None\",\n", " },\n", " ],\n", "}\n", "\n", "\n", "client = boto3.client(\"sagemaker\", region_name=region)\n", "client.create_training_job(**create_training_params)\n", "\n", "status = client.describe_training_job(TrainingJobName=job_name)[\"TrainingJobStatus\"]\n", "print(status)\n", "while status != \"Completed\" and status != \"Failed\":\n", " time.sleep(60)\n", " status = client.describe_training_job(TrainingJobName=job_name)[\"TrainingJobStatus\"]\n", " print(status)" ] }, { "cell_type": "code", "execution_count": null, "id": "wicked-baseline", "metadata": { "execution": {}, "tags": [] }, "outputs": [], "source": [ "%matplotlib inline\n", "from sagemaker.analytics import TrainingJobAnalytics\n", "\n", "metric_name = \"validation:rmse\"\n", "\n", "metrics_dataframe = TrainingJobAnalytics(\n", " training_job_name=job_name, metric_names=[metric_name]\n", ").dataframe()\n", "plt = metrics_dataframe.plot(\n", " kind=\"line\", figsize=(12, 5), x=\"timestamp\", y=\"value\", style=\"b.\", legend=False\n", ")\n", "plt.set_ylabel(metric_name);" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ "## Notebook CI Test Results\n", "\n", "This notebook was tested in multiple regions. The test results are as follows, except for us-west-2 which is shown at the top of the notebook.\n", "\n", "![This us-east-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/us-east-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This us-east-2 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/us-east-2/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This us-west-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/us-west-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This ca-central-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/ca-central-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This sa-east-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/sa-east-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This eu-west-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/eu-west-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This eu-west-2 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/eu-west-2/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This eu-west-3 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/eu-west-3/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This eu-central-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/eu-central-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This eu-north-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/eu-north-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This ap-southeast-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/ap-southeast-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This ap-southeast-2 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/ap-southeast-2/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This ap-northeast-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/ap-northeast-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This ap-northeast-2 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/ap-northeast-2/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n", "\n", "![This ap-south-1 badge failed to load. Check your device's internet connectivity, otherwise the service is currently unavailable](https://h75twx4l60.execute-api.us-west-2.amazonaws.com/sagemaker-nb/ap-south-1/aws_sagemaker_studio|introduction_to_amazon_algorithms|xgboost_abalone|xgboost_parquet_input_training.ipynb)\n" ] } ], "metadata": { "anaconda-cloud": {}, "instance_type": "ml.t3.medium", "kernelspec": { "display_name": "Environment (conda_anaconda3)", "language": "python", "name": "conda_anaconda3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" } }, "nbformat": 4, "nbformat_minor": 5 }