course/videos/debug_error.ipynb (124 lines of code) (raw):

{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "This notebook regroups the code sample of the video below, which is a part of the [Hugging Face course](https://huggingface.co/course)." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "cellView": "form" }, "outputs": [ { "data": { "text/html": [ "<iframe width=\"560\" height=\"315\" src=\"https://www.youtube.com/embed/DQ-CpJn6Rc4?rel=0&amp;controls=0&amp;showinfo=0\" frameborder=\"0\" allowfullscreen></iframe>" ], "text/plain": [ "<IPython.core.display.HTML object>" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "#@title\n", "from IPython.display import HTML\n", "\n", "HTML('<iframe width=\"560\" height=\"315\" src=\"https://www.youtube.com/embed/DQ-CpJn6Rc4?rel=0&amp;controls=0&amp;showinfo=0\" frameborder=\"0\" allowfullscreen></iframe>')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Install the Transformers and Datasets libraries to run this notebook." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "! pip install datasets transformers[sentencepiece]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from transformers import pipeline\n", "\n", "model_checkpoint = \"distillbert-base-cased-distilled-squad\"\n", "question_answerer = pipeline(\"question_answering\", model=model_checkpoint)\n", "\n", "context = \"\"\"\n", "🤗 Transformers is backed by the three most popular deep learning libraries — Jax, PyTorch and TensorFlow — with a seamless integration between them. It's straightforward to train your models with one before loading them for inference with the other.\n", "\"\"\"\n", "question = \"Which deep learning libraries back 🤗 Transformers?\"\n", "question_answerer(question=question, context=context)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from transformers import pipeline\n", "\n", "model_checkpoint = \"distillbert-base-cased-distilled-squad\"\n", "question_answerer = pipeline(\"question-answering\", model=model_checkpoint)\n", "\n", "context = \"\"\"\n", "🤗 Transformers is backed by the three most popular deep learning libraries — Jax, PyTorch and TensorFlow — with a seamless integration between them. It's straightforward to train your models with one before loading them for inference with the other.\n", "\"\"\"\n", "question = \"Which deep learning libraries back 🤗 Transformers?\"\n", "question_answerer(question=question, context=context)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from transformers import pipeline\n", "\n", "model_checkpoint = \"distilbert-base-cased-distilled-squad\"\n", "question_answerer = pipeline(\"question-answering\", model=model_checkpoint)\n", "\n", "context = \"\"\"\n", "🤗 Transformers is backed by the three most popular deep learning libraries — Jax, PyTorch and TensorFlow — with a seamless integration between them. It's straightforward to train your models with one before loading them for inference with the other.\n", "\"\"\"\n", "question = \"Which deep learning libraries back 🤗 Transformers?\"\n", "question_answerer(question=question, context=context)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "colab": { "name": "What to do when you get an error?", "provenance": [] } }, "nbformat": 4, "nbformat_minor": 4 }