uczenie-maszynowe/wyk/13_CNN.ipynb
2023-06-01 10:31:04 +02:00

699 lines
35 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"# 13. Splotowe sieci neuronowe"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Splotowe sieci neuronowe, inaczej konwolucyjne sieci neuronowe (*convolutional neural networks*, CNN, ConvNet)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Konwolucyjne sieci neuronowe wykorzystuje się do:\n",
"\n",
"* rozpoznawania obrazu\n",
"* analizy wideo\n",
"* innych zagadnień o podobnej strukturze"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Innymi słowy, CNN przydają się, gdy mamy bardzo dużo danych wejściowych, w których istotne jest ich sąsiedztwo."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"### Warstwy konwolucyjne"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Dla uproszczenia przyjmijmy, że mamy dane w postaci jednowymiarowej np. chcemy stwierdzić, czy na danym nagraniu obecny jest głos człowieka."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Nasze nagranie możemy reprezentować jako ciąg $n$ próbek dźwiękowych:\n",
"$$(x_0, x_1, \\ldots, x_n)$$\n",
"(możemy traktować je jak jednowymiarowe „piksele”)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Najprostsza metoda „zwykła” jednowarstwowa sieć neuronowa (każdy z każdym) nie poradzi sobie zbyt dobrze w tym przypadku:\n",
"\n",
"* dużo danych wejściowych\n",
"* nie wykrywa własności „lokalnych” wejścia"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Chcielibyśmy wykrywać pewne lokalne „wzory” w danych wejściowych.\n",
"\n",
"W tym celu tworzymy mniejszą sieć neuronową (mniej neuronów wejściowych) i _kopiujemy_ ją tak, żeby każda jej kopia działała na pewnym fragmencie wejścia (fragmenty mogą nachodzić na siebie)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Warstwę sieci A nazywamy **warstwą konwolucyjną** (konwolucja = splot).\n",
"\n",
"Warstw konwolucyjnych może być więcej niż jedna."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Tak definiujemy formalnie funckję splotu dla 2 wymiarów:\n",
"\n",
"$$\n",
"\\left[\\begin{array}{ccc}\n",
"a & b & c\\\\\n",
"d & e & f\\\\\n",
"g & h & i\\\\\n",
"\\end{array}\\right]\n",
"*\n",
"\\left[\\begin{array}{ccc}\n",
"1 & 2 & 3\\\\\n",
"4 & 5 & 6\\\\\n",
"7 & 8 & 9\\\\\n",
"\\end{array}\\right] \n",
"=\\\\\n",
"(1 \\cdot a)+(2 \\cdot b)+(3 \\cdot c)+(4 \\cdot d)+(5 \\cdot e)\\\\+(6 \\cdot f)+(7 \\cdot g)+(8 \\cdot h)+(9 \\cdot i)\n",
"$$\n",
"\n",
"Więcej: https://en.wikipedia.org/wiki/Kernel_(image_processing)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Jednostka warstwy konwolucyjnej może się składać z jednej lub kilku warstw neuronów."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Jeden neuron może odpowiadać np. za wykrywanie pionowych krawędzi, drugi poziomych, a jeszcze inny np. krzyżujących się linii."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"### _Pooling_"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Obrazy składają się na ogół z milionów pikseli. Oznacza to, że nawet po zastosowaniu kilku warstw konwolucyjnych mielibyśmy sporo parametrów do wytrenowania.\n",
"\n",
"Żeby zredukować liczbę parametrów, a dzięki temu uprościć obliczenia, stosuje się warstwy ***pooling***.\n",
"\n",
"*Pooling* to rodzaj próbkowania. Najpopularniejszą jego odmianą jest *max-pooling*, czyli wybieranie najwyższej wartości spośród kilku sąsiadujących pikseli (rys. 13.1)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"![Rys. 13.1. Pooling](Max_pooling.png \"Rys. 13.1. Pooling\")\n",
"\n",
"Rys. 13.1. - źródło: [Aphex34](https://commons.wikimedia.org/wiki/File:Max_pooling.png), [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0), Wikimedia Commons"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Warstwy _pooling_ i konwolucyjne można przeplatać ze sobą (rys. 13.2)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"![Rys. 13.2. CNN](Typical_cnn.png \"Rys. 13.2. CNN\")\n",
"\n",
"Rys. 13.2. - źródło: [Aphex34](https://commons.wikimedia.org/wiki/File:Typical_cnn.png), [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0), Wikimedia Commons"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"_Pooling_ idea: nie jest istotne, w którym *dokładnie* miejscu na obrazku dana cecha (krawędź, oko, itp.) się znajduje, wystarczy przybliżona lokalizacja."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Do sieci konwolucujnych możemy dokładać też warstwy ReLU."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"https://www.youtube.com/watch?v=FmpDIaiMIeA"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"source": [
"Zobacz też: https://colah.github.io/posts/2014-07-Conv-Nets-Modular/"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"### Przykład: MNIST"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"%matplotlib inline\n",
"\n",
"import math\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import random\n",
"\n",
"from IPython.display import YouTubeVideo"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"2023-01-27 12:50:47.601029: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA\n",
"To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.\n",
"2023-01-27 12:50:48.662241: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory\n",
"2023-01-27 12:50:48.662268: I tensorflow/compiler/xla/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.\n",
"2023-01-27 12:50:51.653864: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory\n",
"2023-01-27 12:50:51.654326: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory\n",
"2023-01-27 12:50:51.654341: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.\n"
]
}
],
"source": [
"import keras\n",
"from keras.datasets import mnist\n",
"\n",
"from keras.models import Sequential\n",
"from keras.layers import Dense, Dropout, Flatten\n",
"from keras.layers import Conv2D, MaxPooling2D\n",
"\n",
"# załaduj dane i podziel je na zbiory uczący i testowy\n",
"(x_train, y_train), (x_test, y_test) = mnist.load_data()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"def draw_examples(examples, captions=None):\n",
" plt.figure(figsize=(16, 4))\n",
" m = len(examples)\n",
" for i, example in enumerate(examples):\n",
" plt.subplot(100 + m * 10 + i + 1)\n",
" plt.imshow(example, cmap=plt.get_cmap('gray'))\n",
" plt.show()\n",
" if captions is not None:\n",
" print(6 * ' ' + (10 * ' ').join(str(captions[i]) for i in range(m)))"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABQcAAADFCAYAAADpJUQuAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjYuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8o6BhiAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAomElEQVR4nO3de3RU1d3G8V8CJFyTyC2BQiQqFRQJGgFRFqBE0KoQoKIUBKwFCwFEK6X4oqCIQVBbrmK1kKIoaBFQvFKuVUPK1S5AIlKEIEkAJRcCJErO+4fLVNx78ExmJjOz9/ez1vnDh30m+4Qnh2Eznh3hOI4jAAAAAAAAAKwTGewJAAAAAAAAAAgOFgcBAAAAAAAAS7E4CAAAAAAAAFiKxUEAAAAAAADAUiwOAgAAAAAAAJZicRAAAAAAAACwFIuDAAAAAAAAgKVYHAQAAAAAAAAsxeIgAAAAAAAAYCkWBwEAAAAAAABL1QzUC8+fP19mzZol+fn5kpycLHPnzpVOnTr97HkVFRVy9OhRadCggURERARqerCU4zhSUlIizZs3l8hI79bGq9ppEXqNwPGl0yLcqxF6gtVpEXqNwOFeDdNwr4aJuFfDNF512gmAZcuWOVFRUc6iRYucPXv2OCNGjHDi4uKcgoKCnz03NzfXEREOjoAeubm51dZpes1RHYe3nfa113SaI9BHdXeaXnNUx8G9msO0g3s1h4kH92oO0w43nQ7I4mCnTp2c9PT0yv8+d+6c07x5cycjI+Nnzy0sLAz6N47D/KOwsLDaOk2vOarj8LbTvvaaTnME+qjuTtNrjuo4uFdzmHZwr+Yw8eBezWHa4abTfn/mYHl5uWzfvl1SU1Mrs8jISElNTZWsrCxlfFlZmRQXF1ceJSUl/p4SoPDm49redlqEXqP6efu/IHCvRqgLdKdF6DWqH/dqmIZ7NUzEvRqmcdNpvy8OnjhxQs6dOyfx8fHn5fHx8ZKfn6+Mz8jIkNjY2MqjZcuW/p4S4BNvOy1CrxH6uFfDNNyrYSLu1TAN92qYiHs1TBD03YonTZokRUVFlUdubm6wpwT4jF7DNHQaJqLXMA2dhonoNUxDpxGK/L5bcePGjaVGjRpSUFBwXl5QUCAJCQnK+OjoaImOjvb3NAC/8bbTIvQaoY97NUzDvRom4l4N03Cvhom4V8MEfv/kYFRUlKSkpMi6desqs4qKClm3bp106dLF318OCDg6DRPRa5iGTsNE9BqmodMwEb2GEbzehseFZcuWOdHR0U5mZqazd+9eZ+TIkU5cXJyTn5//s+cWFRUFfScXDvOPoqKiaus0veaojsPbTvvaazrNEeijujtNrzmq4+BezWHawb2aw8SDezWHaYebTgdkcdBxHGfu3LlOYmKiExUV5XTq1MnZsmWLq/P4weCojqMqN/yqdppec1THUZVO+9JrOs0R6KO6O02vOarj4F7NYdrBvZrDxIN7NYdph5tORziO40gIKS4ultjY2GBPA4YrKiqSmJiYavt69BqBRqdhmurutAi9RuBxr4ZpuFfDRNyrYRo3nQ76bsUAAAAAAAAAgoPFQQAAAAAAAMBSLA4CAAAAAAAAlmJxEAAAAAAAALAUi4MAAAAAAACApVgcBAAAAAAAACzF4iAAAAAAAABgKRYHAQAAAAAAAEuxOAgAAAAAAABYisVBAAAAAAAAwFIsDgIAAAAAAACWqhnsCQCAWykpKdp8zJgxSjZ06FDt2CVLlijZ3LlztWN37NjhxewAAAAAAFUxe/ZsbT5u3Dgl2717t5Ldfvvt2vMPHTrk28QswScHAQAAAAAAAEuxOAgAAAAAAABYisVBAAAAAAAAwFIsDgIAAAAAAACWYkOSEFWjRg1tHhsb69Pr6jZuqFu3rnbs5ZdfrmTp6enasc8884ySDRo0SMnOnj2rPX/GjBlK9vjjj2vHwg4dOnRQsrVr12rHxsTEKJnjONqx99xzj5L16dNHO7ZRo0YXmCEQfnr27KlkS5cuVbLu3btrz8/JyfH7nACdyZMnK5mn9wWRkeq/dffo0UPJNm3a5PO8AMAUDRo00Ob169dXsttuu007tkmTJkr23HPPaceWlZV5MTuYrlWrVko2ZMgQ7diKigola9u2rZK1adNGez4bkrjDJwcBAAAAAAAAS7E4CAAAAAAAAFiKxUEAAAAAAADAUiwOAgAAAAAAAJZicRAAAAAAAACwFLsV+ygxMVGbR0VFKdn111+vHdu1a1cli4uL044dMGCA+8n56MiRI0o2Z84c7dh+/fopWUlJiZJ9+umn2vPZQdBunTp1UrIVK1YomafdunU7E+v6JyJSXl6uZJ52Jb7uuuuUbMeOHa5eE1XXrVs3JfP0e7Ry5cpAT8coHTt2VLKtW7cGYSbA94YPH67NJ06cqGS63Qo98bRjPQCYTrcLrO6e2qVLF+357dq18+nrN2vWTJuPGzfOp9eFWY4fP65kmzdv1o7t06dPoKcD4ZODAAAAAAAAgLVYHAQAAAAAAAAsxeIgAAAAAAAAYCkWBwEAAAAAAABLsSGJFzp06KBk69ev1471tHFCKPL0gO/Jkycr2alTp7Rjly5dqmR5eXlKdvLkSe35OTk5F5oiwlDdunWV7JprrtGOfeWVV5TM08OM3dq/f782nzlzppItW7ZMO/bjjz9WMt3PRUZGhpezw4X06NFDyVq3bq0dy4YkepGR+n/7S0pKUrKLL75YySIiIvw+J0BH1z8Rkdq1a1fzTGCyzp07a/MhQ4YoWffu3ZXsyiuvdP21Hn74YW1+9OhRJdNtSiiif1+UnZ3teg4wT5s2bZRs/Pjx2rGDBw9Wsjp16iiZpz/rc3NzlczTRn9t27ZVsoEDB2rHLliwQMn27dunHQvzlZaWKtmhQ4eCMBP8gE8OAgAAAAAAAJZicRAAAAAAAACwFIuDAAAAAAAAgKVYHAQAAAAAAAAsxYYkXjh8+LCSff3119qx1bkhie4BxYWFhdqxN954o5KVl5drx7788ss+zQt2e+GFF5Rs0KBB1fb1PW1+Ur9+fSXbtGmTdqxuY4z27dv7NC/8vKFDhypZVlZWEGYSvjxt6DNixAgl0z34ngeEIxBSU1OVbOzYsa7P99TL22+/XckKCgrcTwxGueuuu5Rs9uzZ2rGNGzdWMt0mDRs3btSe36RJEyWbNWvWz8zwwl/L0+vefffdrl8X4UH398Wnn35aO1bX6wYNGvj09T1t3te7d28lq1Wrlnas7r6s+7m6UA47xcXFKVlycnL1TwSV+OQgAAAAAAAAYCkWBwEAAAAAAABLsTgIAAAAAAAAWIrFQQAAAAAAAMBSLA4CAAAAAAAAlmK3Yi988803SjZhwgTtWN3OeTt37tSOnTNnjus57Nq1S8luvvlmJSstLdWef+WVVyrZAw884PrrAz+VkpKizW+77TYl87Qrn45uB+G3335bO/aZZ55RsqNHj2rH6n4OT548qR170003KZk314CqiYzk36189dJLL7ke62m3QqCqunbtqs0XL16sZLrdOj3xtAvsoUOHXL8GwlPNmupfWa699lrt2BdffFHJ6tatqx27efNmJZs2bZqSffTRR9rzo6Ojlez111/Xju3Vq5c219m2bZvrsQhf/fr1U7Lf/e53AflaBw4cUDLd3yFFRHJzc5Xssssu8/ucYDfdfTkxMdGn1+zYsaM21+2qzXsHFX8DAwAAAAAAACzF4iAAAAAAAABgKRYHAQAAAAAAAEuxOAgAAAAAAABYyusNSTZv3iyzZs2S7du3S15enqxcuVLS0tIqf91xHJkyZYq8+OKLUlhYKDfccIM8//zz0rp1a3/OO2SsWrVKm69fv17JSkpKtGOTk5OV7L777tOO1W284GnzEZ09e/Yo2ciRI12fbyI67V6HDh2UbO3atdqxMTExSuY4jnbse++9p2SDBg1Ssu7du2vPnzx5spJ52pDh+PHjSvbpp59qx1ZUVCiZbqOVa665Rnv+jh07tHmghUun27dvr83j4+OrdR4m8maTB08/w6EmXHoNkWHDhmnz5s2bu36NjRs3KtmSJUuqOqWQRKfdGzJkiJJ5s/GSp/vcXXfdpWTFxcWuX1d3vjcbjxw5ckSb//3vf3f9GqGGXrt35513+nT+l19+qc23bt2qZBMnTlQy3cYjnrRt29b1WNPQ6cDQbR6ZmZmpHTt16lRXr+lpXGFhoZLNmzfP1WvaxOtPDpaWlkpycrLMnz9f++szZ86UOXPmyMKFCyU7O1vq1asnvXv3lrNnz/o8WSAQ6DRMQ6dhInoN09BpmIhewzR0Grbw+pODt956q9x6663aX3McR/7yl7/I5MmTpW/fviLy/b/yxsfHy6pVq+Tuu+/2bbZAANBpmIZOw0T0Gqah0zARvYZp6DRs4ddnDh48eFDy8/MlNTW1MouNjZXOnTtLVlaW9pyysjIpLi4+7wBCRVU6LUKvEbroNExEr2EaOg0T0WuYhk7DJH5dHMzPzxcR9XlR8fHxlb/2UxkZGRIbG1t5tGzZ0p9TAnxSlU6L0GuELjoNE9FrmIZOw0T0Gqah0zBJ0HcrnjRpkhQVFVUe3jwYFQhV9BqmodMwEb2Gaeg0TESvYRo6jVDk9TMHLyQhIUFERAoKCqRZs2aVeUFBgXaXUxGR6OhoiY6O9uc0QoI3Hw0uKipyPXbEiBFKtnz5ciXT7bIK71Wl0yLh3+tf/vKX2nzChAlK5mlX1BMnTihZXl6edqxuV75Tp04p2TvvvKM931MeCHXq1FGyP/zhD9qxgwcPDvR0vBZKnf7Vr36lzXXfY3im2905KSnJ9flfffWVP6cTFKHUa9s0btxYyX77299qx+rem+h2EBQRefLJJ32aV7iztdPTpk3T5o888oiSOY6jHbtgwQIlmzx5snasr/8r3//93//5dP64ceO0+fHjx3163VBla6890f29buTIkdqxH374oZJ98cUX2rHHjh3zbWIauvcaoNP+5unPALe7FcM3fv3kYFJSkiQkJMi6desqs+LiYsnOzpYuXbr480sB1YJOwzR0Giai1zANnYaJ6DVMQ6dhEq8/OXjq1Knz/pXi4MGDsmvXLmnYsKEkJibK+PHj5cknn5TWrVtLUlKSPProo9K8eXNJS0vz57wBv6HTMA2dhonoNUxDp2Eieg3T0GnYwuvFwW3btsmNN95Y+d8PPfSQiIgMGzZMMjMz5Y9//KOUlpbKyJEjpbCwULp27Srvv/++1K5d23+zBvyITsM0dBomotcwDZ2Gieg1TEOnYQuvFwd79Ojh8RkfIiIRERHyxBNPyBNPPOHTxIDqQqdhGjoNE9FrmIZOw0T0Gqah07CFXzckQdXoHrCZkpKiHdu9e3clS01NVTLdQ2sBHd3DcJ955hntWN0GEiUlJdqxQ4cOVbJt27Zpx4b7BhSJiYnBnkJYuvzyy12P3bNnTwBnEt50P6+eHhz++eefK5mnn2Hgp1q1aqVkK1as8Ok1586dq803bNjg0+si9D322GNKptt4RESkvLxcyT744APt2IkTJyrZmTNnXM9L92mfXr16acfq/vyPiIjQjtVtsrN69WrX84J5jh49qmShuvECz89DMEVGqltlsAGr//l1QxIAAAAAAAAA4YPFQQAAAAAAAMBSLA4CAAAAAAAAlmJxEAAAAAAAALAUi4MAAAAAAACApditOASUlpYq2YgRI7Rjd+zYoWQvvviiknna5U+3W+z8+fO1Yy+0ZTvMcfXVVyuZbldiT/r27avNN23aVOU5AT+1devWYE8hIGJiYrT5LbfcomRDhgzRjvW0i6bOtGnTlKywsND1+bCbrpft27d3ff66deuUbPbs2T7NCeEhLi5OyUaPHq1knt576nYmTktL83VactlllynZ0qVLlSwlJcX1a/7jH//Q5jNnznQ/McAH48aNU7J69er59JpXXXWV67GffPKJNs/KyvJpDrCXbmdi1ir8j08OAgAAAAAAAJZicRAAAAAAAACwFIuDAAAAAAAAgKVYHAQAAAAAAAAsxYYkIerAgQPafPjw4Uq2ePFiJbvnnnu05+tyTw+oXbJkiZLl5eVpxyJ8Pffcc0oWERGhHavbZMTkjUciI9V/P9E9EBeB17Bhw4C8bnJyspJ56n9qaqqStWjRQjs2KipKyQYPHqxkuo6JiJw5c0bJsrOztWPLysqUrGZN/R/v27dv1+bAj3na6GHGjBmuzv/oo4+0+bBhw5SsqKjI9bwQvnT3xMaNG7s+X7fBQtOmTbVj7733XiXr06ePdmy7du2UrH79+krm6cH3uvyVV17RjtVtQAj8VN26dbX5FVdcoWRTpkzRjnW7saCn9yDevNc9evSokul+BkVEzp075/p1AVQ/PjkIAAAAAAAAWIrFQQAAAAAAAMBSLA4CAAAAAAAAlmJxEAAAAAAAALAUG5KEmZUrVyrZ/v37lUy3yYSISM+ePZXsqaee0o69+OKLlWz69OnasV999ZU2R2i5/fbblaxDhw5K5unB22+99Za/pxTSdA9k1n1vdu3aVQ2zMY9u0w0R/fd44cKF2rGPPPKIT3No3769knnakOS7775TstOnT2vH7t27V8kWLVqkZNu2bdOer9vop6CgQDv2yJEjSlanTh3t2H379mlz2KtVq1ZKtmLFCp9e87///a8299RhmK+8vFzJjh8/rmRNmjTRnn/w4EEl8/RexRu6zRSKi4uVrFmzZtrzT5w4oWRvv/22z/OCWWrVqqXNr776aiXzdP/VddDT+yhdr7OyspTslltu0Z7vaVMUHd0GaP3799eOnT17tpLp7g0AgoNPDgIAAAAAAACWYnEQAAAAAAAAsBSLgwAAAAAAAIClWBwEAAAAAAAALMXiIAAAAAAAAGApdis2wO7du5Vs4MCB2rF33HGHki1evFg79v7771ey1q1ba8fefPPNF5oiQoRuB9OoqCglO3bsmPb85cuX+31O1S06OlrJpk6d6vr89evXK9mkSZN8mZK1Ro8erc0PHTqkZNdff31A5nD48GElW7VqlXbsZ599pmRbtmzx95Q8GjlypDbX7e7pabdY4KcmTpyoZLqd2r0xY8YMn86HeQoLC5UsLS1NydasWaM9v2HDhkp24MAB7djVq1crWWZmpnbsN998o2TLli1TMk+7FevGwm6699WedgV+8803Xb/u448/rmS696QiIh9//LGS6X6GPJ3frl071/PSvQfJyMjQjnX7nqusrMz114cdIiPVz7R5816lW7duSjZv3jyf5mQiPjkIAAAAAAAAWIrFQQAAAAAAAMBSLA4CAAAAAAAAlmJxEAAAAAAAALAUG5IYSvfgZxGRl19+Wcleeukl7diaNdV66B7mKSLSo0cPJdu4caPH+SG0eXoQcF5eXjXPpOp0G4+IiEyePFnJJkyYoB175MgRJXv22WeV7NSpU17ODhfy9NNPB3sKIalnz56ux65YsSKAM0E46tChgzbv1auXT6+r2/whJyfHp9eEHbKzs5VMt7lBIOne13bv3l3JPD34ns2f7FarVi0l020c4ul9ps57772nzefOnatknv6+p/s5evfdd5Xsqquu0p5fXl6uZDNnztSO1W1e0rdvX+3YpUuXKtk///lPJfP0PvDkyZPaXGfXrl2uxyL06e7BjuO4Pr9///5KdsUVV2jH7t271/3EDMMnBwEAAAAAAABLsTgIAAAAAAAAWIrFQQAAAAAAAMBSLA4CAAAAAAAAlmJxEAAAAAAAALAUuxUboH379kr261//Wju2Y8eOSqbbldgTT7v3bN682fVrIPS99dZbwZ6CV3S7cHraGe6uu+5SMt1umyIiAwYM8GleQLCsXLky2FNAiPnwww+1+UUXXeT6NbZs2aJkw4cPr+qUgKCrU6eOknmzK+ayZcv8PieEnho1amjzadOmKdnDDz+sZKWlpdrz//SnPymZp07pdia+9tprtWPnzZunZFdffbWS7d+/X3v+qFGjlGzDhg3asTExMUp2/fXXa8cOHjxYyfr06aNka9eu1Z6vk5ubq82TkpJcvwZC38KFC5Xs/vvv9+k1R44cqc3Hjx/v0+uGMz45CAAAAAAAAFiKxUEAAAAAAADAUiwOAgAAAAAAAJZicRAAAAAAAACwFBuShKjLL79cm48ZM0bJ+vfvr2QJCQk+z+HcuXNKlpeXpx2re3gzQk9ERISrLC0tTXv+Aw884O8peeXBBx/U5o8++qiSxcbGascuXbpUyYYOHerbxAAgxDVq1Eibe/Pn94IFC5Ts1KlTVZ4TEGwffPBBsKeAMOBp4wLd5iOnT59WMk8bJ+g2irruuuu0Y++9914lu/XWW7VjdRvtPPHEE0q2ePFi7fmeNvnQKS4uVrL3339fO1aXDxo0SMl+85vfuP76nv5uALPs27cv2FOwAp8cBAAAAAAAACzF4iAAAAAAAABgKRYHAQAAAAAAAEuxOAgAAAAAAABYyqvFwYyMDOnYsaM0aNBAmjZtKmlpaZKTk3PemLNnz0p6ero0atRI6tevLwMGDJCCggK/ThrwJ3oN09BpmIhewzR0Gqah0zARvYYtIhzHcdwOvuWWW+Tuu++Wjh07ynfffSePPPKI7N69W/bu3Sv16tUTEZFRo0bJO++8I5mZmRIbGytjxoyRyMhI+fjjj119jeLiYo+7jIY7TzsI63Zp0u1KLCLSqlUrf05JRES2bdumzadPn65kb731lt+/fjAUFRVJTEyMiNjV6zvvvFPJXnvtNSXT7VQtIvLCCy8o2aJFi7Rjv/76ayXztAPbPffco2TJyclK1qJFC+35hw8fVrItW7Zox86ePdv12HBia6dtsnz5cm0+cOBAJRs2bJh27JIlS/w6p0D6cadF6LU3dDtQDh8+XDvWm92KL7nkEiU7dOiQ6/PBvTrU9O7dW8neffddJfP016VmzZop2fHjx32fWBip7k6LVH+v8/LytHmTJk2UrKysTMk87bT6w/fkxy677DIvZ6eaOnWqkmVkZCiZp/f74F4dDj7//HMlu/TSS12fHxmp/5yc7mfwwIED7icWon76vlqnpjcv+NPtxzMzM6Vp06ayfft26datmxQVFcnf/vY3efXVV+Wmm24Ske/foLZt21a2bNnicWEACCZ6DdPQaZiIXsM0dBqmodMwEb2GLXx65mBRUZGIiDRs2FBERLZv3y7ffvutpKamVo5p06aNJCYmSlZWlvY1ysrKpLi4+LwDCCZ6DdPQaZiIXsM0dBqm8UenReg1Qgv3apiqyouDFRUVMn78eLnhhhukXbt2IiKSn58vUVFREhcXd97Y+Ph4yc/P175ORkaGxMbGVh4tW7as6pQAn9FrmIZOw0T0Gqah0zCNvzotQq8ROrhXw2RVXhxMT0+X3bt3y7Jly3yawKRJk6SoqKjyyM3N9en1AF/Qa5iGTsNE9BqmodMwjb86LUKvETq4V8NkXj1z8AdjxoyRNWvWyObNm8/bICAhIUHKy8ulsLDwvJXzgoICj5txREdHS3R0dFWmERLi4+O1+RVXXKFk8+bN045t06aNX+ckIpKdna3NZ82apWSrV6/WjvXmIeUmoNf/U6NGDW0+evRoJRswYIB2rO7j8a1bt/ZpXp988ok237Bhg5I99thjPn0tE9BpO+gelO/pIcsmoNf/06FDB23+4/+16Qee/kwvLy9Xsvnz52vHsvNiYNDp4NFtsgPf+bPTIsHvtadPf+k2JNHNU7fJnie6DXFERDZv3qxkq1at0o798ssvlYzNR3zHvTq07NmzR8m8uafbttbhhld/e3AcR8aMGSMrV66U9evXS1JS0nm/npKSIrVq1ZJ169ZVZjk5OXL48GHp0qWLf2YM+Bm9hmnoNExEr2EaOg3T0GmYiF7DFl59cjA9PV1effVVWb16tTRo0KDyX1FiY2OlTp06EhsbK/fdd5889NBD0rBhQ4mJiZGxY8dKly5d2KUHIYtewzR0Giai1zANnYZp6DRMRK9hC68WB59//nkREenRo8d5+eLFi2X48OEiIvLnP/9ZIiMjZcCAAVJWVia9e/eWBQsW+GWyQCDQa5iGTsNE9BqmodMwDZ2Gieg1bOHV4qDuuUY/Vbt2bZk/f77H59UAoYZewzR0Giai1zANnYZp6DRMRK9hiyptSGKDhg0bKtkLL7ygZJ4eBh6oBxzrNmR49tlnleyDDz7Qnn/mzBm/zwnhIysrS8m2bt2qZB07dnT9mp4etOtpsx6dr7/+Wsl0u4A98MADrl8TsJmnZ9xkZmZW70QQUD9+8PmPXejB/j/11VdfKdnDDz9c1SkBYeVf//qXkuk2dOLB9Xbr1q2bNk9LS1Oya665RsmOHTumPX/RokVKdvLkSe1Y3eZRgM3++te/Ktkdd9wRhJmYw9ztDAEAAAAAAABcEIuDAAAAAAAAgKVYHAQAAAAAAAAsxeIgAAAAAAAAYCkWBwEAAAAAAABLWbVbcefOnZVswoQJ2rGdOnVSsl/84hd+n5OIyOnTp5Vszpw52rFPPfWUkpWWlvp9TjDTkSNHlKx///5Kdv/992vPnzx5sk9ff/bs2dr8+eefV7IvvvjCp68F2CIiIiLYUwCAsLR7924l279/v5Jdcskl2vMvvfRSJTt+/LjvE0NIKSkp0eYvv/yyqwyA/+3du1fJPvvsM+3Ytm3bBno6RuCTgwAAAAAAAIClWBwEAAAAAAAALMXiIAAAAAAAAGApFgcBAAAAAAAAS1m1IUm/fv1cZd7QPQhTRGTNmjVK9t1332nHPvvss0pWWFjo07wAt/Ly8pRs6tSp2rGecgCB995772nzO++8s5pnglCxb98+bf7JJ58oWdeuXQM9HcAIus3/XnrpJe3Y6dOnK9nYsWO1Yz39nQEA4L1Dhw4p2VVXXRWEmZiDTw4CAAAAAAAAlmJxEAAAAAAAALAUi4MAAAAAAACApVgcBAAAAAAAACzF4iAAAAAAAABgqQjHcZxgT+LHiouLJTY2NtjTgOGKiookJiam2r4evUag0WmYpro7LUKvEXjcq0Of7vfn9ddf145NTU1VsjfffFM79t5771Wy0tJSL2cXerhXw0Tcq2EaN53mk4MAAAAAAACApVgcBAAAAAAAACzF4iAAAAAAAABgKRYHAQAAAAAAAEvVDPYEAAAAACAUFBcXK9nAgQO1Y6dPn65ko0aN0o6dOnWqku3du9e7yQEAECB8chAAAAAAAACwFIuDAAAAAAAAgKVYHAQAAAAAAAAsxeIgAAAAAAAAYCkWBwEAAAAAAABLsVsxAAAAAHig28FYRGTs2LGuMgAAQh2fHAQAAAAAAAAsxeIgAAAAAAAAYCkWBwEAAAAAAABLhdzioOM4wZ4CLFDdPaPXCDQ6DdMEo2P0GoHGvRqm4V4NE3GvhmncdCzkFgdLSkqCPQVYoLp7Rq8RaHQapglGx+g1Ao17NUzDvRom4l4N07jpWIQTYsvUFRUVcvToUWnQoIGUlJRIy5YtJTc3V2JiYoI9Nb8pLi7muoLEcRwpKSmR5s2bS2Rk9a2N0+vwFerXRacDJ9R/76sq1K8rWJ0W+V+vHceRxMTEkP0eVVWo/95XVThcF/fqwAmH3/+qCPXr4l4dOKH+e19V4XBd3KsDJxx+/6si1K/Lm07XrKY5uRYZGSktWrQQEZGIiAgREYmJiQnJb7SvuK7giI2NrfavSa/DXyhfF50OLK6r+gWj0yL/63VxcbGIhPb3yBdcV3Bwrw4srqv6ca8OLK4rOLhXBxbXVf3cdjrk/rdiAAAAAAAAANWDxUEAAAAAAADAUiG9OBgdHS1TpkyR6OjoYE/Fr7guu5n6feK67GXq94jrspep3yOuy26mfp+4LnuZ+j3iuuxm6veJ6wp9IbchCQAAAAAAAIDqEdKfHAQAAAAAAAAQOCwOAgAAAAAAAJZicRAAAAAAAACwFIuDAAAAAAAAgKVYHAQAAAAAAAAsFdKLg/Pnz5dWrVpJ7dq1pXPnzvLvf/872FPyyubNm+WOO+6Q5s2bS0REhKxateq8X3ccRx577DFp1qyZ1KlTR1JTU2X//v3BmaxLGRkZ0rFjR2nQoIE0bdpU0tLSJCcn57wxZ8+elfT0dGnUqJHUr19fBgwYIAUFBUGacWih06GJXldduHdaxMxe02nfhHuvTey0CL32Rbh3WsTMXtNp34R7r03stAi99kW4d1rEzF7b0umQXRxcvny5PPTQQzJlyhTZsWOHJCcnS+/eveXYsWPBnpprpaWlkpycLPPnz9f++syZM2XOnDmycOFCyc7Olnr16knv3r3l7Nmz1TxT9zZt2iTp6emyZcsWWbt2rXz77bfSq1cvKS0trRzz4IMPyttvvy1vvPGGbNq0SY4ePSr9+/cP4qxDA50OXfS6akzotIiZvabTVWdCr03stAi9rioTOi1iZq/pdNWZ0GsTOy1Cr6vKhE6LmNlrazrthKhOnTo56enplf997tw5p3nz5k5GRkYQZ1V1IuKsXLmy8r8rKiqchIQEZ9asWZVZYWGhEx0d7bz22mtBmGHVHDt2zBERZ9OmTY7jfH8NtWrVct54443KMZ999pkjIk5WVlawphkS6HT4oNfumNZpxzG313TaPdN6bWqnHYdeu2Vapx3H3F7TafdM67WpnXYceu2WaZ12HHN7bWqnQ/KTg+Xl5bJ9+3ZJTU2tzCIjIyU1NVWysrKCODP/OXjwoOTn5593jbGxsdK5c+ewusaioiIREWnYsKGIiGzfvl2+/fbb866rTZs2kpiYGFbX5W90OryukV7/PBs6LWJOr+m0Ozb02pROi9BrN2zotIg5vabT7tjQa1M6LUKv3bCh0yLm9NrUTofk4uCJEyfk3LlzEh8ff14eHx8v+fn5QZqVf/1wHeF8jRUVFTJ+/Hi54YYbpF27diLy/XVFRUVJXFzceWPD6boCgU6HzzXSa3ds6LSIGb2m0+7Z0GsTOi1Cr92yodMiZvSaTrtnQ69N6LQIvXbLhk6LmNFrkztdM9gTQPhKT0+X3bt3y0cffRTsqQB+Q69hGjoNE9FrmIZOw0T0GqYxudMh+cnBxo0bS40aNZTdXQoKCiQhISFIs/KvH64jXK9xzJgxsmbNGtmwYYO0aNGiMk9ISJDy8nIpLCw8b3y4XFeg0OnwuEZ67Z4NnRYJ/17Tae/Y0Otw77QIvfaGDZ0WCf9e02nv2NDrcO+0CL32hg2dFgn/Xpve6ZBcHIyKipKUlBRZt25dZVZRUSHr1q2TLl26BHFm/pOUlCQJCQnnXWNxcbFkZ2eH9DU6jiNjxoyRlStXyvr16yUpKem8X09JSZFatWqdd105OTly+PDhkL6uQKPToX2N9Np7NnRaJHx7TaerxoZeh2unReh1VdjQaZHw7TWdrhobeh2unRah11VhQ6dFwrfX1nQ6iJuhXNCyZcuc6OhoJzMz09m7d68zcuRIJy4uzsnPzw/21FwrKSlxdu7c6ezcudMREee5555zdu7c6Rw6dMhxHMeZMWOGExcX56xevdr5z3/+4/Tt29dJSkpyzpw5E+SZezZq1CgnNjbW2bhxo5OXl1d5nD59unLM73//eycxMdFZv369s23bNqdLly5Oly5dgjjr0ECnQxe9rhoTOu04ZvaaTledCb02sdOOQ6+ryoROO46ZvabTVWdCr03stOPQ66oyodOOY2avbel0yC4OOo7jzJ0710lMTHSioqKcTp06OVu2bAn2lLyyYcMGR0SUY9iwYY7jfL+V96OPPurEx8c70dHRTs+ePZ2cnJzgTvpn6K5HRJzFixdXjjlz5owzevRo56KLLnLq1q3r9OvXz8nLywvepEMInQ5N9Lrqwr3TjmNmr+m0b8K91yZ22nHotS/CvdOOY2av6bRvwr3XJnbacei1L8K9045jZq9t6XSE4ziOd581BAAAAAAAAGCCkHzmIAAAAAAAAIDAY3EQAAAAAAAAsBSLgwAAAAAAAIClWBwEAAAAAAAALMXiIAAAAAAAAGApFgcBAAAAAAAAS7E4CAAAAAAAAFiKxUEAAAAAAADAUiwOAgAAAAAAAJZicRAAAAAAAACwFIuDAAAAAAAAgKX+H9kDRDMQW71hAAAAAElFTkSuQmCC\n",
"text/plain": [
"<Figure size 1600x400 with 7 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
" 5 0 4 1 9 2 1\n"
]
}
],
"source": [
"draw_examples(x_train[:7], captions=y_train)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [],
"source": [
"batch_size = 128\n",
"num_classes = 10\n",
"epochs = 12\n",
"\n",
"# input image dimensions\n",
"img_rows, img_cols = 28, 28"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"if keras.backend.image_data_format() == 'channels_first':\n",
" x_train = x_train.reshape(x_train.shape[0], 1, img_rows, img_cols)\n",
" x_test = x_test.reshape(x_test.shape[0], 1, img_rows, img_cols)\n",
" input_shape = (1, img_rows, img_cols)\n",
"else:\n",
" x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
" x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
" input_shape = (img_rows, img_cols, 1)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"x_train shape: (60000, 28, 28, 1)\n",
"60000 train samples\n",
"10000 test samples\n"
]
}
],
"source": [
"x_train = x_train.astype('float32')\n",
"x_test = x_test.astype('float32')\n",
"x_train /= 255\n",
"x_test /= 255\n",
"print('x_train shape: {}'.format(x_train.shape))\n",
"print('{} train samples'.format(x_train.shape[0]))\n",
"print('{} test samples'.format(x_test.shape[0]))\n",
"\n",
"# convert class vectors to binary class matrices\n",
"y_train = keras.utils.to_categorical(y_train, num_classes)\n",
"y_test = keras.utils.to_categorical(y_test, num_classes)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"2023-01-27 12:51:13.294000: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory\n",
"2023-01-27 12:51:13.295301: W tensorflow/compiler/xla/stream_executor/cuda/cuda_driver.cc:265] failed call to cuInit: UNKNOWN ERROR (303)\n",
"2023-01-27 12:51:13.295539: I tensorflow/compiler/xla/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (ELLIOT): /proc/driver/nvidia/version does not exist\n",
"2023-01-27 12:51:13.298310: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA\n",
"To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.\n"
]
}
],
"source": [
"model = Sequential()\n",
"model.add(Conv2D(32, kernel_size=(3, 3),\n",
" activation='relu',\n",
" input_shape=input_shape))\n",
"model.add(Conv2D(64, (3, 3), activation='relu'))\n",
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
"model.add(Dropout(0.25))\n",
"model.add(Flatten())\n",
"model.add(Dense(128, activation='relu'))\n",
"model.add(Dropout(0.5))\n",
"model.add(Dense(num_classes, activation='softmax'))"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"sequential\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" conv2d (Conv2D) (None, 26, 26, 32) 320 \n",
" \n",
" conv2d_1 (Conv2D) (None, 24, 24, 64) 18496 \n",
" \n",
" max_pooling2d (MaxPooling2D (None, 12, 12, 64) 0 \n",
" ) \n",
" \n",
" dropout (Dropout) (None, 12, 12, 64) 0 \n",
" \n",
" flatten (Flatten) (None, 9216) 0 \n",
" \n",
" dense (Dense) (None, 128) 1179776 \n",
" \n",
" dropout_1 (Dropout) (None, 128) 0 \n",
" \n",
" dense_1 (Dense) (None, 10) 1290 \n",
" \n",
"=================================================================\n",
"Total params: 1,199,882\n",
"Trainable params: 1,199,882\n",
"Non-trainable params: 0\n",
"_________________________________________________________________\n"
]
}
],
"source": [
"model.summary()"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'model' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn [1], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m model\u001b[38;5;241m.\u001b[39mcompile(loss\u001b[38;5;241m=\u001b[39mkeras\u001b[38;5;241m.\u001b[39mlosses\u001b[38;5;241m.\u001b[39mcategorical_crossentropy,\n\u001b[1;32m 2\u001b[0m optimizer\u001b[38;5;241m=\u001b[39mkeras\u001b[38;5;241m.\u001b[39moptimizers\u001b[38;5;241m.\u001b[39mAdadelta(),\n\u001b[1;32m 3\u001b[0m metrics\u001b[38;5;241m=\u001b[39m[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124maccuracy\u001b[39m\u001b[38;5;124m'\u001b[39m])\n",
"\u001b[0;31mNameError\u001b[0m: name 'model' is not defined"
]
}
],
"source": [
"model.compile(loss=keras.losses.categorical_crossentropy,\n",
" optimizer=keras.optimizers.Adadelta(),\n",
" metrics=['accuracy'])"
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Train on 60000 samples, validate on 10000 samples\n",
"Epoch 1/12\n",
"60000/60000 [==============================] - 333s - loss: 0.3256 - acc: 0.9037 - val_loss: 0.0721 - val_acc: 0.9780\n",
"Epoch 2/12\n",
"60000/60000 [==============================] - 342s - loss: 0.1088 - acc: 0.9683 - val_loss: 0.0501 - val_acc: 0.9835\n",
"Epoch 3/12\n",
"60000/60000 [==============================] - 366s - loss: 0.0837 - acc: 0.9748 - val_loss: 0.0429 - val_acc: 0.9860\n",
"Epoch 4/12\n",
"60000/60000 [==============================] - 311s - loss: 0.0694 - acc: 0.9788 - val_loss: 0.0380 - val_acc: 0.9878\n",
"Epoch 5/12\n",
"60000/60000 [==============================] - 325s - loss: 0.0626 - acc: 0.9815 - val_loss: 0.0334 - val_acc: 0.9886\n",
"Epoch 6/12\n",
"60000/60000 [==============================] - 262s - loss: 0.0552 - acc: 0.9835 - val_loss: 0.0331 - val_acc: 0.9890\n",
"Epoch 7/12\n",
"60000/60000 [==============================] - 218s - loss: 0.0494 - acc: 0.9852 - val_loss: 0.0291 - val_acc: 0.9903\n",
"Epoch 8/12\n",
"60000/60000 [==============================] - 218s - loss: 0.0461 - acc: 0.9859 - val_loss: 0.0294 - val_acc: 0.9902\n",
"Epoch 9/12\n",
"60000/60000 [==============================] - 219s - loss: 0.0423 - acc: 0.9869 - val_loss: 0.0287 - val_acc: 0.9907\n",
"Epoch 10/12\n",
"60000/60000 [==============================] - 218s - loss: 0.0418 - acc: 0.9875 - val_loss: 0.0299 - val_acc: 0.9906\n",
"Epoch 11/12\n",
"60000/60000 [==============================] - 218s - loss: 0.0388 - acc: 0.9879 - val_loss: 0.0304 - val_acc: 0.9905\n",
"Epoch 12/12\n",
"60000/60000 [==============================] - 218s - loss: 0.0366 - acc: 0.9889 - val_loss: 0.0275 - val_acc: 0.9910\n"
]
},
{
"data": {
"text/plain": [
"<keras.callbacks.History at 0x7f70b80b1a10>"
]
},
"execution_count": 32,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"model.fit(x_train, y_train,\n",
" batch_size=batch_size,\n",
" epochs=epochs,\n",
" verbose=1,\n",
" validation_data=(x_test, y_test))"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"('Test loss:', 0.027530849870144449)\n",
"('Test accuracy:', 0.99099999999999999)\n"
]
}
],
"source": [
"score = model.evaluate(x_test, y_test, verbose=0)\n",
"print('Test loss:', score[0])\n",
"print('Test accuracy:', score[1])"
]
}
],
"metadata": {
"author": "Paweł Skórzewski",
"celltoolbar": "Slideshow",
"email": "pawel.skorzewski@amu.edu.pl",
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"lang": "pl",
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
},
"livereveal": {
"start_slideshow_at": "selected",
"theme": "white"
},
"subtitle": "12.Splotowe sieci neuronowe[wykład]",
"title": "Uczenie maszynowe",
"vscode": {
"interpreter": {
"hash": "31f2aee4e71d21fbe5cf8b01ff0e069b9275f58929596ceb00d14d90e3e16cd6"
}
},
"year": "2021"
},
"nbformat": 4,
"nbformat_minor": 4
}