2021-03-02 08:32:40 +01:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
2021-04-14 08:03:54 +02:00
"## Uczenie maszynowe – zastosowania\n",
2021-03-02 08:32:40 +01:00
"# 9. Sieci neuronowe – wprowadzenie"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"# Przydatne importy\n",
"\n",
"import matplotlib\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"%matplotlib inline"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## 9.1. Perceptron"
]
},
{
2021-05-05 08:09:45 +02:00
"cell_type": "markdown",
2021-03-02 08:32:40 +01:00
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:09:45 +02:00
"https://www.youtube.com/watch?v=cNxadbrN_aI"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img style=\"margin: auto\" heighth=\"100%\" src=\"http://m.natemat.pl/b94a41cd7322e1b8793e4644e5f82683,641,0,0,0.png\" alt=\"Frank Rosenblatt\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img style=\"margin: auto\" src=\"http://m.natemat.pl/02943a7dc0f638d786b78cd5c9e75742,641,0,0,0.png\" heighth=\"100%\" alt=\"Frank Rosenblatt\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img style=\"margin: auto\" heighth=\"100%\" src=\"https://upload.wikimedia.org/wikipedia/en/5/52/Mark_I_perceptron.jpeg\" alt=\"perceptron\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Pierwszy perceptron liniowy\n",
"\n",
"* Frank Rosenblatt, 1957\n",
"* aparat fotograficzny podłączony do 400 fotokomórek (rozdzielczość obrazu: 20 x 20)\n",
"* wagi – potencjometry aktualizowane za pomocą silniczków"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Uczenie perceptronu\n",
"\n",
"Cykl uczenia perceptronu Rosenblatta:\n",
"\n",
"1. Sfotografuj planszę z kolejnym obiektem.\n",
"1. Zaobserwuj, która lampka zapaliła się na wyjściu.\n",
"1. Sprawdź, czy to jest właściwa lampka.\n",
"1. Wyślij sygnał „nagrody” lub „kary”."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Funkcja aktywacji\n",
"\n",
"Funkcja bipolarna:\n",
"\n",
"$$ g(z) = \\left\\{ \n",
"\\begin{array}{rl}\n",
"1 & \\textrm{gdy $z > \\theta_0$} \\\\\n",
"-1 & \\textrm{wpp.}\n",
"\\end{array}\n",
"\\right. $$\n",
"\n",
"gdzie $z = \\theta_0x_0 + \\ldots + \\theta_nx_n$,<br/>\n",
"$\\theta_0$ to próg aktywacji,<br/>\n",
"$x_0 = 1$. "
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"def bipolar_plot():\n",
" matplotlib.rcParams.update({'font.size': 16})\n",
"\n",
" plt.figure(figsize=(8,5))\n",
" x = [-1,-.23,1] \n",
" y = [-1, -1, 1]\n",
" plt.ylim(-1.2,1.2)\n",
" plt.xlim(-1.2,1.2)\n",
" plt.plot([-2,2],[1,1], color='black', ls=\"dashed\")\n",
" plt.plot([-2,2],[-1,-1], color='black', ls=\"dashed\")\n",
" plt.step(x, y, lw=3)\n",
" ax = plt.gca()\n",
" ax.spines['right'].set_color('none')\n",
" ax.spines['top'].set_color('none')\n",
" ax.xaxis.set_ticks_position('bottom')\n",
" ax.spines['bottom'].set_position(('data',0))\n",
" ax.yaxis.set_ticks_position('left')\n",
" ax.spines['left'].set_position(('data',0))\n",
"\n",
" plt.annotate(r'$\\theta_0$',\n",
" xy=(-.23,0), xycoords='data',\n",
" xytext=(-50, +50), textcoords='offset points', fontsize=26,\n",
" arrowprops=dict(arrowstyle=\"->\"))\n",
"\n",
" plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAcwAAAEeCAYAAAAHLSWiAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3de1yUZf7/8fcoZzEVT7mmwapYQsUqth5yKaxdUpHWFHTDxNrU1HbxlNpiWlp00KLdVkOtNA+PBH0Y5LElxbJsk3bZDAo1wbTNNHU7iKDA/fvDL/NrHA43h2EYeD0fj3nIXNd9DZ+5He4398x1X1gMwxAAAKhaC2cXAACAKyAwAQAwgcAEAMAEAhMAABMITAAATCAwAQAwwa2afq45AZwgIiJCu3btcnYZQHNlqaiRM0ygEfruu++cXQKAqxCYAACYQGACAGACgQkAgAkEJgAAJhCYAACYQGACAGACgQkAgAkEJgAAJhCYAACYQGACAGACgQkAgAkEJgAAJhCYAACYQGACAGACgQkAgAkEJlCFkydP6pFHHtHAgQPl4+Mji8WigoICU2PLysqUmJgof39/eXl56ZZbbtGWLVscWzAAhyEwgSocPXpUKSkpateunYYMGVKjsQsWLNCiRYs0ffp07dy5UwMGDNCYMWO0Y8cOB1ULwJEshmFU1V9lJ9DUlZWVqUWLK79Xrl69Wg899JDy8/Pl7+9f5bjTp0+rW7dumjdvnp544glr+9ChQ3XmzBl9+umnVY4PDQ1VVlZWnesHUCuWiho5wwSqUB6WNbV7925dunRJsbGxNu2xsbE6dOiQ8vPz66M8AA2IwAQcICcnR56enurZs6dNe1BQkCQpNzfXGWUBqIMq35K9/fbb7Trj4uIUFxen7777TqNHj7Yb8/DDDysmJkYnTpzQ+PHj7fpnzZqlyMhI5eXlafLkyXb9CQkJuvPOO5Wdna34+Hi7/qefflqDBg3Shx9+qMcee8yuPykpSSEhIcrIyNCSJUvs+pOTk9W7d2+9/fbbWrZsmV3/unXr1K1bN23atEkrVqyw69+8ebM6dOigNWvWaM2aNXb9O3bskI+Pj5YvX66UlBS7/szMTEnS0qVLtW3bNps+b29v7dy5U5K0ePFivfvuuzb97du3t04amT9/vg4cOGDTf91112n9+vWSpPj4eGVnZ9v0BwYGauXKlZKkSZMm6fDhwzb9ISEhSkpKknTlTOjkyZM2/QMHDlRiYqIk6d5779XZs2dt+ocOHaoFCxZIku6++25dvHjRpn/EiBGaPXu2JOn222/X1aKjozV16lQVFhZq2LBhdv11fe2FjJ2lf/zXXRculdr1NTbfrI1XlwlJzi4DaBAFzwyX1HiOe5mZmRW+JetWq2cHuKCdX1lUXNb4wxJA48SkHzQb/vO2O7sE0zjDRHNSfobZiHCGCZSrzQ9oTWbJvvHGG5owYYKOHDli8znmmjVrNHHiRB07dkwBAQGVjg/NWKisxncQAZo1Jv0ADhARESEPDw9t2LDBpn39+vUKDg6uMiwBNE6cYQLV2Lx5syTpk08+kSTt3LlTHTt2VMeOHRUWFiZJcnNz04QJE/Tqq69Kkjp16qQZM2YoMTFRrVu3Vt++fbVp0ybt2bNHaWlpznkiAOqEwASqMWbMGJv7U6dOlSSFhYVZZz2XlpaqtNR2QtFTTz0lX19fvfTSSzp16pR69+6tlJQURUZGNkjdAOoXgQlUo5qJcZVu07JlSyUkJCghIcERZQFoYHyGCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBCQCACQQmAAAmEJgAAJhAYAIAYAKBiTr73//+p8WLFyskJEStW7eWn5+fwsPDtWPHDmeXBgD1xs3ZBcC1ZWZm6g9/+IO++eYbm/a9e/cqMzNTK1as0OTJk51UHQDUH84wUWuZmZkaNmyYvvnmG8XGxurgwYM6f/68Pv74Yw0cOFCGYWjmzJk6efKks0sFgDojMFErp0+f1tixY3Xx4kU999xzWrdunUJDQ9W2bVv1799faWlp8vX1VWFhoTZu3OjscgGgzghM1Mrs2bP17bffasSIEZozZ45df8eOHTV48GBJ0r59+xq6PACodwQmauyLL77Qhg0bZLFY9Nxzz1W6XceOHSVJx48fb6jSAMBhCEzUWHJyssrKynTnnXfqxhtvrHS7y5cv2/wLAK6MwESNlJWV6c0335Qk3XfffVVue+7cOUmSt7e3w+tylBMnTmj06NFq06aNrrnmGo0aNUpfffWVqbEWi6XCW3Z2toOrBuAIXFaCGsnOztapU6ckSXFxcYqLi6t2TLdu3RxclWMUFhYqPDxcnp6eWrt2rSwWixISEnTHHXfo008/VatWrap9jLi4OLvLagIDAx1VMgAHIjBRI7WZwNOrVy8HVOJ4q1at0rFjx5SXl6eePXtKkm6++Wb16tVLycnJmjlzZrWP0bVrVw0YMMDRpQJoALwlixr517/+JUn63e9+p4sXL1Z627p1q3VM3759nVVunaSnp2vAgAHWsJSkgIAADR48WGlpaU6sDIAzEJiokcOHD0uSunfvLi8vr0pvH330kXXMb37zG5vH2LRpk/r37y9vb2+1b99e0dHROnbsWIM+DzNycnIUHBxs1x4UFKTc3FxTj7FixQp5enrKx8dH4eHhev/99+u7TAANhMBEjZSv2uPn51fldtu3b5ck3Xjjjerevbu1/ZVXXtHYsWPl7u6uF198UTNnztSePXs0cOBA05NpGsq5c+fUrl07u3Y/Pz+dP3++2vGxsbFavny5MjIytHLlSp09e1bh4eHKzMx0QLUAHI3PMFEjFy9elCR5eXlVus0XX3yhzz77TJI0fvx4a/u5c+f06KOPKiQkRPv27ZO7u7sk6e6771b//v312GOPaf369Q6svuYsFotdm2EYpsauW7fO+vWQIUMUFRWl4OBgJSQkaP/+/Xbbr1y5UitXrpQknTlzppYVA3AUzjBRIx4eHpKkCxcuVLrN8uXLJV25nOTns2jfeust/fjjj/rzn/9sDUvpymec4eHh2rJliwoLCx1TeC20a9fOemnMz50/f77CM8/qtG7dWsOHD9fBgwcr7J80aZKysrKUlZVlXfQBQONBYKJGrr/+eklSXl5ehf35+fnWs6Tp06erS5cu1r6PP/5YkqxL5v3cbbfdpqKiIuuZaWMQFBSknJwcu/bc3Fz16dOnVo9pGEaFZ60AGj8CEzUSFhYmSXrnnXf03//+16bvwoULiomJUXFxsQIDA7Vw4UKb/q+//lqSdN1119k9bnlbY/rLJiNHjtRHH31kMyGpoKBAH3zwgUaOHFnjx/vhhx+0fft2/frXv67PMgE0EAITNTJx4kS5ubmpuLhYI0eOtP5Jr3feeUeDBw/WwYMH5efnp9TUVLsL+8vfbvX09LR73PLVgBrTW7IPPfSQ/P39FRUVpbS0NKWnpysqKkrdunWzWYzg+PHjcnNz05NPPmltW7p0qR566CFt3LhRmZmZWrt2rQYPHqxTp05pyZIlzng6AOqIST+okRtvvFELFy7UggUL9Mknn+jWW2+16f/lL3+pLVu26Oabb7Yb6+PjI0kqLi62Wy6vqKjIZpvGoFWrVtqzZ49mzJih8ePHyzAMDR06VElJSfL19bVuZxiGSktLVVZWZm3r3bu3tm7dqq1bt+r777/XNddco8GDB+vVV1+122cAXAOBiRpLSEhQYGCgkpKSlJOTI8MwFBgYqJiYGE2bNq3S0OvataukK2+7Xr36z4kTJyRV/HatM3Xv3l1btmypcht/f3+7mbORkZGKjIx0ZGkAGhiBiVqJjo5WdHR0jcbceuutSk5O1oEDB+wC84MPPpCXl1e
"text/plain": [
"<Figure size 576x360 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"bipolar_plot()"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Perceptron – schemat\n",
"\n",
2021-05-05 08:03:27 +02:00
"<img src=\"perceptron.png\" width=\"60%\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"#### Perceptron – zasada działania\n",
2021-03-02 08:32:40 +01:00
"\n",
"1. Ustal wartości początkowe $\\theta$ (wektor 0 lub liczby losowe blisko 0).\n",
"1. Dla każdego przykładu $(x^{(i)}, y^{(i)})$, dla $i=1,\\ldots,m$\n",
2021-05-05 08:03:27 +02:00
" * Oblicz wartość wyjścia $o^{(i)} = g(\\theta^{T}x^{(i)}) = g(\\sum_{j=0}^{n} \\theta_jx_j^{(i)})$\n",
" * Wykonaj aktualizację wag (tzw. *perceptron rule*):\n",
2021-03-02 08:32:40 +01:00
" $$ \\theta := \\theta + \\Delta \\theta $$\n",
" $$ \\Delta \\theta = \\alpha(y^{(i)}-o^{(i)})x^{(i)} $$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"$$\\theta_j := \\theta_j + \\Delta \\theta_j $$\n",
"\n",
"Jeżeli przykład został sklasyfikowany **poprawnie**:\n",
"\n",
"* $y^{(i)}=1$ oraz $o^{(i)}=1$ : $$\\Delta\\theta_j = \\alpha(1 - 1)x_j^{(i)} = 0$$\n",
"* $y^{(i)}=-1$ oraz $o^{(i)}=-1$ : $$\\Delta\\theta_j = \\alpha(-1 - -1)x_j^{(i)} = 0$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Czyli: jeżeli trafiłeś, to nic nie zmieniaj."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"$$\\theta_j := \\theta_j + \\Delta \\theta_j $$\n",
"\n",
"Jeżeli przykład został sklasyfikowany **niepoprawnie**:\n",
"\n",
"* $y^{(i)}=1$ oraz $o^{(i)}=-1$ : $$\\Delta\\theta_j = \\alpha(1 - -1)x_j^{(i)} = 2 \\alpha x_j^{(i)}$$\n",
"* $y^{(i)}=-1$ oraz $o^{(i)}=1$ : $$\\Delta\\theta_j = \\alpha(-1 - 1)x_j^{(i)} = -2 \\alpha x_j^{(i)}$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Czyli: przesuń wagi w odpowiednią stronę."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"### Perceptron – zalety\n",
2021-03-02 08:32:40 +01:00
"\n",
"* intuicyjny i prosty\n",
"* łatwy w implementacji\n",
"* jeżeli dane można liniowo oddzielić, algorytm jest zbieżny w skończonym czasie"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
2021-05-05 08:03:27 +02:00
"slide_type": "subslide"
2021-03-02 08:32:40 +01:00
}
},
"source": [
2021-05-05 08:03:27 +02:00
"### Perceptron – wady\n",
"\n",
2021-03-02 08:32:40 +01:00
"* jeżeli danych nie można oddzielić liniowo, algorytm nie jest zbieżny"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"def plot_perceptron():\n",
" plt.figure(figsize=(12,3))\n",
"\n",
" plt.subplot(131)\n",
" plt.ylim(-0.2,1.2)\n",
" plt.xlim(-0.2,1.2)\n",
"\n",
" plt.title('AND')\n",
" plt.plot([1,0,0], [0,1,0], 'ro', markersize=10)\n",
" plt.plot([1], [1], 'go', markersize=10)\n",
"\n",
" ax = plt.gca()\n",
" ax.spines['right'].set_color('none')\n",
" ax.spines['top'].set_color('none')\n",
" ax.xaxis.set_ticks_position('none')\n",
" ax.spines['bottom'].set_position(('data',0))\n",
" ax.yaxis.set_ticks_position('none')\n",
" ax.spines['left'].set_position(('data',0))\n",
"\n",
" plt.xticks(np.arange(0, 2, 1.0))\n",
" plt.yticks(np.arange(0, 2, 1.0))\n",
"\n",
"\n",
" plt.subplot(132)\n",
" plt.ylim(-0.2,1.2)\n",
" plt.xlim(-0.2,1.2)\n",
"\n",
" plt.plot([1,0,1], [0,1,1], 'go', markersize=10)\n",
" plt.plot([0], [0], 'ro', markersize=10)\n",
"\n",
" ax = plt.gca()\n",
" ax.spines['right'].set_color('none')\n",
" ax.spines['top'].set_color('none')\n",
" ax.xaxis.set_ticks_position('none')\n",
" ax.spines['bottom'].set_position(('data',0))\n",
" ax.yaxis.set_ticks_position('none')\n",
" ax.spines['left'].set_position(('data',0))\n",
"\n",
" plt.title('OR')\n",
" plt.xticks(np.arange(0, 2, 1.0))\n",
" plt.yticks(np.arange(0, 2, 1.0))\n",
"\n",
"\n",
" plt.subplot(133)\n",
" plt.ylim(-0.2,1.2)\n",
" plt.xlim(-0.2,1.2)\n",
"\n",
" plt.title('XOR')\n",
" plt.plot([1,0], [0,1], 'go', markersize=10)\n",
" plt.plot([0,1], [0,1], 'ro', markersize=10)\n",
"\n",
" ax = plt.gca()\n",
" ax.spines['right'].set_color('none')\n",
" ax.spines['top'].set_color('none')\n",
" ax.xaxis.set_ticks_position('none')\n",
" ax.spines['bottom'].set_position(('data',0))\n",
" ax.yaxis.set_ticks_position('none')\n",
" ax.spines['left'].set_position(('data',0))\n",
"\n",
" plt.xticks(np.arange(0, 2, 1.0))\n",
" plt.yticks(np.arange(0, 2, 1.0))\n",
"\n",
" plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAqwAAADGCAYAAAAXMtIlAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAWVUlEQVR4nO3df5DUd33H8dd7cRO5XJYcaUQn1YEh0Qop2IYqmBACiS1YQ1J6RB1Bo2PUOUwnV5z6MzVJUepUClYnJiZq6FG1sM4kqYBOmhFsMFQ5M57lgt4dXEhik4YcJnCXHyu8+8f3e7C3t7e3y+339nN3z8fMdzb3/X73u5/95vPm+9rvfr+fNXcXAAAAEKpUrRsAAAAAlEJgBQAAQNAIrAAAAAgagRUAAABBI7ACAAAgaARWAAAABI3ACgAAgKARWEeJmW03Mzez/ymxjsfTz4ZYfmW8fFPB/HvznutmljOzHjP7VbzsXWY2qdrvCZhozOwdZpY1s6fM7GUzO2JmPzazj5jZq4qs311QmyfM7Fkz22lmy2rxHoDxwMw+GdfU14ZY/iYz6zOzg2Z2TsGyt5lZi5k9bmYvmdlRM/upma01s8lDbO/hIrXcY2a7zOy9SbxHDGT8cEDyzOx1kp5Q9AHBJL3V3X9eZL38/xmN7v79guVXSvqxpK+4+8158++V9AFJd0l6On6djKQ3S1ooabKkn0l6t7t3V+t9ARNFHEbvkvQhSccl/UDSIUnnS1om6fWS9kl6l7s/k/e8bkmvlfSP8ayzFdXlNYrq9APu/q+j8y6A8SM+CbNH0lslLXH3XQXLHpb0tiLLvijp05JelrRT0q8l1Uv6c0kXS+qQ9E537yx4vYclXSZpvaRXJKUlXSTpOklnSfqsu38xgbeKfu7OlPAk6VOSXNKX48evD7GeS/qtpBclPSZpUsHyK+N1NhXMvzee/5Yi2zxfUku8/ICk+lrvDyamsTZJ2hDX0COSXluw7GxJX89b/qq8Zd2Sfldke9fH6x+u9XtjYhqrk6IPfy9JOijpnLz5n4zr66sF6zfnHQtnFixLSbo1Xt4h6dyC5Q/Hy+oL5l8u6YSkXkmTa71PxvPEJQGj4wZJz0v6nKTfSHqPmb16iHX/T9LXJP1R/LwRcffnJL1f0oOS3iTp4yPdJjCRmNkbJd0s6TlJy9396fzl7v6ypCZJP5E0X9G3HcPZpuhM7evN7ILqthiYGNz9MUl/L2mGpC9JkpnNknSbpC5FJ4sUzz9f0j8oCrjXuHtXwbZOuvutkr6r6Mzp35bZhocldUqqU3TcRkIIrAkzs8sUBcWsu78kaYuk8yStKPG09YoC7udLBNuyefQxsP+riutHuj1ggvmAon8rv+HuzxZboaDGPljmdi1+zI2secCEtkHSf0tqMrOrJW1W9BX9h9y9N2+96yWdI2mbu3eU2N4X4scPnUFbqOUEEViT13/waokftyj6WmHIg5q790j6J0XXxa2pUjt+qqiY5ha7OQTAkN4ePz40zHo/kfR7SX9Wxk2O71F08HzM3X83wvYBE5a7n1D0beTLiq4tnyfpX9z9JwWrllXH7r5f0jOS3hDff1KSmS1SdO3rEUWXEiAhBJcExXcmXi/psKKDmdz9kJn9VNISM3uDux8e4umbFH19/2kzu9vdXxhJW9z9FTPrkTRN0lRFlx4AGN5r48cnS63k7i+a2XOKaux8na6xV5vZrfF/ny1plqKbrl5U9T6QAhOWux8wsy2SPqwobH6myGpl1XHeOtMkvU7S/xYs+4yZvaIoP10s6a8knZT08fjyICSEwJqsRknnSvpa/JVhvxZFdxveIOn2Yk90914zW6foetZPKLpOZ6Rs+FUAjEB/jeXX+9mSPl+w3ouSlrn77lFpFTCOmdkMSf1DS01TdDb1P0eyyfix2DBKny74+4Sk97r7thG8HsrAJQHJ6v/af0vB/K2KhsW4wcxKhchvKLr7sdnMXjOShpjZ2YrOrJ6Q1DOSbQETTP9NVn9YaqX4evOpimo7v8aed3dzd5M0RdGB9aSkrWZWcpsASouPod9UdInNzYpuqrrbzOoLVi2rjmMXFjwn37lxLddLeqeimzG/bWZzKm07KkNgTYiZzZR0Rfzn/vwBhxUdzM5SdGfjlUNtw91zis7M1CsaYWAk3q7ojPov3f33I9wWMJE8Ej9eNcx6VyiqsZ/H19UN4u4vuPv3JH1E0msUDYcF4Mw1SVos6R53/4qiY+Z0nR77uF9ZdRyPMjBN0ZBzhZcDnOLuve6+U9JKRWH528OcgMIIEViTc4OirxV+rOjTX+F0f7zecHcUf0dSm6SPKirCisVF1P81xr+fyTaACWyzoq8Gb4yHxhmkoMa+PdwG3f07kvZKepeZLaxWQ4GJJL4U4EuKfphnbTx7g6IfymmKb4jqt1XRpTgr4xNKQ+m//vVb5bQhvrnr+5L+VNHNlEgIgTUBZpZSNBTOCUnvc/cPF06KbsZ6TtJfm1lmqG25+0lJn1V0Rrbis6xmNlXRAfcdin7R446K3xAwgbn7ryV9VdIfSLrfzKblLzezs+LlVyoaXqfcX666LX68tSoNBSaQ+EPitxSd3byx/8bk+NuNDyq6NOebZlYXzz+i6OzrqyX9Rxx287eXMrPPSXqfonFV/7mC5tym6EPtLfHxHwngpqtkXK1oSKrtQ32lEN+1/2+S/kbSuyXdPdTG3P0H8c/CXT7M637MzJ5WdGY3o2gQ40Ua+NOsxyt9MwD0CUXjJ79fUoeZFf406xsktUq6Nr6UZ1ju/kMz+5miEUMujwcgB1CeNYo+JN7j7j/KX+Du7WZ2u6IxVdfp9I8AfFnRB8+/k9RuZjsU/ZhP/0+zvlFRWF3m7sfKbYi7/8rM7lM0YsD1kr43gveFIdjAm9dRDWb2XUVfDax092yJ9d4i6VFJe919QXx96y/d/S1F1r1c0n/Ff37F3W/OW3avBv66zglJxxQNzdEqKStpR3y2FsAZMrO/UHR5znxFB77jii7Z+a6kbxWGVTPrlnSeu583xPb+UtHYkQ+5+9UJNh0YN+Kzo79SdD/IJcWGfYzHG98r6U8kXe7uj+QtW6Bo2MiFiq5XfVHRz7VmJd3h7n1FtvewotF9zi124ifveN4u6Y853lYfgRUAAABB41oLAAAABI3ACgAAgKARWAEAABA0AisAAACCRmAFAABA0IYLrD6a09KlS0f19ZiYzmAK3ajuD2qWKfApdKO6P6hXpjEwDSmoM6xHjhypdRMAVICaBcYO6hVjWVCBFQAAAChEYAUAAEDQCKwAAAAIGoEVAAAAQSOwAgAAIGgEVgAAAASNwAoAAICgEVgBAAAQNAIrAAAAgkZgBQAAQNAIrAAAAAgagRUAAABBI7ACAAAgaARWAAAABI3ACgAAgKBVPbA++eSTuummm7RgwQLV1dXJzNTd3V3tlwFQBdQrMHZQr5jIqh5YOzs7tXXrVjU0NGjhwoXDP6GrS2pqkjIZqbU1emxqiuYDRXT1dKlpe5My6zNK3ZZSZn1GTdub1NVDn6lUxfWqgfu/9bet7H+URL1WD/WKUZGfy1KpYHKZuXup5SUXFnPy5EmlUlEOvueee3TjjTfq0KFDmj59+uCVd+6UGhulXE7K5TRP0j5JSqejKZuVli2rtAkYx3Z27FTjtkblTuSUO5k7NT+dSis9Ka3syqyWXZxon7EkN14FFdVsRfWqIvv/LkkfHdX9jzGEeh0W9YqwFOSyU0Yvlw1Zs1U/w9pfTMPq6op2Sl/fwJ0iRX/39UXLOdOKWFdPlxq3Naov1zfg4CdJuZM59eX61LitkTMHFSi7XsX+R2XoL9VHvSJRgeey2t10tWHD4B1SKJeTNm4cnfYgeBse2aDcidJ9Jncip4176TNJYP+jEvSX2mL/o2KB57LaBdYtW8rbMS0to9MeBG9L25ZBZwoK5U7m1NJGn0kC+x+VoL/UFvsfFQs8l9UusB4/Xt31MO4df6W8vlDueqgM+x+VoL/UFvsfFQs8l9UusNbXV3c9jHv1Z5XXF8pdD5Vh/6MS9JfaYv+jYoHnsto
"text/plain": [
"<Figure size 864x216 with 3 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"plot_perceptron()"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Funkcje aktywacji\n",
"\n",
"Zamiast funkcji bipolarnej możemy zastosować funkcję sigmoidalną jako funkcję aktywacji."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"slideshow": {
"slide_type": "notes"
}
},
"outputs": [],
"source": [
"def plot_activation_functions():\n",
" plt.figure(figsize=(16,7))\n",
" plt.subplot(121)\n",
" x = [-2,-.23,2] \n",
" y = [-1, -1, 1]\n",
" plt.ylim(-1.2,1.2)\n",
" plt.xlim(-2.2,2.2)\n",
" plt.plot([-2,2],[1,1], color='black', ls=\"dashed\")\n",
" plt.plot([-2,2],[-1,-1], color='black', ls=\"dashed\")\n",
" plt.step(x, y, lw=3)\n",
" ax = plt.gca()\n",
" ax.spines['right'].set_color('none')\n",
" ax.spines['top'].set_color('none')\n",
" ax.xaxis.set_ticks_position('bottom')\n",
" ax.spines['bottom'].set_position(('data',0))\n",
" ax.yaxis.set_ticks_position('left')\n",
" ax.spines['left'].set_position(('data',0))\n",
"\n",
" plt.annotate(r'$\\theta_0$',\n",
" xy=(-.23,0), xycoords='data',\n",
" xytext=(-50, +50), textcoords='offset points', fontsize=26,\n",
" arrowprops=dict(arrowstyle=\"->\"))\n",
"\n",
" plt.subplot(122)\n",
" x2 = np.linspace(-2,2,100)\n",
" y2 = np.tanh(x2+ 0.23)\n",
" plt.ylim(-1.2,1.2)\n",
" plt.xlim(-2.2,2.2)\n",
" plt.plot([-2,2],[1,1], color='black', ls=\"dashed\")\n",
" plt.plot([-2,2],[-1,-1], color='black', ls=\"dashed\")\n",
" plt.plot(x2, y2, lw=3)\n",
" ax = plt.gca()\n",
" ax.spines['right'].set_color('none')\n",
" ax.spines['top'].set_color('none')\n",
" ax.xaxis.set_ticks_position('bottom')\n",
" ax.spines['bottom'].set_position(('data',0))\n",
" ax.yaxis.set_ticks_position('left')\n",
" ax.spines['left'].set_position(('data',0))\n",
"\n",
" plt.annotate(r'$\\theta_0$',\n",
" xy=(-.23,0), xycoords='data',\n",
" xytext=(-50, +50), textcoords='offset points', fontsize=26,\n",
" arrowprops=dict(arrowstyle=\"->\"))\n",
"\n",
" plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAA4sAAAGKCAYAAACl0NTvAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzdd3hUZfrG8XvSOxAIEHrokEgNCCKigIIi4KJiWVxAWXApKygqKEVWFHVRwZ+A4NqWoiAugqAioOCKsvSW0CGhk0ACCenJnN8fWQazQwlkMicz8/1cVy7zvmfOzJM45Mmdc857LIZhCAAAAACA3/MyuwAAAAAAQNlDWAQAAAAA2CEsAgAAAADsEBYBAAAAAHYIiwAAAAAAOz7X2c5SqUAxdO/eXd9//73ZZQCuwGJ2AW6A3gwUA70ZKLar9maOLAIOcPbsWbNLAAAAv0NvBkqOsAgAAAAAsENYBAAAAADYISwCAAAAAOwQFgEAAAAAdgiLAAAAAAA7hEUAAAAAgB3CIgAAAADADmERAAAAAGCHsAgAAAAAsENYBAAAAADYISwCAAAAAOwQFgEAAAAAdgiLAAAAAAA7hEUAAAAAgB3CIgAAAADADmERAAAAAGCHsAgAAAAAsENYBAAAAADYISwCAAAAAOwQFgEAAAAAdgiLAAAAAAA7hEUAAAAAgB3CIgAAAADADmERAAAAAGCHsAgAAAAAsENYhNs4fvy4RowYofbt2ysoKEgWi0UJCQnF2tdqtWrKlCmqU6eOAgIC1Lx5c3311VelWzAAAG6O3gy4NsIi3MbBgwe1aNEiVahQQR07dryhfcePH69XXnlFw4cP13fffad27drp4Ycf1rfffltK1QIA4P7ozYBrsxiGca3t19wIlCVWq1VeXoV///jHP/6hP//5zzpy5Ijq1Klzzf2SkpJUs2ZNjRkzRpMmTbLNd+nSRcnJydq5c+d1Xzs2NlabN28uUf2Ah7CYXYAboDfDZdCbAZdw1d7MkUW4jUvN6EatXLlSubm56tevX5H5fv36adeuXTpy5IgjygMAwOPQmwHX5rQji3feeafdXN++fTV06FBlZmbqvvvus9s+YMAADRgwQGfPntVDDz1kt/0vf/mLHnnkER07dkxPPPGE3fbnnntOPXv21L59+zRkyBC77ePGjVPXrl21fft2jRw50m7766+/rttuu02//vqrXnrpJbvt06ZNU4sWLbR69WpNnjzZbvvs2bPVqFEjffPNN3r77bftts+dO1c1a9bUwoULNWvWLLvtixcvVqVKlfTpp5/q008/tdv+7bffKigoSDNnztSiRYvstq9du1aSNHXqVC1fvrzItsDAQH333XeSpFdffVVr1qwpsr1ixYq26wLGjh2r3377rcj2GjVqaN68eZKkkSNHavv27UW2N2zYUHPmzJEkDR48WPv37y+yvUWLFpo2bZqkwh/8x48fL7K9ffv2mjJliiTpwQcf1Llz54ps79Kli8aPHy9Juvfee5WVlVVke0REhBYvXqwjR45owIABtvkLkbE6X6ODDG8/OdKpz0Yqsv80hz4nUBYlvNGjpE/BkcWSozfTm92qN1/y+/de48aNdfz4cd1xxx227QMGDFDTpk116623KiYmRhUrViyy//++97Zs2aLWrVvbtvPeWyvJM997999/v0aPHi3JvX7uXfp/6gAcWQQklUpQBAAAjpWfny8fHx+7+fDwcNv2K5kzZ4569OihLVu2KC8vr1RrBDwB1yzCLV3tuog6Y1aUyutxZBGegiOLZQK9GS7pRq5Z/POf/6zly5fr1KlTReYPHDighg0b6p///OcVj+D8HtcsAsV21d5s/ycbwENc+qX3xRdf1PTp05WVlSWL5fK/lY0bN+rWW2/V8uXL1aPHtX9Bjl09UZtL/ks0AABQ4RHE1NRUGYZRpDenpqbatgPuKCe/QGcv5io5PUdn03N09mKOzmXk6uzFHKVk5ColI1fnLuYqNTNXF3PytXPiPUX+jTgaYREeLzo6Wjk5OTp06JDq169vm4+Pj5ckNW3a1KzSAADwSPRmuBvDMHQ+M08nL2Tp9IVsnbyQrdMXsnQmLUdn0rJ1Ji1bSek5Op95Y6dPZ+YWKNi/9CIdYREer3v37vLz89P8+fM1ceJE2/y8efMUExOjqKgoE6sDAMDz0Jvhii5k5ikxJUPHUrJ0LDVTx1IydTw1SyfOZ+nk+Sxl5hY4/DVTMnIJi0BxLV68WJK0ZcsWSdJ3332niIgIRUREqFOnTlfcp3Llyho1apSmTJmi0NBQtWrVSgsXLtSPP/6opUuXOq12AADcUXF6s4+Pj/r376+PPvpIEr0ZZVdmbr4OJ2fo8NkMHU6+qCNnM5RwLlOJ5zJu+KjglXh7WVQx2E+Vw/xVKaTwo2KInyoF+ys82E/hIX6qGOxX+Hmwn4L8SjfOERbhVh5++OEi46FDh0qSOnXqdM3lhV977TWFhIRo+vTpOn36tBo1aqRFixapZ8+epVkuAABurzi9uaCgQAUFRY+60JthpqzcAu0/k659p9O1/0y6DiRd1MGkizpxPuv6O19FsJ+3IssHKrJcgCLLBahqucLPq4T5q3JogKqEBSg82E/eXmVnLThWQ4VH+f1qqA5Y1dGGFdeAYis7HdB10ZuBYqA3o7iS03O0++QFxZ9MK/w4laaEcxm6dkyy5+/jpdoVg1QrPEg1KgSpZniQalQIVI0KgapePlDlAn1LdTGaEmA1VAAAAACeLT07TzuOXdCO4+e18/h57Tx+QacuZBd7f28vi2pXDFLdSiGqFxGsqEqFH7UrBqtyqL+8ytBRQUcgLAIAAABwO4Zh6GhKpjYeSdHWo6namnhe+5PSi3XE0Msi1akYrEZVQ9WoaqgaVglV/cohqlMxWH4+XqVffBlBWAQAAADg8gzD0JGzGfrt8Dn9duicNiWk6ExaznX3C/D1UtPIMEVXK6foamFqWi1MDauEKsDX2wlVl22ERQAAAAAu6ezFHP1y4Kx+PpCsXw+e0+m0a59S6mWRGlcNU8ta5dW8Rnk1q1lO9SNC5OPtOUcLbwRhEQAAAIBLKLAa2nH8vH7ck6S1+5O0+0TaNR8f6u+j1nUqqE2dcLWqVUHNapQr1fsSuhu+UwAAAADKrKzcAv18IFk/xJ3R2n1JOpeRe9XHhvr76Na6FdW+XkW1qxuuxlXDytStKFwNYREAAABAmXIxJ19r9pzR97tPa+2+ZGXlFVzxcd5eFrWqVV53NIhQx4YRiqkWximlDkRYBAAAAGC67LwCrd2XpGU7TmrNniTl5Fuv+LhKIf7q3DhCnRtXVof6lRQa4OvkSj0HYREAAACAKQzD0KaEVP1r63Gt2HlK6Tn5V3xc/coh6hZdRfc0rapbqpdzu/sZllWERQAAAABOdfpCthZtPqbFW47raErmFR/TuGqoetwSqXtvqar6lUOdXCEkwiIAAAAAJyiwGlq7L0mfbzyqH/cmyWrYP6ZOxSD1al5NPZtXU4MqBESzERYBAAAAlJrzmblauOmY/vlbok6cz7LbHhbgo/ubV9ODrWqoVa3yslg4xbSsICwCAAAAcLiDSen66JcjWrLthLLz7Berua1eRT3atpbuaVpFAb7eJlSI6yEsAgAAAHCISwvWzPn5kFbvSbLbXiHIV31ja+rRtrUUVSnYhApxIwiLAAAAAErEMAz9tC9J7/94UFuPnrfb3jQyTAM61FGv5tU4iuhCCIsAAAAAborVamjVnjP6vx8PaPeJNLvtXZtU0eA76qpNnQpci+iCCIsAAAAAbohhGFq7L1l/X7lP8aeKhkQ/by/1aVVdgzrWVf3KISZVCEcgLAIAAAAoto1HUvT3lXu1KSG1yLy/j5f+eGttDelUV1XCAkyqDo5EWAQAAABwXQeTLmrKt3u0Zm/RhWsCfL30p/Z1NKhjlCqHEhLdCWERAAAAwFWdu5ij6WsOaP5/jqrAatjmfb0terxtLQ3rXJ+Q6KYIiwAAAADs5BdYNXdDot5ZtV/p2fm2eYtF+kOL6hp1d0PVDA8ysUKUNsIiAAAAgCI2JaR
"text/plain": [
"<Figure size 1152x504 with 2 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"plot_activation_functions()"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Perceptron a regresja liniowa"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img src=\"reglin.png\" width=\"70%\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Uczenie regresji liniowej:\n",
"* Model: $$h_{\\theta}(x) = \\sum_{i=0}^n \\theta_ix_i$$\n",
"* Funkcja kosztu (błąd średniokwadratowy): $$J(\\theta) = \\frac{1}{m} \\sum_{i=1}^{m} (h_{\\theta}(x^{(i)}) - y^{(i)})^2$$\n",
2021-05-05 08:03:27 +02:00
"* Po obliczeniu $\\nabla J(\\theta)$ - zwykły SGD."
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Perceptron a dwuklasowa regresja logistyczna"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
2021-05-05 08:03:27 +02:00
"<img src=\"reglog.png\" width=\"60%\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Uczenie dwuklasowej regresji logistycznej:\n",
2021-05-05 08:03:27 +02:00
"* Model: $h_{\\theta}(x) = \\sigma(\\sum_{i=0}^n \\theta_ix_i) = P(1|x,\\theta)$\n",
"* Funkcja kosztu (entropia krzyżowa): $$\\begin{eqnarray} J(\\theta) &=& -\\frac{1}{m} \\sum_{i=1}^{m} \\big( y^{(i)}\\log P(1|x^{(i)},\\theta) \\\\ && + (1-y^{(i)})\\log(1-P(1|x^{(i)},\\theta)) \\big) \\end{eqnarray}$$\n",
"* Po obliczeniu $\\nabla J(\\theta)$ - zwykły SGD."
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Perceptron a wieloklasowa regresja logistyczna"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img src=\"multireglog.png\" width=\"40%\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"### Wieloklasowa regresja logistyczna\n",
2021-03-02 08:32:40 +01:00
"* Model (dla $c$ klasyfikatorów binarnych): \n",
"$$\\begin{eqnarray}\n",
"h_{(\\theta^{(1)},\\dots,\\theta^{(c)})}(x) &=& \\mathrm{softmax}(\\sum_{i=0}^n \\theta_{i}^{(1)}x_i, \\ldots, \\sum_{i=0}^n \\theta_i^{(c)}x_i) \\\\ \n",
"&=& \\left[ P(k|x,\\theta^{(1)},\\dots,\\theta^{(c)}) \\right]_{k=1,\\dots,c} \n",
2021-05-05 08:03:27 +02:00
"\\end{eqnarray}$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"* Funkcja kosztu (**przymując model regresji binarnej**): $$\\begin{eqnarray} J(\\theta^{(k)}) &=& -\\frac{1}{m} \\sum_{i=1}^{m} \\big( y^{(i)}\\log P(k|x^{(i)},\\theta^{(k)}) \\\\ && + (1-y^{(i)})\\log P(\\neg k|x^{(i)},\\theta^{(k)}) \\big) \\end{eqnarray}$$\n",
2021-03-02 08:32:40 +01:00
"* Po obliczeniu $\\nabla J(\\theta)$, **c-krotne** uruchomienie SGD, zastosowanie $\\mathrm{softmax}(X)$ do niezależnie uzyskanych klasyfikatorów binarnych."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"* Przyjmijmy: \n",
"$$ \\Theta = (\\theta^{(1)},\\dots,\\theta^{(c)}) $$\n",
"\n",
"$$h_{\\Theta}(x) = \\left[ P(k|x,\\Theta) \\right]_{k=1,\\dots,c}$$\n",
"\n",
2021-05-05 08:03:27 +02:00
"$$\\delta(x,y) = \\left\\{\\begin{array}{cl} 1 & \\textrm{gdy } x=y \\\\ 0 & \\textrm{wpp.}\\end{array}\\right.$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-03-02 08:32:40 +01:00
"* Wieloklasowa funkcja kosztu $J(\\Theta)$ (kategorialna entropia krzyżowa):\n",
"$$ J(\\Theta) = -\\frac{1}{m}\\sum_{i=1}^{m}\\sum_{k=1}^{c} \\delta({y^{(i)},k}) \\log P(k|x^{(i)},\\Theta) $$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"* Gradient $\\nabla J(\\Theta)$:\n",
"$$ \\dfrac{\\partial J(\\Theta)}{\\partial \\Theta_{j,k}} = -\\frac{1}{m}\\sum_{i = 1}^{m} (\\delta({y^{(i)},k}) - P(k|x^{(i)}, \\Theta)) x^{(i)}_j \n",
"$$\n",
"\n",
"* Liczymy wszystkie wagi jednym uruchomieniem SGD"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"## Podsumowanie\n",
"\n",
"* W przypadku jednowarstowej sieci neuronowej wystarczy znać gradient funkcji kosztu.\n",
2021-05-05 08:03:27 +02:00
"* Wtedy liczymy tak samo jak w przypadku regresji liniowej, logistycznej, wieloklasowej logistycznej itp. (wymienione modele to szczególne przypadki jednowarstwowych sieci neuronowych(.\n",
2021-03-02 08:32:40 +01:00
"* Regresja liniowa i binarna regresja logistyczna to jeden neuron.\n",
2021-05-05 08:03:27 +02:00
"* Wieloklasowa regresja logistyczna to tyle neuronów, ile klas."
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"Funkcja aktywacji i funkcja kosztu są **dobierane do problemu**."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## 9.2. Wielowarstwowe sieci neuronowe\n",
"\n",
"czyli _Artificial Neural Networks_ (ANN) lub _Multi-Layer Perceptrons_ (MLP)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img src=\"nn1.png\" width=\"70%\"/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Architektura sieci\n",
"\n",
"* Sieć neuronowa jako graf neuronów. \n",
"* Organizacja sieci przez warstwy.\n",
"* Najczęściej stosowane są sieci jednokierunkowe i gęste."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"* $n$-warstwowa sieć neuronowa ma $n+1$ warstw (nie liczymy wejścia).\n",
"* Rozmiary sieci określane poprzez liczbę neuronów lub parametrów."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-04-28 07:44:48 +02:00
"### Sieć neuronowa jednokierunkowa (*feedforward*)\n",
2021-03-02 08:32:40 +01:00
"\n",
"* Mając daną $n$-warstwową sieć neuronową oraz jej parametry $\\Theta^{(1)}, \\ldots, \\Theta^{(L)} $ oraz $\\beta^{(1)}, \\ldots, \\beta^{(L)} $ liczymy:<br/><br/> \n",
"$$a^{(l)} = g^{(l)}\\left( a^{(l-1)} \\Theta^{(l)} + \\beta^{(l)} \\right). $$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
2021-05-05 08:03:27 +02:00
"<img src=\"nn2.png\" width=70%/>"
2021-03-02 08:32:40 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"* Funkcje $g^{(l)}$ to tzw. **funkcje aktywacji**.<br/>\n",
"Dla $i = 0$ przyjmujemy $a^{(0)} = \\mathrm{x}$ (wektor wierszowy cech) oraz $g^{(0)}(x) = x$ (identyczność)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"* Parametry $\\Theta$ to wagi na połączeniach miedzy neuronami dwóch warstw.<br/>\n",
"Rozmiar macierzy $\\Theta^{(l)}$, czyli macierzy wag na połączeniach warstw $a^{(l-1)}$ i $a^{(l)}$, to $\\dim(a^{(l-1)}) \\times \\dim(a^{(l)})$."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
2021-05-05 08:03:27 +02:00
"slide_type": "subslide"
2021-03-02 08:32:40 +01:00
}
},
"source": [
"* Parametry $\\beta$ zastępują tutaj dodawanie kolumny z jedynkami do macierzy cech.<br/>Macierz $\\beta^{(l)}$ ma rozmiar równy liczbie neuronów w odpowiedniej warstwie, czyli $1 \\times \\dim(a^{(l)})$."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
2021-05-05 08:03:27 +02:00
"slide_type": "fragment"
2021-03-02 08:32:40 +01:00
}
},
"source": [
"* **Klasyfikacja**: dla ostatniej warstwy $L$ (o rozmiarze równym liczbie klas) przyjmuje się $g^{(L)}(x) = \\mathop{\\mathrm{softmax}}(x)$.\n",
"* **Regresja**: pojedynczy neuron wyjściowy jak na obrazku. Funkcją aktywacji może wtedy być np. funkcja identycznościowa."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
2021-05-05 08:03:27 +02:00
"slide_type": "subslide"
2021-03-02 08:32:40 +01:00
}
},
"source": [
"* Pozostałe funkcje aktywacji najcześciej mają postać sigmoidy, np. sigmoidalna, tangens hiperboliczny.\n",
"* Mogą mieć też inny kształt, np. ReLU, leaky ReLU, maxout."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Uczenie wielowarstwowych sieci neuronowych"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Mając algorytm SGD oraz gradienty wszystkich wag, moglibyśmy trenować każdą sieć."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"* Niech:\n",
"$$\\Theta = (\\Theta^{(1)},\\Theta^{(2)},\\Theta^{(3)},\\beta^{(1)},\\beta^{(2)},\\beta^{(3)})$$\n",
"\n",
"* Funkcja sieci neuronowej z grafiki:\n",
"\n",
"$$\\small h_\\Theta(x) = \\tanh(\\tanh(\\tanh(x\\Theta^{(1)}+\\beta^{(1)})\\Theta^{(2)} + \\beta^{(2)})\\Theta^{(3)} + \\beta^{(3)})$$\n",
"* Funkcja kosztu dla regresji:\n",
"$$J(\\Theta) = \\dfrac{1}{2m} \\sum_{i=1}^{m} (h_\\Theta(x^{(i)})- y^{(i)})^2 $$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"Jak obliczymy gradienty?\n",
"\n",
"$$\\nabla_{\\Theta^{(l)}} J(\\Theta) = ? \\quad \\nabla_{\\beta^{(l)}} J(\\Theta) = ?$$\n",
"\n",
"* Postać funkcji kosztu zależna od wybranej architektury sieci oraz funkcji aktywacji."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"$$\\small J(\\Theta) = \\frac{1}{2}(a^{(L)} - y)^2 $$\n",
"$$\\small \\dfrac{\\partial}{\\partial a^{(L)}} J(\\Theta) = a^{(L)} - y$$\n",
"\n",
"$$\\small \\tanh^{\\prime}(x) = 1 - \\tanh^2(x)$$"
]
}
],
"metadata": {
"celltoolbar": "Slideshow",
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.3"
},
"livereveal": {
"start_slideshow_at": "selected",
2021-04-14 08:03:54 +02:00
"theme": "white"
2021-03-02 08:32:40 +01:00
}
},
"nbformat": 4,
"nbformat_minor": 4
}