418 lines
49 KiB
Plaintext
418 lines
49 KiB
Plaintext
|
{
|
|||
|
"cells": [
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"## Uczenie maszynowe UMZ 2019/2020\n",
|
|||
|
"### 16 czerwca 2020\n",
|
|||
|
"# 15. Uczenie przez wzmacnianie i systemy dialogowe"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"## 15.1. Uczenie przez wzmacnianie"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Paradygmat uczenia przez wzmacnianie"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"<img src=\"https://cdn-images-1.medium.com/max/1560/1*Yf8rcXiwvqEAinDTWTnCPA.jpeg\">"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"<img src=\"https://bigdata-madesimple.com/wp-content/uploads/2018/02/Machine-Learning-Explained1.png\">"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"<img src=\"https://bigdata-madesimple.com/wp-content/uploads/2018/02/Machine-Learning-Explained2.png\">"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"<img src=\"https://bigdata-madesimple.com/wp-content/uploads/2018/02/Machine-Learning-Explained3.png\">"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"* Paradygmat uczenia przez wzmacnianie naśladuje sposób, w jaki uczą się dzieci.\n",
|
|||
|
"* Interakcja ze środowiskiem."
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"* W chwili $t$ agent w stanie $S_t$ podejmuje akcję $A_t$, następnie obserwuje zmianę w środowisku w stanie $S_{t+1}$ i otrzymuje nagrodę $R_{t+1}$: "
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "fragment"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"<img src=\"https://cdn-images-1.medium.com/max/1600/1*WOYVzYnF-rbdcgZU2Wt9Yw.png\">"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"* Celem jest znalezienie takiej taktyki wyboru kolejnej akcji, aby zmaksymalizować wartość końcowej nagrody. "
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"Zastosowanie uczenia przez wzmacnianie:\n",
|
|||
|
"* strategie gier\n",
|
|||
|
"* systemy dialogowe\n",
|
|||
|
"* sterowanie"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Uczenie przez wzmacnianie jako proces decyzyjny Markowa"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"Paradygmat uczenia przez wzmacnianie można formalnie opisać jako proces decyzyjny Markowa:\n",
|
|||
|
"$$ (S, A, T, R) $$\n",
|
|||
|
"gdzie:\n",
|
|||
|
"* $S$ – skończony zbiór stanów\n",
|
|||
|
"* $A$ – skończony zbiór akcji\n",
|
|||
|
"* $T \\colon A \\times S \\to S$ – funkcja przejścia która opisuje, jak zmienia się środowisko pod wpływem wybranych akcji\n",
|
|||
|
"* $R \\colon A \\times S \\to \\mathbb{R}$ – funkcja nagrody"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"Albo, jeśli przyjmiemy, że środowisko zmienia się w sposób niedeterministyczny:\n",
|
|||
|
"$$ (S, A, P, R) $$\n",
|
|||
|
"gdzie:\n",
|
|||
|
"* $S$ – skończony zbiór stanów\n",
|
|||
|
"* $A$ – skończony zbiór akcji\n",
|
|||
|
"* $P \\colon A \\times S \\times S \\to [0, 1]$ – prawdopodobieństwo przejścia\n",
|
|||
|
"* $R \\colon A \\times S \\times S \\to \\mathbb{R}$ – funkcja nagrody"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"Na przykład, prawdopodobieństwo, że akcja $a$ spowoduje przejście ze stanu $s$ do $s'$:\n",
|
|||
|
"$$ P_a(s, s') \\; = \\; \\mathbf{P}( \\, s_{t+1} = s' \\, | \\, s_t = s, a_t = a \\,) $$"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Strategia\n",
|
|||
|
"\n",
|
|||
|
"* Strategią (_policy_) nazywamy odwzorowanie $\\pi \\colon S \\to A$, które bieżącemu stanowi przyporządkuje kolejną akcję do wykonania.\n",
|
|||
|
"* Algorytm uczenia przez wzmacnianie będzie starał się zoptymalizować strategię tak, żeby na koniec otrzymać jak najwyższą nagrodę.\n",
|
|||
|
"* W chwili $t$, ostateczna końcowa nagroda jest zdefiniowana jako:\n",
|
|||
|
"$$ R_t := r_{t+1} + \\gamma \\, r_{t+2} + \\gamma^2 \\, r_{t+3} + \\ldots = \\sum_{k=0}^T \\gamma^k \\, r_{t+k+1} \\; , $$\n",
|
|||
|
"gdzie $0 < \\gamma < 1$ jest czynnikiem, który określa, jak bardzo bieżemy pod uwagę nagrody, które otrzymamy w odległej przyszłości."
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"Algorytm szuka optymalnej strategii metodą prób i błędów – podejmując akcje i obserwując ich wpływ na środowisko. W podejmowaniu decyzji pomoże mu oszacowanie wartości następujących funkcji:\n",
|
|||
|
"* Funkcja wartości ($V$) odzwierciedla, jak atrakcyjne w dalekiej perspektywie jest przejście do danego stanu:\n",
|
|||
|
"$$ V_{\\pi}(s) = \\mathbf{E}_{\\pi}(R \\, | \\, s_t = s) $$\n",
|
|||
|
"* Funkcja $Q$ odzwierciedla, jak atrakcyjne w dalekiej perspektywie jest przejście do danego stanu przez podjęcie danej akcji:\n",
|
|||
|
"$$ Q_{\\pi}(s, a) = \\mathbf{E}_{\\pi}(R \\, | \\, s_t = s, a_t = a) $$"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Algorytmy uczenia przez wzmacnianie\n",
|
|||
|
"* Programowanie dynamiczne (DP):\n",
|
|||
|
" * _bootstrapping_ – aktualizacja oczacowań dla danego stanu na podstawie oszacowań dla możliwych stanów następnych\n",
|
|||
|
"* Metody Monte Carlo (MC)\n",
|
|||
|
"* Uczenie oparte na różnicach czasowych (_temporal difference learning_, TD):\n",
|
|||
|
" * _on-policy_ – aktualizacja bieżącej strategii:\n",
|
|||
|
" * SARSA (_state–action–reward–state–action_)\n",
|
|||
|
" * _off-policy_ – eksploracja strategii innych niż bieżąca:\n",
|
|||
|
" * _Q-Learning_\n",
|
|||
|
" * _Actor–Critic_"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Przykład: odwrócone wahadło (_cart and pole_)"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "code",
|
|||
|
"execution_count": 2,
|
|||
|
"metadata": {
|
|||
|
"scrolled": true,
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "fragment"
|
|||
|
}
|
|||
|
},
|
|||
|
"outputs": [
|
|||
|
{
|
|||
|
"data": {
|
|||
|
"image/jpeg": "/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEABALDA4MChAODQ4SERATGCgaGBYWGDEjJR0oOjM9PDkz\nODdASFxOQERXRTc4UG1RV19iZ2hnPk1xeXBkeFxlZ2MBERISGBUYLxoaL2NCOEJjY2NjY2NjY2Nj\nY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY//AABEIAWgB4AMBIgACEQED\nEQH/xAAbAAEAAwEBAQEAAAAAAAAAAAAAAQYHBAUCA//EADcQAQAABAUEAQIDBgUFAAAAAAABAgQF\nAxdUktIUUpPRERIxBhYhEyIyQXGRIzNDRPFhcqHh8P/EABcBAQEBAQAAAAAAAAAAAAAAAAADAgH/\nxAAeEQEAAQQCAwAAAAAAAAAAAAAAAQIDEjEhIhEyQf/aAAwDAQACEQMRAD8Az8AAAAAAAAAAAAAA\nAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\nAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAXDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8Q\nU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XD\nLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8a\nmh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z\n8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi\n8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh\n3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8Q\nU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XD\nLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8a\nmh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QU8XDLi8amh3z8TLi8amh3z8QaelCQAAAAAAA\nAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAEJQAlCQAAAAAAAAAAAAAAAAAAAA\nAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAEJQAlCQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\nAAAAAAAAAAAAAAAAAAAAAAAEJQAlCQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\nAAAAAAAAAAEJQAlCQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAEJQ\nAlCQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAEJQAlCQAAAAAAAAA\nAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAEJQAlCQAAAAAAAAAAAAAAAAAAAAAA\nAAEJAedcLjNR4sMOXDhN8y/PzGL9LZWT1eFPNifTCaWPx8SuO/yx/wAGb4/SHzD5/siwTQ+vGl/n\nGEIo5Tn4VxjDy9oQlZIAAAAAAAAAAAAAAAAAAAAAAAAAAQlACUJAAAAAAAAAAAAAAAAAAAAAAAAA\nAB5t8k+qjhN8/wAMzhsk8Za2MvdLF6l1lhNQYvzD7Q+YPFtk/wBFdh/r8fMfiKFfFcLUc0TCypQl\ndEAAAAAAAAAAAAAAAAAAAAAAAAAAQlACUJAAAAAAAAAAAAAAAAAAAAAAAAAAB+VTLGemxZYQ+Yxk\njCH9lXwJoSVGHNN+kJZoRitk0PmWMFRxJfoxZ5Pn5+mMYIXfkrWvsLdBL88Gf9pgyTxh8fVLCL9F\n0QAAAAAAAAAAAAAAAAAAAAAAAAABCUAJQkAAAAAAAAAAAAAAAAAAAAAAAAAAEKvcJYSV2NCWHxD6\nloV69Sxlr4x+PiE0sIwSux1Vtbezbpoz0OFGMfmP0/DpefZZoRoYQhH9YTR+XoN06hiriZAGmQAA\nAAAAAAAAAAAAAAAAAAAAABCUAJQkAAAAAAAAAAAAAAAAAAAAAAAAAAB4d+kjDGwp/n9Iy/H9v+Xu\nPJv0sI4GFN/OE3x/4/8ATFz1btz2RYZ/8PFk+PtGEfn/AO/o9d4Vhmj1GJJ/KMvy9xy3PUueyQFG\nAAAAAAAAAAAAAAAAAAAAAAAABCUAJQkAAAAAAAAAAAAAAAAAAAAAAAAAERj8QjGP2gCXBeJPqoJo\nwh8xljCP9HmR/G/4fh/vY+Kf0+Mf8X2SswZqfArIzYmJ+7LCOFPD5j/ZmrTtO02iaEtwk+Y/HzCM\nFjVOmx5KXHkx8T+CSPzN/R1/nj8Pw/3sfFP6Ts6Uu7WIV388/h/WzeGf0sMsYTSwjD7R/VZJIAAA\nAAAAAAAAAAAAAAAAAAAACEoAS5uiwu+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu\n+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu+o88/s6\nHC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz\n+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uhwu+o88/s6HC76jzz+wdI5uh\nwu+o88/s6HC76jzz+wdL4xf8ub+kX49Dhd9R55/b5xKLC/Zzfv4/2j/rz+wYXF90+LNg1GHiyfH1\nSTQmh8/9HxFANDxf1hGTEmhD6ofH6x+GfYkIS4k0sPtCMYJnxcTE+P2mJNN8fb6o/L4bqry8A33B\n/wAmT/tgwJumFQ4X7KT9/H/hh/rz+2B2Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7\nB1Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7B1Dl6HC\n76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+z\nocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPPP7B1Dl6HC76jzz+zocLvqPP\nP7B1Ic3Q4XfUeef2dDhd9R55/YOmCUJAAAAAAAAAAAAAAAAAAAAAAAAAfM8PmSaEPvGD6AZB+SL/\nAKOHll9n5Iv+jh5ZfbXwGQfki/6OHll9n5Iv+jh5ZfbXwGQfki/6OHll9tcw4Rhhywj94Qg+wAAA\nAAAAAAAAAAAAAAAAAAAAABCUAJQkAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\nAAAAAAAABCUAJZdmPeNNQ7J+RmPeNNQ7J+QNRGXZj3jTUOyfkZj3jTUOyfkDURl2Y9401Dsn5GY9\n401Dsn5A1EZdmPeNNQ7J+RmPeNNQ7J+QNRGXZj3jTUOyfkZj3jTUOyfkDURl2Y9401Dsn5GY9401\nDsn5A1EZdmPeNNQ7J+RmPeNNQ7J+QNRGXZj3jTUOyfkZj3jTUOyfkDURl2Y9401Dsn5GY9401Dsn\n5A1EZdmPeNNQ7J+RmPeNNQ7J+QNRGXZj3jTUOyfkZj3jTUOyfkDURl2Y9401Dsn5GY9401Dsn5A1\nEZdmPeNNQ7J+RmPeNN
|
|||
|
"text/html": [
|
|||
|
"\n",
|
|||
|
" <iframe\n",
|
|||
|
" width=\"800\"\n",
|
|||
|
" height=\"600\"\n",
|
|||
|
" src=\"https://www.youtube.com/embed/46wjA6dqxOM\"\n",
|
|||
|
" frameborder=\"0\"\n",
|
|||
|
" allowfullscreen\n",
|
|||
|
" ></iframe>\n",
|
|||
|
" "
|
|||
|
],
|
|||
|
"text/plain": [
|
|||
|
"<IPython.lib.display.YouTubeVideo at 0x7fa9a4350050>"
|
|||
|
]
|
|||
|
},
|
|||
|
"execution_count": 2,
|
|||
|
"metadata": {},
|
|||
|
"output_type": "execute_result"
|
|||
|
}
|
|||
|
],
|
|||
|
"source": [
|
|||
|
"import IPython\n",
|
|||
|
"IPython.display.YouTubeVideo('46wjA6dqxOM', width=800, height=600)"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Przykład: symulacja autonomicznego samochodu"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "code",
|
|||
|
"execution_count": 3,
|
|||
|
"metadata": {
|
|||
|
"scrolled": true,
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "fragment"
|
|||
|
}
|
|||
|
},
|
|||
|
"outputs": [
|
|||
|
{
|
|||
|
"data": {
|
|||
|
"image/jpeg": "/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAUDBAkICgcJCQgJCAkFCQUJBQkFCAgFBQgFBQgIBgUF\nBQUIChwLBwgaCQUFDRUNDh0RExMTBwsWGBYSGBASExIBBQUFCAcIDwkJDxIPDw8SExISEhISEhIS\nEhISFxISEhISEhISEhISEhISEhISEhISEhIeEhISEh4SEhISEhISHv/AABEIAWgB4AMBIgACEQED\nEQH/xAAdAAABBAMBAQAAAAAAAAAAAAAAAgUGBwEDBAgJ/8QAWxAAAQMBAwMLDwgHBgQFBQAAAgAB\nAwQFERIGEyEHFCIxMkFRVFWSkxUWFxg0QlJxcnSys9LT1AgzU2Fic5WxIySBkaGi0TVWgpS0wUN1\nhOFFY4OkwiVE4uPw/8QAGwEBAAIDAQEAAAAAAAAAAAAAAAECAwUHBAb/xAAyEQACAQEECQMCBwEB\nAAAAAAAAAQIDBBFSkQUSExQhMTJRcTNBYRWxIiNCcoGhwQbh/9oADAMBAAIRAxEAPwDxohegJPkp\nWyLkJWvk8JA5MYnaFWJiTaCEheivZ7950jtVLY5Yyd/Eav4JY9tDEs0Y9tDusygbkXK/u1VtjljJ\n38Rq/gkdqrbHLGTv4jV/BJt4Ylmhtod1mUDci5X92qtscsZO/iNX8EjtVbY5Yyd/Eav4JNvDEs0N\ntDusygbkXK/u1VtjljJ38Rq/gkdqrbHLGTv4jV/BJt4Ylmhtod1mUDci5X92qtscsZO/iNX8EjtV\nbY5Yyd/Eav4JNvDEs0NtDusygbkXK/u1VtjljJ38Rq/gkdqrbHLGTv4jV/BJt4Ylmhtod1mUDci5\nX92qtscsZO/iNX8EjtVbY5Yyd/Eav4JNvDEs0NtDusygbkXK/u1VtjljJ38Rq/gkdqpbHLGTv4jV\nfBJt4Ylmhtod1mUChX/2qlscsZO/iNV8EueX5MVpA9xW9kyLttsdqzg7fsejRV4P9SzJVWD5NZlE\nIV59rNaH94Ml/wAXm+ESm+TJaL7WUGS7+K15vhFO1h3WZbWRRSFfPavWpy7k1+KVHwa2j8la130t\nbOTrtwjaNUTfwolG2h3WaK7WHdZlA3IuV/dqrbHLGTv4jV/BI7VW2OWMnfxGr+CTbwxLNEbaHdZl\nA3IuV/dqrbHLGTv4jV/BI7VW2OWMnfxGr+CTbwxLNDbQ7rMoG5Fyv7tVbY5Yyd/Eav4JHaqWxyxk\n7+I1fwSbeGJZobaHdZlA3IuV/dqpbHLGTv4jVfBI7VW2OWcnfxGq+CTbwxLNDbQ7rMoG5Fyv7tVb\nY5Zyd/Ear4JHaq2xyzk7+I1XwSbeGJZobWHdZlA3IuV+9qrbHLGTv4jVfBLPaqWxyxk7+I1fwSbe\nGJZobaHdZlA3IuV/dqrbHLGTv4jV/BI7VW2OWMnfxGr+CTbwxLNDbQ7rMoG5Fyv7tVbY5Yyd/Eav\n4JYk+Sxaws5FbWTgiO6I7SqgFvKd6LQm2hiWaG2h3WZQVyFe0fyZLRJ7ht/JgnfcsNrTkT+Jmo10\nR/JVtgnuG2cnSd9ywWjVE7+JmolDtFNc5LNF70UAhehm+SPbvKdhf52t+BQXySbcbbtOwW/62t+B\nVN8o445oayPPV6wvQ1N8km3JcWbtOwpM0RDLma2tlwmO2B4aHYl9Trd2n2UPHrF/zNofALOpJq9E\nnnNCvQ/k014u4llFksJA7sYlbEokLi9xCTPSaHv3knta67+8eSv4xL8IpBRqFePa1V394slfxiX4\nRZ7Wqu/vFkr+MS/CICjUK8u1qrv7xZK/jEvwix2tVd/eLJX8Yl+EQFHIV49rXXf3iyV/GJfhEdrX\nXf3jyV/GZfhEBRyF6Kh+SFlAYiYWhYhgbMQEFXXnGQE14mBtQXEN2+yz2nuUPHrF/wAzaHwCA87I\nV/yfJStkXdntfJ5nFyYmK0KsSYh0OLs9Foe/eSe1VtjljJ38Rq/glj20O6zMe2h3WZQSFfvaq2xv\n2zk63jtGq+CWntX7U5dya/FKj4NNtDusxtYd1mUOhXx2r9qcu5NfilR8GtkPyW7VNxALbybM5CEY\ngitOpOQ5DfCAAA0V5E7uLMzcKlVYv3WZO1j3R6Nyk7qrfOaz1prgThlH3VW+c1frTTeucV3+ZLz/\nAKfA1n+ZLywQhN1t2zFSCzm+yLcAOyMi8Sxq98EUhGUndFXscVm5QwbSltEmYWkp4Ae+UxLDLIV+\nxACbaa7bU3pMjIjaMoaqojAsLzNjzwyhdshEj0xPfvtesFe0wou6byV5so6LqON7dz7Gq5C67ayV\nhpoZZ4ymA6QCPEcxyiQhuwlifQ9/1Josu0o6gb4yvcWFyHvmxN4O8lntEa0XKF/A89psk6Nz5o7E\nIQs1547wQhCXi8EIQl4vBDshCXi8wy5LSoBmG64We+8XuZdiFKk1yLKTXIi1oZIhLpcmZx8FsN/1\nOtdBY2nNsLCwb5N4KlqFkVeRm3qo1qt3jRBYgs+zK9u9wp1jBhZmZtApSFSVRy5mGU2+YIQhVvK3\nghCEvF4IdC4bfnKKCcxa9wAsLKVxZaCcmkvcZLYywCM3jhjeYxfCTi7CGPwRLfddGTFkR1UrHXC5\nnUMWCMSLNBfuItD8G+q/oJHjITuvcHxH4yfwnVi5KWtniEoWLGDiNxtsHMu8xb6xW2Uowahw+T6q\nlY4UlwXHuS+gyGpWcnLOzA/zUdRI7xRfZC7STeNM2VFHDZZwvFGYx1bS50IsUsQPFc+d38F7YlLI\nraenjxVI4SvdmaHZgXgixbzp0zoygJtc7SiLjtEOEu9LhWgp22tTnrTbkuXPgyKlKM4uLXMgNNOE\nosYOxCekXFbEzV9ZHT11bDFhaNjiKWMNiITSizy4B3hd+BPLPfp8JfRJtxUvZq8+ctNF0puIIQhT\neee9gmTLYXelkZnw7nFh3TjfcQp7XJa8EckUgybjCTl9VzaCUxdzM1CerUi37NFUU5OBM4vhIGHB\nh8H/AHdWDkXMU4xFObCxkVxRNgIQHQOIuF3VeB32+wuTA/fOAvscX1qbZIQsELO0jnnXxEPeAW+A\n8CrbVfA+wlxRYx1ksTBBSk0r3E5OT45QC/vj2tKdxtOOR2B3dpLhxxkz4x8L9n1qG2NXtARO44hl\na4sOxJsO5w/Ut+vJZ5mKFmE8OEMVzjgH6Ul87Og2/wDfkwndkNacsFfaEcg5opZQlAcTEB0srXAf\n1E+HaVyi+1/hXnTCXVOmKMpNc3X20MrsUTRRfMlDduW3rlftg1rTxiTbYbE/GLN7S6v/AM/WdSyQ\nvXJXZHvoO+J4jyhb9arvOq71xriuXblB3VXedV/rjXGy3ZnDD9Szh+pZZSnUmsyGstayaaojaWCr\nqBCoAncRMMBvhIhdn2xHaQEVw/Ui5Wb8o7Jyksu046eigGnhKkpZCACIxeUzlYzvN3faAf3Ks3QC\nHZINtD+JbHSJNp/EgPcORHcFm+Z2f6kE8JnyI7gs3zOz/Ugnjg8ahkMpS3O6Kr76o9Mlxrttzuiq\n++qPTJcS5tXb2kvLOfVn+N+WYMWdnZ9omJi8ktCb+okHglznTihUU2uRVTa5Dd1Eg8Euc6c8k7Jh\nGts8mF746uzyHZPuhmjcUlOWTHddB5zQ+tFZrPUltI8f1IzWectpHj7r7ico+6q3zmr9aab04ZR9\n1VvnNX6003rHX9SXn/THW9SXl/cFV+WNQUtVK7s7ND+jBi3y74hHmq0FWuW1Mcc5Oe1VPfEQ/V3v\nj2KUnxNjolx2rv53cBxyZtWNhCIrxPcgwjiEv3bStTJ60ZGEM5EwxCL/AKQfsbkiDg+tUlk8ZDPG\n4g57oSYd0Ik1zy4n2rlZlLaUohm2PYbm+68hArmLCW9oxLU6RoJu5e5v6iuJ5RWjFVNK0bsYxOLH\no2L4vsvvKFapEUVPrWoBgjmebNthuiz0JATnEYtutwLtwJ0q2jhhxwk0Rsw5o4SbEZcBj3yYsrpd\nfUZxRw64qIRGS+ZxA4
|
|||
|
"text/html": [
|
|||
|
"\n",
|
|||
|
" <iframe\n",
|
|||
|
" width=\"800\"\n",
|
|||
|
" height=\"600\"\n",
|
|||
|
" src=\"https://www.youtube.com/embed/G-GpY7bevuw\"\n",
|
|||
|
" frameborder=\"0\"\n",
|
|||
|
" allowfullscreen\n",
|
|||
|
" ></iframe>\n",
|
|||
|
" "
|
|||
|
],
|
|||
|
"text/plain": [
|
|||
|
"<IPython.lib.display.YouTubeVideo at 0x7fa9a4332750>"
|
|||
|
]
|
|||
|
},
|
|||
|
"execution_count": 3,
|
|||
|
"metadata": {},
|
|||
|
"output_type": "execute_result"
|
|||
|
}
|
|||
|
],
|
|||
|
"source": [
|
|||
|
"import IPython\n",
|
|||
|
"IPython.display.YouTubeVideo('G-GpY7bevuw', width=800, height=600)"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "slide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"## 15.2. Systemy dialogowe"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Rodzaje systemów dialogowych\n",
|
|||
|
"* Chatboty\n",
|
|||
|
"* Systemy zorientowane na zadania (_task-oriented systems_, _goal-oriented systems_):\n",
|
|||
|
" * szukanie informacji\n",
|
|||
|
" * wypełnianie formularzy\n",
|
|||
|
" * rozwiązywanie problemów\n",
|
|||
|
" * systemy edukacyjne i tutorialowe\n",
|
|||
|
" * inteligentni asystenci"
|
|||
|
]
|
|||
|
},
|
|||
|
{
|
|||
|
"cell_type": "markdown",
|
|||
|
"metadata": {
|
|||
|
"slideshow": {
|
|||
|
"slide_type": "subslide"
|
|||
|
}
|
|||
|
},
|
|||
|
"source": [
|
|||
|
"### Architektura systemu dialogowego\n",
|
|||
|
"\n",
|
|||
|
"<img src=\"sds.png\">"
|
|||
|
]
|
|||
|
}
|
|||
|
],
|
|||
|
"metadata": {
|
|||
|
"celltoolbar": "Slideshow",
|
|||
|
"kernelspec": {
|
|||
|
"display_name": "Python 3",
|
|||
|
"language": "python",
|
|||
|
"name": "python3"
|
|||
|
},
|
|||
|
"language_info": {
|
|||
|
"codemirror_mode": {
|
|||
|
"name": "ipython",
|
|||
|
"version": 3
|
|||
|
},
|
|||
|
"file_extension": ".py",
|
|||
|
"mimetype": "text/x-python",
|
|||
|
"name": "python",
|
|||
|
"nbconvert_exporter": "python",
|
|||
|
"pygments_lexer": "ipython3",
|
|||
|
"version": "3.8.3"
|
|||
|
},
|
|||
|
"livereveal": {
|
|||
|
"start_slideshow_at": "selected",
|
|||
|
"theme": "amu"
|
|||
|
}
|
|||
|
},
|
|||
|
"nbformat": 4,
|
|||
|
"nbformat_minor": 4
|
|||
|
}
|