widzenie-komputerowe-MP/wko-08.ipynb

776 lines
477 KiB
Plaintext
Raw Permalink Normal View History

2023-01-23 12:24:42 +01:00
{
"cells": [
{
"cell_type": "markdown",
"id": "909d3c02",
"metadata": {},
"source": [
"![Logo 1](img/aitech-logotyp-1.jpg)\n",
"<div class=\"alert alert-block alert-info\">\n",
"<h1> Widzenie komputerowe </h1>\n",
"<h2> 08. <i>Rozpoznawanie twarzy</i> [laboratoria]</h2> \n",
"<h3>Andrzej Wójtowicz (2021)</h3>\n",
"</div>\n",
"\n",
"![Logo 2](img/aitech-logotyp-2.jpg)"
]
},
{
"cell_type": "markdown",
"id": "7a9fde6b",
"metadata": {},
"source": [
"W poniższych materiałach zaprezentujemy klasyczne metody rozpoznawania twarzy. Opisywane zagadnienia można odnaleźć w *5.2.3 Principal component analysis* R. Szeliski (2022) *Computer Vision: Algorithms and Applications* oraz [dokumentacji](https://docs.opencv.org/4.5.3/da/d60/tutorial_face_main.html).\n",
"\n",
"Na początku załadujmy niezbędne biblioteki."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "1d86977a",
"metadata": {},
"outputs": [],
"source": [
"import cv2 as cv\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"%matplotlib inline\n",
"import sklearn.metrics\n",
"import ipywidgets\n",
"import os\n",
"import random"
]
},
{
"cell_type": "markdown",
"id": "c5a62135",
"metadata": {},
"source": [
"Rozpakujmy zbiór danych, na którym będziemy pracować:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0e0f1723",
"metadata": {},
"outputs": [],
"source": [
"!cd datasets && unzip -qo yaleextb.zip"
]
},
{
"cell_type": "markdown",
"id": "e6a0efb1",
"metadata": {},
"source": [
"Nasz zbiór zawiera po kilkadziesiąt zdjęć kilkudziesięciu osób, które zostały sfotografowane w różnych warunkach oświetlenia. Wczytane zdjęcia podzielimy na zbiór treningowy i testowy w stosunku 3/1 oraz wyświetlimy kilka przykładowych zdjęć:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7b775bbf",
"metadata": {},
"outputs": [],
"source": [
"dataset_dir = \"datasets/yaleextb\"\n",
"\n",
"img_data = []\n",
"img_labels = []\n",
"\n",
"images = os.listdir(dataset_dir)\n",
"\n",
"n_examples = 15\n",
"\n",
"for i in range(1, 40):\n",
" i_str = str(i).zfill(2)\n",
" images_p = [img for img in images if img.startswith(f\"yaleB{i_str}\")]\n",
" \n",
" for img in images_p[:n_examples]:\n",
" img_data.append(cv.imread(f\"{dataset_dir}/{img}\", cv.IMREAD_GRAYSCALE))\n",
" img_labels.append(i)\n",
"\n",
"random.seed(1337)\n",
"selector = random.choices([False, True], k=len(images), weights=[3, 1])\n",
"train_data = [x for x, y in zip(img_data, selector) if not y]\n",
"train_labels = [x for x, y in zip(img_labels, selector) if not y]\n",
"test_data = [x for x, y in zip(img_data, selector) if y]\n",
"test_labels = [x for x, y in zip(img_labels, selector) if y]\n",
"\n",
"plt.figure(figsize=(12,5))\n",
"for i in range(4):\n",
" plt.subplot(251 + i)\n",
" plt.imshow(train_data[i], cmap='gray');\n",
"for i in range(4):\n",
" plt.subplot(256 + i)\n",
" plt.imshow(train_data[-i-20], cmap='gray');"
]
},
{
"cell_type": "markdown",
"id": "6e315630",
"metadata": {},
"source": [
"Pierwszym modelem jest *Eigenfaces* zaimplementowany w [`EigenFaceRecognizer`](https://docs.opencv.org/4.5.3/dd/d7c/classcv_1_1face_1_1EigenFaceRecognizer.html). Główny pomysł polega na użyciu PCA do redukcji wymiarów. W naszym przykładzie zachowamy 60 wektorów własnych."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0473c8ae",
"metadata": {},
"outputs": [],
"source": [
"model = cv.face.EigenFaceRecognizer_create(60)\n",
"model.train(np.array(train_data), np.array(train_labels))"
]
},
{
"cell_type": "markdown",
"id": "7a753f2d",
"metadata": {},
"source": [
"Zachowane wektory własne możemy zwizualizować:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f797fe86",
"metadata": {},
"outputs": [],
"source": [
"img_shape = train_data[0].shape\n",
"plt.figure(figsize=(12,5))\n",
"for i in range(5):\n",
" e_v = model.getEigenVectors()[:,i]\n",
" e_v = np.reshape(e_v, img_shape)\n",
"\n",
" plt.subplot(151+i)\n",
" plt.imshow(e_v, cmap='gray');"
]
},
{
"cell_type": "markdown",
"id": "19545151",
"metadata": {},
"source": [
"Możemy zobaczyć jakie potencjalne twarze znajdują się w naszej przestrzeni. Do *uśrednionej* twarzy dodajemy kolejne wektory własne z odpowiednimi wagami. Poniżej mamy przykład wykorzystujący 6 wektorów:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5265f337",
"metadata": {},
"outputs": [],
"source": [
"mean = model.getMean()\n",
"W = model.getEigenVectors()\n",
"\n",
"def generate_face(**args):\n",
" img = mean.copy()\n",
" for i, k in enumerate(args.keys()):\n",
" img = np.add(img, W[:,i]*(10*args[k]))\n",
" \n",
" img = np.reshape(img, img_shape)\n",
" plt.figure(figsize=(5,5))\n",
" plt.imshow(img, cmap='gray')\n",
" plt.show()\n",
" \n",
"ipywidgets.interactive(generate_face, \n",
" w_0=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_1=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_2=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_3=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_4=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_5=ipywidgets.IntSlider(min=-128, max=128))"
]
},
{
"cell_type": "markdown",
"id": "fd4bdce6",
"metadata": {},
"source": [
"Możemy teraz spróbować zrobić rekonstrukcję np. pierwszej twarzy ze zbioru treningowego. Pobieramy dla niej projekcje (wagi) z naszego modelu i podobnie jak wyżej wykorzystujemy uśrednioną twarz i wektory własne. Możemy zobaczyć, że użycie większej liczby wektorów powoduje zwiększenie precyzji rekonstrukcji:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "2619c6f9",
"metadata": {},
"outputs": [],
"source": [
"pro = model.getProjections()[0]\n",
"\n",
"def reconstruct_face(k):\n",
" img = mean.copy()\n",
"\n",
" for i in range(k):\n",
" img = np.add(img, W[:,i]*pro[0,i])\n",
" \n",
" return img\n",
"\n",
"plt.figure(figsize=(12,6))\n",
"for i in range(6):\n",
" k = (i+1)*10\n",
" r_face = np.reshape(reconstruct_face(k), img_shape)\n",
" j = 0 if i <= 4 else 10\n",
" plt.subplot(151+i+100)\n",
" plt.imshow(r_face, cmap='gray')\n",
" plt.title(f\"k = {k}\")\n",
" \n",
"plt.subplot(257)\n",
"plt.imshow(train_data[0], cmap='gray');\n",
"plt.title(\"original\");"
]
},
{
"cell_type": "markdown",
"id": "ae87277a",
"metadata": {},
"source": [
"Spróbujmy teraz odnaleźć osobny znajdujące się na dwóch przykładowych obrazach ze zbioru testowego. Dla nieznanej twarzy obliczamy projekcje i szukamy metodą najbliższego sąsiada projekcji ze zbioru treningowego. Poniżej mamy przykład z poprawnym rozpoznaniem osoby oraz z niepoprawnym rozpoznaniem:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "828f3134",
"metadata": {},
"outputs": [],
"source": [
"def find_face(query_id):\n",
" query_face = test_data[query_id]\n",
" query_label = test_labels[query_id]\n",
"\n",
" x = np.reshape(query_face, mean.shape)\n",
" x_coeff = np.dot(x - mean, W)\n",
"\n",
" best_face = None\n",
" best_label = None\n",
" best_dist = float('inf')\n",
"\n",
" for i, p in enumerate(model.getProjections()):\n",
" dist = np.linalg.norm(np.reshape(p, 60) - np.reshape(x_coeff, 60))\n",
"\n",
" if dist < best_dist:\n",
" best_face = train_data[i]\n",
" best_label = train_labels[i]\n",
" best_dist = dist\n",
" \n",
" return query_face, query_label, best_face, best_label\n",
"\n",
"qf_1, ql_1, bf_1, bl_1 = find_face(45)\n",
"qf_2, ql_2, bf_2, bl_2 = find_face(10)\n",
"\n",
"plt.figure(figsize=(8,11))\n",
"plt.subplot(221)\n",
"plt.imshow(qf_1, cmap='gray')\n",
"plt.title(f\"Face 1: query label = {ql_1}\")\n",
"plt.subplot(222)\n",
"plt.imshow(bf_1, cmap='gray');\n",
"plt.title(f\"Face 1: best label = {bl_1}\")\n",
"plt.subplot(223)\n",
"plt.imshow(qf_2, cmap='gray')\n",
"plt.title(f\"Face 2: query label = {ql_2}\")\n",
"plt.subplot(224)\n",
"plt.imshow(bf_2, cmap='gray');\n",
"plt.title(f\"Face 2: best label = {bl_2}\");"
]
},
{
"cell_type": "markdown",
"id": "43f9a8e5",
"metadata": {},
"source": [
"Bardziej kompaktowe wykonanie predykcji możemy uzyskać poprzez metodę `predict()`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bf736bdd",
"metadata": {},
"outputs": [],
"source": [
"print(test_labels[45], model.predict(test_data[45])[0])\n",
"print(test_labels[10], model.predict(test_data[10])[0])"
]
},
{
"cell_type": "markdown",
"id": "eeaf62b5",
"metadata": {},
"source": [
"Jak widać poniżej, metoda ta nie uzyskuje szczególnie zadowalających wyników (generalnie słabo sobie radzi w sytuacji zmian oświetlenia):"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "12c65438",
"metadata": {},
"outputs": [],
"source": [
"predictions = []\n",
"for test_img in test_data:\n",
" p_label, p_conf = model.predict(test_img)\n",
" predictions.append(p_label)\n",
" \n",
"print(f\"Accuracy: {sklearn.metrics.accuracy_score(test_labels, predictions) * 100:.2f} %\")"
]
},
{
"cell_type": "markdown",
"id": "ea5d879b",
"metadata": {},
"source": [
"Poniżej krótko zaprezentujemy jeszcze dwa rozwinięcia tego algorytmu. Pierwszym z nich jest *Fisherfaces* zaimplementowany w [`FisherFaceRecognizer`](https://docs.opencv.org/4.5.3/d2/de9/classcv_1_1face_1_1FisherFaceRecognizer.html). Tym razem przy pomocy LDA chcemy dodatkowo uwzględnić rozrzut pomiędzy klasami (por. [przykład](https://sthalles.github.io/fisher-linear-discriminant/)). Poniżej tworzymy model z 40 komponentami:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4eb5b746",
"metadata": {},
"outputs": [],
"source": [
"model = cv.face.FisherFaceRecognizer_create(40)\n",
"model.train(np.array(train_data), np.array(train_labels))"
]
},
{
"cell_type": "markdown",
"id": "e9f334be",
"metadata": {},
"source": [
"Zauważmy, że uzyskujemy tutaj ponad dwukrotnie lepszy wynik:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "96faa192",
"metadata": {},
"outputs": [],
"source": [
"predictions = []\n",
"for test_img in test_data:\n",
" p_label, p_conf = model.predict(test_img)\n",
" predictions.append(p_label)\n",
" \n",
"print(f\"Accuracy: {sklearn.metrics.accuracy_score(test_labels, predictions) * 100:.2f} %\")"
]
},
{
"cell_type": "markdown",
"id": "02220e5f",
"metadata": {},
"source": [
"Dalszym rozwinięciem jest model *Local Binary Patterns Histograms* (LBPH) zaimplementowany w [`LBPHFaceRecognizer`](https://docs.opencv.org/4.5.3/df/d25/classcv_1_1face_1_1LBPHFaceRecognizer.html). W tym wypadku chcemy np. uwzględnić możliwość innego oświetlenia osób niż taki, który występuje w naszym zbiorze treningowym. Podobnie jak wcześniej zależy nam na redukcji wymiarów, ale tym razem uzyskamy to poprzez wyliczanie cech (progowanie) dla poszczególnych pikseli w zadanych regionach."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "61eeffdf",
"metadata": {},
"outputs": [],
"source": [
"model = cv.face.LBPHFaceRecognizer_create(radius=10, neighbors=10, grid_x=32, grid_y=32)\n",
"model.train(np.array(train_data), np.array(train_labels))"
]
},
{
"cell_type": "markdown",
"id": "0d64cb5a",
"metadata": {},
"source": [
"Uzyskany wynik jest o kilka punktów procentowy lepszy od poprzedniego modelu, jednak możemy zauważyć, że zmiana domyślnych parametrów na takie, które zwiększają precyzję, powoduje również zwiększenie czasu potrzebnego na wykonanie predykcji:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ca2e319d",
"metadata": {},
"outputs": [],
"source": [
"predictions = []\n",
"for test_img in test_data:\n",
" p_label, p_conf = model.predict(test_img)\n",
" predictions.append(p_label)\n",
" \n",
"print(f\"Accuracy: {sklearn.metrics.accuracy_score(test_labels, predictions) * 100:.2f} %\")"
]
},
{
"cell_type": "markdown",
"id": "00196405",
"metadata": {},
"source": [
"# Zadanie 1\n",
"\n",
"W katalogu `datasets` znajduje się zbiór zdjęć `att_faces`. Sprawdź jakiego typu są to zdjęcia oraz jak powyższe algorytmy działają na tym zbiorze."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "51b8a256",
"metadata": {},
"outputs": [],
"source": [
"!cd datasets && unzip -qo att_faces.zip"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "ec6e8ea1",
"metadata": {},
"outputs": [],
"source": [
"dataset_dir = \"datasets/att_faces\"\n",
"\n",
"img_data = []\n",
"img_labels = []\n",
"\n",
"images = os.listdir(dataset_dir)\n",
"\n",
"n_examples = 15\n",
"\n",
"for i in range(1, 41):\n",
" i_str = str(i).zfill(2)\n",
" images_p = [img for img in images if img.startswith(f\"s{i_str}\")]\n",
" \n",
" for img in images_p[:n_examples]:\n",
" img_data.append(cv.imread(f\"{dataset_dir}/{img}\", cv.IMREAD_GRAYSCALE))\n",
" img_labels.append(i)\n",
"\n",
"random.seed(1337)\n",
"selector = random.choices([False, True], k=len(images), weights=[3, 1])\n",
"train_data = [x for x, y in zip(img_data, selector) if not y]\n",
"train_labels = [x for x, y in zip(img_labels, selector) if not y]\n",
"test_data = [x for x, y in zip(img_data, selector) if y]\n",
"test_labels = [x for x, y in zip(img_labels, selector) if y]"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "734b37c0",
"metadata": {},
"outputs": [],
"source": [
"model = cv.face.EigenFaceRecognizer_create(60)\n",
"model.train(np.array(train_data), np.array(train_labels))"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "28a27f46",
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAsUAAACvCAYAAAAVO6MNAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8rg+JYAAAACXBIWXMAAAsTAAALEwEAmpwYAAD7t0lEQVR4nOz9XYxs23bfh41V1R9VtXvvcw4vdY/uB6FLUgQvIkthAoExICAgrAhSaCEUIIlwJCiUwuA+xWG+EJF+kR8cgAECWXySdRExoSEjpOwYoAELTiJCRKAHCjZlAlYsUCAUyuQFJTIS7z377O7q3d218tD7t+q3/jVX7z5nV/VthT2BQnetWmt+jDk+/mPMMefq+r6vp/JUnspTeSpP5ak8lafyVH4nl9k3uwNP5ak8lafyVJ7KU3kqT+WpfLPLEyh+Kk/lqTyVp/JUnspTeSq/48sTKH4qT+WpPJWn8lSeylN5Kr/jyxMofipP5ak8lafyVJ7KU3kqv+PLEyh+Kk/lqTyVp/JUnspTeSq/48sTKH4qT+WpPJWn8lSeylN5Kr/jy0FAcdd1f6zrul/uuu5Xuq770UO08VT+/6M88cpTuW954pWncp/yxCdP5b7liVeeSpZu3+cUd103r6p/VFV/pKp+var+86r6H/d9/1/vtaGn8i99eeKVp3Lf8sQrT+U+5YlPnsp9yxOvPJVWOUSk+Hur6lf6vv/Hfd+/rqqfrqofOEA7T+Vf/vLEK0/lvuWJV57KfcoTnzyV+5YnXnkqO+XoAHV+oap+Td9/var+e3c9cHp62p+dnU3+3nVdEdHuum7nt9b//t513c7/rWtvq2+qXX//JJF37m0942ut//136vdW/1tt3tWPu65P/fYv/sW/+P/2ff+7Jh/alk/MKycnJ/1isbhH1eOS/Wzx1NTc35c/Pglvtvpxn77nvZ/kO7zS4qGpNlu/3SUXn6RcXl7W1dXVfSv7RLzywQcf9J///Oebv93V/7fNRYsP+J6/tXRCymvW+0noTV1TvOr6ZrPd+Efrvuxji/eyPw/xVtR/8A/+wcF0ymc+85n+277t295aMbR+m9wkXe8jd3nffXTy1D2teaXf9527Kd5o2VH/NmVX/Sy8+En1SPJfazxf+9rX6rd/+7cPolPee++9/rOf/ewn7m/+vS9WSF7wtalyFwa6r255W3t38czb2pjNZsO8TeGylk77JO3dt/zKr/xKU6ccAhTfq3Rd95Wq+kpV1Wq1qj/6R//o6PfZbDYi1GazqdlsNlxHsPK+rutqPp8P9x4dHdXx8fHoM5/P6/j4ePT70dFRU8Dn8/nQtts9Ojoa/t9sNiNG32w2o4nnOuNI5bfZbIYPE81kc72q6vr6um5ubqrv+7q+vq7r6+uhvs1mUzc3N3VzczOqzyUVqZ/hOfeVtvq+H/7nd0qrjb/xN/7GP/nkHDFdzCunp6f1B//gH7zz/tlsNhrLmzqayh0+mfrM5/OBn46Ojobv8/l8qMf3+9rJycmonfl8Xn3f7/As/XVfb25u6vr6epinqqqrq6vROOGB5LmqGs1tVY344vXr1wMP5fybH5l36HhzczPq+32KaZLK7pd+6ZfuVcd9i/nkc5/7XP3Mz/zMZJ9QzlU1ovFsNhvxusfqsSP/zB88Mp/Pd4we8+Z5RqccHR2N2oX2vrdFa3TJ9fX10CayCp/m8+4vuo1xwJfWHdfX1wOvJJ2s36y7/L9pkKA8Qddd36uqfu/v/b0H0ylf/OIX62//7b89Gif9benMpIN/u76+Hsng9fV1XV5eDnJKWa/XO3L3+vXrQadbVq1jqQce9bxbbq2fmJeqLQ9Yx7t+606+i2Yj/rcczGazOj4+Hp5BFo6Pj+v09HQkH9hgf9JWIgM5B2lvGA/lT//pP30/BrhnMZ/8rt/1u+ov/+W//FaHYcomQ3Pk7fr6etSW7f9ms6mrq6vq+74uLy9Hc311dTWyOZYx6yCesb1KXMDvjMX4xHgDnsPmUFrOduo46xvm3ra167o6OTkZMJltKfemPjCfWi8nRuEebK/H/f3f//1NnXIIUPy1qrLb/cU310al7/uvVtVXq249df+WnqSZyX8NUg1iDWIMiiE8k4NQ28BRlyfS9bsNC0EKdEsJcS/3WZmlcvLk8pnP58N9BlMw4dXV1Q4oqqrRMxaW9NQsFNm2DYHLJ/X2o3xiXnnx4sW9XENoc5fHCY9M8Y6VPUIMvyBk6YQZJHEf7SYv8X8aoqqt8kKpdl03KKOWQ2MjibHzdXiAuvhrJeTfG3PQBLb3KR7rO5S38or55Pf9vt/X6/pO21OOXSpXivWDFb7n0vW0ZBl59DMtp4Xnp8CJ+5B8Q302bh4P/Np13aADW5EZeOPo6GhkBFN/uG9pdBIw5hy8DQi37rtH+cQ65Xu+53v6Frhx3zwW61Oupawa2BgUATZev369E8Cwg0t9DnzwrHW++9DS1eYzbEgCUI/d/OPnbRN5Dj3CdTt06ETaJfBE37DLtJVg7m38QtlsNoMutu26Z/lEOuW7vuu7dipv8a3n9ObmZgCx3G8da9DJ74zl8vJyqI9iOhmjWJdXjW2NwXMGAwzSW0CSPjIvthWJIVr6Dx7PICZ94y98B9+gm5LWWT+08nh9Xwac7lMOAYr/86r6rq7rvr1uGezfqKo/80kraSlpT3CC4ozwGfweHx/XYrEYAA2f9KJcP9fsqfBxlDiVSgsUu7TuRbG0AKmLBW0+nw8K18bPCtPRCSswmBwaY/gs4L7HIMDeLPXeZVDeUvbCK58GqBm4el4NZvmkA+VVhYwStqLKGUFu0SvH4EgtvJRGEwPpaARt+R6iicz/fD6v6+vrgRf44FShoFyY99ZvSVfzyh7LXnilamuEW04SxeAvnXTLqa/NZrMdmWuBFkdt/JsdG4MS8wz98ipAzjtzS90JNDAi5oNWhJvrjv6mQTPQMm8k4M3nWyV146cse7M/CVaqxoDH93hlB1oYFHvOXr9+XZeXl6OImvW2AXFG6/L7VGkB+evr65GjPiWjtiOUViAogQ79sk6EH09PT+v09HTkiNmuOMpsuXCf3tGpbpVPxSvW1cnnVWMH13rZDi+61/Npum82t6sELXzD6lJVjWyYA3AZYbVepg8u6cQaX7X0GDrEuKjlyMDT3OtgHfVZb1KvdY/7Rx0G+i3w3OpH0mCq7B0U931/3XXd/7yq/u9VNa+qn+z7/v/9SepoRWGSGbhWVaOoHsJ7cnIyXF+tVnVyctIEL2ac/N0AJ5nEhjIVSBq4VLA8V1WDUTJDsmzi+21IUa72xAFtV1dXO0YMQASIsoduOuYYPLajo6ORETaTpjDdt+yDV1Kxm+GnmJ/5y3mez+d1eno64if4KI1J1VYh4YD5PvOOQUbO2V3KxHzz+vXrYW49/zamouuOcrVBxVBdXV0NUQyi4fDj0dHRoKBoy/ObPJ9zwl/z/7uUd+GVBLD0n+LlQ6eLZJSl5ZinE5tzm+Cbech5Y07oz83NzShaQv1Ot8hoITogQXnXdaNoL/MKPzB+t0Md8EM61Z7jqWcdgUpn2jRqOdlTc/W28mn5xO1n5MxjcGqNwWvqxkxlIzrMh/lwECKfef36da3X61E/EhSnXKHHrq6uRs6R6zdoJbXHNtB1Qvurq6vBDgJkEoxdXFyMwNt8Pq/Ly8s6Pj6uy8vLWi6XtVwuR5Fd+A/ettykfaLd5Hvz2ycp72p/HGDyeBIUm/70G13ulQTmg/GZzh4n3017046+JaCmPq8S0q75ybSnbjv7/J6rpXaK0DnYFGiBXWE80C3ns+/7Ojk5GelP2km+N16E7lO8kPzdKgfJKe77/m9V1d/6BPePvtvo2AhV1chbxZAzcRh35wkfHR2NAHKCXAjddd3wTOa+uLSMXRo+e4g8k3mK7ksqYAComcXMYKaB4aqqTk5OBiFzpAH6HB8fjwC3QZXHakb1mJwTayDXMmSfYO4/Ea9Ab/rkay2vnZJLTFNpEvAL3xeLxQjkUod5iwhIAmcDFPqI8qAkIDaYPT4+HkVaPLdWplaGKC8r6ZOTk1G9m81mMGDHx8dDffP5fCe
"text/plain": [
"<Figure size 864x360 with 5 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"img_shape = train_data[0].shape\n",
"plt.figure(figsize=(12,5))\n",
"for i in range(5):\n",
" e_v = model.getEigenVectors()[:,i]\n",
" e_v = np.reshape(e_v, img_shape)\n",
"\n",
" plt.subplot(151+i)\n",
" plt.imshow(e_v, cmap='gray');"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "7c5fb5be",
"metadata": {},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "382d87032618478fb1c211a4e8e55185",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"interactive(children=(IntSlider(value=0, description='w_0', max=128, min=-128), IntSlider(value=0, description…"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"mean = model.getMean()\n",
"W = model.getEigenVectors()\n",
"\n",
"def generate_face(**args):\n",
" img = mean.copy()\n",
" for i, k in enumerate(args.keys()):\n",
" img = np.add(img, W[:,i]*(10*args[k]))\n",
" \n",
" img = np.reshape(img, img_shape)\n",
" plt.figure(figsize=(5,5))\n",
" plt.imshow(img, cmap='gray')\n",
" plt.show()\n",
" \n",
"ipywidgets.interactive(generate_face, \n",
" w_0=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_1=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_2=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_3=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_4=ipywidgets.IntSlider(min=-128, max=128),\n",
" w_5=ipywidgets.IntSlider(min=-128, max=128))"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "9ecf9bb0",
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAsUAAAFtCAYAAADvbBcGAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8rg+JYAAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOz9e4xt35bXh33X3vXc9Tjn/H73AaLb6Y5EkEMEsUWIFRSCjKNgTEKQELJJSINBHcuyTYyj0EaRcGSsQBQCJDGQFu0YJOQGE6TGMTJEyC0rTgAbjCEYgWhM4+6+9/7uPedXp167Xnuv/FHnM/dnjVp1fq/a557bvadUqqq915prPsYc4zu+Y8y5ur7vsymbsimbsimbsimbsimb8jO5TL7dDdiUTdmUTdmUTdmUTdmUTfl2lw0o3pRN2ZRN2ZRN2ZRN2ZSf8WUDijdlUzZlUzZlUzZlUzblZ3zZgOJN2ZRN2ZRN2ZRN2ZRN+RlfNqB4UzZlUzZlUzZlUzZlU37Glw0o3pRN2ZRN2ZRN2ZRN2ZSf8WUDij+hdF3397uu+ye+3e3YlPe7bORkUz5t2cjKpnzaspGVTfk0ZSMnT1c2oPg9KF3X/WDXdX+767pl13W/ceT7f7nruq93XXfadd2/1XXd7rehmZvybSxd1/03uq77ka7rvtl13auu6/5c13U/r1yzkZNNSdd1X+q67j/uuu5l13UnXdf9f7uu+yXlmo2sbMqgdF33v+i6ru+67rfos67rut/zRpZevvm7+3a2c1O+PeWNbFx0XXf+5ueP6LufNnKyAcXvR/nPk/zzSf5q/aLruv9Rkh9I8suT/NeS/NeT/O/eaes25X0oz5P8mSQ/L8lXk/zlJD/Clxs52RSV8yT/bJIvJ3mR5Pck+fe6rttKNrKyKQ9L13UvkvyOJH+zfPX9Sf6nSX5hkl+Q5H+c5H/5Thu3Ke9T+YV93x+++fkt+vynjZxsQPFnKF3X/cNd1/2XXdf9M09Zb9/3/2bf938hydXI19+X5If6vv+bfd9/nORfT/Ibn/L5m/K0ZR1y0vf9X+77/of6vn/V9/1tkt+X5Od1Xffhm0s2cvIdWNYkK1d93//tvu+XSboki9yD4w/eXLKRle/Asi7786b875P8n5N8q3z+fUl+b9/3P9H3/U8m+b3ZyMp7XdYsJ4+VnzZysgHFn7J0XfePJvlzSf7Fvu//nUeu+etvwpVjP3/wcz765+eeSab850m+KjC0Ke9ReYdy8kuTfL3v+5dv/t/IyXdYWbesdF3313PvaP+ZJH+k7/uP3ny1kZXvsLJOWem67hcn+UVJ/vDI12Oy8vM/f082ZZ3lHdif/+hN2tWf7rrue/T5Txs52fp2N+A7pPz3k/zmJP/zvu9/9LGL+r7/BWt49mGS1/qfv4+SvHx4+aZ8G8s7kZOu674ryb+Z5Lfp442cfGeVtctK3/e/oOu6vSS/JsmOvtrIyndWWZusdF03TfIHk/wLfd8vR9JAx2TlsOu6ru/7/rM+b1PWWtatU/4HSf5iklmS35Xk/9l13X+77/u7/DSSkw1T/OnKP5fk//M2QVtjOU9yrP/5++zb0JZNeXtZu5x0XfflJH8+yR8sTMBGTr6zyjvRKW9SKf6dJD/Qdd0vfPPxRla+s8o6ZeWfT/LX+77/i498PyYr599pQOdnSFmrTun7/j/q+/6m7/uTJL81yfcm+YfffP3TRk42oPjTlX8uyT/Udd3ve9tFXdf9Te3MrD9joalPU/5m7pPXKb8wyTcUNt+U96esVU7ebIb580n+TN/3/0b5eiMn31nlXeuU7dxvqEs2svKdVtYpK788ya95ExL/epL/XpLf23Xd//XN92OyUjfjbcr7Ud61Tulzv2ch+WkkJ5v0iU9XzpL8iiR/oeu63933/Q+MXdT3/efKoem6bif3DkqXZPtNyPPmzUaZP5bk3+667o8n+akk/9sk//bnec6mrL2sTU66rjvOfa7Yf/xIvRs5+c4q65SVfyz3uv0vJ5km+Zdyf2LJX3pzyUZWvrPKOu3Pb0yyp///dJI/leSH3vz/x5L8tq7r/mzuQdC/kuT/8jmesynrL+vUKT8/947130iyn/v0iZ9M8rfeXPLTRk42TPGnLG9CBv/DJP9k13X/+hNX/+eTzHPvpf/gm79/6Zvn/gdJ/g9J/sMk/yDJjyf5nU/8/E15orJGOfk1Sf47SX5T8ez/oTfP3cjJd1hZo6zs5j7n/GXuDdevTPJP9X3/U2+eu5GV77CyLlnp+/6k7/uv85PkJslp3/fkh/7fkvx7uQdD/78k//6bzzblPSxr1ClfTfInkpwm+XtJvifJr3pzElLy00hOuu/AlI9N2ZRN2ZRN2ZRN2ZRN2ZQnLRumeFM2ZVM2ZVM2ZVM2ZVN+xpcNKN6UTdmUTdmUTdmUTdmUn/FlLaC467pf0XXd3+667u92XTea7L0pm5JsZGVTPn3ZyMqmfJqykZNN+bRlIyubUsuT5xS/OQz87+Q+2fsnkvwnSf6Zvu//iyd90KZ8x5eNrGzKpy0bWdmUT1M2crIpn7ZsZGVTxso6jmT7xUn+bt/3fy9Juq774SS/Osmjgra1tdXv7Ow89vWDwlt3Rt6+8+Czruse/Pjzes0n1UeZTCafeA3f2fGoTsjYvVxTr/Xn/HzSfX3fD9rg+8Y+e6zdj5Wxay4uLr7V9/2XP/HmzyEr0+m0397eHjyfPj42D/587O8qB5PJpP0k93P9Njkaq5v76v1cx7iNtfuxeZxMJm+dp8fma7lcZrlcPqh7uVwO5r5e48/qPb6u67oHddW2UZf/v7u7y93d3eOLZ1g+k6xMJpN+a2ul3mib120ttHdMh/g382k5sVz4+8dkpdbve+vnY/fy/5gu8He+7zF9QVksFqNyUn/qNcy/ZeUxOVksFqPPdpvG9NHNzc3adMrW1tZn1imPlcd0xnQ6HeiUOudcX8sn6ZT6ueseK56Xt133mM2yrqhzZfng+7fJCmNc9QLrdLFYvFVmXT/l9vZ2bToF28OYuQ/VDrxNT4/ZEuuM6XT66DXT6XS03jE58b0823Ljdtf2flo5eaxYTiwD1f7wnKoXqpz4ntpP5MTrdkxOPq1OWQco/jlJ/iv9/xNJ/rv1oq7rvj/J9yfJ9vZ2fu7P/bmtw0y8jdByuRwYISsHShWaruuytbWVvb297O3tZXt7OxjKnZ2dTKfT9jMmcNPpNFaWk8kkW1tbmU6n7be/s8DRl/r/3d1d+9xAx5NmhbNcLgcCQz03Nze5u7sbXHtzc/NAMfFMCx/fLRaLLBaL3N7eDgwic1Lrq+PddV3u7u4eAJ+/+Bf/4o/X+X6kfGZZ2draynd913eNggCMz2KxaPOBgkky+HtraysGTVtbW9nd3c3+/n4ODw+zv7+fvb29TKfTzGazNrfM+87OzkBZUB8yMZlMsru7m52dncxmswdyZHlBDiowurm5Gcz31tbWYPyrshkzUElycXGR29vb9tnd3V2bc/6+urrK1dVVbm5u2ndJA65NXq6vr3N9fT1QZFtbWzk/P8cgDWTF1yGzzNWP/diPvVU4SvlEWbGcTKfTfOUrXxmsqwpmaYvn4u7urskJ6525Y/53d3czm81yfHyc2WzWdMv+/n67fmdnJ9vb29ne3h7MbQVHyb0u2tvba/cjE9YzlmHLdpLM5/O2pmk/99T1i9wwJhiU5XKZ09PT3N3dNbmgPub+5uYm8/k8V1dXTS4Yw7u7u9zc3DQ9cnNzk6urq4Eu2traarLIcxh/66vr6+sHevLHfuzH1qpTvud7vqd9h2H1PLl9/sy2iXlEBlj/BwcHOTg4aLKys7PTfpCRnZ2dgZxaZrzut7e3mz3b2tpq44tto93L5TI7OztNTqhvPp83XYA8IqP0CeeF+Td4ZQzOzs6a/UBe0Fc3Nze5vr7OxcVFrq+v2/+WlaurqyZ7yArPT5Ld3d3M5/Omj7xWLTvX19e5vb1t/f47f+fvfEox+XSyUuXku7/7u9t3zLv1BWPB2KJTqq3Z29vL7u5uG/+9vb0mJwcHB02nICe7u7vZ3d3N3t7eADgy18lQN/g
"text/plain": [
"<Figure size 864x432 with 7 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"pro = model.getProjections()[0]\n",
"\n",
"def reconstruct_face(k):\n",
" img = mean.copy()\n",
"\n",
" for i in range(k):\n",
" img = np.add(img, W[:,i]*pro[0,i])\n",
" \n",
" return img\n",
"\n",
"plt.figure(figsize=(12,6))\n",
"for i in range(6):\n",
" k = (i+1)*10\n",
" r_face = np.reshape(reconstruct_face(k), img_shape)\n",
" j = 0 if i <= 4 else 10\n",
" plt.subplot(151+i+100)\n",
" plt.imshow(r_face, cmap='gray')\n",
" plt.title(f\"k = {k}\")\n",
" \n",
"plt.subplot(257)\n",
"plt.imshow(train_data[0], cmap='gray');\n",
"plt.title(\"original\");"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "ec772c43",
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAeYAAAJsCAYAAAAoUADsAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8rg+JYAAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOz9eZBs237XB35XZo1ZmTWec+45745PT3oSesKWAUM7DLaiwWG1oA3tcKsZG4yi5SHAqBnM8AdWG9kB3dE26iAQyAYktzAgBA1ujHAQdMjgsMEMpiEQQnp677537z333nNOnRqyMrOmzN1/ZH1Wfvev1s6qc++571X1q19ERVVl7r32Gn7D9zestVNVVbqjO7qjO7qjO7qjm0Gtr3YH7uiO7uiO7uiO7mhGd4b5ju7oju7oju7oBtGdYb6jO7qjO7qjO7pBdGeY7+iO7uiO7uiObhDdGeY7uqM7uqM7uqMbRHeG+Y7u6I7u6I7u6AbRnWG+o5dGKaW3U0q/7JrXVimlr/+Iz/nI997RHd1kSil9b0rpR27yc1NKP5RS+r6P+JyPfO/XEn3NGuYLIzJKKR3Zz6e+Qs/+Aymlf5xSOk8pfe9X4pl3NJ9SSr8lpfT3UkonKaUfKnz/nSmlf5pS6qeUfjKl9Ku+8r28o+vSnXzX+vPWBZhd+Gr35ZOglNK3pJT+u5TSs5TSpYM5Lsb/V1NKeymlD1JKf+Smz8XXrGG+oP9tVVVd+3n8FXru5yX9h5L+26/Q816YbjrjfgL0WNL3SfqT8YuU0quSfkTSb5e0Lul3SfqvU0oPvqI9vKMXpTv5/tqgM0k/Kum7Gr7/o5KeSHok6Vsl/auS/v2vSM8+In2tG+YapZS2Ukp/JaX09AJd/ZWU0mv2/XZK6U+llB5ffP+X7LtfkVL6hyml/ZTS/5hS+ueanlNV1Q9XVfXjkvofoY+rF+GgvQvP7XellN6172th3hg6mtfPCy/jd6eU/pGkwUXbfyE8//+RUvr+a/TzF6aU/qeL57x/gVKXwmXfkVL6wgXS/b+llFp2/2++8FD3LtDwmy82Uy9GVVX9xaqq/pKk3cLXr0nar6rqx6sp/beSBpI+80n26Y5eLt0G+b6glZTSn7uIzvyDlNI/b/34VErpL1yM4Ysppf/AvvuFF1Gfw5TShyml/+ziq7958Xv/InLwL13VgZTSn7/wLg9SSn8zpfS5cMm9lNJfv+jjf+/ymVL6povvnqeU/llK6Ts/4jxci6qq+mdVVf0JSf+k4ZJPS/rRqqqOq6r6QNJfkxTHc6PozjDXqSXpT0l6U9IbkkaS/oh9//+U1NF0UR9I+s8lKaX0L2jqaf07knYk/XFJ/01KaflFO5BSeuNC+N9ouOQ/0tQgfEbSvy7pN75A29fp56+R9MslbWrqJX57Smnz4v4FSb9a0n91jceNJf2fJd2T9C9J+qW6jFL/d5J+gaSfJ+lXSvrNF8/5lZJ+n6R/U9J9SX9L0p+55hj/6MX8lX7+0XXaKNDfk/RPU0r/RkqpnaZh7BNJH7W9O/rq0G2Qb2kqC39e0rak/1rSX0opLV4A1/+3pP+vpFc1lanvSSn96xf3fb+k76+qal1T/fCjF5//Kxe/Ny8iB//TNbr645K+QdN5+AeS/nT4/tdJ+gOayvc/5PuU0pqkv37R7wea6os/mlL65qsemFL6xXNkdz+l9Iuv0e8S/WFJvzql1EnT6Nf/RlPjfHOpqqqvyR9Jb0s6krR/8fOXCtd8q6S9i78fSZpI2ipc9wOS/kD47J9J+lev6MOPSPreF+z3FyR9u/3/3ZLetf8rSV9v//+QpO+7Tj8v5uQ3h+9/XNL/6eLvXyHpJ6+Y01/W8N33SPp/hX76OP59SX/Dnvld9l1L0lDSm6UxvmS++D5JP1T4/Lsu+OX8oi+//KvNw3c/c9fxtsr390r62/Z/S9L7kn6JpF8k6cvh+t8r6U9d/P03Jf1fJN0L17x1ITMLVzz3Rxq+27y4f+Pi/x+S9Gft+66mQPx1Sf8HSX8r3P/HJf1Hdu/3fUJr/vWSqsLnP0fS37+Q3eqiD+mrzaPzfr7WPeZfVVXV5sXPr7pAVH88pfSllNKhpoy+mVJqa8p0z6uq2iu086ak3+Ho7uL6T6LY5FOS3rH/v/QC916nn++Ee35Y0q+/+PvXa+pVXEkppc9ehAo/uJjL/1RTdO0Ux0E/3pT0/dbH55KSpl7CV5zStNL8/yrp2yQtaZqj+i9TSt/61ejPHV2bbqN8SyYXVVVNJL178aw3JX0q9OP3SXrl4vLvkvRZST+VUvq7KaVf8VEefhEV+oMppZ+9mKe3L75y+fU+Hmkqo/TxF4U+/jpJDz9KXz4uXUQZ/pqkvyhpTdMxbEn6Q1+N/lyXvtYNc6TfIekbJf2iahoOIgSUNGXEbcK6gd6R9J+YEtisqqpTVdW1wq8vSO9rqhSgGBIbahqOg1wgrtPPWNX4lyT9cymlb9HUY44hrSb6AUk/JekbLuby92k6j05xHBTnvCPp3wn9XK2q6n+86qEppT+W6pW4/tOUg7qKvlXS36yq6u9VVTWpqurvSvo7kq61NeyObgzdBvmWTC4uDMtrmsrGO5K+GPrRq6rqOySpqqqfqarq12gaQv5Dkn7sIrT8oq8Q/LWahtN/maQNTT1uqS6/3seupmF3+vjfhz52q6r69656aErpl8yR3aOU0i95wXHool9vSPojVVWdVFW1q2k64zs+QltfMbozzHXqaZp32k8pbWuaz5UkVVX1vqYh1j+apkUkiyklBPu/kPTvppR+UZrSWkrpl6eUeqWHXNy7oun8L6SUVi5Q+3XoRyX93os+vCbpt4bv/6GkX3uBer9dU+8OeqF+Xoz7WNKPaZoz+p+rqvryNfvZk3Qo6Sil9E2SSoL5uy7G8bqk3ybpz118/scuxvg5SUopbaSU/vfXeWhVVf9uVa/E9Z/Ggo+U0sLFmrQltS/WhMr0vyvpl+Ahp2nO8ZfoLsd82+g2yLck/fyU0r95wX/fo2k9w9+W9D9L6qdpgebqhYx/S0rpX7x47q9PKd2/8LL3L9qaSHp68fvrrvn83sUzdzUF+f9p4ZrvSNOc8JKmuea/XVXVO5L+iqTPppR+w8U8LKaU/sWU0s+56qFVVf2tObLbrarqb5Xuu1iTFU2jWbqY7+WLNp9J+qKkf+9Cxjc1rcu52bL71Y6lf7V+VMiHahqK+QlNc1M/rWmxR87NaIq+fljSh5L2JP1Fu/fbNVXg+5p6tX9eUq/h2T900a7//KaL7964eP4bDfd2NC2+2pf0k5pu3fEc8y/QtDqxr2nY+c/Icjrz+lmak4vPf/FFH//t686ppt7IT12M5W9J+o8l/Q92bSXpP9A0Z74r6f8uqW3f/wZJ/1hT4/6OpD8Z7n2pOWZNc2xxTb7Xvv8tmm6D6V/0+Xd8tXn47ud6vGif3Qb5/l5NgfCfu+C1/0XSzwtj+DOSPrjo4982mfsRTbcFHV3ogF9l9/3HmhrofUn/q4bn/sjF311Jf/ni+V+S9H90mbsY3x/TtMjrSNOUwKetrW/UdKvY0wvZ/v9I+la796XmmDXLofvP2/b9t16s+56kZ5o6N698tXl03k+66Pgd3VJKKX2bpgL12hWXfpxnvKGpkX1YVdXhJ/WcO7qjO7qjO7oLZd/RFXSR4/rtmlZh3hnlO7qjO7qjT5i+1k53uqMXoIvCkQ81DWd9+1e5O3d0R3d0R18T9Il4zCmlb0/TE18+n1L6PZ/EM+5oSlVV/cQnFcauqmpQXRRNVdPCjjv6GqQ7eb6jO/rK0kvPMV9UH/60pH9N0/13f1fSr6mq6idf6oPu6I7u6BOnO3m+ozv6ytMn4TH/Qkmfr6rqC1VVnUr6s5ruibujO7qj20d38nxHd/QVpk8ix/yq6ic6vavpUXKNtLa2Vm1vb9c+SynNSsdTPJdiPhEFeJH7/B5/duzTvPZjG5I0mUyuHAPftVqt2v0pJbVaLVVVpclkUnymX8d34/G49tvH4X+nlNRut2v9nDc3pf6mlDSZTPKzvO/cG+fSx+Bja7fbub3JZFJ7DnNzfn5
"text/plain": [
"<Figure size 576x792 with 4 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"def find_face(query_id):\n",
" query_face = test_data[query_id]\n",
" query_label = test_labels[query_id]\n",
"\n",
" x = np.reshape(query_face, mean.shape)\n",
" x_coeff = np.dot(x - mean, W)\n",
"\n",
" best_face = None\n",
" best_label = None\n",
" best_dist = float('inf')\n",
"\n",
" for i, p in enumerate(model.getProjections()):\n",
" dist = np.linalg.norm(np.reshape(p, 60) - np.reshape(x_coeff, 60))\n",
"\n",
" if dist < best_dist:\n",
" best_face = train_data[i]\n",
" best_label = train_labels[i]\n",
" best_dist = dist\n",
" \n",
" return query_face, query_label, best_face, best_label\n",
"\n",
"qf_1, ql_1, bf_1, bl_1 = find_face(45)\n",
"qf_2, ql_2, bf_2, bl_2 = find_face(10)\n",
"\n",
"plt.figure(figsize=(8,11))\n",
"plt.subplot(221)\n",
"plt.imshow(qf_1, cmap='gray')\n",
"plt.title(f\"Face 1: query label = {ql_1}\")\n",
"plt.subplot(222)\n",
"plt.imshow(bf_1, cmap='gray');\n",
"plt.title(f\"Face 1: best label = {bl_1}\")\n",
"plt.subplot(223)\n",
"plt.imshow(qf_2, cmap='gray')\n",
"plt.title(f\"Face 2: query label = {ql_2}\")\n",
"plt.subplot(224)\n",
"plt.imshow(bf_2, cmap='gray');\n",
"plt.title(f\"Face 2: best label = {bl_2}\");"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "cce2d243",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Accuracy: 97.92 %\n"
]
}
],
"source": [
"predictions = []\n",
"for test_img in test_data:\n",
" p_label, p_conf = model.predict(test_img)\n",
" predictions.append(p_label)\n",
" \n",
"print(f\"Accuracy: {sklearn.metrics.accuracy_score(test_labels, predictions) * 100:.2f} %\")"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "43e4b303",
"metadata": {},
"outputs": [],
"source": [
"model = cv.face.FisherFaceRecognizer_create(40)\n",
"model.train(np.array(train_data), np.array(train_labels))"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "6f05732b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Accuracy: 91.67 %\n"
]
}
],
"source": [
"predictions = []\n",
"for test_img in test_data:\n",
" p_label, p_conf = model.predict(test_img)\n",
" predictions.append(p_label)\n",
" \n",
"print(f\"Accuracy: {sklearn.metrics.accuracy_score(test_labels, predictions) * 100:.2f} %\")"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "0ee60d76",
"metadata": {},
"outputs": [],
"source": [
"model = cv.face.LBPHFaceRecognizer_create(radius=10, neighbors=10, grid_x=32, grid_y=32)\n",
"model.train(np.array(train_data), np.array(train_labels))"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "51dad390",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Accuracy: 88.54 %\n"
]
}
],
"source": [
"predictions = []\n",
"for test_img in test_data:\n",
" p_label, p_conf = model.predict(test_img)\n",
" predictions.append(p_label)\n",
" \n",
"print(f\"Accuracy: {sklearn.metrics.accuracy_score(test_labels, predictions) * 100:.2f} %\")"
]
}
],
"metadata": {
"author": "Andrzej Wójtowicz",
"email": "andre@amu.edu.pl",
"kernelspec": {
"display_name": "Python 3.8.12 64-bit",
"language": "python",
"name": "python3"
},
"lang": "pl",
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.12"
},
"subtitle": "08. Rozpoznawanie twarzy [laboratoria]",
"title": "Widzenie komputerowe",
"vscode": {
"interpreter": {
"hash": "31f2aee4e71d21fbe5cf8b01ff0e069b9275f58929596ceb00d14d90e3e16cd6"
}
},
"year": "2021"
},
"nbformat": 4,
"nbformat_minor": 5
}