From 8421d2f7a7133e856f94c3584b2cb7f6ae08a4e7 Mon Sep 17 00:00:00 2001 From: Kamila Bobkowska Date: Sun, 3 May 2020 17:43:59 +0000 Subject: [PATCH] report for individual project --- Report_Patryk_Krawiec.md | 42 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 42 insertions(+) create mode 100644 Report_Patryk_Krawiec.md diff --git a/Report_Patryk_Krawiec.md b/Report_Patryk_Krawiec.md new file mode 100644 index 0000000..21682b6 --- /dev/null +++ b/Report_Patryk_Krawiec.md @@ -0,0 +1,42 @@ +# **Report - Patryk Krawiec** + +## Introduction +In our project garbage truck is moving around the grid visiting houses displayed as garbage dumps for simplicity. +We divide the houses into 2 groups: +* houses of people that pay for the garbage collection +* those who don't pay + +When our truck gets to one of the dumpsters it analyzes an unique 2-digit number of this particular household to determine whether to collect or not. +This number is randomly chosen image from the test set, so there's still a small chance of a mistake during recognision phase. + +## Implementation +I used Deep Neural Network as my method of recognizing numbers. +Main function of my Network is L_layer_model. It uses following function: +* L_model_forward - it calculates the weight and returns it +* L_model_backward - it returns gradients of parameters and bias +* compute_cost - it return the cost function value, which determines how well our network is trained +* update_parameters - it updates the weights using gradient descent upgrading our network's performance + +## Changes in group files +To implement house numbering I had to assign pictures of numbers to each dumster. That's why 2 new parameters of Dumpster model were initialized. In main function I randomly choose those digits and then exclude a few of them from the list of paying customers, so when the truck visits such a house no trash is taken. + +## Forward Propagation +When the weight are initialized we can begin forward propagation. In order to avoid problem of vanishing gradient I use RELU as activation function in every hidden layer. On the other hand due to sigmoid function returning probability-like output it is ideal for output layer. + + +## Important Notice +Training this model takes some time so in order not to calculate it every time, I save it in the file "NN.npy". I highly recommend running the numbering.py file before the Main.py to generate Neural Network and 2 additional files containing pictures of numbers randomly chosen from the test set. + +## Libraries +Following libraries and data sets are needed to run the program : + +``` +from sklearn.datasets import load_digits +import matplotlib.pyplot as plt +from sklearn.model_selection import train_test_split +from sklearn.preprocessing import StandardScaler +import numpy as np +import os.path +import csv +import random +```