Support Vector Machine + AdaBoost + Linear Regression + Logistic Regression + K-Nearest Neighbors + Math + Python
Download source code @ https://sites.fastspring.com/prototypeprj/instant/ai
SVM + SMO + Python (Prototype Project 01)
demo a prebuilt version of the application
code the application
go over the training data used in this app.
go over the various classes that make up the app.
quick introduction to Support Vector Machine (SVM)
quick introduction to Sequential Minimal Optimization (SMO)
code the SupportVectorMachines class
plug in equation for 'w' into equation of a linear svm and use resulting equation in code
EPSILON and slack penalty C
alpha and violating KKT conditions
selecting the index of the 2nd alpha to optimize
optimize an alpha pair and b
define and calculate w
classify method
display information tables code
handle command line entry code
plot data + decision boundary + support vectors code
explain + test run application
change C and rerun app. to show overfitting
AdaBoost + Python (Prototype Project 01)
demo a prebuilt version of the application
code the application
go over the training data used in this application
go over the various classes making the application
go over boosting
go over data classification with AdaBoost
explain + code the TreeTrunk classifier class
explain + code the AdaBoost class
explain + code the DisplayHelper class
explain + code the handle_command_line entry function
code the application
test run the application
Linear Regression + Python + Normal Equation (Prototype Project 01)
demo a prebuilt version of the application
code the application
go over the training data used in this app.
use matplotlib to plot the data
linear regression hypothesis and normal equation
implement normal equation
throw exception if determinant does not have an inverse
estimate rent method
handle command line entry function
test run completed app.
Logistic Regression + Gradient Descent + Python (Prototype Project 01)
demo a prebuilt version of the application
code the application
training data used in this app.
logistic regression hypothesis
logistic/sigmoid function
gradient of the cost function
update weights with gradient descent
implement logistic method in LogisticRegression class
implement gradient_descent method in LogisticRegression class
implement classify method in LogisticRegression class
function using matplotlib for plotting candidates scores and decision boundary
command line entry function
test run the completed app.
Logistic Regression + Stochastic Gradient Descent + Python (Prototype Proj 02)
demo a prebuilt version of the application
code the app.
go over training data used in this app.
gradient descent
stochastic gradient descent
LogisticRegression class from previous tutorial modified to include stochastic gradient descent implementation
create a new Logisticregression instance and call the stochastic gradient descent method on it
plot function implementation using matplotlib
handle command line entry function
test run completed app.
K-Nearest Neighbors + Python (Prototype Project 01)
demo a prebuilt version of the application
min-max normalization
3 Nearest Neighbors euclidean distance calculation
code the application
go over training data used in this application
min-max normalization code
euclidean distance calculation code
Sentiment classification prediction code
handle command line entry code
test run completed application
use 3 nearest neighbors
normalize inputs and calculate euclidean distance
use 5 nearest neighbors
Intro to Linear Algebra + Python NumPy (Prototype Project 01)
demo a prebuilt version of the application
add 2 matrices (both must have same size. add elements in same position)
subtract 2 matrices (both must have same size. subtract elements in same position)
multiply 2 matrices (both must have matching inner dimension. size of resulting matrix obtained by dropping middle dimension)
step by step example of multiplying 2 matrices
scalar add (operation applied to each element in matrix)
scalar subtract (operation applied to each element in matrix)
scalar multiply (operation applied to each element in matrix)
scalar divide (operation applied to each element in matrix)
identity matrix contains all 0s except diagonal is 1s (multiply matrix by identity matrix and obtain original matrix)
transpose matrix by flipping it along diagonal. rows become columns and columns become rows
dot product happens between 2 vectors (here we do element-wise multiplication than sum up results)
NumPy setup using miniconda
code the application
matrix as 2 dimensional array
add and subtract matrices in NumPy
multiply matrices in NumPy
matrix scalar operations in NumPy
identity matrix operation in NumPy
transpose matrix operation in NumPy
dot product matrix operation in NumPy
vector matrix