Tensorflow끄적끄적
-
lec5_LogisticClassificationTensorflow끄적끄적/기초배우기(sung kim) 2020. 4. 27. 03:32
''' Classification Spam Detection : Spam(1) of Ham(0) Facebook feed : show(1) or hide(0) Credit Card Fraudulent Transaction detection : legitimate(0)/fraud(1) g(z)=1/(1+e^(-z)) cost(1)=0 H(x)=0 -> cost=0 H(x)=0 -> cost=무한히커진다. H(x)=1 -> cost=무한히 커진다. cost(H(x),y) = -y*log(H(x))-(1-y)*log(1-H(x)) ''' import tensorflow as tf x_data = [[1,2],[2,3],[3,1],[4,3],[5,3],[6,2]] y_data = [[0],[0],[0],[1],..
-
lec4_mutiVariableLinearRegressionTensorflow끄적끄적/기초배우기(sung kim) 2020. 4. 27. 03:31
''' Cost function H(x1,x2,x3)=w1x1+w2x2+w3x3+b Matrix multiplication(Dot Product) (x1,x2,x3).dot(w1,w2,w3)=w1x1+w2x2+w3x3 X=(x1,x2,x3) W=(w1,w2,w3) H(X)=XW Many x instances ex) x1=[73,93,89,96,73] Hypothesis using matrix |x11 x12 x13| |w1| |x11*w1+x12*w2+x13*w3| |x21 x22 x23| . |w2| = |x21*w1+x22*w2+x23*w3| |w3| ''' import tensorflow as tf x1_data = [73.,93.,89.,96.,73.] x2_data = [80.,88.,91.,9..
-
lec3_HowtoMinimizeTensorflow끄적끄적/기초배우기(sung kim) 2020. 4. 27. 03:23
''' What cost(W) looks like? Gradient descent algorithm(경사를 따라 감소하는 알고리즘) it is used many minimizaion problems For a given cost function cost(W,b),it will find W,b to minimize cost it can be applied to more general function How it works? How would you find the lowest point? 1. Start with initial guesses - Start at 0,0(or any other value) - Keeping changing W and b alittle bit to try and reduce c..