본문 바로가기

분류 전체보기

(5)
How to minimize cost ​ ​ Gradient descent algorithm • Minimize cost function • Gradient descent is used many minimization problems • For a given cost function, cost (W, b), it will find W, b to minimize cost • It can be applied to more general function: cost (w1, w2, …) ​ ​ How it works? • Start with initial guesses - Start at 0,0 (or any other value) - Keeping changing W and b a little bit to try and reduce cost(W,..
Linear Regression2 and placeholders (1.X버전) import tensorflow as tf # X and Y data x_train = [1, 2, 3] y_train = [1, 2, 3] W = tf.Variable(tf.random_normal([1]), name='weight') b = tf.Variable(tf.random_normal([1]), name='bias') # Our hypothesis XW+b hypothesis = x_train * W + b # cost/loss function cost = tf.reduce_mean(tf.square(hypothesis - y_train)) # Minimize optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01) train = o..
Linear Regression ​ ​ 제곱을 하는 이유는 차이를 양수로 바꾸고, 오차가 더 클수록 눈에 띄기 때문에~ ​ 출처 - https://www.youtube.com/user/hunkims
TensorFlow Basics(1.X버전) What is a Data Flow Graph? ​ Nodes in the graph represent mathematical operations Edges represent the multidimensional data arrays (tensors) communicated between them. ​ ​ Installing TensorFlow ​ Linux, Max OSX, Windows (sudo -H) pip install --upgrade tensorflow (sudo -H) pip install --upgrade tensorflow-gpu From source bazel ... https://www.tensorflow.org/install/install_sources Google search/C..
Machine Learning Basics 1. Machine Learning ​ • Limitations of explicit programming - Spam filter: many rules - Automatic driving: too many rules ​ • Machine learning: "Field of study that gives computers the ability to learn without being explicitly programmed” Arthur Samuel (1959) ​ ​ 2. Supervised/Unsupervised learning ​ • Supervised learning: - learning with labeled examples - training set • Unsupervised learning: ..