python - Nonlinear regression on tensorflow -


what activation , cost functions on tensorflow suitable below tf.nn learn simple single variate nonlinear relationship f(x) = x * x priori unknown?

certainly, impractical model used sole purpose of understanding tf.nn mechanics 101.

import numpy np import tensorflow tf  x = tf.placeholder(tf.float32, [none, 1])  w = tf.variable(tf.zeros([1,1])) b = tf.variable(tf.zeros([1])) y = some_nonlinear_activation_function_here(tf.matmul(x,w) + b)  y_ = tf.placeholder(tf.float32, [none, 1])  cost = tf.reduce_mean(some_related_cost_function_here(y, y_))  learning_rate = 0.001 optimize = tf.train.gradientdescentoptimizer(learning_rate).minimize(cost)  sess = tf.session() sess.run(tf.initialize_all_variables()) steps = 1000 in range(steps):   sess.run(optimize,      feed_dict={x: np.array([[i]])), y_: np.array([[i*i]])})    print("prediction: %f" % sess.run(y,    feed_dict={x: np.array([[1100]])}))  # expected output near 1210000 

the cost used squared difference:

def squared_error(y1,y2):   return tf.square(y1-y2) 

plus l1 or l2 penalty if feel it.

however seems me need hidden layer in neural network if want remotely interesting. plus if squash output , target squared function might not able much. do:

x = tf.placeholder(tf.float32, [none, 1])  #hidden layer ten neurons w1 = tf.variable(tf.zeros([1,10])) b1 = tf.variable(tf.zeros([10])) h1 = some_nonlinear_activation_function(tf.matmul(x,w) + b) w2 = tf.variable(tf.zeros([10,1])) b2 = tf.variable(tf.zeros([1])) #i not squashing output y=tf.matmul(h1,w2)+b cost = tf.reduce_mean(squared_error(y, y_)) 

also not use 0 weights more clever initialization scheme xavier's or he's come down starting practically 0 weights not zeros various reasons. activations might use tanh, sigmoid or relu or really.


Comments