 # Tensorflow: “Hello World”

Source: Deep Learning on Medium

Firstly, let’s know what does tensorflow mean. A “tensor” is a data that’s passed between the operations. In effect, a Tensor is a multidimensional array. It can be zero dimensional, such as scalar value, one dimensional as line or vector, or 2 dimensional such as Matrix, and so on. And when that “tensor” flows through one node(operation) to another in a graph to give the output its called tensorflow.

Data Flow Graph:

In a dataflow graph the nodes are called operations, which represent units of computation. The edges are tensors which represent the data consumed or produced by an operation.

In the diagram below, feature matrix is a placeholder. Placeholders can be sees as “holes” in your model, meaning “holes” through which you can pass the data from outside of the graph. Placeholders allow us to create our operations in the graph, without needing the data. When we want to execute the graph, we have to free the placeholders with our input data. This is why we need to initialize placeholders before using them.

Also there, Weight Matrix is a variable. Tensorflow variables, are used to share and persist some values, that are manipulated by the program. When we define a placeholder or a variable, Tensorflow adds an operation to your graph. Operations are the nodes that represent mathematical operations over the tensors in the graph, like add, subtract and multiply or even function like activation functions. After all operations we can we can create session to run the graph, and perform computations. In that graph we have Matmul, which is an operation over the tensors Weight Matrix and Feature Matrix. After that Add is called to add the results of the previous operator and bias. The resultant tensors of each operations crosses an Sigmoid Activation function, which gives us our result.

Now lets built our own tensorflow graph. Start with importing the tensorflow library. If it is not installed simply go to the terminal and type pip install tensorflow and done (!Note! it will install the CPU version of tensorflow)

`import tensorflow as tf`

Now build our graph

`graph = tf.Graph()`
`with graph.as_default(): a = tf.constant() b = tf.constant() sig = tf.placeholder(tf.float32) c = tf.add(a,b) d = tf.subtract(a,b) e = tf.nn.sigmoid(sig) # Sigmoid function`

Running our graph

`with tf.Session(graph = graph5) as sess: result = sess.run(c) print('c = %s' % result) result = sess.run(d) print('d = %s' % result) result = sess.run(e, feed_dict = {sig:1.5}) print('e = %s' % result)`

OUTPUT:

`c = d = e = 0.81757444`

If we would have enabled eager execution at the beginning then we don’t have to build the graph for example.

`import tensorflow as tf`
`tf.enable_eager_execution()`
`x = [[5.]]y = tf.matmul(x,x) # square the inputprint("OUTPUT: {}".format(y))`

OUTPUT:

`OUTPUT: [[25.]]`

And that how you deal with tensorflow. In the next article we will learn about neural networks with tensorflow.