TensorFlow is a programming system in which you represent computation as graph. Node in the graph is called *op* (short for operations). An op takes zero or more `Tensor`

, performs some computation, and produces zero or more `Tensor variables`

. A `Tensor`

is a typed multi-dimensional array. For example, you can represent a mini-batch of image as a 4-D array of floating point numbers with dimensions `[batch, height, width, channels]`

.

A TensorFlow graphs is a *description* of computation. To compute anything, a graph must be launched in a `Sessions`

. A `Session`

places the graph ops onto `Device`

, such as CPUs or GPUs, and provides methods to execute them. These methods return tensors produced by ops as numpy `ndarray`

objects in Python’s standard objects, and as `tensorflow::Tensor`

instances in C and C++.

## The computation graph

TensorFlow program are usually structured into a construction phases, that assembles a graphs, and an executions phase that uses a session to execute ops in the graphs.

For example, it is common to create a graph to represents and train a neural network in the construction phases, and then repeatedly execute a set of training ops in the graphs in the execution phases.

TensorFlow can be used from C, C++, and Python programs. It is presently much easier to use the Python library to assemble graphs, as it provide a large set of helper functions not available in the C and C++ libraries.

The session libraries have equivalent functionalities for the three languages.

### Building the graphs

To build a graph start with ops that do not need any input (source ops), such as `Constants`

, and pass their output to other ops that do computations.

The ops constructor in the Python library return objects that stand for the output of the constructed ops. You can pass these to other ops constructors to use as inputs.

The TensorFlow Python library has a *default graph* to which ops constructors add node. The default graph is sufficient for many applications. See the Graph class documentation for how to explicitly manage multiple graph.

```
import tensorflow as tf
# Create a Constant op that produces a 1x2 matrix. The op is
# added as a node to the default graph.
#
# The value returned by the constructor represents the output
# of the Constant op.
matrix1 = tf.constant([[3., 3.]])
# Create another Constant that produces a 2x1 matrix.
matrix2 = tf.constant([[2.],[2.]])
# Create a Matmul op that takes 'matrix1' and 'matrix2' as inputs.
# The returned value, 'product', represents the result of the matrix
# multiplication.
product = tf.matmul(matrix1, matrix2)
```

The default graph now has three nodes: two `constant()`

ops and one `matmul()`

op. To actually multiply the matrices, and get the result of the multiplication, you must launch the graph in a session.

Take your time to comment on this article.