Hello World, a series of in-depth learning tutorials based on TensorFlow!
Recently, I saw a good deep learning resource, CS20SI in Stanford: "Tensor Flow for Deep Learning Research", which happens to follow the foundation of Tensor Flow. It is also a fruitful resource. It can be easily sorted into a blog and read at any time.
Why choose TensorFlow?
Since AlexNet won the ImageNet championship in 12 years, in-depth learning has become popular. Because of the rapid development of hardware, GPU parallel computing and easy-to-use API make in-depth learning and neural networks shine.
There are many frameworks for in-depth learning. At present, Pytorch, TensorFlow and Keras are the most popular ones. Among them, Pytorch is more suitable for academic research, playing on its own, but not for industrial practice. TensorFlow is difficult to learn because of its long time, but it has a complete development and deployment plan and a large number of GitHub projects for reference. Keras is a high-level API for TensorFlow, along with TFLearns for TensorFlow and so on.
To sum up, if you are a student, just for papers or studies, then recommend Pytorch; if you are a company developer who wants to use in-depth learning in business, recommend using TensorFlow directly; if you use the latest 1.12, then the official example is Keras; if you are from github. If you download the source code and want to learn, you have to learn the corresponding version of the TensorFlow API.
To sum up the advantages of Tensoflow:
Easy to use: APIs with corresponding Python
Portability: A set of code can be adapted to single or multiple CPUs, GPUs, mobile devices, etc.
Flexibility: It can be deployed on raspberry pie, Android, windows, ios, linux, etc.
Visualization: Tenorboard provides a visual interface to facilitate tracking and parameter tuning.
Checkpoint: Experimental data can be saved through checkpoint records
Automatic Calculus: Automatic Gradient Solution
Huge communities: 10,000 + developers, 3,000 + projects in a year
Lots of project code based on TensorFlow
Companies using TensorFlow include: Google, OpenAI, DeepMind, SnapChat, Airbus, eBay, etc.
There are many things that can be done based on TensorFlow, such as image CV, NLP for natural language processing, speech recognition and so on.
Basic knowledge
1. Simplified API
Let's learn the basics of TensorFlow. TensorFlow not only provides the basic grammar, but also provides some simplified APIs:
TF Learn, tf. contrib. learn, API based on scikit-learning style
TF Slim, tf. contrib. slim, lightweight TF build API, can automatically configure default values, simplify use
Keras, a more advanced and abstract API, uses Keras to create models like building blocks, but hides too deep the underlying principles.
2. Data flow graph
If you have done large data or have been exposed to the flow calculation of Java 8, you should have a better understanding of this data flow graph. That is, before the program is executed, we first build the computational process framework, and then when it is executed, we read the data allocation resources to perform the calculation. In this way, on the one hand, the construction and calculation are separated, on the other hand, the code itself can be further optimized.
For example, in the data flow graph above, the structure of the whole network is defined in advance, and then the result 23 can be obtained by passing in 5 and 3 directly when calculating.
3. Tensor tensor
Tensor, not Zhang Liang, not to mention hot and spicy, is a general term for high-dimensional data. For example:
Tensors of 0 dimensions, also known as scalar or numbers,
A one-dimensional tensor is called a vector.
A two-dimensional tensor is called a matrix.
So Tensor Flow can be understood as Tensor + Flow, which is the data stream of tensor.
4. Examples of data flow graphs
Import tensorflow as TF
# The first example is to calculate the addition of two numbers.
A = tf. constant (2)
B = tf. constant (3)
X = tf. add (a, b)
With tf.Session() as sess:
Print (sess. run (x))
In the above code, we construct a basic calculation example of data flow graph.
among
A = tf. constant (2)
B = tf. constant (3)
X = tf. add (a, b)
It's building maps. To get the value of x, you have to create a new session (which will allocate resources) and execute the run method (which will execute at this time).
5. Use of tensorboard
In order to view the build diagram easily, you need to learn how to use TensorBoard. In the above code, you just need to add a Tensorboard declaration:
Import tensorflow as TF
# The first example is to calculate the addition of two numbers.
A = tf. constant (2)
B = tf. constant (3)
X = tf. add (a, b)
With tf.Session() as sess:
Writer = tf. summary. FileWriter ('. / graphs', sess. graph)
Print (sess. run (x))
Writer. close ()
Then enter it on the command line
Tensorboard -- logdir=/Users/xingoo/PycharmProjects/xxx/graphs
Log in to localhost: 6006 to see the following.
You can see the meaning of describing each node on the left. After clicking add, you can see the description of the add node. Because the constructed graph is very simple, which is the sum of two numbers, the whole graph has only three circles. And the name is given according to the default operation.
6. Examples of more complex points
Increase the complexity of the following graph and compute both results at the same time:
Import tensorflow as TF
# tf.constant (value, dtype = None, shape = None, name ='Const', verify_shape = False)
A = tf. constant ([1, 3], name = "a")
B = tf. constant ([[0, 1], [2, 3], name = "b")
X = tf. add (a, b, name = "add")
Y = tf. multiply (a, b, name = "mul")
With tf.Session() as sess:
Writer = tf. summary. FileWriter ('. / graphs', sess. graph)
X, y = sess. run ([x,
Please read the Chinese version for details.