Welcome: Hunan Intelligent Applications Tecgnology CO.,ltd.-HNIAT.com
Language: Chinese ∷  English

Basic knowledge

Introduction to Google Artificial Intelligence TensorFlow Chinese Edition

Introduction to TensorFlow

This guide lets you start programming in TensorFlow. Before using this guide, install TensorFlow. To make full use of this guide, you should know the following:



How to program with Python.

At least a little knowledge of array arrays.

Ideally, it's about machine learning. However, if you know little about machine learning, this is still the first guide you should read.

TensorFlow provides multiple APIs. The lowest level API - TensorFlow core - provides you with complete programming control. We recommend TensorFlow Core for machine learning researchers and others who need to fine-tune their models. A higher level API is built on the TensorFlow core. These higher-level APIs are usually easier to learn and use than TensorFlow Core. In addition, higher-level APIs make it easier and more consistent to duplicate tasks between different users. Advanced APIs like tf. estimator can help you manage data sets, estimators, training and reasoning.



This guide begins with the core TensorFlow tutorial. Later, we will demonstrate how to implement the same model in tf. estimator. Understanding TensorFlow's core principles will give you a good mental model of how the interior works when you use a more compact and high-level API.



tensor

The data center unit in TensorFlow is a tensor. A tensor consists of a set of original values that form an arbitrary number of dimensions. The tensor's rank is its dimension. Here are some examples of tensors:



3# a rank 0 tensor; a scalar with shape []

[1., 2., 3.] a rank 1 tensor; a vector with shape [3]

[[1., 2., 3.], [4., 5., 6.] a rank 2 tensor; a matrix with shape [2, 3]

[[[1., 2., 3.], [[7., 8., 9.]]] a rank 3 tensor with shape [2, 1, 3]

TensorFlow Core Tutorial

Importing Tensor Flow



The specification import statement of the TensorFlow program is as follows:



Import tensorflow as TF

This allows Python to access all TensorFlow classes, methods and symbols. Most documents assume that you have done this.



Computational chart

You might think that the TensorFlow core program consists of two separate parts:



Construct a computational diagram.

Running calculation diagram.

A computational graph is a series of TensorFlow operations arranged into nodes. Let's build a simple calculation diagram. Each node takes zero or more tensors as input and generates tensors as output. A type of node is a constant. Like all TensorFlow constants, it does not require input, but outputs an internally stored value. We can create two floating-point Tensors node 1, node 2 as follows:



Node1 = tf. constant (3.0, dtype = tf. float32)

Node2 = tf. constant (4.0) # also tf. float32 implicitly

Print (node1, node2)

Final print declaration generation



Tensor ("Const:0", shape=(), dtype=float32) Tensor ("Const_1:0", shape=(), dtype=float32)

Note that the print node does not output a value of 3.0, 4.0, as you would expect. Instead, they generate 3.0 and 4.0 nodes for evaluation, respectively. In order to actually evaluate the nodes, we must run the computational graph in the session. Sessions encapsulate the control and state of the TensorFlow runtime.



The following code creates a Session object and then calls its run method to run enough computational graphs to evaluate node1 and node2. By running the computational diagram in the session, the following is done:



Sess = tf. Session ()

Print (sess. run ([node1, node2])

We see the expected values of 3.0 and 4.0:



[3.0, 4.0]

We can build more complex computations by combining Tensor nodes and operations (operations are also nodes). For example, we can add our two constant nodes and generate a new graph as follows:



From future import print_function

Node3 = tf. add (node1, node2)

Print ("node3:", node3)

Print ("sess. run (node3):", sess. run (node3)))

The last two print statements are generated



Node3: Tensor ("Add:0", shape=(), dtype = float32)

Sess. run (node 3): 7.0

TensorFlow provides a utility called TensorBoard that displays pictures of computational graphs. Below is a screen shot showing how TensorBoard visualizes graphics:



Picture Description

For now, this picture is not particularly interesting, because it always produces a constant result. Graphics can be parameterized to accept external input called placeholders. A placeholder is a promise to provide a value.



A = tf. placeholder (tf. float32)

B = tf. placeholder (tf. float32)

Adder_node = a + b provides a short cut for tf. add (a, b)

The first three lines are somewhat like a function or lambda, in which we define two input parameters (a and b) and then operate on them. We can evaluate the graph by providing multiple input values to placeholders using the feed_dict parameter of the run method.



Print (sess. run (adder_node, {a: 3, b: 4.5})

Print (sess. run (adder_node, {a: [1, 3], b: [2, 4]})

Cause output



Seven point five

[3.7.]

In Tensor Board, the figure is as follows:

Picture Description



We can make the calculation diagram more complex by adding another operation. For example,



Add_and_triple = adder_node* 3.

Print (sess. run (add_and_triple, {a: 3, b: 4.5})

Generate output



Twenty-two point five

In Tensor Board, the above figure is as follows:

Picture Description



In machine learning, we usually need a model that can input arbitrarily, such as the model above. To make the model trainable, we need to be able to modify the graphics to get new output with the same input.

CONTACT US

Contact: Manager Xu

Phone: 13907330718

Tel: 0731-22222718

Email: hniatcom@163.com

Add: Room 603, 6th Floor, Shifting Room, No. 2, Orbit Zhigu, No. 79 Liancheng Road, Shifeng District, Zhuzhou City, Hunan Province

Scan the qr codeClose
the qr code