三体读后感

程心被维德制止的时候, 维德可能知道些什么, 但是仍然没有告诉她, 而是把这个谜团悄悄的塞给了她.
直到最后在冥王星出发的时候, 程心才发现这个爆炸在眼前的谜团.
程心两次称为历史的罪人, 维德都是与之关系密切的人.
可以看出来, 维德是真正了解未来方向的人. 即使他不知道哪条路是对的 ,但是他的直觉很准, 手段很狠.
程心最后还是遇到了比死更难受的事.

Difference between sparse_softmax_cross_entropy_with_logits and softmax_cross_entropy_with_logits

Answer from < Olivier Moindrot >
Having two different functions is a convenience, as they produce the same result.

The difference is simple:
For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1].
For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.

Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.

Answer from < 全意 >
sparse_softmax_cross_entropy_with_logits中 lables接受直接的数字标签
如[1], [2], [3], [4] (类型只能为int32,int64)
softmax_cross_entropy_with_logits中 labels接受one-hot标签
如[1,0,0,0], [0,1,0,0],[0,0,1,0], [0,0,0,1] (类型为int32, int64)


What does global_step mean in tensorflow ?

global_step refers to the number of batches seen by the graph. Every time a batch is provided, the weights are updated in the direction that minimizes the loss. global_step just keeps track of the number of batches seen so far. When it is passed in the minimize() argument list, the variable is increased by one. Have a look at optimizer.minimize().

You can get the global_step value using tf.train.global_step(). Also handy are the utility methods tf.train.get_global_step or tf.train.get_or_create_global_step.

0 is the initial value of the global step in this context.
Answer from < maddin25 and martianwars >

global_step 指的是 graph 所看到的 batch 的数量. 如果 dataset 中包含100个数据, 而 batch_size 被设置为 10, 则每隔10个 batch 结束一个 epoch.
epoch 指的是遍历一遍 dataset.

FailedPreconditionError: Attempting to use uninitialized in Tensorflow

Web Solution

Answer from < mrry >

The FailedPreconditionError arises because the program is attempting to read a variable (named “Variable_1”) before it has been initialized. In TensorFlow, all variables must be explicitly initialized, by running their “initializer” operations. For convenience, you can run all of the variable initializers in the current session by executing the following statement before your training loop.

Answer from < Salvador Dali >

This exception is most commonly raised when running an operation that reads a tf.Variable before it has been initialized.

My Case

In my case, when I wrote the __init__ func in class Model, I claimed more variables after initialized the saver with tf.gloabl_variables(). It seems like :

import tensorflow as tf

class Model(object):
    def __init__(self, hparmas):
        self.hparams = hparmas
        """ some variables init """
        self.saver = tf.train.Saver(tf.global_variables(), max_to_keep = self.hparams.max_to_keep)
        self.init_embeddings()

    def init_embeddings(self):
        """ some variables init """

When saving variables, the saver does not realize other variable initialized in func init_embedding. Thus after restore step, these variables cannot be restored from ckpt files. When using them, it will throw FailedPreconditionError like tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value beta1_power in which the beta1_power is the unlucky one.

Learning to Rank : From Pairwise Approach to Listwise Approach

ListNet

Paper

Learning to Rank: From Pairwise Approach to Listwise Approach

开源实现

  1. Minorthird
    由CMU的教授William W. Cohen带领他的学生们做的,类似于Weka,是一个实现了大量机器学习、数据挖掘算法的开源工具,它在Github上的主页Minorthird – Github.
  2. 另一个是罗磊同学近期做的,使用的是单层神经网络模型来调整权值。目前已经在Google code上开源,地址在这儿。欢迎大家使用并给出意见。

书单

书名 日期 作者 状态
三体 2018-12 刘慈欣 已读
大空头 2018-12 迈克尔.刘易斯 已读
找到自己的北极星 2018-12 玛莎·贝克 已读
必然 2019-01 凯文凯利 在读 94%