三体读后感

程心被维德制止的时候, 维德可能知道些什么, 但是仍然没有告诉她, 而是把这个谜团悄悄的塞给了她.
直到最后在冥王星出发的时候, 程心才发现这个爆炸在眼前的谜团.
程心两次称为历史的罪人, 维德都是与之关系密切的人.
可以看出来, 维德是真正了解未来方向的人. 即使他不知道哪条路是对的 ,但是他的直觉很准, 手段很狠.
程心最后还是遇到了比死更难受的事.

Difference between sparse_softmax_cross_entropy_with_logits and softmax_cross_entropy_with_logits

Answer from < Olivier Moindrot >
Having two different functions is a convenience, as they produce the same result.

The difference is simple:
For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1].
For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.

Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.

Answer from < 全意 >
sparse_softmax_cross_entropy_with_logits中 lables接受直接的数字标签
如[1], [2], [3], [4] (类型只能为int32,int64)
softmax_cross_entropy_with_logits中 labels接受one-hot标签
如[1,0,0,0], [0,1,0,0],[0,0,1,0], [0,0,0,1] (类型为int32, int64)


What does global_step mean in tensorflow ?

global_step refers to the number of batches seen by the graph. Every time a batch is provided, the weights are updated in the direction that minimizes the loss. global_step just keeps track of the number of batches seen so far. When it is passed in the minimize() argument list, the variable is increased by one. Have a look at optimizer.minimize().

You can get the global_step value using tf.train.global_step(). Also handy are the utility methods tf.train.get_global_step or tf.train.get_or_create_global_step.

0 is the initial value of the global step in this context.
Answer from < maddin25 and martianwars >

global_step 指的是 graph 所看到的 batch 的数量. 如果 dataset 中包含100个数据, 而 batch_size 被设置为 10, 则每隔10个 batch 结束一个 epoch.
epoch 指的是遍历一遍 dataset.

FailedPreconditionError: Attempting to use uninitialized in Tensorflow

Web Solution

Answer from < mrry >

The FailedPreconditionError arises because the program is attempting to read a variable (named “Variable_1”) before it has been initialized. In TensorFlow, all variables must be explicitly initialized, by running their “initializer” operations. For convenience, you can run all of the variable initializers in the current session by executing the following statement before your training loop.

Answer from < Salvador Dali >

This exception is most commonly raised when running an operation that reads a tf.Variable before it has been initialized.

My Case

In my case, when I wrote the __init__ func in class Model, I claimed more variables after initialized the saver with tf.gloabl_variables(). It seems like :

import tensorflow as tf

class Model(object):
    def __init__(self, hparmas):
        self.hparams = hparmas
        """ some variables init """
        self.saver = tf.train.Saver(tf.global_variables(), max_to_keep = self.hparams.max_to_keep)
        self.init_embeddings()

    def init_embeddings(self):
        """ some variables init """

When saving variables, the saver does not realize other variable initialized in func init_embedding. Thus after restore step, these variables cannot be restored from ckpt files. When using them, it will throw FailedPreconditionError like tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value beta1_power in which the beta1_power is the unlucky one.

Learning to Rank : From Pairwise Approach to Listwise Approach

ListNet

Paper

Learning to Rank: From Pairwise Approach to Listwise Approach

开源实现

  1. Minorthird
    由CMU的教授William W. Cohen带领他的学生们做的,类似于Weka,是一个实现了大量机器学习、数据挖掘算法的开源工具,它在Github上的主页Minorthird – Github.
  2. 另一个是罗磊同学近期做的,使用的是单层神经网络模型来调整权值。目前已经在Google code上开源,地址在这儿。欢迎大家使用并给出意见。

BMPS_LIST

# 书单 Book List

书名 日期 作者 状态
三体 2018-12 刘慈欣 已读
大空头 2018-12 迈克尔.刘易斯 已读
找到自己的北极星
Finding Your Own North Star: Claiming the Life You Were Meant to Live
2018-12 Martha Beck 玛莎·贝克 已读
必然
The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future
2019-01 Kevin Kelly 凯文·凯利 已读
原则
Principles
2019-03 Ray Dalio 雷·达瑞奥 30%
推荐系统实践 2019-03 项亮 已读
Redis实践
Redis in Action
2019-03 Josiah L. Carlson 5%
Python数据分析基础教程-NumPy学习指南
NumPy Beginner’s Guide
2019-03 Ivan Idris 50%
TensorFlow 实战Google深度学习框架 2nd 2019-05 郑泽宇 第三章review完
Linux Shell 脚本攻略 2019-05 Cliff Flynt
Sarath Lakshman
Shantanu Tushar
1.10节 34 / 413

# 电影单 Movie List

电影 日期
正义联盟 2019-02-25
战狼 2019-03-02
战狼2 2019-03-02
云南虫谷 2019-03-03
无名之辈 2019-03-10
银河护卫队 2019-03-18
银河护卫队2 2019-03-18
三块广告牌 2019-03-18
湄公河行动 2019-03-24
大人物 2019-03-30
憨豆特工 2019-03-30
憨豆特工3 2019-03-30
死侍2 2019-03-30
大黄蜂 2019-04-07
飞驰人生 2019-04-14
碟中谍6 2019-04-14
复仇者联盟4 2019-05-06

# 电视剧单 TV Series List

电视剧 日期
怒晴湘西 2019-02-28
那年那兔那些事 第1季 2019-03-19
那年那兔那些事 第2季 2019-03-21
那年那兔那些事 第3季 2019-03-23
破冰行动 2019-05-17 第三集

# 动漫清单 Comic and Animation List

漫画 日期
一人之下 411 2019-05-17
动画 日期
一人之下 2019-04-07
一人之下2 2019-04-07

# 论文单 Paper List

如果外链挂了, 请自行 Google 或翻一翻 arxiv.
Plz check paper on Google or arxiv in case the following download link is unavaiable.

论文 发表年份 领域 简介 状态
Wide & Deep Learning for Recommender Systems Google 2016 Recommendation ; DNN ; TODO 2019-03-17
DeepFM: A Factorization-Machine based Neural Network for CTR Prediction HIT 2017 Recommendation ; DNN ; TODO 在读
Latent Cross: Making Use of Context in Recurrent Recommender Systems Google 2018 Recommendation ; DNN ; TODO 在读

Deep Interest Evolution Network for Click-Through Rate Prediction

# 歌单 Song List
Shape of you

愿望书单:
高效人士的7个习惯
月亮与六便士