三体读后感

程心被维德制止的时候, 维德可能知道些什么, 但是仍然没有告诉她, 而是把这个谜团悄悄的塞给了她.
直到最后在冥王星出发的时候, 程心才发现这个爆炸在眼前的谜团.
程心两次称为历史的罪人, 维德都是与之关系密切的人.
可以看出来, 维德是真正了解未来方向的人. 即使他不知道哪条路是对的 ,但是他的直觉很准, 手段很狠.
程心最后还是遇到了比死更难受的事.

Difference between sparse_softmax_cross_entropy_with_logits and softmax_cross_entropy_with_logits

Answer from < Olivier Moindrot >
Having two different functions is a convenience, as they produce the same result.

The difference is simple:
For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1].
For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.

Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.

Answer from < 全意 >
sparse_softmax_cross_entropy_with_logits中 lables接受直接的数字标签
如[1], [2], [3], [4] (类型只能为int32,int64)
softmax_cross_entropy_with_logits中 labels接受one-hot标签
如[1,0,0,0], [0,1,0,0],[0,0,1,0], [0,0,0,1] (类型为int32, int64)