FailedPreconditionError : Attempting to use uninitialized in Tensorflow

问题

在重载模型的时候抛出异常:

FailedPreconditionError: Attempting to use uninitialized in Tensorflow

解决

在我的源码中, 我在初始化saver对象之后又声明了多个变量, 如下所示:

In my case, when I wrote the __init__ func in class Model, I claimed more variables after initialized the saver with tf.gloabl_variables(). It seems like :

import tensorflow as tf

class Model(object):
    def __init__(self, hparmas):
        self.hparams = hparmas
        """ some variables init """
        self.saver = tf.train.Saver(tf.global_variables(), max_to_keep = self.hparams.max_to_keep)
        self.init_embeddings()

    def init_embeddings(self):
        """ some variables init """

分析

在存储变量的过程中, saver没有意识到后续的init_embedding方法中有新声明的variable, 是因为在saver初始化过程中, 对应的collections中没有这些变量. 因此这些变量并没有被存储进ckpt文件中.

When saving variables, the saver does not realize other variable initialized in func init_embedding. Thus these variables cannot be restored from ckpt files.

当使用的时候, 就会抛出如上的错误.

When using them, it will throw FailedPreconditionError like tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value beta1_power in which the beta1_power is the unlucky one.

这里还有一些StackOverFlow上的回答仅供参考.

Answer from < mrry >

The FailedPreconditionError arises because the program is attempting to read a variable (named “Variable_1”) before it has been initialized. In TensorFlow, all variables must be explicitly initialized, by running their “initializer” operations. For convenience, you can run all of the variable initializers in the current session by executing the following statement before your training loop.

Answer from < Salvador Dali >

This exception is most commonly raised when running an operation that reads a tf.Variable before it has been initialized.

发表评论

电子邮件地址不会被公开。 必填项已用*标注

此站点使用Akismet来减少垃圾评论。了解我们如何处理您的评论数据