tf.GradientTape () returns None

0

I am trying to calculate the gradient with tf.GradientTape. When I try to do it using as inputs the loss and Model.variables the result that returns me in an array of None. what am I doing wrong? The tensorflow version I use is 1.9.

Then I leave part of the code

Model = CubeValModel(TrainingCurves)

LearningRate = 0.0005
TrainOpe = tf.train.AdamOptimizer(LearningRate, name="MainTrainingOpe")

for i in range (5):
    with tf.GradientTape() as t:
        Predictions = tf.nn.softmax(Model.FinalFC, name="SoftmaxPredictions")
        Cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=Predictions, labels=TrainingLabels, name="CrossEntropy")
        Loss = tf.reduce_mean(Cross_entropy, name="Loss")
        print (Loss)
        print (Model.variables)
        Gradients = t.gradient(Loss, Model.variables)
        print(Gradients)

Outputs:

tf.Tensor(0.84878147, shape=(), dtype=float32)

[<tf.Variable 'LayerBlock1/Weights1:0' shape=(1, 3, 1, 3) dtype=float32, numpy=

[None, None, None, None, None, None, None, None, None]
    
asked by Blunt 23.08.2018 в 23:35
source

0 answers