How to calculate the gradient for nce_loss in tensorflow












1












$begingroup$


I need to calculate the gradient of a tensorflow that is stored. I can restore the graph and weights using:



    model1 = tf.train.import_meta_graph("models/model.meta")
model1.restore(sess, tf.train.latest_checkpoint("models/"))
sess.run(tf.global_variables_initializer())
graph = tf.get_default_graph()

weights = graph.get_tensor_by_name("weights:0")
biases = graph.get_tensor_by_name("biases:0")


I have also named my loss function in the original function so I can restore it with



    loss = graph.get_operation_by_name("loss") # for operation
loss = graph.get_tensor_by_name("loss:0") # for the tensor


Basically, I want to get the gradient of the loss with a certain input value using tf.gradients(...). My loss is specifically the nce_loss https://www.tensorflow.org/api_docs/python/tf/nn/nce_loss. I want the gradient of the loss given the inputs function. Specifically, I plug in a new embedding and I want the gradient given that new input and the loss function. However I can't seem to define my input successfully. If I use:



grads = tf.gradients(loss, loss.inputs) #here I use the tensor loss definition


I get:



Traceback (most recent call last):
File "/tmp/fgsm.py", line 168, in <module>
main(config)
File "/tmp/fgsm.py", line 114, in main
loss = graph.get_operation_by_name("loss:0")
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3618, in get_operation_by_name
return self.as_graph_element(name, allow_tensor=False, allow_operation=True)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3490, in as_graph_element
return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3544, in _as_graph_element_locked
(repr(name), types_str))
ValueError: Name 'loss:0' appears to refer to a Tensor, not a Operation.


How do I define my gradient here?










share|improve this question









New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$












  • $begingroup$
    Have you tried using loss = graph.get_operation_by_name("loss") in tf.gradinets()?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    yes I have, I get another type of error message complaining that loss is not a tensor but an operation (the method requires a tensor)
    $endgroup$
    – Mnemosyne
    4 hours ago










  • $begingroup$
    Can you please provide full error trace?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    updated the question with the whole error trace
    $endgroup$
    – Mnemosyne
    3 hours ago
















1












$begingroup$


I need to calculate the gradient of a tensorflow that is stored. I can restore the graph and weights using:



    model1 = tf.train.import_meta_graph("models/model.meta")
model1.restore(sess, tf.train.latest_checkpoint("models/"))
sess.run(tf.global_variables_initializer())
graph = tf.get_default_graph()

weights = graph.get_tensor_by_name("weights:0")
biases = graph.get_tensor_by_name("biases:0")


I have also named my loss function in the original function so I can restore it with



    loss = graph.get_operation_by_name("loss") # for operation
loss = graph.get_tensor_by_name("loss:0") # for the tensor


Basically, I want to get the gradient of the loss with a certain input value using tf.gradients(...). My loss is specifically the nce_loss https://www.tensorflow.org/api_docs/python/tf/nn/nce_loss. I want the gradient of the loss given the inputs function. Specifically, I plug in a new embedding and I want the gradient given that new input and the loss function. However I can't seem to define my input successfully. If I use:



grads = tf.gradients(loss, loss.inputs) #here I use the tensor loss definition


I get:



Traceback (most recent call last):
File "/tmp/fgsm.py", line 168, in <module>
main(config)
File "/tmp/fgsm.py", line 114, in main
loss = graph.get_operation_by_name("loss:0")
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3618, in get_operation_by_name
return self.as_graph_element(name, allow_tensor=False, allow_operation=True)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3490, in as_graph_element
return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3544, in _as_graph_element_locked
(repr(name), types_str))
ValueError: Name 'loss:0' appears to refer to a Tensor, not a Operation.


How do I define my gradient here?










share|improve this question









New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$












  • $begingroup$
    Have you tried using loss = graph.get_operation_by_name("loss") in tf.gradinets()?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    yes I have, I get another type of error message complaining that loss is not a tensor but an operation (the method requires a tensor)
    $endgroup$
    – Mnemosyne
    4 hours ago










  • $begingroup$
    Can you please provide full error trace?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    updated the question with the whole error trace
    $endgroup$
    – Mnemosyne
    3 hours ago














1












1








1





$begingroup$


I need to calculate the gradient of a tensorflow that is stored. I can restore the graph and weights using:



    model1 = tf.train.import_meta_graph("models/model.meta")
model1.restore(sess, tf.train.latest_checkpoint("models/"))
sess.run(tf.global_variables_initializer())
graph = tf.get_default_graph()

weights = graph.get_tensor_by_name("weights:0")
biases = graph.get_tensor_by_name("biases:0")


I have also named my loss function in the original function so I can restore it with



    loss = graph.get_operation_by_name("loss") # for operation
loss = graph.get_tensor_by_name("loss:0") # for the tensor


Basically, I want to get the gradient of the loss with a certain input value using tf.gradients(...). My loss is specifically the nce_loss https://www.tensorflow.org/api_docs/python/tf/nn/nce_loss. I want the gradient of the loss given the inputs function. Specifically, I plug in a new embedding and I want the gradient given that new input and the loss function. However I can't seem to define my input successfully. If I use:



grads = tf.gradients(loss, loss.inputs) #here I use the tensor loss definition


I get:



Traceback (most recent call last):
File "/tmp/fgsm.py", line 168, in <module>
main(config)
File "/tmp/fgsm.py", line 114, in main
loss = graph.get_operation_by_name("loss:0")
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3618, in get_operation_by_name
return self.as_graph_element(name, allow_tensor=False, allow_operation=True)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3490, in as_graph_element
return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3544, in _as_graph_element_locked
(repr(name), types_str))
ValueError: Name 'loss:0' appears to refer to a Tensor, not a Operation.


How do I define my gradient here?










share|improve this question









New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




I need to calculate the gradient of a tensorflow that is stored. I can restore the graph and weights using:



    model1 = tf.train.import_meta_graph("models/model.meta")
model1.restore(sess, tf.train.latest_checkpoint("models/"))
sess.run(tf.global_variables_initializer())
graph = tf.get_default_graph()

weights = graph.get_tensor_by_name("weights:0")
biases = graph.get_tensor_by_name("biases:0")


I have also named my loss function in the original function so I can restore it with



    loss = graph.get_operation_by_name("loss") # for operation
loss = graph.get_tensor_by_name("loss:0") # for the tensor


Basically, I want to get the gradient of the loss with a certain input value using tf.gradients(...). My loss is specifically the nce_loss https://www.tensorflow.org/api_docs/python/tf/nn/nce_loss. I want the gradient of the loss given the inputs function. Specifically, I plug in a new embedding and I want the gradient given that new input and the loss function. However I can't seem to define my input successfully. If I use:



grads = tf.gradients(loss, loss.inputs) #here I use the tensor loss definition


I get:



Traceback (most recent call last):
File "/tmp/fgsm.py", line 168, in <module>
main(config)
File "/tmp/fgsm.py", line 114, in main
loss = graph.get_operation_by_name("loss:0")
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3618, in get_operation_by_name
return self.as_graph_element(name, allow_tensor=False, allow_operation=True)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3490, in as_graph_element
return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
File "/tmp/venv/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3544, in _as_graph_element_locked
(repr(name), types_str))
ValueError: Name 'loss:0' appears to refer to a Tensor, not a Operation.


How do I define my gradient here?







python tensorflow gradient-descent






share|improve this question









New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 3 hours ago







Mnemosyne













New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 4 hours ago









MnemosyneMnemosyne

1063




1063




New contributor




Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Mnemosyne is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • $begingroup$
    Have you tried using loss = graph.get_operation_by_name("loss") in tf.gradinets()?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    yes I have, I get another type of error message complaining that loss is not a tensor but an operation (the method requires a tensor)
    $endgroup$
    – Mnemosyne
    4 hours ago










  • $begingroup$
    Can you please provide full error trace?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    updated the question with the whole error trace
    $endgroup$
    – Mnemosyne
    3 hours ago


















  • $begingroup$
    Have you tried using loss = graph.get_operation_by_name("loss") in tf.gradinets()?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    yes I have, I get another type of error message complaining that loss is not a tensor but an operation (the method requires a tensor)
    $endgroup$
    – Mnemosyne
    4 hours ago










  • $begingroup$
    Can you please provide full error trace?
    $endgroup$
    – Antonio Jurić
    4 hours ago










  • $begingroup$
    updated the question with the whole error trace
    $endgroup$
    – Mnemosyne
    3 hours ago
















$begingroup$
Have you tried using loss = graph.get_operation_by_name("loss") in tf.gradinets()?
$endgroup$
– Antonio Jurić
4 hours ago




$begingroup$
Have you tried using loss = graph.get_operation_by_name("loss") in tf.gradinets()?
$endgroup$
– Antonio Jurić
4 hours ago












$begingroup$
yes I have, I get another type of error message complaining that loss is not a tensor but an operation (the method requires a tensor)
$endgroup$
– Mnemosyne
4 hours ago




$begingroup$
yes I have, I get another type of error message complaining that loss is not a tensor but an operation (the method requires a tensor)
$endgroup$
– Mnemosyne
4 hours ago












$begingroup$
Can you please provide full error trace?
$endgroup$
– Antonio Jurić
4 hours ago




$begingroup$
Can you please provide full error trace?
$endgroup$
– Antonio Jurić
4 hours ago












$begingroup$
updated the question with the whole error trace
$endgroup$
– Mnemosyne
3 hours ago




$begingroup$
updated the question with the whole error trace
$endgroup$
– Mnemosyne
3 hours ago










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});






Mnemosyne is a new contributor. Be nice, and check out our Code of Conduct.










draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f45409%2fhow-to-calculate-the-gradient-for-nce-loss-in-tensorflow%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes








Mnemosyne is a new contributor. Be nice, and check out our Code of Conduct.










draft saved

draft discarded


















Mnemosyne is a new contributor. Be nice, and check out our Code of Conduct.













Mnemosyne is a new contributor. Be nice, and check out our Code of Conduct.












Mnemosyne is a new contributor. Be nice, and check out our Code of Conduct.
















Thanks for contributing an answer to Data Science Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f45409%2fhow-to-calculate-the-gradient-for-nce-loss-in-tensorflow%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Ponta tanko

Tantalo (mitologio)

Erzsébet Schaár