-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Invalid argument: No OpKernel was registered to support Op 'PyFunc' #354
Comments
If so, could some one point out which function exactly should I convert to TF Ops (the above error message could possibly help)?
|
I have been trying to implement this function in pure TF ops,
However, I'm stuck in converting lines mentioned as Tensorflow based Code so far,
|
you could just comment out the related code: line 68~73 of attention_seq2seq.py and "if self.use_beam_search: **" of basic_seq2seq.py to avoid using beam search since the related code almost based in pyhton. |
I could manage to implement that |
I can successfully export the seq2seq library based model and use it in Tensorflow serving.
To mention, when it is successfully working, I have turned off beam_search by supplying,
FLAGS_model_params["inference.beam_search.beam_width"]=0
upon exporting.When I turn on beam search by,
FLAGS_model_params["inference.beam_search.beam_width"]=5
upon exporting,I get the following error while loading the exported model through Tensorflow Serving,
Upon searching for the solution, I could see
py_func
are supported only in Python interpreter.Since, Tensorflow Serving runs in C++, the error message appears.
The possible solution is to convert
py_func
based operations to TF operations.If so, could some one point out which function exactly should I convert to TF Ops (the above error message could possibly help)?
Since,
node model/att_seq2seq/decode/attention_decoder/PyFunc
seems to occur within decoder, I cannot get rid of this by pruning or something else.Or is there any alternate solutions possible?
Thanks in advance
The text was updated successfully, but these errors were encountered: