Skip to content
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.

Invalid argument: No OpKernel was registered to support Op 'PyFunc' #354

Open
sathyarr opened this issue Mar 13, 2019 · 4 comments
Open

Comments

@sathyarr
Copy link

sathyarr commented Mar 13, 2019

I can successfully export the seq2seq library based model and use it in Tensorflow serving.
To mention, when it is successfully working, I have turned off beam_search by supplying,
FLAGS_model_params["inference.beam_search.beam_width"]=0 upon exporting.

When I turn on beam search by,
FLAGS_model_params["inference.beam_search.beam_width"]=5 upon exporting,
I get the following error while loading the exported model through Tensorflow Serving,

2019-03-13 18:48:11.664016: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:285] SavedModel load for tags { serve }; Status: fail. Took 97350 microseconds.
2019-03-13 18:48:11.664075: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: dmodel version: 1551934425} failed: Invalid argument: No OpKernel was registered to support Op 'PyFunc' used by {{node model/att_seq2seq/decode/attention_decoder/PyFunc}}with these attrs: [_output_shapes=[[?,5]], Tin=[DT_INT32, DT_INT32], Tout=[DT_INT32], token="pyfunc_0"]
Registered devices: [CPU]
Registered kernels:
  <no registered kernels>

	 [[{{node model/att_seq2seq/decode/attention_decoder/PyFunc}}]]

Upon searching for the solution, I could see py_func are supported only in Python interpreter.
Since, Tensorflow Serving runs in C++, the error message appears.

The possible solution is to convert py_func based operations to TF operations.
If so, could some one point out which function exactly should I convert to TF Ops (the above error message could possibly help)?

Since, node model/att_seq2seq/decode/attention_decoder/PyFunc seems to occur within decoder, I cannot get rid of this by pruning or something else.

Or is there any alternate solutions possible?

Thanks in advance

@sathyarr
Copy link
Author

If so, could some one point out which function exactly should I convert to TF Ops (the above error message could possibly help)?

py_func is used here

@sathyarr
Copy link
Author

I have been trying to implement this function in pure TF ops,

def gather_tree_py(values, parents):
  beam_length = values.shape[0]
  num_beams = values.shape[1]
  res = np.zeros_like(values)
  res[-1, :] = values[-1, :]        # Point 1
  for beam_id in range(num_beams):
    parent = parents[-1][beam_id]
    for level in reversed(range(beam_length - 1)):
      res[level, beam_id] = values[level][parent]        # Point 2
      parent = parents[level][parent]
  return np.array(res).astype(values.dtype)

However, I'm stuck in converting lines mentioned as Point 1 & Point 2
They're numpy based array operations.
Is that possible to do that in native TF ops?
If not, any alternate solutions possible?
Will writing a Custom Operation in C++ suffice the scenario?

Tensorflow based Code so far,

beam_length = tf.shape(values)[0]
num_beams = tf.shape(values)[1]
res = tf.zeros_like(values)

beam_id = tf.constant(0)
while_condition = lambda beam_id: tf.less(beam_id, num_beams)
def body(beam_id):
    parent = parents[-1][beam_id]

    # internal while loop
    level = tf.subtract(beam_length, 2)
    while_condition2 = lambda level, beam_id, parent: tf.greater(level, -1)
    def body2(level, beam_id, parent):
        res[level, beam_id] = values[level][parent]        # Point 2
        parent.assign(parents[level][parent])
        return[tf.subtract(level, 1), beam_id, parent]
    tf.while_loop(while_condition2, body2, [level, beam_id, parent])
    # internal while loop

    return [tf.add(beam_id, 1)]

# do the loop:
tf.while_loop(while_condition, body, [beam_id])

@ghtwht
Copy link

ghtwht commented Apr 10, 2019

you could just comment out the related code: line 68~73 of attention_seq2seq.py and "if self.use_beam_search: **" of basic_seq2seq.py to avoid using beam search since the related code almost based in pyhton.

@sathyarr
Copy link
Author

you could just comment out the related code: line 68~73 of attention_seq2seq.py and "if self.use_beam_search: **" of basic_seq2seq.py to avoid using beam search since the related code almost based in pyhton.

I could manage to implement that py_func as a Custom Operator.
For those of you interested with the code for Custom Operation, see here

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants