Skip to content
This repository has been archived by the owner on Feb 7, 2023. It is now read-only.

Key error and segmentation fault (core dumped) of onnx-coreml1.3.0,onnx1.7.0,pytorch1.5 #574

Open
lnysoftin opened this issue Jun 11, 2020 · 1 comment

Comments

@lnysoftin
Copy link

Dear:
When I use convert in onnx-coreml, segmentation fault (core dumped)arises.
My model is A PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention.
My code is :

import imageio
import numpy as np
from argparse import ArgumentParser

import torch

from trainer import Trainer
from utils.tools import get_config
from onnx_coreml import convert

parser = ArgumentParser()
parser.add_argument('--config', type=str, default='configs/config.yaml',
                    help="training configuration")
parser.add_argument('--image', default='./examples/places2/case2_input.png', type=str,
                    help='The filename of image to be completed.')
parser.add_argument('--mask', default='./examples/places2/case2_mask.png', type=str,
                    help='The filename of mask, value 255 indicates mask.')
parser.add_argument('--output', default='./examples/output2.png', type=str,
                    help='Where to write output.')
parser.add_argument('--model-path', default='./torch_model.p', type=str,
                    help='Path to save model')
args = parser.parse_args()


def main():
    config = get_config(args.config)
    if config['cuda']:
        device = torch.device("cuda:{}".format(config['gpu_ids'][0]))
    else:
        device = torch.device("cpu")
    trainer = Trainer(config)
    trainer.load_state_dict(load_weights(args.model_path, device), strict=False)
    trainer.eval()
    model=trainer.netG

    x=torch.rand(1,3,256,256)
    mask=torch.rand(1,1,256,256)
    # with torch.no_grad():
    #      result = model(x,mask)
    #      #print(result)
    # torch.save(model, './model.pt')
    torch.onnx.export(model,
                 (x,mask),
                  './model.onnx',
                  input_names=["x","mask"],
                  output_names=["output"],opset_version=11)


    mlmodel = convert(model='./model.onnx',minimum_ios_deployment_target='13')
    print('save')
    mlmodel.save('./model.mlmodel')
   # MLModel prediction
    input_dict = {'x': x.numpy().astype(np.float32),'mask':mask.numpy().astype(np.float32)}
    pred_coreml = mlmodel.predict(input_dict, useCPUOnly=True)


def load_weights(path, device):
    model_weights = torch.load(path)
    return {
        k: v.to(device)
        for k, v in model_weights.items()
    }


def upcast(x):
    return np.clip((x + 1) * 127.5 , 0, 255).astype(np.uint8)


if __name__ == '__main__':
    main()

Are there any ideas to avoid segmentation fault?

/home/ubuntu/Desktop/pyc/generative-inpainting-pytorch-master/model/networks.py:354: TracerWarning: torch.from_numpy results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect.
  flow = torch.from_numpy(flow_to_image(offsets.permute(0, 2, 3, 1).cpu().data.numpy())) / 255.
Segmentation fault (core dumped)

My environment is like this:

python (3.6.6)
pytorch(1.5.0)
onnx (1.7.0)
onnx-coreml (1.3.0)
when i uncomment for 500 lines in converter.py

   # onnx_model = onnx.shape_inference.infer_shapes(onnx_model)
    graph = _prepare_onnx_graph(onnx_model.graph, transformers, onnx_model.ir_version)

key error arises:

Traceback (most recent call last):
  File "allmodel.py", line 70, in <module>
    main()
  File "allmodel.py", line 49, in main
    mlmodel = convert(model='./model.onnx',minimum_ios_deployment_target='13')
  File "/home/ubuntu/.pyenv/versions/3.6.6/lib/python3.6/site-packages/onnx_coreml/converter.py", line 501, in convert
    graph = _prepare_onnx_graph(onnx_model.graph, transformers, onnx_model.ir_version)
  File "/home/ubuntu/.pyenv/versions/3.6.6/lib/python3.6/site-packages/onnx_coreml/converter.py", line 373, in _prepare_onnx_graph
    graph_ = graph_.transformed(transformers)
  File "/home/ubuntu/.pyenv/versions/3.6.6/lib/python3.6/site-packages/onnx_coreml/_graph.py", line 201, in transformed
    return _apply_graph_transformations(graph, transformers) # type: ignore
  File "/home/ubuntu/.pyenv/versions/3.6.6/lib/python3.6/site-packages/onnx_coreml/_graph.py", line 60, in _apply_graph_transformations
    graph = transformer(graph)
  File "/home/ubuntu/.pyenv/versions/3.6.6/lib/python3.6/site-packages/onnx_coreml/_transformers.py", line 758, in __call__
    ends = node.attrs['ends']
KeyError: 'ends'

Are there any ideas to avoid this fault?
there is my onnx file, I think this error may be cause by x = F.interpolate(x, scale_factor=0.5, mode='nearest')

model.zip

@lnysoftin
Copy link
Author

I think this error may be cause by torch.clamp and F.interpolate,can someone help? I try do onnx simpler, but it still cannot work , may be the network is too complex.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant