[tools/onnx-subgraph] add multi subgraphs inference code#14769
[tools/onnx-subgraph] add multi subgraphs inference code#14769seanshpark merged 3 commits intoSamsung:masterfrom
Conversation
add code for multi subgraphs inference ONE-DCO-1.0-Signed-off-by: Youxin Chen <yx113.chen@samsung.com>
| "x": np.random.rand(1, 3, 256, 256).astype(np.float32), | ||
| } | ||
| initial_input_data = prepare_initial_input_data(args.single, default_input_data) | ||
|
|
There was a problem hiding this comment.
plz remove this line, it doesn't seem to be related with the change.
| model_input_data = {name: input_data[name] for name in input_names} | ||
| outputs = session.run(None, model_input_data) | ||
| current_model_outputs = dict(zip(output_names, outputs)) | ||
| if output_names_to_collect is not None: |
There was a problem hiding this comment.
we can hoist this if line above for loop.
when collected_outputs is None, return is {}.
There was a problem hiding this comment.
updated change doesn't relfect may comment correctly.
please read again.
| # Perform inference using multiple split subgraph models | ||
| output_multiple = model_inference.infer_multiple_onnx_models( | ||
| initial_input_data, output_names_list) | ||
| print("Multiple subgraph inference completed!") |
There was a problem hiding this comment.
Q) is infer_single_onnx_model for the "source" onnx model and infer_multiple_onnx_models for our "target" splitted multiple models ?
There was a problem hiding this comment.
yes, we use this code to verify the splitted models inference result, then compare the output data and evaluate the accuracy with the "source" onnx model, there will be comparaing code in next PR
There was a problem hiding this comment.
oh, yes, I move the parameter exception checking out of loop now, thank you :)
update code as the review comment
move parameter exception checking out of loop
related issue: #14534
historical full changes PR: #14613
add code for multi sub graphs inference
ONE-DCO-1.0-Signed-off-by: Youxin Chen yx113.chen@samsung.com