Skip to content

[tools/onnx-subgraph] add multi subgraphs inference code#14769

Merged
seanshpark merged 3 commits intoSamsung:masterfrom
chenyx113:onnx-subgraph-0304
Mar 5, 2025
Merged

[tools/onnx-subgraph] add multi subgraphs inference code#14769
seanshpark merged 3 commits intoSamsung:masterfrom
chenyx113:onnx-subgraph-0304

Conversation

@chenyx113
Copy link
Copy Markdown
Contributor

related issue: #14534
historical full changes PR: #14613

add code for multi sub graphs inference

ONE-DCO-1.0-Signed-off-by: Youxin Chen yx113.chen@samsung.com

add code for multi subgraphs inference

ONE-DCO-1.0-Signed-off-by: Youxin Chen <yx113.chen@samsung.com>
@chenyx113 chenyx113 marked this pull request as ready for review March 4, 2025 13:08
"x": np.random.rand(1, 3, 256, 256).astype(np.float32),
}
initial_input_data = prepare_initial_input_data(args.single, default_input_data)

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

plz remove this line, it doesn't seem to be related with the change.

model_input_data = {name: input_data[name] for name in input_names}
outputs = session.run(None, model_input_data)
current_model_outputs = dict(zip(output_names, outputs))
if output_names_to_collect is not None:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can hoist this if line above for loop.
when collected_outputs is None, return is {}.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated change doesn't relfect may comment correctly.
please read again.

# Perform inference using multiple split subgraph models
output_multiple = model_inference.infer_multiple_onnx_models(
initial_input_data, output_names_list)
print("Multiple subgraph inference completed!")
Copy link
Copy Markdown
Contributor

@seanshpark seanshpark Mar 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Q) is infer_single_onnx_model for the "source" onnx model and infer_multiple_onnx_models for our "target" splitted multiple models ?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, we use this code to verify the splitted models inference result, then compare the output data and evaluate the accuracy with the "source" onnx model, there will be comparaing code in next PR

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh, yes, I move the parameter exception checking out of loop now, thank you :)

update code as the review comment
move parameter exception checking out of loop
Copy link
Copy Markdown
Contributor

@seanshpark seanshpark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@seanshpark seanshpark merged commit 935afd2 into Samsung:master Mar 5, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants