-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automation tests for tensorflow model gRPC RHOAIENG-9052 for triton #1844
base: master
Are you sure you want to change the base?
Conversation
${INFERENCE_GRPC_INPUT_TENSORFLOW}= @tests/Resources/Files/triton/kserve-triton-tensorflow-gRPC-input.json | ||
${TENSORFLOW_MODEL_NAME}= inception_graphdef | ||
${TENSORFLOW_MODEL_LABEL}= inceptiongraphdef | ||
${TENSORFLOW_RUNTIME_NAME}= triton-tensorflow-grpc |
Check notice
Code scanning / Robocop
Variable '{{ name }}' is assigned but not used Note test
...odel_serving/1009__model_serving_triton_on_kserve/1009__model_serving_triton_on_kserve.robot
Fixed
Show fixed
Hide fixed
@@ -152,6 +159,51 @@ | |||
... AND | |||
... Delete Serving Runtime Template From CLI displayed_name=triton-kserve-grpc | |||
|
|||
Test Tensorflow Model Grpc Inference Via UI (Triton on Kserve) # robocop: off=too-long-test-case |
Check warning
Code scanning / Robocop
Test case '{{ test_name }}' has too many keywords inside ({{ keyword_count }}/{{ max_allowed_count }}) Warning test
... token=${TRUE} | ||
Wait For Pods To Be Ready label_selector=serving.kserve.io/inferenceservice=${TENSORFLOW_MODEL_LABEL} | ||
... namespace=${PRJ_TITLE} | ||
${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}= Load Json File file_path=${EXPECTED_INFERENCE_GRPC_OUTPUT_FILE_TENSORFLOW} |
Check warning
Code scanning / Robocop
Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test
${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}= Load Json File file_path=${EXPECTED_INFERENCE_GRPC_OUTPUT_FILE_TENSORFLOW} | ||
... as_string=${TRUE} | ||
${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}= Load Json String ${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW} | ||
${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}= Evaluate json.dumps(${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}) |
Check warning
Code scanning / Robocop
Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test
...odel_serving/1009__model_serving_triton_on_kserve/1009__model_serving_triton_on_kserve.robot
Fixed
Show fixed
Hide fixed
Robot Results
|
@@ -39,6 +39,13 @@ ${PYTORCH_MODEL_NAME}= resnet50 | |||
${PYTORCH_RUNTIME_NAME}= triton-kserve-rest | |||
${PYTORCH_RUNTIME_FILEPATH}= ${RESOURCES_DIRPATH}/triton_onnx_rest_servingruntime.yaml | |||
${EXPECTED_INFERENCE_REST_OUTPUT_FILE_PYTORCH}= tests/Resources/Files/triton/kserve-triton-resnet-rest-output.json | |||
${INFERENCE_GRPC_INPUT_TENSORFLOW}= @tests/Resources/Files/triton/kserve-triton-tensorflow-gRPC-input.json |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe you could insert a reference to the model in the file name
...odel_serving/1009__model_serving_triton_on_kserve/1009__model_serving_triton_on_kserve.robot
Outdated
Show resolved
Hide resolved
${TENSORFLOW_RUNTIME_NAME}= triton-tensorflow-grpc | ||
${TENSORFLOW_GRPC_RUNTIME_NAME}= triton-tensorflow-grpc | ||
${TENSORFLOW_RUNTIME_FILEPATH}= ${RESOURCES_DIRPATH}/triton_tensorflow_gRPC_servingruntime.yaml | ||
${EXPECTED_INFERENCE_GRPC_OUTPUT_FILE_TENSORFLOW}= tests/Resources/Files/triton/kserve-triton-inception_graphdef-gRPC-output.json |
Check warning
Code scanning / Robocop
Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test
Verified with Jenkins Build 581 |
@@ -0,0 +1,64 @@ | |||
apiVersion: serving.kserve.io/v1alpha1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems the same as https://github.com/red-hat-data-services/ods-ci/pull/1843/files
Can we avoid duplication and use one file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are adding this file because to test it on local without those files we cannot test them , once the all PR's are approved we remove the all the duplicate files
488c84f
to
c4178ea
Compare
@@ -223,6 +224,12 @@ | |||
END | |||
RETURN ${url} | |||
|
|||
Get Model Route for gRPC Via UI |
Check warning
Code scanning / Robocop
Missing documentation in '{{ name }}' keyword Warning test
@@ -223,6 +224,12 @@ | |||
END | |||
RETURN ${url} | |||
|
|||
Get Model Route for gRPC Via UI | |||
[Arguments] ${model_name} |
Check warning
Code scanning / Robocop
Trailing whitespace at the end of line Warning test
${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}= Evaluate json.dumps(${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}) | ||
Log ${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW} | ||
Open Model Serving Home Page | ||
${host}= Get Model Route for gRPC Via UI model_name=${TENSORFLOW_MODEL_NAME} |
Check warning
Code scanning / Robocop
Keyword name '{{ keyword_name }}' does not follow case convention Warning test
${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}= Evaluate json.dumps(${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW}) | ||
Log ${EXPECTED_INFERENCE_GRPC_OUTPUT_TENSORFLOW} | ||
Open Model Serving Home Page | ||
${host}= Get Model Route for gRPC Via UI model_name=${TENSORFLOW_MODEL_NAME} |
Check warning
Code scanning / Robocop
Trailing whitespace at the end of line Warning test
Open Model Serving Home Page | ||
${host}= Get Model Route for gRPC Via UI model_name=${TENSORFLOW_MODEL_NAME} | ||
Log ${host} | ||
${token}= Get Access Token Via UI single_model=${TRUE} model_name=${TENSORFLOW_MODEL_NAME} project_name=${PRJ_TITLE} |
Check warning
Code scanning / Robocop
Line is too long ({{ line_length }}/{{ allowed_length }}) Warning test
Quality Gate passedIssues Measures |
Added Automation Test for RHOAIENG-13198