Skip to content

Commit 53ee808

Browse files
committed
BigQuerySource Additional Steps
1 parent 6f3af0b commit 53ee808

8 files changed

Lines changed: 433 additions & 7 deletions

File tree

src/e2e-test/features/bigquery/source/BigQuerySourceError.feature

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,3 +55,27 @@ Feature: BigQuery source - Validate BigQuery source plugin error scenarios
5555
Then Enter BigQuery source property table name
5656
Then Enter BigQuery property temporary bucket name "bqInvalidTemporaryBucket"
5757
Then Verify the BigQuery validation error message for invalid property "bucket"
58+
59+
@BQ_SOURCE_TEST
60+
Scenario Outline:To verify error message when unsupported format is provided in Partition Start date and Partition end Date
61+
Given Open Datafusion Project to configure pipeline
62+
When Expand Plugin group in the LHS plugins list: "Source"
63+
When Select plugin: "BigQuery" from the plugins list as: "Source"
64+
Then Navigate to the properties page of plugin: "BigQuery"
65+
Then Replace input plugin property: "project" with value: "projectId"
66+
Then Replace input plugin property: "dataset" with value: "dataset"
67+
Then Replace input plugin property: "table" with value: "bqSourceTable"
68+
Then Click on the Get Schema button
69+
Then Enter BigQuery source properties partitionFrom and partitionTo
70+
Then Validate BigQuery source incorrect property error for Partition Start date "<property>" value "<value>"
71+
Then Validate BigQuery source incorrect property error for Partition End date "<property>" value "<value>"
72+
Then Enter BigQuery source properties referenceName
73+
Then Validate BigQuery source incorrect property error for reference name"<property>" value "<value>"
74+
Then Enter BigQuery source properties filter
75+
Examples:
76+
| property | value |
77+
| partitionFrom | bqIncorrectFormatStartDate |
78+
| partitionTo | bqIncorrectFormatEndDate |
79+
| referenceName | bqIncorrectReferenceName |
80+
| filter | bqIncorrectFilter |
81+

src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -354,3 +354,34 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data
354354
Then Open and capture logs
355355
Then Verify the pipeline status is "Succeeded"
356356
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
357+
358+
@BQ_SOURCE_TEST @BQ_SINK_TEST
359+
Scenario:Validate that pipeline run gets failed when incorrect filter values and verify the log error message
360+
Given Open Datafusion Project to configure pipeline
361+
When Source is BigQuery
362+
When Sink is BigQuery
363+
Then Open BigQuery source properties
364+
Then Enter BigQuery property reference name
365+
Then Enter BigQuery property projectId "projectId"
366+
Then Enter BigQuery property datasetProjectId "projectId"
367+
Then Override Service account details if set in environment variables
368+
Then Enter BigQuery property dataset "dataset"
369+
Then Enter BigQuery source property table name
370+
Then Enter input plugin property: "filter" with value: "incorrectFilter"
371+
Then Validate output schema with expectedSchema "bqSourceSchema"
372+
Then Validate "BigQuery" plugin properties
373+
Then Close the BigQuery properties
374+
Then Open BigQuery sink properties
375+
Then Override Service account details if set in environment variables
376+
Then Enter the BigQuery sink mandatory properties
377+
Then Validate "BigQuery" plugin properties
378+
Then Close the BigQuery properties
379+
Then Connect source as "BigQuery" and sink as "BigQuery" to establish connection
380+
Then Save the pipeline
381+
Then Deploy the pipeline
382+
Then Run the Pipeline in Runtime
383+
Then Wait till pipeline is in running state
384+
Then Verify the pipeline status is "Failed"
385+
Then Open Pipeline logs and verify Log entries having below listed Level and Message:
386+
| Level | Message |
387+
| ERROR | errorLogsMessageInvalidFilter |

src/e2e-test/features/bigquery/source/BigQueryToGCS_WithMacro.feature

Lines changed: 219 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -69,3 +69,222 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans
6969
Then Verify the pipeline status is "Succeeded"
7070
Then Verify data is transferred to target GCS bucket
7171
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled
72+
73+
@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST
74+
Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for partition start date and partition end date
75+
Given Open Datafusion Project to configure pipeline
76+
When Source is BigQuery
77+
When Sink is GCS
78+
Then Open BigQuery source properties
79+
Then Enter BigQuery property reference name
80+
Then Enter BigQuery property "projectId" as macro argument "bqProjectId"
81+
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId"
82+
Then Enter BigQuery property "partitionFrom" as macro argument "bqStartDate"
83+
Then Enter BigQuery property "partitionTo" as macro argument "bqEndDate"
84+
Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType"
85+
Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount"
86+
Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount"
87+
Then Enter BigQuery property "dataset" as macro argument "bqDataset"
88+
Then Enter BigQuery property "table" as macro argument "bqSourceTable"
89+
Then Validate "BigQuery" plugin properties
90+
Then Close the BigQuery properties
91+
Then Open GCS sink properties
92+
Then Enter GCS property reference name
93+
Then Enter GCS property "projectId" as macro argument "gcsProjectId"
94+
Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType"
95+
Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount"
96+
Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount"
97+
Then Enter GCS property "path" as macro argument "gcsSinkPath"
98+
Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix"
99+
Then Enter GCS property "format" as macro argument "gcsFormat"
100+
Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled
101+
Then Validate "GCS" plugin properties
102+
Then Close the GCS properties
103+
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
104+
Then Save the pipeline
105+
Then Preview and run the pipeline
106+
Then Enter runtime argument value "projectId" for key "bqProjectId"
107+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
108+
Then Enter runtime argument value "partitionFrom" for key "bqStartDate"
109+
Then Enter runtime argument value "partitionTo" for key "bqEndDate"
110+
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
111+
Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
112+
Then Enter runtime argument value "dataset" for key "bqDataset"
113+
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
114+
Then Enter runtime argument value "projectId" for key "gcsProjectId"
115+
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
116+
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
117+
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
118+
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
119+
Then Run the preview of pipeline with runtime arguments
120+
Then Wait till pipeline preview is in running state
121+
Then Open and capture pipeline preview logs
122+
Then Verify the preview run status of pipeline in the logs is "succeeded"
123+
Then Close the pipeline logs
124+
Then Click on preview data for GCS sink
125+
Then Close the preview data
126+
Then Deploy the pipeline
127+
Then Run the Pipeline in Runtime
128+
Then Enter runtime argument value "projectId" for key "bqProjectId"
129+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
130+
Then Enter runtime argument value "partitionFrom" for key "bqStartDate"
131+
Then Enter runtime argument value "partitionTo" for key "bqEndDate"
132+
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
133+
Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
134+
Then Enter runtime argument value "dataset" for key "bqDataset"
135+
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
136+
Then Enter runtime argument value "projectId" for key "gcsProjectId"
137+
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
138+
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
139+
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
140+
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
141+
Then Run the Pipeline in Runtime with runtime arguments
142+
Then Wait till pipeline is in running state
143+
Then Open and capture logs
144+
Then Verify the pipeline status is "Succeeded"
145+
Then Verify data is transferred to target GCS bucket
146+
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled
147+
148+
@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST
149+
Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for filter and outputschema
150+
Given Open Datafusion Project to configure pipeline
151+
When Source is BigQuery
152+
When Sink is GCS
153+
Then Open BigQuery source properties
154+
Then Enter BigQuery property reference name
155+
Then Enter BigQuery property "projectId" as macro argument "bqProjectId"
156+
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId"
157+
Then Enter BigQuery property "filter" as macro argument "bqFilter"
158+
Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType"
159+
Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount"
160+
Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount"
161+
Then Enter BigQuery property "dataset" as macro argument "bqDataset"
162+
Then Enter BigQuery property "table" as macro argument "bqSourceTable"
163+
Then Validate "BigQuery" plugin properties
164+
Then Close the BigQuery properties
165+
Then Open GCS sink properties
166+
Then Enter GCS property reference name
167+
Then Enter GCS property "projectId" as macro argument "gcsProjectId"
168+
Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType"
169+
Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount"
170+
Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount"
171+
Then Enter GCS property "path" as macro argument "gcsSinkPath"
172+
Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix"
173+
Then Enter GCS property "format" as macro argument "gcsFormat"
174+
Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled
175+
Then Validate "GCS" plugin properties
176+
Then Close the GCS properties
177+
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
178+
Then Save the pipeline
179+
Then Preview and run the pipeline
180+
Then Enter runtime argument value "projectId" for key "bqProjectId"
181+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
182+
Then Enter runtime argument value "filter" for key "bqFilter"
183+
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
184+
Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
185+
Then Enter runtime argument value "dataset" for key "bqDataset"
186+
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
187+
Then Enter runtime argument value "projectId" for key "gcsProjectId"
188+
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
189+
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
190+
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
191+
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
192+
Then Run the preview of pipeline with runtime arguments
193+
Then Wait till pipeline preview is in running state
194+
Then Open and capture pipeline preview logs
195+
Then Verify the preview run status of pipeline in the logs is "succeeded"
196+
Then Close the pipeline logs
197+
Then Click on preview data for GCS sink
198+
Then Close the preview data
199+
Then Deploy the pipeline
200+
Then Run the Pipeline in Runtime
201+
Then Enter runtime argument value "projectId" for key "bqProjectId"
202+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
203+
Then Enter runtime argument value "filter" for key "bqFilter"
204+
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
205+
Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
206+
Then Enter runtime argument value "dataset" for key "bqDataset"
207+
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
208+
Then Enter runtime argument value "projectId" for key "gcsProjectId"
209+
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
210+
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
211+
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
212+
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
213+
Then Run the Pipeline in Runtime with runtime arguments
214+
Then Wait till pipeline is in running state
215+
Then Open and capture logs
216+
Then Verify the pipeline status is "Succeeded"
217+
Then Verify data is transferred to target GCS bucket
218+
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled
219+
220+
@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST
221+
Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for output schema
222+
Given Open Datafusion Project to configure pipeline
223+
When Source is BigQuery
224+
When Sink is GCS
225+
Then Open BigQuery source properties
226+
Then Enter BigQuery property reference name
227+
Then Enter BigQuery property "projectId" as macro argument "bqProjectId"
228+
Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId"
229+
Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType"
230+
Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount"
231+
Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount"
232+
Then Enter BigQuery property "dataset" as macro argument "bqDataset"
233+
Then Enter BigQuery property "table" as macro argument "bqSourceTable"
234+
Then Enter BigQuery source property output schema "outputSchema" as macro argument "bqOutputSchema"
235+
Then Validate "BigQuery" plugin properties
236+
Then Close the BigQuery properties
237+
Then Open GCS sink properties
238+
Then Enter GCS property reference name
239+
Then Enter GCS property "projectId" as macro argument "gcsProjectId"
240+
Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType"
241+
Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount"
242+
Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount"
243+
Then Enter GCS property "path" as macro argument "gcsSinkPath"
244+
Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix"
245+
Then Enter GCS property "format" as macro argument "gcsFormat"
246+
Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled
247+
Then Validate "GCS" plugin properties
248+
Then Close the GCS properties
249+
Then Connect source as "BigQuery" and sink as "GCS" to establish connection
250+
Then Save the pipeline
251+
Then Preview and run the pipeline
252+
Then Enter runtime argument value "projectId" for key "bqProjectId"
253+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
254+
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
255+
Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
256+
Then Enter runtime argument value "dataset" for key "bqDataset"
257+
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
258+
Then Enter runtime argument value "OutputSchema" for key "bqOutputSchema"
259+
Then Enter runtime argument value "projectId" for key "gcsProjectId"
260+
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
261+
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
262+
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
263+
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
264+
Then Run the preview of pipeline with runtime arguments
265+
Then Wait till pipeline preview is in running state
266+
Then Open and capture pipeline preview logs
267+
Then Verify the preview run status of pipeline in the logs is "succeeded"
268+
Then Close the pipeline logs
269+
Then Click on preview data for GCS sink
270+
Then Close the preview data
271+
Then Deploy the pipeline
272+
Then Run the Pipeline in Runtime
273+
Then Enter runtime argument value "projectId" for key "bqProjectId"
274+
Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
275+
Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
276+
Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
277+
Then Enter runtime argument value "dataset" for key "bqDataset"
278+
Then Enter runtime argument value for BigQuery source table name key "bqSourceTable"
279+
Then Enter runtime argument value "OutputSchema" for key "bqOutputSchema"
280+
Then Enter runtime argument value "projectId" for key "gcsProjectId"
281+
Then Enter runtime argument value for GCS sink property path key "gcsSinkPath"
282+
Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix"
283+
Then Enter runtime argument value "csvFormat" for key "gcsFormat"
284+
Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled
285+
Then Run the Pipeline in Runtime with runtime arguments
286+
Then Wait till pipeline is in running state
287+
Then Open and capture logs
288+
Then Verify the pipeline status is "Succeeded"
289+
Then Verify data is transferred to target GCS bucket
290+
Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled

0 commit comments

Comments
 (0)