S03 DATA FACTORY: INGESTA, DATAFLOW GEN2 Y ORQUESTACION

S03 DATA FACTORY: INGESTA, DATAFLOW GEN2 Y ORQUESTACION

Professional Development

5 Qs

quiz-placeholder

Similar activities

Business Intelligence Webinar

Business Intelligence Webinar

Professional Development

10 Qs

S01 Introducción Microsoft Fabric

S01 Introducción Microsoft Fabric

Professional Development

5 Qs

test 16-04-2022

test 16-04-2022

Professional Development

5 Qs

OSI LAYER

OSI LAYER

KG - Professional Development

10 Qs

Standar 1

Standar 1

Professional Development

10 Qs

IACUC Review

IACUC Review

Professional Development

9 Qs

Getting Started with Elastic Stack

Getting Started with Elastic Stack

Professional Development

10 Qs

Using Applications and Files

Using Applications and Files

Professional Development

8 Qs

S03 DATA FACTORY: INGESTA, DATAFLOW GEN2 Y ORQUESTACION

S03 DATA FACTORY: INGESTA, DATAFLOW GEN2 Y ORQUESTACION

Assessment

Quiz

Science

Professional Development

Medium

Created by

Eladio Yovera

Used 1+ times

FREE Resource

5 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is a data pipeline?

A special folder in OneLake storage where data can be exported from a lakehouse

A sequence of activities to orchestrate a data ingestion or transformation process

A saved Power Query

none of the above

2.

MULTIPLE CHOICE QUESTION

20 sec • 1 pt

You want to use a pipeline to copy data to a folder with a specified name for each run. What should you do?

Create multiple pipelines - one for each folder name

Use a Dataflow (Gen2)

Add a parameter to the pipeline and use it to specify the folder name for each run

None of the above

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You have previously run a pipeline containing multiple activities. What's the best way to check how long each individual activity took to complete?

Rerun the pipeline and observe the output, timing each activity.

View the run details in the run history.

View the Refreshed value for your lakehouse's default dataset

None of the above

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In what scenario is Dataflows Gen2 preferred over traditional data pipelines for data ingestion and transformation?

When row-level security is a critical requirement for the data processing.

When you need to integrate data from multiple sources with complex transformations using a low-code interface.

When the goal is to replace an existing data warehouse entirely.

None of the above.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

To implement an ELT process using Dataflows Gen2 and pipelines, what is the correct sequence of actions?

Extract and load data with a pipeline, then transform data with a Dataflow Gen2

Load data directly into Power BI, then transform using Dataflows Gen2

Transform data with a Dataflow Gen2, then load it using a pipeline

None of the above