Problem Statement
It is possible to copy / Extract data from Power BI dataset from Azure Offerings as stated in the blog : Overcoming Data Size / Row Limitations of Power BI REST API for Automated Data Extraction from Dataset by leveraging the Power BI REST API: Datasets – Execute Queries.
But unlike Power BI Dataset , Power BI dataflow doesn't have any native API for the data extraction purpose. So in order to extract data from t
Is it possible to Copy / Extract data from Azure Analysis service with MFA enabled on AD accounts in an automated way .
Prerequisites
- MSFT Fabric Dataflow Gen 2
- Power BI Dataflow
Solution
1. Login to https://app.fabric.microsoft.com/home?experience=data-factory and Select Dataflow Gen 2.
![Data factory]()
And rename the Dataflow.
![Rename Dataflow]()
2. Click 'Get Data' >> More and Select Dataflows.
![Data source]()
3. Create a new connection (In case if there isn't an existing one) via your organization account and Click 'Next'.
![Connect to data source]()
4. The list of Dataflows to which the organization account has access to would be populated under Workspaces section as seen below.
![Choose data]()
Filter for your Dataflow Name and Select the Table whose data we need to extract.
![Choose data from DateDim]()
5. Do the necessary transformation that you need in the Power Query mode.
In this scenario, we would add a new column called AuditDate for auditing purpose.
![Custom Column]()
Final Source transformations state.
![Audit date]()
The Advanced editor code.
![Advanced Editor]()
let
Source = PowerPlatform.Dataflows([]),
#"Navigation 1" = Source{[Id = "Workspaces"]}[Data],
#"Navigation 2" = #"Navigation 1"{[workspaceId = "0bca1820-43e9-49dc-a3e1-82e505dbdae7"]}[Data],
#"Navigation 3" = #"Navigation 2"{[dataflowId = "bfd00ae0-cf60-4771-a041-289142951ea5"]}[Data],
#"Navigation 4" = #"Navigation 3"{[entity = "DateDim", version = ""]}[Data],
#"Added custom" = Table.TransformColumnTypes(
Table.AddColumn(#"Navigation 4", "AuditDate", each DateTime.LocalNow()),
{{"AuditDate", type date}}
)
in
#"Added custom"
6. Now Select the Sink / Destination settings
![Data Destination]()
As of today only 4 Sinks are supported :
![Azure data explorer]()
So for our use case, we would Select Azure SQL Database.
7. Similar to #3, create a connection to Azure SQL Database.
![Connect Azure SQL Database]()
8. You can either create a new table on run or map it to an existing one.
![New table]()
In our use case, we would create a new table in destination with the name DateDim.
9. One can either append the data or replace the data based on the settings.
![Append]()
We would proceed with Append data scenario.
And Click on “Save settings”.
10. Finally Click on “Publish”
![Publish]()
11. The Dataflow begins the 1st execution once it’s published.
Database output
![Database output]()
12. To Execute again, Click on the Refresh now component of the dataflow.
![Dataflow]()
As we have enabled Append data at sink, we would have 2 distinct AuditDates
![Distinict AuditDate]()
13. In order to schedule the data extraction, one can either schedule the dataflow refresh frequency via Dataflow Settings.
![Setting]()
OR schedule via Fabric Data pipelines.