Skip to main content

Deep dive into composite semantic models with Direct Lake and import tables

Headshot of article author Zoe Douglas

Getting your data job done just got easier with composite semantic models, mixing Direct Lake tables with import tables, now available in public preview. Direct Lake on OneLake table storage mode already could mix tables from other Fabric data sources, such as lakehouses, warehouses, SQL databases in Fabric, and mirrored databases. And with this update, now that flexibility is extended much further with the ability to add in import tables from any data source, from 100s of connectors in Power Query online.

The import tables added can also be from the same source as your Direct Lake tables. A small dimension table changed from Direct Lake to import can have a calculated column or the hierarchy usable in Analyze in Excel. Semantic link labs have a function to convert in place.
Refer to the Convert a table in Direct Lake mode to import mode code example.

A screenshot of Power BI web modeling showing tables in Direct Lake storage mode and import storage mode in the same composite model with options to add more tables.
A screenshot of Power BI web modeling showing tables in Direct Lake storage mode and import storage mode in the same composite model with options to add more tables.

Regular relationships with Direct Lake and import composite models keep the performance of reports as expected. This is an improvement over the traditional DirectQuery and import composite models only supporting limited relationships.

Create the Direct Lake and import composite models using Power BI experiences in the web. Power BI Desktop live editing is also available, but without the ability to choose different tables from the OneLake or transform options on import tables.

This update also comes with more options to create a new semantic model with Direct Lake on OneLake tables.

A screenshot of the create page in the Power BI service showing the OneLake catalog tile used to create semantic models from scratch with Direct Lake tables.
A screenshot of the create page in the Power BI service showing the OneLake catalog tile used to create semantic models from scratch with Direct Lake tables.

On the main Fabric portal page: From the ‘Create’ button in the left navigation, choose the OneLake catalog tile and select a Fabric item with delta tables, then select ‘Connect’.

  1. On the workspace page: From the ‘New item’ button choose ‘Semantic model’, then select the ‘OneLake catalog tile’ and a Fabric item with delta tables and ‘Connect’.
  2. On the Lakehouse page: In the ribbon, choose ‘New semantic model’.

In addition to the existing path from Power BI Desktop.

  1. In Power BI Desktop: In the ribbon, select the ‘OneLake catalog’ then choose a Fabric item with delta tables and ‘Connect’.

These all lead to the semantic model with Direct Lake tables dialog.

A screenshot of the semantic model creation dialog when creating with Direct Lake tables.
A screenshot of the semantic model creation dialog when creating with Direct Lake tables.

There are several ways in web modeling to create a composite model with Direct Lake and import tables. Web modeling includes 4 new buttons when editing any semantic model.

A screenshot of Power BI web modeling with the Get data, OneLake catalog, Transform data, and Refresh buttons added to the ribbon.
A screenshot of Power BI web modeling with the Get data, OneLake catalog, Transform data, and Refresh buttons added to the ribbon.

Adding Direct Lake tables to import or import tables to Direct Lake tables is as easy as clicking a button.

  1. On an existing Power BI semantic model with tables in Direct Lake on OneLake storage mode: In the ribbon, select the ‘OneLake catalog’ then choose a Fabric item with delta tables and Connect.
  2. On an existing Power BI semantic model with tables in import storage mode: In the ribbon, select ‘Get data’ or ‘Transform data’ then choose any source add a table in import mode.

You can continue to use Edit tables to change out tables used from Direct Lake sources and continue to use Power Query online through ‘Transform data’ to edit and bring in new tables in import storage mode.

Refresh is also now available to update import tables data and reframe Direct Lake tables and perform a schema sync on all tables to get latest column information from the respective data sources.

You can continue semantic modeling in the web, or you can switch to Power BI Desktop and continue to live edit the semantic model there. Even with import tables, you can now live edit a semantic model.

A screenshot of Power BI web modeling showing how to Edit in Desktop from the Editing drop down.
A screenshot of Power BI web modeling showing how to Edit in Desktop from the Editing drop down.

Semantic modeling can include these tasks and many more to get your data ready for reporting.

  1. Rename tables and columns, and add descriptions, for easier report creation.
  2. Add relationships between tables upfront.
  3. Add measures to aggregate data columns and perform data analysis such as % of total or % change.
  4. Add calculated columns and tables to extend the data already available, including a common date table.
  5. Organize the columns by hiding columns not needed for reporting, grouping in display folders, and adding hierarchies.
  6. Add row-level security roles, calculation groups, sort by on columns, and more.

In web or live editing in Desktop you can rely on version history snapshots in help if you need to undo a change.

And like with any semantic model, you can use Semantic Link Labs in Fabric Notebooks to make edits and TDML view in Power BI Desktop.

Let’s look at an example.

Goal: I want to get my deliveries data from a SQL database in Fabric using Direct Lake storage mode then add my deliveries target data from an Excel file using import storage mode.

This example will work if you have a Fabric capacity or Fabric trial capacity. I’ll create a SQL database for deliveries data and create a CSV with the deliveries target data.

I created a workspace, then from ‘New item’ created a ‘SQL database’. I used this query to create and populate the ‘Deliveries’ table.

-- Step 1: Create the Deliveries table
CREATE TABLE Deliveries (
    DeliveryID INT IDENTITY(1,1) PRIMARY KEY,
    ItemID INT,
    QuantityDelivered INT,
    DeliveryDate DATE,
    SupplierName VARCHAR(100),
    DeliveryStatus VARCHAR(20),
    DeliveryReference VARCHAR(50)
);

-- Step 2: Generate 1,000 mock delivery records
DECLARE @i INT = 0;
WHILE @i < 1000
BEGIN
    INSERT INTO Deliveries (
        ItemID,
        QuantityDelivered,
        DeliveryDate,
        SupplierName,
        DeliveryStatus,
        DeliveryReference
    )
    VALUES (
        ABS(CHECKSUM(NEWID())) % 1000 + 1,  -- Assuming ItemID between 1 and 1000
        ABS(CHECKSUM(NEWID())) % 100 + 1,   -- Quantity between 1 and 100
        DATEADD(DAY, -ABS(CHECKSUM(NEWID())) % 180, GETDATE()), -- Past 6 months
        CHOOSE(ABS(CHECKSUM(NEWID())) % 5 + 1, 'Acme Inc.', 'SupplyCo', 'Warehouse World', 'BulkBuyers', 'Global Goods'),
        CHOOSE(ABS(CHECKSUM(NEWID())) % 3 + 1, 'Delivered', 'Pending', 'Delayed'),
        CONCAT('DEL-', FORMAT(@i + 1, '000000'))
    );

    SET @i = @i + 1;
END;

And this SQL query to make a Calendar table to group the delivery dates into month also.

CREATE TABLE Calendar (
    DateID INT PRIMARY KEY,
    Date DATE NOT NULL,
    Month DATE NOT NULL
);

DECLARE @StartDate DATE = GETDATE()-180;
DECLARE @EndDate DATE = GETDATE();

WHILE @StartDate <= @EndDate
BEGIN
    INSERT INTO Calendar (DateID, Date, Month)
    VALUES (
        CAST(FORMAT(@StartDate, 'yyyyMMdd') AS INT),  -- e.g., 20250101
        @StartDate,
        DATEFROMPARTS(YEAR(@StartDate), MONTH(@StartDate), 1)
    );

    SET @StartDate = DATEADD(DAY, 1, @StartDate);
END;

I also created a CSV file and added it to my OneDrive. Depending on when you follow this tutorial, you may need to update the months to be in the last 6 months.

MonthDeliveries target
1/1/2025151
2/1/2025163
3/1/2025153
4/1/2025184
5/1/2025163
6/1/2025199
7/1/2025186
8/1/2025171
9/1/2025121
10/1/2025103
11/1/2025131
12/1/2025187

Now the data is ready!

I now go to the New item again in the workspace and choose semantic model.

A screenshot of using new item in the workspace to create a new semantic model with Direct Lake or import storage mode tables.
A screenshot of using new item in the workspace to create a new semantic model with Direct Lake or import storage mode tables.

From here I can select the ‘OneLake catalog tile’ and choose the ‘Deliveries SQL database’ I created. I named my semantic model Deliveries analysis and chose both tables.

A screenshot of the semantic model creation dialog.
A screenshot of the semantic model creation dialog.

In a few moments my semantic model is created and I’m in the web modeling experience. From here I can add my import table by going to ‘Get data’.

A screenshot of Power BI web modeling showing how to add an import storage mode table with the Get data button.

I choose Text/CSV connection and navigate to where I saved my CSV file with the targets in my OneDrive. Alternatively, you can choose blank table to paste in the values or upload the file directly.

A screenshot of Power Query online in Power BI web modeling to add the import table from different connector options.
A screenshot of Power Query online in Power BI web modeling to add the import table from different connector options.

Then you can do additional transformations, if needed. I had an extra blank row that I was able to remove.

A screenshot of the transform data experience in Power Query online in the Power BI web modeling experience.
A screenshot of the transform data experience in Power Query online in the Power BI web modeling experience.

Note: If you hadn’t set up a credential for this data source before you may be prompted to do that in the Power Query online experience. After that, if the load data fails on saving the transformations, go to the schedule refresh page of the semantic model and set the credentials there as well before returning to web modeling and refreshing.

Now I have my import and Direct Lake tables in my model!

A screenshot of Power BI web modeling with tables in Direct Lake and import table storage modes.
A screenshot of Power BI web modeling with tables in Direct Lake and import table storage modes.

Tooltips here on the Direct Lake tables will give details about the source name, source schema, source type, and the name and workspace it’s from. When adding tables from different Fabric data sources, I can quickly see where all my tables are coming from.

Now I need to create the relationships and measures. I can do that here but now I am going to edit this in Power BI Desktop and give you a TMDL script to run instead. Go to the ‘Editing’ drop down in the top right corner and pick ‘Edit in Desktop’.

A screenshot of Power BI web modeling to show where to continue editing in Desktop from the Editing drop down.
A screenshot of Power BI web modeling to show where to continue editing in Desktop from the Editing drop down.

After Power BI Desktop loads, I am now live editing this composite semantic model.

A screenshot of live editing a composite semantic model in Power BI Desktop with tables in both Direct Lake and import storage modes.
A screenshot of live editing a composite semantic model in Power BI Desktop with tables in both Direct Lake and import storage modes.

I can add in my relationships and measure by running this TMDL script:

createOrReplace

	relationship CalendarAndTarget
		toCardinality: many
		fromColumn: 'Deliveries target'.Month
		toColumn: Calendar.Month

	relationship CalendarAndDeliveries
		relyOnReferentialIntegrity
		fromColumn: Deliveries.DeliveryDate
		toColumn: Calendar.Date

	ref table Deliveries

		/// Calculates the average quantity delivered per delivery by dividing the total quantity delivered by the number of deliveries.
		measure 'Avg Qty per Delivery' = ```
				
				    DIVIDE([Total Quantity Delivered], [Deliveries])
				```
			formatString: #,0.00

			changedProperty = FormatString

			changedProperty = Description

		/// Counts the number of unique delivery records by calculating the distinct count of DeliveryID in the Deliveries table.
		measure Deliveries = ```
				
				    DISTINCTCOUNT('Deliveries'[DeliveryID])
				```
			formatString: #,0

			changedProperty = FormatString

			changedProperty = Description

		/// Calculates the percentage of deliveries achieved relative to the monthly target by dividing the number of deliveries by the monthly target.
		measure 'Target Attainment %' = ```
				
				    DIVIDE([Deliveries], [Monthly Target])
				```
			formatString: 0%;-0%;0%

			changedProperty = FormatString

			changedProperty = Description

		/// Calculates the difference between the number of deliveries and the monthly target to show how actual performance compares to the goal.
		measure 'Target Variance' = ```
				
				    [Deliveries] - [Monthly Target]
				```
			formatString: #,0

			changedProperty = FormatString

			changedProperty = Description

		/// Calculates the total quantity delivered by summing the 'QuantityDelivered' column in the Deliveries table.
		measure 'Total Quantity Delivered' = ```
				
				    SUM('Deliveries'[QuantityDelivered])
				```
			formatString: #,0

			changedProperty = FormatString

			changedProperty = Description

	ref table 'Deliveries target'

		/// Calculates the total deliveries target for the month by summing the 'Deliveries target' column.
		measure 'Monthly Target' = ```
				
				    SUM('Deliveries target'[Deliveries target])
				```
			formatString: #,0

A screenshot of TMDL view when live editing a composite semantic model in Power BI Desktop to add relationships and measures.
A screenshot of TMDL view when live editing a composite semantic model in Power BI Desktop to add relationships and measures.

And right-click the Calendar and ‘Mark as date table’. I also hid the base columns I used in measures.

Now I want to create a report. The report view is not included when live editing, so I can open a new instance of Desktop to create a report, or I can go back to web modeling and select ‘File’ then ‘Create new report’. If you had closed the browser window from earlier, you could go to the name of the file in Desktop to open a drop down with link to the semantic model in web.

A screenshot of Power BI web modeling showing how to create a new report from the file menu.
A screenshot of Power BI web modeling showing how to create a new report from the file menu.

Now I can drag and drop my fields to create some visuals. I can use Copilot as well to quickly create a report.

A screenshot of the Power BI report editing experience in the web.
A screenshot of the Power BI report editing experience in the web.

I can now use the new organizational themes to apply a theme used in my organization by going to ‘View’ then ‘Theme’.

A screenshot of the Power BI report editing experience in the web to apply a theme.
A screenshot of the Power BI report editing experience in the web to apply a theme.

And I even have performance analyzer also available to me when editing a report in the web, previously only available in Power BI Desktop.

A screenshot of the Power BI report editing experience in the web to use performance analyzer.
A screenshot of the Power BI report editing experience in the web to use performance analyzer.

There are other ways to create these composite models with Direct Lake and import tables. From an existing semantic model with Direct Lake on OneLake and/or import tables you can simply add in additional Direct Lake tables or import tables.

You can learn more about Direct Lake in the Direct Lake overview documentation, and try it out today!