Getting your data job done just got easier with composite semantic models, mixing Direct Lake tables with import tables, now available in public preview. Direct Lake on OneLake table storage mode already could mix tables from other Fabric data sources, such as lakehouses, warehouses, SQL databases in Fabric, and mirrored databases. And with this update, now that flexibility is extended much further with the ability to add in import tables from any data source, from 100s of connectors in Power Query online.
The import tables added can also be from the same source as your Direct Lake tables. A small dimension table changed from Direct Lake to import can have a calculated column or the hierarchy usable in Analyze in Excel. Semantic link labs have a function to convert in place.
Refer to the Convert a table in Direct Lake mode to import mode code example.

Regular relationships with Direct Lake and import composite models keep the performance of reports as expected. This is an improvement over the traditional DirectQuery and import composite models only supporting limited relationships.
Create the Direct Lake and import composite models using Power BI experiences in the web. Power BI Desktop live editing is also available, but without the ability to choose different tables from the OneLake or transform options on import tables.
This update also comes with more options to create a new semantic model with Direct Lake on OneLake tables.

On the main Fabric portal page: From the ‘Create’ button in the left navigation, choose the OneLake catalog tile and select a Fabric item with delta tables, then select ‘Connect’.
- On the workspace page: From the ‘New item’ button choose ‘Semantic model’, then select the ‘OneLake catalog tile’ and a Fabric item with delta tables and ‘Connect’.
- On the Lakehouse page: In the ribbon, choose ‘New semantic model’.
In addition to the existing path from Power BI Desktop.
- In Power BI Desktop: In the ribbon, select the ‘OneLake catalog’ then choose a Fabric item with delta tables and ‘Connect’.
These all lead to the semantic model with Direct Lake tables dialog.

There are several ways in web modeling to create a composite model with Direct Lake and import tables. Web modeling includes 4 new buttons when editing any semantic model.

Adding Direct Lake tables to import or import tables to Direct Lake tables is as easy as clicking a button.
- On an existing Power BI semantic model with tables in Direct Lake on OneLake storage mode: In the ribbon, select the ‘OneLake catalog’ then choose a Fabric item with delta tables and Connect.
- On an existing Power BI semantic model with tables in import storage mode: In the ribbon, select ‘Get data’ or ‘Transform data’ then choose any source add a table in import mode.
You can continue to use Edit tables to change out tables used from Direct Lake sources and continue to use Power Query online through ‘Transform data’ to edit and bring in new tables in import storage mode.
Refresh is also now available to update import tables data and reframe Direct Lake tables and perform a schema sync on all tables to get latest column information from the respective data sources.
You can continue semantic modeling in the web, or you can switch to Power BI Desktop and continue to live edit the semantic model there. Even with import tables, you can now live edit a semantic model.

Semantic modeling can include these tasks and many more to get your data ready for reporting.
- Rename tables and columns, and add descriptions, for easier report creation.
- Add relationships between tables upfront.
- Add measures to aggregate data columns and perform data analysis such as % of total or % change.
- Add calculated columns and tables to extend the data already available, including a common date table.
- Organize the columns by hiding columns not needed for reporting, grouping in display folders, and adding hierarchies.
- Add row-level security roles, calculation groups, sort by on columns, and more.
In web or live editing in Desktop you can rely on version history snapshots in help if you need to undo a change.
And like with any semantic model, you can use Semantic Link Labs in Fabric Notebooks to make edits and TDML view in Power BI Desktop.
Let’s look at an example.
Goal: I want to get my deliveries data from a SQL database in Fabric using Direct Lake storage mode then add my deliveries target data from an Excel file using import storage mode.
This example will work if you have a Fabric capacity or Fabric trial capacity. I’ll create a SQL database for deliveries data and create a CSV with the deliveries target data.
I created a workspace, then from ‘New item’ created a ‘SQL database’. I used this query to create and populate the ‘Deliveries’ table.
-- Step 1: Create the Deliveries table CREATE TABLE Deliveries ( DeliveryID INT IDENTITY(1,1) PRIMARY KEY, ItemID INT, QuantityDelivered INT, DeliveryDate DATE, SupplierName VARCHAR(100), DeliveryStatus VARCHAR(20), DeliveryReference VARCHAR(50) ); -- Step 2: Generate 1,000 mock delivery records DECLARE @i INT = 0; WHILE @i < 1000 BEGIN INSERT INTO Deliveries ( ItemID, QuantityDelivered, DeliveryDate, SupplierName, DeliveryStatus, DeliveryReference ) VALUES ( ABS(CHECKSUM(NEWID())) % 1000 + 1, -- Assuming ItemID between 1 and 1000 ABS(CHECKSUM(NEWID())) % 100 + 1, -- Quantity between 1 and 100 DATEADD(DAY, -ABS(CHECKSUM(NEWID())) % 180, GETDATE()), -- Past 6 months CHOOSE(ABS(CHECKSUM(NEWID())) % 5 + 1, 'Acme Inc.', 'SupplyCo', 'Warehouse World', 'BulkBuyers', 'Global Goods'), CHOOSE(ABS(CHECKSUM(NEWID())) % 3 + 1, 'Delivered', 'Pending', 'Delayed'), CONCAT('DEL-', FORMAT(@i + 1, '000000')) ); SET @i = @i + 1; END;
And this SQL query to make a Calendar table to group the delivery dates into month also.
CREATE TABLE Calendar (
DateID INT PRIMARY KEY,
Date DATE NOT NULL,
Month DATE NOT NULL
);
DECLARE @StartDate DATE = GETDATE()-180;
DECLARE @EndDate DATE = GETDATE();
WHILE @StartDate <= @EndDate
BEGIN
INSERT INTO Calendar (DateID, Date, Month)
VALUES (
CAST(FORMAT(@StartDate, 'yyyyMMdd') AS INT), -- e.g., 20250101
@StartDate,
DATEFROMPARTS(YEAR(@StartDate), MONTH(@StartDate), 1)
);
SET @StartDate = DATEADD(DAY, 1, @StartDate);
END;
I also created a CSV file and added it to my OneDrive. Depending on when you follow this tutorial, you may need to update the months to be in the last 6 months.
Month | Deliveries target |
1/1/2025 | 151 |
2/1/2025 | 163 |
3/1/2025 | 153 |
4/1/2025 | 184 |
5/1/2025 | 163 |
6/1/2025 | 199 |
7/1/2025 | 186 |
8/1/2025 | 171 |
9/1/2025 | 121 |
10/1/2025 | 103 |
11/1/2025 | 131 |
12/1/2025 | 187 |
Now the data is ready!
I now go to the New item again in the workspace and choose semantic model.

From here I can select the ‘OneLake catalog tile’ and choose the ‘Deliveries SQL database’ I created. I named my semantic model Deliveries analysis and chose both tables.

In a few moments my semantic model is created and I’m in the web modeling experience. From here I can add my import table by going to ‘Get data’.

I choose Text/CSV connection and navigate to where I saved my CSV file with the targets in my OneDrive. Alternatively, you can choose blank table to paste in the values or upload the file directly.

Then you can do additional transformations, if needed. I had an extra blank row that I was able to remove.

Note: If you hadn’t set up a credential for this data source before you may be prompted to do that in the Power Query online experience. After that, if the load data fails on saving the transformations, go to the schedule refresh page of the semantic model and set the credentials there as well before returning to web modeling and refreshing.
Now I have my import and Direct Lake tables in my model!

Tooltips here on the Direct Lake tables will give details about the source name, source schema, source type, and the name and workspace it’s from. When adding tables from different Fabric data sources, I can quickly see where all my tables are coming from.
Now I need to create the relationships and measures. I can do that here but now I am going to edit this in Power BI Desktop and give you a TMDL script to run instead. Go to the ‘Editing’ drop down in the top right corner and pick ‘Edit in Desktop’.

After Power BI Desktop loads, I am now live editing this composite semantic model.

I can add in my relationships and measure by running this TMDL script:
createOrReplace
relationship CalendarAndTarget
toCardinality: many
fromColumn: 'Deliveries target'.Month
toColumn: Calendar.Month
relationship CalendarAndDeliveries
relyOnReferentialIntegrity
fromColumn: Deliveries.DeliveryDate
toColumn: Calendar.Date
ref table Deliveries
/// Calculates the average quantity delivered per delivery by dividing the total quantity delivered by the number of deliveries.
measure 'Avg Qty per Delivery' = ```
DIVIDE([Total Quantity Delivered], [Deliveries])
```
formatString: #,0.00
changedProperty = FormatString
changedProperty = Description
/// Counts the number of unique delivery records by calculating the distinct count of DeliveryID in the Deliveries table.
measure Deliveries = ```
DISTINCTCOUNT('Deliveries'[DeliveryID])
```
formatString: #,0
changedProperty = FormatString
changedProperty = Description
/// Calculates the percentage of deliveries achieved relative to the monthly target by dividing the number of deliveries by the monthly target.
measure 'Target Attainment %' = ```
DIVIDE([Deliveries], [Monthly Target])
```
formatString: 0%;-0%;0%
changedProperty = FormatString
changedProperty = Description
/// Calculates the difference between the number of deliveries and the monthly target to show how actual performance compares to the goal.
measure 'Target Variance' = ```
[Deliveries] - [Monthly Target]
```
formatString: #,0
changedProperty = FormatString
changedProperty = Description
/// Calculates the total quantity delivered by summing the 'QuantityDelivered' column in the Deliveries table.
measure 'Total Quantity Delivered' = ```
SUM('Deliveries'[QuantityDelivered])
```
formatString: #,0
changedProperty = FormatString
changedProperty = Description
ref table 'Deliveries target'
/// Calculates the total deliveries target for the month by summing the 'Deliveries target' column.
measure 'Monthly Target' = ```
SUM('Deliveries target'[Deliveries target])
```
formatString: #,0

And right-click the Calendar and ‘Mark as date table’. I also hid the base columns I used in measures.
Now I want to create a report. The report view is not included when live editing, so I can open a new instance of Desktop to create a report, or I can go back to web modeling and select ‘File’ then ‘Create new report’. If you had closed the browser window from earlier, you could go to the name of the file in Desktop to open a drop down with link to the semantic model in web.

Now I can drag and drop my fields to create some visuals. I can use Copilot as well to quickly create a report.

I can now use the new organizational themes to apply a theme used in my organization by going to ‘View’ then ‘Theme’.

And I even have performance analyzer also available to me when editing a report in the web, previously only available in Power BI Desktop.

There are other ways to create these composite models with Direct Lake and import tables. From an existing semantic model with Direct Lake on OneLake and/or import tables you can simply add in additional Direct Lake tables or import tables.
You can learn more about Direct Lake in the Direct Lake overview documentation, and try it out today!