Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
In this section of the tutorial, you project your digital twin builder (preview) ontology into Eventhouse using Fabric notebooks. This step makes it possible to run KQL queries on your digital twin builder data for further analysis in Real-Time Intelligence.
Important
This feature is in preview.
During this tutorial section, you create shortcuts to bring your digital twin builder data from the lakehouse where it's stored into your Tutorial KQL database in Eventhouse. Then, you run a notebook code sample that generates a script to project organized views of your digital twin builder data in Eventhouse. The script creates user-defined functions in Eventhouse, one for each entity and property type in your digital twin builder ontology. Later, you can use these functions in KQL queries to access your organized digital twin builder ontology data from Eventhouse.
Prepare eventhouse KQL database
Start by preparing your eventhouse and KQL database to access digital twin builder (preview) data.
Data from digital twin builder mappings is stored in a new lakehouse, with a name that looks like your digital twin builder item name followed by dtdm. For this tutorial, it's called BusModeldtdm. The lakehouse is located in the root folder of your workspace.
In this section, you add tables from your digital twin builder data lakehouse as external tables in the KQL database. Later, you run sample notebook code to set up an Eventhouse projection that runs on and organizes this data.
Go to the Tutorial KQL database that you created earlier in Tutorial part 2: Get and process streaming data.
From the menu ribbon, select New > OneLake shortcut.
Under Internal sources, select Microsoft OneLake. Then, choose BusModeldtdm.
Expand the list of Tables and begin selecting all tables. There's a limit to the number of tables that you can add to a shortcut at once, so stop after you select 10 tables and see the warning message. Make a note of where you stopped.
Select Next and Create to create the shortcuts.
Repeat the shortcut creation steps twice more, so that all tables are added as shortcuts.
When you're finished, you see all the external digital twin builder data tables under Shortcuts in the KQL database.
Prepare notebook and install dependencies
Next, prepare a notebook to run the sample Eventhouse projection code on the digital twin builder data in the KQL database. In this step, you import the sample notebook and link it to your digital twin builder data, then upload and install the required Python package.
Import the notebook
First, import the sample Fabric notebook. It contains code to generate the Eventhouse projection.
Download DTB_Generate_Eventhouse_Projection.ipynb from the sample folder in GitHub: digital-twin-builder.
Go to your workspace. From the menu ribbon, select Import > Notebook > From this computer.
Upload the notebook file.
The notebook is imported into your workspace. Select it from the workspace items to open it.
From the Explorer pane of the notebook, select Add data items > Existing data sources.
Select the BusModeldtdm lakehouse and select Connect.
In the Explorer pane, select ... next to the lakehouse name, and select Set as default lakehouse.
Optionally, remove the other lakehouse that was added by default to simplify the view.
Upload and install the Python package
Next, install the Python package that the notebook needs to work with digital twin builder data. In this section, you upload the package to your lakehouse and install it in the notebook.
Download dtb_samples-0.1-py3-none-any.whl from the sample folder in GitHub: digital-twin-builder.
In the Explorer pane of your open notebook, expand BusModeldtdm. Select ... next to Files, and select Upload > Upload files.
Upload the .whl file.
Close the Upload files pane and observe the new file in the Files pane for the lakehouse.
In the notebook, install the package by running the first code block.
After less than a minute, the package is installed and the notebook confirms the successful run status with a checkmark underneath the code.
Run Eventhouse projection code
Next, run the rest of the notebook code to generate the Eventhouse projection script. This script creates user-defined functions in Eventhouse that correspond to your digital twin builder entities and their properties.
In the second code block, there are variables for
dtb_item_name
andkql_db_name
. Fill their values with BusModel and Tutorial (case-sensitive). Run the code block. The notebook confirms the successful run status with a checkmark underneath the code.Scroll down to the next code block and run it. This code block completes the following operations:
- Connects to your workspace and your digital twin builder ontology
- Sets up a Spark reader to pull data from the digital twin builder database
- Generates a script that pushes your digital twin builder data into Eventhouse
- Automatically creates several functions based on your digital twin builder's configuration, to make this data readily accessible in Eventhouse for use in KQL queries
The notebook confirms the successful run status with a checkmark underneath the code, and a list of functions that were added (one for each entity and property type).
Tip
If you see a ModuleNotFoundError, rerun the first code block with the package installation. Then, rerun this code block.
Run the last code block. This code runs a Python snippet that sends your script to the Fabric REST API and executes it against your KQL database. The notebook confirms the successful run status with a checkmark underneath the code, and confirmation that it successfully created the Eventhouse ___domain projection functions.
Now the projection functions are created in Eventhouse, one for each digital twin builder entity and its property types.
Verify projection functions
Verify that the functions were created successfully in your KQL database.
Go to your Tutorial KQL database and refresh the view.
Expand Functions in the Explorer pane to see a list of functions created by the notebook (the extractBusData function is also there from when you created it in Tutorial part 2: Get and process streaming data).
Select Tutorial_queryset from the Explorer pane to open the query window. Use the + above the query pane to create a new query, and enter
.show functions
. This displays a list of functions in the queryset, which you can expand to see their details.Run the functions to see the data projections they produce. Recognize that the properties correspond to the fields that you mapped to the digital twin builder ontology in Tutorial part 3: Build the ontology.
Optionally, save the query tab as Explore functions so you can identify it later.
Now you can write other KQL queries using these user-defined functions to access data from your digital twin builder (preview) ontology. In the next tutorial section, you use these functions to write KQL queries that extract insights from your data, and display the results in a Real-Time Dashboard.