I think that Microsoft Dataverse for Teams (old name Project Oakdale) is the most important Power Platform announcement of the year, especially you are coming from a canvas apps background like myself. Teams are starting to be more and more of a platform for the business than a replacement for Skype. Now, with Microsoft Dataverse, users will have a real data capability and a route to future upgrades by transferring their application on top of Common Data Service if needed. Teams and Microsoft Dataverse for Teams apps offer the needed simplicity to build those small or even large everyday business apps that truly matter for the users.
There are multiple posts and how-to guides to learn Microsoft Dataverse for Teams and canvas apps development, so there is no need to go deeper. One thing that got my attention during the Ignite was the announcement of using Azure API Management with Dataverse for Teams solution through the existing Teams licensing!
This means that your professional developers can create API services to process data and connect to almost any enterprise service. Then the citizen or IT pro developers can leverage those functionalities on their application. These functions will be technically published as custom connectors to the Power Platform environment related to the Dataverse in Teams.
Earlier, this meant that you needed an extra license because a custom connector is a premium level connector, but now that is not needed with Dataverse for Teams environments. Let us see how to use this in real action. Again, we can use something easy for even non-developers and create an Azure Function with PnP PowerShell (I did write about this earlier).
This post is a long one, so here are quick links to different sections:
Let us keep things simple and create an application that asks for some data from a user and then creates a new News page to SharePoint. We will also fetch some additional information from an “enterprise” service with a rest call during the creation process. The idea is to ask the title and the body from the user and then fetch some data from the Bacon Ipsum service and add it to the news page.
The source code of the function can be found from my GitHub. It is a lot easier to read the code from there, but I will cover the most important parts here: PowerShellCore / CreateBaconPage.
When I am creating PowerShell scripts, I have a habit of using the following type of structure. I think this helps to read and maintain the code.
$newsTitle = $Request.Query.newsTitle
$newsBody = $Request.Query.newsBody
$meatParas = $Request.Query.meatParas
$spConn = Connect-PnPOnline -Url $siteURL -Credentials $credential -ReturnConnection
#Check the parameters necessary for the application
Write-Host " "
Write-Host "*Check the parameters necessary for the application"
If($spConn -and $newsTitle -and $newsBody -and $meatParas){
#Parameters are available
$haveMainParameters = $true
}
else {
#Missign some parameters
$haveMainParameters = $false
}
Write-Host " "
Write-Host "*Create the new page"
#Create basic section
Write-Host "..create basic section"
$newsPage = Add-PnPClientSidePage -Name $newsTitle -PromoteAs NewsArticle -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Text $newsBody -Connection $spConn
#Connect to enterpise service
$baconText = GiveMeBacon -meatParas $meatParas #Save for later -meatType $meatType
Now, let’s look at an example where we fetch some data from an enterprise service outside of Office 365 scope. As a test, let’s get some random text from the Bacon Ipsum service. We will do this by adding a custom module with the necessary logic into our Azure Function.
function GiveMeBacon{
[CmdletBinding()]
param
(
[Parameter(Mandatory = $true, HelpMessage="Amount of meat")]
[string] $meatParas
)
#**Give me bacon from - baconipsum.com
Write-Host "#*#Give me bacon from - baconipsum.com"
$response = ""
try {
$queryURL = ("https://baconipsum.com/api/?type=all-meat¶s={0}&start-with-lorem=1&format=text" -f $meatParas)
$response = Invoke-RestMethod -Uri $queryURL -ContentType "application/json; charset=utf-8" -Method Post -UseBasicParsing
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Host "**ERROR: #*#Give me bacon"
Write-Error $ErrorMessage
}
return $response
}
Export-ModuleMember -Function GiveMeBacon
#Get custom modules
$SP_ModulePath = $PSScriptRoot + "\CustomModules"
Import-Module "$SP_ModulePath\EnterpiseAPI.psm1" -Force
#Connect to enterpise service
$baconText = GiveMeBacon -meatParas $meatParas #Save for later -meatType $meatType
#Add related section
Write-Host "..add related section"
Add-PnPClientSidePageSection -Page $newsPage -SectionTemplate OneColumn -ZoneEmphasis 2 -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Column 1 -Section 2 -Text "<h3>Related Info</h3>" -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Column 1 -Section 2 -Text $baconText -Connection $spConn
#Add related section
Write-Host "..publish the page"
Set-PnPClientSidePage -Identity $newsPage -Publish -Connection $spConn
Again, you can test the function to make sure a correct type of page gets created to the SharePoint. When everything is working correctly, you can publish the function in Azure.
Now let us go teams open the Power App application so that we can create and Dataverse application. The application is simple, with some data fields and a button. You can see the structure from the image below.
If(
EnableSendBtn And Not(IsBlank(txtNewsTitle.Value)) And Not(IsBlank(txtNewsBody.Value)) And Not(IsBlank(txtDetailsParagraph.Value)),
DisplayMode.Edit,
DisplayMode.Disabled
)
But how to call the Azure Function we made earlier?
If you have not used or created API Management before, you can start exploring the service with this simple documentation: Quickstart – Create an Azure API Management instance | Microsoft Docs. I will cover the Project Oakdale related basic settings in the next steps. I assume that you have Azure Function that you want to publish an Azure API Management instance created.
Set(
EnableSendBtn,
false
);
//CALL AZURE API MANAGEMENT
GiveMeBaconAPI.postcreatebaconpage(
{
newsTitle: txtNewsTitle.Value,
newsBody: txtNewsBody.Value,
meatParas: Value(txtDetailsParagraph.Value),
'subscription-key': "YOUR_KEY_GOES_HERE"
}
);
Set(
EnableSendBtn,
true
);
Reset(txtNewsTitle);
Reset(txtNewsBody);
Reset(txtDetailsParagraph);
PS. When writing this post, I have seen a couple of different setting options for the subscription key during the past week. This might because Project Oakdale is still in preview. Here I used one of the current working methods, but I will keep watching the progress and update my post if necessary.
When everything goes as planned, a new page is created to SharePoint with some enterprise service data. You can now continue to future develop the Project Oakdale app and publish it to the users. In case something goes wrong, you can check the possible errors in the Power App side after closing the preview windows. You can also debug the Azure Function by opening the monitoring and making a test call from the Power App. At this point, you will thank your-self for writing enough comments to the host inside your code.
Good things are worth waiting for. It took around two years but finally, PnP PowerShell (@PnpPowershell) for PowerShell Core, aka PnP.PowerShell is out in a preview, and GA is expected to be realized this year. Big kudos to Erwin (@erwinvanhunen) and his team. More about the journey and roadmap ‘Cross Platform PnP PowerShell Released‘
I am a huge fan of PowerShell and especially the PnP PowerShell library. I still use it almost daily for something, and I think scripting is something that every IT Pro should master at least on some level. As we know, things are evolving quickly, and there is always something new in the IT world. One important thing around PowerShell was PowerShell Core 6 that was announced back in 2014. The PowerShell Core is a cross-platform, free, and open source. For an old-school SharePoint geek (like me), the lack of API and modules against SharePoint Online has been the thing to keep using the Windows PowerShell. Sure, you have been able to use REST before, but the PnP PowerShell module has just been more comfortable.
But now the future is finally here for the oldies everyone. You can find the source code of this post from my GitHub repository.
Installing the new PnP PowerShell is easy, as usual. Just open the PnP Management Shell and give Install-Module PnP.PowerShell -AllowPrerelease command. Things still are in preview mode, but new cmdlets are added regularly to GitHub. Using the module is almost similar to using the classic version. Here’s an excellent place to start if needed: PnP PowerShell Overview | Microsoft Docs
On a high -level, things to remember for us who have been using the earlier version are:
Regarding the login, if you are not familiar with connecting to SharePoint Online using Application Permission, I think this is an excellent time to check out that option. Giving the necessary permission through an Azure AD application will give you more robust tools to control the access for many scripting and integration use cases. Application permission is also crucial to master for a reliable and secure connection for automated tasks and service scenarios. Here’s a basic walkthrough on how to get started.
I still have not covered the “Here’s Why” section of the title, so here it goes. I think (for me) the major thing of the new release is that moving the PnP library on top of .NET Core means that PnP PowerShell now works natively with Azure Functions type of services. You were, and I have, able to use it before, but the development experience was not good, and there was not an easy way to manage the ALM process for your PowerShell functions. With the new version, all the complexity is gone, and PowerShell is starting to be a real option for service development in many cloud business cases.
As an old developer, I know this will raise the hairs upright among many of my friends, but you should not underestimate PowerShell when creating integrations or services. Yes, by coding, you can do things effectively, and there are multiple cases when PowerShell is not enough.
In many organizations, the situation is that there aren’t too many people who know modern code techniques but there are many who know PowerShell. Now with PnP PowerShell and another available cmdlet and modules, people are able to do even more complex useful programs for their organization.
At the same time, we, the consultants, are responsible for developing services that our clients can use AND maintain. In many cases, there are multiple IT Pro’s that do know PowerShell, but they do not have coding experience. I always try to build my solutions to take whole control of the solution when I am gone. One of my mentors in consulting had one rule; as a consultant, do your job every day so that you may be unnecessary the next day. Therefore, I see PowerShell as a viable option in many cases. By using it, I know multiple people can maintain the solution. This is the same reason why I find Power Platform as an essential platform in the Microsoft ecosystem.
As a consultant, do your job every day so that you may be unnecessary the next day.
To get back to the point, you can now use the PnP PowerShell module easily when building Azure functions. I recommend using a Visual Studio code from the beginning so that you will adopt better ALM practices. If necessary, start your journey by creating a basic Azure function with PowerShell. Here’s a guide for that (select PowerShell as the programming language): Create your first function in Azure using Visual Studio Code | Microsoft Docs
I assume that you have a working local Azure Function development environment based on the “Create your first function in Azure using Visual Studio Code” article above. Here are the steps you need to take to have PnP PowerShell working in your function.
$connClientId = $env:ConnClientId
$connSecret = $env:ConnSecret
$siteURL = $env:O365DemoURL
$response = ""
try {
Connect-PnPOnline -Url $siteURL -ClientId $connClientId -ClientSecret $connSecret
$ctx = Get-PnPContext
$response = $ctx.Url
Disconnect-PnPOnline
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Host "**ERROR: *Basic-PnPPowerShellCore"
Write-Error $ErrorMessage
}
if ($response) {
Write-Host ("Response: " + $response)
# Site details found!
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
StatusCode = [HttpStatusCode]::OK
Body = $response
})
}
else {
# No site details found
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
StatusCode = [HttpStatusCode]::NoContent
})
}
Now, imagine what type of scenarios and possibilities the integration to Azure Function is giving to a user who knows PowerShell. In my next post, I will show something interesting regarding Power Platform and PowerShell, but hopefully, this post gives you ideas to continue developing useful services quickly.
Is it possible to save list item attachments to the document library? The short answer is yes.
This question is really common, and as you might know, there isn’t an OOTB way to do it. Still, I do understand this need because having the attachment documents in the SharePoint document library brings many benefits; search and filtering, metadata, collaboration, etc. I remember, when I was looking for a solution for this, I did find many blogs about how to upload images to the SharePoint document library with Power App.
At the end of the day, the solution for this comes to the point I had in my Tech Days 2020 session ‘The Right Tool for the Job – Combine Power Platform and SharePoint.’ Know your tools and use the right tool for the job. There is no need to read the attachment blobs in Power App and then upload the data somehow. Everything can be done with Power Automate, SharePoint, and some smart additions to extend user experience with UI only customizations.
There isn’t anything special you will need to the list that has the attachments. For the document library, I like to add a field to link the documents and the item in the referring list. So, add a column called ParentID as metadata to the document library where you want to save the document. Later, we will use this column to give the users more accessible access to the documents.
In this example, I have a simple list with few columns and a Canvas Power App with a simple. The Save button is using the basic SubmitForm to save the given details. The attachment handling is done so that the end-user uses the default attachment field functionality to save the attachment document to the list. Then we will use a Power Automate flow to move them to the document library. There are few ways to build the necessary Power Automation.
Let’s use the option two here and initialize the flow straight from the Power App. From the ribbon in the Power Apps studio, select Action -> Power Automate -> Create a new flow. This will open a new Power Automate for you that is triggered by a Power App.
Finally, remember to give a name to the flow and save it. I used the name ‘Save Attachments Demo.’ After this, we can go back to the Power App and connect the flow to the application. First, select the Save button and open the OnSelect property. Then from the ribbon, select Action -> Power Automate -> select the flow we just created. If you had something in the OnSelect property, you might lose the written function, but with undo, you can get them back.
We need to modify the OnSelect property of the button with the following functions.
SubmitForm(frmMyReport);
SaveAttachmentsDemo.Run(frmMyReport.LastSubmit.ID);
The forms in Power Apps have some data state-related properties (LastSubmited, Unsaved, Updates) we can use in the process. LastSubmited property holds all the fields and their values of the last submitted form. We will send the list item ID parameter to the Power Automate. This id is then saved in the ListItemID variable at the beginning of the flow. Test the form with few attachments, and after the submission, the flow should copy the list attachments to the document library.
At this point, we can copy the attachments to the document library, but this leaves us two copies in different places, which is not ideal. Let us make some additions to the flow to overcome this.
The next step is to add the list item id as metadata to the document. This can be done with Update file properties action. The correct file is found with dynamic id fetch from the Create file action. Set the ParentID with the value in the ListItemID variable.
In the final step, we will delete the attachment from the list item. This can be done with Delete attachment action. Point the action to the list and get the item based on the ListItemID variable. The File Identifier id is fetched from the Get Attachment action.
![]() |
![]() |
Save the flow and make a test run with the app. Now we have a solution where the user can give list details and documents with Power Apps form. The form details are saved in the SharePoint list, and the documents are saved in a document library.
So, what is the benefit of saving the id of the list item as metadata for the documents? With the id, we can build an easy functionality where users can access the desired documents straight from the list.
{
"$schema": "https://developer.microsoft.com/json-schemas/sp/column-formatting.schema.json",
"elmType": "a",
"txtContent": "Show Documents",
"attributes": {
"target": "_blank",
"href": "='LINK TO YOUR LIBRARY/Forms/AllItems.aspx?FilterField1=ParentID&FilterValue1=' + [$ID]"
},
"style": {
"border": "none",
"background-color": "transparent",
"cursor": "pointer",
"color": "Blue"
}
}
Now we have an easy way to navigate to the document straight from the list. When you click the Show Document link in the list, you will be directed to the documents library, and you will see only the documents related to the list item. More information on column formatting, like using the icons, can be found here: https://github.com/SharePoint/sp-dev-list-formatting
Nowadays, I find myself self-working with PowerShell almost as much as with Power Platform. Not even mention that I used to work as a software developer in the web stack world, like jQuery, and React. I remember that my first large PowerShell project was related to SharePoint 2013 site provisioning years ago. Back then PnP PowerShell library was only a dream I had to write the modules and extension our self. Today, I’m a big fan of PnP PowerShell, and I’m using it almost daily. And we should not forget those multiple other PowerShell modules available to different services of Office 365.
Of course, there are situations and times when there isn’t a built-in command available for the needed task. In many cases, you can use CSOM capabilities to close the cap (I’ll come back to this in the upcoming posts). And then there’s Microsoft Graph you can use to do a lot of things. To get us rolling, we need to authenticate against it to get AccessToken we need to use in HTTP calls against the Graph API. Here are three different scenarios on how to get the AccessToken with PowerShell.
You find all the scripts from my GitHub: https://github.com/MikkoKoskinen/SP-Poweshell/tree/master/ConnectToGraph
The first thing you need is to register the application and set the necessary permissions for the service you won’t use. Here are the steps on how to do it: https://docs.microsoft.com/en-us/graph/auth-register-app-v2. Remember to copy the application ID and secret because we need those in the scripts.
In all the scripts below, we will need several information from the tenant and the registered application. Most importantly, we need the application id, tenant id, and the created secret id. With these, we can design and application-level connection to the Graph. For delegated permission, we will also need the credentials of the users that we want to use for the connection.
#*** Initialize variables
Write-Host "Initialize variables"
Write-Host " "
[string]$graphApp_AppId = "APPLICATION_ID"
[string]$graphApp_AppSecret = "APPLICATION_SECRET"
[string]$tenantId="TENANT_ID"
[string]$spAdminURL = "SPADMIN_URL"
[string]$graphVer = "v1.0"
#Credetential iformation to be used in delegated connections
[string]$username = "USERNAME"
[string]$password = "PASSWORD"
In my example script, the needed variables are given as plain text in the script file, but for production, you might want to consider other options, like using Credential Manager or Azure Key Vault, to hide this information. At least, don’t take a straight copy from my script, add the information, and post that to some public source.
To connect against Graph with application permission we will need to build a JSON variable with application id and secret. We also need to pass a grant type of ‘client_credentials’ and scope with a value of ‘https://graph.microsoft.com/.default’.
try{
#Connecting with Application Permissions
Write-Host " "
Write-Host "Connecting with Application Permissions"
$body=@{
client_id=$graphApp_AppId
client_secret=$graphApp_AppSecret
scope="https://graph.microsoft.com/.default"
grant_type="client_credentials"
}
$response = Invoke-WebRequest -Uri "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" -ContentType "application/x-www-form-urlencoded" -Body $body -Method Post
$accessToken=$response.content | ConvertFrom-Json
$AppAccessToken = $accessToken.access_token
if($AppAccessToken){
Write-Host ("...got AppAccessToken")
}
else{
Write-Error -Message$ "Application access error" -Category AuthenticationError
}
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Host "**ERROR: Connecting with Application Permissions"
Write-Error -Message $ErrorMessage
}
This JSON in then send with Invoke-WebRequest Post against the login API. If succeeded, we should get a JSON response back and we can parse the authentication token from it.
A delegated connection can be done in almost similar way than the connection with application permission. Again we need the Json variable with application id and secret. Then we need to add the username and password for the user account we want to use in the connection. This time the grant type for the call is ‘password’.
try{
#Connecting with Delegated permissions
Write-Host " "
Write-Host "Connecting with Delegated permissions"
$body=@{
client_id=$graphApp_AppId
password= $password
username= $username
client_secret=$graphApp_AppSecret
grant_type="password"
scope="https://graph.microsoft.com/.default"
}
$response = Invoke-WebRequest -Uri "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" -ContentType "application/x-www-form-urlencoded" -Body $body -Method Post
$accessToken=$response.content | ConvertFrom-Json
$UserAccessToken = $accessToken.access_token
if($UserAccessToken){
Write-Host ("...got UserAccessToken")
}
else{
Write-Error -Message$ "Delegated access error" -Category AuthenticationError
}
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Host "**ERROR: Connecting with Delegated permissions"
Write-Error -Message $ErrorMessage
}
Again, if everything goes as planned, we should get the access code as a return value.
For the application connect there’s also a shorter version available through PnP PowerShell module. We can call the Connect-PnPOnline function with application id and secret and get the access token with Get-PnPAccessToken call after that.
try{
#Connecting with PnP PowerShell
Write-Host " "
Write-Host "Connecting with PnP PowerShell"
Connect-PnPOnline -Url $spAdminURL -AppId $graphApp_AppId -AppSecret $graphApp_AppSecret
$pnpAccessToken = Get-PnPAccessToken
if($pnpAccessToken){
Write-Host ("...got pnpAccessToken")
}
else{
Write-Error -Message$ "PnP PowerShell access error" -Category AuthenticationError
}
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Host "**ERROR: Connecting with Delegated permissions"
Write-Error -Message $ErrorMessage
}
As far as I remember, customizing SharePoint forms and cascading fields have been one of the main areas that organizations have asked. I have done these customizations for classic and modern SharePoint with InfoPath, custom solutions, and other form solutions. Now we can use Power Apps in many cases, and they’re still is no exception here. Many app creators are wondering how to make these customizations and how can I create cascading fields, for example?
A full walkthrough for custom forms would need a separate series of posts, so I am concentrating on the cascading fields and small form examples here. Of course, with Power Apps, it’s not about SharePoint only, and the following techniques can be used against other data connectors also. Here is my typical way of implementing cascading fields in Power Apps. I’ve used this way against small SharePoint lists and in multifield large business scenarios.
![]() |
![]() |
In the custom form, the users are first selecting the category, and then they can select the value belonging to this category from the second drop-down box
Let’s create a simple canvas app for our form. Form only has a text box for the title, two drop-down boxes for selecting the categories, and a submit button. You also need to connect the app to all the three lists in SharePoint.
Even though this is a simple list, remember the naming convention. Taking a habit to rename the controls and screen will help you in the future. You can also use grouping, as I did for the labels, to make it easier to find important elements in the tree view.
The basic setup is now done, but we are missing the cascading part. The category selection doesn’t filter the value list. We need to change the items selected for the second drop-down to make this happen.
In the items selection of the ddpSubCategory, we need to filter the items based on the selection in the other drop-down box
Filter('Sub Category','Main Category'.Value = ddpMainCategory.Selected.Title)
After this, we have the basic cascading fields ready
In the SharePoint list, we only have the Title, and Type Value, and the values are now read from a different SharePoint list. Therefore we didn’t use a Power Apps form element where we have the SubmitForm action available. To save the details from our custom form, we need to use the Patch command. Add the following command to the OnSelect -event of the Save button. Again, remember to comment your code. It helps. I’ll promise
These functions will save the details to SharePoint based on what the user selected from the drop-down and resets the form for the next one. The idea in the Patch command is:
//Save details to SharePoint
Patch('Important List',
Defaults('Important List'),
{
Title: txtTitle.Text,
'Type Value': ddbSubCategory.Selected.Title
}
);
//Reset form element
Reset(txtTitle);
Reset(ddpMainCategory);
Reset(ddbSubCategory);
![]() |
![]() |
You might notice a couple of things in this way of building cascading drop-downs. Updating the ddbSubCategory field might take some time after we change the value in ddpMainCategory, and the first item in the drop-down box is selected automatically. In many situations, we want users to see an empty value in the drop-down and take action to choose the one they need.
//Main Categories to collection
ClearCollect(MainCatSelection,"");
Collect(MainCatSelection,'Main Category');
//Sub Categories to collection
ClearCollect(SubCatSelection,"");
Collect(SubCatSelection,'Sub Category');
Click the Run OnStart action for the App to populate the collection for the test. Now we have cascading drop-downs that are fast to use and show an empty value before the user is making the necessary actions.