Project Oakdale and Azure API Management

Microsoft Dataverse for Teams and Azure API Management

I think that Microsoft Dataverse for Teams (old name Project Oakdale) is the most important Power Platform announcement of the year, especially you are coming from a canvas apps background like myself. Teams are starting to be more and more of a platform for the business than a replacement for Skype. Now, with Microsoft Dataverse, users will have a real data capability and a route to future upgrades by transferring their application on top of Common Data Service if needed. Teams and Microsoft Dataverse for Teams apps offer the needed simplicity to build those small or even large everyday business apps that truly matter for the users.

There are multiple posts and how-to guides to learn Microsoft Dataverse for Teams and canvas apps development, so there is no need to go deeper. One thing that got my attention during the Ignite was the announcement of using Azure API Management with Dataverse for Teams solution through the existing Teams licensing!

This means that your professional developers can create API services to process data and connect to almost any enterprise service. Then the citizen or IT pro developers can leverage those functionalities on their application. These functions will be technically published as custom connectors to the Power Platform environment related to the Dataverse in Teams.

Earlier, this meant that you needed an extra license because a custom connector is a premium level connector, but now that is not needed with Dataverse for Teams environments. Let us see how to use this in real action. Again, we can use something easy for even non-developers and create an Azure Function with PnP PowerShell (I did write about this earlier).

Create Azure Function

Let us keep things simple and create an application that asks for some data from a user and then creates a new News page to SharePoint. We will also fetch some additional information from an “enterprise” service with a rest call during the creation process. The idea is to ask the title and the body from the user and then fetch some data from the Bacon Ipsum service and add it to the news page.

The source code of the function can be found from my GitHub. It is a lot easier to read the code from there, but I will cover the most important parts here: PowerShellCore / CreateBaconPage.

When I am creating PowerShell scripts, I have a habit of using the following type of structure. I think this helps to read and maintain the code.

  1. Start by reading the Azure Function request parameters if developing a function.
  2. Set the main internal parameters, like connection related, used in the function.
  3. Read the possible modules if needed.
  4. Then inside the first try-catch, open the necessary connection, for example, to SharePoint.
  5. Then check that all mandatory parameters are available. I have a parameter called $haveMainParameter that I update while checking the other parameters.
  6. If we have everything available and connections are open, we can start to run the main section of the program.
  7. In the last section, I am closing all the connections.
  8. If developing a function, I am pushing the return details so that the callers can continue their process.
  1. Create a new Azure Function with Visual Studio Code.
  2. We need to fetch three parameters from the request.
    • News title
    • Body of the news
    • Paragraph amount (int value) to be used in our service call to Bacon Ipsum
$newsTitle = $Request.Query.newsTitle
$newsBody = $Request.Query.newsBody
$meatParas = $Request.Query.meatParas
  1. Next, add the necessary parameters used to connect to SharePoint.
    • This time we will need to authenticate against SharePoint with users’ credentials because you cannot create pages with an app-only connection.
    • Make sure to save the credentials securely. I did use the application settings in this example, but Azure KeyVault is a better option.
  2. Now you can make a connection to SharePoint.
    • As a best practice, it is recommended to return the connection to a parameter and use every PnP function call.
    • This helps you overcome a possible mixing of the connection context that can happen when executing the parallel function. You will see ‘The object is used in the context different from the one associated with the object.’ error message when mixing happens.
$spConn = Connect-PnPOnline -Url $siteURL -Credentials $credential -ReturnConnection
  1. Now it is time to check that we have all the necessary parameters and connections available.
#Check the parameters necessary for the application
Write-Host " "
Write-Host "*Check the parameters necessary for the application"

If($spConn -and $newsTitle -and $newsBody -and $meatParas){
    #Parameters are available
    $haveMainParameters = $true
}
else {
    #Missign some parameters
    $haveMainParameters = $false
}
  1. If the parameters are OK, we can continue building the logic.
  2. First, let us create a basic page with a title and content.
    • As you can see, I like to write messages to the PowerShell host console a lot. I think this helps you with debugging and testing.
Write-Host " "
Write-Host "*Create the new page"

#Create basic section
Write-Host "..create basic section"

$newsPage = Add-PnPClientSidePage -Name $newsTitle -PromoteAs NewsArticle -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Text $newsBody -Connection $spConn

#Connect to enterpise service
$baconText = GiveMeBacon -meatParas $meatParas #Save for later -meatType $meatType
  1. At this point, remember to save the function and test the logic by hitting F5.

Connecting to Enterprise Service

Now, let’s look at an example where we fetch some data from an enterprise service outside of Office 365 scope. As a test, let’s get some random text from the Bacon Ipsum service. We will do this by adding a custom module with the necessary logic into our Azure Function.

  1. I like to add a folder for my custom folder.
    • Add CustomModules -folders and EnterpiseAPI.psm1 file into it.
  2. Inside the module, we only need to add one function that is then exported to function logic.
function GiveMeBacon{
    [CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $true, HelpMessage="Amount of meat")]
        [string] $meatParas
    )

    #**Give me bacon from - baconipsum.com
    Write-Host "#*#Give me bacon from - baconipsum.com"

    $response = ""

    try {
        $queryURL = ("https://baconipsum.com/api/?type=all-meat¶s={0}&start-with-lorem=1&format=text" -f $meatParas)
        
        $response = Invoke-RestMethod -Uri $queryURL -ContentType "application/json; charset=utf-8" -Method Post -UseBasicParsing
    }
    catch {
        $ErrorMessage = $_.Exception.Message
        Write-Host "**ERROR: #*#Give me bacon"
        Write-Error $ErrorMessage
    }

    return $response
}

Export-ModuleMember -Function GiveMeBacon
  1. Here is a quick overview of the function:
    • First, read the parameters for the paragraph and text type.
    • We need to construct the URL that we are calling and make a rest call against that URL.
    • As a response, we will get a random text set that we then return to the caller.
  2. Of course, in real life, an enterprise service connecting is most likely a lot complex, but this gives you an idea of building one.
  3. Next, let’s use this new logic in our original function. The first thing to do is to add a reference to the module.
    • Add the following text line at the beginning of your Azure function somewhere after the Input binding section.
#Get custom modules
$SP_ModulePath = $PSScriptRoot + "\CustomModules"
Import-Module "$SP_ModulePath\EnterpiseAPI.psm1" -Force
  1. Now we can extend the page creation by calling the enterprise function and adding the returned text into a separate section on the page. Add the following logic after the initial page creation section.
#Connect to enterpise service
$baconText = GiveMeBacon -meatParas $meatParas #Save for later -meatType $meatType

#Add related section
Write-Host "..add related section"

Add-PnPClientSidePageSection -Page $newsPage -SectionTemplate OneColumn -ZoneEmphasis 2 -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Column 1 -Section 2 -Text "<h3>Related Info</h3>" -Connection $spConn

Add-PnPClientSideText -Page $newsPage -Column 1 -Section 2 -Text $baconText -Connection $spConn

#Add related section
Write-Host "..publish the page"
Set-PnPClientSidePage -Identity $newsPage -Publish -Connection $spConn

Again, you can test the function to make sure a correct type of page gets created to the SharePoint. When everything is working correctly, you can publish the function in Azure.

Microsoft Dataverse for Teams Application

Now let us go teams open the Power App application so that we can create and Dataverse application. The application is simple, with some data fields and a button. You can see the structure from the image below.

  1. I added a variable called EnableSendBtn to the OnStart setting of the App. We will use this parameter to enable and disable the Send button to send the form details only when all the necessary details are given.
  2. I added the following elements to the screen.
    • A one-line text box for the title of the news page.
    • A multiline text box for the body.
    • A number field used to give the number of paragraphs fetch from our enterprise service.
  3. Here are a few important things to notice from my example:
    • Remember to give a unique name for each element on the screens. This will help you to build and maintain the logic.
    • First, we set the EnableSendBtn as false to disable the button for the API call time.
    • The last two sections will enable the button again after the call and resets the form controls.
    • The DisplayMode setting of the Send button has the following logic:
If(
    EnableSendBtn And Not(IsBlank(txtNewsTitle.Value)) And Not(IsBlank(txtNewsBody.Value)) And Not(IsBlank(txtDetailsParagraph.Value)),
    DisplayMode.Edit,
    DisplayMode.Disabled
)

But how to call the Azure Function we made earlier?

Configuring and Using Azure API Management

If you have not used or created API Management before, you can start exploring the service with this simple documentation: Quickstart – Create an Azure API Management instance | Microsoft Docs. I will cover the Project Oakdale related basic settings in the next steps. I assume that you have Azure Function that you want to publish an Azure API Management instance created.

  1. The documentation link above also has details on how to add your first API to the management instance.
  2. In this example, you need to add an Azure Function.
    • You will see a form that you can use to find the necessary API details.
    • Click Browse from the form.
  1. Next, click the “Function App.”
    • You will see a list of available Azure Function Applications.
    • Select the one that holds the function you want to publish.
  1. A list of available functions is shown, and you can select those you want to publish.
    • Select the correct one and click Select.

  1. You will see the details of the function in a form.
    • I recommend giving a meaningful name for the details of the details because it will help you find and use the API in Project Oakdale.
    • Sure, these settings can be updated later also.
    • Finally, click create.
  2. An important thing to notice here is that every Azure API Management API is protected with a subscription key by default. This key needs to be added to the API call, or otherwise, the user is getting access denied error.
    • It is possible to turn the key usage off, but then the whole API would be public and don’t want that.
    • You can find the keys from the Azure API Management portal under Subscriptions.
    • Copy the necessary key, like the build-in primary key, because we will need that in the Power Apps side.
  1. In our function, there were three attributes that we need from the users. Those won’t be asked automatically unless we update the schema of OpenAPI details of the function and tell what we need.

     

  1. Select the post-call of our API and then click the edit link of the frontend section.
    • We want to add new Query parameters to the function.
    • You could write the JSON settings manually, but using the editor makes your life a lot easier.
    • Create parameters for all the necessary ones used in your function. In my case, I only needed a string or integer type of attributes.
    • Also, create one extra parameter for the subscription key called ‘subscription-key,’ type ‘String.’
    • Remember to save the changes.
  1. Now we can export the function. In the export, there is an option for Power Apps and Power Automate available.
    • In the export form, you see a dropdown box to select the Power Platform environment were to publish the connector.
    • Ensure that you have at least one Power App created in Teams because otherwise, you will not see the environment in the list. Also, makes sure to select the correct environment (been there, done that).
    • Give the connect a meaningful name and click export.
  1. It will take a few minutes to publish the API fully, but at this point, you can go to Teams and open the Project Oakdale application.
    • In case the App is open, I recommend hitting refresh for the browser.
  1. In the App, you need to add a new Data Connection.
    • Select Data from the left menu and click Add data.
    • Find the API with the name you gave during the publishing time and click it from the list.
    • A right popup menu will open, and you can click the Connect button from the form.
    • At least for the current preview version for Project Oakdale, you will see a warning about the Premium connection. Based on Microsoft documentation, there is no need to take any extra action based on that.
  1. Now we are ready to use the API. Go to the OnSelect setting of the button in our App and start to add a line after the CALL AZURE API MANAGEMENT comment in our example.
    • Remember to select the post-call of your function.
    • Then you need to give the custom parameters and associate the values to the form elements.
    • The final parameter, called ‘subscription-key,’ is the subscription key copied earlier. Without this key in the query, your API call will not be processed.
  2. My final logic of OnSelect of the Send button looks like this:
Set(
    EnableSendBtn,
    false
);
//CALL AZURE API MANAGEMENT
GiveMeBaconAPI.postcreatebaconpage(
    {
        newsTitle: txtNewsTitle.Value,
        newsBody: txtNewsBody.Value,
        meatParas: Value(txtDetailsParagraph.Value),
        'subscription-key': "YOUR_KEY_GOES_HERE"
    }
);
Set(
    EnableSendBtn,
    true
);
Reset(txtNewsTitle);
Reset(txtNewsBody);
Reset(txtDetailsParagraph);

PS. When writing this post, I have seen a couple of different setting options for the subscription key during the past week. This might because Project Oakdale is still in preview. Here I used one of the current working methods, but I will keep watching the progress and update my post if necessary.

  1. Now, we can save the application and make a test in the preview window.

When everything goes as planned, a new page is created to SharePoint with some enterprise service data. You can now continue to future develop the Project Oakdale app and publish it to the users. In case something goes wrong, you can check the possible errors in the Power App side after closing the preview windows. You can also debug the Azure Function by opening the monitoring and making a test call from the Power App. At this point, you will thank your-self for writing enough comments to the host inside your code.

Advertisement
PnP.PowerShell Is Out and Here's Why it Matters

PnP.PowerShell Is Out and Here’s Why it Matters

Good things are worth waiting for. It took around two years but finally, PnP PowerShell (@PnpPowershell) for PowerShell Core, aka PnP.PowerShell is out in a preview, and GA is expected to be realized this year. Big kudos to Erwin (@erwinvanhunen) and his team. More about the journey and roadmap ‘Cross Platform PnP PowerShell Released

I am a huge fan of PowerShell and especially the PnP PowerShell library. I still use it almost daily for something, and I think scripting is something that every IT Pro should master at least on some level. As we know, things are evolving quickly, and there is always something new in the IT world. One important thing around PowerShell was PowerShell Core 6 that was announced back in 2014. The PowerShell Core is a cross-platform, free, and open source. For an old-school SharePoint geek (like me), the lack of API and modules against SharePoint Online has been the thing to keep using the Windows PowerShell. Sure, you have been able to use REST before, but the PnP PowerShell module has just been more comfortable.

But now the future is finally here for the oldies everyone. You can find the source code of this post from my GitHub repository.

Things to Consider on PnP PowerShell

Installing the new PnP PowerShell is easy, as usual. Just open the PnP Management Shell and give Install-Module PnP.PowerShell -AllowPrerelease command. Things still are in preview mode, but new cmdlets are added regularly to GitHub. Using the module is almost similar to using the classic version. Here’s an excellent place to start if needed: PnP PowerShell Overview | Microsoft Docs

On a high -level, things to remember for us who have been using the earlier version are:

  • The roadmap
    • Once the new PnP PowerShell (v4) goes to the GA (scheduled end of 2020), the old v3 version’s development will stop.
    • The plan is to archive the v3 repo during Q1 2020
    • Start to plan the migration NOW! I just did.
  • PnP PowerShell will be available for SharePoint Online only in the future!
  • PnP PowerShell runs on top of .NET Core 3.1 / .NET Framework 4.6.1. This means that you need the new cross-platform version of PowerShell to use. You can find the installation instruction from here: Installing PowerShell – PowerShell | Microsoft Docs
  • All *-PnPProvisioningTemplate cmdlets have been renamed to *-PnPSiteTemplate. This means that Get-PnPProvisioningTemplate is now, for instance, called Get-PnPSiteTemplate.
  • Classic credential-based authentication has changed.
    • More info from GitHub (link above)
    • WebLogin functionality is not available anymore.

Regarding the login, if you are not familiar with connecting to SharePoint Online using Application Permission, I think this is an excellent time to check out that option. Giving the necessary permission through an Azure AD application will give you more robust tools to control the access for many scripting and integration use cases. Application permission is also crucial to master for a reliable and secure connection for automated tasks and service scenarios. Here’s a basic walkthrough on how to get started.

PnP PowerShell with Azure Functions

I still have not covered the “Here’s Why” section of the title, so here it goes. I think (for me) the major thing of the new release is that moving the PnP library on top of .NET Core means that PnP PowerShell now works natively with Azure Functions type of services. You were, and I have, able to use it before, but the development experience was not good, and there was not an easy way to manage the ALM process for your PowerShell functions. With the new version, all the complexity is gone, and PowerShell is starting to be a real option for service development in many cloud business cases.

As an old developer, I know this will raise the hairs upright among many of my friends, but you should not underestimate PowerShell when creating integrations or services. Yes, by coding, you can do things effectively, and there are multiple cases when PowerShell is not enough.

In many organizations, the situation is that there aren’t too many people who know modern code techniques but there are many who know PowerShell. Now with PnP PowerShell and another available cmdlet and modules, people are able to do even more complex useful programs for their organization.

At the same time, we, the consultants, are responsible for developing services that our clients can use AND maintain. In many cases, there are multiple IT Pro’s that do know PowerShell, but they do not have coding experience. I always try to build my solutions to take whole control of the solution when I am gone. One of my mentors in consulting had one rule; as a consultant, do your job every day so that you may be unnecessary the next day. Therefore, I see PowerShell as a viable option in many cases. By using it, I know multiple people can maintain the solution. This is the same reason why I find Power Platform as an essential platform in the Microsoft ecosystem.

As a consultant, do your job every day so that you may be unnecessary the next day.

To get back to the point, you can now use the PnP PowerShell module easily when building Azure functions. I recommend using a Visual Studio code from the beginning so that you will adopt better ALM practices. If necessary, start your journey by creating a basic Azure function with PowerShell. Here’s a guide for that (select PowerShell as the programming language): Create your first function in Azure using Visual Studio Code | Microsoft Docs

  • You should also go through the Application Permission section mentioned above to connect to SharePoint Online in a managed way.
    • Maybe the easiest way to connect to SharePoint Online from the automated process is to use Connect-PnPOnline -Url YOUR_URL -ClientId “ClientId” -ClientSecret “Secret.”
    • Remember not to add the client id, secret, cert details, etc. straight to your code.
    • Use the Application settings of the function to store these values during the development time.
    • In production, you should store these values to Azure Key Vault and use Managed Identity to access values, but this goes over this post’s topic.
  • For this example, I did save the connection variable to the application settings of the Function App.
  • A valuable thing to learn for Azure Function development is Dependency Management.

I assume that you have a working local Azure Function development environment based on the “Create your first function in Azure using Visual Studio Code” article above. Here are the steps you need to take to have PnP PowerShell working in your function.

Using PnP PowerShell with Azure Function

  1. Create a new PowerShell based function with Visual Studio Code.
    1. I did name mine as Basic-PnPPowerShellCore.
  2. Then, and this is the most enjoyable part, you need to load the PnP.PowerShell module on your function’s environment.
    1. Open the requirements.psd1 file and add PnP.PowerShell a dependency on the project.
    2. When writing this post, the version was 0.xxx, but you should check the current one from GitHub (link above).
    3. AND THAT’S IT. No more uploading the module manually or anything like that. Things are just rolling natively.
  1. Create a new Azure AD application for your target environment.
    1. Give the necessary permission to the application. I did give full control to all sites.
    2. Also, create a secret that can be used when authenticating against the application with PnP PowerShell.
  2. I prefer creating an Azure Function App before I publish the function to make the necessary settings, but you can choose to create the application while publishing the function.
    1. Save the application id, aka client id, and the secret to your function application settings (note what was said before about production).
  3. Download the Application Settings to your local development environment.
  4. For a test function, let us do a simple service that returns the site’s URL to the questioner.
  5. Open the run.ps1 of the function created earlier.
  6. Add the necessary parameters.
    1. $env is referring to the Azure Function Application Settings that holds the necessary details to connect to SharePoint Online.
    2. Of course, you can give these values straight to your code but in production, I recommend using other options.
$connClientId = $env:ConnClientId
$connSecret = $env:ConnSecret
$siteURL = $env:O365DemoURL
  1. My simple function looks like this:
$response = ""

try {
    Connect-PnPOnline -Url $siteURL -ClientId $connClientId -ClientSecret $connSecret

    $ctx = Get-PnPContext

    $response = $ctx.Url

    Disconnect-PnPOnline
}
catch {
    $ErrorMessage = $_.Exception.Message
    Write-Host "**ERROR: *Basic-PnPPowerShellCore"
    Write-Error $ErrorMessage
}

if ($response) {
    Write-Host ("Response: " + $response)

    # Site details found!
    Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
        StatusCode = [HttpStatusCode]::OK
        Body = $response
    })
}
else {
    # No site details found
    Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
        StatusCode = [HttpStatusCode]::NoContent
    })
}
  1. Next, you can test the function by pressing F5.
    1. Because we added a new module as a dependency, the first start will take a few minutes when the module is loaded to the local environment.
  2. After the loading is done, you can test the function by calling the URL of the function.
  1. When everything goes as planned, you will see the URL of the test site in the browser.
  2. After this, you can publish the function to Azure Function app from the Visual Code Extension
  1. I recommend going into Azure Portal and open the Azure Function where you deployed the function.
    1. From the App, select Functions, and then the function you just created.
    2. Run a Test against the function to make sure it is working.
    3. Running the test will also load the dependencies for the first time so that the calls after the initial call are quicker.
  1. Finally, you could push your code to the git repository or any source control system your organization uses.

Now, imagine what type of scenarios and possibilities the integration to Azure Function is giving to a user who knows PowerShell. In my next post, I will show something interesting regarding Power Platform and PowerShell, but hopefully, this post gives you ideas to continue developing useful services quickly.

Creating a Feedback Form Part 2 – Connecting the Flow

In my last post, I started to create a Feedback form that is using the new possibility to us PowerApps for SharePoint Online list form customization. The first part shows how to create a separated forms for View, Edit, and New action. You should check that out first because we will extend that functionality on this post.

PowerApp Custom Forms and Flow – Creating a Feedback Form Part 1

In the old days, you probably have created workflow and send an email when something is added to the list. This type of action is completely valid, and you can still use traditional workflows even in Office 365. But the modern better way to do similar things is to use Flow. For our Feedback form, I wanted to connect the given feedbacks as tasks for our internal team that is building our Intranet.

Let’s add a Flow to our custom form and create a new to do item in Planner that comes as default service for each new Microsoft Group.

  1. From the Feedback list click the PowerApps -> Customize forms from the lists action ribbon.
    1. This will open the custom app forms application we did in Part 1 of this series.
  1. From the ribbon select Flows.
    1. This will open the Associated Flows panel.
  2. Select Create a new Flow.
  3. Flow application will be opened in new tab, and a new flow is created automatically.
    1. You can see that the flow is associated to PowerApps.
  1. What we wanted to do is to add a new task to Planner so let’s add an action to do that.
    1. Click New step -> Add an action
    2. From the opened form search all Planner related action by writing Planner to the search box.
    3. Select the first action Planner – Create a task
    4. This will add a new task to your flow. This task is used to create a new task into Planner.
  1. Now you need to connect to correct plan and bucket where you want to add the tasks.
    1. From Plan Id select the drop-down menu by clicking the down arrow in the field.
    2. This will open a form showing all available Planner plans.
    3. Select the one you want to use.
    4. Do the same thing for Bucket Id. Except, this time you will see all available buckets found from the plan you selected.
  2. For task Title let’s get the value from a user through PowerApps.
    1. Select Title field and select dynamic content -> Ask in PowerApps.
    2. This will create a new PowerApps parameter that we need to populate in our custom form. We will come back to this later.
  1. You could give values for other fields also if you want.
    1. One option is to add current time and date to Start Date-Time field.
    2. Select the Start Date field and select dynamic content.
    3. Open the Expression tab and scroll until you see utcNow().
    4. Click utcNow and click OK.
  2. Now we have a task that creates a new task to Planner and uses the title detail that the user created as task title.

But we are asking some more information from the user also, so let’s add the value from the Description field also into the task.

  1. Add a new step after the first task above and add a new action.
  2. From task selection form again, search planner but this time select Update task details.
  1. To update the new task we just created, select Task Id field and Add dynamic content.
    1. From the list select Id.
    2. The created task will send us the details of the created task, and we can now use that info to find it and update the details.
  2. Next, select the Description.
    1. From dynamic content, menu select Ask in PowerApp to get the value from our custom form.
  1. Finally, click the default name next to Flow name at the beginning of the form.
    1. Give the flow a name you desire to use.
    2. Click Create flow.
  2. Finally, you can click Done, and our flow is completed.

At this point, we have a custom form and a flow that does the task creation. But we still need to combine these two, so that right after a new task is created a new task will be created automatically. Of course, we could attach the flow in New Item Added event on the list, but for this example, we will add the flow straight to our custom form.

  1. Go back to the list and open the custom form PowerApp.
  2. Open the NewFormScreen we created in Part 1.
  3. From the screen select CustomNewForm.
    1. Expand the Title and Description card details. You need the see the name of the fields of the cards later on.
  1. While the form is selected choose OnSuccess event from the attribute drop-down.
    1. This event is run every time a new item is added successfully to the list.
    2. By default, it includes two actions. One for clearing the form and another one to close the panel.
    3. Copy the current OnSuccess value and save them for later use.
  2. From the ribbon select Action -> Flows
    1. You should see the flow you created earlier.
    2. Select the flow, and it will be added as a task to the OnSuccess event.
  1. Now we need to give those two parameters, Createatask_Title and Updatetaskdetails_Description, we decided to ask for the app.
    1. We will connect the form fields to the Flow task call to get the text user has given.
  1. The field value reference can be done based on the field name on the card.
    1. The name depends on your environment, but on this example, the names are DataCardValue2_1 for Description, and DataCardValue1_1 for Title.
    2. With the name, you can refer to the Text value and use that on the Flow call.
  1. Finally add the default tasks back to the OnSuccess action so that the form will be reset and closed when everything is done.
    1. Here’s the whole value used in the example: NewIntranetFeedback.Run(DataCardValue1_1.Text, DataCardValue2_1.Text); ResetForm(CustomNewForm); RequestHide()
  2. Now save and publish the app to SharePoint.

Navigate back to your list and start to add a new item. After the save check from Planner (https://tasks.office.com/) that a new task is added for future steps.

 

PowerApp Custom Forms and Flow – Creating a Feedback Form Part 1

Last month Microsoft announced a long waited functionality where you can use PowerApps to customize the modern list forms in Office 365. Customizing the OOTB SharePoint list forms is something we have done for years already, and everyone has done customizations in multiple different ways.

But when the new modern lists were announced, all of those old ways were useless. I do understand the shouts and anger that came out of this. Starting from last month the first release tenants have been able to use PowerApps and Flow to create custom forms. We can finally do something real with list forms.

Here are few articles related to custom form with PowerApps.

 

Over the time, I have created many types of Feedback forms in my SharePoint projects. The most simple solution is to use SharePoint list and let the users send feedback with OOTB form. But maybe there has been a need to give some instructions to the users or notify someone when new feedback created. To do this, you need to do some level of customization. Sometimes even some coding.

The best part of the new features in Office 365 is that changes can now be done without coding (almost, in many cases, etc. 😉 ) Even a power users can create customization with PowerApps and Flow. This is exactly what I’m about to show in this series.

My example form has the following requirements.

  1. Give a possibility to add some text and instructions to the users when they are giving the feedback.
  2. Show the necessary fields for feedback gathering.
  3. Save the details to SharePoint list.
  4. Create a new task on a backlog list in Planner.
  5. Give a quick view and search for existing feedback items.

Custom List for Feedbacks

  1. Step one which is the easiest one. Just create a new list to your SharePoint Online site.
    • Make sure that the list type is modern. I did create the list to the new modern team site that comes along the Office 365 Group.

Now add two new columns to the list. The quickest way to do this is by clicking the + sign on the view.
AddNewColumn

  1. Create the following fields
    • Description – Multiple lines of text
    • Feature Type – Choice with values; New Feature, Question, Update, and Bug
  2. And now you do have a simple feedback form ready to use!

PowerApp for Custom Forms

Now, let’s start to customize the form. Open the PowerApps menu from the list toolbar and select Customize forms. This will open and create the PowerApps application for your lists forms. Let’s take a look at few things before we make any changes.

  1. Click the SharePointIntegration link on the left and select Advanced tab from the right.
    • You can see the basic settings of the SharePoint integration from this screen.
    • DataSource is pointing to the list you created, and Action selection is pointing to custom forms.
  2. Notice that all Action events are pointing to only one form.
    • Different action types are handling the data mode automatically so calling your form with ViewForm action will show to form in read-only mode.
    • One thing to remember here is that in OOTB only one form is used. If you make any changes, these changes will reflect on all views types.
  3. To solve this issue, we need to create a new screen and form for our custom form.

  4. I ended up to create only two screens seen in the image above. One screen is for View and Edit, the other is for New item creation. You still have to update the values for each action setting in the SharePointIntegration. My settings are as follows (remember to use the names you set on your screens).
    • OnNew – Set(SharePointFormMode, “CreateForm”); NewForm(CustomNewForm); Navigate(NewFormScreen, ScreenTransition.None)
    • OnEdit – Set(SharePointFormMode, “EditForm”); EditForm(CustomViewForm); Navigate(ViewFormScreen, ScreenTransition.None)
    • OnView – Set(SharePointFormMode, “ShowForm”); ViewForm(CustomViewForm); Navigate(ViewFormScreen, ScreenTransition.None)
    • OnSave – If(SharePointFormMode=”CreateForm”, SubmitForm(CustomNewForm), If(SharePointFormMode=”EditForm”, SubmitForm(CustomNewForm)))
    • OnCancel – If(SharePointFormMode=”CreateForm”, ResetForm(CustomNewForm), If(SharePointFormMode=”EditForm”, ResetForm(CustomNewForm)))
  5. Now let’s make some changes to the New form. Delete the fields that are unnecessary like date fields etc.
    • Select the Title_DataCard from the CustomNewForm table and increase the height.
    • Lower the Title field and add two new Label controls at the beginning of the form.
    • Add the instructions in these new labels and make any other change you like.
    • You also have to change the mode of the Description box (DataCardValue2_1) to Multiline.
    • Now save and publish the forms.

Navigate to the Feedback list and try to create a new item. You should now see our custom form in use. Create one item and open it and you should see a different form in use.

CustomForm

In next part we will add a Flow send our Feedback to Planner.

Part 2: https://mikkokoskinen.com/2017/12/06/creating-a-feedback-form-part-2-connecting-the-flow/