Project Oakdale and Azure API Management

Microsoft Dataverse for Teams and Azure API Management

I think that Microsoft Dataverse for Teams (old name Project Oakdale) is the most important Power Platform announcement of the year, especially you are coming from a canvas apps background like myself. Teams are starting to be more and more of a platform for the business than a replacement for Skype. Now, with Microsoft Dataverse, users will have a real data capability and a route to future upgrades by transferring their application on top of Common Data Service if needed. Teams and Microsoft Dataverse for Teams apps offer the needed simplicity to build those small or even large everyday business apps that truly matter for the users.

There are multiple posts and how-to guides to learn Microsoft Dataverse for Teams and canvas apps development, so there is no need to go deeper. One thing that got my attention during the Ignite was the announcement of using Azure API Management with Dataverse for Teams solution through the existing Teams licensing!

This means that your professional developers can create API services to process data and connect to almost any enterprise service. Then the citizen or IT pro developers can leverage those functionalities on their application. These functions will be technically published as custom connectors to the Power Platform environment related to the Dataverse in Teams.

Earlier, this meant that you needed an extra license because a custom connector is a premium level connector, but now that is not needed with Dataverse for Teams environments. Let us see how to use this in real action. Again, we can use something easy for even non-developers and create an Azure Function with PnP PowerShell (I did write about this earlier).

Create Azure Function

Let us keep things simple and create an application that asks for some data from a user and then creates a new News page to SharePoint. We will also fetch some additional information from an “enterprise” service with a rest call during the creation process. The idea is to ask the title and the body from the user and then fetch some data from the Bacon Ipsum service and add it to the news page.

The source code of the function can be found from my GitHub. It is a lot easier to read the code from there, but I will cover the most important parts here: PowerShellCore / CreateBaconPage.

When I am creating PowerShell scripts, I have a habit of using the following type of structure. I think this helps to read and maintain the code.

  1. Start by reading the Azure Function request parameters if developing a function.
  2. Set the main internal parameters, like connection related, used in the function.
  3. Read the possible modules if needed.
  4. Then inside the first try-catch, open the necessary connection, for example, to SharePoint.
  5. Then check that all mandatory parameters are available. I have a parameter called $haveMainParameter that I update while checking the other parameters.
  6. If we have everything available and connections are open, we can start to run the main section of the program.
  7. In the last section, I am closing all the connections.
  8. If developing a function, I am pushing the return details so that the callers can continue their process.
  1. Create a new Azure Function with Visual Studio Code.
  2. We need to fetch three parameters from the request.
    • News title
    • Body of the news
    • Paragraph amount (int value) to be used in our service call to Bacon Ipsum
$newsTitle = $Request.Query.newsTitle
$newsBody = $Request.Query.newsBody
$meatParas = $Request.Query.meatParas
  1. Next, add the necessary parameters used to connect to SharePoint.
    • This time we will need to authenticate against SharePoint with users’ credentials because you cannot create pages with an app-only connection.
    • Make sure to save the credentials securely. I did use the application settings in this example, but Azure KeyVault is a better option.
  2. Now you can make a connection to SharePoint.
    • As a best practice, it is recommended to return the connection to a parameter and use every PnP function call.
    • This helps you overcome a possible mixing of the connection context that can happen when executing the parallel function. You will see ‘The object is used in the context different from the one associated with the object.’ error message when mixing happens.
$spConn = Connect-PnPOnline -Url $siteURL -Credentials $credential -ReturnConnection
  1. Now it is time to check that we have all the necessary parameters and connections available.
#Check the parameters necessary for the application
Write-Host " "
Write-Host "*Check the parameters necessary for the application"

If($spConn -and $newsTitle -and $newsBody -and $meatParas){
    #Parameters are available
    $haveMainParameters = $true
}
else {
    #Missign some parameters
    $haveMainParameters = $false
}
  1. If the parameters are OK, we can continue building the logic.
  2. First, let us create a basic page with a title and content.
    • As you can see, I like to write messages to the PowerShell host console a lot. I think this helps you with debugging and testing.
Write-Host " "
Write-Host "*Create the new page"

#Create basic section
Write-Host "..create basic section"

$newsPage = Add-PnPClientSidePage -Name $newsTitle -PromoteAs NewsArticle -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Text $newsBody -Connection $spConn

#Connect to enterpise service
$baconText = GiveMeBacon -meatParas $meatParas #Save for later -meatType $meatType
  1. At this point, remember to save the function and test the logic by hitting F5.

Connecting to Enterprise Service

Now, let’s look at an example where we fetch some data from an enterprise service outside of Office 365 scope. As a test, let’s get some random text from the Bacon Ipsum service. We will do this by adding a custom module with the necessary logic into our Azure Function.

  1. I like to add a folder for my custom folder.
    • Add CustomModules -folders and EnterpiseAPI.psm1 file into it.
  2. Inside the module, we only need to add one function that is then exported to function logic.
function GiveMeBacon{
    [CmdletBinding()]
    param
    (
        [Parameter(Mandatory = $true, HelpMessage="Amount of meat")]
        [string] $meatParas
    )

    #**Give me bacon from - baconipsum.com
    Write-Host "#*#Give me bacon from - baconipsum.com"

    $response = ""

    try {
        $queryURL = ("https://baconipsum.com/api/?type=all-meat¶s={0}&start-with-lorem=1&format=text" -f $meatParas)
        
        $response = Invoke-RestMethod -Uri $queryURL -ContentType "application/json; charset=utf-8" -Method Post -UseBasicParsing
    }
    catch {
        $ErrorMessage = $_.Exception.Message
        Write-Host "**ERROR: #*#Give me bacon"
        Write-Error $ErrorMessage
    }

    return $response
}

Export-ModuleMember -Function GiveMeBacon
  1. Here is a quick overview of the function:
    • First, read the parameters for the paragraph and text type.
    • We need to construct the URL that we are calling and make a rest call against that URL.
    • As a response, we will get a random text set that we then return to the caller.
  2. Of course, in real life, an enterprise service connecting is most likely a lot complex, but this gives you an idea of building one.
  3. Next, let’s use this new logic in our original function. The first thing to do is to add a reference to the module.
    • Add the following text line at the beginning of your Azure function somewhere after the Input binding section.
#Get custom modules
$SP_ModulePath = $PSScriptRoot + "\CustomModules"
Import-Module "$SP_ModulePath\EnterpiseAPI.psm1" -Force
  1. Now we can extend the page creation by calling the enterprise function and adding the returned text into a separate section on the page. Add the following logic after the initial page creation section.
#Connect to enterpise service
$baconText = GiveMeBacon -meatParas $meatParas #Save for later -meatType $meatType

#Add related section
Write-Host "..add related section"

Add-PnPClientSidePageSection -Page $newsPage -SectionTemplate OneColumn -ZoneEmphasis 2 -Connection $spConn
Add-PnPClientSideText -Page $newsPage -Column 1 -Section 2 -Text "<h3>Related Info</h3>" -Connection $spConn

Add-PnPClientSideText -Page $newsPage -Column 1 -Section 2 -Text $baconText -Connection $spConn

#Add related section
Write-Host "..publish the page"
Set-PnPClientSidePage -Identity $newsPage -Publish -Connection $spConn

Again, you can test the function to make sure a correct type of page gets created to the SharePoint. When everything is working correctly, you can publish the function in Azure.

Microsoft Dataverse for Teams Application

Now let us go teams open the Power App application so that we can create and Dataverse application. The application is simple, with some data fields and a button. You can see the structure from the image below.

  1. I added a variable called EnableSendBtn to the OnStart setting of the App. We will use this parameter to enable and disable the Send button to send the form details only when all the necessary details are given.
  2. I added the following elements to the screen.
    • A one-line text box for the title of the news page.
    • A multiline text box for the body.
    • A number field used to give the number of paragraphs fetch from our enterprise service.
  3. Here are a few important things to notice from my example:
    • Remember to give a unique name for each element on the screens. This will help you to build and maintain the logic.
    • First, we set the EnableSendBtn as false to disable the button for the API call time.
    • The last two sections will enable the button again after the call and resets the form controls.
    • The DisplayMode setting of the Send button has the following logic:
If(
    EnableSendBtn And Not(IsBlank(txtNewsTitle.Value)) And Not(IsBlank(txtNewsBody.Value)) And Not(IsBlank(txtDetailsParagraph.Value)),
    DisplayMode.Edit,
    DisplayMode.Disabled
)

But how to call the Azure Function we made earlier?

Configuring and Using Azure API Management

If you have not used or created API Management before, you can start exploring the service with this simple documentation: Quickstart – Create an Azure API Management instance | Microsoft Docs. I will cover the Project Oakdale related basic settings in the next steps. I assume that you have Azure Function that you want to publish an Azure API Management instance created.

  1. The documentation link above also has details on how to add your first API to the management instance.
  2. In this example, you need to add an Azure Function.
    • You will see a form that you can use to find the necessary API details.
    • Click Browse from the form.
  1. Next, click the “Function App.”
    • You will see a list of available Azure Function Applications.
    • Select the one that holds the function you want to publish.
  1. A list of available functions is shown, and you can select those you want to publish.
    • Select the correct one and click Select.

  1. You will see the details of the function in a form.
    • I recommend giving a meaningful name for the details of the details because it will help you find and use the API in Project Oakdale.
    • Sure, these settings can be updated later also.
    • Finally, click create.
  2. An important thing to notice here is that every Azure API Management API is protected with a subscription key by default. This key needs to be added to the API call, or otherwise, the user is getting access denied error.
    • It is possible to turn the key usage off, but then the whole API would be public and don’t want that.
    • You can find the keys from the Azure API Management portal under Subscriptions.
    • Copy the necessary key, like the build-in primary key, because we will need that in the Power Apps side.
  1. In our function, there were three attributes that we need from the users. Those won’t be asked automatically unless we update the schema of OpenAPI details of the function and tell what we need.

     

  1. Select the post-call of our API and then click the edit link of the frontend section.
    • We want to add new Query parameters to the function.
    • You could write the JSON settings manually, but using the editor makes your life a lot easier.
    • Create parameters for all the necessary ones used in your function. In my case, I only needed a string or integer type of attributes.
    • Also, create one extra parameter for the subscription key called ‘subscription-key,’ type ‘String.’
    • Remember to save the changes.
  1. Now we can export the function. In the export, there is an option for Power Apps and Power Automate available.
    • In the export form, you see a dropdown box to select the Power Platform environment were to publish the connector.
    • Ensure that you have at least one Power App created in Teams because otherwise, you will not see the environment in the list. Also, makes sure to select the correct environment (been there, done that).
    • Give the connect a meaningful name and click export.
  1. It will take a few minutes to publish the API fully, but at this point, you can go to Teams and open the Project Oakdale application.
    • In case the App is open, I recommend hitting refresh for the browser.
  1. In the App, you need to add a new Data Connection.
    • Select Data from the left menu and click Add data.
    • Find the API with the name you gave during the publishing time and click it from the list.
    • A right popup menu will open, and you can click the Connect button from the form.
    • At least for the current preview version for Project Oakdale, you will see a warning about the Premium connection. Based on Microsoft documentation, there is no need to take any extra action based on that.
  1. Now we are ready to use the API. Go to the OnSelect setting of the button in our App and start to add a line after the CALL AZURE API MANAGEMENT comment in our example.
    • Remember to select the post-call of your function.
    • Then you need to give the custom parameters and associate the values to the form elements.
    • The final parameter, called ‘subscription-key,’ is the subscription key copied earlier. Without this key in the query, your API call will not be processed.
  2. My final logic of OnSelect of the Send button looks like this:
Set(
    EnableSendBtn,
    false
);
//CALL AZURE API MANAGEMENT
GiveMeBaconAPI.postcreatebaconpage(
    {
        newsTitle: txtNewsTitle.Value,
        newsBody: txtNewsBody.Value,
        meatParas: Value(txtDetailsParagraph.Value),
        'subscription-key': "YOUR_KEY_GOES_HERE"
    }
);
Set(
    EnableSendBtn,
    true
);
Reset(txtNewsTitle);
Reset(txtNewsBody);
Reset(txtDetailsParagraph);

PS. When writing this post, I have seen a couple of different setting options for the subscription key during the past week. This might because Project Oakdale is still in preview. Here I used one of the current working methods, but I will keep watching the progress and update my post if necessary.

  1. Now, we can save the application and make a test in the preview window.

When everything goes as planned, a new page is created to SharePoint with some enterprise service data. You can now continue to future develop the Project Oakdale app and publish it to the users. In case something goes wrong, you can check the possible errors in the Power App side after closing the preview windows. You can also debug the Azure Function by opening the monitoring and making a test call from the Power App. At this point, you will thank your-self for writing enough comments to the host inside your code.

Advertisement

Save List Attachments to SharePoint Document Library

Is it possible to save list item attachments to the document library? The short answer is yes.

This question is really common, and as you might know, there isn’t an OOTB way to do it. Still, I do understand this need because having the attachment documents in the SharePoint document library brings many benefits; search and filtering, metadata, collaboration, etc. I remember, when I was looking for a solution for this, I did find many blogs about how to upload images to the SharePoint document library with Power App.

At the end of the day, the solution for this comes to the point I had in my Tech Days 2020 session ‘The Right Tool for the Job – Combine Power Platform and SharePoint.’ Know your tools and use the right tool for the job. There is no need to read the attachment blobs in Power App and then upload the data somehow. Everything can be done with Power Automate, SharePoint, and some smart additions to extend user experience with UI only customizations.

Prepare the Library

There isn’t anything special you will need to the list that has the attachments. For the document library, I like to add a field to link the documents and the item in the referring list. So, add a column called ParentID as metadata to the document library where you want to save the document. Later, we will use this column to give the users more accessible access to the documents.

Copy the Attachments

In this example, I have a simple list with few columns and a Canvas Power App with a simple. The Save button is using the basic SubmitForm to save the given details. The attachment handling is done so that the end-user uses the default attachment field functionality to save the attachment document to the list. Then we will use a Power Automate flow to move them to the document library. There are few ways to build the necessary Power Automation.

  1. You can build a flow that is triggered on SharePoint When an item is created.
    1. In this method, the user, who is using the form, will need edit permission to the list
    2. If needed, the end-user doesn’t need permission to the document library because that connection is handled with the account that is used to create the flow
    3. But in many cases, we want the end-users to have access both on the list and document library
  2. You can initialize the flow from the Power App
    1. In this case, the user will need permission both in the list and the document library

Let’s use the option two here and initialize the flow straight from the Power App. From the ribbon in the Power Apps studio, select Action -> Power Automate -> Create a new flow. This will open a new Power Automate for you that is triggered by a Power App.

  1. Add an Initialize variable action in the beginning
    1. Name can be, for example, ListItemID
    2. Type is Integer
    3. For the value, open the dynamic value panel and select Ask in PowerApps
    4. I also like to rename the action for easier recognition in later steps
  2. Then we can select a SharePoint action related to attachments
    1. There are multiply available, and we need to use the Get attachments one
    2. Connect the action to the correct SharePoint list
    3. The id parameter is coming from the variable initialized in the beginning
  1. Next, add Apply to each action under the Control category
    1. For the output, select the Body from the Get attachment step
  2. Then add a Create file action under the SharePoint category
    1. Connect the step to the correct document library
    2. Set File name from the dynamic value panel and select the DisplayName coming from the Get attachments step
    3. File Content can also be set from the dynamic values, and you can use the AbsoluteUri parameter from the Get attachments step

Finally, remember to give a name to the flow and save it. I used the name ‘Save Attachments Demo.’ After this, we can go back to the Power App and connect the flow to the application. First, select the Save button and open the OnSelect property. Then from the ribbon, select Action -> Power Automate -> select the flow we just created. If you had something in the OnSelect property, you might lose the written function, but with undo, you can get them back.

We need to modify the OnSelect property of the button with the following functions.

SubmitForm(frmMyReport);

SaveAttachmentsDemo.Run(frmMyReport.LastSubmit.ID);

  • submit form is used to send the form details to the SharePoint list
  • Then we call the flow and give the necessary details to copy the attachments

The forms in Power Apps have some data state-related properties (LastSubmited, Unsaved, Updates) we can use in the process. LastSubmited property holds all the fields and their values of the last submitted form. We will send the list item ID parameter to the Power Automate. This id is then saved in the ListItemID variable at the beginning of the flow. Test the form with few attachments, and after the submission, the flow should copy the list attachments to the document library.

Finetuning

At this point, we can copy the attachments to the document library, but this leaves us two copies in different places, which is not ideal. Let us make some additions to the flow to overcome this.

The next step is to add the list item id as metadata to the document. This can be done with Update file properties action. The correct file is found with dynamic id fetch from the Create file action. Set the ParentID with the value in the ListItemID variable.

In the final step, we will delete the attachment from the list item. This can be done with Delete attachment action. Point the action to the list and get the item based on the ListItemID variable. The File Identifier id is fetched from the Get Attachment action.

Save the flow and make a test run with the app. Now we have a solution where the user can give list details and documents with Power Apps form. The form details are saved in the SharePoint list, and the documents are saved in a document library.

Linking the Details

So, what is the benefit of saving the id of the list item as metadata for the documents? With the id, we can build an easy functionality where users can access the desired documents straight from the list.

  1. Add a new text column to the list
    1. I like to name the column as Link to Documents
  2. Go to the document library, filter the list based on the ParentID field.
  3. Next, open the column formatting setting of the Link to Documents field
  4. Paste the following JSON to the column formatting field.

{
"$schema": "https://developer.microsoft.com/json-schemas/sp/column-formatting.schema.json",
"elmType": "a",
"txtContent": "Show Documents",
"attributes": {
"target": "_blank",
"href": "='LINK TO YOUR LIBRARY/Forms/AllItems.aspx?FilterField1=ParentID&FilterValue1=' + [$ID]"
},
"style": {
"border": "none",
"background-color": "transparent",
"cursor": "pointer",
"color": "Blue"
}
}

  1. Replace the ‘LINK TO YOUR LIBRARY’ with an URL pointing to your document library.
    1. This link in the JSON is pointing to a filtered view in the document library
    2. The FilterValue1 is set dynamically based on the ID of each list row ([$ID])
  2. Save the formatting and close the formatting panel

Now we have an easy way to navigate to the document straight from the list. When you click the Show Document link in the list, you will be directed to the documents library, and you will see only the documents related to the list item. More information on column formatting, like using the icons, can be found here: https://github.com/SharePoint/sp-dev-list-formatting

Cascading Fields in Power Apps Forms – Basics and Few Tips

As far as I remember, customizing SharePoint forms and cascading fields have been one of the main areas that organizations have asked. I have done these customizations for classic and modern SharePoint with InfoPath, custom solutions, and other form solutions. Now we can use Power Apps in many cases, and they’re still is no exception here. Many app creators are wondering how to make these customizations and how can I create cascading fields, for example?

A full walkthrough for custom forms would need a separate series of posts, so I am concentrating on the cascading fields and small form examples here. Of course, with Power Apps, it’s not about SharePoint only, and the following techniques can be used against other data connectors also. Here is my typical way of implementing cascading fields in Power Apps. I’ve used this way against small SharePoint lists and in multifield large business scenarios.

Main Requirements

  1. We have a SharePoint list that has a Type field
  2. We want to able to select the type from a range of values that can be categorized into multiple different categories
  3. We want to give the administrators an easy way to maintain these values, so they should be placed to a SharePoint list
    • There are two SharePoint lists Main Category and Sub Category
    • Sub Category has a lookup column pointing to the Main Category

In the custom form, the users are first selecting the category, and then they can select the value belonging to this category from the second drop-down box

Basic Form With Cascading Fields

Let’s create a simple canvas app for our form. Form only has a text box for the title, two drop-down boxes for selecting the categories, and a submit button. You also need to connect the app to all the three lists in SharePoint.

Even though this is a simple list, remember the naming convention. Taking a habit to rename the controls and screen will help you in the future. You can also use grouping, as I did for the labels, to make it easier to find important elements in the tree view.

  1. Add the necessary label, one text input, and two drop-down box
  2. Select the Items property of the first drop-down (ddpMainCategory in the example)
  3. Connect the items to the Main Category list
  4. Make sure the value property is set as Title
  5. Do the same thing to another drop-down (ddbSubCategory) but connect it to the Sub Category list

The basic setup is now done, but we are missing the cascading part. The category selection doesn’t filter the value list. We need to change the items selected for the second drop-down to make this happen.

In the items selection of the ddpSubCategory, we need to filter the items based on the selection in the other drop-down box

Filter('Sub Category','Main Category'.Value = ddpMainCategory.Selected.Title) 

After this, we have the basic cascading fields ready

Saving the Details

In the SharePoint list, we only have the Title, and Type Value, and the values are now read from a different SharePoint list. Therefore we didn’t use a Power Apps form element where we have the SubmitForm action available. To save the details from our custom form, we need to use the Patch command. Add the following command to the OnSelect -event of the Save button. Again, remember to comment your code. It helps. I’ll promise

These functions will save the details to SharePoint based on what the user selected from the drop-down and resets the form for the next one. The idea in the Patch command is:

//Save details to SharePoint
Patch('Important List',
Defaults('Important List'),
    {
        Title: txtTitle.Text,
        'Type Value': ddbSubCategory.Selected.Title
    }
);

//Reset form element
Reset(txtTitle);
Reset(ddpMainCategory);
Reset(ddbSubCategory);
  • First ‘Import List’ is the name of the data source where we are running the patch command
  • Defaults command will tell to create a new item in the list
  • Next, we are adding the values from the form into the SharePoint columns of the new item
  • Value from the second drop-down box can be read from the Selected property
  • More info from the Patch – function: https://docs.microsoft.com/en-us/powerapps/maker/canvas-apps/functions/function-patch

Cascading Fields – Next Version

You might notice a couple of things in this way of building cascading drop-downs. Updating the ddbSubCategory field might take some time after we change the value in ddpMainCategory, and the first item in the drop-down box is selected automatically. In many situations, we want users to see an empty value in the drop-down and take action to choose the one they need.

  1. To tackle these issues, we will read the drop-down selection values to collections in the app start
  2. Open the App OnStart settings and add the following functions to save the selected items from SharePoint to two collection
//Main Categories to collection
ClearCollect(MainCatSelection,"");
Collect(MainCatSelection,'Main Category');

//Sub Categories to collection
ClearCollect(SubCatSelection,"");
Collect(SubCatSelection,'Sub Category');
  1. First, we are clearing the possible old collections and adding one empty row into them
  2. Then we collect the SharePoint rows from the lists and save them to the collections
  3. Next, we need to update the items settings of the drop-downs
    1. ddpMainCategory = MainCatSelection
    2. ddbSubCategory = Filter(SubCatSelection, IsBlank(Title) Or ‘Main Category’.Value = ddpMainCategory.Selected.Title)
  4. In the ddbSubCategory, we are filtering the collection so that the first black item is selected and also all the other rows that have the selected category value in the ‘Main Category’ field

Click the Run OnStart action for the App to populate the collection for the test. Now we have cascading drop-downs that are fast to use and show an empty value before the user is making the necessary actions.

PowerApp Custom Forms and Flow – Creating a Feedback Form Part 1

Last month Microsoft announced a long waited functionality where you can use PowerApps to customize the modern list forms in Office 365. Customizing the OOTB SharePoint list forms is something we have done for years already, and everyone has done customizations in multiple different ways.

But when the new modern lists were announced, all of those old ways were useless. I do understand the shouts and anger that came out of this. Starting from last month the first release tenants have been able to use PowerApps and Flow to create custom forms. We can finally do something real with list forms.

Here are few articles related to custom form with PowerApps.

 

Over the time, I have created many types of Feedback forms in my SharePoint projects. The most simple solution is to use SharePoint list and let the users send feedback with OOTB form. But maybe there has been a need to give some instructions to the users or notify someone when new feedback created. To do this, you need to do some level of customization. Sometimes even some coding.

The best part of the new features in Office 365 is that changes can now be done without coding (almost, in many cases, etc. 😉 ) Even a power users can create customization with PowerApps and Flow. This is exactly what I’m about to show in this series.

My example form has the following requirements.

  1. Give a possibility to add some text and instructions to the users when they are giving the feedback.
  2. Show the necessary fields for feedback gathering.
  3. Save the details to SharePoint list.
  4. Create a new task on a backlog list in Planner.
  5. Give a quick view and search for existing feedback items.

Custom List for Feedbacks

  1. Step one which is the easiest one. Just create a new list to your SharePoint Online site.
    • Make sure that the list type is modern. I did create the list to the new modern team site that comes along the Office 365 Group.

Now add two new columns to the list. The quickest way to do this is by clicking the + sign on the view.
AddNewColumn

  1. Create the following fields
    • Description – Multiple lines of text
    • Feature Type – Choice with values; New Feature, Question, Update, and Bug
  2. And now you do have a simple feedback form ready to use!

PowerApp for Custom Forms

Now, let’s start to customize the form. Open the PowerApps menu from the list toolbar and select Customize forms. This will open and create the PowerApps application for your lists forms. Let’s take a look at few things before we make any changes.

  1. Click the SharePointIntegration link on the left and select Advanced tab from the right.
    • You can see the basic settings of the SharePoint integration from this screen.
    • DataSource is pointing to the list you created, and Action selection is pointing to custom forms.
  2. Notice that all Action events are pointing to only one form.
    • Different action types are handling the data mode automatically so calling your form with ViewForm action will show to form in read-only mode.
    • One thing to remember here is that in OOTB only one form is used. If you make any changes, these changes will reflect on all views types.
  3. To solve this issue, we need to create a new screen and form for our custom form.

  4. I ended up to create only two screens seen in the image above. One screen is for View and Edit, the other is for New item creation. You still have to update the values for each action setting in the SharePointIntegration. My settings are as follows (remember to use the names you set on your screens).
    • OnNew – Set(SharePointFormMode, “CreateForm”); NewForm(CustomNewForm); Navigate(NewFormScreen, ScreenTransition.None)
    • OnEdit – Set(SharePointFormMode, “EditForm”); EditForm(CustomViewForm); Navigate(ViewFormScreen, ScreenTransition.None)
    • OnView – Set(SharePointFormMode, “ShowForm”); ViewForm(CustomViewForm); Navigate(ViewFormScreen, ScreenTransition.None)
    • OnSave – If(SharePointFormMode=”CreateForm”, SubmitForm(CustomNewForm), If(SharePointFormMode=”EditForm”, SubmitForm(CustomNewForm)))
    • OnCancel – If(SharePointFormMode=”CreateForm”, ResetForm(CustomNewForm), If(SharePointFormMode=”EditForm”, ResetForm(CustomNewForm)))
  5. Now let’s make some changes to the New form. Delete the fields that are unnecessary like date fields etc.
    • Select the Title_DataCard from the CustomNewForm table and increase the height.
    • Lower the Title field and add two new Label controls at the beginning of the form.
    • Add the instructions in these new labels and make any other change you like.
    • You also have to change the mode of the Description box (DataCardValue2_1) to Multiline.
    • Now save and publish the forms.

Navigate to the Feedback list and try to create a new item. You should now see our custom form in use. Create one item and open it and you should see a different form in use.

CustomForm

In next part we will add a Flow send our Feedback to Planner.

Part 2: https://mikkokoskinen.com/2017/12/06/creating-a-feedback-form-part-2-connecting-the-flow/