Quantcast
Channel: Develop 1 Limited Blog
Viewing all 296 articles
Browse latest View live

PCF Dataset Paging in Model vs Canvas Apps

$
0
0

One of the recent additions to PCF for Canvas Apps is the ability to bind dataset PCF controls to datasets in a Canvas App. A challenge that faces all PCF developers is if their control should support both Model AND Canvas – so with this in mind you need to be aware of the differences in the way that data is paged.

This post demonstrates how the paging API works in Model and Canvas and highlights the differences. In my tests, I used an entity that had 59 records and spanned 3 pages of 25 records per page.

loadNextPage/loadPreviousPage

There are two ways of paging through your data:

  1. Incrementally load the data using loadNextPage
  2. Page the data explicitly using loadExactPage
    In Model Apps, when you call loadNextPage, the next page of data will be added on top of the existing dataset.sortedRecordIds– whereas in Canvas, you will get a reset set of records that will just show the page that you have just loaded.

This is important if you control aims to load all records incrementally or uses some kind of infinite scrolling mechanism.

This is how nextPage/previousPage works in Canvas Apps

This is how nextPage/previousPage works in Model Apps

Notice how the totalRecordsLoaded increases with each page for Model, but for Canvas it shows only the number of records on that page.
You might think that using this approach would be more efficient because it uses the fetchXml paging cookie - well from what I can see it doesn't seem to be any different to just specifying the page in the fetchXml - and has the same performance as loadExactPage...

loadExactPage

When you want to show a specific page – jumping over other pages without loading them, you can use ‘loadExactPage’. This method is not currently documented – but it is mentioned by the PCF team in the forums
This method will load the records for the specific page and so dataset.sortedRecordIds will only contain that page – this is the same on both Canvas and Model!

Notice that if you load a specific page, the hasNextPage and hasPreviousPage is updated to indicate if you can move back or forwards. This would only help when using loadExactPage in Model Apps, because when using loadNextPage in Model Apps, you will never get hasPreviousPage == true because you are loading all the records incrementally rather than a specific page.

This is how loadExactPage works in Canvas Apps

This is how loadExactPage works in Model Apps

Notice total records loaded shows only the number of records in that page.

totalResultCount

This property should give you how many records there are in the current dataset – however, in Canvas it only gives you the number of records that have been loaded via the paging methods. If you look at the comparisons above, you’ll see that the Canvas totalResultCount goes up with each page, but in Model, it remains the total record count.
Interestingly this property is not actually documented – however it’s in the Typescript definitions.

The Future

It’s not clear if we will see a completely unified experience between Canvas and Model with PCF controls – but I’ll update this post if anything changes!


PCF DateTimes – the saga continues!

$
0
0

It's been over a year since I last blogged about DateTimes and nearly a decade since I blogged the first time on the subject! CRM DateTimes – so it’s well overdue that I update you on how DateTimes work with PCF.

My last post on the subject was when the ‘Timezone independent’ and ‘Date Only’ behaviours were introduced -DateTimes - It’s never the last word.

This made the time zone handling of dates much easier if you needed to store absolute date/times – however, there are always times where you need to store a date that is dependant on the user’s time zone (e.g. date/time a task is completed, etc.)

In PCF, it would have been nice if the time zone element of the date was handled for us – but unfortunately not!

There are 3 places where we have to consider datetime behaviours in PCF:

  • Field Controls

    • Inbound dates - When PCF calls updateView()

    • Outbound dates - When PCF calls getOutputs()

  • Dataset Controls - Inbound dates

Field Controls - Inbound dates

When the PCF passes our component a date as a bound property to the context via the updateView methods, the date will be provided as a formatted date string and also a raw Date object.

I have a record with the dateAndTimeField property bound to a DateTime field that has the User Local DateTime behaviour.

I can get the two values as follows:

  • Raw - parameters.dateAndTimeField.raw

  • Formatted - parameters.dateAndTimeField.formatted

There are two time zones I can vary, firstly the CDS User Settings (I have it set to GMT+8) and my local browser time zone. In the following table, I vary the browser time zone and keep the CDS time zone constant.

The formatted date is formatted using my CDS user settings – YYYY/MM/DD HH:mm

Local Time Zone:GTMGMT-3GMT+8
CDS UTC2020-05-10T04:30:00Z2020-05-10T04:30:00Z2020-05-10T04:30:00Z
Raw2020-05-10 05:30:00 GMT+01002020-05-10 02:30:00 GMT-02002020-05-10 12:30:00 GMT+0800
Formatted2020/05/10 12:302020/05/10 12:302020/05/10 12:30

You’ll notice that the formatted time is still 12:30 because it’s showing as the CDS UTC+8 date. Changing my local time zone shouldn’t change this. However, the Raw date is now showing as 12:30 because it’s converted to my local browser time zone, and what makes it more complex is that Daylight savings is also added - depending on the date in the year. JavaScript dates are awkward like this. Although the date is set to the UTC date by PCF – it is provided in the local time zone.

So why not use the formatted date?

To work with the date value (bind it to a calendar control etc.) we need it in the user’s CDS local time zone - that shown by the formatted date. If we are just showing the date and not editing it, then the formatted string is the way to go. However, if we want to edit the date, then we need to convert it to a Date object. This could be done by parsing the Formatted Date but that would require us to understand all the possible date formats that CDS has in the user settings. Instead we can simple apply the following logic:

  1. Convert to UTC to remove the browser timezone offset:
const localDate = getUtcDate(localDate)
getUtcDate(localDate: Date) {
    return  new  Date(
        localDate.getUTCFullYear(),
        localDate.getUTCMonth(),
        localDate.getUTCDate(),
        localDate.getUTCHours(),
        localDate.getUTCMinutes(),
    );
}
 
  1. Apply the user’s time zone offset. This requires access to the user’s time zone settings - luckily they are loaded for us in the PCF context:
convertDate(value: Date) {
    const offsetMinutes = this.context.userSettings.getTimeZoneOffsetMinutes(value);
    const localDate = addMinutes(value, offsetMinutes);
    return getUtcDate(localDate);
}
addMinutes(date: Date, minutes: number): Date {
    return new Date(date.getTime() + minutes * 60000);
}
 

This will now give us a Date that represents the correct Datetime in the browser local time zone - and can be used as a normal date!

Because some dates can be set as time zone independent, we can conditionally run this logic depending on the metadata provided:

convertToLocalDate(dateProperty: ComponentFramework.PropertyTypes.DateTimeProperty) {
    if (dateProperty.attributes?.Behavior == DateBehavior.UserLocal) {
        return this.convertDate(dateProperty.raw);
    } else {
        return this.getUtcDate(dateProperty.raw);
    }
}
 

We still need to convert to UTC even if the date is time zone independent - this is to remove the correction for the browser timezone.

Fields controls - outbound dates

Now we have a date time that is corrected for our local browser time zone, we can simply return the Date object from inside the getOutputs().
So if we wanted to set 12:30 - and our browser timezone is set to GMT-3 (Greenland) - then the date will actually be: 12:30:00 GMT-0200 (West Greenland Summer Time)
PCF ignores the timezone part of the date and then converts the date to UTC for us.

NOTE: It does seem odd that we have to convert to local inbound - but not back to UTC outbound.

Dataset controls - inbound dates

There are two notable differences when binding datasets to tables in PCF compared to the inbound values in their field counterparts.

  1. Dates that are provided by a dataset control binding are similar in that they are provided in the browser timezone - however they are strings and not Date objects.
  2. There is no information on the UserLocal/Timezone independant behaviour - and so we need to know about this in advance.

So as before, when binding to a datagrid, it’s easiest to use the formatted value:
item.getFormattedValue("dateAndTimeField")

If you need the Date object to edit the value - then you’ll need to convert to the local date as before - but with the added step of converting to a Date object:

const dateValue = item.getValue("dateAndTimeField");
const localDate = this.convertDate(dateValue);
 

This isn’t going to be the last I write on this subject I am sure of it! Anything that involves timezones is always tricky!
@ScottDurow

PCF DetailsList Layout with Fluent UI and Sticky

$
0
0

One of the challenges with PCF controls is getting them to reflow to the available space that they are stretched to fill the available space. Doing this using standard HTML involves using the flexbox. The really nice aspect of the Fluent UI react library is that it comes with an abstraction of the flexbox called the ‘Stack’.

The aim of this post is to layout a dataset PCF as follows:

  • Left Panel - A fixed width vertical stack panel that fills 100% of the available space
  • Top Bar - A fixed height top bar that can contain a command bar etc.
  • Footer - A centre aligned footer that can contain status messages etc.
  • Grid - a DetailsList with a sticky headers that occupies 100% of the middle area.

The main challenges of this exercise are:

  1. Expanding the areas to use 100% of the container space - this is done using a combination of verticalFill and height:100%
  2. Ensure that the DetailsList header row is always visible when scrolling - this is done using the onRenderDetailsHeader event of the DetailsList in combination with Sticky and ScrollablePane
  3. Ensure that the view selector and other command bar overlay appear on top of the stick header.
    This requires a bit of a ‘hack’ in that we have to apply a z-order css rule to the Model Driven overlays for the ViewSelector and Command Bar flyoutRootNode. If this is not applied then flyout menus will show behind the Stick header:

Here is the React component for the layout:

/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable @typescript-eslint/explicit-function-return-type */
import * as React from "react";
import {
  Stack,
  ScrollablePane,
  DetailsList,
  TooltipHost,
  IRenderFunction,
  IDetailsColumnRenderTooltipProps,
  IDetailsHeaderProps,
  StickyPositionType,
  Sticky,
  ScrollbarVisibility,
} from "office-ui-fabric-react";

export class DatasetLayout extends React.Component {
  private onRenderDetailsHeader: IRenderFunction<IDetailsHeaderProps> = (props, defaultRender) => {
    if (!props) {
      return null;
    }
    const onRenderColumnHeaderTooltip: IRenderFunction<IDetailsColumnRenderTooltipProps> = tooltipHostProps => (<TooltipHost {...tooltipHostProps} />
    );
    return (<Sticky stickyPosition={StickyPositionType.Header} isScrollSynced>
        {defaultRender!({
          ...props,
          onRenderColumnHeaderTooltip,
        })}</Sticky>
    );
  };
  private columns = [
    {
      key: "name",
      name: "Name",
      isResizable: true,
      minWidth: 100,
      onRender: (item: string) => {
        return <span>{item}</span>;
      },
    },
  ];
  render() {
    return (<><Stack horizontal styles={{ root: { height: "100%" } }}><Stack.Item>
            {/*Left column*/}<Stack verticalFill><Stack.Item
                verticalFill
                styles={{
                  root: {
                    textAlign: "left",
                    width: "150px",
                    paddingLeft: "8px",
                    paddingRight: "8px",
                    overflowY: "auto",
                    overflowX: "hidden",
                    height: "100%",
                    background: "#DBADB1",
                  },
                }}><Stack><Stack.Item>Left Item 1</Stack.Item><Stack.Item>Left Item 2</Stack.Item></Stack></Stack.Item></Stack></Stack.Item><Stack.Item styles={{ root: { width: "100%" } }}>
            {/*Right column*/}<Stack
              grow
              styles={{
                root: {
                  width: "100%",
                  height: "100%",
                },
              }}><Stack.Item verticalFill><Stack
                  grow
                  styles={{
                    root: {
                      height: "100%",
                      width: "100%",
                      background: "#65A3DB",
                    },
                  }}><Stack.Item>Top Bar</Stack.Item><Stack.Item
                    verticalFill
                    styles={{
                      root: {
                        height: "100%",
                        overflowY: "auto",
                        overflowX: "auto",
                      },
                    }}><div style={{ position: "relative", height: "100%" }}><ScrollablePane scrollbarVisibility={ScrollbarVisibility.auto}><DetailsList
                          onRenderDetailsHeader={this.onRenderDetailsHeader}
                          compact={true}
                          items={[...Array(200)].map((_, i) => `Item ${i + 1}`)}
                          columns={this.columns}></DetailsList></ScrollablePane></div></Stack.Item><Stack.Item align="center">Footer</Stack.Item></Stack></Stack.Item></Stack></Stack.Item></Stack></>
    );
  }
}

Here is the css:

div[id^="ViewSelector"]{
    z-index: 20;
}
#__flyoutRootNode .flexbox {
    z-index: 20;
}

Hope this helps!

@ScottDurow

Always be linting your TypeScript!

$
0
0

Linters have been around for ages - it all started back in 1978 apparently - but has now become a mainstay of modern JavaScript and TypeScript programming.

Writing code without a linter is like writing an essay without using spell checker! Sure there may be some super humans who can write their code perfectly without linting - but I’m not one of those!

Much has been written about linting since 1978 and there are plenty of opinions! For me there are two parts:

  1. Enforcing semantic code rules such as not using var in TypeScript or using let when it could be const because the value doesn’t change. These rules are designed to help you trap bugs as early as possible and enforce best practices.
  2. Formatting rules - such as not mixing tabs and spaces and adding spaces before and after keywords.

For TypeScript, we can enforce rules using eslint - and automatically format our code using prettier.
There are a whole raft of style rules that then can be applied for different libraries such as react.

This post shows you how to setup linting quickly and easily for a TypeScript PCF project that uses React.

Create your PCF project

Create your pcf project using your CLI/IDE of choice:
I use:

pac pcf init --namespace dev1 --name pcflint --template field
npm install react react-dom @fluentui/react
yo pcf --force

Install ESLint, Prettier and the plugins

Prettier is great for formatting your code, but doesn’t really do any of the semantic code checks. So the configuration we are going to create uses prettier as a plugin from within eslint. This means when you run eslint, no only will it warn and attempt to fix semantic issues, it’ll also tidy up the formatting for you using prettier.

npm install eslint --save-dev

You can use the bootstrapper if you want - but this can lead to a configuration that you don’t really want:

npx eslint --init
  1. Next up is installing prettier (https://prettier.io/docs/en/install.html);
npm install --save-dev --save-exact prettier

We use the --save-exact as recommended by the project because sometimes formatting rules can change slightly and you don’t suddenly want your source control diffs to include formatting differences.

  1. Now install the plugins and configurations needed for our rules:
npm install --save-dev @typescript-eslint/eslint-plugin @typescript-eslint/parser eslint-plugin-react eslint-config-prettier eslint-plugin-prettier
  1. Next we configure setline to call prettier when it is run (https://prettier.io/docs/en/integrating-with-linters.html) - this uses estlint-plugin-prettier
    Create a file named .estlintrc.json:
{
    "parser": "@typescript-eslint/parser",
    "env": {
        "browser": true,
        "commonjs": true,
        "es6": true,
        "jest": true,
        "jasmine": true
    },
    "extends": [
        "plugin:@typescript-eslint/recommended",
        "plugin:prettier/recommended",
        "plugin:react/recommended",
        "prettier",
        "prettier/@typescript-eslint",
        "prettier/react"
    ],
    "parserOptions": {
        "project": "./tsconfig.json"
    },
    "settings": {
        "react": {
          "pragma": "React",
          "version": "detect"
        }
      },
    "plugins": [
        "@typescript-eslint",
        "prettier"
    ],
    "rules": {
        "prettier/prettier": "error"
    },
    "overrides": [
        {
          "files": ["*.ts"],
          "rules": {
            "camelcase": [2, { "properties": "never" }]
          }
        }
      ]
}

Note:

  1. There is an override rule to allow non-camelcase property names since we often use pascal named SchemaNames from CDS.
  2. There is support for jest and jasmine tests.

Now configure the prettier rules by creating a file called .prettierrc.json

{
  "semi": true,
  "trailingComma": "all",
  "singleQuote": false,
  "printWidth": 120,
  "tabWidth": 2,
  "endOfLine":"auto"
}
 

Let the magic happen!

There are two ways to get eslint to do its job:

  1. Run from the command line
  2. Use a VSCode extension.

Note: Both approaches will require you to have setup eslint and prettier already

Run from the command line:

  1. You will need to globally install eslint:
npm install -g eslint
  1. After that you can add a script to your package.config:
"scripts": {
  ...
  "lint": "eslint ./**/*.ts --fix"
},

Run from inside VSCode

This is my day-to-day use of eslint.

  1. Install the eslint VSCode extension - https://github.com/Microsoft/vscode-eslint
  2. lint issues will show up via a code-lens - the details show up using Ctrl-.
  3. You can auto-format your code using Alt-SHIFT-P

I really recommend getting linting into your workflow early on – because you don’t want to enable it later and then find you have 1000’s of issues to wade through!
@ScottDurow

 

Pets vs. Cattle – How to manage your Power App Environments

$
0
0

A situation I see very frequently is where there is a ‘special’ PowerApps environment that holds the master unmanaged customizations. This environment is looked after for fear of losing the ability to deploy updates to production since with managed solutions you can’t re-create your unmanaged environment. Sometimes, a new partner starts working with a customer only to find that they have managed solutions in production with no corresponding unmanaged development environment.

I’m not getting into the managed/unmanaged debate – but let’s assume that you are following the best practices outlined by the PowerApps team themselves “Managed solutions are used to deploy to any environment that isn't a development environment for that solution”[1]+[2]

There is a phrase that I often use (adapted from its original use [3]):

“Treat your environments like cattle, not pets”

This really resonates with the new PowerApps environment management licensing where you pay for storage and not per-environment. You can create and delete environments (provided you are not over DB storage capacity) with ease.

If you store your master unmanaged solution in an environment – and only there – then you will start to treat it like a pet. You’ll stroke it and tend to its every need. Soon you’ll spend some much time in pet-care that you’ll be completely reliant on it, but it’ll also be holding you back.

There is another principle I am very vocal about:

“Everything as code”

This is the next logical step from “Infrastructure as code” [4]

In the ‘everything as code’ world, every single piece of the configuration of your development environment is stored as code in source control, such that you can check-out and build a fully functioning unmanaged development environment that includes:

  1. Solution Customisations as XML
  2. Canvas Apps as JSON
  3. Flows as JSON
  4. Workflows as XAML
  5. Plugins as C#
  6. JS Web resources as TypeScript
  7. Configuration data as Xml/JSON/CSV
  8. Package Deployer Code
  9. Test Proxy Stub Code for external integrations
  10. Scripts to deploy incremental updates from an older version
  11. Scripts to extract a solution into its respective parts to be committed to source control
  12. Scripts to set up a new development environment
    1. Deploy Test Proxy Stub Services
    2. Build, Pack and deploy a solution to a new environment
    3. Deploy Reference Data
    4. Configure Environment Variables for the new environment

There are still areas of this story that need more investment by the PowerApps teams such as connector connection management and noisy diffs – but even if there are manual steps, the key is that everything is there in source control that is needed. If you lose an environment, it’s not a disaster – it’s not like you have lost your beloved pet.

The advantages of combining these two principles are that every single time you make a change to any aspect of an environment, it is visible in the changeset and Pull Request

If you are working on a new feature, the steps you’d take would be:

  1. Create a new branch for the Issue/Bug/User Story
  2. Checkout the branch locally
  3. Create a new development PowerApps environment and deploy to it using the build scripts
  4. Develop the new feature
  5. Use the scripts to extract and unpack the changes
  6. Check that your changeset only contains the changes you are interested in
  7. Commit the changes
  8. Merge your branch into the development/master branch (depending on the branching strategy you are using)
  9. Delete your development environment

Using this workflow, you can even be working on multiple branches in parallel provided there won’t be any significant merge conflicts when you come to combine the work. Here is an example of a branching strategy for a hotfix and two parallel feature development branches:

The most common scenario I see where there are merge conflicts are RibbonXml, FormXml, and ViewXml – the editing of both of these elements is now supported – and so you can manage merge conflicts inside your code editor! CanvasApps and Flows are another story – there really isn’t an attractive merge story at this time and so I only allow a single development branch to work on Canvas Apps, Flows, and Workflows at any one time.

If you think you have pet environments, you can still keep them around until you feel comfortable letting go, but I really recommend starting to herd your environments and get everything extracted as code. You’ll not look back.

@ScottDurow

References:

[1] ALM Basics - https://docs.microsoft.com/en-us/power-platform/alm/basics-alm

[2] Solution Concepts - https://docs.microsoft.com/en-us/power-platform/alm/solution-concepts-alm

[3] Pets vs Cattle - http://cloudscaling.com/blog/cloud-computing/the-history-of-pets-vs-cattle/

[4] Infrastructure as Code - https://en.wikipedia.org/wiki/Infrastructure_as_code

[5]  ALM with the PowerPlatform - https://docs.microsoft.com/en-us/power-platform/alm/

[6] ALM for Developers - https://docs.microsoft.com/en-us/power-platform/alm/alm-for-developers

[7] Supported Customization Xml Edits - https://docs.microsoft.com/en-us/power-platform/alm/when-edit-customization-file

[9] Healthy ALM - https://docs.microsoft.com/en-us/power-platform/alm/implement-healthy-alm

Environment Variables in Smart Buttons

$
0
0

A very common request I've had for the Ribbon WorkbenchSmart Button solution is to be able to configure the WebHook/FlowUrl using an Environment Variable. Environment Variables are small pieces of information that can vary between environments without there needing to be customizations update. This way you can have different endpoints for each environment without making customization changes.

As of Version 1.2.435.1 you can now put an environment variable (or combination of) into the FlowUrl smart button parameter:

This screenshot assumes you have added an environment variable to your solution with the schema name dev1_FlowUrl

The Url is in the format {%schemaname%}. Adding the environment variable to the solution would look like:

The really awesome part of environment variables is that you are promoted to update them when you import to a new environment inside the new Solution Import experience that came with Wave 1 2020.

If you have any feedback or suggestions for Smart Buttons, please head over to the Github project page.

@ScottDurow

Public Preview of GitHub Actions for Power Platform ALM

$
0
0

There is a new kid in town! Not long after the Power Apps Build Tools for Azure Dev Ops were released out of beta under the new name of Power Apps Build Tools (https://marketplace.visualstudio.com/items?itemName=microsoft-IsvExpTools.PowerPlatform-BuildTools) the new set of GitHub actions for Power Platform ALM have been released in public preview (https://powerapps.microsoft.com/en-us/blog/github-actions-for-the-power-platform-now-available-in-preview/). They can be used in your workflows today and will be available in the GitHub Marketplace later in the year.

Since Microsoft acquired GitHub for $7.5 Billion back in 2018 there has been a growing amount of investment – it seems that parity with Azure Dev Ops is inevitable before long. The CI/CD story in the open-source world has been served using products such as Octopus Deploy for a long time, but one of the investments Microsoft have made is in the are of GitHub actions (https://github.blog/2019-08-08-github-actions-now-supports-ci-cd/)

GitHub Actions for Power Platform ALM

Actions and Workflows give a yaml build pipeline with a set of hosted build agents. This provides a significant step towards some degree of parity with Azure Pipelines.

With the public preview of the Power Platform GitHub actions, we can come some way to moving our CI/CD pipeline to GitHub. At this time, not all of the Azure Dev Ops Power Platform Build Tools are supported yet – with the most notable omission being the Solution Checker and environment management tasks.

Power Platform Build Tools

GitHub Power Platform Actions

WhoAmI

who-am-i

Power Platform Checker

---

Power Platform Import Solution

import-solution

Power Platform Export Solution

export-solution

Power Platform Unpack Solution

unpack-solution

Power Platform Pack Solution

pack-solution

Power Platform Publish Customizations

---

Power Platform Set Solution Version

---

Power Platform Deploy Package

---

Power Platform Create Environment

---

Power Platform Delete Environment

---

Power Platform Backup Environment

---

Power Platform Copy Environment

---

---

branch-solution

---

clone-solution


An interesting addition to the GitHub actions is the branch-solution action which I think is intended to be used when you want a new pro-code or low-code environment to match a GitHub branch so that you can ‘harvest’ the solution xml from any changes automatically. I look forwards to seeing documentation on the best practices surrounding this action.

There are two missing features that I would really like to see in the actions:

  1. Client Secret Authentication
  2. Cross-Platform Support

When do we move from Azure Dev Ops then?

Not yet! Personally, I feel the biggest gap in actions is the maturity around the release management in GitHub actions. Azure Dev Ops allows you to create multi-stage deployments with approval gates that can be driven from the output of a specific build, whereas GitHub actions require you to manage this using release tags and branch merging or external integrations.

Example

You can see an example of the new GitHub actions at work in my NetworkView PCF control repo (https://github.com/scottdurow/NetworkViewPCF)

Each time a pull request is merged into the master branch, the PCF control is built, the solution packaged and a release created.

Since the solution contains more than just the PCF control (forms too!), I have a folder called solution_package that contains the solution as unpacked by the Solution Packager. After the PCF control is built, a script is then used to copy the bundle.js into the solution package and update the version of the artefacts. Then the solution is built using the microsoft/powerplatform-actions/pack-solution@latest action. I chose to use a node script rather than PowerShell/PowerShell Core so that eventually it will be easier to be cross-platform once the Power Platform tools are also cross-platform.

You can take a look at the build yaml here - https://github.com/scottdurow/NetworkViewPCF/blob/dev/.github/workflows/build.yml 

@ScottDurow

New Smart Button – Custom Dialogs in Model Driven Apps using Canvas Apps

$
0
0

One of the most requested features of Model-Driven Apps ‘back in the day’ was to edit the popup dialog boxes that do actions such as Closing Opportunities or Cases. These were ‘special’ dialogs that had a fixed user interface.

There were a few workarounds that involved either using dialogs (now deprecated) or a custom HTML web resource.

More recently, the ability to customize the Opportunity Close dialog was introduced (https://docs.microsoft.com/en-us/dynamics365/sales-enterprise/customize-opportunity-close-experience) however this is very limited in what you can actually do.

Canvas Apps are a great way of creating tailored specific purpose user interfaces and are great for this kind of popup dialog type action. If only there was a way to easily open a Canvas App from a Model-Driven Command Bar. Well, now there is!

Open Dialog

Open Dialog Smart Button

I’ve added a new smart button that allows you to easily provide the URL to the Canvas App to use as a dialog and pass the current record or selected record in a grid.

Step 1. Create a Canvas App Dialog

Your Canvas App will be responsible for performing the logic that your users need. The information that is passed to it is in the form of the record Id and logical name parameters. You can grab these values in the Canvas App startup script and then load the record that you need:

Set(varRecordId, If(    IsBlank(Param("recordId")),    GUID("780bb51e-961e-ea11-a810-000d3ab39933"),    GUID(Param("recordId"))    ));

Set(varRecordLogicalName, Param("recordLogicalName"));

Set(varSelectedRecord, LookUp(Accounts, Account = varRecordId))

Replace the GUID with the id of a record you want to use as a test when running inside the Canvas App Studio.

Any buttons that perform actions on the data or a cancel button that just closes the dialog, simply use the Exit() function:

// Do some stuff
Patch(Accounts,varSelectedRecord,{    'Invoice Date':dpkInvoiceDate.SelectedDate
});
Exit();

The smart button listens for the result of the Exit() function to close the dialog.

One of the challenges of adding a Canvas App to a Model-Driven app is styling it to look like the out of the box Model-Driven App dialogs. I have created a sample app that you can import and then use as a template - https://github.com/scottdurow/RibbonWorkbench/blob/master/SmartButtonsUCI/SampleDialogSolution_unmanaged.zip

Step 2. Publish and grab the App Url.

Publish your Canvas App in a solution, and then grab the App Url from the details. Select the … from the Canvas App and then select ‘Details’

Get App Url

Then copy just the Url of the App that is displayed:

You could create an environment variable to hold this similar to the WebHook smart button - http://develop1.net/public/post/2020/09/11/environment-variables-in-smart-buttons This is because the url to the Canvas App will be different in each environment you deploy to.

Note: Make sure you share your Canvas App with the users that are going to be using your Model-Driven App! (https://docs.microsoft.com/en-us/powerapps/maker/model-driven-apps/share-embedded-canvas-app

Step 3. Install the Smart Buttons solution

You will need the latest smart buttons solution – https://github.com/scottdurow/RibbonWorkbench/releases

Step 4. Open the Ribbon Workbench and add the buttons

When you open the Ribbon Workbench for the environment that the Smart Button solution and Canvas App is installed into, you can then drop the ‘Open Dialog’ button on either a Form, SubGrid, or Home Grid.

The properties for the Smart Button might look something like:

Note: I've used an environment variable reference in the Dialog Url parameter - but equally, you could just paste the URL of your canvas app in there if you didn't want to deploy to multiple environments such that the app URL would be different.

And that's it!

It’s really that simple. Now you will have a dialog that allows you to take actions on records from forms or grids using a Canvas App. The data is then refreshed after the dialog is closed.

Mobile App Support

At this time, due to cross-domain restrictions inside the Power Apps Mobile App, this technique will not work. The user will simply be presented with a login message, but the button will not do anything. If you would like to unblock this scenario – please vote this suggestion up!  https://powerusers.microsoft.com/t5/Power-Apps-Ideas/Support-Canvas-App-modal-popup-inside-Model-Driven-Mobile-App/idi-p/704962#M31952

Let me know how you get on over on GitHub - https://github.com/scottdurow/RibbonWorkbench/issues 

@ScottDurow


Dataverse for Teams vs Canvas Apps - Part 1 - Checked vs Default Control Properties

$
0
0

If you were thinking that Power Apps Canvas Apps and Dataverse for Teams Canvas Apps are just the same – but with a different license and container – well whilst it is mostly true, there is a very big difference:
Dataverse for Teams uses a completely different set of Out of the Box controls. They are based on the Fluent UI library.
This post will hopefully save someone the time that I’ve spent investigating why a very common UI design pattern doesn’t work in Dataverse for Teams.

The Toggle Pattern

A common pattern in Canvas Apps is to bind a variable to the Default property of a Toggle Button, and then use the OnChange event to fire some code when it is changed. This is a very common solution to the problem that components cannot raise events at the time of writing.
Imagine a scenario where you have a Component that renders a button, that when selected it should raise an event on the hosting screen.
The common pattern is to toggle an output property from a custom component, and then bind the output to a variable – that is in turn bound to a toggle button. When the variable is toggled, it then raises the OnChecked event on the toggle button so you can perform the logic you need. This does seem like a hack – but it is the only mechanism I know of to respond to events from inside components.

I hope that at some point we will see custom events being able to be defined inside components – but for now, the workaround remains.
So, the app looks something like this:

Fluent UI Controls not only look different - they behave differently!

The problem is that inside Dataverse for Teams, the standard controls have been replaced with the new Fluent UI based controls, and with that, there is a subtle difference.

The default property has been replaced by a new set of properties that are control specific (e.g. Checked, Value, Text, etc). With this change, the change events are only fired with the user initiates the event– and not when the app changes the value.

So in Dataverse for Teams, the App looks very similar, but with the Checked property rather than Default:

This results in the OnChecked event not being fired and as such, the pattern no longer works.

If you look carefully, you'll see, in Dataverse for Teams, the label counter only increments when the toggle button is checked but not when the button is clicked. This is because the OnChecked event is not triggered by the varToggle variable being changed by the component.

I really love the Fluent UI controls in Dataverse for Teams - especially with the awesome responsive layout controls - but this drawback is very limiting if you are used to writing Power Apps Canvas Apps. I hope that we will see an update soon that will remove this limitation from Dataverse for Teams Apps.

Work Around

Update 2021-02-10: There is a workaround to this - you can enable 'classic' controls - this then gives you the choice between using the Fluent UI OR the classic Toggle control. By using the classic control you then get the OnChecked event being raised!

 

Fiddler2: The tool that gives you Superpowers - Part 2

$
0
0

This post is the second post in the series 'Fiddler – the tool that gives you superpowers!'

Invisibility

This time it's the superpower of Invisibility! Wow I hear you say!

Fiddler is a web debugger that sits between you and the server and so is in the unique position of being able to listen for requests for a specific file and rather than returning the version on the server return a version from your local disk instead. This is called and 'AutoResponder' and sounds like a super-hero it's self – or perhaps a transformer (robots in disguise).

If you are supporting a production system then the chances are that at some point your users have found an issue that you can't reproduce in Development/Test environments. Auto Responders can help by allowing us to update any web resource (html/JavaScript/Silverlight) locally and then test it against the production server without actually deploying it. The Auto Responder sees the request from the browser for the specific web resource and rather returning the currently deployed version, it gives the browser your local updated version so you can test it works before other users are affected.

Here are the steps to add an auto responder:

1) Install Fiddler (if you've not already!) and ensure decrypt HTTPS traffic is checked in Options->HTTPS.

2) Switch to the 'Auto Responders' tab and check the two checkboxes 'Enable automatic responses' and 'Unmatched requests pass-through'

3) To ensure that the browser requests a version of the web resource rather than a cached version from the server you'll need to clear the browser cache using the convenient 'Clear Cache' button on the tool bar.

4) You can ensure that no versions get subsequently cached by selecting Rules-> Performance-> Disable Caching.

5) You can now use 'Add Rule' to add an auto responder rule. Enter a regular expression to match the web resource name

regex:(?insx).+/<Web Resource Name>([?a-z0-9-=&]+\.)*

then enter the file location of the corresponding webresource in your Visual Studio Developer Toolkit project.

You are now good to go so that when you refresh your browser the version of your web resource will be loaded into the browser directly from your Visual Studio project. No need to publish a file to the server and affect other users.

There is one caveat to this – If the script that you are debugging updates data then this approach is probably not a good idea until you are have fully tested the script in a non-production environment. Only once you have QAed and ready to deploy can be it be used against the production environment to check that the specific user's issue is fixed before you commit to deploying it to all users.

UPDATE: This is now the recommended technique by Microsoft for debugging scripts - https://docs.microsoft.com/en-us/powerapps/developer/model-driven-apps/streamline-javascript-development-fiddler-autoresponder

Read the next post on how to be faster than a speeding bullet!

@ScottDurow

 

Fiddler2: The tool that gives you Superpowers – Part 3

$
0
0

This post is the third post in the series 'Fiddler – the tool that gives you superpowers!'

Faster than a Speeding Bullet

If you have done any development of Web Resources with Dynamics CRM then I'm certain that you'll have become impatient whilst waiting to first deploy your solution and then publish it before you can test any changes. Everytime you need to make a change you need to go round this loop which can slow down the development process considerably. Using the Auto Responders I described in my previous post (Invisibility) you can drastically speed up this development process by using Fiddler to ensure changes you make to a local file in Visual Studio are reflected inside Dynamics CRM without waiting for deploying and publishing. You make the changes inside Visual Studio, simply save and refresh your browser and voilà!

Here some rough calculations on the time it could save you on a small project:

Time to Deploy

15

seconds

Time to Publish

15

seconds

Debug iterations

20

 

Number of web resources

30

 

Development Savings

5

hours

Time to reproduce live data in test/development

1

hour

Number of issues to debug in live

10

 

Testing Savings

10

hours

   

Total Savings for a small project

15

hours

 

What is perhaps more important about this technique that it saves the frustration caused by having to constantly wait for web resource deployment and ensures that you stay in the development zone rather than being distracted by the latest cute kitten pictures posted on facebook!

Furthermore, using this technique allows you to use source-maps without ever deploying them to the server!

Do remember to deploy and publish your changes once you've finished your development. It seems obvious but it is easily forgotten and you're left wondering why your latest widget works on your machine but not for others!

More information can be found on this at the following locations:

@ScottDurow

Create a command bar button for your custom activities

$
0
0

When creating a new custom activity entity you are presented with a great many checkboxes to choose from. One of these checkboxes is 'Display in Activities Menus' that will ensure that the activity type is included in the Activities Menu on records

The Custom Activities flyout menu is a dynamically generated menu so that if you load up an entity in the Ribbon Workbench and expand this button you won't see any menu items. This has been the subject for confusion for users who are trying to move this button out of the sub menu and put it on the main Command Bar as a short cut for uses. Since the menu is created dynamically at run time there is no button to customise and move. This post shows you how to create a custom activity button on the opportunity form.

1. Determine the Entity Type Code of your custom activity.
Each custom entity has an Entity Type code. This integer value is different to the entity logical name and could be different on each organization you deploy your solution. We need to know this entity type code in order to create a new record of a specific entity type.

To find the value, create a new record for your custom entity in Internet Explorer and press Ctrl-N (opens a new window with Url visible) and copy the URL. This should look something like:

http://server/Org/main.aspx?etc=10005...

You need to note the etc number for later.

2. Install Ribbon Workbench

You'll need to install the Ribbon Workbench as described by 'Getting started with the Ribbon Workbench'

3. Create and open a solution containing the Opportunity Entity

The Ribbon Workbench requires a solution to load that contains the entities that you wish to work on. Since we are adding the new button to the opportunity entity, add it to the solution – only add the entities you need to speed up the solution load/publish time. When you first open the Ribbon Workbench it will allow you to select your new solution.

4. Click on the 'Ribbon' tab and change drop down in the top right to 'Form'

We need a template command to use and so we select the 'Ribbon' view rather than the Command Bar view so to locate the 'Task' button on the Form ribbon.

5. Select the 'Add' tab and then right click on the 'Task' button and select 'Customise Command'

The Task button is located on the 'Add' tab. Using 'Customise Command' creates a copy of the add task command that we can change to reference our custom acitivty.



6. Expand the Commands node to see the customised command and select 'Mscrm.AddTaskToPrimaryRecord'.
7. Rename the command to be something like 'Mscrm.AddCustomActivityToPrimaryRecord'

By renaming the command we are creating a new command specifically for our new button, rather than customising the existing one.

8. Expand the Command and JavaScript command and change the Int value to be the Entity Type Code of your custom activity

Remember that the entity type code is unique to your custom entity – but it could change between deployments.

9. Right click on the command and select 'Edit Display Rules' and Remove the 'HideOnCommandBar' rule

Since the Task button only shows on the legacy ribbon, we need to change the command so it also shows on the command bar by removing this display rule.

10. Select the 'Command Bar' tab and drag a button onto the Form command bar

11. Set the Command to be your new custom command (e.g. Mscrm.AddCustomActivityToPrimaryRecord)

12. Set the image16 to be /_imgs/ribbon/AddActivity_16.png (or a custom image you have for your entity)

13. Expand the 'Display Rules' and set the 'IsCore' property to 'True' on each Display Rule.

14. Expand the 'Enable Rules' and set the 'IsCore' property to 'True' on each Enable Rule.

By setting the 'IsCore' property to true, we only reference them rather than redefining them.

15. Publish your customisations

16. Test the new button on the opportunity form!

Sparkle XRM Code Snippets

$
0
0

If you are doing Sparkle XRM development then you'll find these code snippets I've created for VS2012 very useful. You can grab them from the master repository:
https://github.com/scottdurow/SparkleXrm/tree/master/Snippets
To start using these snippets you simply need to copy the contents of the Snippets directory and paste it into your profile directory at Users\<username>\Documents\Visual Studio 2012\Code Snippets

Using the snippets in Visual Studio is easy – When you are creating a new view HTML page just select all the default code and then type 'sparkle-view-page' followed by TAB. This will add the page snippet and allow you to TAB around the highlighted variable parts of the template which in this case are the name of your Client library (default Client.js) and the name of your View Class:


Once you've finished updating the parameters press Escape to exit the snippet edit.

You can then move the cursor to the line marked <!--TODO--> and add in your form by typing 'sparkle-view-form'. This allows you to enter the name of your view model instance and the title of your form section. Now you're ready to move to the TODO line and add the fields by typing the name of the field snippet and again pressing TAB and filling in the parameters.

Here is a list of all the snippets included:

  • sparkle-view-page – Adds the standard HTML View template
  • sparkle-view-form – Adds the standard Sparkle XRM Form scaffolding
  • sparkle-view-grid – Adds a Sparkle XRM grid
  • sparkle-view-text – Adds a text field
  • sparkle-view-numeric – Adds a numeric field and allows setting the max/min to a constant or view model field.
  • sparkle-view-optionset – Adds a optionset field and allows specifying the entity and attribute to grab the optionset metadata from.
  • sparkle-view-datetime – Adds a datetime field
  • sparkle-view-lookup –Adds a lookup field and allows specifying the search command to return the available records to select. The sparkle-viewmodel-searchcommand can be used to create the search command in the ViewModel Script# code.

There are also some snippets to add a View and ViewModel:

  • sparkle-view-class – Used to create a template view class that is referenced by your html view page.
  • sparkle-viewmodel-searchcommand – Used to create a command that is used to bind to a sparkle-view-lookup to be used when searching within the lookup field.

You'll find that using this snippets will speed up writing your code and reduce errors. Hope it helps!

@ScottDurow

And then there were seven!

$
0
0

If you've updated to the CRM2013 Spring '14 Wave (Service Pack 1) I think you'll agree that it contains some pretty awesome features.

You can read a good roundup of the developer features in the SDK (http://msdn.microsoft.com/en-us/library/gg309589.aspx#BKMK_NewInSpring2014) but have you noticed that there is a little less white space across the top of your forms? This is because the Command Bar now shows 7 rather than 5 buttons before buttons are added to the ellipses overflow.

It is worth noting that the number of buttons that are displayed before the overflow is not (yet?) configurable and nor can you revert back to displaying only 5 (pre SP1). That said, I really appreciate this little change that is a result of the product team listening to feedback from users. What's more it has no impact on the Ribbon Xml or Ribbon Workbench.

You might like to read more about the Command Bar in my post from way back when CRM2013 was first released.

@ScottDurow

My top 3 fixes in CRM2013 SP1

$
0
0

There are great many new features shipped in CRM2013 SP1 but let's not forget there are some eagerly awaited fixes as well (as described by http://support.microsoft.com/default.aspx?kbid=2941390) Since there has already been plenty of coverage of the new features, I thought I would pick out my top 3 fixes that I've been particularly waiting for…Queue up the charts count-down soundtrack!

In at Number 3: Matching Connection Roles

Connections are a great way of providing users with an overview of a contact's involvement with all areas of your business. Right from within a contact record users can see the roles that the contact has across different record types such as accounts, cases, and opportunities. Roles are specified on each side of the connection and in many cases you need to have the same role on both sides. In CRM2011 this was possible but strangely with CRM2013 the same role could not be specified as the matching role with the issue being described by the KB article as "In Microsoft Dynamics CRM 2013, users are unable to set a Matching Connection Role to the same Connection Role." Up until now the work around was to create a different role but with the same name but I've tested this in SP1 and I'm pleased to report that it now works the same as it did in CRM2011.

New at Number 2: Importing solution fails when plugins attached to custom actions

The addition of custom actions to CRM2013 was fantastic for creating custom server logic that is called from JavaScript. Unfortunately when you attached a plugin to a custom action message and imported a solution containing that plugin it would fail (See the Microsoft Connect article for this issue). I have held back from using this feature up until now due to the pain it created during deployments but now that it is fixed in SP1 I'm going to be making a lot more use of Custom Action plugins!

UPDATE: Actually, the connect item for this issue is not entirely correct - this will be fixed in one of the next Update Rollups of CRM 2013 SP1

…and this Service Pack's Number 1 Fix is: "Found more than one RibbonDiff entity"

This message is likely a very familiar message to anyone who has made customisations to the Ribbon with CRM2011 (as described by http://support.microsoft.com/kb/2503029). The message was shown when importing customisations and was usually down to there being more than one element in the ribbon xml that had been given the same ID. When CRM2013 was released the message started popping up far more frequently. Initially I was worried it was due to an issue with Ribbon Workbench but eventually I tracked it down to the fact that elements were being duplicated when a solution was exported from CRM2013. The Ribbon Workbench resolves duplicate IDs automatically so the issue only caused problems when transferring a solution between environments through an export/import but what was more confusing is that the issue would only happen on the second import. There is a line in the KB article for SP1 that describes the issue as:

"When we import, the import logic creates two values for each entry within the RibbonDifBase. But When we export the application ribbon we do not have any check for this and we export 2 values directly from the DB to the XML. This if imported to a new org will create 4 values in the table RibbonDiffBase. If the solution is imported again to the same org causes an error"

The Microsoft Connect item for this issue is still marked as active but I've tried to reproduce the issue in SP1 and so far so good! Well done Dynamics CRM Product team!

Poptastic!

@ScottDurow


Monitor, Monitor, Monitor

$
0
0

I once heard someone say that "the great thing about Dynamics CRM is that it just looks after itself" Whilst CRM2013 is certainly very good at performing maintenance tasks automatically, if you have a customised system it is important to Monitor, Monitor, Monitor! There are some advanced ways of setting up monitoring using tools such as System Center but just some regular simple monitoring tasks will go a long way for very little investment on your part:

1) Plugin Execution Monitoring

There is a super little entity called 'Plugin-in Type Statistics' that often seems to be overlooked in the long list of advanced find entities. This entity is invaluable for tracing down issues before they cause problems for your users and as defined by the SDK it is "used by the Microsoft Dynamics CRM 2011 and Microsoft Dynamics CRM Online platforms to record execution statistics for plug-ins registered in the sandbox (isolation mode)."

The key here is that it only records statistics for your sandboxed plugins. Unless there is a good reason not to (security access etc.) I would recommend that all of your plugins be registered in sandbox isolation. Of course Dynamics CRM online only allows sandboxed plugins anyway so you don't want to put up barriers not to move to the cloud.

To monitor this you can use advanced to show a sorted list by execution time or failure count descending:

If you spot any issues you can then proactively investigate them before they become a problem. In the screen shot above there are a few plugins that are taking more than 1000ms (1 second) to execute, but their execution count is low. I look for plugins that have a high execution count and high execution time, or those that have a high failure percent.

2) Workflow & Asynchronous Job Execution Monitoring

We all know workflows often can start failing for various reasons. Because of their asynchronous nature these failures can go unnoticed by users until it's too late and you have thousands of issues to correct. To proactively monitor this you can create a view (and even add to a dashboard) of System Jobs filtered by Status = Failed or Waiting and where the Message contains data. The Message attribute contains the full error description and stack trace, but the Friendly message just contains the information that is displayed at the top of the workflow form in the notification box.

3) Client Latency & Bandwidth Monitoring

Now that you've got the server-side under control you should also look at the client connectivity of your users. There is a special diagnostics hidden page that can be accessed by using a URL of the format:

http://<YourCRMServerURL>/tools/diagnostics/diag.aspx

As described by the implementation guide topic, "Microsoft Dynamics CRM is designed to work best over networks that have the following elements:

  • Bandwidth greater than 50 KB/sec
  • Latency under 150 ms"

After you click 'Run' on this test page you will get results similar to that shown below. You can see that this user is just above these requirements!

You can read more about the Diagnostic Page in Dynamics CRM. You can also monitor the client side using the techniques I describe in my series on Fiddler:

If you take these simple steps to proactively monitor your Dynamics CRM solution then you are much less likely to have a problem that goes un-noticed until you get 'that call'!

@ScottDurow

Multi Entity Search for CRM2013

$
0
0

I've just published an update to my Multi-Entity Search Solution (after being encouraged by my friend and fellow Dynamics CRM MVP Gus Gonzalez!).

Features:

  1. Search across multiple entities at once.
  2. Uses the same configuration as the mobile client 'Quick Find' (Settings->General ->Set Up Quick Find). This allows you to select which entities you would like to search across.
  3. Virtual Scrolling with new records loaded as you scroll rather than all loaded at once.
  4. Shows the primary entity image of returned records (if there is one) in the search results.

 

In the new version you'll find:

  1. A search button added to the top navigation bar* rather than using a Command Bar button.
  2. Auto searching as you type the search term
  3. Mouse Wheel horizontal scrolling support

*Since there is no supported way of doing this, I've had to do a little DOM shenanigans to get this to work the way Gus wanted!

To try it out you'll need to install the following 2 managed solutions:

If you like this, you might also like to check out my Start Menu Navigation for CRM2013!

@ScottDurow

Creating a Contact Card List Web Resource with Images

$
0
0

Sparkle XRM provides a great way of creating grids and forms that look and work similar to Dynamics CRM but sometimes you need to create a responsive user interface that is a little different. Luckily there is a wealth of jQuery plugins out there that provide a great starting point. For this post I'm going to show you how to use FreeWall to create a dynamic grid layout of contact cards but the same approach would apply for any other plugin that isn't included in the core Sparkle XRM dependencies library. This sample creates an HTML web resource that lays out your contacts in a responsive grid that resizes depending on the size available.

Create the Script# Import

Whenever we need to use an external library from Sparkle XRM, the external API must be defined as imported classes. For FreeWall we must import the jQuery plugin and the configuration options class. Notice that the [ScriptName("Object")] is used on the configuration options so that although our Script# class uses a strongly typed class, the JavaScript compiled uses an anonymous type so as not to create unnecessary class prototypes.

Create the View Model

It is a good idea to start with the View Model since this defines all the functionality that the View must expose as well as communicating with the server. For this sample we have a simple view model that simply loads a list of contacts into a list so that FreeWall can be bound to it to display our contact cards. It also provides a function to get the Image Url of the contact and return a placeholder image if no image is defined. The contact image can be found by returning the attribute with logical name 'entityimage_url'

Include the library JS

Once you've selected the library you want to use, you'll need to include it in your Crm Solution project under the js folder, and give it a UniqueName and Display Name similar to dev1_/js/freewall.js

Create the HTML View

The HTML View should be added to the html folder and must contain the scaffolding to hook the jQuery library into and initialise the View code. The data binding is done using Knockout's built in templating engine using the 'template' binding.

Create the View Class

The view class's job is to instantiate the View Model and initialise the binding.

Notice the OnAfterRender call back – this is called every time a new contact card is rendered because of the binding afterRender: Client.InlineSubGrids.Views.ContactCardView.onAfterRender in the HTML. If this is not done then the freewall grid will not layout until the window is resized.

The result is a nice and responsive layout that optimises the fill the available space and has variable height blocks.

@ScottDurow

'Error while copying content to a stream' when pushing to Git

$
0
0

Whilst pushing a recent commit on SparkleXRM I recieved the following error:

An error was raised by libgit2. Category = Net (Error). Error while copying content to a stream.

I turned it off again (rebooted), and used fiddler to trace what was going with no luck. In a last ditch attempt I changed the networking on my virtual machine from NAT to Bridged and hey presto it worked agiain.

So it seems that Git doesn't like NATed connections from Virtual Machines.

Hope this helps someone else!

 

Polymorphic Workflow Activity Input Arguments

$
0
0

I often find myself creating 'utility' custom workflow activities that can be used on many different types of entity. One of the challenges with writing this kind of workflow activity is that InArguments can only accept a single type of entity (unlike activity regarding object fields).

The following code works well for accepting a reference to an account but if you want to accept account, contact or lead you'd need to create 3 input arguments. If you wanted to make the parameter accept a custom entity type that you don't know about when writing the workflow activity then you're stuck!

[Output("Document Location")]
[ReferenceTarget("account")]
public InArgument<EntityReference> EntityReference { get; set; }

There are a number of workarounds to this that I've tried over the years such as starting a child work flow and using the workflow activity context or creating an activity and using it's regarding object field – but I'd like to share with you the best approach I've found.

Dynamics CRM workflows and dialogs have a neat feature of being about to add Hyperlinks to records into emails/dialog responses etc. which is driven by a special attribute called 'Record Url(Dynamic)'

This field can be used also to provide all the information we need to pass an Entity Reference.

The sample I've provide is a simple Workflow Activity that accepts the Record Url and returns the Guid of the record as a string and the Entity Logical Name – this isn't much use on its own, but you'll be able to use the DynamicUrlParser.cs class in your own Workflow Activities.

[Input("Record Dynamic Url")]
[RequiredArgument]
public InArgument<string> RecordUrl { get; set; }

The DynamicUrlParser class can then be used as follows:

var entityReference = new DynamicUrlParser(RecordUrl.Get<string>(executionContext));
RecordGuid.Set(executionContext, entityReference.Id.ToString());
EntityLogicalName.Set(executionContext, entityReference.GetEntityLogicalName(service));

 

The full sample can be found in my MSDN Code Gallery.

@ScottDurow

Viewing all 296 articles
Browse latest View live