Speaking and Hackathon at Digital Workplace Conference Australia - Melbourne


In a little less than a month on August 15-16, I'll be presenting in Melbourne at the Digital Workplace Conference Australia 2018.  


(I still fondly remember this conference as the previously annual Australian SharePoint Conference - but all things evolve.  SharePoint to SharePoint Online to Office 365, and the workplace evolved from portals to intranet to social platforms to conversational platforms).

Flow and Functions: level up our Serverless Toolkit in Office 365

I will be presenting an amped up talk covering the implementation and automation with Microsoft Flow and Azure Functions.  This is one of several sessions on Microsoft Flow at this conference so I will be covering implementation and mastery at level 200+  (But because it's Flow, it'll still look deceivingly simple!)

John join forces with Paul Culmsee and Ashlee at #DWCAU pre-conf hackathon

I want to also mention that I'm helping out with Paul Culmsee and Ashlee's PowerApps and Flow Workshop/Hackathon on 14 August 2018


I personally don't know how much PowerApps and Flow raw potential would be in that room in the hackathon.  I really want to find out and I hope you would too.  Space for this is limited - please consider this one day pre-conference workshop.

"Learn PowerApps and Flow from experts and build an app that you need." says John.

"Two MVPs for the price of one" says Paul.  I'll just leave this here.

Whether at the conference, and/or the hackathon, I hope to see you there in Melbourne.

Betting on 2018 - level up our Serverless in Azure

A recent conversation got me thinking about making some predictions for 2018.  This isn't so much a "ha look I'm right in 2019" post.  This is more about internalizing and verbalize my choices and I think there's value is sharing all this thinking.

So here it is, all of it: notes, wishlist, observations, what other people are doing, what we should be doing.  All in one overview blog post.  Happy 2018.




Bet on Serverless

You can't look sideways without seeing "Serverless" it's a silly term, but I need to start with a definition by Serverless experts on "Serverless"

  • Use a compute service to execute code on demand
  • Write single-purpose stateless functions
  • Design push-based event-driven pipelines
  • Create thicker front-ends
  • Embrace third-party services

I started on this path in 2016 and I can't look back.  Being able to run your code, anytime in the cloud is a life changing experience for many of us - it abstracts the operations part of hosting code in the cloud, and lets us get back quickly into code.

Applications for this technique are far and wide.  From simple services to augment the endless front-end applications we were building in 2016, to finally having a great way to handle remote events or permission escalation.  And look beyond to the bot-framework.  A little blog post I wrote in 2016 about Serverless site provisioning is now officially best practice in SharePoint's Site Design - I'm a little glad it was useful :-)  At times it feels like I just hack and cobble things together and behold, wow people do like this.


So, what's next?


Serverless Orchestration

Invest into Serverless orchestration.  Azure Functions are not the right place to do our orchestration.  Yes, Durable Functions will help this a lot.  But the product we should be looking at is Azure Logic Apps / Microsoft Flow.

As far as I'm concerned - these are the same products, the differences boils down to:

Logic Apps

  • UX, with JSON editor is targetted for developers
  • Consumption based pricing - per actions used, perfect for multiple small requests 
    • So we end up compressing multiple actions into unreadable mess to save costs
  • Integration Services (biztalk scenarios)
  • Better for multi-tenant solutions.

Microsoft Flow

  • UX tries really hard to remain Power User friendly and hide JSON complexity
  • Per Flow execution pricing, with free buckets per tier
    • So we end up putting way too many steps inside a single Flow to save costs
  • Premium connectors as part of higher tier plans
  • Free licenses as part of Office 365 / Dynamic 365 plans making this cheaper for single-tenant solutions.

What can you do with Logic Apps/Flow?

  • Leverage connectors - (remember Embrace third-party services is a Serverless principle), these are hundreds of connectors implemented by the various product teams themselves directly.  So they know what they are doing* (most of the time)
  • You can do delay and wait easily in Flow
  • You can do loops easily in Flow (in Functions it's tricky without potentially hitting timeout).
  • You can do for-each loops in Flow and easily turn it into parallel execution (fan-out) with fan-in just part of the package
  • You can define repeat/retry policies with gradual fall back in Flow
  • You can define follow next token in Flow HTTP Request for REST paging
  • You can handle fallback behaviour as a scoped set, so if any actions fail you can orchestrate that
  • You can include human workflows with human approvals and send nice templated emails with attachments from Flow
  • Function shouldn't do more than one thing.  Use Flow to chain them.


Serverless API end points

As we build out a constellation (I stole this word from https://www.slideshare.net/HeitorLessa1/serverless-best-practices-plus-design-principles-20m-version) of functions.  We need to clean up all the microservices APIs with a unified API front.  There are two products for this:

Azure Functions Proxy

  • Simpler - can transform query/post messages

Azure API Management Service

  • More extensive - can transform REST to XML
  • Better Open API definitions


Serverless Websites / thicker Front-Ends

A serverless website is basically a CDN plus FaaS.  You don't scale Azure VM or even Azure WebJobs.  Build your entire website with your favourite JavaScript library (I like and recommend Angular - but you should use what your team uses), then bundle with Webpack into a couple of minified JS file for CDN.

Do your compute in the client.  And do your server compute with Azure Functions.

I'll even add here that a low-code solution such as PowerApps is extremely good at getting a proof of concept up and running quickly.  PowerApps supports offline capabilities and will happily call your Serverless APIs via a Swagger/OpenAPI file and treats them all as first class functions.


As part of the Azure services upgrade email (what a peculiar way to announce new features), the upgrade to the latest Windows Server means that Azure Functions, as part of Azure App Services, will gain ability to work with HTTP/2.

It means - we can get our entire HTTP website in one HTTP Get request, with our Function (or possibly our LogicApp/Flow) sending multiple resources in one response.


Serverless Database

Let me first define what is a Serverless Database.  Essentially, you have a database in the cloud. 

  • You want to pay for storage.
  • You want to pay for compute.  On consumption based plans
  • Pay nothing if it's not doing anything, automatically scale as necessary
  • The problem we are trying to fix is simple.  We want to start an application, pick a database, and have it scale with us.  We don't want to put SQL Azure on the cheapest free VM and have it run like crap.

This is an area where Azure is somewhat lacking.  My choices are:

  • Azure Storage Table
  • SharePoint Lists (only because I'm a Office 365 person and I've got office 365 tenants everywhere I look)


I predict boldly that Azure will bring out a Serverless CosmosDB solution in 2018 and it will be what everyone in the Microsoft ecosystem uses from there onwards.

Otherwise, look towards the competition:

  • Google Cloud Platform has Firebase - event driven, consumption based database, linked to Google Cloud Functions
  • Amazone Web Services has Aurora Serverless - in late 2017, AWS announced they've separated Aurora's cost model down to Compute and Storage.


Serverless Event Aggregator

The Azure Event Grid is a very interesting service.  I see the possibility that we'll see a unified way to manage all events in a system.

This is best explained with a parallel analogy.  In browser applications, we catch and handle events in the DOM all the time.

In the beginning, we do:


This has all sorts of problems - how do you route.  How do you de-allocate.  How do you attach new events as new resources come online.  A few years later, we end up with this:

$(global).on("click", ".filter", func)

We attach events via one top level resource, ALL our event handlers are attached there.  And then we let events bubble to the root, apply the filter, then call the handler.

The Azure Event Grid has the potential to be this solution.  In 2019, if we are attaching event handling directly to a resource or a container, then we have stuffed up.  We should attach all our events to Event Grid, then filter within the event grid, and only then invoke the functions that fits the filter.


I'm hopeful if we can map Microsoft Graph events into the Azure Event Grid - then we'd have something super magical.


Serverless Visualization

I want to end on this one because I don't have a great solution, but I think we need a great solution.


As we built out our constellation of functions and orchestration, there's a need to visualize that design so we can both review the designs, and specifically see where the bottle necks are.

If a set of microservices are buggy, this would be a place to pintpoint this and switch the Functions back to the previous deployment slots.

With Azure Insights - we can get detailed logging for Functions and Flow/LogicApps, so perhaps this is something that needs to be layered on top of the logging.




Serverless connect-the-dots: MP3 to WAV via ffmpeg.exe in AzureFunctions, for PowerApps and Flow

There's no good title to this post, there's just too many pieces we are connecting.  

So, a problem was in my todo list for a while - I'll try to describe the problem quickly, and get into the solution.

  • PowerApps Microphone control records MP3 files
  • Cognitive Speech to Text wants to turn WAV files into JSON
  • Even with Flow, we can't convert the audio file formats. 
  • We need an Azure Function to gluethis one step
  • After a bit of research, it looks like FFMPEG, a popular free utility can be used to do the conversion

Azure Functions and FFMPEG

So my initial thought is that well, I'll just run this utility exe file through PowerShell.  But then I remembered that PowerShell don't handle binary input and output that well.  A quick search nets me several implementations in C# one of them catches my eye: 

Jordan Knight is one of our Australian DX Microsoftie - so of course I start with his code

It actually was really quick to get going, but because Jordan's code is triggered from blob storage - the Azure Functions binding for blob storage has waiting time that I want to shrink, so I rewrite the input and output bindings to turn the whole conversion function into an input/output HTTP request.


#r "Microsoft.WindowsAzure.Storage"

using Microsoft.WindowsAzure.Storage.Blob;
using System.Diagnostics;
using System.IO;
using System.Net;
using System.Net.Http.Headers;

public static HttpResponseMessage Run(Stream req, TraceWriter log)

    var temp = Path.GetTempFileName() + ".mp3";
    var tempOut = Path.GetTempFileName() + ".wav";
    var tempPath = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());


    using (var ms = new MemoryStream())
        File.WriteAllBytes(temp, ms.ToArray());

    var bs = File.ReadAllBytes(temp);
    log.Info($"Renc Length: {bs.Length}");

    var psi = new ProcessStartInfo();
    psi.FileName = @"D:\home\site\wwwroot\mp3-to-wave\ffmpeg.exe";
    psi.Arguments = $"-i \"{temp}\" \"{tempOut}\"";
    psi.RedirectStandardOutput = true;
    psi.RedirectStandardError = true;
    psi.UseShellExecute = false;
    log.Info($"Args: {psi.Arguments}");
    var process = Process.Start(psi);

    var bytes = File.ReadAllBytes(tempOut);
    log.Info($"Renc Length: {bytes.Length}");

    var response = new HttpResponseMessage(HttpStatusCode.OK);
    response.Content = new StreamContent(new MemoryStream(bytes));
    response.Content.Headers.ContentType = new MediaTypeHeaderValue("audio/wav");

    Directory.Delete(tempPath, true);    

    return response;
Trick: You can upload ffmpeg.exe and run them inside an Azure Function


  "bindings": [
      "type": "httpTrigger",
      "name": "req",
      "authLevel": "function",
      "methods": [
      "direction": "in"
      "type": "http",
      "name": "$return",
      "direction": "out"
  "disabled": false

Azure Functions Custom Binding

Ling Toh (of Azure Functions) reached out and tells me I can try the new Azure Functions custom bindings for Cognitive Services directly.  But I wanted to try this with Flow.  I need to come back to custom bindings in the future.


Set up Cognitive Services - Speech

In Azure Portal, create Cognitive Services for Speech

Need to copy one of the Keys for later


Take the binary Multipart Body send to this Flow and send that to the Azure Function


Take the binary returned from the Function and send that to Bing Speech API

Flow returns the result from Speech to text which I force into a JSON


Try it:


Need this for PowerApps to call Flow
I despise Swagger so much I don't even want to talk about it (the Swagger file takes 4 hours the most problematic part of the whole exercise)

  "swagger": "2.0",
  "info": {
    "description": "speech to text",
    "version": "1.0.0",
    "title": "speech-api"
  "host": "prod-03.australiasoutheast.logic.azure.com",
  "basePath": "/workflows",
  "schemes": [
  "paths": {
    "/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/triggers/manual/paths/invoke": {
      "post": {
        "summary": "speech to text",
        "description": "speech to text",
        "operationId": "Speech-to-text",
        "consumes": [
        "parameters": [
            "name": "api-version",
            "in": "query",
            "default": "2016-06-01",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
            "name": "sp",
            "in": "query",
            "default": "/triggers/manual/run",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
            "name": "sv",
            "in": "query",
            "default": "1.0",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
            "name": "sig",
            "in": "query",
            "default": "4h5rHrIm1VyQhwFYtbTDSM_EtcHLyWC2OMLqPkZ31tc",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
            "name": "file",
            "in": "formData",
            "description": "file to upload",
            "required": true,
            "type": "file"
        "produces": [
          "application/json; charset=utf-8"
        "responses": {
          "200": {
            "description": "OK",
            "schema": {
              "description": "",
              "type": "object",
              "properties": {
                "RecognitionStatus": {
                  "type": "string"
                "DisplayText": {
                  "type": "string"
                "Offset": {
                  "type": "number"
                "Duration": {
                  "type": "number"
              "required": [

Power Apps



I expect a few outcomes from this blog post.

  1. ffmpeg.exe is very powerful and can convert multiple audio and video datatypes.  I'm pretty certain we will be using it a lot more for many purposes.
  2. Cognitive Speech API doesn't have a Flow action yet.  I'm sure we will see it soon.
  3. PowerApps or Flow may need a native way of converting audio file formats.  Until such an action is available, we will need to rely on ffmpeg within an Azure Function
  4. The problem of converting MP3 to WAV was raised by Paul Culmsee - the rest of the blog post is just to connect the dots and make sure it works.  I was also blocked on an error on the output of my original swagger file, which I fixed only after Paul sent me a working Swagger file he used for another service - thank you!



Taking a picture with PowerApps and sending to SharePoint with just Flow

Less than one day after I wrote about Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions - I was looking at Flow to do another thing with recurring calendar events, and reading about how Logic App's Workflow Definition Language can be used in Flow.  Then as I scrolled down - I saw this: dataUriToBinary

This was the heart of the problem in converting PowerApp's camera image (Data URI) for SharePoint File upload (Binary).  That I solved with an Azure Function.

And here it is, again, staring at me: dataUriToBinary()
And I know I'd have to write this new post.  

Create the Flow from Template

Using Advanced Formula from Logic Apps Functions in Flow

https://docs.microsoft.com/en-au/azure/logic-apps/logic-apps-workflow-definition-language#functions lists the Logic Apps functions available to Flow.  There are some tricks to make the syntax work - but they are all the same, so practice makes perfect.  Also, there is a LOT of functions.  So it should be fun.


Add Compose Action

Add "@dataUriToBinary(  ...  )" drag in Createfile_FileContent.  It'll look OK at first, but if you try to Update flow, you'll get an error.

The template validation failed: 'The template action 'Compose' at line '1' and column '1947' is not valid: "The template language expression 'dataUriToBinary(@{triggerBody()['Createfile_FileContent']})' is not valid: the string character '@' at position '16' is not expected.".'.

Note 2018: the Flow designer has been changed since 2017, and the way to write this expression has changed.

  • Create a Compose action

  • In the dynamic content panel that pop up on the right, select expression editor

  • Type in dataUriToBinary(triggerBody()['Createfile_FileContent'])

  • Note, without the prefix @

  • Hit OK to write the expression into the Compose

Note: Once you save and come back, it won't show the " quotes anymore, and it isn't updateable.




So that's all - DataURI to Binary conversion for PowerApps camera to go to SharePoint file.


In a way, I'm glad - even in my previous post I argued that data conversion should be native, and shouldn't require a developer.  So this is kind of my wish come true.


Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions

Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions

Sometimes, after having written a selfie app in Silverlight, JavaScript, even an Add-in (SharePoint Online), you want to do it again with PowerApps.  This is that article.  I think it's really fun.  And I think it's funny I'm solving the world's problems one AzureFunction at a time.  And I think I need help.

Read More