Scheduled Item Publishing in Modern SharePoint Site Pages

For those of us used to the rich content publishing features in the old “Classic” SharePoint Publishing model, the new Modern experience takes a little getting used to. For example, until very recently there wasn’t much at all in the way of content approval to say nothing of the more advanced features like scheduled item publishing. This month (August 2018) Microsoft announced a new solution for page approvals utilizing Flow under the covers, which promises to enable the sort of functionality we were used to in Classic publishing sites.

One of the Classic features my client use all the time is scheduled item publishing. In Classic Publishing libraries, this was a setting on the library that allowed content approvers to set a date and time when the page would be published, and it worked pretty well. This feature is missing on Modern, but with the new Flow capability, we can bring it back.

Enabling the Approval Flow

On the Modern Site Pages library, in the Flow dropdown, we not have the option “Configure Page Approval Flow”.

Configure Page Approval

If we choose the option to do this we get a slide out panel that allows us to set up our list of approvers.

List of Approvers

List Approvers

Flow Created

Now, once we’ve done this, we get the option to “Submit For Approval”. Clicking this option opens an initiation form where we can kick off the approval process. The users specified in the flow configuration will get the Approval email, and on approval the page will get published.

Updating the Flow to support scheduled item publishing

To enable scheduled item publishing we need to do two things. First, we need a way to specify the date on which we want to publish. An easy way to do this is to add a Date field onto the Site Pages library. Use a custom content type that inerits from Site Page add this field there.

The second thing we need to do is modify the Flow to add a “Delay Until” action, using the Date field we added to our content type. We’ll put this inside the “Yes” branch of the condition that follows the approval result. It looks something like this:

Delay Until

Now, when we submit a page for approval, we can see the Flow waiting until the publish date and time before proceeding on to publish the page.

Delaying

Some things to be aware of

All in all, this process works pretty well, but the whole Approval Flow business has some rough edges, and some things that don’t quite work as well as they should.

There’s no way to see configured Approval Flows for a library.

SharePoint will happily allow you to configure many approval Flows on a single library, because the UI has no way to show you the Flow(s) that have already been configured. Unless there’s a way I’m not aware of, the only way to see your approval Flow is to go directly to Flow and pick your Flow from the list. Which can be a problem, especially considering…

Flows don’t scale well.

If you have 200 sites, you’ll need to configure 200 separate instances of the Approval Flow. Obviously this is a governance and maintenance nightmare, and woe be upon the person who has to do all this grunt work, because Flow creation is not easily automated. One possibility might be to create a master Flow that all the other Flows call via HTTP request, and simply update each Approval Flow to launch that.

You’re stuck naming individual people as approvers, no SharePoint or AD groups

The UI doesn’t allow for this, but the approval email actions don’t allow this, so I think that’s the cause of the limitation. We can do a little more work inside the Flow to fetch a SharePoint group but that’s maybe a topic for another time.

Flows crap out after 30 days

If 30 days transpire after the invocation of a Flow, the Flow will just stop working. This seems to be a hard limit. So when submitting your Approval Flows, be aware of this limitation.

Wrapping it up

SharePoint’s Modern initiative replaced a mature, battle-hardened system in Classic Publishing, and naturally there would be some functionality gaps to close. Microsoft is working quickly to address this, and it seems reasonable to expect further evelopments along this path. For now, at least we now have a way to deliver scheduled item publishing to our clients in Modern SharePoint.

Advertisements

I’ll be speaking at the SE Michigan PowerApps/Flow User Group Sept 10

I’ll be speaking at the Southeast Michigan PowerApps/Flow user group, September 10 at 5:00, at the Rightpoint offices in Royal Oak.

I’ll be giving a brief introduction to connectors, triggers, and actions in Flow, and talk about how to create your own integrations using raw HTTP actions and converting those into custom actions and connectors. I’ll also demo some real-world implementations of things I’ve done in Flow using this pattern.

Hope to see you there!

Meetup:

SE Michigan PowerApps/Flow User Group meet up

Monday, Sep 10, 2018, 5:00 PM

Rightpoint
909 South Main St, Royal Oak, MI Royal Oak, mi

7 Members Attending

Hello Everyone, Hope you are having a great summer! It’s time for our next meet up on Monday, September 10. We have some exciting topics to talk about. Agenda • Check in/Snacks • Welcome • PowerApps Customer Success story from Rightpoint – Sreeni A We will talk about an app with demo that helps field workers complete a report for the job they worke…

Check out this Meetup →

Venue:
https://binged.it/2wrIH1K

Copy Link in Modern SharePoint – non-obvious security implications you should know about

Recently I encountered a strange issue in a client’s Intranet during the content buildout phase. They’d given read-only access to a group of pilot users, and loaded up their site with pages and links to documents. Then they began to notice that these pilot users appeared to have the ability to delete the documents, and logged a bug with us.

We discovered that the document library had hundreds of files with broken permission inheritance, and that the Everyone principal had been granted Contribute permission on each one of these documents, meaning they could edit and even delete the documents.

Thinking that some rogue user had inadvertently (or “advertently”) shared those documents in error, we ran a script that looped through all the documents in each library and restored the permission inheritance on each one. Then we discovered that the several hundred or so hyperlinks to the documents throughout the system began returning 404s.

Eventually we tracked the issue down to a “feature” of the Copy Link action bar item in Modern SharePoint document libraries. We discovered that Copy Link does a bit more than merely return a link to the document to the user’s clipboard.

The Document Action Bar

My client had been using SharePoint’s Copy Link functionality to create those links, just as we had taught them to. But what we didn’t realize was that clicking Copy Link was actually breaking the security inheritance on the document, and sharing it to the entire company. This was because the tenant settings that drove this functionality were left in their default settings, which inexplicably default to the most permissive – the most insecure – setting.

Check out what happens when you click the button:

Copy Link Dialog

Once you see this dialog, permission inheritance has already been broken and the permission “Anyone with the link can edit” has already been applied. If you select another option, the permission will update – even to the point of reinstating permission inheritance if “People With Existing Access” is selected. Also, the link regenerates, and previously generated links become stale and return 404s.

Copy Link Options

The link structure will tell the sharing story

If a Copy Link operation results in broken inheritance, it will look different from a link that does not.

A Sharing Link looks like this:
https://m365x692092.sharepoint.com/:w:/g/Ea90HDWefS1BnLLgtVkMNJgBdpUI6LiBC7Kw4pj0g-CIAQ?e=troe2t

..while a non-shared link will look like this:
https://m365x692092.sharepoint.com/:w:/r/Shared%20Documents/CAS/Marketing%20Strategy%20Future.docx?d=w351c74af7d9e412d9cb2e0b5590c3498&csf=1&e=TWfoVC

Note that a sharing link shows the tenant followed by a long string of crap, and the non-sharing link, while also containing its share of trailing junk, also seems to incorporate a physical path as part of its structure. So using this pattern you should be able to tell if a Copy Link resulted in broken inheritance.

My thoughts on this

You have some options for setting the default behavior of this function, but like I said the default default is the most permissive. The decision to have it behave this way vexes me somewhat. In previous versions of SharePoint it’s been difficult and tedious to break permission inheritance through the UI, and I think it ought to be that way. Breaking inheritance should only be done with serious consideration as it’s difficult to support and also has performance implications – a Microsoft employee once told me that breaking inheritance “makes SQL cry”. Maybe in the cloud we care less about performance implications because all that stuff is abstracted away. But it’s still there and I’d have to believe Microsoft cares about its servers. Anyway…

Know your tenant settings

We can manage the tenant-wide default behavior for Copy Link by navigating directly to https://-admin.sharepoint.com/_layouts/15/online/ExternalSharing.aspx

There are a number of settings related to Sharing on this page but the ones we care about are under the headings “Default Link Type” and “Default Link Permission”. The defaults look like this.

Tenant Settings

Note that in the Copy Link dialog we had four options for how to share the link, and the tenant setting only allows for three, excluding, maddeningly, the “People with Existing Access” option, the one I think should be the default. If we select the “Direct – specific people” option, though, and just not actually specify any people, the result will be the same.

The “Use shorter links” option only substitutes the “guestaccess.aspx” url with the crypic sharing url we saw earlier, nothing really to see there. The Default Link Permission setting, if set to Read, will at least limit the damage done if files are inadvertently shared to the general population.

Manipulating the settings using PowerShell

Of course these settings can be set using PowerShell at both the Tenant and Site Collection level. The Site Collection level settings will override the Tenant level settings for the site in question. Check out the documentation for Set-SPOTenant and Set-SPOSite. The options you want to look into are, on both commands, DefaultSharingLinkType and DefaultSharingLinkType. Make sure to check out the other settings related to sharing just to get a feel for how they work.

 

The SharePoint Modernization Scanner

While spelunking through Github this week I came across a useful tool in the PNP Tools repo that can generate some pretty interesting data about your Office 365 tenant.

It’s called the SharePoint Modernization Scanner, and it claims to grease the skids for your movement to Modern and Group-ification of your existing sites.

The complete source code is there but they’ve also included a direct link to the executable if you’re not interested in building it and just want to run the darn thing, which is what I did against a few of my tenants.

Running the darn thing

The default configuration for the tool uses a Client ID and Secret for a tenant-scoped App to authenticate into the tenant, which is pretty smart because it’s not guaranteed that admin user accounts will have access to all sites, even with policies in place to enforce it. (It’s the real world, things happen)  So, before you can run this you’ll want to make sure you have such an app and have the client ID and secret. You can also use normal credentials, just be aware of the access issue.

In order to make it work you’ll need to grab a file called webpartmapping.xml from the source code and drop it into the same directory where you’ve downloaded the executable.  Then open a PowerShell session and CD into that directory and run something like this:

./SharePoint.Modernization.Scanner.exe -t tenantname -i {client_id} -s {client_secret}

(Documentation)

The process will run for a while, depending on how much stuff is in your tenant. On one of my tenants with 400 site collections, it took about 15 minutes to run, and when it’s done, I got a nice collection of CSV files:

scannerResults

With this data we can see every site, its template, the deployed custom actions, and detailed information about every page and web part in the tenant.

Remote Event Receivers – you’re all doing it wrong

Remote Event Receivers are a powerful way to integrate custom code into your SharePoint Online environment.  Essentially a Remote Event Receiver is a hook that allows you to execute your code in response to an event that occurs in SharePoint.  There are several techniques for responding to events in SharePoint Online, but the Remote Event Receiver is the most powerful. It offers dozens of different events to attach to and allows you to configure synchronously or asynchronously. It is also very easy to attach, develop, deploy, test, and maintain. Is also very misunderstood.

Remote Event Receivers were introduced along with SharePoint 2013 and the arrival of the App model. Microsoft provided tooling with Visual Studio to create Remote Event Receivers, but unfortunately the only way to expose this tooling was in the context of a Provider-Hosted App.

This is unfortunate because Remote Event Receivers have nothing to do with Provider-Hosted Apps. In order to develop a Remote Event Receiver using the Microsoft-provided tooling, a developer had to create a Provider-Hosted App project and deploy their RER along with it. This added a great deal of complexity to the development effort, and made packaging and deployment a painful and tedious experience.

To further complicate things, Microsoft decided to wire up a WCF service as the endpoint in its RER tooling. This is sheer lunacy, even back in 2013. A web API project would have been simpler and would be more in line with Microsoft development tooling efforts. The opacity and complexity of WCF made RER development even more cumbersome. Actually, I believe they decided to use the WCF service so the development experience would be similar to that of old-school Event Receivers, with the ability to use a deserialized Event Properties object.

Building a Remote Event Receiver is Easy

In truth, it is remarkably simple to configure, develop, deploy, and maintain a Remote Event Receiver, but in order to do so you must completely abandon the Microsoft tooling and just set up the pieces yourself. Luckily there are really only two components to a Remote Event Receiver:

  • The endpoint
  • The registration

The registration is where you tell SharePoint, “call this endpoint every time this event occurs”. The CSOM provides mechanisms for adding Remote Event Receivers, but the details depend somewhat on the type of event receiver being deployed. The  PnP PowerShell library provides the ability to register a Remote Event Receiver in a one-liner. For example, to set up an RER that is invoked every time an item is updated on a list, execute:

Add-PnPEventReceiver -List "Tasks" -Name "TasksRER" -Url https://my-rer.azurewebsites.net/Service1.svc -EventReceiverType ItemAdded -Synchronization Asynchronous

More about registering Remote Event Receivers

The endpoint is just a web service listening at a certain URL, and you have lots of options for this. A Web API project would work great for this. Azure Functions are also a very compelling option. You are also free to write services in Java, Node or whatever other technology you can think of. In the example below, we’ll use the canonical WCF Service you’d get with the Visual Studio item template, but we’re going to sidestep the template and wire things up ourselves. It’s actually easier this way.

 

Creating the Remote Event Receiver shell

In Visual Studio,

  1.  create an empty ASP.NET web application
  2. Add the Nuget Package ”AppForSharePointOnlineWebToolkit”.
  3. Add a new item of type WCF Service to the web app.
  4. Get rid of the IService reference and set your service to implement IRemoteEventService, which lives in the Microsoft.SharePoint.Client.EventReceivers namespace. This namespace came into the project with the Nuget packages we added earlier. Resolve the squiggly to implement the interface stubs.

Your service class should look something like this:

RER-stub

F5 your project and navigate to the service to make sure it’s accepting requests. Take note of the port number.  We’ll need that to set up our proxy for local debugging.

Locally debugging your remote event receiver

To test this event receiver locally, we’ll use ngrok. According to its documentation, “ngrok is a reverse proxy that creates a secure tunnel from a public endpoint to a locally running web service.”  We will use it to map an Internet endpoint to our local machine so we can intercept and debug requests coming from SharePoint Online.

Assuming you have installed Node.JS and ngrok, create your proxy by executing the following. 56754 is the port number hosting my local WCF service.

ngrok-1

Once it connects it’ll output some data, including the public URL of our proxy connection:

ngrok-2

Next, open up a browser and navigate to the ngrok URL, append the service endpoint, and you should be able to see that the ngrok URL is returning your service.

ngrok-3

 

Next we’ll attach our event receiver to a SharePoint list. Using the PnP PowerShell cmdlet shown above, we’ll add a Remote Event Receiver to our site. Make sure to open a new PowerShell session for this and leave the ngrok session running in its window. The proxy will be released when the window is closed.

Now, set a breakpoint in your ProcessOneWayEvent method, add a task to the list and make an edit to it.  If all goes well your local web service will be called and the breakpoint will hit:

rer-2

Make sure you closely inspect the properties object in the debugger, and get a feel for all the data that’s in there. For this particular event we’ll want to check out ItemEventProperties and ItemEventProperties.AfterProperties for some useful metadata that gets passed into the service.

Deploying your Remote Event Receiver to Azure

When you are ready to deploy to the Internet you can deploy to Azure just like a normal web application. You’ll want to run Add-PnPEventReceiver using the Internet URL to register your event receiver for real, of course.

 

Create a Yammer Group with Microsoft Flow

Microsoft Flow is a fantastic enterprise tool and comes with hundreds of default actions, which allow you to easily perform integrations to different services, including Yammer.

Flow gives us several actions out of the box that we can use to perform integration activities against Yammer:

1

These actions pretty much revolve around fetching and creating messages. True, this is the core thing we do in Yammer, but sometimes our requirements force us to step outside the default capabilities of our platforms and think of creative ways to solve complex problems.

We do a lot of Yammer integrations in our solutions at Rightpoint, and creating a Yammer group is something we do all the time. As we’ve seen, Flow doesn’t give us an action to create Yammer groups, but there is a Yammer API that will do this, and it can be very easily executed via Flow. (Technically speaking, the API to create Yammer groups is an undocumented API. We’ll discuss what that means a little later.)

We’re going to achieve this by POST-ing the necessary data to a Yammer API endpoint, using Flow’s HTTP action.  The HTTP action is, in my opinion, the most powerful and flexible Flow component. It’s so powerful because we can literally do anything we want.  At a low enough level of abstraction, every action is an HTTP action anyway.

Authenticating to Yammer

The Yammer API uses Oauth tokens to authenticate, so before we start our Flow we’ll need to create an app in Yammer. Navigate to https: //www.yammer.com/YOUR_ORG_NAME/client_applications, and click the green “Register New App” button.

On the “Register New App” screen, fill out the required fields. Absolutely none of the fields in this form have any bearing whatsoever on what we’re doing. If you’re planning to build a web app and publish it in the global Yammer app marketplace, these fields will be needed, but 99.9% of the time they’re pointless. But they’re required fields, so put in whatever will allow the validation to pass.

2

What we’re really after is the Oauth token. When we save our app, we’ll be redirected to a configuration page where we’ll see the Client ID and Client Secret. By clicking the “Generate a developer token” link, we will expose the app’s token. Don’t worry about the text claiming that this is for testing purposes; they’re still assuming you’re building web apps for the global store. Auth tokens in that case are generated on the client side for each user.

3

Won’t this Auth token expire?

Yammer app tokens last a long time, although Yammer won’t disclose exactly what the expiration terms are. I can say that I’ve had Yammer integrations authenticating in this fashion since about 2014, and I’ve never seen one expire. They can be revoked, however, and the account that created the app has the ability to do this.

Yammer Apps – other considerations

There are a few things you’ll want to know as you develop Yammer apps in this fashion:

  • Anyone can create a Yammer App. You don’t need to be an administrator to do this.
  • Yammer Apps execute under the security context of the creating user account. Content created by the app will appear to have been created by the user that created the app. Think of it this way: through the auth token the app author is delegating their access to anyone who holds it.
  • If your token is compromised you can invalidate it by navigating to yammer.com/YOUR_ORG_NAME/account/applications and click the “Revoke Access” link next to your app.
  • Consider using a dedicated “Service Account” for creation of Yammer Apps. Not only does this protect your user account should the token get compromised, it ensures the token continues to work should your account get disabled – for example, if you leave the organization.

Understanding the Yammer group creation API

If you do any work with the Yammer API, you’ll want to check out the Yammer API documentation.

There you’ll find more detailed information about using the API, and you’ll see a listing of all the supported endpoints exposed via the API.

Notice that there is no listed endpoint to create groups. This is because the group creation endpoint is undocumented. Something to consider when working with undocumented endpoints is that Yammer is not bound to provide support for it, nor will they feel obliged to maintain backward compatibility should they ever decide to update their APIs. So there’s an element of risk to working with these endpoints, and you should be prepared to accept that one morning you might wake up to find that all your stuff is broken.  I would counter that by saying Office 365 often imposes breaking changes even on supported stuff, and these APIs have been stable for a number of years running.

The endpoint we use to create groups in Yammer looks like this:

https://www.yammer.com/api/v1/groups.json?name=GROUP_NAME&private=false&show_in_directory=true

We will POST to this URL, add a content-type of application/json, and add the Yammer Auth token as a bearer token, and leave the body empty.  The parameters should be self-evident, and the last two are optional, and they default to a public, listed group.

Let’s test this out. We can use Fiddler or Postman for this, but I’ve recently discovered the Visual Studio Code REST Client extension, and that’s what I’ll be using here. You can read more about it here.

Set your request to look like this:

4

It’s a simple as that. If you did everything right, you should get a 202 response back and your new group will be shown in Yammer, along with a notification sent to All Company:

5

Remember, you’ll want to use a service identity, which I didn’t do in this case.

Creating our Flow

Now that we’ve figured out how to post to Yammer using the raw API, let’s incorporate that into a Flow. In my Flow I’m going to use an HTTP trigger, so I can call this as a service from other applications or even from other Flows.  We’re going to pass three parameters into our Flow trigger, groupName, isPrivate, and showInDirectory.  We’ll use the sample JSON option to generate the request body our trigger will be expecting:

6

Next we’ll create a variable to construct our REST API URL. Its configuration, using the same URL structure we discussed before, using the Trigger Body JSON to flesh out the parameters, will look like this:

7.png

Now we can create and configure our HTTP action:

8

Now, assuming we’ve hooked everything up properly, we can call our Flow. I’ll be using the VS Code extension again just to keep things simple:

9

If all goes well we’ll get a 201 response and our group will be present in Yammer:

10

Note that I passed in parameters to create a private, unlisted group, and you can see this group is private and won’t be listed in the Groups list on Yammer. Also, it doesn’t create the notification message in All Company. Note that the creation of public unlisted groups is unsupported and will cause an error response to be thrown.

Wrapping it up

In this post I showed a technique for integrating the Yammer API with Microsoft Flow, and used it to create groups in Yammer.  Using Flow’s HTTP actions we can do just about anything that can be done over HTTP.  For more info on the Yammer REST APIs, check out their official documentation.

Microsoft Flow: First Impressions

Over the last several weeks I’ve had my first experiences using Microsoft Flow in a real-world application. The client has dozens of old 2010-style SharePoint Designer workflows touching a number of business functions: Sales, Procurement, Change Management, and Human Resources, and they were looking for a way to modernize their development process and eliminate some of the quirks and irksome bugs that have been plaguing their users.  Hearing that the client was looking down the road at moving from on-premises SharePoint Server 2013 to the cloud, I recommended re-writing a number of these processes in Flow instead of SPD.

flow

Flow is a part of Microsoft’s new cloud-based platform for process modeling, for lack of a better phrase. The idea is that non-developers can use Flow’s intuitive user interfaces to build robust integrations between their line of business applications with no code anywhere to be found.

At first glance, Flow seems to be a huge improvement over the experience of building workflows in SharePoint Designer. For starters, it’s web-based, so there’s nothing to install. Flow comes with an impressive array of standard integration points (“connectors”), a handful of entry points (“triggers”) and hundreds of pre-defined activities you can configure (“actions”).  By dragging and dropping widgets onto the control surface and setting up some basic properties, power users can create powerful applications without having to rely on developers or IT to set it up for them.

Here are a few quick takeaways from my experiences so far.

Low expectations

SharePoint Designer workflows come with so much baggage it’s difficult to imagine preferring it to any succeeding technology that comes along. So the bar here is low.

Wide range of capability

Flow’s range of out-of-the-box integration points is very impressive, and they keep adding new connectors and actions all the time. There’s even an extension model where you can create your own and submit them for inclusion in the platform.

More of a consumer focus

Many of the integration points, though, don’t seem to make a lot of sense in most enterprise scenarios. Twitter, Facebook, and Gmail are some such connectors. And many of the starter templates are more in the personal productivity realm. For example,

  • Text me when I get an email from my boss
  • Email me when a new item shows up in a SharePoint list
  • Start a simple approval process on a document when it’s posted
  • Save Tweets to an Excel file
  • Send me an email reminder every day

 

Easy to extend

It’s really simple to create extension points in Flow. Suppose you have a need to do something that isn’t supported by a Flow action. If you can code, you can write an API to do what you need, and call it via an HTTP action.  Azure Functions work really well for this. In fact, the HTTP action is the most powerful thing in Flow. You can even use it to trigger other Flows from within a Flow.

Approvals are not fully baked

If you’re building approval workflows and are expecting the way SharePoint Designer works, you’ll be disappointed. An Approval in Flow consists of an email and a two buttons, nothing more. There is no concept of setting a status on an item, no functionality for logging (unless you roll it yourself), and no notion of tasks. It changes the way you think about approvals in general, because the old model just doesn’t apply here.

The Designer does not scale

For a simple two- or three-step flow, the designer works great. Add a couple of nested if/else blocks (‘conditions’), or more than a half-dozen or so actions, and you’ll find that  the design surface is totally unsuited to the task. Scroll bars are in difficult-to-find places and it’s often next to impossible to maintain your context when trying to move around within a Flow.

Sometimes saving a Flow will trigger a phantom validation error, and you’ll have to expand every one of your actions until you find the offending statement, because the Flow team have not seen fit to provide any sort of feedback on where the failure occurred. In addition, sometimes, especially when working with variables, the validation will fail even though the variable is properly configured.

No Code view

As clunky as the designer gets, if you’re a developer you might be more comfortable just coding your Flow the old fashioned way – after all, it’s just JSON under the hood. But alas, code view is not available in Flow. The design view is all you have.

Another implication of this: If you have places in your Flows where there are large blocks of similar functionality, you have no option to copy blocks of code and modify to suit. You’re stuck having to re-create those similar blocks of functionality, manually, in the designer, every single time. Believe me, this gets old really fast.

No versioning

If you make a change to your Flow and somehow break it, well, that’s tough, you’d better figure it out because there’s no rolling back.

Clearly Flow is not the magic bullet in the Enterprise process modeling world. It has is quirks and its pitfalls. But remember, the bar is low due to the legacy application it replaces.  SharePoint Designer workflows share many of the same deficiencies as Flow: a clumsy design experience (check), an inability to edit code directly (for all practical purposes), and no rollback model (technically possible in SPD via version history but janky as hell).

Given that SPD has had its ten-plus years in the limelight, and Flow is a brand-new V1 product with an engaged product team, I’d say the future looks bright for Flow.