Category: Azure

This post will try to explain some relevant parameters from the ADFS side. I’m not saying the defaults aren’t good, that’s something you’ve got to decide for yourself.


WS-Fed/SAML protocol requirements: All time is UTC. ADFS will ignore system time and will use UTC.

Dates in SAML

A Security Assertion Markup Language(SAML) assertion might contain many attributes that contain dates. For example the following Conditions element of a SAML assertion has two date attributes:
<saml:Conditions NotOnOrAfter=”2013-07-29T23:49:40.051Z” NotBefore=”2013-07-29T23:39:40.051Z”>

Time in SAML elements is expressed in xs:dateTime format (xs is a XML format style). The Z at the end indicates that the Time Zone is Coordinated Universal Time (UTC) or Zulu time format. SAMLCore explicitly references “W3C XML Schema Datatypes specification [Schema2]” which in turn references ISO 8601.

Dates in OpenTokens

An OpenToken contains multiple dates. For example the following OpenToken contains three dates:


The format for OpenTokens also uses ISO 8601 which also uses UTC.
See also: Microsoft KB 884804 – How to convert UTC time to local time

WebSSOLifetime (Default 480 = 8 hours)

This parameter is server-wide. Meaning if you configure it, it’s active for all of the ADFS relying parties. Whenever a user asks a token for a given RP he will have to authenticate to the ADFS service first. Upon communicating with the ADFS service he will receive two tokens: a token which proves who he is (let’s call that the ADFS Token) and a token for the RP (let’s say the RP Token). All in all this seems very much like the TGT and TGS tickets of Kerberos.

Now the WebSSOLifetime timeout determines how long the ADFS token can be used to request new RP Tokens without having to re-authenticate. In other words a user can ask new tokens for this RP, or for other RP’s, and he will not have to prove who he is until the WebSSOLifetime expires the ADFS token.

TokenLifetime (Default 0 (which is 10 hours))

The TokeLifetime is now easy to explain. This parameter is configurable for each RP. Whenever a user receives a RP Token, it will expire at some time. At that time the user will have to go to the ADFS server again an request a new RP token. Depending on whether or not the ADFS Token is still valid or not, he will not have to re-authenticate.

One argument to lower the TokenLifetime could be that you want the claims to be updated faster. With the default whenever some of the Attribute Store info is modified, it might potentially take 10 hours before this change reaches the user in its claims.

NotBefore and NotOnOrAfter, NotBeforeSkew

Setting up federated trusts with third-party vendors to provide users with single sign on (SSO) is very common.  SAML2 is the preferred method for SSO authentication.  One issue with this method is ensuring the SAML tokens have a valid lifespan.  Basically, when does a token become valid and when is it no longer valid.  Built into the SAML specification, there is a <saml:Conditions> element, which contains two attributes; NotBefore and NotOnOrAfter.  The NotBefore attribute contains the date and time value that specifies when the assertion becomes valid.  The NotOnOrAfter attribute contains the date and time value that specifies when the SAML assertion is no longer valid.  Both must be UTC datetimes, without the time zone.  As long as the SAML token is being used between the NotBefore and NotOnOrAfter times the assertion will be valid.

But what happens when the IdP server time and the third-party server times are off by a few seconds, or even a couple of minutes?  Simple, authentication may fail because the third-party server may see the SAML as not yet valid.

Luckily, ADFS 3 (Windows Server 2012 R2) offers a simple solution.  A simple time skew value can be added to the relying party on the ADFS server.  This property is called NotBeforeSkew.  It contains the number of minutes to adjust the NotBefore value by.  Setting the NotBeforeSkew to a value of 5 will result in a NotBefore of -5 minutes.

The following PowerShell command can be used to set the NotBeforeSkew value.

Set-ADFSRelyingPartyTrust -TargetIdentifier "<replying party identifier>" -NotBeforeSkew 5



What’s new in ADFS 2016?

  • Eliminate Passwords from the Extranet
  • Sign in with Azure Multi-factor Authentication
  • Password-less Access from Compliant Devices
  • Sign in with Microsoft Passport
  • Secure Access to Applications
  • Better Sign in experience
  • Manageability and Operational Enhancements

You can upgrade an AD FS 2012 R2 farm using the “mixed farm” process described here. It works for WID or SQL farms, though the document shows only the WID scenario. Also another upgrade procedure:

  1. Active Directory schema update using ‘ADPrep’ with the Windows Server 2016 additions
  2. Build Windows Server 2016 servers with ADFS and install into the existing farm and add the servers to the Azure load balancer
  3. Promote one of the ADFS 2016 servers as “primary” of the farm, and point all other secondary servers to the new “primary”
  4. Build Windows Server 2016 servers with WAP and add the servers to the Azure load balancer
  5. Remove the WAP 2012 servers from the Azure load balancer
  6. Remove the ADFSv3 servers from the Azure load balancer
  7. Raise the Farm Behavior Level feature (FBL) to ‘2016’
  8. Remove the WAP servers from the cluster
  9. Upgrade the WebApplicationProxyConfiguration version to ‘2016’
  10. Configure ADFS 2016 to support Azure MFA and complete remaining configuration

Other links:

ADFS 2016 operations

ADFS 2016 deployment

ADFS 2016 design

Office 365 and Project/Visio ?

Concernant le déploiement des binaires :

Concernant la juxtaposition Project/Visio avec Office 365 :

Latest version: 1.8 update 1 – Azure ATP and ATA v1.9 planned for Q1 2019

News from Ignite event 2017:   

Azure ATP:

Technet resource:

ATA 1.8 simulation playbook:

ATA powershell module:

(copied under \\\microsoft\microsoft ATA\)

News from pentesters:


What’s new in ATA version 1.8

Suspicious activity guide:

New & updated detections

  • NEW! Abnormal modification of sensitive groups – As part of the privilege escalation phase, attackers modify groups with high privileges to gain access to sensitive resources. ATA now detects when there’s an abnormal change in an elevated group.
  • NEW! Suspicious authentication failures (Behavioral brute force) – Attackers attempt to brute force credentials to compromise accounts. ATA now raises an alert when an abnormal failed authentication behavior is detected.
  • NEW! Remote execution attempt – WMI exec – Attackers can attempt to control your network by running code remotely on your domain controller. ATA added detection for remote execution leveraging WMI methods to run code remotely.Reconnaissance using directory services queries– In ATA 1.8, a learning algorithm was added to this detection allowing ATA to detect reconnaissance attempts against a single sensitive entity and improve the results for generic reconnaissance.
  • Kerberos Golden Ticket activity ATA 1.8 includes an additional technique to detect golden ticket attacks, detecting time anomalies for Kerberos tickets.
  • Enhancements to some detections, to remove known false positives:
    • Privilege escalation detection (forged PAC)
    • Encryption downgrade activity (Skeleton Key)
    • Unusual protocol implementation
    • Broken trust


  • NEW! More actions can be made to suspicious activities during the triage process.
    • Exclude some entities from raising future suspicious activities. Prevent ATA from alerting when it detects benign true positives (i.e. an admin running remote code or using nslookup) or known false positives (don’t open a Pass-The-Ticket alert on a specific IP).
    • Suppress a reoccurring suspicious activity from alerting.
    • Delete suspicious activities from the timeline.
  • A more efficient triage – The suspicious activities time line has gone through a major process of re-design. In 1.8, a lot more suspicious activities will be visible at the same time, and will contain better information for triage and investigation purposes.


  • NEW! Summary report. An option to see all the summarized data from ATA, including suspicious activities, health issues and more. It’s possible to define a reoccurring report.
  • NEW! Modification to sensitive groups report to see all the changes made in sensitive groups during a certain period.


  • Lightweight Gateways can now read events locally, without configuring event forwarding
  • Feature flags were added for all detection, periodic tasks and monitoring alerts
  • Accessibility ramp up – ATA now stands with Microsoft in providing an accessible product, for everyone.
  • E-mail configuration for monitoring alerts and for suspicious activities are separated


  • NEW! Single sign on for ATA management.
    • Gateway and Lightweight gateway silent installation scripts will use the logged on user’s context, without the need to provide credentials.
  • Local System privileges removed from Gateway process
    • You can now use virtual accounts (available on stand-alone GWs only), managed service accounts and group managed service accounts to run the ATA Gateway process.
  • Auditing logs for ATA Center and Gateways were added and all actions are now logged in the event viewer.Added support for KSP Certificates


Version: 1.7

Reference articles:

ATA on Technet:

ATA events:

ATA deployment demo:


Additional resources:

Powershell windows forensics:

Powershell windows forensics:

Powershell windows forensics:



Today AD FS is made highly available by setting up an AD FS farm. Some organizations would like a way to have a single server AD FS deployment, eliminating the need for multiple AD FS servers and network load balancing infrastructure, while still having some assurance that service can be restored quickly if there is a problem. The new AD FS Rapid Restore tool provides a way to restore AD FS data without requiring a full backup and restore of the operating system or system state. You can use the new tool to export AD FS configuration either to Azure or to an on-premises location. Then you can apply the exported data to a fresh AD FS installation, re-creating or duplicating the AD FS environment.


The AD FS Rapid Restore tool can be used in the following scenarios:
1.Quickly restore AD FS functionality after a problem•Use the tool to create a cold standby installation of AD FS that can be quickly deployed in place of the online AD FS server

2.Deploy identical test and production environments•Use the tool to quickly create an accurate copy of the production AD FS in a test environment, or to quickly deploy a validated test configuration to production

What is backed up

The tool backs up the following AD FS configuration
•AD FS configuration database (SQL or WID)
•Configuration file (located in AD FS folder)
•Automatically generated token signing and decrypting certificates and private keys (from the Active Directory DKM container)
•SSL certificate and any externally enrolled certificates (token signing, token decryption and service communication) and corresponding private keys (note: private keys must be exportable and the user running the script must have permissions to access them)
•A list of the custom authentication providers, attribute stores, and local claims provider trusts that are installed.

Download and usage


Problem description, security issue?

When i log on to my Adfs link below

It showing two of my replying parties asking me sign in.

I have up to 8 other applications i am federating but they are not showing up on this link.

Is this Normal? If its not, how can i remove it? Is this something my relying partner has to fix?



Do not disable the /adfs/ls endpoint from ADFS management snap-in.

Ask to the application provider (SP) to use WS-Fed and not SAML; using WS-Fed (like MS online 365 RP) will not show the list of RP trusts.

As long as your Relying Party trust has a SAML Assertion Consumer Endpoint, it will show up in the list of RP available for IDP initiated logon.

You can check if you have such endpoints in the graphical interface, or in PowerShell:
(Get-AdfsRelyingPartyTrust -Identifier “<ID of your RP>”).SamlEndpoints

In Windows Server 2012 R2, you cannot not disclose the information. You can use a JavaScript to hide it from the users, but the info will still be available in the code (if the users are curious and look at the HTML source, they will see them). If this is acceptable for you, you can go ahead and create a custom JavaScript for that. I can provide a sample if you want to, but the info is essentially there: and on the Internet.

Note that ADFS on Windows Server 2016 changed that behavior and the IdpInitiatedSignon page is not enabled by default. Although once enabled, you still need the JavaScript to hide the list or a part of the list.

This is normal. The Relying Party Trusts showing up are the ones using the SAML Federation Protocol since that protocol has a ‘feature’ called IdP Initiated Sign On where the user can first be authenticated by your ADFS and then choose which of these Relying Party Trusts/Service Providers they want to access (by having ADFS issue them a SAML Token) and POST/Redirect the browser to that Relying Party Trust/Service Provider.

Do note that just because a Relying Party Trust/Service Provider is listed doesn’t automatically mean that they actually DO support IdP Initiated Sign In. Some Service Providers using the SAML Protocol might only accept Service Provider Initiated Sign In.

I’ve hidden this list on my ADFS 2.0 Proxies for un-authenticated users (but not on our ADFS 3.0 WAPs yet).

In ADFS 2.0 edit C:\inetpub\adfs\ls\IdpInitiatedSignOn by adding SetRpListState(null, null);

You can’t disable the page on Windows Server 2012 R2. You can hide the list in JavaScript onload.js:
var checkidp_OtherRpPanel = document.getElementById(‘idp_OtherRpPanel’) ;
if ( checkidp_OtherRpPanel ) { = ‘none’ ;

You’ll find the guidance on how to modify the default JavaScript of the page there:

Customizing the ADFS 3.0 Sign-in page:


Here are resources about Azure and Office365,

let me summarize:

Office365 : is an offer of MS services and hosted applications – Saas ; in clear you pay for a service (sharepoint,exchange,office…) and you don’t manage the infra behind (like CPU,RAM,Storage,Security)

Azure: is a cloud (private/public) offer – paas/Iaas ; compared to Office365, MS provide just the plumbery (hyper-v, Storage, CPU, RAM, network) and you manage the applications, the Operating system, the security and patches, the applications ; in short “it is like a lego or a Mecano!”, and with Azure you can mix your on-premises IT infra with Azure in the cloud (and vice-versa)

Web resources for Azure  / Office 365:

Office 365 for business get started:


Productivity library (scenarios):

Technical decks:

Technical references:




Azure AD Blog:

Azure Powershell:

Azure RMS blog:

‘In the Cloud’:

Office blog:    and

Intune blog:

Azure training kit:

FAQ and enhancement suggestions:

portal and management:

main:     calculatrice:      white papers:     FR blog:


To go deeper:     Forum:     channel9:     Dashboard/SLAB:


Prerequisites before using Azure:

Prepare your environment:

Need certificates:

How to use CSUpload?

How do you get CSUPLOAD?

CSUPLOAD is part of the Windows Azure SDK. After installing all components, it finds you csupload under the following path:
“C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload.exe”
How does CSUPLOAD work?

CSUPLOAD is a command console program that the VHDs in the uploads BLOB storage account and authenticated to the azure cloud client certificates.
Overall it with Visual Studio is very simple and fast to create the appropriate certificates, and to distribute them to the appropriate locations through the function
“Publish to Azure” that requires requires developer know-how or you experience with the Visual Studio.

CSUpload syntax reference:

Managing disks and images:

How to:

the article above refers to:


CSUPLOAD how to?

# Create exportable certificate for Azure (use -pe to be exportable)
makecert -r -pe -n “CN=My Azure IaaS Cert2048” -a sha1 -ss My -len 2048 -sy 24 -b 07/08/2013 -e 07/08/2014

then open mmc,load certificates snap-in, My user, personal,
select the certificate, export
to D:\Contoso

upload the certificate, from the Azure portal, settings, certificates management

get the thumbprint: 4D15540AFD7182964651826BE133FB3C868BA4D1

Now with csupload:

“C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” Set-Connection “SubscriptionId=eaea9c22-cc5a-4da2-8dd2-d89837f042b7;CertificateThumbprint=4D15540AFD7182964651826BE133FB3C868BA4D1;ServiceManagementEndpoint=”

# just for fun

D:\Contoso>”C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” get-Connection
Windows(R) Azure(TM) Upload Tool version
for Microsoft(R) .NET Framework 3.5
Copyright c Microsoft Corporation. All rights reserved.

Warning: CSUpload.exe will be deprecated in a future release. Use the Windows Azure PowerShell cmdlets instead:
ConnectionString          : SubscriptionId=eaea9c22-cc5a-4da2-8dd2-d89837f042b7;CertificateThumbprint=4D15540AFD7182964651826BE133FB3C868BA4D1;ServiceManagementEndpoint=
SubscriptionId            : eaea9c22-cc5a-4da2-8dd2-d89837f042b7
CertificateSubjectName    : CN=Amadeus Azure IaaS Cert2048
CertificateThumbprint     : 4D15540AFD7182964651826BE133FB3C868BA4D1
ServiceManagementEndpoint :

D:\Contoso>”C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” get-location
Windows(R) Azure(TM) Upload Tool version
for Microsoft(R) .NET Framework 3.5
Copyright c Microsoft Corporation. All rights reserved.

Warning: CSUpload.exe will be deprecated in a future release. Use the Windows Azure PowerShell cmdlets instead:
Using the saved connection string…
Location : West US

Location : East US

Location : East Asia

Location : Southeast Asia

Location : North Europe

Location : West Europe

A total of 6 record(s) were found.

D:\Contoso>”C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” get-hostedservice
Windows(R) Azure(TM) Upload Tool version
for Microsoft(R) .NET Framework 3.5
Copyright c Microsoft Corporation. All rights reserved.

Warning: CSUpload.exe will be deprecated in a future release. Use the Windows Azure PowerShell cmdlets instead:
Using the saved connection string…
Name          : amazure
Label         : amazure
Location      : North Europe

A total of 1 record(s) were found.

D:\Contoso>”C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” get-disk
Windows(R) Azure(TM) Upload Tool version
for Microsoft(R) .NET Framework 3.5
Copyright c Microsoft Corporation. All rights reserved.

Warning: CSUpload.exe will be deprecated in a future release. Use the Windows Azure PowerShell cmdlets instead:
Using the saved connection string…
Name                : Contoso-Contoso-0-201308011545510947
Location            : North Europe
OS                  : Windows
LogicalDiskSizeInGB : 128
MediaLink           :
SourceImageName     :

A total of 1 record(s) were found.

Upload a disk (vhd) to Azure:

You can use the Add-Disk parameter of the CSUpload Command-Line Tool to upload a .vhd file and register it in Windows Azure as either an operating system disk or a data disk.
An image is a VHD that has been generalized and is used to create an operating system disk. An operating system disk is a VHD that contains specific settings for a virtual machine.

Specifies a VHD file to be uploaded as a disk. A VHD file that has been uploaded as a disk can be used to create a virtual machine if the file contains an operating system or it can be used to create a data disk that can be attached to a virtual machine.
•–Connection <string> – (Optional if the Set-Connection command has been run) Specifies the connection string that is used to connect to Windows Azure. The connection string contains the identifier of your Windows Azure subscription and the thumbprint of the management certificate that you created to enable API access to the subscription. The connection string is provided in the following format: “SubscriptionID=subscription-id;CertificateThumbprint=cert-thumbprint;ServiceManagementEndpoint=”. You can find the subscription identifier and certificate thumbprint in Management Portal.
•-Destination <string> – Specifies the blob storage account where the VHD file is stored. The destination includes the endpoint of the account, the container in the account where the file is stored, and the name of the VHD file. For example,”;
•-Label <string> – Specifies the identifier that is used for the disk in the Management Portal.
•-LiteralPath <string> – Specifies the location and name of the VHD file to upload as a disk.
•-Name <string> – (Optional) Specifies the name to be used for the VHD file that is being uploaded.
•-OS <string> – (Optional) If the VHD file that is being uploaded contains an operating system to be used with a virtual machine, you must include this parameter with the value of Windows or Linux depending on the type of operating system that is installed.
•-Overwrite – (Optional) Indicates that you intend to overwrite an existing VHD file with a new file.

“C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” add-disk -destination -label SP2010 -literalpath d:\contoso\contoso1.vhd -name contoso1.vhd -os Windows

“C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.0\bin\csupload” add-disk -destination -label EX2010 -literalpath d:\contoso\contoso2.vhd -name contoso2.vhd -os Windows

How to move Office 365 data to another Office 365 tenant?

this will includes:

  • exchange online mailboxes
  • sharepoint online data
  • onedrive online data

Read those articles:

Microsoft does not provide at the moment the possibility of transferring mailboxes, Sharepoint and OneDrive data in an automated manner between Office365 tenants

It is not possible to connect between the Office365 servers and send data from one to another. However, with the help of third-party tools like MigrationWiz from BitTitan, Quest Migration Manager from Dell, Sharegate-created especially for the Sharepoint and others – you can prepare and go through the migration process.

Third-party tools can only copy the data, not syncing them!

You need to prepare customers that using third party tools to migrate Office365 between tenants, contents of mailboxes and Sharepoint/One Drive data are copied from one location to another. For example, if mail message has been copied to the destination mailbox and will be removed in source mailbox, then it will still exists in the destination.

• OneDrive for Business (or OD4B)

In the Office365’s OneDrive each user gets their own, separate data space on Sharepoint Online server. At the time of writing this article it is 1 TB. Administrators problem is that by default, this is designed as a user private storage, so until changes of permissions they cannot access it and so migrate data. To copy it across the Office365 tenants, administrator permissions needs to be added to OneDrive sites. You can achieve this via Powershell script for all OneDrive sites.:

$Creds = Get-Credential
Connect-SPOService -Url -credential $Creds
$Users = Get-SPOUser -Site -limit all| Where-Object {$_.LoginName -like ‘*DOMAINNAME.DOMAIN*’}
$Users = $Users.LoginName | ForEach-Object { $_.TrimEnd(“DOMAINNAME.DOMAIN”) } | ForEach-Object { $_.TrimEnd(“@”) }
$Users | ForEach-Object {Set-SPOUser -Site”$_”_DOMAINNAME_DOMAIN/ -LoginName -IsSiteCollectionAdmin $true}

or only for selected OneDrive’s if you prepare list beforehand (here it is named O4bUsers.csv)

$Creds = Get-Credential
Connect-SPOService -Url -credential $Creds
$Users = import-csv ./O4bUsers.csv
$Users | ForEach-Object { Set-SPOUser -Site $_.url -LoginName -IsSiteCollectionAdmin $true }

The second issue related to OneDrive migration, is the fact that when you move its data to a new tenant, you need to prepopulate O4B sites first-they are not automatically created when you assign license to Office365 user.

Here comes Powershell again, however it is required to use complex script and prepare a list of accounts to be created beforehand.

You can get relevant information here:

During preparation of migration batches, be careful entering account parameters.

OneDrive and Sharepoint links will change accordingly to the domain connected to user UPN – that is when UPN (login name) changes, OneDrive url will change too – for example:

can change to:

Due to this process, you must set the correct source and destination addresses in the migration tools. Be aware and do not migrate data in the wrong way!

• SharePoint world

Although it is integrated with other Office365 services, it is a separate environment, which is governed by its own laws. It can have its own set of users, permissions, and services you need to keep in mind during the migration.

Microsoft has no out of the box solution to transfer data, configuration and structure between Office365 tenants, however, there are third-party tools which can help you with the migration.

The complexity of all of the Sharepoint features causing that there is no application that can mirror everyone environment in the new place. Every tool on the market has always some limitations. You need to check the documentation for a list of functions and properties that can be included in the migration and exceptions that just cannot be migrated.

Even though you choose the best tool, it is useful to have Sharepoint specialist on board on the planning phase, during and just after migration to help solving emerging problems.

• Switching domain-downtime in the mail delivery

To move organization data from one Office365 tenancy to the other, one of the steps you need to perform is a domain migration. You cannot assign the same domain to two Office 365 tenants. You need to remove it from the existing one first, then add and verify in the second. This will be possible if you remove any domain aliases assigned to Office365 objects -mailboxes, groups, contact. This step is critical, because removing domain will stop mail flow directed to it.

If we do not use additional servers, which can take over the mail traffic for the duration of the switching domains (switching consists of removing, adding and verifying domain, changing MX records, waiting for DNS replication), for example hybrid on-premises Exchange Server, or Linux server – there will be a downtime in the mail service for selected domain.

You have to get this into account during planning stage and preparing migration steps for customer.

It is even more important when there are many, sometimes several hundred SMTP (mail) domains to migrate between the Office365 tenants, and at the same time you can get the verification code only up to 50 domains.

It is possible that you can also encounter unplanned obstacles, for example in the form of damaged objects – in my case it was when I cannot remove aliases and needed to remove object completely.

Some migration solutions:

for sharepoint:

read the article:

for onedrive:

All in one suite:



Office 365 objects:

How to recognize objects created directly in the Office 365? You can check date of their synchronization with the AD. Cloud users, groups and contacts which have never been synchronized with the directory service, will have the lastdirsynctime parameter set to null ($null).

Powershell commands to check cloud identities:

get-msoluser-all | where {$ _. lastdirsynctime-eq $null}

get-msolgroup-all | where {$ _. lastdirsynctime-eq $null}

get-msolcontact-all | where {$ _. lastdirsynctime-eq $null}


On Office 365 landing page:

How to Disable OneDrive for Business in Office 365

On Office 2016 package – by GPO:


Two ways to integrate/federate applications with Azure AD:

Azure marketplace:

check if the application exists:

The Microsoft Azure Marketplace is an online store that offers applications and services either built on or designed to integrate with Microsoft’s Azure public cloud.

Else, how to configure single sign-on to applications that are not in the Azure Active Directory application gallery:

Authentication scenarios for Azure AD:

Applications integrated with Azure: the access via

Integrating applications with Azure AD:


Identity hybrid ports and protocols:


Azure app gallery:

If not pre-packaged in Azure marketplace: Any app that supports SAML 2.0 can be integrated directly with an Azure AD tenant using these instructions to add a custom application.

Provide credentials for a test tenant or account with your application that can be used by the Azure AD team to test the integration.

Provide the SAML Sign-On URL, Issuer URL (entity ID), and Reply URL (assertion consumer service) values for your application, as described here. If you typically provide these values as part of a SAML metadata file, then please send that as well.

Provide a brief description of how to configure Azure AD as an identity provider in your application using SAML 2.0. If your application supports configuring Azure AD as an identity provider through a self-service administrative portal, then please ensure the credentials provided above include the ability to set this up.

Configuring single sign-on to applications that are not in the Azure Active Directory application gallery:


Sign On URL (SP-initiated only) – Where the user goes to sign-in to this application. If the application is configured to perform service provider-initiated single sign on, then when a user navigates to this URL, the service provider will do the necessary redirection to Azure AD to authenticate and log on the user in. If this field is populated, then Azure AD will use this URL to launch the application from Office 365 and the Azure AD Access Panel. If this field is ommited, then Azure AD will instead perform identity provider -initiated sign on when the app is launched from Office 365, the Azure AD Access Panel, or from the Azure AD single sign-on URL (copiable from the Dashboard tab).

Issuer URL – The issuer URL should uniquely identify the application for which single sign on is being configured. This is the value that Azure AD sends back to application as the Audience parameter of the SAML token, and the application is expected to validate it. This value also appears as the Entity ID in any SAML metadata provided by the application. Check the application’s SAML documentation for details on what it’s Entity ID or Audience value is. Below is an example of how the Audience URL appears in the SAML token returned to the application

Reply URL – The reply URL is where the application expects to receive the SAML token. This is also referred to as the Assertion Consumer Service (ACS) URL.