Prevent accidental deletes with Azure AD and AADConnect

Hi everyone,

So you are using Azure AAD Connect to sync your directory to Azure Active Directory (a.k.a AAD), and everything works just fine.

Now a security audit is happening in your corporation, and they identified many inactive or disabled users in your on premise AD that are no longer in use.

Then you went to your on premise AD and deleted all those users. If you delete more than 500 users at once from your on premise AD, then your Azure AAD Connect will refuse to sync those deletion to Azure AAD.

You may receive an email that looks like this:


At Sunday, 18 October 2015 12:11:40 GMT the Identity synchronization service detected that the number of deletions exceeded the configured deletion threshold for Contoso Corporation []. A total of 800 objects were sent for deletion in this Identity synchronization run. This met or exceeded the configured deletion threshold value of 500 objects.
We need you to provide confirmation that these deletions should be processed before we will proceed.

Please see Preventing Accidental Deletions for more information about the error listed in this email message.
Thank you,

The Azure Active Directory Team

Why ?

When you install the Azure AD Connect, by default a feature called accidental deletes is enabled with a threshold of 500 objects. This simply means that the sync tool will not export from metaverse to azure AAD more than 500 deletes at once.

This objective of such feature is to protect you from accidental configuration changes and changes to your on-premise directory which may affect large number of users.

The default value of 500 objects is configurable by running Enable-ADSyncExportDeletionThreshold with a value that fits your organization size and requirements.

What to do?

If all the deletes are legitimate deletions, then do the following:

  1. To temporarily disable this feature and authorize those deletions, run the PowerShell cmdlet: Disable-ADSyncExportDeletionThreshold
  2. With the Azure Active Directory Connector still selected in AAD Connect tool, select the action Run and select Export.
  3. To re-enable the protection run the PowerShell cmdlet: Enable-ADSyncExportDeletionThreshold

Azure Sync Case – Conflict between UPN and SMTP address

I was working heavily on AD Sync Tool with all its versions including the AAD Connect tool. And I came across an issue that will cause you trouble soon. So in this blog post I will share my experience with you.


My environment was consisting of one AD domain with Exchange 2010 on premise and a single sign on experience. The AD domain and SMTP domain is the same (

I have a manager called John Smith:

  • UPN :
  • SMTP address :

This manager came to me asking me to add a secondary SMTP address for him Since that SMTP address is available, I agreed.

Azure AAD Sync UPN vs SMTP 12121

Life goes on. John is a big manager and he used his nice clean new SMTP in all his communications. He printed the new email address to his business cards.

So far, we have all our mailboxes are on premise and we do not have any Office 365 implementation.

The Problem

After 10 years, the corporate decided to start using Azure services and they decided to start with AD Sync next month.

Meanwhile, a new employee is hired with name John William. The IT department assigned him the following:

  • UPN :
  • SamAccountName : Contoso\John
  • SMTP Address :

Everything is fine so far and there is no single conflict.

Azure AAD Sync UPN vs SMTP 1212122

Now when we started to Sync users to Azure AAD, John Smith user is no longer synchronizing to Azure AD and the AD sync tool is giving an error for that user.

“Unable to update this object because the following attributes associated with this object have values that may already be associated with another object in your local directory services:[ProxyAddresses smtp:].Correct or remove the duplicate values in your local directory.”

After opening a case with Microsoft, we reviewed our AD Sync configuration and after opening the AAD Connect sync tool, we confirmed that we are using ObjectGUID as the anchor attribute and not email address.

Microsoft confirms that when an on premise mail enabled user is synched to Azure AD, he is assigned a secondary SMTP in the form of his UPN. So in this case, when John William is being synched to Azure AD, Azure will try to stamp him with a secondary SMTP in the form of which causes a conflict with John Smith user.

So although there is no conflict on premise, Azure introduces this type of conflict and throwing sync error.  Microsoft answered us : “THIS IS BY DESIGN DEAL WITH IT”

How big this problem is

Now, we contacted John Smith and asked him to give away his lovely SMTP address. What do you think his reaction is ? He is using that email address since years and he cannot give it away. Further more, if he agreed to give it away, and if Azure assigned it to John William user, then people sending emails to will be sending to John William and this is by itself a security and privacy issue.

So we went to John William and ask him to change his UPN and SamAccountName. This solves the problem for a while, until a new employee come again with the name John Robert, and the IT found that Contoso\John is not used in the enterprise and so they assigned it to John Robert.

Now suddenly the Sync Tool will throw a sync error and the same loop happens over and over.

Imagine you have many users who used to have clean SMTP addresses and you have to go through them one by one and do this 🙂

Compare your on Premise AD users and Azure users – One by One !

If you are synchronizing your on premise Active Directory with Azure Active Directory, then you know for sure that maintaining a healthy synchronization is not an easy thing to do.

I want to give you my experience in synchronizing a single domain AD to Azure AD, using ObjectGuid as anchor attribute.

The Challenge

I am using Azure AD Connect to synchronize AD objects to Azure AD. The Azure AD Connect tool is nice, and it gives you the health of the synchronization process and a nice error messages if there is a problem synchronizing one of your on premise AD objects. I used the AD connect tool for a month, and everything was fine. No errors and everything looks clean.

I then noticed that couple of users are having problem with Office 365 and i discovered that they have never synched to Azure AD in the first place. I did Get-MSOLUser on them and no results are returned. I had to go to Azure AD Connect and force Full AD Import to sort out this issue.

I become so worried about this, and I started to compare the number of AD users and Azure AD users to at least ensure matching numbers. Count AD users and Azure AD users and compare them is not enough, as there is no guarantee that the same objects represent these numbers.

I then came with an idea. I wanted to take each AD user, collect his ObjectGuid, and then compute his ImmutableID, go to Azure AD, and search for that ImmutableID, and finally linked the AD user with his Azure AD copy. We will do that for all AD users. Finally, we can identify AD users that does not have Azure copies, and Azure AD users that are not mapped to AD users.

This will guarantee that each one of your AD users are mapped to Azure AD users. Along this journey, the script will generate couple of information:

  • AD users not in Azure
  • Azure users not in AD
  • Azure users with Synch Errors
  • Filtered AD users from Synchronization
  • Total number of AD users
  • Total number of Azure users
  • Last AD Sync time

Compare you on Premise AD users and Azure users 11

Conditions to use the script

  1. This script assumes you are using ObjectGuid as anchor attribute
  2. If you have multiple domains, forests or you are doing filters to scope AD users by OU or attribute, then you must write your script block to populate the $ADusers_Raw variable, by searching inside the script for (Raw data collection from AD) region. You can find examples there to do that. By default the script will do Get-ADuser to retrieve all users.

Compare you on Premise AD users and Azure users 2

Script Visuals

The script provides many visuals to help you see what is going on:

  • The script code is divided into regions that you can expand separately for better script browsing.
  • Progress bars will appear during running the script to give you a feel and sense of what is going on during the run time of the script.
  • Summary information is displayed on the PowerShell console window with summary statistics.
  • Results are written at the end of the script in two locations:
    • The PowerShell console window.
    • Couple of files that are generated on the script running directory.


The script connects to Active Directory and gets a list of users (Get-ADUser) , and then connects to Azure Active Directory AAD and gets all azure users (Get-MSOLUser). A comparison then is performed by mapping each AD user with his Azure user, and identify un-synchronized AD users that do not have a mapped/synched Azure copy.

This script assumes that ObjectGUID is used as the anchor attribute to link/map AD user and Azure AAD user. If you are using any other attribute, then this script is not for your case.

The script needs to get the list of AD users that you are synchronizing to Azure AD. Microsoft Sync tool gives you the ability to filter by OU or by attribute. Another tricky part of the script is when you are synchronizing multiple domains or perhaps multiple forests. For all those different cases, it is your job to populate the variable called ($ADusers_Raw) located under (Raw data collection from AD) region in this script.
By default, the script will do Get-ADUser to populate this variable. In your case, you may need to write your own script block to collect all AD user objects that you are synching to azure.

Script Parameters

.PARAMETER ScriptFilesPath
Path to store the script output files. For example C:\ , or ‘.\’ to represent the current directory.

As the script needs to connect to Azure, an internet connectivity is required. Use this option if you have an internet proxy that prompts for credentials.

.PARAMETER CreateGlobalObject
The switch when used, the script will generate an extra csv file that contains a unified view of each user with properties from the on premise AD and Azure AD.


.\Compare-CorpAzureIDs.ps1 -ScriptFilesPath .\

.\Compare-CorpAzureIDs.ps1 -ScriptFilesPath .\ -Proxy:$true

.\Compare-CorpAzureIDs.ps1 -ScriptFilesPath .\ -CreateGlobalObject

Download the script

You can download the script from here.

Azure GUID to ImmutableID and vise versa Desktop App

When working with Azure AD, and you deploy one of the Azure AD Synchronization tools like AD Connect, by default objectGUID is used as your ImmutableID. When objects are synced to Azure AD, objects in the cloud will have a converted value of the objectGUID in a base-64 version called ImmutableID.

Let us say that a user called John exist in your AD, his objectGUID is something like this:

Objectguid to immutableID1

The user objectGUID is converted  to base-64 and stored in AD Sync tool metaverse as (sourceAnchor) , and in Azure AD as ImmutableID:


So sometime you want a tool that converts from objectGUID to ImmutableID and the other way.

So I created a simple desktop application, that you click on , and use it to easily convert between Azure ImmutableID and AD objectGUID

Azure Identity Converter Desktop App

The application is so small (500k) as you can see below:


Just double click it and the app will open:


Now you can simply enter an AD GUID and it will compute the ImmutableID (Azure ID for that GUID)


Or you can enter an Azure ImmutableID and it will compute the object GUID in your AD:


Download the APP

You can download the APP from here. For those who are afraid to run untrusted application, be rest that the tool does not install anything, it only prompts for values and no admin rights are needed:)

Get the owner/creator of Active Directory Objects


I got a quick need to get the owner of couple of accounts in Active directory in order to find out who created them.

I used Quest PowerShell extension to find out this information. So for example if i want to find out who created (the owner) of an account called contoso\ServiceAccount1, i will type:

Get-QADObjectSecurity contoso\ServiceAccount1 -owner

Active Directory AutoRecovery from Dirty Shutdown

The Problem

I found couple of events in a Windows 2012 Domain Controller that indicates a problem in the DFS replication. The event looks like this :

AD AutoRecovery 

It turned out that Microsoft had published a hotfix that will disable the ability to AutoRecovery after dirty shutdown. I am sure that the domain admin will not always updated if one of his domain controllers suffered a power failure, which will cause DFS to be broken and SYSVOL to stop replication. 

The bad news, is that Microsoft made this the default behavior for Windows 2012 !

How to solve this

Find out what drive holds your DFS folders, if it is the C drive, then type the following to know what is the volume ID :

Mountvol C:   /L

Then :

wmic /namespace:\\root\microsoftdfs path dfsrVolumeConfig where volumeGuid=”{Volume-ID}” call ResumeReplication   

(do not include the “{” and “}” when replacing the volume-id

Finally, run :

wmic /namespace:\\root\microsoftdfs path dfsrmachineconfig set StopReplicationOnAutoRecovery=FALSE


Shaking BitLocker – Backup keys to AD and play around

I have come across many scenarios where people have their BitLocker Information in AD, and then different funny situation happened along the way that i want to talk about in this blog post.


Case 1 : What will happen if you rejoin a BitLocker protected computer to the domain

Case 2 : Renaming a computer which has BitLocker

Case 3 : Computer was used by user1, user1 resigned, so you reset his computer account in AD, reformatted the machine, join it to domain and re-enabled BitLocker on it

Case 4 : deleting computer which has BitLocker from AD

Case 5 : Enabling BitLocker before joining the machine to the domain

Case 6 : divergence happened, you have a domain joined machine with BitLocker enabled, and in AD you do not have recovery information for that computer.

Moving to a New Blog Platform

This post is now moved to my new blog platform at To continue reading this blog post, please click here

AD Backup – PowerShell Script

Mission :

I do not want to use third party software to backup my Active Directory (Domain Controller’s system state) because usually those third party requires high privileges and rights on domain controllers.

Instead, i will have a protected file share server, i will use a script on the domain controller to backup its system state to that protected file share. Then i will use my favorite third party solution to backup the protected file share server.


Prepare the File Server

Now as the file share will host your AD backup, it is important to protect and restrict access on that file server. I would recommend to install a VM with C drive for the O.S and D drive to host the system state backup.

Also, make sure that the administrators group on that file share is restricted to only domain admins. Do not install any other server roles on the server and do not host any other shares on it.

Now on the D drive, create a hidden share with Full permissions given to (Domain Controllers) group on both sharing and NTFS permissions. (Domain Controllers) is a built in security group that exist on your AD by default.

Prepare the Domain Controller

Nothing to prepare really here. You need to schedule the below script on one of your domain controllers. That’s it.

Script Breakdown

The script should be scheduled to run on any domain controller and it should run using the built in (System) scrutiny context. This will give it the right to take backups to your AD without any additional rights 🙂

The script starts by importing the (Server Manager) module using the Import-Module ServerManager

Then we will get the current date [string]$date = get-date -f ‘yyyy-MM-dd’.

Following that we will define the folder on the remote file share $TargetUNC = “\\FileServer\ADBackup$\AD-$date”. 

This assumes that the remote file share name is (FileServer) and the hidden share we created is called ADBackups$

Notice that we are assuming that backups will be taken in a folder structure where the name of the folder contains the date on which the backup is taken.

So we will check first to see if a folder is already there that contains today’s date, and if it exists, we will delete it. This means that we will not maintain two backups taken in the same date. This is only my own way. You can do yours.

Because the script will try to create folders on the remote share, (Domain Controllers) group will need access on that remote file share

If ( Test-Path $TargetUNC) { Remove-Item -Path $TargetUNC -Recurse -Force }

New-Item -ItemType Directory -Force -Path $TargetUNC

Finally, we will start taking backup using WEBADMIN command. This command requires that in order to do backups to remote file share, a user name and password should be supplied. So create a username (i.e ServiceADBackup) and give it share and NTFS permissions to write to the remote file share.

$WBadmin_cmd =    “wbadmin.exe START BACKUP     -backupTarget:$TargetUNC   -systemState      -noverify     -vssCopy     -quiet     -user:MyUser     -password:MyPassword “

Invoke-Expression   $WBadmin_cmd

Schedule Script

In order to schedule the script on your DC, open Task scheduler , create basic task with your own schedule preference, and when you reach the Action window, make sure to put (C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe) on the Program/script field, and the full path of your script in the (Add arguments (optional)) filed.


When you are done, open the task again, and change the scrutiny context that the script is using, and replace it with (System) and check the (Run with highest privileges)


Download Script

You can download the script from here

Final Note

When taking backups this way, logs for this backup are stored on that DC here (C:\Windows\Logs\WindowsServerBackup). Usually with time, those logs will consume big space, so i would run a script to delete log files from this directory using another script