Secure terminal Services (RDP) using Azure Multi-factor Authentication (MFA) – Part 2 – Azure Dummies

Hello Everyone,

In First article of this series we discussed the general concept of Azure Multi-Factor Authentication and how it’s work.

In second part of this series we went more deeper in the technical aspects of the implementation of Azure MFA by taking an example of how to secure your remote desktop connection through Azure Multi-Factor authentication and we prepared the azure tenant and the Azure MFA provider.

In this part, we will continue our demo of integrating remote desktop connection (RDP) with Azure MFA by installing the Azure MFA server in the same server we need to secure it.

In this demo we have a server called Secure-Server with windows server 2008 r2 joined to the domain, we need to secure the remote desktop connection to it by installing the MFA server in it.

In this demo, I assumed that the MFA provider is already prepared, for more information read our previous article.

So let’s start the installation of the MFA:

Just to remind you, the below three snapshots show you how to download the MFA setup and generate the credentials which we explained in the previous article:

Now after the download of MFA completed, double click in the setup file, choose the installation path and click Next:13

wait a seconds for the installation to complete:14

once the installation finish click Finish:15

A new Wizard will appear as below, Click Next:16

Now enter the Email and Password credentials which we obtained before from the MFA provider, if you forget how to obtain it please read our previous post, if the credentials expired you can re-generate it again, once you fill the required information click Next:21

Now, MFA server will try to communicate with Azure MFA Provider as below:22

Ops, we received an error message as shown below ” Unable to communicate with the Multi-factor Authentication POP, The Multi-factor Authentication server could not be activated … etc“, this error is normal if you use an proxy to access internet, in this case you must verify three things if you use a proxy server:

1- Your proxy is set correctly in the server (in IE browser).
2- Run CMD as administrator, write the following command:
netsh winhttp import proxy source=ie
3- MFA server must be able to communicate on port 443 outbound to the following:

If outbound firewalls are restricted on port 443, the following IP address ranges will need to be opened:

IP Subnet Netmask IP Range – – –

If you are not using Azure Multi-Factor Authentication Event Confirmation features and if users are not authenticating with the Multi-Factor Auth mobile apps from devices on the corporate network the IP ranges can be reduced to the following:

IP Subnet Netmask IP Range – – –

23After we set a proxy rules, I tried another time to activate the MFA as below:24

finally, its verified successfully, Now the wizard will ask you to create new MFA group or choose existing one, since it’s first MFA server to be deployed, give any meaningful name for the new group as below, also note that the group used to manage more than MFA servers and enable replication between the servers if there is a need, Click Next:25

Uncheck enable replication between servers option and click Next:26

Now, you can select what application need to integrate it with Azure MFA, the last option is remote desktop, you can select it and click Next, but in our demo we will click cancel to configure the remote desktop from the MFA console, click Cancel.27

Now, go to star menu and click on Multi-Factor Authentication Server icon:

Azure MFA server is loading as below:

After a while the console appear, this is the MFA server console that you can manage the MFA setup, in the status option it display that the server Secure-Server.demo.lab is online which is the same server we need to secure the RDP connection on it and the MFA server at the same time:30

Also if you go to the Azure MFA provider manage page, click on Server Status option you will see the server is online as below:31

Now back to the MFA server console, go to windows authentication, check “Enable Windows Authentication” option as below, then click Add button:32

Choose the server name and terminal services as an application option, check the “Enable” option, now if you will apply all users in AD to use MFA check the “Require Multi-Factor Authentication user match” option, if not leave it uncheck as below, click OK:33

The MFA is configured to secure the RDP in that server, it mentioned that the server need to be restarted to take the effect, click OK and wait before restart to continue the configuration:34

As shown below, the server appear in the console:35

Now go to Users icon to add the users you need to apply MFA authentication on them, click in Import from Active Directory button as below:


choose the users you need and click Import as below:


The users successfully imported as below, click OK:


the new users appear in the console, there is a warning icon beside each user, this warning because the user must enabled for MFA manual, by default when you import the user it will be not enabled for MFA automatically, double click in any user:


fill the required information as below:
-Country Code.
– choose MFA to be phone call, Text Message or mobile app … etc. we will choose for this demo a phone call option.
-check the enabled option.
Finally, Click Apply:


note after we check the enable option the warning icon disappear, do the same for all users you need:


after I prepared all users, the users appear in the console without the warning icon, to test the configuration choose any user and click the test button:


provide the password and click test:


wait a while:


Now, the user should receive a call, if he end the call the authentication will be refused because it will considered that another person try to use his/her credentials, if the user click (#) he/she confirm that he is the one trying to access the server:


After I clicked (#), the test completed as belwo:


Now, after I restarted the machine to take effect, I try to access the server remotely as below:


I tried to login with the administrator user:


Now the welcome page start:


during the login and within the welcome page  I received a call from Microsoft MFA, I answered the call and end it direct:

50 - Copy

Because I end the call and didn’t press the (#) key, the login process failed as below:

I tried to login again with the same user: 53

I received another call from Microsoft MFA, but this time I press (#) key:


Because I press the (#) key I confirmed for Microsoft that I am the same person who try to login now, so I successfully login to the server as below:

55 56

So in this article we tried to demonstrate how to install the MFA server and integrate it with the remote desktop connection (RPD).

In next article we will show you how to customize the MFA setup by using Fraud alert, changing the received call voice, generate a reports … etc.

Stay tuned

Source: Secure terminal Services (RDP) using Azure Multi-factor Authentication (MFA) – Part 2 – Azure Dummies

Secure terminal Services (RDP) using Azure Multi-factor Authentication (MFA) – Part 1 – Azure Dummies

Hello Everyone,

In First article of this series, we discussed the general concept of Azure Multifactor Authentication, and how MFA participate in securing your on premise environment and Hybrid one if exist.

In this article we will go in more technical details about how to use Azure Multifactor Authentication using a real example.

One of my customers have a server which contains a highly secure data and only around 6 users have a remote desktop access to that server, the customer need to add more security layer for accessing this server.

I suggest the customer to use Azure MFA, since it will add a highly secure layer to the remote desktop access to the server in addition to the low cost of this service.

So let’s start the technical steps to do that, remember that we need to integrate remote desktop protocol access (RDP) with Azure MFA.

In this part we will prepare the Azure MFA provider and download the MFA server setup files, In next part we will deploy and configure the MFA server to secure the RDP.

First of all let’s summarize the requirements to implement this scenario:

1- we need an azure account (Azure Tenant) to configure and install the Azure setup, if you don’t have account you can sign up for one month as trial, for more info follow this link :

2- integrate RPD protocol with Azure MFA is not supported in windows 2012 R2 (until the date of this article), which means if you need to integrate RPD with Azure MFA you need to install windows 2012 and earlier such as windows 2008 R2.

3- To secure the remote desktop protocol (RDP) with Azure Multifactor, you must install the Azure MFA server in the same RDP server, in other word assume you have a server called “SRV1”, then you should install the MFA setup in the “SRV1” server, if you look back to point #2 you can conclude that you cannot secure the RDP for windows 2012 R2 (until the date of this article).

This deployment called MFA stand alone server since all deployment will be on premise and no integration will be done between local AD and Azure AD.

Now, log in to your azure tenant using, go to active directory tab from left pane:


Now choose MULTI-FACTOR AUTH PROVIDERS option from the top options,


Click New:


MULTI-FACTOR AUTH PROVIDERS used to install the MFA server setup files, also the provider will be responsible for the usage calculations and you can customize your setup from the provide such as fraud alerts.

Now choose App Services -> Active Directory -> MULTI-FACTOR AUTH PROVIDERS – Quick Create.


Name: choose any meaning full name for your provider.
Usage Model: you have two options here, per user enabled and per authentication, this option cannot be changed later, if you need to change it later you must create new provider, the difference between the two model is how Microsoft will charge you, if you choose per enabled user then you will be charged for how many users using MFA regardless of how many actual authentication occurs, if you choose per authentication you will be charged every time the users try to authenticate using Azure MFA.
Directory: choose Don’t link a directory since we will install the stand alone MFA server without integration with Azure AD.

After you fill the required information, click create:


After less than minute a new provider will be available in your tenant as shown below:


Click in the provider just created, then click in the MANAGE button in the bottom of the portal page:


The MFA Management page will appear, click in Downloads button as below:


in the download server page, it’s list the supported OS versions for MFA server including windows 2012 R2 and this is not what I said before, be smart I mentioned that the RPD feature is not supported in windows 2012 R2 but there is a lot of features that work in windows 2012 R2, Now click in Generate Activation Credentials button to generate the credential which will be used to register your server in MFA provider during the setup.


Email and password credential will be generated, these credential valid to be used within 10 minutes, if you take more than 10 min to start the setup you can re generate a new credentials.


Now click the download text to start the downloading of the MFA setup:


After the download complete, copy the setup file to the server you need to secure the RDP on it and double click on the setup to start the installation.

In Next Part we will continue our demo by installing the multifactor server and configuring it to secure remote desktop access.

So keep tuned

Source: Secure terminal Services (RDP) using Azure Multi-factor Authentication (MFA) – Part 1 – Azure Dummies

Configuring internal load balancer using Azure Resource Manager

In opvolgend van een SQL listener in Azure Click

2015-10-21_10-07-07 2015-10-21_10-06-35


Availability Group Listener in Windows Azure Now Supported! (And Scripts for Cloud-Only Configuration)

If you haven’t noticed, AlwaysOn Availability Groups in Windows Azure now supports availability group (AG) listeners. Configuring the AG listener in Windows Azure is a multi-step process that includes:

Creating a load-balanced endpoint for the Windows Azure VMs that host AG replicas.
Ensuring that hotfix KB2854082 is installed on all cluster nodes (including any node that does not host an AG replica).
Opening the probe port in the cluster nodes that host AG replicas.
Manually creating a client access point in the cluster and configuring the IP address parameters to use the cloud service IP address, the probe port, etc.
Setting the listener port in SQL Server.
If you are looking for an easier way to configure the listener in Windows Azure, I’ve published a script at the Script Center. This script currently has limited applications, but hopefully I can expand the scenarios as time goes on – and if you shout in my ear. If your scenario fits all the requirements, then I hope this script can help simplify the process of listener creation. If you don’t fit all the requirements, you may still be able to “scriptify” most of the steps. So just read on.

Now back to the requirements for this script, the biggest limitations are as follows:

All AG nodes are running in Windows Azure and in the same subnet – Simply put, if the same listener requires multiple IPs, you can’t use this script. This means the script excludes to all multi-subnet scenarios such as hybrid IT.
All cluster VMs were created with PowerShell Remoting enabled – This part gets a little hairy, so get ready. If after GA (4/16) you created your cluster VMs using PowerShell, then they are all PowerShell Remoting enabled. If however, you created your VMs using the portal, you had a choice until very recently to enable PowerShell Remoting by means of a small check box. If you didn’t check that box, I won’t say you lose, but you definitely can’t use this script, at least not without manually enabling PowerShell Remoting on your VMs and tweaking the script. My personal opinion is: not worth the trouble.

Now, the Azure team make a small tweak on 7/16 that enables PowerShell Remoting for all portal-created VMs without giving you the option. So if you created your VM after 7/16, then you win!

So enough for the $winners, now for the -not($winners) – those who can’t use the script because of the above limitations. I’d like to provide some PowerShell snippets that you can run and that hopefully can make things simpler as well. Mainly, there are three scripts: one to run on your local client from which you normally administer your Windows Azure deployment, one to run on all your cluster VMs, and one to run on the primary replica VM. Understand that these scripts are much more “quick and dirty” than foolproof, so don’t expect the validation and error checking that you find in the downloadable script. Also, you should have created a working AG in Windows Azure before using these steps. So now, without further ado, here are steps:

Windows Azure PowerShell June 2013 or later installed on the local client. Download at
On your local client, copy and paste the following script into a Windows Azure PowerShell session to configure LB endpoints and DSR each AG node (not necessarily each WSFC node)


Connect to RDP session for each WSFC node and download hotfix KB2854082 to a local directory.
In the RDP session for each WSFC node, copy and paste the following script into an elevated PowerShell session to install hotfix 2854082 and open the probe port in the firewall if the node is an availability group node. Be careful to run the script to completion on each node before moving onto the next node.


Test connection to listener from a domain-joined VM that is not in the same cloud service (DSR not supported from within the same cloud service). Use a longer login timeout since network messages are traversing the VM’s public endpoint. You can use sqlcmd or SSMS.


Fail over the AG and test the listener connection again. The query above should succeed and return a different server name.


Azure IaaS Toolkit |

I was at the Azure Global Bootcamp last month where we built a number of Azure VMs and networks. We also configured load balancing and several other settings. When I came home, I thought that it would be much easier if we could manage endpoints, load balancers and network security groups via a GUI. So the end result is available on the TechNet Gallery. Please look at it and if you have any feedback please let me know

Available on Technet Gallery

When you start the tool, it discovers an already connected Azure subscription. If not, you can add one by using Add Azure Account.You can then browse through your Cloud Services and VMs inside of the Cloud Services.There is an overview of the VM you selected where you can quickly connect, stop, start or delete your VM. Be careful with the delete. The confirm doesn’t seem to work somehow, so it’s not included in the script. Only the last VM in the service will give a warning.

On the Networking / LB tab you have the option to create, edit and remove Endpoints and ACLs on an endpoint. When you click directly on the edit button you see the default endpoints Azure created for you.

If you want to add a new one close the edit screen and fill out the form on the endpoints section. When you check the Configure Load Balancing Set it allows to create a set what you can use to load balance multiple machines on the same port. Click add to add the Endpoint / LB Set.

Now when you click edit again, you see the endpoint is listed. You can now manage the ACL for that endpoint to allow for example a specific ip range to access your web site in this example:

*Only on load balanced sets you have to manually update the view as it looks like azure is using some background task what doesn’t show you directly the created ACL in the gridview. For normal endpoints the view is directly updated.

On the Network Security Groups tab you can create, edit remove NSG’s. If you have existing NSG, you can manage the ACL by selecting the NSG and click on Edit.

When you click on edit you get a new view with all the inbound and outbound ACLs applied to that NSG. You can create, edit and remove ACL rules here.

When you have created the NSG you can then choose to add it to a VM or add it to a Network:

I would encourage you to download the tool and play with it. If you have any feedback, features you would like to see please let me know.

Source: Azure IaaS Toolkit |

AD FS Windows 2012 R2: adfssrv hangs in starting mode

Note that if you just set the Microsoft Key Distribution Service to Automatic, it will end up being set to Automatic (Trigger Start).  The default trigger has created problems for AD FS startup (for me and maybe others too).  You can query the trigger configuration by running the sc qtriggerinfo kdssvc command.  The default for the Microsoft Key Distribution Service is using an RPC trigger which will start the service when a request is received on the interface.  In my testing, I still run into trouble with the AD FS service starting up.

The workaround that I found to be consistent is changing the trigger configuration so that it relies on a different trigger.  The command to use is sc triggerinfo kdssvc start/networkon which starts the service when the network is on (typically very early in the boot cycle).

I also tested removing the trigger completely but that wasn’t effective at all.

If you have AD FS installed on the DC running 2012 R2 and use a gmsa for the service account, set the kdssvc to auto (instead of manual trigger) and restart the DC. You’ll find this fixes this issue. You should no longer have a recurrence.

Please report back if you do

Source: AD FS Windows 2012 R2: adfssrv hangs in starting mode

Changing DISKID | Symantec Connect Community

BE recognises each disks by their unique diskids.  Under some circumstances, two disk can have the same disk ids.  This will present a problem when they are used for B2D folders.  Below is how to fix this problem


All our external USB SATA backup drives are recognized as normal Disk storage and I managed to add most of these to Backup Exec 2012 and put them in a Device Pool. Some disks from the same manufacturer batch were recognized as the same disk in BE Storage (e.g. disk 10 and 11 both showed up as disk 10) and I was not able to add a new disk device.

Then I thought what does maybe distinguise one disk from the other, the Disk Signature ID in sector 0. This is where Mark Russinovich, our SysInternals guru came to the rescue, because I found this article:


  1. Attach first disk to your server and use DISKPART to find the signature
    SELECT DISK #, where # is the disk number in you system (use Disk Management to find disk number).
    This will give you a 4 byte disk ID e.g. “e9eb3aa5”
  2. Now attach the second (or third, etc) and do the same, find the UNIQUE ID, which will show as the same ID as the first. With the same DISKPART tool you can overwrite the ID with a new ID, try incrementing the number with 1 (that is what I did) for each new disk.
    Use the command in DISKPART: UNIQUEID DISK ID=e9eb3aa6, where the ID is the id you choose, this is just an example (see also the article above).
  3. After the ID is changed, I deleted all BE folders from that disk (/BEControl and /BEdata), disconnect the USB drive and reconnected it.
  4. This time BE Storage does not find the disk in the list and you can Add a new disk.

Office 365 Shared Mailbox Sent Items Limitation

Office 365 Shared Mailbox Sent Items Limitation

Office 365 Shared mailbox makes it easy for a specific group of people to monitor and send email from a common email address. When a person in the group replies to a message sent to the shared mailbox, the email appears to be from the shared mailbox not from the individual user. Shared Mailbox sent items limitation that cause compliance / regulatory issues for most of the customers utilizing shared mailbox is fixed now. Microosft has fixed this issue for all the customers and now customer can leverage the use of Shared Mailbox in Office 365 and on prem version of Exchange. Before we look at the resolution let’s have a look at shared mailbox sent items issue.

When a user sends an email using shared mailbox alias, the email goes directly to the sent items of sender instead of shared mailbox sent item. This is something that cause issues for organizations using Shared Mailboxes. To meet the compliance requirements organizations need control over Shared Mailbox sent items and admin don’t have to runMailbox Audit logging to nail down the sender of email.

Shared mailboxes are a great way to handle customer email queries because several people in your organization can share the responsibility of monitoring the mailbox and responding to queries. Your customer queries get quicker answers and related emails are stored in one mailbox. A shared mailbox doesn’t have its own user nameand password.

Microsoft didn’t have any control over shared mailbox sent items in Exchange 2013 and Exchange online and as a fix customer need to make registry changes on end users machine. This cause a lot of administrative overhead for customers and IT admins. Several customers of mine has not considered to upgrade to Exchange 2013 or migrate to Office 365 due to this limitation of sent items.

Registry fix doesn’t work with the users using outlook in online mode. This fix is only applicable when user is running outlook in cached mode.

With Exchange 2013 CU9 email sent by a user using shared mailbox will be delivered to the Sent Items folder of shared mailbox and to the user mailbox sent items. Microsoft has already started to roll out this new feature to the customers using Office 365 shared mailboxes.

Enable Office 365 Shared Mailbox Sent Items Feature

By default, this feature will be disabled to all the customers and administrator need to run the following PowerShell cmdlet.

For emails Sent As the shared mailbox: Set-mailbox <mailbox name> -MessageCopyForSentAsEnabled $True

For emails Sent On Behalf of the shared mailbox: Set-mailbox <mailbox name> -MessageCopyForSendOnBehalfEnabled $True

Disable Office 365 Shared Mailbox Sent Items Feature

To disable this feature run the following powershell cmdlets

For emails Sent As the shared mailbox: Set-mailbox <mailbox name> -MessageCopyForSentAsEnabled $False

For emails Sent On Behalf of the shared mailbox: Set-mailbox <mailbox name> -MessageCopyForSendOnBehalfEnabled $False

To enable this feature for Exchange online, we need to first connect to Exchange online powershell.

More information on new enhancements in exchange 2013 CU9 can be found on Microsoft Knowledge Base.

Changing the Certificate on ADFS 3.0 and Web Application Proxy (WAP)

As with all systems using certificates for security, there comes a time when the certificate is expiring and needs to be replaced. here’s the procedure for ADFS 3.0 and WAP:

Starting with the ADFS server:

  1. Log onto the ADFS server.
  2. Add the new certificate to the server. Make sure this is added to the personal certificate store for the computer account. I usually do this using the certificates snap-in in MMC.
  3. Find the thumbprint for the new certificate. This can be found by looking at the details for the certificate; the thumbprint is usually at/near the bottom of the list of details for the certificate and consists of 40 hexadecimal characters. Take a copy of the thumbprint and ensure that the spaces are removed, so it’s a 40 character string; you’ll need this in a few moments.
  4. Grant the service account that is running the ‘Active Directory Federation Services’ service read access to the private key. To do this, follow these steps:
    1. Within the certificates snap-in of MMC, right click the certificate, select ‘All Tasks’ and then select ‘Manage Private Keys…’:
      Manage private keys
    2. Click ‘Add…’ to add the user account running the ADFS service on the server and grant read access to that user. Click OK on the permissions dialog to close it.
  5. Launch AD FS Management, expand ‘Service’ within the left pane and click ‘Certificates’:
    AF FS Manager Certificates
  6. Click ‘Set Service Communications Certificate…’ from the actions panel at the right of the screen:
    Set Services Communication Cert
  7. A dialog is shown presenting the available certificates on the server. Select the new certificate that is to be used. If you are unsure of the correct certificate, select each certificate in turn and click the ‘Click here to view certificate properties’ link which is shown and compare the thumbprint with that recorded earlier. Click OK on the dialog once the correct certificate is selected.
  8. If at this point you restart the server or ADFS service and make a connection to ADFS, you will still be presented with the original certificate. The change in the GUI changes the configuration in the ADFS configuration database, but not the certificate bound to HTTP.sys.
  9. To complete the configuration change, the following PowerShell command must be run:
    Set-AdfsSslCertificate –Thumbprint 00112233445566778899aabbccddeeff00112233
    Where 00112233445566778899aabbccddeeff00112233 should be replaced with the thumbprint you found earlier.
  10. Restart the server, or the ADFS service on the server to complete the configuration change.

Additional configuration is required on the WAP server:

  1. Log onto the WAP server.
  2. Add the new certificate to the server. Make sure this is added to the personal certificate store for the computer account.
  3. Run the following PowerShell command to change the certificate:
    Set-WebApplicationProxySslCedrtificate –Thumbprint 0011223344556677889900aabbccddeeff00112233
    Where 00112233445566778899aabbccddeeff00112233 should be replaced with the thumbprint you found earlier.
  4. All of the publishing rules need to be updated with the thumbprint of the new certificate (you created these originally using PowerShell, right?). This can be done by either deleting the old rules and recreating them with the new certificate thumbprint specified, or the rules can be updated with the new thumbprint, for example:
    Get-WebApplicationProxyApplication –Name “WebAppPublishingRuleName” | Set-WebApplicationProxyApplication –ExternalCertificateThumbprint “00112233445566778899aabbccddeeff00112233”
    Where (you guessed it!) 00112233445566778899aabbccddeeff00112233 should be replaced with the thumbprint you found earlier and ‘WebAppPublishingRuleName’ should be replaced with the name of the rule as it is shown in the Remote Access Console.
    I expected the federation publishing rule that was created automatically when WAP was originally configured to be updated for me, but had to manually switch the certificate on that one.
  5. Restart the server, or the ADFS and Web Application Proxy services to complete the configuration.
  6. Test that all of the previously published rules function correctly and provide the new certificate to the computer from which you are making a connection. If you need to check the certificate assigned to a specific publishing rule, the following PowerShell will show all of the properties for the publishing rule:
    Get-WebApplicationProxyApplication –Name “WebAppPublishingRuleName” | fl
    Note that the other parameters shown in the list generated by the above can also be changed (with a few exceptions) using the Set-WebApplicationProxyApplication cmdlet.


Source: Changing the Certificate on ADFS 3.0 and Web Application Proxy (WAP)

Windows IoT – SetupBoard

Get StartedLearn how to set up the Raspberry Pi 2 and connect it to your computer. Note that this requires you to have a PC running Windows 10 Technical Preview.1. Select Your Device2. Set up your Raspberry Pi 2 30min 3. Set up your PC4. DevelopWhat you needA PC running Windows 10 Insider Preview.Raspberry Pi 2.5V micro USB power supply with at least 1.0A current.NOTE: You may want to use a higher current power supply (>2.0A) instead if you plan on using several USB peripherals or high-current devices.A 8gb Class 10 (or better) micro SD card.If you don’t have an SD card, we suggest this one or this one.HDMI cable (if display is desired).Ethernet cable.Put the Windows 10 IoT Core Insider Preview image on your SD CardWe have provided a utility to provision your SD card with the Windows 10 IoT Core Insider Preview. The following steps can only be executed on a system running Windows 10 (build 10069 or higher).Follow these instructions to configure your SD card:NOTE: you will need to follow these instructions on a physical Windows machine (and not a VM) because you need access to the SD card reader.Configure your connect account here. Note that if your account was already configured, you’ll see a blank page.Make a local copy of the flash.ffu contained in found here.Insert an SD card into your SD card reader.Open an admininistrator command prompt and navigate to the folder containing your local flash.ffu.Find the disk number that your SD card is on your computer. This will be used when the image is applied in the next step. To do this, you can use the diskpart utility. Run the following commands:diskpartlist diskexitUsing the admininistrator command prompt, apply the image to your SD card by running the following command (be sure to replace PhysicalDriveN with the value you found in the previous step, for example, if your SD card is disk number 3, use /ApplyDrive:\\.\PhysicalDrive3 below):dism.exe /Apply-Image /ImageFile:flash.ffu /ApplyDrive:\\.\PhysicalDriveN /SkipPlatformCheckClick on the “Safely Remove Hardware” icon in your task tray and select your USB SD card reader to safely remove it from the system. Failing to do this can cause corruption of the image.Hook up your BoardInsert micro SD card you prepared in the section above (the slot is on the opposite side of the board shown below).Connect a network cable to the Ethernet port on the board.Connect an HDMI monitor to the HDMI port on the board.Connect the power supply to the micro USB port on the board.Boot Windows 10 IoT Core Insider PreviewWindows 10 IoT Core Insider Preview will boot automatically after connecting power supply.On the very first boot the Windows IoT Core will do some first boot configurations and it will display a default blue colored application while this is happening. Wait for a few minutes and the board will automatically restart. This will happen only once and then DefaultApp should come up, displaying the IP address of the Raspberry Pi 2. Follow the instructions here to use PowerShell to connect to your running device.It is highly recommended that you update the default password for the Administrator account. Please follow the instructions found in the PowerShell documentation.Remote Debugger will launch automatically when your Raspberry Pi 2 boots.

Source: Windows IoT – SetupBoard