Shawn Weisfeld

I find when I talk to myself nobody listens. - Shawn Weisfeld
posts - 365, comments - 180, trackbacks - 34

My Links


Shawn Weisfeld's Facebook profile

The views expressed in this blog are mine and mine alone, not that of my employer, Microsoft, or anyone else’s. No warrantee is given for the quality of any material on this site.


Post Categories

Friday, March 31, 2017

Working with Closed Captioning and the Azure Media Services Media Player

First let me commend the Azure Media Services team for the awesome job they did with the Azure Media Player ( Not only is this a great tool to help debug and test your streaming media, but it is also a great platform to use in your production application.

First let's encode and publish your video file. In production, you will likely write code in your application to integrate this workflow into your application, but for this exercise I will use the Azure Media Services Explorer (

  • Using the Azure Portal create an Azure Media Services Account
  • Connect AMSE to your Azure Media Services Account
  • Using AMSE upload your video asset (Look for the upload command in the Asset Menu)
  • When the upload finishes, right click on the file in the Asset Tab and then encode it.
  • Once the encoding is finished, right click on the new entry on the Asset Tab and select Publish & Create a locator.

Great, now your asset is ready for streaming. Grab the publish url and test it out!

Right click on the Publication endpoint on the asset tab. Select Publish and copy the publish url to the clipboard. This will give you a url that ends in “.ism/manifest”, copy and paste this into the URL box of the Azure Media player, press the Update Player button and your video should start playing.

Now, that we can see our video, lets add the closed captioning. Typically, this is done with a WebVTT file ( You can write one by hand, or you can let Azure Media Services Indexer do this for you.

Back in AMSE, right click on your source video file again, select Media Analytics and then Index Assets. This will enqueue a new job that will strip the audio out of the video, and then convert it to text, and create, among other things, our WebVTT file. When it is done, it will dump all of these assets out on Azure Blob storage. Now you have a choice of where do you want to host these files.

The three most common places are on AMS in your publishing endpoint, on Azure Blob Storage, or on your website. The trick with all three of these is having the proper cross domain access policy in place.

If you choose to host the WebVTT file on AMS, we have already done this work for you. You can view our cross domain policy file here ( So to publish your WebVTT, you just need to move it from the storage account where the indexer dropped it, to your publish endpoint. You can do this in AMSE, double click on the asset folder with the output of the indexer job, flip over to the Asset Files tab in the dialog that opens, and then download the .vtt file. Once you have it downloaded, go back to the main screen in AMSE, double click on the published video file, flip over to the Asset Files tab, and then upload it. The URL to the file will be the same as the URL to the ".ism/manifest", just remove the "/manifest" part and replace the .ism file name with the file name for your .vtt file.

If you choose to host the WebVTT file on your website, you will likely need to setup the cross domain policy. Easiest way to do this is to download the policy that we use and upload it to the root of your website. Our policy is pretty wide open, so you will likely want to lock it down a bit more.

Finally, if you choose to host the WebVTT file on Azure Blob storage you will need to configure the cross domain policy on Azure Blob storage. The easiest way to do this is via the Azure Storage Explorer ( Just find the storage account in question, right click on the blob containers node and then select configure CORS.

Now that you have your cross domain policy configured, drop the URL to the VTT file into the Advanced Options section of the Azure Media Player, under tracks. After you press the Update Player button, you should be able to select your captions track and see them in the player!

If you didn't configure CORS correctly it will not work. You can confirm this by using the Web Debugging tools in your favorite browser. First, check the network tool to ensure that your .vtt file is getting download, 200 OK. If you are not downloading the VTT file, check to see if you have the path correct. If the file is downloading and you still don't see the captions, check the console window. You will see an error there, if your CORS policy is not set correctly.

Hope, this helps you get closed captioning working in your media application. Big thanks to Saili from the AMS team for working with me to get this all ironed out.

posted @ Friday, March 31, 2017 6:04 AM | Feedback (2) |

Thursday, October 6, 2016

How To change the replication of an Azure storage account.

This article does a great job talking about the different replication options with Azure Storage:

My general rule of thumb: I like LRS for my VM hard drives, it provides higher throughput and is the least expensive, and I typically have workload specific backup/HA/DR for them. I like RA-GRS for my backups, as it allows me to read the secondary’s and do DR drills any time I need, without touching anything in the primary Azure region.

Step #1: open your storage account in the portal and identify what replication is currently enabled. In this screen shot you can see I am using LRS.

Step #2: go to configuration, select the new replication level, press save.

Step #3: Done! Verify the change. Depending on the amount of data in your account it might take a bit of time to get everything synced up.

posted @ Thursday, October 6, 2016 3:12 AM | Feedback (2) |

Using Azure Active Directory Groups for SQL Azure Authentication

Setting database permissions based on Active Directory Credentials and Groups has been a tried and true technique for authentication on premises for years. With the addition of AAD to SQL Azure we can now use this technique in the cloud. However, I could not find a good step by step on how to use AAD Groups in this scenario. So, I put this post together, hopefully it will help you.

The first step is to confirm what AAD is “trusted” by the Azure subscription you want to deploy the database into. Just navigate to the settings tab, here you will see your subscriptions listed, on the right you will see the directory that it is tied to. NOTE: if you need to change this see the details in the official documentation above.

With that housekeeping out of the way, let’s set up some accounts to test this with. I will create 2, a user account and an admin account. Just navigate to your directory, select add user and go through the wizard. In this screen shot you can see the two users I created. NOTE: After I create new accounts in AAD I always log in once to trigger a password reset. I do this in a new in private browser window, by going to the Azure Portal. After logging in it should say “No Subscriptions Found” this is expected.

Now that we have some accounts, lets go ahead and create two groups one for admins and another for users, and put each account in its group. This can be done on the groups tab in AAD, just click add group at the bottom, and go through the wizard. After you add the group don’t forget to put the proper users in the group. Here are some screen shots from my portal after I completed the process for the Admin group.

Ok, one more piece of setup, lets create a SQL Azure DB, during this process you will have to assign it a non-AAD admin account. I just used the Adventure Works sample for this demo so it would have some data.

Ok, now we get to the fun part. The first thing we want to do is assign the admin group as admin on our entire SQL Database server, we can do this easily in the Azure Portal. From the database blade (screen shot above) click on the server name in the upper right, to bring up the server blade, here you can select Active Directory admin from the options list, and then set admin. This will pop up a blade for you to search for the users or groups you want to assign. As you can see in the screen print I did a search for SQL and I can see the 2 users we created and the two groups we created. I am going to select SQL Admin and then press the select button.

Now you must save your changes. The save button is hidden under the more ellipsis.

Great now our admins can log into our database server with SSMS. NOTE: I said server, the admin group permission is in the master db. Also, look I am logging in as the user, but we assigned the permissions to the group. Now when users are added/removed from the group, permissions will be reflected automatically in SQL Azure.

For our users, we don’t want them in master, just in the database. So, let’s give our users just access to the database. We can do this with a bit of TSQL while logged in as the admin. NOTE: I set the database in the drop down before executing the TSQL. Also I am giving the entire group access not just my user, for the same reasons we did it at the server level.

And lets give our users the db_owner SQL Role.

Now login with the user account the same way we did before, however you must set the database under “connection properties” in the options dialog, or you will get this error.

And now our users can get in to just this database and do what they need to. All with the security of Azure Active Directory and the convince of using AAD Groups for maintenance.

posted @ Thursday, October 6, 2016 1:20 AM | Feedback (0) |

Monday, September 19, 2016

AT&T U-verse Gigapower & Skype/Skype for Business

Like many I was super excited to get AT&T U-verse Gigapower to my home. Can you say FAST! No it is really FAST!

However, I have had some problems with the first few seconds of my Skype/Skype for Business phone calls being unbearably bad. I was chatting with a colleague (thanks Ahmed) and he had the fix, two small tweaks to the default firewall settings on the AT&T Router.

  1. Connect to your local network at home.
  2. Log into your router via the browser. By default, this is probably at however you can check by looking at the ipconfig settings on your PC.
  3. Navigate to "Firewall" and then "Firewall Advanced"
  4. Switch "Flood Limit" and "SIP ALG" to OFF
  5. Save the settings
  6. Call your mom and tell her that you lover her.

posted @ Monday, September 19, 2016 6:45 AM | Feedback (0) |

Tuesday, May 24, 2016

Authenticating a service principal with Azure Resource Manager – via a password file

The Azure documentation has a great article on Authenticating a service principal with Azure Resource Manager. It does a good job of outlining the steps needed to automating POSH login via PowerShell.

However, it assumes that you can store the password in an Azure KeyVault. In some scenarios this is not ideal. In my case I wanted to store my password securely in a file on the file system. We will leverage the .NET Secure String to create the password. This means that only user that created the file can decrypt and use it, so when saving this value, use the same account that the script or service will use.

The secure file will look something like this:

Here is the one-time setup

Now I can use the file that I create when I need to login, and no longer to I have to type in my password!

posted @ Tuesday, May 24, 2016 11:31 AM | Feedback (1) |

Sunday, April 5, 2015

Speakers Notes: The tools

Outside of my day job, I am a frequent speaker at community events. I always get asked about the process I use to prepare. Today I would like to talk a bit about the tools I use.

Source Control

As a software developer I use source control for all my code. Additionally since many of my presentations are code heavy it wasn’t a huge leap for me to use source control for my presentations. I even put my PowerPoint deck in source control. This has a few advantages. First if I make a mistake and want to "go back in time" I can do that. Secondly it allows me a bit of a backup. While it hasn’t happened in years if I have technical difficulties on my device and need to present from someone else’s computer, I have a full backup of everything online. Third it makes it easier to share my content with the attendees as I can just point them at the repository, they also have the opportunity to see how the talk has evolved over time. I have even see some presenters replay commits or jump between branches to "Julia Child" there presentations.

There are many source control tools available, and while I am a huge fan of using for most of my development projects, I have selected for this task. I do this mainly because I keep all my private work on VS and the unlimited free public repo pricing model of makes it quite attractive for this purpose.

Talk Feedback

While IMHO it is important to give attendees the ability to download the code. It is important for me to get feedback about each of my presentations. I use this to better myself over time. I have been logging my presentations at since the end of 2010. This free service has allowed me to collect valuable feedback every time I present. It has also provided me a log of every talk I have done, in a single list. This has a huge amount of value as it is like an online resume of speaking engagements.

Use whatever tools work best for you, the biggest advice I would have is to get a process and stick to it. This will allow you to focus on building great content for your presentations and create on online history of the great talks you have done.

posted @ Sunday, April 5, 2015 1:19 AM | Feedback (0) |

Wednesday, March 11, 2015

Moving blob storage between Azure Commercial and Azure Government

Got a question from a customer on how to move there blob storage assets from Azure Commercial to Azure Government. The easiest way is via AzCopy:, but you have two options.

Technique #1: Server-side copy

Even though the public cloud and the government cloud are on separate networks the server-side copy routine works like a champ. Server-side copy will schedule a job running on the Azure backbone that will move the files using “spare bandwidth capacity”.

AzCopy.exe /Source: /Dest: /SourceKey:MySourceKey /DestKey:MyDestKey /Pattern:MyBlob.txt

More technical details on that works under the covers can be found here:

Technique #2: Jump VM

If you need more control then consider setting up a “jump” VM. Using this technique you can “reserve” capacity, by purchasing VM’s to do the copy. While this requires a bit more work, this will allow you to scale your copy server(s) to whatever level you need to meet your needs, an IaaS VM with AzCopy would work great for this, or you can write something more complex using PaaS WorkerRoles. The basic idea is to copy the files down to the local disk on the VM then back up to blob storage on the other end.
BTW: If you need extra local disk to do the transfer try setting up storage spaces, more info here:

Regardless of what technique you select, you will be responsible for any egress charges moving data between two different datacenters.

posted @ Wednesday, March 11, 2015 1:44 AM | Feedback (0) |

Thursday, February 26, 2015

Redis Timeout errors with the StackExchange Redis Client

While I am not a big fan of writing large chunks of data to Redis and I am not a big fan of writing to Redis synchronously, I offer this information in case it will help you.

When writing large values to Redis I had a customer observe the following timeout error, Using the StackExchange client:

{"Timeout performing SET MyName, inst: 0, mgr: Inactive, queue: 2, qu=1, qs=1, qc=0, wr=1/1, in=0/0"}

I was writing the value to Redis synchronously, something like this:

db.StringSet("MyName", foo);

I found that I could resolve the issue by writing the string asynchronously, of course by tagging Wait to the end of the line I get none of the Async benefits, so if you are going to do this in your app, you probably want to make the entire method Async.

db.StringSetAsync("MyName", foo).Wait();

Another other option is to bump up the SyncTimeout, here is an example of how that is done, note I am setting the timeout to max value, and this is not a best practice, you will want to pick a value that makes sense for your application. . .

Here is a good article that explains in more detail about timeout errors:

posted @ Thursday, February 26, 2015 3:15 AM | Feedback (0) |

Wednesday, February 11, 2015

Azure WebRole OnStart Trace.Write Azure SDK 2.5

With earlier versions of the Azure SDK when you wanted to “hook up” the DiagnosticMontiorTraceListener to log information using Trace.Write in the OnStart method of the Azure WebRole you could simply make the following call and the Azure trace listener would collect all the Trace statements from your code in the OnStart method.

System.Diagnostics.Trace.Listeners.Add(new Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener());

However in Azure SDK 2.5 the DiagnosticMontiorTraceListener no longer inherits from System.Diagnostics.TraceListener, therefore you must wrap it in order to inject it. Wrapping it is done simply with the following code.

Now you can inject your TraceListener into the pipeline before you need to log anything, and you should start seeing your logs.

Trace.Listeners.Add(new MyTraceListener());

NOTE: this has been tested with the 2.5 version of the Azure SDK.

NOTE: you will get the following error after you add the above code. "The type 'Microsoft.Cis.Eventing.Listeners.RDEventMonitoringAgentListener' is defined in an assembly that is not referenced. You must add a reference to assembly 'MonAgentListener, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35'." To resolve this error simply add a reference to the MonAgentListener DLL via the "Add References" window. You will find the dll under the "Extensions" tab in the "Add References" dialog.

posted @ Wednesday, February 11, 2015 4:46 AM | Feedback (0) |

Monday, March 11, 2013

Upload files to Azure Blobs FAST

CAUTION: This worked for me...Use it at your own risk.

I was given a task to upload billions of small (10k) files to Azure, yes that is billions with a B. Like everyone else I started by reading the MSDN docs and having it show me the toBlob.UploadFile and toBlob.UploadFromStream. Both proved to be a bit slow in my case. I can only assume the issue is the overhead of opening and closing the connections.

My first iteration was to just wrap everything with a Parallel.ForEach, and while this drastically improved things it still was not good enough.

My second iteration was to bump up the "Default Connection Limit" (see Line one in the example blow), again things got better but not good enough.

My third iteration was to use the Task.Factory.FromAsync to wrap the async calls that the API exposes in a Task, finally things are much better now. (see complete code sample below)

posted @ Monday, March 11, 2013 9:18 AM | Feedback (0) |

Powered by: