Shawn Weisfeld

I find when I talk to myself nobody listens. - Shawn Weisfeld
posts - 362, comments - 174, trackbacks - 34

My Links


Shawn Weisfeld's Facebook profile

The views expressed in this blog are mine and mine alone, not that of my employer, Microsoft, or anyone else’s. No warrantee is given for the quality of any material on this site.


Post Categories

Monday, September 19, 2016

AT&T U-verse Gigapower & Skype/Skype for Business

Like many I was super excited to get AT&T U-verse Gigapower to my home. Can you say FAST! No it is really FAST!

However, I have had some problems with the first few seconds of my Skype/Skype for Business phone calls being unbearably bad. I was chatting with a colleague (thanks Ahmed) and he had the fix, two small tweaks to the default firewall settings on the AT&T Router.

  1. Connect to your local network at home.
  2. Log into your router via the browser. By default, this is probably at however you can check by looking at the ipconfig settings on your PC.
  3. Navigate to "Firewall" and then "Firewall Advanced"
  4. Switch "Flood Limit" and "SIP ALG" to OFF
  5. Save the settings
  6. Call your mom and tell her that you lover her.

posted @ Monday, September 19, 2016 6:45 AM | Feedback (0) |

Tuesday, May 24, 2016

Authenticating a service principal with Azure Resource Manager – via a password file

The Azure documentation has a great article on Authenticating a service principal with Azure Resource Manager. It does a good job of outlining the steps needed to automating POSH login via PowerShell.

However, it assumes that you can store the password in an Azure KeyVault. In some scenarios this is not ideal. In my case I wanted to store my password securely in a file on the file system. We will leverage the .NET Secure String to create the password. This means that only user that created the file can decrypt and use it, so when saving this value, use the same account that the script or service will use.

The secure file will look something like this:

Here is the one-time setup

Now I can use the file that I create when I need to login, and no longer to I have to type in my password!

posted @ Tuesday, May 24, 2016 11:31 AM | Feedback (1) |

Sunday, April 5, 2015

Speakers Notes: The tools

Outside of my day job, I am a frequent speaker at community events. I always get asked about the process I use to prepare. Today I would like to talk a bit about the tools I use.

Source Control

As a software developer I use source control for all my code. Additionally since many of my presentations are code heavy it wasn’t a huge leap for me to use source control for my presentations. I even put my PowerPoint deck in source control. This has a few advantages. First if I make a mistake and want to "go back in time" I can do that. Secondly it allows me a bit of a backup. While it hasn’t happened in years if I have technical difficulties on my device and need to present from someone else’s computer, I have a full backup of everything online. Third it makes it easier to share my content with the attendees as I can just point them at the repository, they also have the opportunity to see how the talk has evolved over time. I have even see some presenters replay commits or jump between branches to "Julia Child" there presentations.

There are many source control tools available, and while I am a huge fan of using for most of my development projects, I have selected for this task. I do this mainly because I keep all my private work on VS and the unlimited free public repo pricing model of makes it quite attractive for this purpose.

Talk Feedback

While IMHO it is important to give attendees the ability to download the code. It is important for me to get feedback about each of my presentations. I use this to better myself over time. I have been logging my presentations at since the end of 2010. This free service has allowed me to collect valuable feedback every time I present. It has also provided me a log of every talk I have done, in a single list. This has a huge amount of value as it is like an online resume of speaking engagements.

Use whatever tools work best for you, the biggest advice I would have is to get a process and stick to it. This will allow you to focus on building great content for your presentations and create on online history of the great talks you have done.

posted @ Sunday, April 5, 2015 1:19 AM | Feedback (0) |

Wednesday, March 11, 2015

Moving blob storage between Azure Commercial and Azure Government

Got a question from a customer on how to move there blob storage assets from Azure Commercial to Azure Government. The easiest way is via AzCopy:, but you have two options.

Technique #1: Server-side copy

Even though the public cloud and the government cloud are on separate networks the server-side copy routine works like a champ. Server-side copy will schedule a job running on the Azure backbone that will move the files using “spare bandwidth capacity”.

AzCopy.exe /Source: /Dest: /SourceKey:MySourceKey /DestKey:MyDestKey /Pattern:MyBlob.txt

More technical details on that works under the covers can be found here:

Technique #2: Jump VM

If you need more control then consider setting up a “jump” VM. Using this technique you can “reserve” capacity, by purchasing VM’s to do the copy. While this requires a bit more work, this will allow you to scale your copy server(s) to whatever level you need to meet your needs, an IaaS VM with AzCopy would work great for this, or you can write something more complex using PaaS WorkerRoles. The basic idea is to copy the files down to the local disk on the VM then back up to blob storage on the other end.
BTW: If you need extra local disk to do the transfer try setting up storage spaces, more info here:

Regardless of what technique you select, you will be responsible for any egress charges moving data between two different datacenters.

posted @ Wednesday, March 11, 2015 1:44 AM | Feedback (0) |

Thursday, February 26, 2015

Redis Timeout errors with the StackExchange Redis Client

While I am not a big fan of writing large chunks of data to Redis and I am not a big fan of writing to Redis synchronously, I offer this information in case it will help you.

When writing large values to Redis I had a customer observe the following timeout error, Using the StackExchange client:

{"Timeout performing SET MyName, inst: 0, mgr: Inactive, queue: 2, qu=1, qs=1, qc=0, wr=1/1, in=0/0"}

I was writing the value to Redis synchronously, something like this:

db.StringSet("MyName", foo);

I found that I could resolve the issue by writing the string asynchronously, of course by tagging Wait to the end of the line I get none of the Async benefits, so if you are going to do this in your app, you probably want to make the entire method Async.

db.StringSetAsync("MyName", foo).Wait();

Another other option is to bump up the SyncTimeout, here is an example of how that is done, note I am setting the timeout to max value, and this is not a best practice, you will want to pick a value that makes sense for your application. . .

Here is a good article that explains in more detail about timeout errors:

posted @ Thursday, February 26, 2015 3:15 AM | Feedback (0) |

Wednesday, February 11, 2015

Azure WebRole OnStart Trace.Write Azure SDK 2.5

With earlier versions of the Azure SDK when you wanted to “hook up” the DiagnosticMontiorTraceListener to log information using Trace.Write in the OnStart method of the Azure WebRole you could simply make the following call and the Azure trace listener would collect all the Trace statements from your code in the OnStart method.

System.Diagnostics.Trace.Listeners.Add(new Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener());

However in Azure SDK 2.5 the DiagnosticMontiorTraceListener no longer inherits from System.Diagnostics.TraceListener, therefore you must wrap it in order to inject it. Wrapping it is done simply with the following code.

Now you can inject your TraceListener into the pipeline before you need to log anything, and you should start seeing your logs.

Trace.Listeners.Add(new MyTraceListener());

NOTE: this has been tested with the 2.5 version of the Azure SDK.

NOTE: you will get the following error after you add the above code. "The type 'Microsoft.Cis.Eventing.Listeners.RDEventMonitoringAgentListener' is defined in an assembly that is not referenced. You must add a reference to assembly 'MonAgentListener, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35'." To resolve this error simply add a reference to the MonAgentListener DLL via the "Add References" window. You will find the dll under the "Extensions" tab in the "Add References" dialog.

posted @ Wednesday, February 11, 2015 4:46 AM | Feedback (0) |

Monday, March 11, 2013

Upload files to Azure Blobs FAST

CAUTION: This worked for me...Use it at your own risk.

I was given a task to upload billions of small (10k) files to Azure, yes that is billions with a B. Like everyone else I started by reading the MSDN docs and having it show me the toBlob.UploadFile and toBlob.UploadFromStream. Both proved to be a bit slow in my case. I can only assume the issue is the overhead of opening and closing the connections.

My first iteration was to just wrap everything with a Parallel.ForEach, and while this drastically improved things it still was not good enough.

My second iteration was to bump up the "Default Connection Limit" (see Line one in the example blow), again things got better but not good enough.

My third iteration was to use the Task.Factory.FromAsync to wrap the async calls that the API exposes in a Task, finally things are much better now. (see complete code sample below)

posted @ Monday, March 11, 2013 9:18 AM | Feedback (0) |

Wednesday, January 23, 2013

Zebra Strip a jqGrid


Zebra striping, or greenbar, or odd/even row highlighting whatever you call it, your boss asked you for it and now you are here and you are trying to figure out how to implement it. When I was attempting to do this I saw many examples using javascript to do this, however why do in javascript what you can do in CSS.

If you look for a solution for a standard html table (of which there are millions on the web) they will tell you to do something like this:
    #grdResults tr:nth-child(even) {
            background-color:rgb(226, 228, 240);

While this works great for normal tables for a jqGrid it will not work. Why because the cells themselves are overriding your command. Easy enough to solve, just color the background of each cell. (note the td at the end of the first line)
        #grdResults tr:nth-child(even) td {
            background-color:rgb(226, 228, 240);

Wammo! A zebra striped jqGrid with no additional javascript overhead.

posted @ Wednesday, January 23, 2013 9:05 PM | Feedback (0) |

Tuesday, January 22, 2013

Use ASP.NET MVC as a template engine

Get the code here:

Got a question today from a colleague. He had a database full of records, lets say customers, and needed to generate static html document for each customer. I did not ask him why he needed the static html, but let’s assume that he was going to use it as the body of an email message. After doing a bunch of thinking I was disappointed that I could not use the power of the MVC template-ing engine, razor. Or could I. What if I wrote a view that pulled a customer from the database and created a view in the normal manner (See the Home\CustomerInfo Action & View). That was easy, boy to I love razor. Now to the hard part, rendering the html for all 100 of our customers. If can use WebClient to download any page off of the internet, why cannot we use it to download a page from our own website (see the Home\ProcessCustomers Action). So with a few lines of code we can now render any html we want. YEA!

While I have not played with it, you could avoid the network traffic with something like the RazorEngine (available from @

posted @ Tuesday, January 22, 2013 8:34 PM | Feedback (0) |

Thursday, January 17, 2013

Entity Framework insert performance

While I am big believer in Entity Framework (EF) there are some cases where dropping down to raw ADO.NET make sense. In an application I am working on I need to insert thousands of records into the database, and while the change tracking mechanism and other features in EF provides a great service sometimes the overhead of it doesn’t outweigh the benefits it offers.

I threw together a quick little sample app (you can download it from GitHub here) that shows the problem. The app simply creates a bunch of random customer objects and saves them to sql server. Here are the run times for a set of 1,000 and 10,000 customers. As you can see using raw ADO.NET is about 35% faster with record sets of 1,000 and with the 10,000 size record set I get a 70% performance increase. Additionally it appears that the performance with raw ADO.NET is linear as you scale up while EF appears to perform worse as the record set grows.


(times in milliseconds)

Run 1,000 Records     10,000 Records  
  EF DbHelper     EF DbHelper  
1 4,359 2,281     74,010 22,088  
2 3,389 2,233     70,250 20,621  
3 3,370 2,266     71,181 22,470  
4 3,383 2,302     74,096 22,495  
5 3,392 2,288     76,465 20,759  
6 3,367 2,245     70,030 19,778  
7 3,450 2,249     69,176 20,485  
8 3,369 2,252     69,833 19,680  
9 3,368 2,268     71,728 20,751  
10 3,390 2,298     74,084 22,821  
Average 3,484 2,268 34.9%   72,085 21,195 70.6%


BTW: SqlBulkCopy is about 100x faster then either of the above if raw speed is important to you. (I have included an example of that in the code on GitHub)

Please note this is all anecdotal information that I have got from my observations of the system, and I could be woefully wrong, please let me know if I am. . .

posted @ Thursday, January 17, 2013 12:28 PM | Feedback (0) | Filed Under [ SQL ]

Powered by: