Shawn Weisfeld

I find when I talk to myself nobody listens. - Shawn Weisfeld
posts - 360, comments - 173, trackbacks - 34

My Links

News


Shawn Weisfeld's Facebook profile

The views expressed in this blog are mine and mine alone, not that of my employer, Microsoft, or anyone else’s. No warrantee is given for the quality of any material on this site.

Archives

Post Categories

Sunday, April 5, 2015

Speakers Notes: The tools

Outside of my day job, I am a frequent speaker at community events. I always get asked about the process I use to prepare. Today I would like to talk a bit about the tools I use.


Source Control

As a software developer I use source control for all my code. Additionally since many of my presentations are code heavy it wasn’t a huge leap for me to use source control for my presentations. I even put my PowerPoint deck in source control. This has a few advantages. First if I make a mistake and want to "go back in time" I can do that. Secondly it allows me a bit of a backup. While it hasn’t happened in years if I have technical difficulties on my device and need to present from someone else’s computer, I have a full backup of everything online. Third it makes it easier to share my content with the attendees as I can just point them at the repository, they also have the opportunity to see how the talk has evolved over time. I have even see some presenters replay commits or jump between branches to "Julia Child" there presentations.

There are many source control tools available, and while I am a huge fan of using http://www.VisualStudio.com for most of my development projects, I have selected http://www.GitHub.org for this task. I do this mainly because I keep all my private work on VS and the unlimited free public repo pricing model of GitHub.org makes it quite attractive for this purpose.


Talk Feedback

While IMHO it is important to give attendees the ability to download the code. It is important for me to get feedback about each of my presentations. I use this to better myself over time. I have been logging my presentations at http://www.speakerrate.com since the end of 2010. This free service has allowed me to collect valuable feedback every time I present. It has also provided me a log of every talk I have done, in a single list. This has a huge amount of value as it is like an online resume of speaking engagements.


Use whatever tools work best for you, the biggest advice I would have is to get a process and stick to it. This will allow you to focus on building great content for your presentations and create on online history of the great talks you have done.

posted @ Sunday, April 5, 2015 1:19 AM | Feedback (0) |

Wednesday, March 11, 2015

Moving blob storage between Azure Commercial and Azure Government

Got a question from a customer on how to move there blob storage assets from Azure Commercial to Azure Government. The easiest way is via AzCopy: http://aka.ms/AzCopy, but you have two options.

Technique #1: Server-side copy

Even though the public cloud and the government cloud are on separate networks the server-side copy routine works like a champ. Server-side copy will schedule a job running on the Azure backbone that will move the files using “spare bandwidth capacity”.

AzCopy.exe /Source:https://sorce.blob.core.windows.net/container /Dest:https://dest.blob.core.usgovcloudapi.net/container /SourceKey:MySourceKey /DestKey:MyDestKey /Pattern:MyBlob.txt

More technical details on that works under the covers can be found here: http://blogs.msdn.com/b/windowsazurestorage/archive/2012/06/12/introducing-asynchronous-cross-account-copy-blob.aspx

Technique #2: Jump VM

If you need more control then consider setting up a “jump” VM. Using this technique you can “reserve” capacity, by purchasing VM’s to do the copy. While this requires a bit more work, this will allow you to scale your copy server(s) to whatever level you need to meet your needs, an IaaS VM with AzCopy would work great for this, or you can write something more complex using PaaS WorkerRoles. The basic idea is to copy the files down to the local disk on the VM then back up to blob storage on the other end.
BTW: If you need extra local disk to do the transfer try setting up storage spaces, more info here: http://blogs.technet.com/b/keithmayer/archive/2013/01/24/step-by-step-building-a-windows-server-2012-storage-server-in-the-cloud-with-windows-azure.aspx

Regardless of what technique you select, you will be responsible for any egress charges moving data between two different datacenters.

posted @ Wednesday, March 11, 2015 1:44 AM | Feedback (0) |

Thursday, February 26, 2015

Redis Timeout errors with the StackExchange Redis Client

While I am not a big fan of writing large chunks of data to Redis and I am not a big fan of writing to Redis synchronously, I offer this information in case it will help you.

When writing large values to Redis I had a customer observe the following timeout error, Using the StackExchange client:

{"Timeout performing SET MyName, inst: 0, mgr: Inactive, queue: 2, qu=1, qs=1, qc=0, wr=1/1, in=0/0"}

I was writing the value to Redis synchronously, something like this:

db.StringSet("MyName", foo);

I found that I could resolve the issue by writing the string asynchronously, of course by tagging Wait to the end of the line I get none of the Async benefits, so if you are going to do this in your app, you probably want to make the entire method Async.

db.StringSetAsync("MyName", foo).Wait();

Another other option is to bump up the SyncTimeout, here is an example of how that is done, note I am setting the timeout to max value, and this is not a best practice, you will want to pick a value that makes sense for your application. . .

Here is a good article that explains in more detail about timeout errors: http://azure.microsoft.com/blog/2015/02/10/investigating-timeout-exceptions-in-stackexchange-redis-for-azure-redis-cache/

posted @ Thursday, February 26, 2015 3:15 AM | Feedback (0) |

Wednesday, February 11, 2015

Azure WebRole OnStart Trace.Write Azure SDK 2.5

With earlier versions of the Azure SDK when you wanted to “hook up” the DiagnosticMontiorTraceListener to log information using Trace.Write in the OnStart method of the Azure WebRole you could simply make the following call and the Azure trace listener would collect all the Trace statements from your code in the OnStart method.

System.Diagnostics.Trace.Listeners.Add(new Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener());


However in Azure SDK 2.5 the DiagnosticMontiorTraceListener no longer inherits from System.Diagnostics.TraceListener, therefore you must wrap it in order to inject it. Wrapping it is done simply with the following code.

Now you can inject your TraceListener into the pipeline before you need to log anything, and you should start seeing your logs.

Trace.Listeners.Add(new MyTraceListener());

NOTE: this has been tested with the 2.5 version of the Azure SDK.

NOTE: you will get the following error after you add the above code. "The type 'Microsoft.Cis.Eventing.Listeners.RDEventMonitoringAgentListener' is defined in an assembly that is not referenced. You must add a reference to assembly 'MonAgentListener, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'." To resolve this error simply add a reference to the MonAgentListener DLL via the "Add References" window. You will find the dll under the "Extensions" tab in the "Add References" dialog.

posted @ Wednesday, February 11, 2015 4:46 AM | Feedback (0) |

Monday, March 11, 2013

Upload files to Azure Blobs FAST

CAUTION: This worked for me...Use it at your own risk.

I was given a task to upload billions of small (10k) files to Azure, yes that is billions with a B. Like everyone else I started by reading the MSDN docs and having it show me the toBlob.UploadFile and toBlob.UploadFromStream. Both proved to be a bit slow in my case. I can only assume the issue is the overhead of opening and closing the connections.

My first iteration was to just wrap everything with a Parallel.ForEach, and while this drastically improved things it still was not good enough.

My second iteration was to bump up the "Default Connection Limit" (see Line one in the example blow), again things got better but not good enough.

My third iteration was to use the Task.Factory.FromAsync to wrap the async calls that the API exposes in a Task, finally things are much better now. (see complete code sample below)

posted @ Monday, March 11, 2013 9:18 AM | Feedback (0) |

Wednesday, January 23, 2013

Zebra Strip a jqGrid

 

Zebra striping, or greenbar, or odd/even row highlighting whatever you call it, your boss asked you for it and now you are here and you are trying to figure out how to implement it. When I was attempting to do this I saw many examples using javascript to do this, however why do in javascript what you can do in CSS.

If you look for a solution for a standard html table (of which there are millions on the web) they will tell you to do something like this:
    #grdResults tr:nth-child(even) {
            background-color:rgb(226, 228, 240);
        }

While this works great for normal tables for a jqGrid it will not work. Why because the cells themselves are overriding your command. Easy enough to solve, just color the background of each cell. (note the td at the end of the first line)
        #grdResults tr:nth-child(even) td {
            background-color:rgb(226, 228, 240);
        }

Wammo! A zebra striped jqGrid with no additional javascript overhead.

posted @ Wednesday, January 23, 2013 9:05 PM | Feedback (0) |

Tuesday, January 22, 2013

Use ASP.NET MVC as a template engine

Get the code here: https://github.com/shawnweisfeld/MvcRenderAndSaveFile

Got a question today from a colleague. He had a database full of records, lets say customers, and needed to generate static html document for each customer. I did not ask him why he needed the static html, but let’s assume that he was going to use it as the body of an email message. After doing a bunch of thinking I was disappointed that I could not use the power of the MVC template-ing engine, razor. Or could I. What if I wrote a view that pulled a customer from the database and created a view in the normal manner (See the Home\CustomerInfo Action & View). That was easy, boy to I love razor. Now to the hard part, rendering the html for all 100 of our customers. If can use WebClient to download any page off of the internet, why cannot we use it to download a page from our own website (see the Home\ProcessCustomers Action). So with a few lines of code we can now render any html we want. YEA!

While I have not played with it, you could avoid the network traffic with something like the RazorEngine (available from Nuget.org @ http://nuget.org/packages/RazorEngine/)

posted @ Tuesday, January 22, 2013 8:34 PM | Feedback (0) |

Thursday, January 17, 2013

Entity Framework insert performance

While I am big believer in Entity Framework (EF) there are some cases where dropping down to raw ADO.NET make sense. In an application I am working on I need to insert thousands of records into the database, and while the change tracking mechanism and other features in EF provides a great service sometimes the overhead of it doesn’t outweigh the benefits it offers.

I threw together a quick little sample app (you can download it from GitHub here) that shows the problem. The app simply creates a bunch of random customer objects and saves them to sql server. Here are the run times for a set of 1,000 and 10,000 customers. As you can see using raw ADO.NET is about 35% faster with record sets of 1,000 and with the 10,000 size record set I get a 70% performance increase. Additionally it appears that the performance with raw ADO.NET is linear as you scale up while EF appears to perform worse as the record set grows.

 

(times in milliseconds)

Run 1,000 Records     10,000 Records  
  EF DbHelper     EF DbHelper  
1 4,359 2,281     74,010 22,088  
2 3,389 2,233     70,250 20,621  
3 3,370 2,266     71,181 22,470  
4 3,383 2,302     74,096 22,495  
5 3,392 2,288     76,465 20,759  
6 3,367 2,245     70,030 19,778  
7 3,450 2,249     69,176 20,485  
8 3,369 2,252     69,833 19,680  
9 3,368 2,268     71,728 20,751  
10 3,390 2,298     74,084 22,821  
Average 3,484 2,268 34.9%   72,085 21,195 70.6%

 

BTW: SqlBulkCopy is about 100x faster then either of the above if raw speed is important to you. (I have included an example of that in the code on GitHub)

Please note this is all anecdotal information that I have got from my observations of the system, and I could be woefully wrong, please let me know if I am. . .

posted @ Thursday, January 17, 2013 12:28 PM | Feedback (0) | Filed Under [ SQL ]

Thursday, March 22, 2012

Display a list of data as a set of columns not rows

Typically when asked to display a list data, lets say customers, we create a first name column, last name column and then iterate over the rows to produce a table. Something like this:

clip_image001

But recently I was asked to produce a pivot of this data, putting each customer in their own column instead of an individual row. After doing the obligatory internet searching I found no solution that I really liked. So I wrote my own…

Lets start by creating our customer object:

public class Customer
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public int Age { get; set; }
}
 

Now that we got him, lets mock up some data in our controller:

public ActionResult Index()
{
    var customers = new List<Customer>();

    for (int i = 0; i < 100; i++)
    {
        customers.Add(new Customer()
                            {
                                Id = i,
                                FirstName = string.Format("First {0}", i),
                                LastName = string.Format("Last {0}", i),
                                Age = i
                            });
    }

    ViewBag.Customers = customers;

    return View();
}
 

Finally, the meat, using a bit of reflection magic lets get a “list” of all the properties on our customer, iterate over each of them to generate our rows, then iterate over each customer and get the “value” for that row.

<div style="width: 600px; overflow: auto">
    <table style="white-space: nowrap;">
        <thead>
            <tr>
                <th></th>
                @foreach (var customer in ViewBag.Customers)
                {
                    <th style="">
                        Customer @customer.Id
                    </th>
                }
            </tr>
        </thead>
        <tbody>
            @foreach (var prop in typeof(Customer).GetProperties())
            {
                <tr>
                    <td>
                        @prop.Name
                    </td>
                    @foreach (var customer in ViewBag.Customers)
                    {
                        <td>
                            @prop.GetValue(customer, null)
                        </td>   
                    }
                </tr>
            }
        </tbody>
    </table>
</div>

 

And ta-da we have a table with dynamically generated columns based on the number of “customers” in our collection.

clip_image002

posted @ Thursday, March 22, 2012 9:14 AM | Feedback (0) |

Saturday, March 17, 2012

Convert flattened data in Excel to a hierarchal XML File

Ever had an excel file that looked like this?

image

and need an xml file that looks like this?

image

 

Converting takes 2 steps:

Step 1: Read the excel file into objects in memory

    var products = new List<Product>();

    using (var con = new OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=data.xlsx;Extended Properties=\"Excel 8.0;HDR=YES;\""))
    using (var cmd = con.CreateCommand())
    {
        con.Open();
        cmd.CommandText = "SELECT * FROM [Sheet1$]";
        var dr = cmd.ExecuteReader();

        while (dr.Read())
        {
            products.Add(new Product()
            {
                ProductGroup = dr[0].ToString(),
                ProductSubGroup = dr[1].ToString(),
                ProductNumber = dr[2].ToString()
            });
        }
    }

 

Step 2: Use LINQ to group the objects into a hierarchy and convert them into xml

    var xml = new XElement("ProductGroups",
        products.GroupBy(x => x.ProductGroup).Select(x => new XElement("ProductGroup",
            new XElement("name", x.Key),
            new XElement("ProductSubGroups",
                x.GroupBy(y => y.ProductSubGroup).Select(y => new XElement("ProductSubGroup",
                    new XElement("name", y.Key),
                    new XElement("Product", y.Select(z => new XElement("number", z.ProductNumber)))))))));

    xml.Save("test.xml");

posted @ Saturday, March 17, 2012 3:01 PM | Feedback (0) |

Powered by: