All posts by Juan Pablo Garcia Dalolla

my new blog @

Blog moved to

Live Feed:

Windows Azure Storage for Node.js

I got really exited after watching the Introduction to Node.js video performed by Ryan Dahl the creator of Node.js, just after that I installed node in my computer and started playing a little bit with Javascript which is one of the languages that I love.

A few days ago I announced the release of the first version (0.0.1) of the waz-storage-js library I wrote in Node.js. The code is based on the waz-storage ruby library written by friend and colleague Johnny Halife who let me contribute with the Tables service and the dm-waztables-adapter for datamapper.

This version allows you to deal only with Blobs but my plan is to include also Tables and Queues in the near future.

Remember that you can always fork the code hosted on Github and contribute to speed up the thing!

Let’s play with it and install the package

The library is available as an npm package. To install, run the following command:

npm install waz-storage-js

Configure your account information

var waz = waz.establishConnection({
        accountName: 'your_account_name'
        , accountKey: 'your_key',
        , useSsl: false

And finally read and write data from you Blobs

// Creating a new container
waz.blobs.container.create('myContainer', function(err, result){

// Listing existing containers
waz.blobs.container.list(function(err, result){

// Finding a container
waz.blobs.container.find('myContainer', function(err, container){

        // Getting container's metadata
        container.metadata(function(err, metadata){

        // Adding properties to a container
        container.putProperties({'x-ms-custom' : 'MyValue'}, function(err, result){

        // Getting container's ACL
        container.getAcl(function(err, result){

        // Setting container's ACL (null, 'blob', 'container')
        container.setAcl('container', function(err, result){

        // Listing blobs in a container
        container.blobs(function(err, result){

        // Getting blob's information
        result.getBlob('myfolder/my file.txt', function(err, blob){

            // Getting blob's contents

        // Uploading a new Blob'folder/my file.xml', 'content', 'text/xml', {'x-ms-MyProperty': 'value'}, function(err, result){

// Deleting containers
waz.blobs.container.delete('myContainer', function(err){

Microsoft is working to bring Node.js to both Windows and Azure and I hope my contribution (and why not yours) helps developers in future projects.

Stay tuned! I wil keep you posted.

Using TwitPic API 2.0 (OAuth Echo) from a C# client to upload pictures

Last week we enjoyed a remote pair programming session with my friend Johnny, the challenge was to upload a picture from a console application using the new TwitPic’s API which takes advantage of the OAuth Echo authentication mechanism provided by Twitter.

The scenario we wanted to create wasn’t a web application, so we had not been able to handle a full OAuth process which forced us to perform an oob authentication method as described here:

"For applications that really can’t handle the full OAuth process Twitter provides the out-of-band/PIN code authentication mode, also known as oob.
This authentication flow is almost identical to full OAuth except instead of being directed back to your website the user is presented with a PIN code. The user is then asked to type this PIN code into your application which will then complete the token exchange."

Below you will find every step we followed to get this running.

Getting a token and a token secret from Twitter

  1. Register your application on to get your Consumer Key and Consumer Secret
  2. Create a new console application and reference the DotNetOpenAuth (download it from here), System.Web and System.Windows.Forms assemblies
  3. Create a service descriptor providing Twitter endpoints:
    var descriptor = new ServiceProviderDescription


        RequestTokenEndpoint = new MessageReceivingEndpoint("", HttpDeliveryMethods.GetRequest | HttpDeliveryMethods.AuthorizationHeaderRequest),

        UserAuthorizationEndpoint = new MessageReceivingEndpoint("", HttpDeliveryMethods.GetRequest | HttpDeliveryMethods.AuthorizationHeaderRequest),

        AccessTokenEndpoint = new MessageReceivingEndpoint("", HttpDeliveryMethods.GetRequest | HttpDeliveryMethods.AuthorizationHeaderRequest),

        TamperProtectionElements = new ITamperProtectionChannelBindingElement[] { new HmacSha1SigningBindingElement() }


  4. Now, we must create a token manager that implements IConsumerTokenManager and IOpenIdOAuthTokenManager interfaces. In this case we used an in memory implementation provided as part of DotNetOpenAuth samples. You can find the code in DotNetOpenAuth.ApplicationBlock sample, or you can copy and paste it form here:
    namespace TwitpicOAuthClient


        using System;

        using System.Collections.Generic;

        using DotNetOpenAuth.OAuth.ChannelElements;

        using DotNetOpenAuth.OAuth.Messages;

        using DotNetOpenAuth.OpenId.Extensions.OAuth;


        internal class InMemoryTokenManager : IConsumerTokenManager, IOpenIdOAuthTokenManager


            private readonly Dictionary<string, string> tokensAndSecrets = new Dictionary<string, string>();


            public InMemoryTokenManager(string consumerKey, string consumerSecret)


                if (String.IsNullOrEmpty(consumerKey))


                    throw new ArgumentNullException("consumerKey");



                this.ConsumerKey = consumerKey;

                this.ConsumerSecret = consumerSecret;



            public string ConsumerKey { get; private set; }


            public string ConsumerSecret { get; private set; }


            public string GetTokenSecret(string token)


                return this.tokensAndSecrets[token];



            public void StoreNewRequestToken(UnauthorizedTokenRequest request, ITokenSecretContainingMessage response)


                this.tokensAndSecrets[response.Token] = response.TokenSecret;



            public void ExpireRequestTokenAndStoreNewAccessToken(string consumerKey, string requestToken, string accessToken, string accessTokenSecret)



                this.tokensAndSecrets[accessToken] = accessTokenSecret;



            public TokenType GetTokenType(string token)


                throw new NotImplementedException();



            public void StoreOpenIdAuthorizedRequestToken(string consumerKey, AuthorizationApprovedResponse authorization)


                this.tokensAndSecrets[authorization.RequestToken] = String.Empty;




  5. Provide the consumer information you got when you registered your application on Twitter. At this point you should be able to get a token URL from Twitter and let the user know from where he can authenticate to get the PIN code.
    string requestToken;

    var twitterConsumerKey = "twitter_consumer_key";

    var twitterConsumerSecret = "twitter_consumer_key_secret";


    var consumer = new DesktopConsumer(descriptor, new InMemoryTokenManager(twitterConsumerKey, twitterConsumerSecret));

    var requestTokenUrl = consumer.RequestUserAuthorization(null, null, out requestToken);


    Console.WriteLine("Please copy the following Url to your favorite web browser (it's already on your clipboard), and write the PIN below:");




    Console.Write("PIN: ");

    var pin = Console.ReadLine();


  6. Decorate the Main method with the [STAThread] attribute to avoid a ThreadStateException while copying the URL to the user’s clipboard:

    public static void Main(string[] args)


  7. Get the access_token and access_token_secret values provided by twitter:
    var response = consumer.ProcessUserAuthorization(requestToken, pin);

    var token = response.AccessToken;

    var tokenSecret = ((ITokenSecretContainingMessage)response).TokenSecret;

Using TwitPic API 2.0

  1. First of all we must get an API key from


  2. Further on this guide we will need to sign Twitpic’s requests, to do that you have to include the OAuthBase.cs library on your project( 
  3. Provide your Twitpic API key and define the following variables just after where you got the token and the token secret from Twitter:
    var twitpicApiKey = "twitpic_api_key";

    var oauthSignaturePattern = "OAuth realm="{0}", oauth_consumer_key="{1}", oauth_signature_method="HMAC-SHA1", oauth_token="{2}", oauth_timestamp="{3}", oauth_nonce="{4}", oauth_version="1.0", oauth_signature="{5}"";

    var authenticationRealm = "";

    var twitpicUploadApiUrl = "";

    var twitterVerifyCredentialsApiUrl = "";

    var contentEncoding = "iso-8859-1";

  4. Use the OAuthBase library to generate the signature:
    var oauth = new OAuthBase();

    string normalizedString, normalizedParameters;

    var timestamp = oauth.GenerateTimeStamp();

    var nounce = oauth.GenerateNonce();

    var signature = oauth.GenerateSignature(

                        new Uri(twitterVerifyCredentialsApiUrl),








                        out normalizedString,

                        out normalizedParameters);


    signature = HttpUtility.UrlEncode(signature);

  5. Create a multipart/form-data HttpWebRequest and set the required X-Verify-Credentials-Authorization and X-Auth-Service-Provider headers as described in page:
    var boundary = Guid.NewGuid().ToString();

    var request = (HttpWebRequest)WebRequest.Create(twitpicUploadApiUrl);


    request.PreAuthenticate = true;

    request.AllowWriteStreamBuffering = true;

    request.ContentType = string.Format("multipart/form-data; boundary={0}", boundary);


    request.Headers.Add("X-Auth-Service-Provider", twitterVerifyCredentialsApiUrl);


    var authorizationHeader = string.Format(









    request.Headers.Add("X-Verify-Credentials-Authorization", authorizationHeader);


    request.Method = "POST";

  6. Create the payload for the request, which includes the TwitPic message and image contents:
    var header = string.Format("--{0}", boundary);

    var footer = string.Format("--{0}--", boundary);


    var contents = new StringBuilder();



    string fileContentType = "image/png";

    string fileHeader = string.Format("Content-Disposition: file; name="{0}"; filename="{1}"", "media", "sample_image.png");

    string fileData = Encoding.GetEncoding(contentEncoding).GetString(File.ReadAllBytes(@"c:sample_image.png"));



    contents.AppendLine(string.Format("Content-Type: {0}", fileContentType));





    contents.AppendLine(string.Format("Content-Disposition: form-data; name="{0}"", "key"));





    contents.AppendLine(String.Format("Content-Disposition: form-data; name="{0}"", "message"));


    contents.AppendLine("Testing Twitpic API" + Path.GetTempFileName()); // GetTempFileName is to avoid duplicate prevention.




    byte[] bytes = Encoding.GetEncoding(contentEncoding).GetBytes(contents.ToString());

    request.ContentLength = bytes.Length;

  7. And finally perform the request:
    using (var requestStream = request.GetRequestStream())


        requestStream.Write(bytes, 0, bytes.Length);


        using (var twitpicResponse = (HttpWebResponse)request.GetResponse())


            using (var reader = new StreamReader(twitpicResponse.GetResponseStream()))


                Console.WriteLine(twitpicResponse.StatusCode + ": " + reader.ReadToEnd());





Running the Sample

  1. Run the console application and the token URL should be copied to your clipboard: image
  2. Open a web browser an paste the URL to obtain the PIN code:


  3. Enter the PIN code in the console and if there are no errors you will be able to view the new image on Twitpic:


AD FS 2.0 – No certificate with thumbprint “…….” found

During the last week I was working for an identity project related to the new U-Prove CTP version of the Active Directory Federation Services 2.0.

As you know, when working with new technologies, it is very common to find blocking issues like this. This is why I want to share with you my experience.


AD FS certificates cannot be changed neither from the Management Console nor PowerShell Cmdlet. You may receive an error message like No certificate with thumbprint “…….” found.


  1. Open the Microsoft Management Console and add a new Certificates Snap-in for Computer Account
    • Go to the Personal / Certificates node and open the new certificate you are going to use by doble-clicking on it
    • Select the Details tab and copy the Thumbprint value
  2. Open SQL Server Management Console
    • Select the AdFsConfiguration databaseNote: If you are using the Microsoft Internal Database you can use this connection string ( \.pipemssql$microsoft##sseesqlquery )
    • Open the IdentityServerPolicy.ServiceSettings table and copy the ServiceSettingsData field value (XML) to a Notepad
    • Find the missing Thumbprint values you got on the AD FS error message
    • Replace the found values by the new one certificate’s Thumbprint without empty spaces.
    • Update the ServiceSettingsData field with the new XML configuration
      : XML contents must not BE tidy

  3. Go to to and refresh the Certificates node
  4. At this point you should see listed the new certificate
  5. If you are changing the Service Communications Certificate, open the Internet Information Service (IIS) Manager
    • Select the Default Website
    • Click on Bindings… action, go to the https row ad click on Edit…
    • Select the new certificate from the SSL certificate combo-box and click OK (Note: if you see an error message, click ok)

Windows Azure Tables adapter for DataMapper

The Past

Last Friday, we shipped the first Major version (v1.0.0) of the Windows Azure Storage API gem for ruby, started a few months ago by my friend Johnny Halife. As it is an open-source project, I had the opportunity to contribute with:

  • Support for table service to query, get_one, insert, update, merge and delete entities.
  • Support for running against the Storage Developement Fabriq shipped with Microsoft SDK.
  • Signature support for Tables service according to
  • Support to enumerate, create, and delete tables on give storage account.
  • Give feedback to Improve the support for stacked connection management.

This release of waz-storage for ruby includes numerous features collected thru 0.5.6 to 1.0.0, for more information you can visit the where you will find all the gem documentation, or if you like to read the source code, contribute or giving us feedback you can get it from on

The Present

One of the objectives of having Tables support on the gem was to have an interface to interact with Tables and Entities that we can consume from an adapter as we usually do with our favorite ORM written in ruby which is DataMapper.

This is why this weekend was pretty much to make the dream come true, creating a new project on github called dm-waztables-adapter ( and spitting some lines of code.

Writing the adapter

As everything in Ruby wonderful world, it was really easy to have a first version running with the features provided by Datamapper.

It took me a few hours to write down 85 lines of code to cover the whole adapter (Create, Read, Update and Delete methods)

Sorry, I’m forgetting the aditional 30 minutes I spent on writing 32 more lines to cover the Migrations stuff. So you won’t worry about creating the tables when you design your models (As Windows Azure doesn’t have support for schemas inside tables, migrations exists just to make sure that you have the tables. It won’t modify attributes of existing data).

Below you will find some code samples. I hope you like it.

Getting started

sudo gem install dm-waztables-adapter --source


require 'dm-waztables-adapter'

# set up a DataMapper with your Windwows Azure account
DataMapper.setup(:default, { :adapter => 'WAZTables',
                                         :account_name => 'name',
                                         :access_key => 'your_access_key' })

# define a new model
class Guitarist
    include DataMapper::Resource

    property :id, String, :key => true
    property :name, String
    property :age, Integer

# set up database table on Windows Azure for a specific model
Guitarist.auto_migrate! # (destructive)
Guitarist.auto_upgrade! # (safe)

# set up database table on Windows Azure for all defined models
Datamapper.auto_migrate! # (destructive)
Datamapper.auto_upgrade! # (safe)

# play with DataMapper as usual
Guitarist.create(:id => '1', :name => 'Ritchie Blackmore', :age => 65)

yngwie = => '2', :name => 'Yngwio Malmsteen', :age => 46) = "Yngwie Malmsteen"

# retrieving a unique record by its id
ritchie = Guitarist.get('1')
ritchie.age # => 65

# updating records
ritchie.age = 66

# retrieving all guitarists
    Guitarist.all.length # => 2

# performing queries
    older_guitar_players = Guitarist.all( { :age.gte => 50 } )

# deleting records


  • Allow users to define the model partition key by using :partition_key => true option on the property.
  • Allow users to set the partition key as an additional attribute of the model with a lambda as default value.
  • Allow users to set the partition key as a method on the model.
  • Implement “in” operator in queries
  • Implement “order” query option
  • Retrieve more than 1000 fields using Windows Azure :continuation_token

Known Issues

  • Like statements are not working since Microsoft service API is throwing a NotImplemented exception when
    using startswith and endswith filters (more information here)
  • There’s no way to tell thru the entity which is the partition key of our entity, so there’s no out-of-the-box load balancing support (for mor info on the tables model that a look at

Reaching Azure Development Fabriq from a remote machine

Are you following me on Twitter? If the answer is yes, you may know that I forked the waz-storage project to write the Tables and Entities operations exposed by the Windows Azure Table API.

One of the things I wanted to get while writing code in my ruby environment, was to perform functional tests against a local environment. So, today I’m going to talk about the Azure Development Fabriq and how you can access to this service from outside your local host.

The problem

The Development Fabriq is configured by default to listen in the following sockets:

That said, we can imagine that http://{Your_LAN_IPAddress}:10000/devstoreaccount1/container/myblob will allow us to get that blob, but it will never happen. At this point, you can consume the services just from your localhost.

The solution

You can change the default IP address by the one assigned to your interface, in the configuration file C:Program FilesWindows Azure SDKv1.0bindevstoreDSService.exe.config:

  <service name="Blob" url=""/>
  <service name="Queue" url=""/>
  <service name="Table" url=""/>

Shutdown / Start the Development Fabriq to apply those changes and that’s it. Tests passing from Textmate against a vm runing the Storage Service locally.



hope you like it!

Dropzone extension leveraging Ruby waz-storage gem

It’s been a long time since my last post, so today I want to share with you my experience on using the waz-storage gem, created by my friend Johhny Halife. If you are not aware about his amazing job you should check this post to get more context about what I’m going to show you.

What can I do?

After reviewing Johnny’s code I was very excited on creating something and to start playing with that toy, but obviously the question was WHAT?

The answer came to my mind while reading a blog about an application for Mac OSX desktop called Dropzone. This is a excerpt taken from Dropzone‘s website.

“Drag a file onto the dock icon and your fully customizable grid of destinations flies smoothly out using core animation. Drop the file onto a destination and Dropzone will take care of the rest. Whether you’re installing an app, uploading a file to an FTP server or sharing your photos on Flickr.”

There is a section regarding how to extend the Dropzone’s features, and how to contribute creating plugins. Dropzone can easily be extended using simple ruby scripts.

So, I thought about writing a script that allow users to easily drag an drop files from your computer and store them as Blobs on the Windows Azure Storage Services.

Coding for fun!

I started reading the Dropzone Scripting API documentation and I was surprised how easy it was. There are only two methods to implement which are dragged, clicked and that’s it.

Beyond the simplicity that gives the Dropzone’s API, I had the joy of coding in Ruby and the waz-storage gem easiness.

You can find the source code on the following url

How can you try it?

Installation and configuration:

  1. Download the Dropzone program from here
  2. Install the waz-storage gem if the gem isn’t installed yet
    sudo gem install waz-storage –version >= 0.5.4 –source
  3. Download the dropzone extension that I created from the github repositories. The file is called WAZBlobs.dropzone
  4. Open the WAZBlobs.dropzone file and provide your Windows Azure Services credentials as depicted below:

The script and the functionality is very simple:

  1. Drag and drop your files to the Azure’s icon on the Dropzone panel
  2. The files will be uploaded to a public container called dropzone.
  3. The following picture shows how the Picture 3, that I dragged & dropped above, is already on Windows Azure Blobs servers. So I will play a little bit with the console to show you that the Blob is already there 🙂

On my next post I will show you a new application I’m developing on Heroku that uses the same gem to manage the Blobs via Web. Stay tuned!

Webcast for Latin American Community about HPC with WCF

Spanish Version

On December 11, 2008 we gave a Webcast for Latin American Community with my friend and mentor Johnny Halife about how to develop distributed applications by using Windows HPC Server 2008.

The objective of the talk was about to explain the platform that Windows HPC Server 2008 provide us to build distributed applications with a SOA architecture by using Windows Communication Foundation (WCF).

In addition we had the opportunity to make a small demo creating a simple application on our lab deployed at Southworks.

For the ones that couldn’t attend this presentation, you can download or watch it in the following url:

Enjoy it!

Visual Studio 2008 templates compliant with Microsoft StyleCop


Since Microsoft launched StyleCop, we are running this tool in all Southworks’ projects. From our Engineering Excellence department we’re promoting the use of this tool because it give us source code consistency and homogeneity we want for developers and customers who read the code.

If you’re using this tool, you surely be realized that some Visual Studio templates are not compliant with some of the StyleCop rules, like using directives inside the namespaces, regions, one class for each file, among others. This is quite annoying when you’re coding because each class, interface or test you add to your project has to be stylized to meet that rules.

Project templates like ASP.Net Web MVC Application (Preview 5) have an amount of ~120 warning even avoiding the documentation rules.

The purpose of this post is to give you a workaround to avoid this unnecessary work.


The way that Visual Studio provides these templates is by using a series of compressed zip files, with the base source code inside.

There are two folders inside the %ProgramFiles%Microsoft Visual Studio 9.0Common7IDE” with these templates, one for the items (classes, interfaces, tests, etc.) and one for projects (Class Library, Console Application, MVC Web Application, etc.).


The workaround is pretty much straightforward, all you have to do is:

  1. Extract the default Visual Studio template files
  2. Modify them to be compliant
  3. Compress it again
  4. and overwrite the original files

What I did this weekend, is make the work for you for the most used files and projects for me including the Microsoft ASP.Net MVC preview 5 project template :). So below you’ll find a table with the zip files to download and the folder location where you will overwrite them.

Once you have copied the files, ensure you’ve all you Visual Studio instances closed and run as administrator from the console the following command to refresh Visual Studio’s template cache:

“%ProgramFiles%Microsoft Visual Studio" /setup"

Template Files

(*) rootPath = "%ProgramFiles%Microsoft Visual Studio 9.0Common7IDE"

template pathtemplate file






You can also download all templates in a single .zip file:

One year and four months later…

Spanish Version

Yesterday night while drinking a couple of beers with some of my Southworks’ colleagues, I returned back to my home and being lying on my bed, I started to think on this last year and four months from when I began working in Southworks.

While I was dating back over time, I remembered in which projects and customers we were worked for, some of them are “Microsoft Architecture Strategy Team”, “Microsoft Depeloper & Platform Evangelism Team”, “Microsoft Connected Services Framework Team”, “Microsoft SQL Server Team”, “Grupo Sancor Seguros” among others, and the important people who I known in person like Eugenio Pace y Gianpaolo Carraro.

Words and technical acronyms came to my mind regarding things I have being acquired and incorporated during this time. So, in this moment I started to imagine something like a mental “Tag Cloud” and it was there when I decided to write this post with the objective of leaving this as a log experience and to compare it in the future with the new words that surely will be added.

Beside all tags am I listing in this post, I wanted to thank all Southies who were helping me to fill my mind with all this knowledge and specially to my two mentors, which are Johnny Halife y Matias Woloski who today I still admiring and respecting, but the ones who I having fun where the computer aren’t close to us.

Here is my "Mind Tag Cloud":

  Refactoring    Code Analysis    Retrospective    TDD    WCF    WSDL    Continuous Integration    Patterns    Cluster Server    ISO    Virtualization    SOA    Singleton    Cyclomatic Complexity    WPF    Model View Controller    REST    Linq to XML    Mocks    Paravirtualization    Sprint    Hyper-V    Lamda Expressions    Repository    SAN    Synchronization Framework    iSCSI    LUN    Powershell    SCRUM    Ssds    Agile    Spike    NAS    Iteration Planning    Dependency Injection    Factory    Linq to SQL    Code Coverage    Subversion    Security Token Service    CMMi    StyleCop    Model View Presenter    Strategy  FxCop    Serialization    Apache    Prototype    Datamember    Composite Application Block    Build Server    RSS    DIT    S+S    Backlog    Commitment    Inversion Of Control    Scaffolding    Abstract Factory    Reflection    LCOM    Iteration Review    Software As A Service    DataContract    TFS    Code Query Language    SOAP    Dynamic Language Runtime    Lightweight Directory Services