Global Azure Bootcamp 2015

On Saturday, April 25, 2015 we are going to be part of the Global Azure Bootcamp 2015 at Microsoft Argentina offices!

Read More

More Azure Management Portal updates for Azure Media Services

Last week, the Azure Media Services team published more updates in the Azure Management Portal that include some new features. In this post you will find a quick summary of what’s new.

 

Added support for JSON Web Token (JWT) format

Last month, the Azure Media Services team added support for JSON Web Token (JWT) format to restrict delivery of content keys; before this update, only Simple Web Token (SWT) format was supported.

With the latest Azure Portal updates, there is a new Token Type option in the Content Protection tab for both the AES and PlayReady authorization policies to let you set the format: JSON Web Token (default) or Simple Web Token.

Please take into account that:

  • After applying changes to the authorization policies, it could take up to a few minutes to take effect.
  • The Content Protection tab only lets you configure the authorization policies that are used for the encryption features in the Azure Portal. If you have a custom authorization policy, you will need to configure it programmatically using the SDK/REST API.

To learn more about how to use JWT format with dynamically encryption, you can read these blog posts by @GEOTRIF:

Token Type option in Content Protection tab

 

Disabled Play command when the asset has custom delivery policies

The Play command in the Content tab is now disabled for assets that have a custom delivery policy created programmatically using the SDK/REST API (i.e. not with the Encryption command).

The reason behind this update is that the Azure Portal relies on the delivery policies created with the Encryption command. If you create a custom one, it might not cover the delivery protocols used by the Azure Portal. Therefore, to avoid any potential playback issues that could be misleading (users might think that there is an issue in the dynamic encryption configuration), the Play command is now disabled in this scenario.

Play command disabled with custom delivery policies

 

Added support for Streaming Endpoints advanced configuration

The Configure sub-tab for streaming endpoints now have three new sections:

  • Cross Domain Policy: This section lets you specify the cross domain access policy for Adobe Flash clients. For more information, see Cross-domain policy file specification.
  • Client Access Policy: This section lets you specify the client access policy for Microsoft Silverlight clients. For more information, see Making a Service Available Across Domain Boundaries.
  • Custom Host Names: This section lets you configure a streaming endpoint to accept traffic directed to a custom host name. This allows for easier traffic management configuration through a Global Traffic Manager (GTM) and also for branded domain names to be used as the streaming. To perform CRUD operations on the custom host names, click the Configure button to open the Manage Custom Host Names dialog.

Please take into account that these sections (along with the rest) are only enabled for editing when the streaming endpoint has at least one streaming unit.

Streaming Endpoint advanced configuration

Manage streaming endpoint custom host names 

 

Enjoy!

Connecting to Microsoft Azure Media Services REST API

If you are a .NET/C# developer, you can consume all the Azure Media Services features using the .NET SDK. It is available as a NuGet package which you can easily get from Visual Studio, and its source code can be found in GitHub. There is also an Extensions .NET SDK that you can use to simplify the implementation of common media workflows (also available as a NuGet package and in GitHub).

If you are working in a platform/language where there is no official SDK currently available, you can leverage Azure Media Services by implementing your own client for the Azure Media Services REST API. Luckily, this is not a complex task: the REST API is straightforward in terms of design, supports JSON/XML formats and it’s very well documented in MSDN, so you only need a standard HTTP client and you are good to go Smile.

In the case of the latter, you will need to perform some connection steps before accessing the Azure Media Services REST API:

  1. Get the access token (SWT format) from the ACS instance by providing the Media Services account name and key. You can find more details here.
  2. Send an authenticated GET request (with the bearer access token) to https://media.windows.net/ and you will receive a 301 Moved Permanently response with the REST API endpoint in the Location header. You can find more details here.
  3. Use the bearer access token and the new endpoint for sending all your subsequent requests to the REST API.

A common mistake when performing step #2 is not disabling the auto-redirect logic in the HTTP client used to send the request. This feature is likely to be enabled by default for GET requests in a standard HTTP client, so it would follow the redirection automatically, send a second request to the new location (similar to https://wamsbayclus001rest-hs.cloudapp.net/api/) and get a second response without the Location header and a body as follows:

{  
   "d":{  
      "EntitySets":[  
         "AccessPolicies",
         "Locators",
         "ContentKeys",
         "ContentKeyAuthorizationPolicyOptions",
         "ContentKeyAuthorizationPolicies",
         "Files",
         "Assets",
         "AssetDeliveryPolicies",
         "IngestManifestFiles",
         "IngestManifestAssets",
         "IngestManifests",
         "StorageAccounts",
         "Tasks",
         "NotificationEndPoints",
         "Jobs",
         "TaskTemplates",
         "JobTemplates",
         "MediaProcessors",
         "EncodingReservedUnitTypes",
         "Operations",
         "StreamingEndpoints",
         "Channels",
         "Programs"
      ]
   }
}

Of course, this error will prevent you from continuing with step #3 to actually access the Azure Media Services resources you want. Therefore, as a summary, if you are building your own Azure Media Services client, do not use auto-redirection logic in your HTTP client configuration when connecting to the REST API.

Happy coding!

Microsoft Azure Management Portal updates for Azure Media Services

Last week, the Azure Media Services team published some updates in the Azure Management Portal that include several improvements and new features. In this post you will find a quick summary of what has changed and what is new.

 

Content Protection tab

As part of the General Availability announcement of the Content Protection (AES and DRM) service, the following changes were applied in the Content Protection tab:

  • The Preview tag next to the Content Protection tab was removed since the service is now in General Availability. This also means that you no longer need to sign up for the PlayReady license service via the Azure Preview features page; this feature is now active in all the Media Services accounts.
  • The Branding Reporting table at the bottom is now optional; this means that the PlayReady license service is always enabled.

Content Protection tab

 

Content tab

These are the improvements and new features that were added in the Content tab:

  • The Encode command was renamed Process, since you can now select another media processor (not just the Azure Media Encoder).
  • The Process dialog (formerly Encode) was refactored as follows:
    • The encoding configuration for Azure Media Encoder was reorganized into these categories to make it clearer for users:
      • Presets for Adaptive Streaming (dynamic packaging): produces multi-bitrate MP4 files ready for dynamic packaging
      • Presets for Progressive Download: produces a single MP4 file ready for progressive download
      • Legacy Presets for Adaptive Streaming: produces Smooth Streaming files ready for adaptive streaming
      • Other (common workflows).
    • As of the General Availability announcement of the Azure Media Indexer back in September 2014, you can now select the Azure Media Indexer processor to submit an indexing job. You can also enter the optional title and description metadata fields for the task configuration preset, and choose whether to copy the generated TTML caption file back to the input asset. For more information related to the Azure Media Indexer you can check this blog post.
  • The Publish and Play commands now notify the user if the Media Services account does not have any streaming units when the content is dynamically encrypted (AES or PlayReady). Remember that the dynamic packaging and dynamic encryption features will only work in streaming endpoints that have at least one streaming unit.
  • The Publish command now creates a SAS locator (for progressive download) instead of an Origin locator (for adaptive streaming) when the asset contains a single MP4 file. This is the case, for example, when you generate an asset by submitting a job using any of the H264 Broadband * encoding configuration presets.

Content tab

 

Channels and Programs tabs

With the AES encryption and PlayReady encryption announcements for the Live Streaming service back in  November/December 2014, the Azure Media Services team has added support to configure these features from the portal:

  • The Start Streaming command in the Channels tab now includes three sub-commands:
    • Unencrypted: creates a channel program with default settings, publishes the associated asset with an Origin locator and starts the program.
    • AES Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for AES dynamic encryption.
    • PlayReady Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for PlayReady dynamic encryption.
  • The Create a new program dialog now includes the Streaming Encryption Type setting:
    • Unencrypted: creates a program with custom settings.
    • AES Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for AES dynamic encryption.
    • PlayReady Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for PlayReady dynamic encryption.
  • The Programs table now has a new Encryption column specifying the streaming encryption type.

Channels and Programs tabs

 

Enjoy!

eh

Azure Event Hubs, the Thing and the Internet

I have things, and I have Internet. Now I want to Internet my things…

I recently started playing with Azure Event Hubs and wanted to do a basic IoT sample with a real device and learn lots of new stuff. If you haven’t heard of Azure Event Hubs before,  Event Hubs is a “highly scalable publish-subscribe ingestor that can intake millions of events per second so that you can process and analyze the massive amounts of data produced by your connected devices and applications.”

In this post, the objective is to analyze Kinect sensor readings in real time using Azure Event Hubs and monitor meaningful events in a website.

The scenario

I want to know in real time(ish) when a security breach is produced. In my fictitious scenario, a security breach is produced when someone gets too close to a person (have you seen The Island?) or a guarded object (if movies teach us anything, is that a laser grid alone won’t cut it. I don’t have access to a laser grid, either). Ok, so basically, I need to walk around with a Kinect strapped to my chest or strap a Kinect to a Fabergé egg).

I set out to do the following:

  • Dump the Kinect’s depth sensor readings in an event hub.
  • Analyze the stream of events using a worker role in real time (using the EventProcessorHost approach) and produce a proximity alert whenever I get too close to the sensor (say, 50cm).
  • Display the proximity alerts in a monitoring website as they occur.

You can checkout the progress here. It will require that you have a Kinect, the Kinect SDK (1.8) and a Microsoft Azure account.

The big picture

eh

 

The Sample Solution

I’ll make a brief overview of the projects in the sample.

The DepthRecorder project: It is actually the Depth-Basics sample of the Kinect SDK with a little twist. Every time a frame is received, I obtain the known distance of the closest pixel of the frame, prepare a new KinectEvent object and send it to the event hub. At roughly 30fps, it´s fun to watch how events quickly start to accumulate. (Checkout the ServiceBus explorer tool, too!)

The Cloud project: This project contains two worker roles. The EventsConsumer worker is responsible for analyzing the stream of events and produce a proximity alert whenever an event reports a depth reading of 50cms or less. As alerts appear, the worker role will dump them in a Service Bus queue. The AlertsListener worker is responsible for reading the alerts queue, and pushing alert messages down to the monitoring website (using SignalR)

The Web project: It’s an MVC project that will display the alerts on screen as they occur. It receives SignalR messages and will display a big red alert. For the time being, it won’t distinguish alert sources, but we’ll work on that :)

 

Work in progress

This is probably not the simplest approach, but I wanted to separate and decouple components as much as possible, because i wanted to use queues and because it’s fun :). So probably, for simplicity’s sake, we could get rid of the alerts queue and the alerts listener worker, and push alerts directly to the website.

 

Further reading

check out the following links for further reading:

Cloud Cover Episode 160

ScottGu’s blog post with links to more information

Clemens Vaster’s session at Teched

 

So there you go. Happy Learning!

 

Charla Azure Nights: Primeros Pasos con Azure Websites

¿Qué es Azure Nights?

Azure Nights es la primera serie de charlas sobre la tecnología Microsoft Azure. En esta ocasión vamos a ver “Primeros Pasos con Azure Websites”, una introducción a la plataforma Cloud de Microsoft y cómo podemos utilizarla para desarrollar y publicar nuestras aplicaciones Web.

¿Cuándo?

Esta primera charla se realizará el Jueves 4 de Septiembre a las 19 Horas en las oficinas de Southworks – Perú 375 1º Piso. Vení a disfrutar de la charla, snacks y compartir algunas cervezas en esta noche dedicada a Azure.

Quiero Registrarme!

Usá el siguiente enlace para registrarte en el evento: Registro en EventBrite

Los esperamos!

Enabling SSL Client Certificates in ASP.NET Web API

You can get an excellent description of what client certificates are and how they work in this article – if you want to really understand this post take a minute to read it. In a nutshell, client certificates allows a web app to authenticate users by having the client provide a certificate before the HTTP connection is established (during the SSL Handshake). It’s an alternative to providing username/password. As the article explicitly mentions, client certificates have nothing to do with HTTPS certificates – that means you can have HTTPS communication without client certificates. However, you cannot have client certificates work without enabling HTTPS on your site.

ASP.NET Web API can take advantage of client certificates to authenticate clients as well. In this post, I’ll walk you through the steps of configuring client certificates in your IIS and test it on a Web API. Please notice that this steps should only be executed on a Development environment, as a production environment might require a more rigorous approach.

1. Let’s first create the necessary certificates (as explained here). To do this open the Visual Studio Developer Command Prompt and run the following command. Type the certificate password as prompted. This first command will create a ‘Root Certification Authority’ certificate. If you want more details, you can read about how these commands this MSDN article.

[code language=”powershell”]
makecert.exe -n "CN=Development CA" -r -sv DevelopmentCA.pvk DevelopmentCA.cer
[/code]

2. Install the DevelopmentCA.cer certificate in your Trusted Root Certification Authorities for the Local Machine store using MMC (right-click over the Trusted Root Certification Authorities folder | All Tasks | Import).

image

Note: For production scenarios, this certificate will obviously not be valid. You will need to get an SSL certificate from a Certificate Authority (CA). Learn more about this here.

3. Now let’s create an SSL certificate in a .pfx format, signed by the CA created above, using the following 2 commands. The first command will create the certificate and the second one will convert the .pvk certificate containing the private key to .pfx. This certificate will be used as the SSL certificate.

[code language=”powershell”]
makecert -pe -n "CN=localhost" -a sha1 -sky exchange -eku 1.3.6.1.5.5.7.3.1
-ic DevelopmentCA.cer -iv developmentCA.pvk -sv SSLCert.pvk SSLCert.cer

pvk2pfx -pvk SSLCert.pvk -spc SSLCert.cer -pfx SSLCert.pfx -po 123456
[/code]

4. Install the SSLCert.pfx certificate in the Personal store of Local Computer using MMC. Notice that the certificate shows it was issued by the Development CA.

image

5. Run this third command to create a private-key certificate signed by the CA certificate created above. The certificate will be automatically installed into the Personal store of  Current User, as shown in the figure below. Notice also that the Intended Purpose shows Client Authentication.

[code language=”powershell”]
makecert.exe -pe -ss My -sr CurrentUser -a sha1 -sky exchange -n "CN=ClientCertificatesTest"
-eku 1.3.6.1.5.5.7.3.2 -sk SignedByCA -ic DevelopmentCA.cer -iv DevelopmentCA.pvk
[/code]

image

6. Now let’s get into the code. A good place in ASP.NET Web API 2 to validate the client certificate is an ActionFilterAttribute, by calling GetClientCertificate on the request message (see some examples here). An action filter is an attribute that you can apply to a controller action — or an entire controller — that modifies the way in which the action is executed.

[code language=”csharp” highlight=”5″]
public override void OnActionExecuting(HttpActionContext actionContext)
{
var request = actionContext.Request;

if (!this.AuthorizeRequest(request.GetClientCertificate()))
{
throw new HttpResponseException(HttpStatusCode.Forbidden);
}
}
[/code]

7. Use your local IIS to host your Web API code. Under the site configuration, open the site bindings and configure HTTPS using the SSL certificate created above. Select the ‘localhost’ certificate create in step 3.

image

8. Open the SSL Settings under your web site in IIS and select Accept. The options available are:

  • Ignore: Will not request any certificate (default)
  • Accept: IIS will accept a certificate from the client, but does not require one
  • Require: Require a client certificate – to enable this option, you must also select “Require SSL”

clip_image001

Changing this value will add the following section in the ApplicationHost.config (by default located under C:WindowsSystem32inetsrvconfig). The value SslNegotiateCert equals the Accept we’ve selected before.

[code language=”xml” highlight=”6″]
<configuration>

<location path="subscriptions">
<system.webServer>
<security>
<access sslFlags="SslNegotiateCert" />
</security>
</system.webServer>
</location>
</configuration>
[/code]

Note: If you want to enable this from Web.config instead of using ApplicationHost.config, notice that the  <access> element is not allowed to be overridden from the Web.config by default. To enable overriding the value from Web.config you can change the overrideModeDefault value of the <access> section like this: <section name=”access” overrideModeDefault=”Deny” />. Please notice this is not recommended for production servers, as this would imply changing the behavior for the entire IIS server.

9. Now when browsing to the site using HTTPS in a browser like Internet Explorer you should get prompted for a client certificate. Select the ClientCertificatesTest client certificate you’ve created. As we’ve only selected ‘Accept’ in IIS SSL Settings, if you click Cancel, you should be able to browse to the site all the same, even if you didn’t provide a client certificate.

Also, notice that you are now shown an untrusted certificate warning because you’ve installed the Development CA cert as a Trusted Root Certification Authority.

image

Finally, if you want to know how to perform a request programmatically using client certificates, you can check this Gist.

Note: I’m actually not an expert in security, this post is mostly the results of a couple of battles, some of them won some of them lost – so feel free to provide feedback!

Updating the workaround to use Regions in DataTemplates to Prism 5

Hi All,

These last few days there have been some questions regarding the use of Region definitions inside DataTemplates when working with Prism 5.

Many of you may know already about Damian Cherubini’s workaround for this feature about using a Region Behavior when working in Prism 4. You may find the related blog post in the following site:

However, the sample solution provided in Damian’s post is implemented in a Silverlight solution (although it would also work on WPF). So I took some time to migrate this well designed workaround in order to make it available for Prism 5.

Compatibility for Prism 5

For the sample solution to be compatible with Prism 5, it had to be migrated to WPF as Silverlight is not yet supported by this new version of Prism.

Therefore, a first update of the sample was performed by migrating Silverlight to WPF, and then achieving the corresponding upgrade from Prism 4.1 to 5.

The major modification was related to the Region Manager binding inside the DataTemplate when migrating to WPF. I’ve updated the solution in order to get the Region Manager on each View from its ViewModel, so the ObservableRegionManager would not be necessary anymore:

ViewWithTemplateAndScopedRegion.xaml

<DataTemplate>
     <StackPanel Background="Orange" Orientation="Vertical">
          <TextBlock Text="Scoped region inside data template:" />
          <Border BorderThickness="2" BorderBrush="Black">
               <ScrollViewer MaxHeight="110" VerticalScrollBarVisibility="Auto">
                    <ItemsControl MinWidth="400" MinHeight="100" prism:RegionManager.RegionName="MainRegion" prism:RegionManager.RegionManager="{Binding ElementName=ViewWithScopedRegion, Path=DataContext.RegionManager}"/>
               </ScrollViewer>
          </Border>
     </StackPanel>
</DataTemplate>

Sample Code

You can find a sample with this updated workaround stored here with the name of RegionManagerAwareBehavior-Prism5.

(This code is provided “AS IS” with no warranties and confers no rights.)

I hope that you find this useful,

Please read! Blog moved.

After more than 6 years working at Southworks I have decided to start a new adventure. I’ll provide more information about this soon in my new blog, where I’ll continue to write about code stuff.

The new blog uses the magic of Jekyll + Github Pages and is hosted at http://dschenkelman.github.io/ and the RSS feed is http://dschenkelman.github.io/feed.xml.

Building End-To-End Video Experiences with Azure Media Services @ //build/ 2014

Microsoft Azure Media Services @ //build/ 2014

Last month, Mingfei Yan (Program Manager for Azure Media Services) invited me to be co-presenter of the “Building End-To-End Video Experiences with Azure Media Services Media Services” session at //build/ 2014. In our presentation we briefly introduced Microsoft Azure Media Services, explaining the steps involved in a Video On-Demand workflow and showing some demos on how to use the Azure Management Portal and the Extension SDK for .NET. We also presented some new features, including HLS version 3 support, more Secure Delivery options, Live Streaming and fast sub-clipping.

In this post, you will find a summary of the content shown in the session along with several useful resources. The session recording will be available soon on Channel9 at http://channel9.msdn.com/Events/Build/2014/3-610, and the slides can be found at http://www.slideshare.net/mingfeiy/build-2014-building-endtoend-video-experience-with-azure-media-services

Media Services speakers @ //build/ 2014

 

Media Services Introduction

Mingfei started the session by explaining the current Media industry challenges of building a solution that provides high quality videos for any device anywhere and at any time, and how Microsoft Azure Media Services can help us to solve them. She also went through the Media Services architecture built on top of the Microsoft Azure platform.

Microsoft Azure Media Services Architecture

 

Video On-Demand (VOD)

Mingfei continued by breaking down the steps involved in a Video On-Demand (VOD) workflow using Azure Media Services: Ingest, Encode, Package, Encrypt and Deliver. She also highlighted the Dynamic Packaging feature and the brand new support for Apple HTTP Live Streaming (HLS) version 3. We also showed two demos for the Video On-Demand (VOD) services:

1. Azure Management Portal.

Mingfei showed how you can use the portal to list your assets, ingest a new mezzanine file and then transcode it to an adaptive bitrate MP4 set asset.

2. Extensions SDK for .NET.

I showed how to write a very simple VOD workflow using the Extensions SDK for .NET that publishes an asset for adaptive streaming by creating an Origin locator and generating the Smooth Streaming, HLS and MPEG-DASH dynamic packaging URLs. To demonstrate this, I used a players Web Site that leverages different Display Modes to choose the appropriate dynamic packaging URL for each device:

VODPlayersSite

You can find the source code of the VOD demos in the azuremediaservices-build2014 GitHub repository at https://github.com/mconverti/azuremediaservices-build2014/tree/master/VODDemos.

 

Secure Delivery options (private preview feature)

After that, Mingfei explained the new options available in private preview to secure the media delivery to serve both encrypted HLS and Smooth Streaming to client devices:

  • Advanced Encryption Standard (AES) 128 clear key dynamic encryption
  • Microsoft PlayReady Digital Rights Management (DRM) content access and protection technology

She also showed a demo on how to dynamically encrypt Smooth Streaming content with AES and play it back in a Windows Store application. For more details about the new Security Delivery options, you can check out her blog post: Announcing PlayReady as a service and AES dynamic encryption with Azure Media Services.

New Secure Delivery Options

 

Live Streaming and Fast Sub-Clipping (private preview features)

Finally, we ended the session by showing the Live Streaming private preview feature. Mingfei briefly explained the new Live Streaming concepts and entities and how they work. Then, I ran a demo leading the audience through the following steps:

1. Manage Channels and Programs from the Azure Management Portal.

I showed the new Channels tab in the Azure Management Portal that lets you create, remove, update and list the channels and programs in the live-enabled account. The portal works on top of the Live Services REST API, so this means that all the operations available there (and even more) can be easily automated/scheduled in a Live Streaming workflow.

2. Start a live stream.

To do this, I used an encoding software running on my machine to encode the stream from my camera to Smooth Streaming format and send it to the Ingest URL of the channel. We needed an encoder because the Live Encoding feature is not yet ready, so we had to encode the original camera stream into multiple bitrates before sending it to the channel. To verify that the channel was receiving my input, I used the Preview URL player from the Azure Management Portal (the Preview URL it is a tech-check to let us know if the channel is receiving the output from the encoder). Everything worked as expected, and I started a “real” live stream through the Origin servers by clicking the Start Streaming button in the portal. This triggers the following operations behind the scenes:

  1. Create a new asset
  2. Create a new program and associate it with the asset
  3. Start the program
  4. Create an origin locator for the program asset to enable adaptive streaming through the Origin servers

3. Dynamic Packaging with Live Streams

To demonstrate that dynamic packaging also works with live stream assets, I used a players Web Site that leverages different Display Modes to choose the appropriate dynamic packaging URL for each device (very similar to the one I used for the VOD demo):

4. Fast Sub-Clipping

I showed the new fast sub-clipping feature that allows you to create a highlight from an asset while it is still a live stream. To do this, I prepared a Microsoft Media Platform Video Editor (formerly RCE) deployment with custom providers to get assets from my live-enabled Media Services account. I used this tool to easily choose the begin and end positions and submit the sub-clip generation to Media Services (this only took a few seconds). You can go back in the live stream playback to choose the sub-clip positions thanks to the Program DVR Window, which is the amount of viewable content available before the live position.

5. Live to VOD transition

Finally, I stopped the program from the Azure Management Portal to show the seamless transition between Live and VOD; the full live archive is immediately available as VOD using the same dynamic packaging URLs after the live event ends. This also applies to the sub-clips created while the asset was still a live stream.

For more information about the new Media Services features, you can check out the NAB 2014 Announcements from Media Services blog post by John Deutscher (Principal Program Manager Lead for Azure Media Services).

Enjoy!