All posts by Mariano Converti

More Azure Management Portal updates for Azure Media Services

Last week, the Azure Media Services team published more updates in the Azure Management Portal that include some new features. In this post you will find a quick summary of what’s new.

 

Added support for JSON Web Token (JWT) format

Last month, the Azure Media Services team added support for JSON Web Token (JWT) format to restrict delivery of content keys; before this update, only Simple Web Token (SWT) format was supported.

With the latest Azure Portal updates, there is a new Token Type option in the Content Protection tab for both the AES and PlayReady authorization policies to let you set the format: JSON Web Token (default) or Simple Web Token.

Please take into account that:

  • After applying changes to the authorization policies, it could take up to a few minutes to take effect.
  • The Content Protection tab only lets you configure the authorization policies that are used for the encryption features in the Azure Portal. If you have a custom authorization policy, you will need to configure it programmatically using the SDK/REST API.

To learn more about how to use JWT format with dynamically encryption, you can read these blog posts by @GEOTRIF:

Token Type option in Content Protection tab

 

Disabled Play command when the asset has custom delivery policies

The Play command in the Content tab is now disabled for assets that have a custom delivery policy created programmatically using the SDK/REST API (i.e. not with the Encryption command).

The reason behind this update is that the Azure Portal relies on the delivery policies created with the Encryption command. If you create a custom one, it might not cover the delivery protocols used by the Azure Portal. Therefore, to avoid any potential playback issues that could be misleading (users might think that there is an issue in the dynamic encryption configuration), the Play command is now disabled in this scenario.

Play command disabled with custom delivery policies

 

Added support for Streaming Endpoints advanced configuration

The Configure sub-tab for streaming endpoints now have three new sections:

  • Cross Domain Policy: This section lets you specify the cross domain access policy for Adobe Flash clients. For more information, see Cross-domain policy file specification.
  • Client Access Policy: This section lets you specify the client access policy for Microsoft Silverlight clients. For more information, see Making a Service Available Across Domain Boundaries.
  • Custom Host Names: This section lets you configure a streaming endpoint to accept traffic directed to a custom host name. This allows for easier traffic management configuration through a Global Traffic Manager (GTM) and also for branded domain names to be used as the streaming. To perform CRUD operations on the custom host names, click the Configure button to open the Manage Custom Host Names dialog.

Please take into account that these sections (along with the rest) are only enabled for editing when the streaming endpoint has at least one streaming unit.

Streaming Endpoint advanced configuration

Manage streaming endpoint custom host names 

 

Enjoy!

Connecting to Microsoft Azure Media Services REST API

If you are a .NET/C# developer, you can consume all the Azure Media Services features using the .NET SDK. It is available as a NuGet package which you can easily get from Visual Studio, and its source code can be found in GitHub. There is also an Extensions .NET SDK that you can use to simplify the implementation of common media workflows (also available as a NuGet package and in GitHub).

If you are working in a platform/language where there is no official SDK currently available, you can leverage Azure Media Services by implementing your own client for the Azure Media Services REST API. Luckily, this is not a complex task: the REST API is straightforward in terms of design, supports JSON/XML formats and it’s very well documented in MSDN, so you only need a standard HTTP client and you are good to go Smile.

In the case of the latter, you will need to perform some connection steps before accessing the Azure Media Services REST API:

  1. Get the access token (SWT format) from the ACS instance by providing the Media Services account name and key. You can find more details here.
  2. Send an authenticated GET request (with the bearer access token) to https://media.windows.net/ and you will receive a 301 Moved Permanently response with the REST API endpoint in the Location header. You can find more details here.
  3. Use the bearer access token and the new endpoint for sending all your subsequent requests to the REST API.

A common mistake when performing step #2 is not disabling the auto-redirect logic in the HTTP client used to send the request. This feature is likely to be enabled by default for GET requests in a standard HTTP client, so it would follow the redirection automatically, send a second request to the new location (similar to https://wamsbayclus001rest-hs.cloudapp.net/api/) and get a second response without the Location header and a body as follows:

{  
   "d":{  
      "EntitySets":[  
         "AccessPolicies",
         "Locators",
         "ContentKeys",
         "ContentKeyAuthorizationPolicyOptions",
         "ContentKeyAuthorizationPolicies",
         "Files",
         "Assets",
         "AssetDeliveryPolicies",
         "IngestManifestFiles",
         "IngestManifestAssets",
         "IngestManifests",
         "StorageAccounts",
         "Tasks",
         "NotificationEndPoints",
         "Jobs",
         "TaskTemplates",
         "JobTemplates",
         "MediaProcessors",
         "EncodingReservedUnitTypes",
         "Operations",
         "StreamingEndpoints",
         "Channels",
         "Programs"
      ]
   }
}

Of course, this error will prevent you from continuing with step #3 to actually access the Azure Media Services resources you want. Therefore, as a summary, if you are building your own Azure Media Services client, do not use auto-redirection logic in your HTTP client configuration when connecting to the REST API.

Happy coding!

Microsoft Azure Management Portal updates for Azure Media Services

Last week, the Azure Media Services team published some updates in the Azure Management Portal that include several improvements and new features. In this post you will find a quick summary of what has changed and what is new.

 

Content Protection tab

As part of the General Availability announcement of the Content Protection (AES and DRM) service, the following changes were applied in the Content Protection tab:

  • The Preview tag next to the Content Protection tab was removed since the service is now in General Availability. This also means that you no longer need to sign up for the PlayReady license service via the Azure Preview features page; this feature is now active in all the Media Services accounts.
  • The Branding Reporting table at the bottom is now optional; this means that the PlayReady license service is always enabled.

Content Protection tab

 

Content tab

These are the improvements and new features that were added in the Content tab:

  • The Encode command was renamed Process, since you can now select another media processor (not just the Azure Media Encoder).
  • The Process dialog (formerly Encode) was refactored as follows:
    • The encoding configuration for Azure Media Encoder was reorganized into these categories to make it clearer for users:
      • Presets for Adaptive Streaming (dynamic packaging): produces multi-bitrate MP4 files ready for dynamic packaging
      • Presets for Progressive Download: produces a single MP4 file ready for progressive download
      • Legacy Presets for Adaptive Streaming: produces Smooth Streaming files ready for adaptive streaming
      • Other (common workflows).
    • As of the General Availability announcement of the Azure Media Indexer back in September 2014, you can now select the Azure Media Indexer processor to submit an indexing job. You can also enter the optional title and description metadata fields for the task configuration preset, and choose whether to copy the generated TTML caption file back to the input asset. For more information related to the Azure Media Indexer you can check this blog post.
  • The Publish and Play commands now notify the user if the Media Services account does not have any streaming units when the content is dynamically encrypted (AES or PlayReady). Remember that the dynamic packaging and dynamic encryption features will only work in streaming endpoints that have at least one streaming unit.
  • The Publish command now creates a SAS locator (for progressive download) instead of an Origin locator (for adaptive streaming) when the asset contains a single MP4 file. This is the case, for example, when you generate an asset by submitting a job using any of the H264 Broadband * encoding configuration presets.

Content tab

 

Channels and Programs tabs

With the AES encryption and PlayReady encryption announcements for the Live Streaming service back in  November/December 2014, the Azure Media Services team has added support to configure these features from the portal:

  • The Start Streaming command in the Channels tab now includes three sub-commands:
    • Unencrypted: creates a channel program with default settings, publishes the associated asset with an Origin locator and starts the program.
    • AES Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for AES dynamic encryption.
    • PlayReady Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for PlayReady dynamic encryption.
  • The Create a new program dialog now includes the Streaming Encryption Type setting:
    • Unencrypted: creates a program with custom settings.
    • AES Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for AES dynamic encryption.
    • PlayReady Encrypted: same as Unencrypted command, but it also adds an asset delivery policy for PlayReady dynamic encryption.
  • The Programs table now has a new Encryption column specifying the streaming encryption type.

Channels and Programs tabs

 

Enjoy!

Building End-To-End Video Experiences with Azure Media Services @ //build/ 2014

Microsoft Azure Media Services @ //build/ 2014

Last month, Mingfei Yan (Program Manager for Azure Media Services) invited me to be co-presenter of the “Building End-To-End Video Experiences with Azure Media Services Media Services” session at //build/ 2014. In our presentation we briefly introduced Microsoft Azure Media Services, explaining the steps involved in a Video On-Demand workflow and showing some demos on how to use the Azure Management Portal and the Extension SDK for .NET. We also presented some new features, including HLS version 3 support, more Secure Delivery options, Live Streaming and fast sub-clipping.

In this post, you will find a summary of the content shown in the session along with several useful resources. The session recording will be available soon on Channel9 at http://channel9.msdn.com/Events/Build/2014/3-610, and the slides can be found at http://www.slideshare.net/mingfeiy/build-2014-building-endtoend-video-experience-with-azure-media-services

Media Services speakers @ //build/ 2014

 

Media Services Introduction

Mingfei started the session by explaining the current Media industry challenges of building a solution that provides high quality videos for any device anywhere and at any time, and how Microsoft Azure Media Services can help us to solve them. She also went through the Media Services architecture built on top of the Microsoft Azure platform.

Microsoft Azure Media Services Architecture

 

Video On-Demand (VOD)

Mingfei continued by breaking down the steps involved in a Video On-Demand (VOD) workflow using Azure Media Services: Ingest, Encode, Package, Encrypt and Deliver. She also highlighted the Dynamic Packaging feature and the brand new support for Apple HTTP Live Streaming (HLS) version 3. We also showed two demos for the Video On-Demand (VOD) services:

1. Azure Management Portal.

Mingfei showed how you can use the portal to list your assets, ingest a new mezzanine file and then transcode it to an adaptive bitrate MP4 set asset.

2. Extensions SDK for .NET.

I showed how to write a very simple VOD workflow using the Extensions SDK for .NET that publishes an asset for adaptive streaming by creating an Origin locator and generating the Smooth Streaming, HLS and MPEG-DASH dynamic packaging URLs. To demonstrate this, I used a players Web Site that leverages different Display Modes to choose the appropriate dynamic packaging URL for each device:

VODPlayersSite

You can find the source code of the VOD demos in the azuremediaservices-build2014 GitHub repository at https://github.com/mconverti/azuremediaservices-build2014/tree/master/VODDemos.

 

Secure Delivery options (private preview feature)

After that, Mingfei explained the new options available in private preview to secure the media delivery to serve both encrypted HLS and Smooth Streaming to client devices:

  • Advanced Encryption Standard (AES) 128 clear key dynamic encryption
  • Microsoft PlayReady Digital Rights Management (DRM) content access and protection technology

She also showed a demo on how to dynamically encrypt Smooth Streaming content with AES and play it back in a Windows Store application. For more details about the new Security Delivery options, you can check out her blog post: Announcing PlayReady as a service and AES dynamic encryption with Azure Media Services.

New Secure Delivery Options

 

Live Streaming and Fast Sub-Clipping (private preview features)

Finally, we ended the session by showing the Live Streaming private preview feature. Mingfei briefly explained the new Live Streaming concepts and entities and how they work. Then, I ran a demo leading the audience through the following steps:

1. Manage Channels and Programs from the Azure Management Portal.

I showed the new Channels tab in the Azure Management Portal that lets you create, remove, update and list the channels and programs in the live-enabled account. The portal works on top of the Live Services REST API, so this means that all the operations available there (and even more) can be easily automated/scheduled in a Live Streaming workflow.

2. Start a live stream.

To do this, I used an encoding software running on my machine to encode the stream from my camera to Smooth Streaming format and send it to the Ingest URL of the channel. We needed an encoder because the Live Encoding feature is not yet ready, so we had to encode the original camera stream into multiple bitrates before sending it to the channel. To verify that the channel was receiving my input, I used the Preview URL player from the Azure Management Portal (the Preview URL it is a tech-check to let us know if the channel is receiving the output from the encoder). Everything worked as expected, and I started a “real” live stream through the Origin servers by clicking the Start Streaming button in the portal. This triggers the following operations behind the scenes:

  1. Create a new asset
  2. Create a new program and associate it with the asset
  3. Start the program
  4. Create an origin locator for the program asset to enable adaptive streaming through the Origin servers

3. Dynamic Packaging with Live Streams

To demonstrate that dynamic packaging also works with live stream assets, I used a players Web Site that leverages different Display Modes to choose the appropriate dynamic packaging URL for each device (very similar to the one I used for the VOD demo):

4. Fast Sub-Clipping

I showed the new fast sub-clipping feature that allows you to create a highlight from an asset while it is still a live stream. To do this, I prepared a Microsoft Media Platform Video Editor (formerly RCE) deployment with custom providers to get assets from my live-enabled Media Services account. I used this tool to easily choose the begin and end positions and submit the sub-clip generation to Media Services (this only took a few seconds). You can go back in the live stream playback to choose the sub-clip positions thanks to the Program DVR Window, which is the amount of viewable content available before the live position.

5. Live to VOD transition

Finally, I stopped the program from the Azure Management Portal to show the seamless transition between Live and VOD; the full live archive is immediately available as VOD using the same dynamic packaging URLs after the live event ends. This also applies to the sub-clips created while the asset was still a live stream.

For more information about the new Media Services features, you can check out the NAB 2014 Announcements from Media Services blog post by John Deutscher (Principal Program Manager Lead for Azure Media Services).

Enjoy!

WindowsAzure.MediaServices.Extensions NuGet package 2.0.0 released!

Windows Azure Media ServicesLast week, the Windows Azure Media Services (WAMS) team published a new release of the WindowsAzure.MediaServices.Extensions NuGet package (2.0.0). Starting from this new version, the WAMS team will be taking ownership of the NuGet package and the Extensions source code to make sure that they are aligned and consistent with the Windows Azure Media Services .NET SDK. The Extensions source code was moved to a new GitHub repository under the WindowsAzure organization: https://github.com/WindowsAzure/azure-sdk-for-media-services-extensions (the previous GitHub repository is already deprecated).

The following are some of the highlights for this new NuGet package release:

  • It targets the WindowsAzure.MediaServices NuGet package 3.0.0 (or higher). This is latest version of the Windows Azure Media Services .NET SDK.
  • It does not add source code files to your project anymore. Now, the extensions are shipped in the Microsoft.WindowsAzure.MediaServices.Client.Extensions assembly that is added as a reference to your project.
  • It contains several breaking changes with the previous extensions (1.0.6). Most of them are related to a different organization of the extension methods (to avoid defining all of them for CloudMediaContext class) and renaming.
  • It contains new extensions for creating assets by automatically selecting one of the Storage accounts attached to the Media Services account based on different strategies.

Breaking Changes

The following are the breaking changes with the previous Nuget package version (1.0.6). For more information, you can check the README documentation.

Create Asset from single local file

// Original extension (there are additional overloads)

public static IAsset CreateAssetFromFile(

  this CloudMediaContext context, string filePath, AssetCreationOptions options)

 

// New extension (there are additional overloads)

public static IAsset CreateFromFile(

  this AssetBaseCollection assets, string filePath, AssetCreationOptions options)

Create Asset from local folder

// Original extension (there are additional overloads)

public static IAsset CreateAssetFromFolder(

  this CloudMediaContext context, string folderPath, AssetCreationOptions options)

 

// New extension (there are additional overloads)

public static IAsset CreateFromFolder(

  this AssetBaseCollection assets, string folderPath, AssetCreationOptions options)

Generate Asset Files from Blob storage

// Original extension

public static void CreateAssetFiles(this CloudMediaContext context, IAsset asset)

 

// New extension

public static void GenerateFromStorage(this IAsset asset)

Download Asset Files to local folder

// Original extension (there are additional overloads)

public static void DownloadAssetFilesToFolder(this CloudMediaContext context, IAsset asset, string folderPath)

 

// New extension (there are additional overloads)

public static void DownloadToFolder(this IAsset asset, string folderPath)

Create Locator

// Original extension (there are additional overloads)

public static ILocator CreateLocator(

  this CloudMediaContext context, IAsset asset, LocatorType locatorType, AccessPermissions permissions, TimeSpan duration)

 

// New extension (there are additional overloads)

public static ILocator Create(

  this LocatorBaseCollection locators, LocatorType locatorType, IAsset asset, AccessPermissions permissions, TimeSpan duration)

Create a Job with a single Task

// Original extension (there is an additional overload)

public static IJob PrepareJobWithSingleTask(

  this MediaContextBase context, string mediaProcessorName, string taskConfiguration, IAsset inputAsset, string outputAssetName, AssetCreationOptions outputAssetOptions)

 

// New extension (there is an additional overload)

public static IJob CreateWithSingleTask(

  this JobBaseCollection jobs, string mediaProcessorName, string taskConfiguration, IAsset inputAsset, string outputAssetName, AssetCreationOptions outputAssetOptions)

Start Job execution progress task

// Original extension (there is an additional overload)

public static Task<IJob> StartExecutionProgressTask(

  this MediaContextBase context, IJob job, Action<IJob> executionProgressChangedCallback, CancellationToken cancellationToken)

 

// New extension (there is an additional overload)

public static Task<IJob> StartExecutionProgressTask(

  this IJob job, Action<IJob> executionProgressChangedCallback, CancellationToken cancellationToken)

Removed extensions

// These extensions were removed because they were not related to Media Services.

public static void Save(this Uri uri, string filePath)

public static void Save(this string url, string filePath)

Sample Workflow using the Extensions

The following sample code shows a basic media workflow that uses the new Windows Azure Media Services .NET SDK Extensions NuGet package.

try

{

    MediaServicesCredentials credentials = new MediaServicesCredentials("%accountName%", "%accountKey%");

    CloudMediaContext context = new CloudMediaContext(credentials);

 

    Console.WriteLine("Creating new asset from local file...");

 

    // 1. Create a new asset by uploading a mezzanine file from a local path.

    IAsset inputAsset = context.Assets.CreateFromFile(

        "smallwmv1.wmv",

        AssetCreationOptions.None,

        (af, p) =>

        {

            Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);

        });

 

    Console.WriteLine("Asset created.");

 

    // 2. Prepare a job with a single task to transcode the previous mezzanine asset

    //    into a multi-bitrate asset.

    IJob job = context.Jobs.CreateWithSingleTask(

        MediaProcessorNames.WindowsAzureMediaEncoder,

        MediaEncoderTaskPresetStrings.H264AdaptiveBitrateMP4Set720p,

        inputAsset,

        "Sample Adaptive Bitrate MP4",

        AssetCreationOptions.None);

 

    Console.WriteLine("Submitting transcoding job...");

 

    // 3. Submit the job and wait until it is completed.

    job.Submit();

    job = job.StartExecutionProgressTask(

        j =>

        {

            Console.WriteLine("Job state: {0}", j.State);

            Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());

        },

        CancellationToken.None).Result;

 

    Console.WriteLine("Transcoding job finished.");

 

    IAsset outputAsset = job.OutputMediaAssets[0];

 

    Console.WriteLine("Publishing output asset...");

 

    // 4. Publish the output asset by creating an Origin locator for adaptive streaming, 

    // and a SAS locator for progressive download.

    context.Locators.Create(

        LocatorType.OnDemandOrigin,

        outputAsset,

        AccessPermissions.Read,

        TimeSpan.FromDays(30));

    context.Locators.Create(

        LocatorType.Sas,

        outputAsset,

        AccessPermissions.Read,

        TimeSpan.FromDays(30));

 

    IEnumerable<IAssetFile> mp4AssetFiles = outputAsset

            .AssetFiles

            .ToList()

            .Where(af => af.Name.EndsWith(".mp4", StringComparison.OrdinalIgnoreCase));

 

    // 5. Generate the Smooth Streaming, HLS and MPEG-DASH URLs for adaptive streaming, 

    // and the Progressive Download URL.

    Uri smoothStreamingUri = outputAsset.GetSmoothStreamingUri();

    Uri hlsUri = outputAsset.GetHlsUri();

    Uri mpegDashUri = outputAsset.GetMpegDashUri();

    List<Uri> mp4ProgressiveDownloadUris = mp4AssetFiles.Select(af => af.GetSasUri()).ToList();

 

    // 6. Get the asset URLs.

    Console.WriteLine(smoothStreamingUri);

    Console.WriteLine(hlsUri);

    Console.WriteLine(mpegDashUri);

    mp4ProgressiveDownloadUris.ForEach(uri => Console.WriteLine(uri));

 

    Console.WriteLine("Output asset available for adaptive streaming and progressive download.");

 

    string outputFolder = "job-output";

    if (!Directory.Exists(outputFolder))

    {

        Directory.CreateDirectory(outputFolder);

    }

 

    Console.WriteLine("Downloading output asset files to local folder...");

 

    // 7. Download the output asset to a local folder.

    outputAsset.DownloadToFolder(

        outputFolder,

        (af, p) =>

        {

            Console.WriteLine("Downloading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);

        });

 

    Console.WriteLine("Output asset files available at '{0}'.", Path.GetFullPath(outputFolder));

 

    Console.WriteLine("VOD workflow finished.");

}

catch (Exception exception)

{

    // Parse the XML error message in the Media Services response and create a new 

    // exception with its content.

    exception = MediaServicesExceptionParser.Parse(exception);

 

    Console.Error.WriteLine(exception.Message);

}

finally

{

    Console.ReadLine();

}

As usual, if you want to report a bug, send feedback or propose new extensions, you can open a new issue in the azure-sdk-for-media-services-extensions GitHub repository.

Enjoy!

[Spanish] Webcast: Fácil Async para Windows Apps Store en Microsoft Visual C # y Microsoft Visual Basic

For those who don’t read Spanish, this blog post provides details about an upcoming Spanish speaking webcast for the MSDN Latin American community.


Este Jueves 5 de Diciembre junto con Jonathan Cisneros presentaremos un webcast donde revisaremos el modelo de programación asincrónica en el contexto de las Windows Store Apps, utilizando los keywords async y await.

Durante el evento exploraremos los principios de asincronía en una Windows Store App de ejemplo, explicando cómo coordinar actividades concurrentes, cómo realizar una tarea CPU-intensiva en background para liberar el thread de la UI, y cómo habilitar cancelación de tareas asincrónicas.

Los detalles del evento son los siguientes:

Si tienen alguna consulta sobre el evento, no duden en contactarnos (@cisne y @mconverti). Los esperamos!

Smooth Streaming Manifest Generator library and MMP Video Editor (former RCE) 2.1 released!

Last month, two new Media-related releases were published in the MSDN code samples gallery:

Smooth Streaming Manifest Generator release

The Smooth Streaming Manifest Generator is a lightweight Portable Class Library (PCL) that allows you to easily parse and generate Smooth Streaming manifests, such as ISM (server manifest), ISMC (client manifest) and CSM (composite stream manifest). This is really useful for building workflows to generate sub-clips out of Smooth Streaming media sources. The following sample code shows how easy it is to generate a CSM with a single clip using this library.

// Manifest URL of the Smooth Streaming video used as the source.

Uri manifestUri = new Uri("http://mediadl.microsoft.com/mediadl/iisnet/smoothmedia/Experience/BigBuckBunny_720p.ism/Manifest");

 

// Download the Smooth Streaming client manifest (ISMC) and parse it.

DownloaderManager downloaderManager = new DownloaderManager();

var manifestStream = downloaderManager.DownloadManifest(manifestUri, true);

SmoothStreamingManifestParser parser = new SmoothStreamingManifestParser(manifestStream);

 

// Create a Smooth Streaming Manifest writer setting the onlyAudioAndVideoTracks parameter in true.

SmoothStreamingManifestWriter writer = new SmoothStreamingManifestWriter(true);

 

// Set the begin and end positions of your clip.

ulong clipBeginPosition =  20000000;

ulong clipEndPosition   = 220000000;

 

// Create a Composite Stream Manifest and add a clip from the original Smooth Streaming source.

CompositeManifestInfo compositeManifestInfo = new CompositeManifestInfo(parser.ManifestInfo.MajorVersion, parser.ManifestInfo.MinorVersion);

compositeManifestInfo.AddClip(manifestUri, clipBeginPosition, clipEndPosition, null, parser.ManifestInfo);

 

// Write the Composite Stream Manifest to a string variable.

string compositeStreamManifest = writer.GenerateCompositeManifest(compositeManifestInfo, false, false)

For more information related to Smooth Streaming manifests, refer to the following articles:

Microsoft Media Platform Video Editor 2.1 release

The Silverlight Microsoft Media Platform (MMP) Video Editor (formerly known as Rough Cut Editor – RCE) simplifies the editing and publishing process, enabling real-time, browser-based video editing, and providing the ability for publishers to improve collaboration, manage dynamic metadata and deliver exciting content to the Web.

The MMP Video Editor 2.1 release includes the following new features:

  • Upgraded Visual Studio 2012 solution
  • Support for fade in and fade out transitions
  • Support to load asset in XML format from a JavaScript bridge so you can query your assets directly from a CMS web page and send them back to the MMP Video Editor
  • New mode to snap mark in and mark out positions to the nearest fragment boundary

If you want to download legacy releases of the MMP Video Editor, they are still available on the MSDN Archive site at: http://archive.msdn.microsoft.com/VideoEditor.

Microsoft Media Platform Video Editor

Enjoy!

New Windows Azure Media Services (WAMS) Asset Replicator release published on CodePlex

Last week, a new release of the Windows Azure Media Services (WAMS) Asset Replicator Tool was published on CodePlex. This September 2013 release includes the following changes:

  • Code upgraded to use the latest Windows Azure Storage Client library (v2.1.0.0)
  • Code upgraded to use the latest Windows Azure Media Services .NET SDK (v2.4.0.0)
  • C# projects upgraded to target .NET Framework 4.5
  • Cloud Service project upgraded to Windows Azure Tools v2.0
  • NuGet package dependencies updated to their latest versions
  • Support added to compare, replicate and verify FragBlob assets
  • New approach to auto-replicate and auto-ignore assets based on metadata in the IAsset.AlternateId property using JSON format

As you can see, the most important changes are the last two items which I will go into more detail about below.

FragBlob support

FragBlob is a new storage format that will be used in an upcoming Windows Azure Media Services feature that is not yet available. In this new format, each Smooth Streaming fragment is written to storage as a separate blob in the asset’s container instead of grouping them together into Smooth Streaming PIFF files (ISMV’s/ISMA’s). Therefore, the Replicator has been updated to identify, compare, copy and verify this new FragBlob asset type.

FragBlob asset container

Replicator metadata in IAsset.AlternateId property using JSON format

To decide whether or not an asset should be automatically replicated or ignored, the Replicator Tool needs to get some metadata from your assets. Currently the IAsset interface does not have a property to store custom metadata, so as a workaround the Replicator now uses the IAsset.AlternateId string property to store this metadata with a specific JSON format described below:

{

   "alternateId":"my-custom-alternate-id",

   "replicate":"No",

   "data":"optional custom metadata"

} 

The following are the expected fields in the JSON format:

  • alternateId: this is the actual Alternate Id value for the asset that is used to identify and track assets in both data centers.
  • replicate: this is a three-state flag that the replicator will use to determine whether or not it should take automatic action for the asset. The possible values are:
    • No: the asset will be automatically ignored
    • Auto: the asset will be automatically replicated
    • Manual: no automatic action will be taken for this asset

    Important: If the replicate field is not included in the IAsset.AlternateId (or if this property is not set at all – null value), the default value is No (asset automatically ignored).

  • data: this is an optional field that you can use to store additional custom metadata for the asset.

The Replicator uses some extension methods for the IAsset interface to easily retrieve and set these values without having to deal with the JSON format. These extensions can be found in the Replicator.CoreExtensionsAssetUtilities.cs source code file.

IAsset.AlternateId extensions for Replicator JSON format

Using these IAsset extension methods for the IAsset.AlternateId property, you can easily set the Replicator metadata in your media workflows as explained below:

How to set the IAsset.AlternateId metadata for automatic replication

Once your asset is ready (for instance, after ingestion or a transcoding job) and you have successfully created an Origin locator for it, you need to set the alternateId and replicate fields as follows:

// Set the alternateId to track the asset in both WAMS accounts.

string alternateId = "my-custom-id";

asset.SetAlternateId(alternateId);

 

// Set the replicate flag to 'Auto' for automatic replication.

asset.SetReplicateFlag(ReplicateFlag.Auto);

 

// Update the asset to save the changes in the IAsset.AlternateId property.

asset.Update(); 

By setting the replicate field to ‘Auto‘, the Replicator will un-ignore the asset and automatically start copying it to the other WAMS account. When the copy operation is complete, both assets will be marked as verified if everything measures up OK; otherwise, it will report the differences/errors and the user will have to take manual action from the Replicator Dashboard (like manually forcing the copy again).

How to set the IAsset.AlternateId metadata for manual replication

Once your asset is ready and you have successfully created an Origin locator for it, you need to set the alternateId and replicate fields as follows:

// Set the alternateId to track the asset in both WAMS accounts.

// Make sure to use the same alternateId for the asset in the other WAMS account that you want to compare.

string alternateId = "my-custom-id";

asset.SetAlternateId(alternateId);

 

// Set the replicate flag to 'Manual' for manual replication.

asset.SetReplicateFlag(ReplicateFlag.Manual);

 

// Update the asset to save the changes in the IAsset.AlternateId property.

asset.Update(); 

By setting the replicate field to ‘Manual‘, the Replicator will un-ignore the asset and check if there is an asset in the other WASM account with the same alternateId field. If the Replicator finds one, it will compare both assets and mark them as verified if everything checks out OK; otherwise, it will report the differences and the user will have to take manual action from the Replicator Dashboard (like deleting one and forcing a copy of the other). This scenario is useful when comparing assets living in different WAMS accounts and generated from the same source.

Enjoy!

More helpers added to WindowsAzure.MediaServices.Extensions NuGet package

This week I published a new version of the WindowsAzure.MediaServices.Extensions NuGet package (v1.0.5), which includes some great contributions from Martin Cabral who has also been working extensively with Windows Azure Media Services here at Southworks. He put together all the available Media Processors names and Task Preset Strings for Windows Azure Media Encoder in a series of string constants. This way, you don’t need to search the Web for these special string values when you want to submit a job. Instead, you can use IntelliSense right from your Visual Studio coding window to list them along with their descriptions.

Media Processor names available as string constants

Media Encoder Task Preset Strings available as string constants

Additionally, I added a sample Console application showing how easy it is to write a VOD workflow leveraging the Extensions for Windows Azure Media Services .NET SDK NuGet package. You can find the SampleWorkflowUsingExtensions sample source code in the azure-sdk-for-media-services-extensions GitHub repository.

SampleWorkflowUsingExtensions sample

If you have any feedback, suggestions or new extensions/helpers to propose, please feel free to open an issue in the azure-sdk-for-media-services-extensions GitHub repository.

Enjoy!

[Spanish] Material del webcast: Creando aplicaciones Media con Windows Azure Media Services

For those who don’t read Spanish, this blog post provides details about a Spanish speaking webcast that Ezequiel Jadib and I presented last week for the MSDN Latin American community.


Eventos y Webcasts de MicrosoftLa semana pasada junto con Ezequiel Jadib presentamos un webcast sobre cómo crear aplicaciones Media utilizando Windows Azure Media Services. Durante el evento hablamos sobre el panorama actual de la industria de Media y sus desafíos actuales, luego nos enfocamos en cómo Windows Azure Media Services nos puede ayudar con estos desafíos, y por último mostramos una serie de demos sobre cómo implementar un VOD workflow sacando ventaja de dynamic packaging (también conocido como dynamic muxing) para llegar a múltiples plataformas muy fácilmente.

Queremos agradecerles a todos los que asistieron al evento y les repetimos que pueden contactarnos (@ejadib y @mconverti) en caso de que tengan preguntas o dudas sobre los temas que presentamos. Para los que no pudieron asistir, les dejamos a continuación los detalles del evento y los links para descargar la grabación o verla online:

El material que utilizamos para la presentación también esta disponible para ser descargado desde nuestro repositorio GitHub:

Material Webcast: 'Creando aplicaciones Media con Windows Azure Media Services'

Por úlitmo, les dejamos algunos links con recursos adicionales:

Enjoy!