Tag Archives: Microsoft Azure

New Azure Media Services .NET SDK Extensions Release

Azure Media ServicesYesterday, the Azure Media Services team released a new version of the Azure Media Services .NET SDK Extensions nuget package (v3.7.0.1) that contains some fixes, updates and new features. The previous nuget package version (v3.5.2) is a broken build, so it’s now deprecated/unlisted; please make sure to update your projects to use the new and fixed package version: v3.7.0.1.

In this post, I will share the change log by describing all the changes that were included in this release.

Change Log

Upgraded to latest Azure Media Services .NET SDK (v3.7.0.1).

Marked as obsolete these processors in MediaProcessorNames class: Windows Azure Media Encoder, Azure Media Encoder, Windows Azure Media Packager, Windows Azure Media Encryptor.

public const string WindowsAzureMediaEncoder = "Windows Azure Media Encoder";

public const string AzureMediaEncoder = "Azure Media Encoder";

public const string WindowsAzureMediaPackager = "Windows Azure Media Packager";

public const string WindowsAzureMediaEncryptor = "Windows Azure Media Encryptor";

Marked as obsolete the task preset strings for Azure Media Encoder processor in MediaEncoderTaskPresetStrings class.

public static class MediaEncoderTaskPresetStrings

Added names for new Media Analytics’ processors in MediaProcessorNames class.

public const string AzureMediaFaceDetector = "Azure Media Face Detector";

public const string AzureMediaHyperlapse = "Azure Media Hyperlapse";

public const string AzureMediaIndexer = "Azure Media Indexer";

public const string AzureMediaIndexer2Preview = "Azure Media Indexer 2 Preview";

public const string AzureMediaMotionDetector = "Azure Media Motion Detector";

public const string AzureMediaOCR = "Azure Media OCR";

public const string AzureMediaStabilizer = "Azure Media Stabilizer";

public const string AzureMediaVideoThumbnails = "Azure Media Video Thumbnails";

Added task preset strings for Media Encoder Standard processor in MediaEncoderStandardTaskPresetStrings class.

public static class MediaEncoderStandardTaskPresetStrings
// H264 Multiple Bitrate Presets
public const string H264MultipleBitrate1080pAudio51 = "H264 Multiple Bitrate 1080p Audio 5.1";
public const string H264MultipleBitrate1080p = "H264 Multiple Bitrate 1080p";
public const string H264MultipleBitrate16x9foriOS = "H264 Multiple Bitrate 16x9 for iOS";
public const string H264MultipleBitrate16x9SDAudio51 = "H264 Multiple Bitrate 16x9 SD Audio 5.1";
public const string H264MultipleBitrate16x9SD = "H264 Multiple Bitrate 16x9 SD";
public const string H264MultipleBitrate4KAudio51 = "H264 Multiple Bitrate 4K Audio 5.1";
public const string H264MultipleBitrate4K = "H264 Multiple Bitrate 4K";
public const string H264MultipleBitrate4x3foriOS = "H264 Multiple Bitrate 4x3 for iOS";
public const string H264MultipleBitrate4x3SDAudio51 = "H264 Multiple Bitrate 4x3 SD Audio 5.1";
public const string H264MultipleBitrate4x3SD = "H264 Multiple Bitrate 4x3 SD";
public const string H264MultipleBitrate720pAudio51 = "H264 Multiple Bitrate 720p Audio 5.1";
public const string H264MultipleBitrate720p = "H264 Multiple Bitrate 720p";

// H264 Single Bitrate Presets
public const string H264SingleBitrate1080pAudio51 = "H264 Single Bitrate 1080p Audio 5.1";
public const string H264SingleBitrate1080p = "H264 Single Bitrate 1080p";
public const string H264SingleBitrate4KAudio51 = "H264 Single Bitrate 4K Audio 5.1";
public const string H264SingleBitrate4K = "H264 Single Bitrate 4K";
public const string H264SingleBitrate4x3SDAudio51 = "H264 Single Bitrate 4x3 SD Audio 5.1";
public const string H264SingleBitrate4x3SD = "H264 Single Bitrate 4x3 SD";
public const string H264SingleBitrate16x9SDAudio51 = "H264 Single Bitrate 16x9 SD Audio 5.1";
public const string H264SingleBitrate16x9SD = "H264 Single Bitrate 16x9 SD";
public const string H264SingleBitrate720pAudio51 = "H264 Single Bitrate 720p Audio 5.1";
public const string H264SingleBitrate720pforAndroid = "H264 Single Bitrate 720p for Android";
public const string H264SingleBitrate720p = "H264 Single Bitrate 720p";
public const string H264SingleBitrateHighQualitySDforAndroid = "H264 Single Bitrate High Quality SD for Android";
public const string H264SingleBitrateLowQualitySDforAndroid = "H264 Single Bitrate Low Quality SD for Android";

Added new CreateFromBlobAsync / CreateFromBlob extension methods for AssetBaseCollection class to create a new asset by copying a source blob. This extension works with a source blob belonging to any Storage account (not necessary bound to the Media Services account and even across different datacenters).

CloudMediaContext context = new CloudMediaContext("%accountName%", "%accountKey%");
StorageCredentials storageCredentials = new StorageCredentials("%storageAccountName%", "%storageAccountKey%");

// Get a reference to the source blob that will be copied in the new asset.
CloudBlockBlob sourceBlob = null;

// Create a new asset and copies the sourceBlob parameter using a single extension method.
IAsset asset = context.Assets.CreateFromBlob(sourceBlob, storageCredentials, AssetCreationOptions.None);

Added new CopyAsync / Copy extension methods for IAsset interface to copy all files in the source asset into the destination asset. This extension works with regular assets, live archive assets (FragBlob format) and source/destination assets belonging to different Media Services accounts (even across different datacenters).

CloudMediaContext context = new CloudMediaContext("%accountName%", "%accountKey%");

// Get a reference to the source asset.
string sourceAssetId = "%sourceAssetId%";
IAsset sourceAsset = context.Assets.Where(a => a.Id == sourceAssetId).First();

// Create an empty destination asset where the source asset files are going to be copied.
IAsset destinationAsset = context.Assets.Create("Asset Copy", AssetCreationOptions.None);
StorageCredentials destinationStorageCredentials = new StorageCredentials("%storageAccountName%", "%storageAccountKey%");

// Copy the files in the 'sourceAsset' instance into the 'destinationAsset' instance.
sourceAsset.Copy(destinationAsset, destinationStorageCredentials);

Added new CopyBlobHelpers static class with some helper methods for copying blob.

/// <summary>
/// Returns a <see cref="System.Threading.Tasks.Task"/> instance for the copy blobs operation from <paramref name="sourceContainer"/> to <paramref name="destinationContainer"/>.
/// </summary>
/// <param name="sourceContainer">The <see cref="Microsoft.WindowsAzure.Storage.Blob.CloudBlobContainer"/> instance that contains the blobs to be copied into <paramref name="destinationContainer"/>.</param>
/// <param name="destinationContainer">The <see cref="Microsoft.WindowsAzure.Storage.Blob.CloudBlobContainer"/> instance where the blobs from <paramref name="sourceContainer"/> will be copied.</param>
/// <param name="options">The <see cref="Microsoft.WindowsAzure.Storage.Blob.BlobRequestOptions"/>.</param>
/// <param name="cancellationToken">The <see cref="System.Threading.CancellationToken"/> instance used for cancellation.</param>
/// <returns>A <see cref="System.Threading.Tasks.Task"/> instance for the copy blobs operation from <paramref name="sourceContainer"/> to <paramref name="destinationContainer"/>.</returns>
public static async Task CopyBlobsAsync(CloudBlobContainer sourceContainer, CloudBlobContainer destinationContainer, BlobRequestOptions options, CancellationToken cancellationToken);

/// <summary>
/// Returns a <see cref="System.Threading.Tasks.Task"/> instance for the copy blob operation from <paramref name="sourceBlob"/> to <paramref name="destinationBlob"/>.
/// </summary>
/// <param name="sourceBlob">The <see cref="Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob"/> instance to be copied to <paramref name="destinationBlob"/>.</param>
/// <param name="destinationBlob">The <see cref="Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob"/> instance where <paramref name="sourceBlob"/> will be copied.</param>
/// <param name="options">The <see cref="Microsoft.WindowsAzure.Storage.Blob.BlobRequestOptions"/>.</param>
/// <param name="cancellationToken">The <see cref="System.Threading.CancellationToken"/> instance used for cancellation.</param>
/// <returns>A <see cref="System.Threading.Tasks.Task"/> instance for the copy blob operation from <paramref name="sourceBlob"/> to <paramref name="destinationBlob"/>.</returns>
public static async Task CopyBlobAsync(CloudBlockBlob sourceBlob, CloudBlockBlob destinationBlob, BlobRequestOptions options, CancellationToken cancellationToken);


As usual, feedback and contributions in the GitHub project are always welcome.



New Microsoft Azure Media Services SDK’s for Java and PHP release with FairPlay Streaming support

Over the last few days, the Azure SDK team published new releases of the Azure SDK for Java and Azure SDK for PHP packages that contain updates and new features for Microsoft Azure Media Services about Content Protection. In particular, both SDK’s now support Apple FairPlay Streaming (FPS) DRM dynamic encryption configuration and include improvements for Widevine DRM dynamic encryption configuration.

To take advantage of these features in your Java Maven projects, you need to use the latest Azure Media Service Java SDK (v0.9.4) by adding the following azure-media dependency in your pom.xml file.



For PHP Composer projects, you need to use the latest Azure Media Service PHP SDK (v0.4.4) by adding the following microsoft/windowsazure dependency in the composer.json file – make sure to call require_once(‘vendor/autoload.php’); in your PHP files.

  "name": "my/sample",
    "license": "Apache-2.0",
    "require": {
        "microsoft/windowsazure": "^0.4"


FairPlay Streaming DRM support

With this new release, you can now use the Azure Media Services REST API operations and entities to configure DRM dynamic encryption with Apple FairPlay Streaming (FPS) in both Java and PHP. Below you can find a sample VOD workflow that shows how to enable FairPlay Streaming:

For more details about Apple FairPlay Streaming (FPS) support in Azure Media Services, you can read the official general availability announcement by @mingfeiy: https://azure.microsoft.com/blog/apple-fairplay-streaming-for-azure-media-services-generally-available/.


Widevine DRM updates

When you create a Widevine Asset Delivery Policy, you have to specify the license acquisition URL for the common encryption Content Key assigned to the Asset. This value can be obtained by calling the GetKeyDeliveryUrl REST API operation on the Content Key instance. The URL returned by this call, however, contains the Content Key ID as a query string parameter: ?KID=<Guid>. This means that if you use this value “as is” in the configuration, the resulting Widevine Asset Delivery Policy will only be valid for that single Content Key instance.

With the new release, you can now reuse the same Widevine Asset Delivery Policy among multiple Assets. To do this, you just need to remove the query string from the Content Key license acquisition URL and then set it in the Asset Delivery Policy configuration using the new “Widevine Base License Acquisition Url” option. Below you can find some sample code snippets showing how to do this in both Java and PHP.

PHP sample code snippet

// $restProxy: Azure Media Services client context
// $contentKey: a common encrytion content key

$widevineUrl = $restProxy->getKeyDeliveryUrl($contentKey, ContentKeyDeliveryType::WIDEVINE);

// Remove query string
if (strpos($widevineUrl, '?') !== false) {
$widevineUrl = substr($widevineUrl, 0, strrpos($widevineUrl, "?"));

// Generate the AssetDeliveryPolicy configuration
$config = [AssetDeliveryPolicyConfigurationKey::WIDEVINE_BASE_LICENSE_ACQUISITION_URL => $widevineUrl];
$configuration = AssetDeliveryPolicyConfigurationKey::stringifyAssetDeliveryPolicyConfiguartionKey($config);

// Create a reusable AssetDeliveryPolicy for Widevine
$adpolicy = new AssetDeliveryPolicy();
$adpolicy->setName('Reusable Widevine Delivery Policy');
$adpolicy = $restProxy->createAssetDeliveryPolicy($adpolicy);

Java sample code snippet

// mediaService: Azure Media Services client context
// contentKey: a common encrytion content key

String widevineUrl = mediaService.create(ContentKey.getKeyDeliveryUrl(contentKey.getId(), ContentKeyDeliveryType.Widevine));

// Remove query string
if (widevineUrl.contains("?")) {
widevineUrl = widevineUrl.substring(0, widevineUrl.indexOf("?"));

// Generate the AssetDeliveryPolicy configuration
Map<AssetDeliveryPolicyConfigurationKey, String> configuration = new HashMap<AssetDeliveryPolicyConfigurationKey, String>();
configuration.put(AssetDeliveryPolicyConfigurationKey.WidevineBaseLicenseAcquisitionUrl, widevineUrl);

AssetDeliveryPolicyInfo assetDeliveryPolicy = mediaService.create(AssetDeliveryPolicy.create()
.setName("Reusable Widevine Delivery Policy")



Microsoft Azure Media Services SDK for Java v0.9.1 released with support for Widevine dynamic encryption


Last Friday, the Azure SDK team published a new release of the Azure SDK for Java Maven packages; you can find the full list at http://search.maven.org/#search|ga|1|com.microsoft.azure. In particular, there were was a minor new release (v0.9.1) of the Azure Media Services SDK for Java that adds support for Widevine (DRM) Dynamic Common Encryption and License Delivery Service; below I’m listing the change log.

If you want to use the Java SDK in your Maven project, you just need to add the “azure-media” dependency in your pom.xml file as follows:



To demonstrate the new Java SDK features, I created the azure-media-dynamic-encryption-playready-widevine sample console Java application that contains a VOD end-to-end workflow that uses PlayReady and Widevine (DRM) Dynamic Common Encryption and the License Delivery Service for playback. It is based on the .NET sample explained in this documentation article: https://azure.microsoft.com/documentation/articles/media-services-protect-with-drm/.

You can access the full source code of this sample at: https://github.com/southworkscom/azure-sdk-for-media-services-java-samples/tree/master/azure-media-dynamic-encryption-playready-widevine.

Media Services SDK for Java sample projects in Eclipse


v0.9.1 Change Log



New Microsoft Azure Media Services SDK for PHP release available with New features and samples

Azure Media Services SDK for PHP

Last week the Azure SDK team published a new release of the Azure SDK for PHP package that contains updates and new features for Microsoft Azure Media Services. In particular, the Azure Media Services SDK for PHP now supports the latest Content Protection features (AES and DRM – both PlayReady and Widevine – dynamic encryption with and without Token restriction), and listing/scaling Encoding Units. This release also includes three new PHP samples that show how to use these new features; below you can find the full change log with all the details about these updates.

In this post, I’ll focus on explaining how to use one of these new features: implement a VOD workflow that applies PlayReady and Widevine (DRM systems) with Dynamic Common Encryption (CENC) using Token restriction for the license.

  1. Make sure you have PEAR and Composer properly installed and configured (php.ini) in your local development box.
  2. Add the following dependencies in the composer.json file in the root of your project.
    "repositories": [
    "type": "pear",
    "url": "http://pear.php.net",
    "vendor-alias": "pear-pear2.php.net"
    "require": {
    "pear-pear.php.net/HTTP_Request2": "0.4.0",
    "pear-pear.php.net/mail_mime": "*",
    "pear-pear.php.net/mail_mimedecode": "*",
    "firebase/php-jwt": "^3.0",
    "microsoft/windowsazure": "dev-master"

  3. In your index.php main file include the autoload.php file generated by Composer to load all the dependencies, and add the use statements for the required namespaces.
    require_once 'vendor/autoload.php';

    use WindowsAzure\Common\ServicesBuilder;
    use WindowsAzure\Common\Internal\MediaServicesSettings;
    use WindowsAzure\Common\Internal\Utilities;
    use WindowsAzure\MediaServices\Models\Asset;
    use WindowsAzure\MediaServices\Models\AccessPolicy;
    use WindowsAzure\MediaServices\Models\Locator;
    use WindowsAzure\MediaServices\Models\Task;
    use WindowsAzure\MediaServices\Models\Job;
    use WindowsAzure\MediaServices\Models\TaskOptions;
    use WindowsAzure\MediaServices\Models\ContentKey;
    use WindowsAzure\MediaServices\Models\ProtectionKeyTypes;
    use WindowsAzure\MediaServices\Models\ContentKeyTypes;
    use WindowsAzure\MediaServices\Models\ContentKeyAuthorizationPolicy;
    use WindowsAzure\MediaServices\Models\ContentKeyAuthorizationPolicyOption;
    use WindowsAzure\MediaServices\Models\ContentKeyAuthorizationPolicyRestriction;
    use WindowsAzure\MediaServices\Models\ContentKeyDeliveryType;
    use WindowsAzure\MediaServices\Models\ContentKeyRestrictionType;
    use WindowsAzure\MediaServices\Models\AssetDeliveryPolicy;
    use WindowsAzure\MediaServices\Models\AssetDeliveryProtocol;
    use WindowsAzure\MediaServices\Models\AssetDeliveryPolicyType;
    use WindowsAzure\MediaServices\Models\AssetDeliveryPolicyConfigurationKey;
    use WindowsAzure\MediaServices\Templates\PlayReadyLicenseResponseTemplate;
    use WindowsAzure\MediaServices\Templates\PlayReadyLicenseTemplate;
    use WindowsAzure\MediaServices\Templates\PlayReadyLicenseType;
    use WindowsAzure\MediaServices\Templates\MediaServicesLicenseTemplateSerializer;
    use WindowsAzure\MediaServices\Templates\WidevineMessage;
    use WindowsAzure\MediaServices\Templates\AllowedTrackTypes;
    use WindowsAzure\MediaServices\Templates\ContentKeySpecs;
    use WindowsAzure\MediaServices\Templates\RequiredOutputProtection;
    use WindowsAzure\MediaServices\Templates\Hdcp;
    use WindowsAzure\MediaServices\Templates\TokenRestrictionTemplateSerializer;
    use WindowsAzure\MediaServices\Templates\TokenRestrictionTemplate;
    use WindowsAzure\MediaServices\Templates\SymmetricVerificationKey;
    use WindowsAzure\MediaServices\Templates\TokenClaim;
    use WindowsAzure\MediaServices\Templates\TokenType;
    use WindowsAzure\MediaServices\Templates\WidevineMessageSerializer;

  4. Create a rest proxy instance for the Azure Media Services REST API.
    // Replace the placeholders with your Media Services credentials
    $restProxy = ServicesBuilder::getInstance()->createMediaServicesService(new MediaServicesSettings("%account-name%", "%account-key%"));

  5. Create a new asset using your mezzanine source file.
    // Replace the placeholder with your mezzanine file name and path
    $sourceAsset = uploadFileAndCreateAsset($restProxy, "%source-mezzanine-file.mp4%");

    function uploadFileAndCreateAsset($restProxy, $mezzanineFileName) {
    // Create an empty "Asset" by specifying the name
    $asset = new Asset(Asset::OPTIONS_NONE);
    $asset->setName("Mezzanine " . $mezzanineFileName);
    $asset = $restProxy->createAsset($asset);
    $assetId = $asset->getId();

    print "Asset created: name=" . $asset->getName() . " id=" . $assetId . "\r\n";

    // Create an Access Policy with Write permissions
    $accessPolicy = new AccessPolicy('UploadAccessPolicy');
    $accessPolicy = $restProxy->createAccessPolicy($accessPolicy);

    // Create a SAS Locator for the Asset
    $sasLocator = new Locator($asset, $accessPolicy, Locator::TYPE_SAS);
    $sasLocator->setStartTime(new \DateTime('now -5 minutes'));
    $sasLocator = $restProxy->createLocator($sasLocator);

    // Get the mezzanine file content
    $fileContent = file_get_contents($mezzanineFileName);

    print "Uploading...\r\n";

    // Perform a multi-part upload using the Block Blobs REST API storage operations
    $restProxy->uploadAssetFile($sasLocator, $mezzanineFileName, $fileContent);

    // Generate the asset files metadata

    print "File uploaded: size=" . strlen($fileContent) . "\r\n";

    // Delete the SAS Locator (and Access Policy) for the Asset
    return $asset;

  6. Submit a transcoding job for the source asset to generate a multi-bitrate output asset suitable for adaptive streaming.
    $encodedAsset = encodeToAdaptiveBitrateMP4Set($restProxy, $sourceAsset);

    function encodeToAdaptiveBitrateMP4Set($restProxy, $asset) {
    // Retrieve the latest 'Media Encoder Standard' processor version
    $mediaProcessor = $restProxy->getLatestMediaProcessor('Media Encoder Standard');

    print "Using Media Processor: {$mediaProcessor->getName()} version {$mediaProcessor->getVersion()}\r\n";

    // Create the Job; this automatically schedules and runs it
    $outputAssetName = "Encoded " . $asset->getName();
    $outputAssetCreationOption = Asset::OPTIONS_NONE;
    $taskBody = '<?xml version="1.0" encoding="utf-8"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset assetCreationOptions="' . $outputAssetCreationOption . '" assetName="' . $outputAssetName . '">JobOutputAsset(0)</outputAsset></taskBody>';

    $task = new Task($taskBody, $mediaProcessor->getId(), TaskOptions::NONE);
    $task->setConfiguration('H264 Multiple Bitrate 720p');

    $job = new Job();
    $job->setName('Encoding Job');

    $job = $restProxy->createJob($job, array($asset), array($task));

    print "Created Job with Id: {$job->getId()}\r\n";

    // Check to see if the Job has completed
    $result = $restProxy->getJobStatus($job);

    $jobStatusMap = array('Queued', 'Scheduled', 'Processing', 'Finished', 'Error', 'Canceled', 'Canceling');

    while($result != Job::STATE_FINISHED && $result != Job::STATE_ERROR && $result != Job::STATE_CANCELED) {
    print "Job status: {$jobStatusMap[$result]}\r\n";
    $result = $restProxy->getJobStatus($job);

    if ($result != Job::STATE_FINISHED) {
    print "The job has finished with a wrong status: {$jobStatusMap[$result]}\r\n";

    print "Job Finished!\r\n";

    // Get output asset
    $outputAssets = $restProxy->getJobOutputMediaAssets($job);
    $encodedAsset = $outputAssets[0];

    print "Asset encoded: name={$encodedAsset->getName()} id={$encodedAsset->getId()}\r\n";

    return $encodedAsset;

  7. Create a new Common Encryption content key and linked it to the multi-bitrate output asset.
    $contentKey = createCommonTypeContentKey($restProxy, $encodedAsset);

    function createCommonTypeContentKey($restProxy, $encodedAsset) {
    // Generate a new content key
    $keyValue = Utilities::generateCryptoKey(16);

    // Get the protection key id for content key
    $protectionKeyId = $restProxy->getProtectionKeyId(ContentKeyTypes::COMMON_ENCRYPTION);
    $protectionKey = $restProxy->getProtectionKey($protectionKeyId);

    $contentKey = new ContentKey();
    $contentKey->setContentKey($keyValue, $protectionKey);

    // 3.3 Create the ContentKey
    $contentKey = $restProxy->createContentKey($contentKey);

    print "Content Key id={$contentKey->getId()}\r\n";

    // Associate the content key with the asset
    $restProxy->linkContentKeyToAsset($encodedAsset, $contentKey);

    return $contentKey;

  8. Create a new content key authorization policy with PlayReady and Widevine options using Token restriction, and linked it to the content key.
    // You can also use TokenType::SWT 
    $tokenTemplateString = addTokenRestrictedAuthorizationPolicy($restProxy, $contentKey, TokenType::JWT);

    function addTokenRestrictedAuthorizationPolicy($restProxy, $contentKey, $tokenType) {
    // Create content key authorization policy restriction (Token)
    $tokenRestriction = generateTokenRequirements($tokenType);
    $restriction = new ContentKeyAuthorizationPolicyRestriction();
    $restriction->setName('Content Key Authorization Policy Restriction');

    // Configure PlayReady and Widevine license templates.
    $playReadyLicenseTemplate = configurePlayReadyLicenseTemplate();
    $widevineLicenseTemplate = configureWidevineLicenseTemplate();

    // Create content key authorization policy option (PlayReady)
    $playReadyOption = new ContentKeyAuthorizationPolicyOption();
    $playReadyOption->setName('PlayReady Authorization Policy Option');
    $playReadyOption = $restProxy->createContentKeyAuthorizationPolicyOption($playReadyOption);

    // Create content key authorization policy option (Widevine)
    $widevineOption = new ContentKeyAuthorizationPolicyOption();
    $widevineOption->setName('Widevine Authorization Policy Option');
    $widevineOption = $restProxy->createContentKeyAuthorizationPolicyOption($widevineOption);

    // Create content key authorization policy
    $ckapolicy = new ContentKeyAuthorizationPolicy();
    $ckapolicy->setName('Content Key Authorization Policy');
    $ckapolicy = $restProxy->createContentKeyAuthorizationPolicy($ckapolicy);

    // Link the PlayReady and Widevine options to the content key authorization policy
    $restProxy->linkOptionToContentKeyAuthorizationPolicy($playReadyOption, $ckapolicy);
    $restProxy->linkOptionToContentKeyAuthorizationPolicy($widevineOption, $ckapolicy);

    // Associate the authorization policy with the content key

    print "Added Content Key Authorization Policy: name={$ckapolicy->getName()} id={$ckapolicy->getId()}\r\n";
    return $tokenRestriction;

    function generateTokenRequirements($tokenType) {
    $template = new TokenRestrictionTemplate($tokenType);

    $template->setPrimaryVerificationKey(new SymmetricVerificationKey());
    $claims = array();
    $claims[] = new TokenClaim(TokenClaim::CONTENT_KEY_ID_CLAIM_TYPE);

    return TokenRestrictionTemplateSerializer::serialize($template);

    function configurePlayReadyLicenseTemplate() {
    $responseTemplate = new PlayReadyLicenseResponseTemplate();

    $licenseTemplate = new PlayReadyLicenseTemplate();

    return MediaServicesLicenseTemplateSerializer::serialize($responseTemplate);

    function configureWidevineLicenseTemplate() {
    $template = new WidevineMessage();
    $template->allowed_track_types = AllowedTrackTypes::SD_HD;

    $contentKeySpecs = new ContentKeySpecs();
    $contentKeySpecs->required_output_protection = new RequiredOutputProtection();
    $contentKeySpecs->required_output_protection->hdcp = Hdcp::HDCP_NONE;
    $contentKeySpecs->security_level = 1;
    $contentKeySpecs->track_type = "SD";
    $template->content_key_specs = array($contentKeySpecs);

    $policyOverrides = new \stdClass();
    $policyOverrides->can_play = true;
    $policyOverrides->can_persist = true;
    $policyOverrides->can_renew = false;
    $template->policy_overrides = $policyOverrides;

    return WidevineMessageSerializer::serialize($template);

  9. Create a new asset delivery policy for PlayReady and Widevine dynamic common encryption for the MPEG-DASH streaming protocol, and linked it to the multi-bitrate output asset.
    createAssetDeliveryPolicy($restProxy, $encodedAsset, $contentKey);

    function createAssetDeliveryPolicy($restProxy, $encodedAsset, $contentKey) {
    // Get the license acquisition URLs
    $playReadyUrl = $restProxy->getKeyDeliveryUrl($contentKey, ContentKeyDeliveryType::PLAYREADY_LICENSE);
    $widevineURl = $restProxy->getKeyDeliveryUrl($contentKey, ContentKeyDeliveryType::WIDEVINE);

    // Generate the asset delivery policy configuration
    $configuration = [AssetDeliveryPolicyConfigurationKey::PLAYREADY_LICENSE_ACQUISITION_URL => $playReadyUrl,
    AssetDeliveryPolicyConfigurationKey::WIDEVINE_LICENSE_ACQUISITION_URL => $widevineURl];
    $confJson = AssetDeliveryPolicyConfigurationKey::stringifyAssetDeliveryPolicyConfiguartionKey($configuration);

    // Create the asset delivery policy
    $adpolicy = new AssetDeliveryPolicy();
    $adpolicy->setName('Asset Delivery Policy');

    $adpolicy = $restProxy->createAssetDeliveryPolicy($adpolicy);

    // Link the delivery policy to the asset
    $restProxy->linkDeliveryPolicyToAsset($encodedAsset, $adpolicy->getId());

    print "Added Asset Delivery Policy: name={$adpolicy->getName()} id={$adpolicy->getId()}\r\n";

  10. Publish the multi-bitrate output asset with an origin locator to generate the base streaming URL.
    publishEncodedAsset($restProxy, $encodedAsset);

    function publishEncodedAsset($restProxy, $encodedAsset) {
    // Get the .ISM asset file
    $files = $restProxy->getAssetAssetFileList($encodedAsset);
    $manifestFile = null;

    foreach($files as $file) {
    if (endsWith(strtolower($file->getName()), '.ism')) {
    $manifestFile = $file;

    if ($manifestFile == null) {
    print "Unable to found the manifest file\r\n";

    // Create a 30-day read-only access policy
    $access = new AccessPolicy("Streaming Access Policy");
    $access->setDurationInMinutes(60 * 24 * 30);
    $access = $restProxy->createAccessPolicy($access);

    // Create an origin locator for the asset
    $locator = new Locator($encodedAsset, $access, Locator::TYPE_ON_DEMAND_ORIGIN);
    $locator->setName("Streaming Locator");
    $locator = $restProxy->createLocator($locator);

    // Create the base streaming URL for dynamic packaging
    $stremingUrl = $locator->getPath() . $manifestFile->getName() . "/manifest";

    print "Base Streaming URL: {$stremingUrl}\r\n";

    function endsWith($haystack, $needle) {
    $length = strlen($needle);
    if ($length == 0) {
    return true;

    return (substr($haystack, -$length) === $needle);

  11. Generate a test Token to retrieve the PlayReady/Widevine license and enable playback in Azure Media Player.
    generateTestToken($tokenTemplateString, $contentKey);

    function generateTestToken($tokenTemplateString, $contentKey) {
    $template = TokenRestrictionTemplateSerializer::deserialize($tokenTemplateString);
    $contentKeyUUID = substr($contentKey->getId(), strlen("nb:kid:UUID:"));
    $expiration = strtotime("+12 hour");
    $token = TokenRestrictionTemplateSerializer::generateTestToken($template, null, $contentKeyUUID, $expiration);

    print "Token Type {$template->getTokenType()}\r\nBearer={$token}\r\n";

  12. Run the code using the following PHP command and make sure to copy the Base Streaming URL and Token values displayed in the console.
    php -d display_errors=1 index.php

  13. Try the Base Streaming URL and Token values in the Azure Media Player demo site: http://amsplayer.azurewebsites.net/. Make sure to use the Advanced Options form to the set Protection value to DRM (PlayReady and Widevine) and paste the token.


For more coding details about enabling PlayReady and Widevine dynamic common encryption, you can check the vodworkflow_drm_playready_widevine.php sample.


Change Log



Microsoft Azure Media Services SDK for Java v0.8.0 released and new samples available

Azure Media Services SDK for Java

This week the Azure SDK team published new releases of the Azure SDK for Java packages that contain updates and support for more Microsoft Azure Platform Services features. You can find the full list of packages at http://search.maven.org/#search|ga|1|g:”com.microsoft.azure”.

In particular, there was a new release (v.0.8.0) of the Azure Media Services SDK for Java that contains lots of new features such as Content Protection (Dynamic Encryption) and support for more entities/operations (StreamingEndpoint, EncodingReservedUnitType, StorageAccount, etc.). Below you can find the full change log for this release.

Here at Southworks I’ve been working with Emanuel Vecchio on preparing some Java console sample applications showing how to use the new features recently added to the Azure Media Services SDK for Java. As a result, we created the azure-sdk-for-media-services-java-samples GitHub repository containing the following samples:

Media Services SDK for Java sample projects in Eclipse


v0.8.0 Change Log



Twitter Sentiment Analysis

> Note: If you are not familiar with machine learning you can start with this post which explains the basic concepts of Machine Learning and the Azure Machine Learning service.

The purpose of this post is to explain how to build an experiment for sentiment analysis using Azure Machine Learning and then publish it to a public API that can be consumed by any application that needs to use this feature for a particular business scenario (e.g. gather user’s opinions about a product or brand, etc.). Since there is already a Text Analytics API in the Azure Marketplace in English, we decided to create it in Spanish. And to simplify things, we used the sample Twitter Sentiment analysis experiment available in the Azure Machine Learning Gallery.

Creating a custom dataset

This is our greatest challenge: create a valid dataset but with Spanish content. There is an existing dataset used in the sample experiment we are going to use as a basis for our work, which you can find here. This experiment is based on an original dataset of 1,600,000 tweets classified as negative or positive. The Azure ML Studio sample dataset contains only 10% of this data (160,000 records). In supervised learning the more training data you have, the more accurate your trained model will be, and that’s why the first thing we want is a dataset with a considerable amount of data.

As this dataset is in English, the predictive model will learn to process English text. But since we want to create a service using the Spanish language, our data needs to be in Spanish.

To get the data in Spanish we could use Spanish tweets and manually classify them (which would take a long time) or use the original dataset translated to Spanish. In the latter option, the hard work of classifying the data is already done and we could use an automatic translation tool to do the work for us. Although automatic translation is not 100% accurate, the keywords will be there, so we’re going to go with this approach to make sure we have a good quantity of training data.

For this reason we created a very simple console application that uses the Bing Translate API to translate our dataset and return it in the correct format.

Once we have the dataset ready, the next step is to upload it to Azure ML studio so it is available to use in the experiments.

To upload the recently created dataset, in the Azure ML portal click NEW, select DATASET, and then click FROM LOCAL FILE. In the dialog box that opens, select the file you want to upload, type a name, and select the dataset type (this is usually inferred automatically). In our case, it is a TAB separated values file (.tsv).

uploading a new dataset

The data in the dataset contains only 2 columns, the sentiment_label, which is 0 for a negative sentiment and 4 for positive.

Sample input data

Once the dataset is created, we will take advantage of the existing sample experiment of the Machine Learning Gallery, available here.

Open the experiment by clicking Open in Studio as shown below.

sample experiment

Then, you will be prompted to copy the experiment from the Gallery to your workspace.

copying from gallery

At this point let’s remove the Reader module from the experiment and add the custom dataset we created. Connect the dataset to the Execute R Script module.

Run the experiment.

Running the experiment

Pre-processing the data

This experiment uses several modules to pre-process the data before analyzing its content (like removing punctuation marks or special characters, or adjusting the data to fit the algorithm used). For more information about the data preprocessing, you can read the information available in the experiment page in the Gallery.

Scoring the model

After running the predictive experiment, let’s create the scoring model. To do this, point to SET UP WEB SERVICE and select Predictive Web Service [Recommended].

setting up the web service

Once the Predictive Experiment is created, we need to update this experiment to make it work as expected. First delete the Filter Based Feature Selection Module and reconnect the Feature Hashing module to the Score Model module.

Delete the connection between the Score Model module and the Web Service Output module by right-clicking it and clicking Delete.

deleting a conection

Between those two modules, add a Project Columns module, and then an Execute R Script module. Connect them in sequence and also with the Web Service Output module. The resulting experiment will resemble the following image.

resulting experiment

Now let’s configure the Project Columns module. Select it and in the Properties pane, click Launch column selector. In the dialog box that opens, in the row with the Include dropdown, go to the text field and add the four available columns (sentiment_label, tweet_text, Scored Labels, and Scored Probabilities).

projecting columns

Lastly, select the Execute R Script to configure it. Click inside the R Script text box and replace the existing script with the following:

# Map 1-based optional input ports to variables
dataset1 <- maml.mapInputPort(1) # class: data.frame

#set thresholds for classification
threshold1 <- 0.60
threshold2 <- 0.45
positives <- which(dataset1["Scored Probabilities"] > threshold1)
negatives <- which(dataset1["Scored Probabilities"] < threshold2)
neutrals <- which(dataset1["Scored Probabilities"] <= threshold1 &
dataset1["Scored Probabilities"] >= threshold2)

new.labels <- matrix(nrow=length(dataset1["Scored Probabilities"]),
new.labels[positives] <- "positive"
new.labels[negatives] <- "negative"
new.labels[neutrals] <- "neutral"

data.set <- data.frame(assigned=new.labels,
confidence=dataset1["Scored Probabilities"])
colnames(data.set) <- c('Sentiment', 'Score')

# Select data.frame to be sent to the output Dataset port

This will return two columns as the output of the service: Sentiment and Score.

The sentiment column will be returned as Positive, Neutral or Negative and the Score column will be the Score Probability. The classification will be made based on the defined thresholds and will fall into the following 3 categories:

  • Less than 0.45: Negative
  • Between 0.45 and 0.60: Neutral
  • Above 0.60: Positive

Now that everything is set up, we can run the experiment.

Publishing and Testing the Web Service

Once the predictive experiment finishes, click Deploy Web Service. The deployed service screen will appear. Click Test.

published web service

In the Enter data to predict dialog box, enter a text in Spanish in the TWEET_TEXT parameter and click the check mark button.

entering data to predict

Wait for the web service to predict the results, which will be shown as an alert.

prediction results

We have made the following test page that uses our generated API to test the service.

testing app

Next Steps

We tested the resulting API with some sample text, and we are pleased with the outcome (the model learned how to classify Spanish texts quite well). Nevertheless, there are some ways to improve the model we have created, such as:

  • Trying other training algorithms and comparing their performance
  • Improving the input dataset, either by having a brand new dataset with manually classified information in Spanish or using common keywords for getting classified results.

Given that this is a proof of concept, we consider this to be a successful experiment.

Ways to create a Docker VM in Microsoft Azure


In 2014 at Microsoft’s Redmond campus, Microsoft and Docker announced a strategic working relationship which would enable developers and organizations to create Docker containers with the same ecosystem of users and applications on both Windows Server and Linux. Both Microsoft and Docker allowed developers to create exciting new business scenarios that let Microsoft’s server and cloud platform customers benefit from the work of the Docker open source community.

If you don’t know much about the non-Windows world, it might come as a surprise that Microsoft has been working with Linux for several years now, which is why Linux customers now have Microsoft products. Working with Linux is not new at Microsoft, but it is getting more attention. If you want to know more, read Microsoft loves Linux.

You have probably heard about Docker and how it works, but if not, please review the Intro to Docker blog post or read the Understanding Docker documentation. In this post we will show you the options for creating a Docker VM for Linux in Azure.

Ways to create the Docker VM in Azure

Microsoft Azure provides the Azure Docker Extension built by both Microsoft and Docker to enable features (security, runtime, debugging and more) which you can take advantage of to increase your productivity with Azure VM. You can see the source code for the Azure Docker Extension in GitHub. So, when Azure creates a Docker VM, first it creates the Linux Virtual Machine and then installs the Docker extension. This means that you can always create a Linux VM and then install Docker as you would install it anywhere else (e.g. using an apt-get package).

Additionally, we can use Microsoft Azure to create a Docker VM that can host any number of containers for our applications in Azure. There are different options when it comes to creating a Docker VM. Below you will find a quick description of each option so you can select which would work best for you.

These are the options that we will describe in this post:

  1. Using the Azure Marketplace
  2. Adding the Docker VM Extension to a VM within the Azure Portal
  3. Using the Azure Command-line interface
  4. Using the Visual Studio 2015 Tools for Docker

Using the Azure Marketplace

This is the fastest way to create a VM using Docker on an Ubuntu Server. It creates an Ubuntu Server and automatically installs the Docker VM Extension. For further information see here.

To create a VM with Docker, log in to the Azure Portal Preview and click New | Marketplace and filter by Docker.

Creating a Docker VM in Azure

Adding the Docker VM Extension to a VM within the Azure Portal

Use the Azure Portal Preview to create a VM using the Ubuntu Server 14.04 LTS image from the Image Gallery of Azure and then configure this VM adding the Docker VM Extension. For further information see here.

Using the Docker VM Extension

Using the Azure Command-line interface

Use the Azure Command-Line Interface (Azure CLI) to create the Docker VM. The benefit of this approach is that we can use this on any platform because the Azure CLI is a cross-platform command line. More details here.

Using the Azure CLI

Using the Visual Studio 2015 Tools for Docker

This approach uses the Visual Studio 2015 RC Tools for Docker – Preview tool that enables developers to build and publish an ASP.NET 5 Web or console application to a Docker container running on a Linux VM.

Using the Visual Studio Tools for Docker


Further Reading

Reusing Azure Media Services Locators to avoid facing the "5 Shared Access Policy" limitation

If you have developed VOD or Live workflows with Azure Media Services, you might have faced the following error when creating multiple Asset Locators: “Server does not support setting more than 5 shared access policy identifiers on a single container”.

<?xml version="1.0" encoding="utf-8"?> <m:error xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"> <m:code /> <m:message xml:lang="en-US">Server does not support setting more than 5 shared access policy identifiers on a single container.</m:message> </m:error>

To understand the reason behind this error (and how to avoid it), first let me clarify the differences among Azure Storage Stored Access Policies, Media Services Access Policies and Media Services Locators:

  • Stored Access Policies. This is an Azure Storage Services REST API feature that provides an additional level of control over shared access signatures (SAS) on the server-side for containers, queues, or tables. This feature has the limitation that you can include up to 5 Stored Access Policies for each container, queue or table.
  • Access Policies. This is an Azure Media Services REST API entity that it is just used as a “template” for creating Locators. There is no mapping between a Media Services Access Policy and a Storage Services Stored Access Policy and, therefore, there is no explicit limitation on the number of Access Policies you can create.
  • Locators. This is an Azure Media Services REST API entity that provides an entry point to access the files contained in an Asset container. An Access Policy is used to define the permissions and duration that a client has access to a given Asset. When a Locator is created, the Media Services REST API creates a Stored Access Policy in the container associated with the Asset. Therefore, the same Stored Access Policy limitation also applies for Locators: you can create up to 5 Locators for a given Asset.

As you can see, the error is thrown by an Azure Storage Services limitation on the number of Stored Access Policies for a container, and the same limitation is inherited by the number of Media Services Locators for an Asset.

There are different options to avoid getting this error (depending on your scenario):

  1. Delete the asset locators after you are done using them. For example, if you need to upload a new asset file, you have to create a SAS locator with Write permissions. Once the operation is complete, you no longer need the locator, so it is safe to delete it. Take into account that this approach does not apply to some scenarios: if you want to publish an asset for adaptive streaming (On-Demand Origin locator) or progressive download (SAS locator), the locator must persist; otherwise, deleting the locator will “unpublish” the asset.
  2. Reuse the locators that are available in the asset. Instead of creating a new locator every time, check if the asset already contains one that matches the type and access policy permissions you need. If you find one, make sure it is not expired (or near expiration) before using it; otherwise, create a new locator.
  3. Leverage the Azure Media Services Content Protection feature. If you are trying to get granular control over your content by creating a different Locator for each client, there is a better way now: you can dynamically encrypt your content with AES or PlayReady, set a token authorization policy for the content key or license, and make the token expire after a short period of time (long enough for the player to retrieve the content key or license and start the playback). This way, you will be using a single long-lived Locator for all your clients. For more details, you can check this blog post: Announcing public availability of Azure Media Services Content Protection Services.


In this post, I will focus on Option #2 and show you how to implement a helper extension method to let you reuse your Locators and also update the duration if it happens to be expired (or near expiration). Below, you can find a proposed implementation that takes care of this.

Note: This code uses the Windows Azure Media Services .NET SDK Extensions NuGet package.

namespace Microsoft.WindowsAzure.MediaServices.Client { using System; using System.Linq; using System.Threading.Tasks; public static class LocatorCollectionExtensions { public static readonly TimeSpan DefaultExpirationTimeThreshold = TimeSpan.FromMinutes(5); public static async Task<ILocator> GetOrCreateAsync(this LocatorBaseCollection locators, LocatorType locatorType, IAsset asset, AccessPermissions permissions, TimeSpan duration, DateTime? startTime = null, TimeSpan? expirationThreshold = null) { MediaContextBase context = locators.MediaContext; ILocator assetLocator = context .Locators .Where(l => (l.AssetId == asset.Id) && (l.Type == locatorType)) .OrderByDescending(l => l.ExpirationDateTime) .ToList() .Where(l => (l.AccessPolicy.Permissions & permissions) == permissions) .FirstOrDefault(); if (assetLocator == null) { // If there is no locator in the asset matching the type and permissions, then a new locator is created. assetLocator = await context.Locators.CreateAsync(locatorType, asset, permissions, duration, startTime).ConfigureAwait(false); } else if (assetLocator.ExpirationDateTime <= DateTime.UtcNow.Add(expirationThreshold ?? DefaultExpirationTimeThreshold)) { // If there is a locator in the asset matching the type and permissions but it is expired (or near expiration), then the locator is updated. await assetLocator.UpdateAsync(startTime, DateTime.UtcNow.Add(duration)).ConfigureAwait(false); } return assetLocator; } public static ILocator GetOrCreate(this LocatorBaseCollection locators, LocatorType locatorType, IAsset asset, AccessPermissions permissions, TimeSpan duration, DateTime? startTime = null, TimeSpan? expirationThreshold = null) { using (Task<ILocator> task = locators.GetOrCreateAsync(locatorType, asset, permissions, duration, startTime, expirationThreshold)) { return task.Result; } } } }

Every time you need a Locator for an Asset, you can use the GetOrCreate method as follows. Of course, if you call the GetOrCreate method multiple times using different parameter combinations (locator type and access policy permissions), you might end up facing the “5 shared access policy” limitation. That’s why it is also important to delete the locators that are not needed as explained in Option #1.

var myContext = new CloudMediaContext("%accountName%", "%accountKey%"); var myAsset = myContext.Assets.Where(a => a.Id == "%assetId%").First(); var myLocator = myContext.Locators.GetOrCreate(LocatorType.Sas, myAsset, AccessPermissions.Read, TimeSpan.FromDays(30));



[Spanish] Construyendo aplicaciones Media con Microsoft Azure Media Services @ Global Azure Bootcamp 2015 Buenos Aires, Argentina

For those who don’t read Spanish, this blog post provides details about a Spanish speaking session at the Global Azure Bootcamp 2015 Buenos Aires, Argentina.

Global Azure Bootcamp 2015 Buenos Aires, Argentina

El sábado pasado junto con Mariano Vazquez presentamos Microsoft Azure Media Services en el Global Azure Bootcamp 2015 Buenos Aires, Argentina. La charla duró aproximadamente 90 minutos y cubrimos los siguientes temas:

  • Introducción a conceptos de Media en general, como Progressive Download vs. Adaptive Streaming, protocolos disponibles de Adaptive Streaming, transcoding vs. transmuxing, etc.
  • Arquitectura de Microsoft Azure Media Services (PaaS)
  • Principales características de la plataforma:
    • Video-on-Demand (VOD)
    • Live Streaming
    • Dynamic Packaging
    • Dynamic Encryption (content protection)
  • Demostraciones:
  • Nuevas características anunciadas recientemente:

Queremos agradecerles a todos los que asistieron al evento y les repetimos que pueden contactarnos (@marianodvazquez y @mconverti) en caso de que tengan preguntas o dudas sobre alguno de los temas presentados. El material que utilizamos para la presentación ya esta disponible para ser descargado desde nuestro repositorio GitHub @ https://github.com/mconverti/bootcamp2015-aplicaciones-media-con-azure-media-services.

Construyendo aplicaciones Media con Microsoft Azure Media Services

Por úlitmo, les dejamos algunos links con recursos adicionales relacionados con esta charla:



Global Azure Bootcamp 2015

On Saturday, April 25, 2015 we are going to be part of the Global Azure Bootcamp 2015 at Microsoft Argentina offices!

Read More