Using Facebook’s QuickReplies with PromptDialogs

In my last post I showed how easy it is to send Facebook’s quick replies using Microsoft Bot Framework. Using what we’ve learned; I’m now going to show you how you can use quick replies when using a PromptDialog.

 

Some of the PromptDialogs coming out of the box with the BotBuilder library (like the PromptChoice or PromptConfirm) expects a set of options (like the options to choose from) to be presented to the user. By default, these dialogs use a PromptStyler that based on the PromptStyle selected when creating the dialog; it will render the options in different ways: cards with buttons, all the options inline or one option per line to mention some of them.

The good news is that you can create your own PromptStyler and so, change the way the options are being rendered.

 

At this point, you might already have spotted what I will be doing as part of this post Smile. The whole idea is to create a custom PromptStyler that will take the provided options and render them as quick replies.

 

The code
The full sample is in GitHub. To run the sample, publish your bot, for example to Azure or use Ngrok to interact with your local bot in the cloud.

The code is extremely simple and is taking advantage of the models I put together in the previous post.

[Serializable]
public class FacebookQuickRepliesPromptStyler : PromptStyler
{
public override void Apply<T>(ref IMessageActivity message, string prompt, IList<T> options)
{
if (message.ChannelId.Equals("facebook", StringComparison.InvariantCultureIgnoreCase) && this.PromptStyle == PromptStyle.Auto && options != null && options.Any())
{
var channelData = new FacebookChannelData();

var quickReplies = new List<FacebookQuickReply>();

foreach (var option in options)
{
var quickReply = option as FacebookQuickReply;

if (quickReply == null)
{
quickReply = new FacebookTextQuickReply(option.ToString(), option.ToString());
}

quickReplies.Add(quickReply);
}

channelData.QuickReplies = quickReplies.ToArray();

message.Text = prompt;
message.ChannelData = channelData;
}
else
{
base.Apply<T>(ref message, prompt, options);
}
}
}

 

Using it in your dialogs is a no-brainer. Just provide an instance of the new PromptStyler when defining the prompt options and that’s it.

var promptOptions = new PromptOptions<string>(
"Please select your age range:",
options: new[] { "20-35", "36-46", "47-57", "58-65", "65+" },
promptStyler: new FacebookQuickRepliesPromptStyler());

PromptDialog.Choice(context, this.ResumeAfterSelection, promptOptions);

 

One caveat that I’m not addressing in this post is how to access the payload of the quick replies from the PromptDialog. In order to access to the payload, you will likely have to extend the PromptDialog (PromptConfirm is sealed but you can inherit from PromptChoice), override the MessageReceivedAsync and/or the TryParse and include the logic related to extracting the payload. Please refer to my previous post for the logic on how to extract the payload.

 

The outcome

image

image

Enjoy!

Sending Facebook’s Quick replies using Microsoft Bot Framework

Facebook’s quick replies are a great way to present buttons to the users. Per Facebook’s Quick Replies documentation:

Quick Replies provide a new way to present buttons to the user. Quick Replies appear prominently above the composer, with the keyboard less prominent. When a quick reply is tapped, the message is sent in the conversation with developer-defined metadata in the callback. Also, the buttons are dismissed preventing the issue where users could tap on buttons attached to old messages in a conversation.

Taking advantage of this feature when using Microsoft Bot Framework is really simple, even knowing this functionality is very specific to Facebook.

If you want to use special features or concepts for a channel, the Bot Framework provides a way to send native metadata to that channel giving you much deeper control over how your bot interacts on a channel. The way Bot Framework enables this is through the ChannelData property in the C# SDK and the sourceEvent property in Node.js.

Not every capability provided by a channel must go through the ChannelData property; that will basically depend on whether the functionality can be consistent across channels. When that’s the case, it’s very likely the functionality will be addressed in the core API, like with Rich card attachments.

 

The code

The full sample is in GitHub. To run the sample, publish your bot, for example to Azure or use Ngrok to interact with your local bot in the cloud.

The quick replies schema is pretty straightforward. Below is the portion of the code that creates some text only quick replies and assigns them to the activity’s ChannelData.

if (reply.ChannelId.Equals("facebook", StringComparison.InvariantCultureIgnoreCase))
{
var channelData = JObject.FromObject(new
{
quick_replies = new dynamic[]
{
new
{
content_type = "text",
title = "Blue",
payload = "DEFINED_PAYLOAD_FOR_PICKING_BLUE",
image_url = "https://cdn3.iconfinder.com/data/icons/developperss/PNG/Blue%20Ball.png"
},
new
{
content_type = "text",
title = "Green",
payload = "DEFINED_PAYLOAD_FOR_PICKING_GREEN",
image_url = "https://cdn3.iconfinder.com/data/icons/developperss/PNG/Green%20Ball.png"
},
new
{
content_type = "text",
title = "Red",
payload = "DEFINED_PAYLOAD_FOR_PICKING_RED",
}
}
});

reply.ChannelData = channelData;
}

Note that in the code I’m checking for the channel so quick replies are only sent in outgoing messages to a Facebook channel.


ProTip
: If you are planning to use quick replies in many places or in many bots and you don’t want to remember the required schema; I would recommend creating some helper classes to ease the work. I included those as part of the sample, and now the code looks like:

var channelData = new FacebookChannelData
{
QuickReplies = new[]
{
new FacebookTextQuickReply("Blue", "DEFINED_PAYLOAD_FOR_PICKING_BLUE", "https://cdn3.iconfinder.com/data/icons/developperss/PNG/Blue%20Ball.png"),
new FacebookTextQuickReply("Green", "DEFINED_PAYLOAD_FOR_PICKING_GREEN", "https://cdn3.iconfinder.com/data/icons/developperss/PNG/Green%20Ball.png"),
new FacebookTextQuickReply("Red", "DEFINED_PAYLOAD_FOR_PICKING_RED")
}
};

When a Quick Reply is tapped, a text message will be sent to your webhook Message Received Callback. The text of the message will correspond to the title of the Quick Reply. The message object will also contain the payload custom data defined on the Quick Reply.

Accessing the payload information only requires using the ChannelData of the message received after the user tapped the Quick Reply button. Optionally, you can create a typed model of the response and deserialize the channel data.

private async Task OnColorPicked(IDialogContext context, IAwaitable<IMessageActivity> result)
{
var colorMessage = await result;

var message = $"Color picked: {colorMessage.Text}.";

if (colorMessage.ChannelId.Equals("facebook", StringComparison.InvariantCultureIgnoreCase))
{
var quickReplyResponse = colorMessage.ChannelData.message.quick_reply;

if (quickReplyResponse != null)
{
message += $" The payload for the quick reply clicked is: {quickReplyResponse.payload}";
}
else
{
message += " It seems you didn't click on a quick reply and you just typed the color.";
}
}

await context.PostAsync(message);

context.Wait(this.MessageReceivedAsync);
}

 

The outcome

As soon as you send a message to the Bot, it will respond with a question and the three quick replies. Intentionally, I’m showing quick replies with and without images.

image_thumb17

Once you tap on the quick reply button, the Bot will respond with the selected color and also it will show the defined payload for the quick reply button.

image_thumb18

Quick replies are just hints, you can still type something different, in which case the sample will detect that you didn’t tap on a quick reply button and it won’t look for the defined payload.

image_thumb19

 

As you can see, quick replies are powerful and really easy to use right from your Bot Framework based bot. If one of the channels you are supporting within your bot is Facebook, I would strongly recommend using them. To learn more about Facebook features with Bot Framework, please read here.

New Microsoft Azure Media Services SDK’s for Java and PHP release with FairPlay Streaming support

Over the last few days, the Azure SDK team published new releases of the Azure SDK for Java and Azure SDK for PHP packages that contain updates and new features for Microsoft Azure Media Services about Content Protection. In particular, both SDK’s now support Apple FairPlay Streaming (FPS) DRM dynamic encryption configuration and include improvements for Widevine DRM dynamic encryption configuration.

To take advantage of these features in your Java Maven projects, you need to use the latest Azure Media Service Java SDK (v0.9.4) by adding the following azure-media dependency in your pom.xml file.

<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-media</artifactId>
<version>0.9.4</version>
</dependency>

 

For PHP Composer projects, you need to use the latest Azure Media Service PHP SDK (v0.4.4) by adding the following microsoft/windowsazure dependency in the composer.json file – make sure to call require_once(‘vendor/autoload.php’); in your PHP files.

{
  "name": "my/sample",
    "license": "Apache-2.0",
    "require": {
        "microsoft/windowsazure": "^0.4"
    }
}

 

FairPlay Streaming DRM support

With this new release, you can now use the Azure Media Services REST API operations and entities to configure DRM dynamic encryption with Apple FairPlay Streaming (FPS) in both Java and PHP. Below you can find a sample VOD workflow that shows how to enable FairPlay Streaming:

For more details about Apple FairPlay Streaming (FPS) support in Azure Media Services, you can read the official general availability announcement by @mingfeiy: https://azure.microsoft.com/blog/apple-fairplay-streaming-for-azure-media-services-generally-available/.

 

Widevine DRM updates

When you create a Widevine Asset Delivery Policy, you have to specify the license acquisition URL for the common encryption Content Key assigned to the Asset. This value can be obtained by calling the GetKeyDeliveryUrl REST API operation on the Content Key instance. The URL returned by this call, however, contains the Content Key ID as a query string parameter: ?KID=<Guid>. This means that if you use this value “as is” in the configuration, the resulting Widevine Asset Delivery Policy will only be valid for that single Content Key instance.

With the new release, you can now reuse the same Widevine Asset Delivery Policy among multiple Assets. To do this, you just need to remove the query string from the Content Key license acquisition URL and then set it in the Asset Delivery Policy configuration using the new “Widevine Base License Acquisition Url” option. Below you can find some sample code snippets showing how to do this in both Java and PHP.

PHP sample code snippet

// $restProxy: Azure Media Services client context
// $contentKey: a common encrytion content key

$widevineUrl = $restProxy->getKeyDeliveryUrl($contentKey, ContentKeyDeliveryType::WIDEVINE);

// Remove query string
if (strpos($widevineUrl, '?') !== false) {
$widevineUrl = substr($widevineUrl, 0, strrpos($widevineUrl, "?"));
}

// Generate the AssetDeliveryPolicy configuration
$config = [AssetDeliveryPolicyConfigurationKey::WIDEVINE_BASE_LICENSE_ACQUISITION_URL => $widevineUrl];
$configuration = AssetDeliveryPolicyConfigurationKey::stringifyAssetDeliveryPolicyConfiguartionKey($config);

// Create a reusable AssetDeliveryPolicy for Widevine
$adpolicy = new AssetDeliveryPolicy();
$adpolicy->setName('Reusable Widevine Delivery Policy');
$adpolicy->setAssetDeliveryConfiguration($configuration);
$adpolicy->setAssetDeliveryProtocol(AssetDeliveryProtocol::DASH);
$adpolicy->setAssetDeliveryPolicyType(AssetDeliveryPolicyType::DYNAMIC_COMMON_ENCRYPTION);
$adpolicy = $restProxy->createAssetDeliveryPolicy($adpolicy);

Java sample code snippet

// mediaService: Azure Media Services client context
// contentKey: a common encrytion content key

String widevineUrl = mediaService.create(ContentKey.getKeyDeliveryUrl(contentKey.getId(), ContentKeyDeliveryType.Widevine));

// Remove query string
if (widevineUrl.contains("?")) {
widevineUrl = widevineUrl.substring(0, widevineUrl.indexOf("?"));
}


// Generate the AssetDeliveryPolicy configuration
Map<AssetDeliveryPolicyConfigurationKey, String> configuration = new HashMap<AssetDeliveryPolicyConfigurationKey, String>();
configuration.put(AssetDeliveryPolicyConfigurationKey.WidevineBaseLicenseAcquisitionUrl, widevineUrl);

AssetDeliveryPolicyInfo assetDeliveryPolicy = mediaService.create(AssetDeliveryPolicy.create()
.setName("Reusable Widevine Delivery Policy")
.setAssetDeliveryConfiguration(configuration)
.setAssetDeliveryPolicyType(AssetDeliveryPolicyType.DynamicCommonEncryption)
.setAssetDeliveryProtocol(EnumSet.of(AssetDeliveryProtocol.Dash)));

 

Enjoy!

Azure Media Services updates in new Azure portal

Azure Portal

Last month, the Azure Media Services team announced the availability of Azure Media Services functionalities through portal.azure.com (codename Ibiza) in public preview; you can check all the details in this blog post by @mingfeiy. After this announcement, there were some updates to the new portal with more features, enhancements and several bug fixes.

In this post you will find a quick summary of what is new and what has changed in the new portal for Azure Media Services.

 

Add FairPlay DRM protection onto both VOD and live stream

You can now configure the default content key authorization policy for FairPlay DRM from the Content Protection blade by providing the App Certificate (.pfx file containing the private key along with its password) and the Application Secret Key (ASK) received when you generate the certification using Apple Developer portal.

Content Protection blades for FairPlay DRM

After configuring the default content key authorization policy for FairPlay DRM, two new encryption options are going to be available for both VOD and live stream assets.

  • PlayReady and Widevine with MPEG-DASH + FairPlay with HLS
  • FairPlay only with HLS

Adding FairPlay delivery policy in a VOD asset

Adding FairPlay delivery policy in a live stream asset

For more details about FairPlay Streaming (FPS) support, you can check this blog post: Stream premium content to Apple TV with Azure Media Services.

 

Enable/Disable CDN in Streaming Endpoint

After creating a streaming endpoint, you can now enable/disable CDN feature from the Streaming Endpoint Details blade. Remember that, in order to enable CDN feature, the streaming endpoint must be in Stopped state and have at least one streaming unit.

Streaming Endpoint Details blade

 

Improve delete Channel experience

In order to delete a channel, the Azure Media Services REST API validates that:

  • The channel is in Stopped state 
  • The channel does not contain any programs

The original implementation of the Delete command was only enabled in channels satisfying both conditions. To make it easier for Azure portal users, it is now always enabled and takes care of performing these operations (if necessary):

  • Stop all the programs in the channel
  • Delete all the programs in the channel
  • Stop the channel
  • Delete the channel

Channel blade

 

Show ‘Account ID’ property in the Summary and Properties blades

The Media Services ‘Account ID’ is now available in both the Summary and Properties blades. This value is useful, for example, when you want to submit a support request through the Azure portal and univocally identify your account.

Summary and Properties blades

 

Bug Fixes

  • Create Media Services Account blade: Account Name availability validation fails when the user has a disabled subscriptions
  • Create Media Services Account blade: Location dropdown is empty for some subscriptions
  • Asset blade: Encrypt command does not get automatically enabled when the asset type changes after an encoding job finishes
  • Publish the Asset blade: Notification error message when scaling streaming endpoint while creating a streaming locator
  • Asset Media Analytics blade: Remove frame limit for Azure Media Hyperlapse Media Processor
  • Create a new Channel wizard (Custom Create): Channel creation fails when ingest streaming protocol is set to RTP/MPEG-2
  • Create a new Channel wizard (Custom Create): Wrong ingest streaming protocol set when creating multiple channels
  • Create a Live Event blade: Binding issue in Archive Window field
  • Streaming Endpoint Details blade: Streaming units max limit is 5 but it should be 10
  • Streaming Endpoint Settings blade: Setting an entry in the Streaming Allowed IP Addresses table breaks media streaming
  • Media Services blades fail to load in Safari for Mac
  • Delete operations do not work after performing an update on the same entity

 

Enjoy!

Microsoft Azure Media Services SDK for Java v0.9.1 released with support for Widevine dynamic encryption

microsoft-azure-java-sdk-for-media-services

Last Friday, the Azure SDK team published a new release of the Azure SDK for Java Maven packages; you can find the full list at http://search.maven.org/#search|ga|1|com.microsoft.azure. In particular, there were was a minor new release (v0.9.1) of the Azure Media Services SDK for Java that adds support for Widevine (DRM) Dynamic Common Encryption and License Delivery Service; below I’m listing the change log.

If you want to use the Java SDK in your Maven project, you just need to add the “azure-media” dependency in your pom.xml file as follows:

<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-media</artifactId>
<version>0.9.1</version>
</dependency>

 

To demonstrate the new Java SDK features, I created the azure-media-dynamic-encryption-playready-widevine sample console Java application that contains a VOD end-to-end workflow that uses PlayReady and Widevine (DRM) Dynamic Common Encryption and the License Delivery Service for playback. It is based on the .NET sample explained in this documentation article: https://azure.microsoft.com/documentation/articles/media-services-protect-with-drm/.

You can access the full source code of this sample at: https://github.com/southworkscom/azure-sdk-for-media-services-java-samples/tree/master/azure-media-dynamic-encryption-playready-widevine.

Media Services SDK for Java sample projects in Eclipse

 

v0.9.1 Change Log

 

Enjoy!

New Microsoft Azure Media Services SDK for PHP release available with New features and samples

Azure Media Services SDK for PHP

Last week the Azure SDK team published a new release of the Azure SDK for PHP package that contains updates and new features for Microsoft Azure Media Services. In particular, the Azure Media Services SDK for PHP now supports the latest Content Protection features (AES and DRM – both PlayReady and Widevine – dynamic encryption with and without Token restriction), and listing/scaling Encoding Units. This release also includes three new PHP samples that show how to use these new features; below you can find the full change log with all the details about these updates.

In this post, I’ll focus on explaining how to use one of these new features: implement a VOD workflow that applies PlayReady and Widevine (DRM systems) with Dynamic Common Encryption (CENC) using Token restriction for the license.

  1. Make sure you have PEAR and Composer properly installed and configured (php.ini) in your local development box.
  2. Add the following dependencies in the composer.json file in the root of your project.
    "repositories": [
    {
    "type": "pear",
    "url": "http://pear.php.net",
    "vendor-alias": "pear-pear2.php.net"
    }
    ],
    "require": {
    "pear-pear.php.net/HTTP_Request2": "0.4.0",
    "pear-pear.php.net/mail_mime": "*",
    "pear-pear.php.net/mail_mimedecode": "*",
    "firebase/php-jwt": "^3.0",
    "microsoft/windowsazure": "dev-master"
    }

  3. In your index.php main file include the autoload.php file generated by Composer to load all the dependencies, and add the use statements for the required namespaces.
    require_once 'vendor/autoload.php';

    use WindowsAzure\Common\ServicesBuilder;
    use WindowsAzure\Common\Internal\MediaServicesSettings;
    use WindowsAzure\Common\Internal\Utilities;
    use WindowsAzure\MediaServices\Models\Asset;
    use WindowsAzure\MediaServices\Models\AccessPolicy;
    use WindowsAzure\MediaServices\Models\Locator;
    use WindowsAzure\MediaServices\Models\Task;
    use WindowsAzure\MediaServices\Models\Job;
    use WindowsAzure\MediaServices\Models\TaskOptions;
    use WindowsAzure\MediaServices\Models\ContentKey;
    use WindowsAzure\MediaServices\Models\ProtectionKeyTypes;
    use WindowsAzure\MediaServices\Models\ContentKeyTypes;
    use WindowsAzure\MediaServices\Models\ContentKeyAuthorizationPolicy;
    use WindowsAzure\MediaServices\Models\ContentKeyAuthorizationPolicyOption;
    use WindowsAzure\MediaServices\Models\ContentKeyAuthorizationPolicyRestriction;
    use WindowsAzure\MediaServices\Models\ContentKeyDeliveryType;
    use WindowsAzure\MediaServices\Models\ContentKeyRestrictionType;
    use WindowsAzure\MediaServices\Models\AssetDeliveryPolicy;
    use WindowsAzure\MediaServices\Models\AssetDeliveryProtocol;
    use WindowsAzure\MediaServices\Models\AssetDeliveryPolicyType;
    use WindowsAzure\MediaServices\Models\AssetDeliveryPolicyConfigurationKey;
    use WindowsAzure\MediaServices\Templates\PlayReadyLicenseResponseTemplate;
    use WindowsAzure\MediaServices\Templates\PlayReadyLicenseTemplate;
    use WindowsAzure\MediaServices\Templates\PlayReadyLicenseType;
    use WindowsAzure\MediaServices\Templates\MediaServicesLicenseTemplateSerializer;
    use WindowsAzure\MediaServices\Templates\WidevineMessage;
    use WindowsAzure\MediaServices\Templates\AllowedTrackTypes;
    use WindowsAzure\MediaServices\Templates\ContentKeySpecs;
    use WindowsAzure\MediaServices\Templates\RequiredOutputProtection;
    use WindowsAzure\MediaServices\Templates\Hdcp;
    use WindowsAzure\MediaServices\Templates\TokenRestrictionTemplateSerializer;
    use WindowsAzure\MediaServices\Templates\TokenRestrictionTemplate;
    use WindowsAzure\MediaServices\Templates\SymmetricVerificationKey;
    use WindowsAzure\MediaServices\Templates\TokenClaim;
    use WindowsAzure\MediaServices\Templates\TokenType;
    use WindowsAzure\MediaServices\Templates\WidevineMessageSerializer;

  4. Create a rest proxy instance for the Azure Media Services REST API.
    // Replace the placeholders with your Media Services credentials
    $restProxy = ServicesBuilder::getInstance()->createMediaServicesService(new MediaServicesSettings("%account-name%", "%account-key%"));

  5. Create a new asset using your mezzanine source file.
    // Replace the placeholder with your mezzanine file name and path
    $sourceAsset = uploadFileAndCreateAsset($restProxy, "%source-mezzanine-file.mp4%");

    function uploadFileAndCreateAsset($restProxy, $mezzanineFileName) {
    // Create an empty "Asset" by specifying the name
    $asset = new Asset(Asset::OPTIONS_NONE);
    $asset->setName("Mezzanine " . $mezzanineFileName);
    $asset = $restProxy->createAsset($asset);
    $assetId = $asset->getId();

    print "Asset created: name=" . $asset->getName() . " id=" . $assetId . "\r\n";

    // Create an Access Policy with Write permissions
    $accessPolicy = new AccessPolicy('UploadAccessPolicy');
    $accessPolicy->setDurationInMinutes(60.0);
    $accessPolicy->setPermissions(AccessPolicy::PERMISSIONS_WRITE);
    $accessPolicy = $restProxy->createAccessPolicy($accessPolicy);

    // Create a SAS Locator for the Asset
    $sasLocator = new Locator($asset, $accessPolicy, Locator::TYPE_SAS);
    $sasLocator->setStartTime(new \DateTime('now -5 minutes'));
    $sasLocator = $restProxy->createLocator($sasLocator);

    // Get the mezzanine file content
    $fileContent = file_get_contents($mezzanineFileName);

    print "Uploading...\r\n";

    // Perform a multi-part upload using the Block Blobs REST API storage operations
    $restProxy->uploadAssetFile($sasLocator, $mezzanineFileName, $fileContent);

    // Generate the asset files metadata
    $restProxy->createFileInfos($asset);

    print "File uploaded: size=" . strlen($fileContent) . "\r\n";

    // Delete the SAS Locator (and Access Policy) for the Asset
    $restProxy->deleteLocator($sasLocator);
    $restProxy->deleteAccessPolicy($accessPolicy);
    return $asset;
    }

  6. Submit a transcoding job for the source asset to generate a multi-bitrate output asset suitable for adaptive streaming.
    $encodedAsset = encodeToAdaptiveBitrateMP4Set($restProxy, $sourceAsset);

    function encodeToAdaptiveBitrateMP4Set($restProxy, $asset) {
    // Retrieve the latest 'Media Encoder Standard' processor version
    $mediaProcessor = $restProxy->getLatestMediaProcessor('Media Encoder Standard');

    print "Using Media Processor: {$mediaProcessor->getName()} version {$mediaProcessor->getVersion()}\r\n";

    // Create the Job; this automatically schedules and runs it
    $outputAssetName = "Encoded " . $asset->getName();
    $outputAssetCreationOption = Asset::OPTIONS_NONE;
    $taskBody = '<?xml version="1.0" encoding="utf-8"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset assetCreationOptions="' . $outputAssetCreationOption . '" assetName="' . $outputAssetName . '">JobOutputAsset(0)</outputAsset></taskBody>';

    $task = new Task($taskBody, $mediaProcessor->getId(), TaskOptions::NONE);
    $task->setConfiguration('H264 Multiple Bitrate 720p');

    $job = new Job();
    $job->setName('Encoding Job');

    $job = $restProxy->createJob($job, array($asset), array($task));

    print "Created Job with Id: {$job->getId()}\r\n";

    // Check to see if the Job has completed
    $result = $restProxy->getJobStatus($job);

    $jobStatusMap = array('Queued', 'Scheduled', 'Processing', 'Finished', 'Error', 'Canceled', 'Canceling');

    while($result != Job::STATE_FINISHED && $result != Job::STATE_ERROR && $result != Job::STATE_CANCELED) {
    print "Job status: {$jobStatusMap[$result]}\r\n";
    sleep(5);
    $result = $restProxy->getJobStatus($job);
    }

    if ($result != Job::STATE_FINISHED) {
    print "The job has finished with a wrong status: {$jobStatusMap[$result]}\r\n";
    exit(-1);
    }

    print "Job Finished!\r\n";

    // Get output asset
    $outputAssets = $restProxy->getJobOutputMediaAssets($job);
    $encodedAsset = $outputAssets[0];

    print "Asset encoded: name={$encodedAsset->getName()} id={$encodedAsset->getId()}\r\n";

    return $encodedAsset;
    }

  7. Create a new Common Encryption content key and linked it to the multi-bitrate output asset.
    $contentKey = createCommonTypeContentKey($restProxy, $encodedAsset);

    function createCommonTypeContentKey($restProxy, $encodedAsset) {
    // Generate a new content key
    $keyValue = Utilities::generateCryptoKey(16);

    // Get the protection key id for content key
    $protectionKeyId = $restProxy->getProtectionKeyId(ContentKeyTypes::COMMON_ENCRYPTION);
    $protectionKey = $restProxy->getProtectionKey($protectionKeyId);

    $contentKey = new ContentKey();
    $contentKey->setContentKey($keyValue, $protectionKey);
    $contentKey->setProtectionKeyId($protectionKeyId);
    $contentKey->setProtectionKeyType(ProtectionKeyTypes::X509_CERTIFICATE_THUMBPRINT);
    $contentKey->setContentKeyType(ContentKeyTypes::COMMON_ENCRYPTION);

    // 3.3 Create the ContentKey
    $contentKey = $restProxy->createContentKey($contentKey);

    print "Content Key id={$contentKey->getId()}\r\n";

    // Associate the content key with the asset
    $restProxy->linkContentKeyToAsset($encodedAsset, $contentKey);

    return $contentKey;
    }

  8. Create a new content key authorization policy with PlayReady and Widevine options using Token restriction, and linked it to the content key.
    // You can also use TokenType::SWT 
    $tokenTemplateString = addTokenRestrictedAuthorizationPolicy($restProxy, $contentKey, TokenType::JWT);

    function addTokenRestrictedAuthorizationPolicy($restProxy, $contentKey, $tokenType) {
    // Create content key authorization policy restriction (Token)
    $tokenRestriction = generateTokenRequirements($tokenType);
    $restriction = new ContentKeyAuthorizationPolicyRestriction();
    $restriction->setName('Content Key Authorization Policy Restriction');
    $restriction->setKeyRestrictionType(ContentKeyRestrictionType::TOKEN_RESTRICTED);
    $restriction->setRequirements($tokenRestriction);

    // Configure PlayReady and Widevine license templates.
    $playReadyLicenseTemplate = configurePlayReadyLicenseTemplate();
    $widevineLicenseTemplate = configureWidevineLicenseTemplate();

    // Create content key authorization policy option (PlayReady)
    $playReadyOption = new ContentKeyAuthorizationPolicyOption();
    $playReadyOption->setName('PlayReady Authorization Policy Option');
    $playReadyOption->setKeyDeliveryType(ContentKeyDeliveryType::PLAYREADY_LICENSE);
    $playReadyOption->setRestrictions(array($restriction));
    $playReadyOption->setKeyDeliveryConfiguration($playReadyLicenseTemplate);
    $playReadyOption = $restProxy->createContentKeyAuthorizationPolicyOption($playReadyOption);

    // Create content key authorization policy option (Widevine)
    $widevineOption = new ContentKeyAuthorizationPolicyOption();
    $widevineOption->setName('Widevine Authorization Policy Option');
    $widevineOption->setKeyDeliveryType(ContentKeyDeliveryType::WIDEVINE);
    $widevineOption->setRestrictions(array($restriction));
    $widevineOption->setKeyDeliveryConfiguration($widevineLicenseTemplate);
    $widevineOption = $restProxy->createContentKeyAuthorizationPolicyOption($widevineOption);

    // Create content key authorization policy
    $ckapolicy = new ContentKeyAuthorizationPolicy();
    $ckapolicy->setName('Content Key Authorization Policy');
    $ckapolicy = $restProxy->createContentKeyAuthorizationPolicy($ckapolicy);

    // Link the PlayReady and Widevine options to the content key authorization policy
    $restProxy->linkOptionToContentKeyAuthorizationPolicy($playReadyOption, $ckapolicy);
    $restProxy->linkOptionToContentKeyAuthorizationPolicy($widevineOption, $ckapolicy);

    // Associate the authorization policy with the content key
    $contentKey->setAuthorizationPolicyId($ckapolicy->getId());
    $restProxy->updateContentKey($contentKey);

    print "Added Content Key Authorization Policy: name={$ckapolicy->getName()} id={$ckapolicy->getId()}\r\n";
    return $tokenRestriction;
    }

    function generateTokenRequirements($tokenType) {
    $template = new TokenRestrictionTemplate($tokenType);

    $template->setPrimaryVerificationKey(new SymmetricVerificationKey());
    $template->setAudience("urn:contoso");
    $template->setIssuer("https://sts.contoso.com");
    $claims = array();
    $claims[] = new TokenClaim(TokenClaim::CONTENT_KEY_ID_CLAIM_TYPE);
    $template->setRequiredClaims($claims);

    return TokenRestrictionTemplateSerializer::serialize($template);
    }

    function configurePlayReadyLicenseTemplate() {
    $responseTemplate = new PlayReadyLicenseResponseTemplate();

    $licenseTemplate = new PlayReadyLicenseTemplate();
    $licenseTemplate->setLicenseType(PlayReadyLicenseType::NON_PERSISTENT);
    $licenseTemplate->setAllowTestDevices(true);
    $responseTemplate->setLicenseTemplates(array($licenseTemplate));

    return MediaServicesLicenseTemplateSerializer::serialize($responseTemplate);
    }

    function configureWidevineLicenseTemplate() {
    $template = new WidevineMessage();
    $template->allowed_track_types = AllowedTrackTypes::SD_HD;

    $contentKeySpecs = new ContentKeySpecs();
    $contentKeySpecs->required_output_protection = new RequiredOutputProtection();
    $contentKeySpecs->required_output_protection->hdcp = Hdcp::HDCP_NONE;
    $contentKeySpecs->security_level = 1;
    $contentKeySpecs->track_type = "SD";
    $template->content_key_specs = array($contentKeySpecs);

    $policyOverrides = new \stdClass();
    $policyOverrides->can_play = true;
    $policyOverrides->can_persist = true;
    $policyOverrides->can_renew = false;
    $template->policy_overrides = $policyOverrides;

    return WidevineMessageSerializer::serialize($template);
    }

  9. Create a new asset delivery policy for PlayReady and Widevine dynamic common encryption for the MPEG-DASH streaming protocol, and linked it to the multi-bitrate output asset.
    createAssetDeliveryPolicy($restProxy, $encodedAsset, $contentKey);

    function createAssetDeliveryPolicy($restProxy, $encodedAsset, $contentKey) {
    // Get the license acquisition URLs
    $playReadyUrl = $restProxy->getKeyDeliveryUrl($contentKey, ContentKeyDeliveryType::PLAYREADY_LICENSE);
    $widevineURl = $restProxy->getKeyDeliveryUrl($contentKey, ContentKeyDeliveryType::WIDEVINE);

    // Generate the asset delivery policy configuration
    $configuration = [AssetDeliveryPolicyConfigurationKey::PLAYREADY_LICENSE_ACQUISITION_URL => $playReadyUrl,
    AssetDeliveryPolicyConfigurationKey::WIDEVINE_LICENSE_ACQUISITION_URL => $widevineURl];
    $confJson = AssetDeliveryPolicyConfigurationKey::stringifyAssetDeliveryPolicyConfiguartionKey($configuration);

    // Create the asset delivery policy
    $adpolicy = new AssetDeliveryPolicy();
    $adpolicy->setName('Asset Delivery Policy');
    $adpolicy->setAssetDeliveryProtocol(AssetDeliveryProtocol::DASH);
    $adpolicy->setAssetDeliveryPolicyType(AssetDeliveryPolicyType::DYNAMIC_COMMON_ENCRYPTION);
    $adpolicy->setAssetDeliveryConfiguration($confJson);

    $adpolicy = $restProxy->createAssetDeliveryPolicy($adpolicy);

    // Link the delivery policy to the asset
    $restProxy->linkDeliveryPolicyToAsset($encodedAsset, $adpolicy->getId());

    print "Added Asset Delivery Policy: name={$adpolicy->getName()} id={$adpolicy->getId()}\r\n";
    }

  10. Publish the multi-bitrate output asset with an origin locator to generate the base streaming URL.
    publishEncodedAsset($restProxy, $encodedAsset);

    function publishEncodedAsset($restProxy, $encodedAsset) {
    // Get the .ISM asset file
    $files = $restProxy->getAssetAssetFileList($encodedAsset);
    $manifestFile = null;

    foreach($files as $file) {
    if (endsWith(strtolower($file->getName()), '.ism')) {
    $manifestFile = $file;
    }
    }

    if ($manifestFile == null) {
    print "Unable to found the manifest file\r\n";
    exit(-1);
    }

    // Create a 30-day read-only access policy
    $access = new AccessPolicy("Streaming Access Policy");
    $access->setDurationInMinutes(60 * 24 * 30);
    $access->setPermissions(AccessPolicy::PERMISSIONS_READ);
    $access = $restProxy->createAccessPolicy($access);

    // Create an origin locator for the asset
    $locator = new Locator($encodedAsset, $access, Locator::TYPE_ON_DEMAND_ORIGIN);
    $locator->setName("Streaming Locator");
    $locator = $restProxy->createLocator($locator);

    // Create the base streaming URL for dynamic packaging
    $stremingUrl = $locator->getPath() . $manifestFile->getName() . "/manifest";

    print "Base Streaming URL: {$stremingUrl}\r\n";
    }

    function endsWith($haystack, $needle) {
    $length = strlen($needle);
    if ($length == 0) {
    return true;
    }

    return (substr($haystack, -$length) === $needle);
    }

  11. Generate a test Token to retrieve the PlayReady/Widevine license and enable playback in Azure Media Player.
    generateTestToken($tokenTemplateString, $contentKey);

    function generateTestToken($tokenTemplateString, $contentKey) {
    $template = TokenRestrictionTemplateSerializer::deserialize($tokenTemplateString);
    $contentKeyUUID = substr($contentKey->getId(), strlen("nb:kid:UUID:"));
    $expiration = strtotime("+12 hour");
    $token = TokenRestrictionTemplateSerializer::generateTestToken($template, null, $contentKeyUUID, $expiration);

    print "Token Type {$template->getTokenType()}\r\nBearer={$token}\r\n";
    }

  12. Run the code using the following PHP command and make sure to copy the Base Streaming URL and Token values displayed in the console.
    php -d display_errors=1 index.php

  13. Try the Base Streaming URL and Token values in the Azure Media Player demo site: http://amsplayer.azurewebsites.net/. Make sure to use the Advanced Options form to the set Protection value to DRM (PlayReady and Widevine) and paste the token.

 

For more coding details about enabling PlayReady and Widevine dynamic common encryption, you can check the vodworkflow_drm_playready_widevine.php sample.

 

Change Log

 

Enjoy!

.NET Core: Cross Platform – Ubuntu Linux

In the previous posts we briefly described how .NET Core is making .NET cross platform available for Windows, OSX and now Linux!

If you search a little bit you’ll find different instructions to install .NET Core tooling (DNVM, DNX and DNU) on Ubuntu Linux. In this blog post you’ll find the bare minimum steps to get a Hello World sample running in Ubuntu 14.04.

For instance, the docs specify a Mono dependency to make DNU work on Linux. Apparently, this is no longer required :)

Remember, the .NET team is currently working on it and the CoreCLR is work in progress and open source!

.NET Version Manager (DNVM)

The .NET Version Manager helps retrieving versions of the DNX, using NuGet, and allowing you to switch between versions when you have multiple on your machine. DNVM is simply a way to manage and download NuGet packages and it is a set of command line utilities to update and configure which .NET Runtime to use.

We’ll use a little script to download and install DNVM in the machine. This script uses curl and unzip which can be installed with the following command:

sudo apt-get install unzip curl

 

After installing the prerequisites we are ready to download and install DNVM:

curl -sSL https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.sh | DNX_BRANCH=dev sh && source ~/.dnx/dnvm/dnvm.sh

 
DNVM_install
After the script is executed you should be ready to go and install DNX. In my case, I had to execute one additional command to source the dnvm.sh and make it available to the console. Helpfully, this hint was provided by the script itself:

Type 'source /home/user/.dnx/dnvm/dnvm.sh' to start using dnvm

 

DNVM

Once this step is complete you should be able to run DNVM and see some help text.

Again, as I said in the OS X post, from this point this post is mostly a copy of my previous .NET Core: Cross platform – Windows post. Which is nice :)

 

.NET Execution Environment (DNX)

The .NET Execution Environment (DNX) contains the code required to bootstrap and run an application. This includes things like the compilation system, SDK tools, and the native CLR hosts.

DNX provides a consistent development and execution environment across multiple platforms (Windows, Mac and Linux) and across different .NET flavors (.NET Framework, .NET Core and Mono).

It is easy to install the .NET Core version of DNX, using the DNVM install command:

dnvm install -r coreclr latest

 

You can then use dnvm to list and select the active DNX version in your system (In my case, the latest version of DNX CoreCLR is 1.0.0-beta8)

dnvm use 1.0.0-beta8 -r coreclr

dnvm list

 
DNVM_list
 

Important: This is not strictly required at this point but before you can DNX run your application you must install some general purpose dependencies. You can install those using the following command:

sudo apt-get install libunwind8 libssl-dev

 

Hello world

A DNX project is simply a folder with a project.json file. The name of the project is the folder name.

Let’s first create a folder, set it as our current directory in command line:

mkdir HelloWorld && cd HelloWorld

 

Now create a new C# file HelloWorld.cs, and paste in the code below:

using System;

public class Program {
    public static void Main(string[] args){
        Console.WriteLine("Hello World from Core CLR!");
        Console.Read();
    }
}

 

Next, we need to provide the project settings DNX will use. Create a new project.json file in the same folder, and edit it to match the listing shown here:

{
    "version": "1.0.0-*",
    "dependencies": {
    },
    "frameworks" : {
        "dnx451" : { },
        "dnxcore50" : {
            "dependencies": {
                "System.Console": "4.0.0-beta-*"
            }
        }
    }
}

 

.NET Development Utility (DNU)

DNU is a command line tool that helps with the development of applications using DNX. You can use DNU to build, package and publish DNX projects. Or, as in the following example, you can use DNU to install a new package into an existing project or to restore all package dependencies for an existing project.

 

Important: The DNU utility has some dependencies on itself. Before you can dnu restore a project packages you must install the dependency using the following command:

sudo apt-get install libcurl3-dev

 

The project.json file defines the app dependencies and target frameworks in addition to various metadata properties about the app. See Working with DNX Projects for more details.

Because the .NET Core is completely factored we need to explicitly pull those libraries that our project depends on. We’ll  run the following command to download and install all packages that are listed in the project.json:

dnu restore

 

You’ll notice that even though our project only required System.Console, several dependent libraries have been downloaded and installed as a result.

 
DNU_restore
 

DNX run the app

At this point, we’re ready to run the app. You can do this by simply entering dnx run from the command prompt. You should see a result like this one:

dnx run

 
DNX_run
 

Further reading

https://dotnet.github.io/core/getting-started/

http://docs.asp.net/en/latest/getting-started/installing-on-linux.html#installing-on-debian-ubuntu-and-derivatives

http://www.mono-project.com/docs/getting-started/install/linux/#debian-ubuntu-and-derivatives

Intro to .NET Core

.NET Core: Cross Platform – Windows

.NET Core: Cross Platform – OS X

 

Summary

IHaveNoIdeaWhatImDoing

.NET Core: Cross Platform – OS X

.NET Core is making .NET cross platform an available for Windows, Ubuntu and OSX with a little help of our friends: DNVM, DNU and DNX.

The .NET Execution Environment (DNX) is a software development kit (SDK) and runtime environment that has everything you need to build and run .NET applications for Windows, Mac and Linux.

.NET Version Manager (DNVM)

The .NET Version Manager helps retrieving versions of the DNX, using NuGet, and allowing you to switch between versions when you have multiple on your machine. DNVM is simply a way to manage and download NuGet packages and it is a set of command line utilities to update and configure which .NET Runtime to use.

While it is pretty easy to install DNVM in a Windows box, on OSX the easiest way to get DNVM is to use Homebrew.

Homebrew: Bootstrapping the Bootstrapper

Package managers are the key component that have completely changed the face of modern software development. Homebrew is an OSX package manager which makes easy to install, upgrade and remove software packages.

In a nutshell, it uses GitHub repositories to download software (such as scripts), stores them in its own location and the creates symlinks providing easy access to them.

 

If you don’t have Homebrew installed then follow the Homebrew installation instructions. If you have the Developer tools installed you can use the following script in a terminal window:

ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

 

Once you have Homebrew up and running you can use the following commands to install DNVM:

brew tap aspnet/dnx
brew update
brew install dnvm

source dnvm.sh

 

Under the hood, Homebrew cloned the aspnet/homebrew-dnx  repository and installed dnvm using the dnvm.rb script. The dnvm.sh is a bag of functions that need to be sourced, hence the last command executed. Also, it is recommended to check and upgrade to the latest version of DNVM using the following command:

dnvm upgrade

 

DNVM

 

.NET Execution Environment (DNX)

From this point, this post is mostly a copy of my previous .NET Core: Cross platform – Windows post. Which is nice :)

DNX provides a consistent development and execution environment across multiple platforms (Windows, Mac and Linux) and across different .NET flavors (.NET Framework, .NET Core and Mono).

 

It is easy to install the .NET Core version of DNX, using the DNVM install command:

dnvm install -r coreclr latest

 

You can then use dnvm to list and select the active DNX version in your system (In my case, the latest version of DNX CoreCLR is 1.0.0-beta8)

dnvm use 1.0.0-beta8 -r coreclr

dnvm list

DNVM_use_list

 

Hello world

A DNX project is simply a folder with a project.json file. The name of the project is the folder name.

Let’s first create a folder, set it as our current directory in command line:

mkdir HelloWorld && cd HelloWorld

 

Now create a new C# file HelloWorld.cs, and paste in the code below:

using System;

public class Program {
    public static void Main(string[] args){
        Console.WriteLine("Hello World from Core CLR!");
        Console.Read();
    }
}

 

Next, we need to provide the project settings DNX will use. Create a new project.json file in the same folder, and edit it to match the listing shown here:

{
    "version": "1.0.0-*",
    "dependencies": {
    },
    "frameworks" : {
        "dnx451" : { },
        "dnxcore50" : {
            "dependencies": {
                "System.Console": "4.0.0-beta-*"
            }
        }
    }
}

.NET Development Utility (DNU)

DNU is a command line tool that helps with the development of applications using DNX. You can use DNU to build, package and publish DNX projects. Or, as in the following example, you can use DNU to install a new package into an existing project or to restore all package dependencies for an existing project.

DNU

Important: If you are not seeing the output above you might be facing an issue with your OSX installation. By the time I’m writing this, there is a know issue regarding a missing dependency on ICU4C. In my case, I fixed it by brewing the dependency and everything worked like a charm again.

brew install icu4c

 

The project.json file defines the app dependencies and target frameworks in addition to various metadata properties

about the app. See Working with DNX Projects for more details.

Because the .NET Core is completely factored we need to explicitly pull those libraries that our project depends on. We’ll  run the following command to download and install all packages that are listed in the project.json:

dnu restore

 

You’ll notice that even though our project only required System.Console, several dependant libraries have been downloaded and installed as a result.
DNU_restore

 

DNX run the app

At this point, we’re ready to run the app. You can do this by simply entering dnx run from the command prompt. You should see a result like this one:

dnx run

DNX_run

 

Further reading

https://dotnet.github.io/core/getting-started/

http://brew.sh/

https://github.com/aspnet/homebrew-dnx

https://github.com/aspnet/dnx/issues/2875

Intro to .NET Core

.NET Core: Cross Platform – Windows

 

Summary

WhatIfIToldYouOSX

.NET Core: Cross Platform – Windows

The .NET Execution Environment (DNX) is a software development kit (SDK) and runtime environment that has everything you need to build and run .NET applications for Windows, Mac and Linux.

However, package managers are the key component that have completely changed the face of modern software development and they’re very tied together in the .NET Core main tools: DNVM, DNU and DNX.

.NET Version Manager (DNVM)

The .NET Version Manager helps retrieving versions of the DNX, using NuGet, and allowing you to switch between versions when you have multiple on your machine. DNVM is simply a way to manage and download NuGet packages and it is a set of command line utilities to update and configure which .NET Runtime to use.
You can use the PowerShell script below to install DNVM. It just downloads an executes the dnvminstall.p1 script from the ASPNET/Home repo.

@powershell -NoProfile -ExecutionPolicy unrestricted -Command "&{$Branch='dev';iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.ps1'))}"

It just installs the dnmv.ps1 command line tool and adds it to the %PATH% environment variable. No version of DNX will be installed at this point. Also, it is recommended to check and upgrade to the latest version of DNVM using the following command:

dnvm upgrade

 

DNVM

DNVM solves the bootstrapping problem of getting and selecting the correct version of the DNX to run. You’ll find that the NuGet gallery hosts cross platform versions of DNX:

.NET Execution Environment (DNX)

The .NET Execution Environment (DNX) contains the code required to bootstrap and run an application. This includes things like the compilation system, SDK tools, and the native CLR hosts.

DNX provides a consistent development and execution environment across multiple platforms (Windows, Mac and Linux) and across different .NET flavors (.NET Framework, .NET Core and Mono).

It is easy to install the .NET Core version of DNX, using the DNVM install command:

dnvm install -r coreclr latest

 
DNX

You can then use dnvm to list and select the active DNX version in your system (In my case, the latest version of DNX CoreCLR is 1.0.0-beta8)

dnvm use 1.0.0-beta8 -r coreclr

dnvm list

 

Hello world

A DNX project is simply a folder with a project.json file. The name of the project is the folder name.

Let’s first create a folder, set it as our current directory in command line:

mkdir HelloWorld && cd HelloWorld

 

Now create a new C# file HelloWorld.cs, and paste in the code below:

using System;
 
public class Program {
    public static void Main(string[] args){
        Console.WriteLine("Hello World from Core CLR!");
        Console.Read();
    }
}

 

Next, we need to provide the project settings DNX will use. Create a new project.json file in the same folder, and edit it to match the listing shown here:

 

{
    "version": "1.0.0-*",
    "dependencies": {
    },
    "frameworks" : {
        "dnx451" : { },
        "dnxcore50" : {
            "dependencies": {
                "System.Console": "4.0.0-beta-*"
            }
        }
    }
}

.NET Development Utility (DNU)

DNU is a command line tool that helps with the development of applications using DNX. You can use DNU to build, package and publish DNX projects. Or, as in the following example, you can use DNU to install a new package into an existing project or to restore all package dependencies for an existing project.
DNU

 

The project.json file defines the app dependencies and target frameworks in addition to various metadata properties
about the app. See Working with DNX Projects for more details.

Because the .NET Core is completely factored we need to explicitly pull those libraries that our project depends on. We’ll  run the following command to download and install all packages that are listed in the project.json:

dnu restore

 

You’ll notice that even though our project only required System.Console, several dependant libraries have been downloaded and installed as a result.
DNU_restore

DNX run the app

At this point, we’re ready to run the app. You can do this by simply entering dnx run from the command prompt. You should see a result like this one:

dnx run

Further reading

https://dotnet.github.io/core/getting-started/

https://github.com/aspnet/Home/wiki/Version-Manager

http://docs.asp.net/en/latest/dnx/overview.html

http://docs.asp.net/en/latest/dnx/console.html

Intro to .NET Core

.NET Core: Cross Platform – OS X

Intro to .NET Core

.NET is a general purpose development platform. It has several key features that are attractive to many developers, including automatic memory management and modern programming languages, that make it easier to efficiently build high-quality apps. Multiple implementations of .NET are available, based on open .NET Standards that specify the fundamentals of the platform.

.NET Implementations

There are various implementations of .NET, some coming from Microsoft, some coming from other companies and groups:

  • The .NET Framework is the premier implementation of the .NET Platform available for Windows server and client developers.

    There are additional stacks built on top the .NET Framework, for example Windows Forms and Windows Presentation Foundation (WPF) for UI, Windows Communication Foundation (WCF) for middleware services and ASP.NET as a web framework.

  • Mono is an open source implementation of Microsoft’s .NET Framework based on the ECMA standards for C# and the Common Language Runtime.
  • .NET Native is the set of tools used to build .NET Universal Windows Platform (UWP) applications. .NET Native compiles C# to native machine code that performs like C++.

I’ll explain a little bit more on what is .NET Core below. But first, let’s take a look at the .NET Ecosystem.

.NET Ecosystem

The NET Ecosystem is undergoing a major shift and restructuring in 2015. There are a lot of “moving pieces” that need to be tied together in order for this new ecosystem and all of the recommended scenarios to work. As you can see, this is a very vibrant and diverse ecosystem.

You might not know but most of these projects are currently open sourced and fostered by the .NET Foundation (independent) organization. Yes! These projects are open source!

 

10kft_view

A wild .NET implementation appeared!

.NET Core is a cross-platform implementation of .NET that is primarily being driven by ASP.NET 5 workloads, but also by the need and desire to have a modern runtime that is modular and whose features and libraries can be cherry picked based on the application’s needs.

It includes a small runtime that is built from the same codebase as the .NET Framework CLR. The .NET Core runtime includes the same GC and JIT (RyuJIT), but doesn’t include features like Application Domains or Code Access Security.

There are several characteristics of .NET Core:

  • Cross-platform support is the first important feature. For applications, it is important to use those platforms that will provide the best environment for their execution. Thus, having an application platform that can enable the app to be ran on different operating systems with minimal or no changes provides a significant boon.
  • Open Source because it is proven to be a great way to enable a larger set of platforms, supported by community contribution.
  • Better packaging story – the framework is distributed as a set of packages that developers can pick and choose from, rather than a single, monolithic platform. .NET Core is the first implementation of .NET Platform that is distributed via NuGet package manager.
  • Better application isolation as one of the scenarios for .NET Core is to enable applications to “take” the needed runtime for their execution and deploy it with the application, not depending on shared components on the targeted machine. This plays well with the current trends of developing software and using container technologies like Docker for consolidation.
  • Modular – .NET Core is a set of runtime, library and compiler components. Microsoft uses these components in various configurations for device and cloud workloads.

NuGet as a 1st class delivery vehicle

In contrast to the .NET Framework, the .NET Core platform will be delivered as a set of NuGet packages.

Using NuGet allows for much more agile usage of the individual libraries that comprise .NET Core. It also means that an application can list a collection of NuGet packages (and associated version information) and this will comprise both system/framework as well as third-party dependencies required. Further, third-party dependencies can now also express their specific dependencies on framework features, making it much easier to ensure the proper packages and versions are pulled together during the development and build process.

If, for example, you need to use immutable collections, you can install the System.Collections.Immutable package via NuGet. The NuGet version will also align with the assembly version, and will use semantic versioning.

 

0841.Pic4

Open Source and Cross-Platform

Last year, the .NET Core main repositories were made open source: CoreFX (Framework libraries) and CoreCLR (Runtime) are public in GitHub. The main reasons for this is to leverage a stronger ecosystem and lay the foundation for a cross platform .NET and it is a natural progression on current .NET Foundation’s open source efforts:

However, as of only a few months ago (April) you can install .NET Core on Windows, Linux and OSX. This makes code written for it is also portable across application stacks, such as Mono, and platforms making it feasible to move applications across different environments with ease.

Windows_Linux_OSX

Further reading

http://blogs.msdn.com/b/dotnet/archive/2014/12/04/introducing-net-core.aspx

http://blogs.msdn.com/b/dotnet/archive/2014/11/12/net-core-is-open-source.aspx

http://dotnet.github.io/core/

http://dotnet.github.io/core/about/

http://dotnet.readthedocs.org/en/latest/getting-started/overview.html

http://dotnet.github.io/core/about/overview.html

.NET Core: Cross Platform – Windows

.NET Core: Cross Platform – OS X