Rosner Technologies

Effective Solutions for Productive Business

Latest Business and Technology News


Introducing ASP.NET 5

Published Mon, 23 Feb 2015 20:41:56 GMT

The first preview release of ASP.NET 1.0 came out almost 15 years ago.  Since then millions of developers have used it to build and run great web applications, and over the years we have added and evolved many, many capabilities to it. 

I'm excited today to post about a new release of ASP.NET that we are working on that we are calling ASP.NET 5.  This new release is one of the most significant architectural updates we've done to ASP.NET.  As part of this release we are making ASP.NET leaner, more modular, cross-platform, and cloud optimized.  The ASP.NET 5 preview is now available as a preview release, and you can start using it today by downloading the latest CTP of Visual Studio 2015 which we just made available.

ASP.NET 5 is an open source web framework for building modern web applications that can be developed and run on Windows, Linux and the Mac. It includes the MVC 6 framework, which now combines the features of MVC and Web API into a single web programming framework.  ASP.NET 5 will also be the basis for SignalR 3 - enabling you to add real time functionality to cloud connected applications. ASP.NET 5 is built on the .NET Core runtime, but it can also be run on the full .NET Framework for maximum compatibility.

With ASP.NET 5 we are making a number of architectural changes that makes the core web framework much leaner (it no longer requires System.Web.dll) and more modular (almost all features are now implemented as NuGet modules - allowing you to optimize your app to have just what you need).  With ASP.NET 5 you gain the following foundational improvements:

  • Build and run cross-platform ASP.NET apps on Windows, Mac and Linux
  • Built on .NET Core, which supports true side-by-side app versioning
  • New tooling that simplifies modern Web development
  • Single aligned web stack for Web UI and Web APIs
  • Cloud-ready environment-based configuration
  • Integrated support for creating and using NuGet packages
  • Built-in support for dependency injection
  • Ability to host on IIS or self-host in your own process

The end result is an ASP.NET that you'll feel very familiar with, and which is also now even more tuned for modern web development.

Flexible, Cross-Platform Runtime

ASP.NET 5 works with two runtime environments to give you greater flexibility when hosting your app. The two runtime choices are:

.NET Core – a new, modular, cross-platform runtime with a smaller footprint.  When you target the .NET Core, you’ll be able to take advantage of some exciting new benefits:

1) You can deploy the .NET Core runtime with your app which means your app will run with this deployed version of the runtime rather than the version of the runtime that is installed on the host operating system. Your version of the runtime runs side-by-side with versions for other apps. You can update that runtime, if needed, without affecting other apps, or you can continue running on the same version even though other apps on the system have been updated.  This makes app deployment and framework updates much easier and less impactful to other apps running on a system.

2) Your app is only dependent on features it really needs. Therefore, you are never prompted to update/service the runtime for features that are not relevant to your app. You will spend less time testing and deploying updates that are perhaps unrelated to the functionality of your app.

3) Your app can now be run cross-platform. We will provide a cross-platform version of .NET Core for Windows, Linux and Mac OS X systems.  Regardless of which operating system you use for development or which operating system you target for deployment, you will be able to use .NET. The cross-platform version of the runtime has not been released yet, but we are working on it on GitHub and plan to have an official preview of it out soon.

.NET Framework – The API for .NET Core is currently more limited than the full .NET Framework, so you may need to modify existing apps to target .NET Core. If you don't want to have to update your app you can instead run ASP.NET 5 applications on the full .NET Framework (version 4.5.2 and above).  When doing this you have access to the complete set of .NET Framework APIs. Your existing applications and libraries will work without modification on this runtime.

MVC 6 - a unified programming model

MVC, Web API and Web Pages provide complementary functionality and are frequently used together when developing a solution. However, in past ASP.NET releases, these programming frameworks were implemented separately and therefore contained some duplication and inconsistencies. With MVC 6, we are merging those models into a single programming model. Now, you can create a single web application that handles the Web UI and data services without needing to reconcile differences in these programming frameworks. You will also be able to seamlessly transition a simple site first developed with Web Pages into a more robust MVC application.

You can now return Razor views and content-negotiated data from the same controller and using the same MVC filter pipeline.

In addition to unifying the existing frameworks we are also adding new features to make server-side Web development easier, like the new tag helpers feature. Tag helpers let you use HTML helpers in your views by simply extending the semantics of tags in your markup.

So instead of writing this:

@Html.ValidationSummary(true, "", new { @class = "text-danger" })

<div class="form-group">

    @Html.LabelFor(m => m.UserName, new { @class = "col-md-2 control-label" })

    <div class="col-md-10">

        @Html.TextBoxFor(m => m.UserName, new { @class = "form-control" })

        @Html.ValidationMessageFor(m => m.UserName, "", new { @class = "text-danger" })

    </div>

</div>

You can instead write this:

<div asp-validation-summary="ModelOnly" class="text-danger"></div>

<div class="form-group">

    <label asp-for="UserName" class="col-md-2 control-label"></label>

    <div class="col-md-10">

        <input asp-for="UserName" class="form-control" />

        <span asp-validation-for="UserName" class="text-danger"></span>

    </div>

</div>

Tag helpers make authoring your views more natural and readable. They also simplify customizing the output of HTML helpers with additional markup while letting you take full advantage of the HTML editor.

For more examples of creating MVC 6 apps, see these tutorials.

Modern web development

This week's ASP.NET 5 preview also includes a number of other great development features that enable you to build even better web applications:

Dynamic Development

In Visual Studio 2015, we take advantage of dynamic compilation to provide a streamlined developer experience. You no longer have to compile your application every time you want to see a change. Instead, just (1) edit the code, (2) save your changes, (3) refresh the browser, and then (4) see your change automatically appear.

image

You enjoy a development experience that is similar to working with an interpreted language without sacrificing the benefits of a compiled language.

You can also optionally use other code editors to work on your ASP.NET 5 projects. Every function within the Visual Studio user interface is matched with cross-platform command-line operations.

Integration with Popular Web Development Tools (Bower, Grunt and Gulp)

Another exciting feature in Visual Studio 2015 is built-in support for Bower, Grunt, and Gulp - popular open source tools that we think should be in every Web developer’s toolkit.

  • Bower is a package manager for client-side libraries, including both JavaScript and CSS libraries.
  • Grunt and Gulp are task runners, which help you to automate your web development workflow. You can use Grunt or Gulp for tasks like compiling LESS, CoffeeScript, or TypeScript files, running JSLint, or minifying JavaScript files.

Bower: To add a JavaScript library to your ASP.NET project add it directly in the bower.json config file:

image

Notice that Visual Studio gives you IntelliSense with a list of available packages. The next time you open the solution, Visual Studio automatically restores any missing packages, so you don’t need to check the packages into source control.

For server-side packages, you’ll still use NuGet Package Manager.

Grunt: In modern web development, you can find yourself managing a lot of tasks, just to build your app: Compiling LESS, TypeScript, or CoffeeScript files, linting, JavaScript minification, running JS unit tests, and so on. Every team will have its own set of requirements, depending on the particular tools that you use. Task runners make it easier to manage and coordinate these tasks. Visual Studio 2015 will support two popular task runners, Grunt and Gulp.

For example, let’s say you want to use Grunt to compile LESS files. Just go into package.json and add the grunt-contrib-less package, which is a third-party Grunt plugin.

image

Use the new Task Runner Explorer in Visual Studio 2015 to bind the task to a build step (pre-build, post-build, clean, or when the solution is opened).

image

This makes it incredibly easy to automate common tasks within your projects - and have them work both for you, as well as across a team wide project.

Simplified dependency management

In ASP.NET 5 you manage dependencies by adding NuGet packages. You can use the NuGet Package Manager or simply edit the JSON file (project.json) that lists the NuGet packages and versions used in your project. The project.json file is easy to work with and you can edit it with any text editor, which enables you to update dependencies even when the app has been deployed to the cloud.

The project.json file looks like:

image

In Visual Studio 2015, IntelliSense assists you with finding the available NuGet packages that you can add as dependencies.

image

And, Intellisense can even help you with the available versions:

image

Cloud-ready configuration

In ASP.NET 5, we eliminated the need to use Web.config file for configuration values. We wanted to make it easier for you to deploy your app to the cloud and have the app automatically read the correct configuration values for that environment. The new system enables you to request named values from a variety of sources (such as JSON, XML, or environment variables). You can decide which formats work best in your situation.

In the Startup.cs file, you can now add or remove the sources for configuration values.

image

The above code snippet shows a project that is set up to retrieve configuration values from a JSON file and environmental variables. You can change this code if you need to specify other sources. In the specified config.json file, you could provide the values.

image

In your host environment, such as Azure, you can set the environmental variables and those values are automatically used instead of local configuration values after the application is deployed. You can deploy your application without worrying about publishing test values.

Dependency injection (DI)

Dependency Injection (DI) is supported in existing ASP.NET frameworks, like MVC, Web API and SignalR, but not in a consistent and holistic way. ASP.NET 5 provides a built-in DI abstraction that is available in a consistent way throughout the entire web stack. You can access services at startup, in middleware, in filters, in controllers, in model binding and virtually any part of the pipeline where you want to use your services. ASP.NET 5 includes a minimalistic DI container to bootstrap the system, but you can easily replace the default container with your container of choice (Autofac, Ninject, etc). Services can be singleton, scoped to the request or transient.

For example, to see how to use constructor injection with ASP.NET MVC 6, create a new ASP.NET 5 Starter Web project and add a simple time service:

using System;

 

namespace WebApplication1

{

    public class TimeService

    {

        public TimeService()

        {

            Ticks = DateTime.Now.Ticks.ToString();

        }

        public String Ticks { get; set; }

    }

}

The simple service class sets the current Ticks when the constructor is called.

Next, register the time service as a transient service in the ConfigureServices method of the Startup class:

public void ConfigureServices(IServiceCollection services)

{

    services.AddMvc();

    services.AddTransient<TimeService>();

}

Then, update the HomeController to use constructor injection and to write the Ticks when the TimeService object was created.

public class HomeController : Controller

{

    public TimeService TimeService { get; set; }

 

    public HomeController(TimeService timeService)

    {

        TimeService = timeService;

    }

 

    public IActionResult About()

    {

        ViewBag.Message = TimeService.Ticks + " From Controller";

        System.Threading.Thread.Sleep(1);

        return View();

    }

 

    // Code removed for brevity

}

Notice the controller doesn't create a TimeService. It's injected when the controller is instantiated.

In MVC 6 you can use the [Activate] attribute to inject services via properties. You can use [Activate] not just on controllers but also on filters, and view components. This means you can simplify your controller code like this:

public class HomeController : Controller

{

    [Activate]

    public TimeService TimeService { get; set; }

 

    // Code removed for brevity

}

MVC 6 also supports DI into Razor views via the @inject keyword. In the code below, I’ve injected the time service into the about view directly and defined a TimeSvc property by which it can be accessed:

@using WebApplication23

@inject TimeService TimeSvc

 

<h3>@ViewBag.Message</h3>

 

<h3>

    @TimeSvc.Ticks From Razor

</h3>

When you run the app, you can see different ticks values from the controller and the view.

image

Fast HTTP performance

ASP.NET 5 introduces a new HTTP request pipeline that is modular so you can add only the components that you need. The pipeline is also no longer dependent on System.Web. By reducing the overhead in the pipeline, your app can experience better throughput and a more tuned HTTP stack. The new pipeline is based on many of the learnings from the Katana project and also supports OWIN.

To customize which components are used in the pipeline, use the Configure method in your Startup class. The Configure method is used to specify which middleware you want to “use” in your request pipeline. ASP.NET 5 already includes ported versions of many of the middleware from the Katana project, like middleware for static files, authentication and diagnostics. The following image shows some of the features you can add or remove to the pipeline for your project.

public void Configure(IApplicationBuilder app)

{

    // Add static files to the request pipeline.

    app.UseStaticFiles();

 

    // Add cookie-based authentication to the request pipeline.

    app.UseIdentity();

 

    // Add MVC and routing to the request pipeline.

    app.UseMvc(routes =>

    {

    routes.MapRoute(

        name: "default",

        template: "{controller}/{action}/{id?}",

        defaults: new { controller = "Home", action = "Index" });

 

});

You can also write your own middleware components and add them to the pipeline.

Open source

We are developing ASP.NET 5 as an open source project on GitHub. You can view the code, see when changes were made, download the code, and submit changes. We believe making ASP.NET 5 open source will we make it easier for you to understand the code, understand our intended direction, and contribute to the project.

image

Docs and tutorials

To get started with ASP.NET 5 you can find docs and tutorials on the ASP.NET site at http://asp.net/vnext. The following tutorials will guide you through the steps of creating your first ASP.NET 5 project.

Also read this article for even more ASP.NET and Web Development improvements coming this week.

Hope this help,

Scott

omni


Azure: Machine Learning Service, Hadoop Storm, Cluster Scaling, Linux Support, Site Recovery and More

Published Wed, 18 Feb 2015 16:06:22 GMT

Today we released a number of great enhancements to Microsoft Azure. These include:

  • Machine Learning: General Availability of the Azure Machine Learning Service
  • Hadoop: General Availability of Apache Storm Support, Hadoop 2.6 support, Cluster Scaling, Node Size Selection and preview of next Linux OS support
  • Site Recovery: General Availability of DR capabilities with SAN arrays

I've also included details in this blog post of other great Azure features that went live earlier this month:

  • SQL Database: General Availability of SQL Database (V12)
  • Web Sites: Support for Slot Settings
  • API Management: New Premium Tier
  • DocumentDB: New Asia and US Regions, SQL Parameterization and Increased Account Limits
  • Search: Portal Enhancements, Suggestions & Scoring, New Regions
  • Media: General Availability of Content Protection Service for Azure Media Services
  • Management: General Availability of the Azure Resource Manager

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Machine Learning: General Availability of Azure ML Service

Today, I’m excited to announce the General Availability of our Azure Machine Learning service.  The Azure Machine Learning Service is a powerful cloud-based predictive analytics service that makes it possible to quickly create analytics solutions.  It is a fully managed service - which means you do not need to buy any hardware nor manage VMs manually.

Data Scientists and Developers can use our innovative browser-based machine learning IDE to quickly create and automate machine learning workflows.  You can literally drag/drop hundreds of existing ML libraries to jump-start your predictive analytics solutions, and then optionally add your own custom R and Python scripts to extend them.  Our Machine Learning IDE works in any browser and enables you to rapidly develop and iterate on solutions:

image

With today's General Availability release you can easily discover and create web services, train/retrain your models through APIs, manage endpoints and scale web services on a per customer basis, and configure diagnostics for service monitoring and debugging. Additional new capabilities with today's release include:

  • The ability to create a configurable custom R module, incorporate your own train/predict R-scripts, and add python scripts using a large ecosystem of libraries such as numpy, scipy, pandas, scikit-learn etc. You can now train on terabytes of data using “Learning with Counts”, use PCA or one-class SVM for anomaly detection, and easily modify, filter, and clean data using familiar SQLite.
  • Azure ML Community Gallery that allows you to discover & learn experiments, and share through Twitter and LinkedIn. You can purchase marketplace apps through an Azure subscription and consume finished web services for Recommendation, Text Analytics, and Anomaly Detection directly from the Azure Marketplace.
  • A step-by-step guide for the Data Science journey from raw data to a consumable web service to ease the path for cloud-based data science. We have added the ability to use popular tools such as iPython Notebook and Python Tools for Visual Studio along with Azure ML.

Get Started

You can learn the basics of predictive analytics and machine learning using our step-by-step data science guide and tutorials.  No sign-up or credit card is required to get started using Azure Machine Learning (you can use the machine learning IDE and try experiments for free):

image

Also browse our machine learning gallery to run existing machine learning experiments others have already built - and optionally publish your own experiments for others to learn from:

image

Machine Learning and predictive analytics will fundamentally change the way all applications are built in the future.  The new Azure Machine Learning service provides an incredibly powerful and easy way to achieve this.  Start using it for production apps today!

HDInsight: General Availability of Apache Storm, Cluster Scaling, Hadoop 2.6, Node Sizes, and Preview of HDInsight on Linux

Today I’m happy to also announce several major enhancements to HDInsight, our managed Hadoop service for powering Big Data workloads in Azure.

General Availability of Apache Storm support

With today's release, we are making it easy for you to do real-time streaming analytics using Hadoop by providing Apache Storm as a fully managed Service and making it generally available on HDInsight. This makes it incredibly easy to stand up and manage Storm clusters. As part of the Storm service on HDInsight we have improved productivity by enabling some key features:

  • Integration with our Azure Event Hubs service - which allows you to easily process any data that is collected via Event Hubs
  • First class .NET experience on top of Apache Storm giving you the option to use both Java and .NET with it
  • Library of spouts and bolts let you easily integrate other Azure services like SQL, HBase and DocumentDB
  • Visual Studio integration that makes it easy for developers to do full project management from within the Visual Studio environment

Creating Storm cluster and running a sample topology

You can easily spin up a new Storm cluster from the Azure management portal. The Storm Dashboard allows you to either upload an existing Storm topology or pick one of the sample topologies from the dropdown.  Topologies can be authored in code, or higher level programming models like Trident can be used. You can also monitor and manage all the topologies that are currently on your cluster via the Storm Dashboard.

image

.NET Topologies and a Visual Studio Experience

One of the big improvements we have done on top of Storm is to enable developers to write Storm topologies in .NET. One of the things I am particularly excited about with the Storm release is the Visual Studio experience that we have enabled for Storm on HDInsight. With the latest version of the Azure SDK, you will get Storm project templates under HDInsight. This will quickly get you started with writing Storm topologies without having to worry or setup the right references or write the skeleton code that is needed for every Storm topology.

Since Storm is available as part of the HDInsight service, all HDInsight features also apply to Storm clusters. For example, you can easily scale up or scale down a Storm cluster with no impact to the existing running topologies. This will enable you to easily grow or shrink Storm clusters depending on the speed of ingest data and latency requirements with no impact on the data which is being processed.  At the time of the cluster creation you have the choice to pick from a long list of available VMs to use for their Storm cluster on HDInsight.

HDInsight 3.2 Support

I’m pleased to announce the availability of the next major version of Hadoop in HDInsight clusters for Windows and Linux. This includes Hadoop 2.6, Hive 0.14, and substantial updates to all of the components in the stack.  Hive 0.14 contains work to improve performance and scalability through Tez, adds a powerful cost based optimizer, and introduces capabilities for handling UPDATE, INSERT and DELETE SQL statements, temporary tables which live for the duration of a development session and more. You can find more details on the Hive 0.14 release here.   Pig 0.14 adds support for ORC, allowing a single high performance format to be leveraged across Pig and Hive.  Additionally Pig can now target Tez instead of Map/Reduce, resulting in substantial performance improvements by changing the execution engine. Details on the Pig 0.14 release are here.  These bring the latest improvements in the open source ecosystem to HDInsight. 

To get started with a 3.2 cluster, use the Azure Management portal or the command-line. In addition to the VS tools for Storm, we've also updated the VS tools to include Hive query authoring.  We've also added improved statement completion, local validation, access in Visual Studio to the YARN task logs, and support for HDInsight clusters on Linux. In order to get these, you just need to install the Azure SDK for Visual Studio which contains the latest HDInsight tooling.

Cluster Scaling

Many of our customers have asked for the ability to change HDInsight cluster sizes on the fly.  This capability is now accessible in both the Azure portal, as well as through the command line and SDK's.  You can grow or shrink a Hadoop cluster to fit your workload by simply dragging the sizing slider.  We'll add more nodes to your cluster while it is processing and when your larger jobs are done, you can reduce the size of the cluster.  If you need more cores available in your subscription, you can open a Billing support ticket to request a larger quota. 

Node Size Selection

Finally, you can also now specify the VM sizes for the nodes within your HDInsight cluster.  This lets you optimize your cluster's resources to fit your workload.  We've made the entire A and D series of VM sizes available.  For each of the different types of roles within a cluster, we'll let you specify the machine type.  This allows you to tune the amount of CPU, RAM and SSD available to your jobs. 

HDInsight on Linux

Today we are also releasing a preview version of our HDInsight service that allows you to deploy HDInsight clusters using Ubuntu Linux containers.  This expands the operating system options you can use when running managed Hadoop workloads on Azure (previously HDInsight only supported Windows Server containers).

The new Linux support enables you to easily use familiar tools like SSH and Ambari to build Big Data workloads in Azure.  HDInsight on Linux clusters are built on the same Hadoop distribution as the Windows clusters, are fully integrated with Azure storage, and make it easy for customers leveraging Hadoop to take advantage of the SLA, management and support that HDInsight offers.  To get started, sign up for the preview here.  You can then easily create Linux clusters using the Azure Management Portal or via our command-line interfaces.

SSH connectivity to your HDInsight clusters is enabled by default for all HDInsight on Linux clusters. You can use an SSH client of your choice to connect to the cluster.  Additionally, SSH tunneling can be leveraged for forwarding traffic from your browser to all of the Hadoop web applications.

Learn More

For more information about Azure HDInsight, check out the following resources:

Site Recovery: General Availability of Enterprise DR with SANs

With today’s Azure release, we are also adding another significant capability to Azure Site Recovery’s disaster recovery and replication portfolio. Enterprises that seek to leverage their Storage Area Network (SAN) Arrays to enable high performance synchronous and asynchronous replication across their on-premises Hyper-V private clouds can now orchestrate end-to-end storage array-based replication and disaster recovery with Azure Site Recovery and System Center Virtual Machine Manager (SCVMM).

The addition of SAN as a replication channel enables key scenarios such as Synchronous Replication, Multi-VM Consistency, and support for Guest Clusters with Azure Site Recovery. With support for Shared VHDX and iSCSI Target LUNs, ASR will now be able to better meet the needs of enterprise-class applications such as SQL Server, SharePoint, and SAP etc.

To enable SAN Replication, in the Azure Management Portal select SAN when configuring SCVMM clouds in ASR. ASR in turn validates that the cloud being configured has host clusters that have been correctly zoned to a Storage Array, either via Fibre Channel or iSCSI. Once the cloud configuration is complete and the storage pools have been mapped, Replication Groups (group of storage LUNs that replicate together and thereby enable multi-VM replication consistency) can be enabled for replication. ASR automates the creation of target LUNs, target Replication Groups, and starts the array-based replication. 

Here’s an example of a Recovery Plan that can failover a SQL Guest Cluster deployed on a Replication Group:

image

Learn More

Visit the Azure Site Recovery forum on MSDN for additional information.

Getting started with Azure Site Recovery is easy - all you need is to simply sign up for a free Microsoft Azure trial.

SQL Database: General Availability of SQL Database (V12)

Earlier this month we released the general availability version of our SQL Database (V12) service version.  We introduced a preview of this new release last December, and it includes a ton of new capabilities. These include:

  • Better management of large databases. We now support heavier database workload management with parallel queries, table partitioning, online indexing, worry-free large index rebuilds with the previous 2GB size limit removed, and more alter database commands.

  • Support for more programmability capabilities: You can now build even more robust applications with CLR, T-SQL Windows functions, XML index, and change tracking support.

  • Up to 100x performance improvements with support for In-memory columnstore queries for data mart and analytic workloads.

  • Improved monitoring and troubleshooting: Extended Events (XEvents) and visibility into over 100 new table views via an expanded set of Database Management Views (DMVs).

  • New S3 performance level: Today's preview introduces a new pricing option for SQL Databases. The new "S3" performance tier delivers 100 DTU of performance (twice the DTU level of the existing S2 tier) and all of the features available in the Standard tier. It enables an even more cost effective way to run applications with higher performance needs.

You can now take advantage of all of these features in general availability - with all databases backed by an enterprise grade SLA.

Upcoming Security Features

I'm also excited to announce a number of new security features that will start rolling out this month and this spring.  These features will help customers better protect their cloud data and help further meet corporate and industry compliance policies. These security enhancements include:

  • Row-Level Security
  • Dynamic Data Masking
  • Transparent Data Encryption

Available in preview today, customers can now implement Row-Level Security on databases to enable implementation of fine-grained access control over rows in a database table for greater control over which users can access which data.

Coming soon, SQL Database will introduce Dynamic Data Masking which is a policy-based security feature that helps limit the exposure of data in a database by returning masked data to non-privileged users who run queries over designated database fields, like credit card numbers, without changing data on the database. Finally, Transparent Data Encryption is coming soon to SQL Database V12 databases for encryption at rest on all databases.

Stay tuned over the coming months for details as we continue to rollout the V12 service general availability and upcoming security features.

Web Sites: Support for Slot Settings

The Azure Web Sites service has always provided the ability to store application settings and connection strings as a part of your Web Site’s metadata.  Those settings become available at runtime via environment variables and, if you use .NET, the standard configuration manager API.  This feature has now been updated to work better with another Web Sites feature: deployment slots. 

Deployment slots provide an easy way for you to safely deploy and test new releases of your web applications prior to swapping them live into production.  Let’s say you have a website called mysite.azurewebsites.net with a deployment slot at mysite-staging.azurewebsites.net.  You can swap these slots at any given time, and with no downtime. This provides a nice infrastructure for upgrading your website. Until now, when you swapped the staging slot with the production site, all settings and connection strings would swap as well. Sometimes that’s exactly what you want and it works great. 

But what if, for testing purposes, your site uses a database and you explicitly want each slot to have its own database (e.g. a production database and a testing database)?  Prior to this month's release that would have been difficult to automate since the swap operation would move the staging connection string to the production site and vice versa. You would have to do something unnatural like going to the staging slot and manually updating the settings to the production values before performing the swap operation. Then, you would execute the swap, and finally manually update the staging settings to point to the staging database. That workflow is very complicated and error prone.  

New Slot Settings Support

Slot specific settings are the solution to this problem.  Simply go to the Azure Preview Portal, navigate to your Web Site’s Settings page, and you’ll see a new checkbox next to each app setting and connection string.  Check the boxes next to each app settings setting and/or connection string that should not participate in swap operations.  Each deployment slot has its own version of this settings page where you can go and enter the slot specific setting values.  You now have a lot more flexibility when it comes to managing deployment slots and flowing configuration between them during swaps:

image 

API Management: New Premium Tier

Earlier this month we released a preview of our new Premium Tier for our API Management Service.  The Azure API Management Service provides a great offering that helps customers expose web-based APIs to customers - and provides support for API protection via rate-limiting, quotas and keys, detailed analytics, easy developer on-boarding and much more.

As the strategic value of APIs increase, customers are demanding even more performance, higher availability and more enterprise-grade features. And in response we're delighted to introduce a new Premium tier of API Management which will offer a 99.95% SLA after preview and includes a number of key new features:

Multiple Geography Deployment

Until now each API Management service resided in a single region selected when the service is created. I’m pleased to announce the introduction of a new multi-region deployment feature that allows API publishers to easily distribute a single API Management service across any number of Azure regions. Customers who want to reduce latency for distributed API consumers and require extremely high availability can now enable multi-geo with minimal configuration.

image

Premium tier customers will now see an updated capacity section on the scale tab of the Azure Management portal. Additional units and regions can be added with a few clicks of the relevant dropdown controls and API Management will provision additional proxies beyond the primary region in a matter of minutes.

Multi-geo is particularly effective when combined with the API Management caching policy, which can provide a CDN-like capability for your mission critical and performance sensitive APIs. For more information on multiple-geography deployment, check out the documentation.

Azure Virtual Network / VPN integration

Many customers are already managing their on-premises APIs using API Management's mutual certificate authentication to secure their backend. The new Premium offering introduces a great new capability for organizations that prefer to use a VPN solution or want to leverage their Azure ExpressRoute connection. Available in the Premium Tier, VPN connectivity settings are available on the configure tab of the Azure Management Portal and can even be combined with multi-geo, with a separate VPN for each region. More information is available in the documentation.

image

Active Directory Integration

Prior to today’s release, API Management's developer portal allowed developers to self-serve sign up using a custom account created with their e-mail address or using popular social identity providers like Facebook, Twitter, Google and Microsoft account. Sometimes businesses and enterprises want more control and would like to restrict sign in options, often preferring Azure Active Directory.

With our latest release, we now allow you to configure Azure Active Directory as an identity provider for Azure API Management. Administrators can disable all other identity providers and restrict access to APIs and documentation based on AD group membership. What's more, access can be extended to allow multiple AAD tenants to access your developer portal, making it even easier to share your APIs with business partners.

image

Learning More

Check out the Azure Active Directory documentation for more information on the integration, and the pricing page for more information on the new premium tier.

DocumentDB: New Asia and US Regions, SQL Parameterization and Increased Account Limits

Earlier this month we released the following new features and capabilities in our Azure DocumentDB service - which provides a fully managed NoSQL JSON database service:

  • New regional availability
  • Larger accounts and documents: Increased the number of capacity units per account and upported document size doubled
  • SQL parameterization: Support for handle and escape user input, preventing accidental exposure of data

New Regions

We have added new support for provisioning DocumentDB accounts in the East Asia, Southeast Asia, and US East Azure regions (in addition to our existing US West, East Europe and West Europe regions). We’ll continue to invest in regional expansion in order to give you the flexibility and choice you need when deciding where to locate your DocumentDB data.

Larger Accounts and Documents

Throughout the preview process we’ve steadily increased the maximum document and database sizes.  With this month's release we've increased the maximum size of an individual document from 256Kb to 512Kb. The Capacity Unit (CU) limit per DocumentDB Account has also been raised from 5 to 50 which means you can now scale a single DocumentDB account to 500GB of storage and 100,000 Request Units of provisioned throughput. As always, our preview quotas can be adjusted on a per account basis - contact us if you have a need for increased capacity.

SQL Parameterization

Instead of inventing a new query language, DocumentDB supports querying documents using SQL (Structured Query Language) over hierarchical JSON documents. We are pleased to announce that we have extended our SQL query capabilities by adding support for parameterized SQL queries in the Azure DocumentDB REST API and SDKs. Using this feature, you can now write parameterized SQL queries. Parameterized SQL provides robust handling and escaping of user input, preventing accidental exposure of data through “SQL injection”.

Let’s take a look at a sample using the .NET SDK. In addition to plain SQL strings and LINQ expressions, we’ve added a new SqlQuerySpec class that can be used to build parameterized queries.  Here’s a sample that queries a “Books” collection with a single user supplied parameter for author name:

IQueryable<Book> queryable = client.CreateDocumentQuery<Book>(

collectionSelfLink,

new SqlQuerySpec {

             QueryText = "SELECT * FROM books b WHERE (b.Author.Name = @name)",

             Parameters = new SqlParameterCollection()  {

              new SqlParameter("@name", "Herman Melville")

              }

       });

Note:

  • SQL parameters in DocumentDB use the familiar @ notation borrowed from T-SQL
  • Parameter values can be any valid JSON (strings, numbers, Booleans, null, even arrays or nested JSON)
  • Since DocumentDB is schema-less, parameters are not validated against any type
  • You could just as easily supply additional parameters by adding additional SqlParameters to the SqlParameterCollection

The DocumentDB REST API also natively supports parameterization. The .NET sample shown above translates to the following REST API call. To use parameterized queries, you need to specify the Content-Type Header as application/query+json and the query as JSON in the body, as shown below.

POST https://contosomarketing.documents.azure.com/dbs/XP0mAA==/colls/XP0mAJ3H-AA=/docs

HTTP/1.1 x-ms-documentdb-isquery: True

x-ms-date: Mon, 18 Aug 2014 13:05:49 GMT

authorization: type%3dmaster%26ver%3d1.0%26sig%3dkOU%2bBn2vkvIlHypfE8AA5fulpn8zKjLwdrxBqyg0YGQ%3d

x-ms-version: 2014-08-21

Accept: application/json

Content-Type: application/query+json

Host: contosomarketing.documents.azure.com

Content-Length: 50

{     

    "query": "SELECT * FROM books b WHERE (b.Author.Name = @name)",    

    "parameters": [         

        {"name": "@name", "value": "Herman Melville"}        

    ]

}

Queries can be issued against document collections, as well as system metadata collections like Databases, DocumentCollections, and Attachments using the approach shown above. To try this out, download the latest build of the DocumentDB SDK on any of the supported platforms (.NET, Java, Node.js, JavaScript, or Python).

As always, we’d love to hear from you about the DocumentDB features and experiences you would find most valuable. Submit your suggestions on the Microsoft Azure DocumentDB feedback forum.

Search: Portal Enhancements, Suggestions & Scoring, New Regions

Earlier this month we released a bunch of great enhancements to our Azure Search service.  Azure Search provides developers with all of the features needed to build out search experiences for web and mobile applications without having to deal with the typical complexities that come with managing, tuning and scaling a large search service.

Azure Portal Enhancements

Last month we added the ability to create and manage your search indexes from the Azure Preview Portal. Since then, you have told us that this has really helped to speed up development as it greatly reduced the amount of code required, but we also heard that you needed more. As a result, we extended the portal by adding the ability to add Scoring Profiles as well as configure Cross Origin Resource Sharing from the portal.

Portal Support of Scoring Profiles

Scoring Profiles boost items up in the search results based on different factors that you control. For example, below, I have a hotels index and all other things being equal, I want highly rated hotels close to the users’ current location to appear at the top of the users search results. To do this, in the Azure Preview Portal, choose Add Scoring Profile and provide a name for it. In this case I am going to call it “closeToUser”. You can create one or more scoring profiles and name them as needed in the search request, allowing you to provide different search results based on different use cases.

image

Once closeToUser has been created, I can start adding weights and functions. For example, in this scoring profile, I chose to add:

  • Weighting: Use hotelName as a weighted field, such that if the search term is found in the hotelName, it gets a weighted boost
  • Distance: Leverage the spatial capabilities of Azure Search to boost a hotel if it is found to be closer to the user’s specified location
  • Magnitude: Provide a boost to the hotels that have higher ratings

All of these functions and weights are then combined into a final score that is used to rank documents.

image

Scoring Profiles can often be tricky and it tends to be mixed with the rest of the query. With Azure Search, scoring profiles experience has been simplified and they are separated from search queries so the scoring model stays outside of application code and can be updated independently. In addition, these scoring profiles are modeled as a set of high-level scoring functions combined with a way to do the typical field weights making editing and maintenance of scoring much simpler.

As demonstrated above, this user experience requires no coding and you can simply choose the fields that are important and apply the function or weight that makes the most sense. It is important to note that scoring profiles is a method of boosting the relevance of a document and should not be confused with sorting. There are a number of other functions available which you can learn more about in the MSDN documentation.

Cross Origin Resource Sharing (CORS)

Web Browsers commonly apply a same-origin restriction policy to network requests, preventing client-side web applications from issuing requests to another domain for security reasons. For example, JavaScript code that came from http://www.contoso.com could not issue a request to another domain such as http://www.northwindtraders.com. For Azure Search developers, this is important in cases where all the data is already publicly accessible and they want to save on latency by going straight to the search index from mobile devices or a browser.

CORS is a method that allows you to relax this restriction in a controlled way so you don’t compromise security. Azure Search uses CORS to allow JavaScript code inside browsers to make search requests directly to the Azure Search service and eliminate the need to proxy all requests through the originating server. We now offer the ability to configure CORS from the Azure Preview Portal, allowing you to easily enable cross-domain access and limit it to specific origins. This can be done from the index management portion of your search service as shown below.

image

Tag Boosting

As discussed with Scoring Profiles, there are many examples of where you may want to boost certain relevant items. To this end, we have also introduced a new and highly requested function to our set of scoring profile functions called Tag Boosting. This feature is currently part of our experimental API version, made available to you so you can test and provide feedback on these potential new features.

Tag Boosting allows you to boost documents that have tags in common with the search query. The tags for the search query are provided as a scoring parameter in each search request and then any document that contain these terms would get a boost. This capability can not only be helpful to enable search result customization, but could also be used for cases where you have specific items you want to promote. As an example, during a sporting event, a retailer might want to promote items that are related to the teams participating in that sporting event.

Improved Suggestions

Suggestions (auto-complete) is a feature that allows you to provide type-ahead suggestions as the user types. Just like scoring profiles, this is a great way to allow your users to find the content they are looking for quickly. When we first implemented search suggestions in Azure Search, we heard a number of requests to extend the capabilities of this feature to better suit your requirements. As a result, we have an entirely new implementation of suggestions to address these items. In particular, it will do infix matching for suggestions and if fuzzy matching is enabled, it’ll show more flexibility for spelling mistakes. It also allows up to 100 suggestions per result, has no limit in length other than field limits and doesn’t have the 3-character minimum length.

This enhancement is still under the experimental API version as we are continuing to gather feedback. For more information on this and to see a more detailed example of suggestions, please see the post on the Suggestions in the Azure Blog.

New Regions

As a final note, I wanted to point out that we are continuing to expand the global footprint of Azure Search. With the addition of East Asia and West Europe you can now provision Azure Search services in 8 regions across the globe.

Media: General Availability of Content Protection Service

Earlier this month we released the general availability of our new Content Protection service for Azure Media Services. This is backed by an enterprise grade SLA for all customers.

We understand the importance of protecting your premium media content, and our robust new DRM offering features both static and dynamic encryption with first party PlayReady license delivery and an AES 128-bit key delivery service. You can either dynamically encrypt during delivery of your media or statically encrypt during the content processing workflow, and our content protection options are available for both live and on-demand workflows.

For more information on functionality and pricing, visit the Media Services Content Protection blog post, the Media Services Pricing webpage, or this Securing Media article.

Management: General Availability of the Azure Resource Manager

Earlier this month we reached general availability of the new Azure Resource Manager, and now provide a world-side SLA of the service. The Azure Resource Manager provides a core set of management capabilities that are fundamental to the Microsoft Azure Platform and form the basis of our new deployment and management model for all Azure services.  You can use the Azure Resource Manager to deploy and manage your Azure solutions at no cost.

The Azure Resource Manager provides a simple, and customizable experience to manage your applications running in Azure along with enterprise grade authentication and authorization capabilities. Benefits include:

Application Lifecycle Boundaries: Azure Resource Manager provides a deployment container called a Resource Group that serves as the lifecycle boundary of resources/services deployed in it - making it easy for you to deploy, manage and visualize services that are contained within it. You no longer have to deploy parts of your application ala carte and then stitch them together manually. A resource Group container supports one-click deployment and tear down of the entire application in a single operation.

Enterprise Grade Access Control: OAuth and Role-Based Access Control (RBAC) are now natively integrated into Azure Management and consistently apply to all services supported by the Resource Manager. Access and operations performed on these services are also logged automatically to enable you to audit them later. You can now use a rich set of platform and resource specific roles that can be applied at the subscription, resource group, or resource level - giving you granular control over who has access to what operation within your organization.

Rich Tagging and Categorization: The Azure Resource Manager supports metadata tagging of resource groups and contained resources, and you can use this tagging support to group objects in ways suitable to your own needs such as management, billing or monitoring. For example, you could mark certain resources or resource groups as being "Dev/Test" and use that to help filter your resources or charge back their bills differently to internal groups in your organization.  This provides the power needed to manage and monitor departmental applications, subscriptions, and billing data in a more streamlined fashion, especially for larger organizations.

Declarative Deployment Templates: The new Azure Resource Manager supports both an imperative API as well as a declarative template model that you can use to deploy rich multi-tier applications on Azure.  These applications can be composed from multiple Azure services (including both IaaS and PaaS based services) and support the ability for you to pass parameters and connection-strings across them.  For example, you could declarative create a SQL DB, Web Site and VM using a single template and automatically wire-up the connection-string details between them.

image

Learn More

Check out the following resources to learn more about the Azure Resource Manager, and start using it today:

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at:twitter.com/scottgu omni


ScottGu Azure event in London on March 2nd

Published Mon, 16 Feb 2015 18:16:51 GMT

On March 2nd I'm doing an Azure event in London that you can attend for free.  I'll be speaking for about 2.5 hours and will do an end-to-end walkthrough of Microsoft Azure, show off a bunch of demos of great new features/capabilities, and talk about some of the improvements coming out over the next few months.

logo[1]

You can sign-up and attend the event for free (while tickets last - they are going fast).  If you are interested sign-up now.  The event is being held at the Mermaid Conference & Events Centre in Blackfriars, London:

mermaidspic3[1]

Hope to see some of you there!

Scott

omni


Azure: Premium Storage, RemoteApp, SQL Database Update, Live Media Streaming, Search and More

Published Thu, 11 Dec 2014 19:14:56 GMT

Today we released a number of great enhancements to Microsoft Azure. These include:

  • Premium Storage: New Premium high-performance Storage for Azure Virtual Machine workloads
  • RemoteApp: General Availability of Azure RemoteApp service
  • SQL Database: Enhancements to Azure SQL Databases
  • Media Services: General Availability of Live Channels for Media Streaming
  • Azure Search: Enhanced management experience, multi-language support and more
  • DocumentDB: Support for Bulk Add Documents and Query Syntax Highlighting
  • Site Recovery: General Availability of disaster recovery to Azure for branch offices and SMB customers
  • Azure Active Directory: General Availability of Azure Active Directory application proxy and password write back support

All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them:

Premium Storage: High-performance Storage for Virtual Machines

I’m excited to announce the public preview of our new Azure Premium Storage offering. With the introduction of the new Premium Storage option, Azure now offers two types of durable storage: Premium Storage and Standard Storage. Premium Storage stores data durably on Solid State Drives (SSDs) and provides high performance, low latency, disk storage with consistent performance delivery guarantees.

image

Premium Storage is ideal for I/O-sensitive workloads - and is great for database workloads hosted within Virtual Machines.  You can optionally attach several premium storage disks to a single VM, and support up to 32 TB of disk storage per Virtual Machine and drive more than 50,000 IOPS per VM at less than 1 millisecond latency for read operations. This provides a wickedly fast storage option that enables you to run even more workloads in the cloud.

Using Premium Storage, Azure now offers the ability to "lift-and-shift" more demanding enterprise applications to the cloud - including SQL Server, Dynamics AX, Dynamics CRM, Exchange Server, MySQL, Oracle Database, IBM DB2, and SAP Business Suite solutions.

Premium Storage is now available in public preview starting today. To sign up to use the Azure Premium Storage preview, visit the Azure Preview page.

Disk Sizes and Performance

Premium Storage disks provide up to 5,000 IOPS and 200 MB/sec throughput depending on the disk size. When you create a new premium storage disk you get the option to select the disk size and performance characteristics you want based on your application performance and storage capacity needs.  For the public preview, we are offering three Premium Storage disk configurations:

Disk Types

P10

P20

P30

Disk Size

128 GB

512 GB

1 TB

IOPS per Disk

500

2300

5000

Throughput per Disk

100 MB/sec

150 MB/sec

200 MB/sec

You can maximize the performance of your VMs by attaching multiple Premium Storage disks to them (up to the network bandwidth limit available to the VM for disk traffic). To learn the disk bandwidth available for each VM size, see the Virtual Machine and Cloud Service Sizes for Azure

Durability

Durability of data is of utmost importance for any persistent storage option. Azure customers have critical applications that depend on the persistence of their data and high tolerance against failures. Premium Storage keeps three replicas of data within the same region. In addition, you can also optionally create snapshots of your disks and copy those snapshots to a Standard GRS storage account - which enables you to maintain a geo-redundant snapshot of your data that is stored at least 400 miles away from your primary Azure region.

Learn More

You can learn more about Premium Storage disks here.  To sign up to use Premium Storage, go to the Azure Preview page, and sign up for Microsoft Azure Premium Storage service using your subscription.

RemoteApp: General Availability of Azure RemoteApp

I’m excited to announce the general availability of Azure RemoteApp. Using Azure RemoteApp, you can deploy Windows desktop applications in the cloud, and provide your users with an intuitive, high-fidelity, WAN-ready remote application experience.  Users can use the cloud-hosted Windows applications you enable on their phones, tablets, or PCs - including Windows, Mac, iOS and Android based devices.  We are delivering RemoteApp with a super competitive price - you can host your user's applications in the cloud for just $10/user/month.  With today’s release, Azure RemoteApp is backed by an SLA and supported by Microsoft Support, offering the full scalability and security of the Azure cloud.

Getting Started

Setting up RemoteApp is easy. In the Azure Management Portal, select NEW -> App Services -> RemoteApp -> Quick Create. Pick a name, region, select the scale configuration plan you want to use, pick one of the standard template images, and click OK. When you do this for the first time, your 30-day free trial will also start. This is a fully featured trial, available to all Azure customers.

image

A RemoteApp instance is an elastic, automatically scaled, collection of Windows Server VMs that are running the Remote Desktop Session Host role and host the applications. The VMs are all created based on the template image you provide. You can provide your own template image containing your custom apps, or use one of the standard template images we provide. One of these is for Office 365 ProPlus, which you can use if you have a subscription that contains the Office 365 ProPlus service:

image

Once enabled, your users can quickly and easily connect to the applications you host in Azure.  They can use Windows, Mac, iOS and Android devices to connect to the RemoteApp service - enabling you to use Azure to run your Windows desktop applications anywhere in the world, on any device.

Enabling Hybrid Applications

Many business-critical Windows applications rely on on-premises infrastructure such as identity and machine management, and require access to on-premises databases and resources. Azure RemoteApp provides a hybrid deployment model that supports all of these scenarios.

  • Hybrid Management: In a hybrid RemoteApp collection, the VMs which host your applications are joined to your AD domain. Therefore, you can use on-premises management tools like Group Policy, System Center, and many other enterprise management tools that rely on AD membership.

  • Federated Identity: You can use Azure Active Directory integrated with your on-premises AD and your users can logon with their familiar corporate identities. When the Windows applications starts, it is running in a fully domain-joined session, with the usual integrated authentication capabilities of a Windows domain.
  • Hybrid Networking: Windows applications in a hybrid RemoteApp collection can seamlessly access on-premises data and resources. This capability is built on Azure Virtual Networking with site-to-site VPN, providing cloud-premise virtual network connectivity. In the future, RemoteApp collections will support full range of Azure Networking capabilities, including ExpressRoute.

Performance and Scale Configurations

With today's general availability release, we are offering two scale configurations: BASIC and STANDARD.

  • BASIC is intended for lighter, task-worker use cases, such as a single productivity application or a data-entry frontend to a line of business system.
  • STANDARD is intended for typical productivity use cases such as using Outlook, Word, Excel and other knowledge worker line of business and productivity applications.

You can select the scale configuration for your RemoteApp collection while creating it. If you want to change it later, you can do so using the SCALE tab. Your applications and settings and your user data remain intact through this change.

image

Pricing

We are making the RemoteApp service available at a very attractive, affordable price.  You can host a line of business Windows application for as little as $10/user per month using the BASIC configuration.

At the STANDARD level, you can host your users’ entire productivity workspace for just $15/user per month.

Learn More

A variety of resources are available on the RemoteApp overview page. You can also download the client for your device and take a test drive. Finally, RDV Team blog discusses today’s new features in more detail.

SQL Databases: Now with SQL 2014 Features and Compatibility

Today we are making available a preview of the next-generation release of our Azure SQL Database service.  We announced some of the preview's new features earlier in November.  Today's release delivers near-complete SQL Server 2014 engine compatibility and even better performance.

Our internal benchmark tests (using over 600 million rows of data) show query performance improvements of around 5x with today's preview relative to our existing Premium Tier SQL Database offering and up to 100x performance improvements when using the new In-memory columnstore technology now supported with today's preview release.

Lots of great new features and improvements

Key highlights of today's preview include:

  • Better management of large databases. We now support heavier database workload management with parallel queries, table partitioning, online indexing, worry-free large index rebuilds with the previous 2GB size limit removed, and more alter database commands.

  • Support for more programmability capabilities: You can now build even more robust applications with CLR, T-SQL Windows functions, XML index, and change tracking support.

  • Up to 100x performance improvements with support for In-memory columnstore queries for data mart and analytic workloads.

  • Improved monitoring and troubleshooting: Extended Events (XEvents) and visibility into over 100 new table views via an expanded set of Database Management Views (DMVs).

  • New S3 performance level: Today's preview introduces a new pricing option for SQL Databases. The new "S3" performance tier delivers 100 DTU of performance (twice the DTU level of the existing S2 tier) and all of the features available in the Standard tier. It enables an even more cost effective way to run applications with higher performance needs.

Learn More and Get Started

You can read more about the enhancements in today's preview on the preview getting started page.  To use today's preview, you can navigate to the SETTINGS part on the SQL Database blade in the Azure Preview Portal and upgrade to use the preview.

image

Try our the new features and give us your feedback!

Media Services: General Availability of Live Media Streaming

I’m very excited to announce the General Availability of Live Channels Media Streaming support.  Live Channels with Azure Media Services is the live services backbone that broadcasters such as NBC Sports have used to deliver live online media streamed events such as English Premier League, NHL hockey, Sunday Night Football.  A dozen international broadcasters also used it to seamlessly deliver live media streaming coverage of the 2014 Sochi Winter Olympics and 2014 FIFA World Cup.

You can now use Live Channels to stream events of your own - and scale to literally millions of users watching them.  Today's general availability release is backed by an enterprise-grade Service-Level Agreement (SLA) for all customers. 

Live Streaming

Learn More

For more information on functionality and pricing, visit the Getting Started with Live Streaming blog post, the Media Services webpage or Media Services Pricing webpage, or the Live Streaming MSDN section.

Search: Portal Management, Multi-language support

I am happy to announce a number of highly requested features available today in Azure Search.  Azure Search provides developers with all of the features needed to build out search experience for web and mobile applications without having to deal with the typical complexities that come with managing, tuning and scaling a real-world search service. 

Azure Portal Enhancements

Helping developers setup and manage their services quickly and easily is a key goal of the Azure Management Portal. Today's release adds several new capabilities to the Azure Search support in the portal that makes it even easier to get started with Azure Search and reduce the need to write code.

For example, you can now easily create a new index. In the portal, you can name the search index, set all of the fields, and assign the properties for each of them - all without writing any code:

image

Once you create the index, you can also now drill into usage details like document counts and storage size. You can see all of the fields associated with this index as shown below:

image

Index tuning is another enhancement now supported in the portal user interface. Boosting relevant items not only provides a better search experience, it also helps you achieve business objectives. For example, if you are searching a product index, you might want to boost documents where the result was found in the product name, as opposed to another document where the result was found in the product description. Or you may wish to use a scoring function that allows you to boost items that have high star ratings or that provide higher margins.

The task of tuning an index was previously only available through the API. Starting today, using the Azure Preview portal you can create or alter scoring profiles, instantly tuning the results of your search queries without having to write a line of code:

image

Multilanguage Support across 27 Languages

With today’s release, Azure Search now has support for 27 languages. This allows Azure Search to accommodate the unique characteristics of a given language, enabling word-breaking and text normalization to work as expected for each language. Part of this enhancement includes support for stemming in the relevant languages, reducing words to their word stems. For example, you can search for the word “runs” and find documents that say “run” or “running”.

When creating an index you can choose to include content from multiple languages, allowing you to search and return results based on the chosen language of your user. For more information, you can visit the Language Support page. Over time, we will continue to enhance multi-language support to include additional languages.

API features

Last but not least, we’ve introduced a new Azure Search Management REST API that allows you to perform common administrative tasks, such as creating new services, and scaling services to allow for additional storage or higher query-per-second rates. You can see a sample of how to use this Management API at CodePlex.

Document DB: Bulk Add Documents and Syntax Highlighting

DocumentDB is a NoSQL document database service designed for scalable and high performance modern applications.  DocumentDB is delivered as a fully managed service (meaning you don’t have to manage any infrastructure or VMs yourself) with an enterprise grade SLA.

We now support some nice new capabilities for Document DB in the Azure Preview Portal:

  • Add Documents: Upload existing JSON documents via Document Explorer
  • Query syntax highlighting: Document DB SQL query

These features make it even easier to get started and explore DocumentDB.

Add Documents Support within the Azure Portal

The DocumentDB Document Explorer within the Azure Preview Portal now supports the uploading of existing JSON documents - which makes it easy to import and start using existing data stored in existing JSON files. Simply open Document Explorer and click the Add Document command:

image

In the new blade that opens, click the browse button to open a file explorer and select 1 or more JSON documents to upload. Note that Document Explorer currently supports up to 100 JSON document files in a single upload operation.

image

Once you’re satisfied with your selection, click the upload button. The documents will automatically be added to the Document Explorer grid and aggregate results will be displayed as the upload operation is in progress.

image

Once the operation has completed, you can select up to another 100 documents to upload without having to close the Add Document blade.  This makes it easy to import data into your DocumentDB databases.

Query Explorer – Syntax Highlighting

We’ve also enabled basic keyword and value highlighting within Query Explorer.

image

This makes it even easier to experiment and test queries against your NoSQL databases.

Please send us your feedback and suggestions on the Microsoft Azure DocumentDB feedback forum. If you haven’t tried DocumentDB yet, you can learn more about how to get started here.

Disaster Recovery: GA of DR for Branch Offices & SMB Customers

I’m excited to announce the General Availability of the Disaster Recovery (DR) to Azure for Branch offices and SMB feature in our Azure Site Recovery (ASR) service.  Today's new support enables consistent replication, protection, and recovery of Virtual Machines directly in Microsoft Azure.  With this new support we have extended the Azure Site Recovery service to become a simple, reliable & cost effective DR Solution for enabling Virtual Machine replication and recovery between Windows Server 2012 R2 and Microsoft Azure without having to deploy a System Center Virtual Machine Manager on your primary site.

These features builds on top of the Hyper-V Replica technology available in Windows Server 2012 R2 and Microsoft Azure to provide remote health monitoring, no-impact recovery plan testing and single click orchestrated recovery – all of this backed by an SLA that is enterprise grade.

Verify DR Plans with Confidence

The Test Failover feature within Azure Site Recovery allows you to test your disaster recovery plans without impacting your production workload which ensures that you can perform periodic DR drills to meet your compliance objectives. You can connect to the virtual machine running in Azure via RDP after enabling the appropriate endpoints for the virtual machine running in Azure.

A Planned Failover will do a shutdown of your on-premises machine, transfer all the last changes inside the virtual machine to Azure & then bring up an instance of the VM in Azure without any data loss. An Unplanned Failover is usually triggered when your on-premises site has been hit by an unexpected disaster.

If you are looking for failing over a multi-virtual machine application, you can do so using the One-Click Orchestration using Recovery Plans feature available in Azure Site Recovery. Recovery plans make failover and failback from Azure easy and ensure that you meet your Recovery Time Objectives (RTO) goals of your organization.

Check out additional pricing or product information about Azure Site Recovery, and sign up for a free Azure trial and start using it today.

Active Directory: GA of Application Proxy and Password Writeback support

Today's Azure update includes some great updates to Azure Active Directory.

Azure Active Directory Application Proxy

The Azure Active Directory Application Proxy allows you to make your on-premises web applications securely accessible to users who want to use them from the cloud - and enables you to authenticate access to them using Azure AD.

You can do this without changing your applications and without having to change your DMZ configuration. Just install a lightweight connector anywhere on your network and configure access to the application in your Azure Active Directory, and you can make your SharePoint, Outlook Web Access (or any other Web application that relies on Kerberos) available to users outside your corporate network.

image

With today's release we added support for Kerberos Constrained Delegation. Now, once a user has authenticated to Azure Active Directory, the Azure Active Directory Application Proxy can automatically authenticate users to your on-premises application.

Password Writeback for Azure Active Directory Premium Customers

With the new Password Writeback support in Azure AD Sync, you can now configure your Active Directory system so that any time a user or administrator changes a password in Azure AD, the new password is also written back to your on-premises Active Directory as well. So, for example, when a user forgets their password to your on-premises AD, they can reset their password using the Azure AD password reset service we provide in the cloud, and then use their new password to sign on to your on-premises AD.  This makes it easier for organizations to enable a variety of self-service IT and password reset features to their employees and partners.

Preview of security questions for password reset

With today's release we’re also introducing preview support that enables you to configure security questions for password reset scenarios. This enables users to register their answers to secret questions, and then use those answers to prove who they are when they go to reset a forgotten password.

Add your own password SSO for SaaS applications

With today's release we are introducing the preview of functionality that lets you configure password-based single sign-on for any web application that has an HTML sign-in page, even for applications that are not in the Azure AD Application Gallery. You can also add any links to your users’ Azure AD Access Panel, such as deep links to specific SharePoint pages, or to web apps that use Active Directory Federation Services.

More Ways to Get AD Premium

We now support the ability to purchase Azure Active Directory Premium online at the Office 365 commerce catalogue, where you can purchase Azure AD Premium licenses for as many users as you want.  You can then easily manage your Azure Active Directory by navigating to http://aka.ms/accessAAD or through the Office administration portal.

To learn more about these new capabilities and how you can start using them, read Alex Simons’ post on the Active Directory Team Blog.

Summary

Today’s Microsoft Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu omni

    1 2 3 4 5 

Let us work with your organization to leverage our years of management consulting and software development experience to improve productivity and implement your business initiatives.

Our Boston area based Microsoft Certified Professional Staff of MBAs, software engineers and architects specialize in the latest Microsoft technologies and business methodologies to develop solutions for the desktop, enterprise, and the Internet.

We will work with you to assess, analyze, plan, and deploy initiatives and technologies to maximize their effectiveness and and maximize their return on your investment. All development and testing is done locally.

Our technical core competencies include Web Services, SQL Server, ASP.NET web forms, MVC, and SilverLight development using state of the art agile methodologies.

Among the many applications we have developed are the following sampling:
  • Academic Organizations - scholarship tracking, society membership back office
  • Health Insurers
  • Retail Distributors
  • Construction Cost Management
  • Hospital Capital Management
  • IT Portfolio Management
  • CRM
  • Help Desk Support

Please contact us for inquiries on how we might work together to solve your business needs at info@rosnertech.com

Thank you for visiting our Web site. The next step is yours. Contact us to explore how we can help your organization.