Quantcast
Channel: Data Platform
Viewing all 808 articles
Browse latest View live

Amway sees 100 percent availability with hybrid data-recovery solution

$
0
0

With the release to manufacturing announcement of SQL Server 2014 you will start to see more customer stories showcasing their use of hybrid features that span the cloud and on-premises.

One such customer is global direct seller, Amway.  Its data centers support about 34 terabytes of information spread across 100 instances of Microsoft SQL Server software, with the data load growing at an annual rate of about 15 percent.  The company faces the same challenges of almost any sizable organization in maximizing data availability and ensuring disaster recovery.  As Amway has grown they have created additional secondary data centers as disaster-recovery sites. All this additional infrastructure has, inevitably, introduced more complexity and cost into the Amway data environment.

Previously, Amway had been concerned about cloud configurations that were not under its control. But with Windows Azure Infrastructure as a Service, the company could create its own virtual machine configuration image and install it on Windows Azure, addressing that concern. Amway uses the same virtual-machine media for instances on Windows Azure and in its own data centers, ensuring that installations are consistent across both environments.

Amway conducted a pilot test of a prerelease version of Microsoft SQL Server 2014 software, focusing on the software’s AlwaysOn Availability Groups for high availability and disaster recovery. That feature is based on multisite data clustering with failover to databases hosted both on-premises and in Windows Azure. The pilot test focused on a CRM application. The test architecture consisted of three nodes in a hybrid on-premises/cloud configuration.  A primary replica and secondary replica, operating synchronously to support high availability through automatic failover, both located on-premises. A secondary replica located in Windows Azure, operating in asynchronous mode to provide disaster recovery through manual failover

Amway found that the test of SQL Server AlwaysOn Availability Groups with Windows Azure replicas delivered 100 percent uptime and failover took place in 10 seconds or less, compared to the 45 seconds Amway experienced with traditional SQL Server Failover Clusters. Amway is looking forward to an even bigger reduction in the time required to recover from a complete data center failure. Instead of the two-hour, three-person process required with database mirroring, Amway will be able to restore a data center with just 30 seconds of one DBA’s time.

You can learn more about the Amway solution by reading the more detailed case study here.  

For those who want access to the upcoming SQL Server 2014 release as soon as possible, please sign-up to be notified once the release is available.  Also, please join us on April 15 for the Accelerate Your Insights event to learn about our data platform strategy and how more customers are gaining significant value with SQL Server 2014.  There also will be additional launch events worldwide so check with your local Microsoft representatives or your local PASS chapter for more information on SQL Server readiness opportunities.


Visit Microsoft at the Gartner Business Intelligence and Analytics Summit 2014 in Las Vegas

$
0
0

Microsoft will be at the Gartner Business Intelligence and Analytics summit being held in Las Vegas from March 30th– April 2nd as a premier sponsor. We’re looking forward to sharing our vision of how we’re making big data real through the familiar tools you already use – SharePoint, Excel and SQL Server – as well as new ones such as Power BI for Office 365 and Windows Azure HDInsight.

Over the last few years, we’ve all had to deal with an explosion in data types and the velocity at which we need to react to that data. In this world of big data, Microsoft has refined our data toolkit – adding performance and scaling capabilities on commodity hardware in SQL Server 2014, as well as the ability to store, process and analyze large volumes of data through Windows Azure HDInsight, our 100% compatible implementation of Apache Hadoop.

Just as we’ve added capabilities to our data platform, we’ve continued to focus on making it as easy as possible to get rich insights from your stored data – whether it’s in SQL Server, Windows Azure HDInsight or a 3rd party data provider such as a LOB system. We’ve built powerful visualizations right into Excel with Power View and have added geospatial mapping capabilities through Power Map. It’s also now possible to query your data with natural language through Q&A in Power BI.

Our focus at Gartner will be on showcasing how all of these innovations are coming together to enable all of your users to find, analyze and use the information they need quickly and easily.

We’d love to speak to you if you’ll be there. Stop by our booth; attend our session on April 2nd from 10:45 AM – 11:45 AM; or schedule an individual meeting with Microsoft through the Gartner concierge. We’re also co-hosting a learning lab on the show floor with our partner SAP where you can learn about how Power BI connects to SAP BusinessObjects BI Universes, both through small group sessions and hands-on demonstrations.

We hope to see you. If you haven’t yet registered, you may use code BISP7 to get a $300 discount on registration.

For those who want access to the upcoming SQL Server 2014 release as soon as possible, please sign-up to be notified once the release is available.  Also, please join us on April 15 for the Accelerate Your Insights event to learn about our data platform strategy and how more customers are gaining significant value with SQL Server 2014.  There also will be additional launch events worldwide so check with your local Microsoft representatives or your local PASS chapter for more information on SQL Server readiness opportunities.

SQL Server Data Tools for SQL Server 2014 is available

$
0
0

We’d like to announce the availability of the latest March 2014 release of SSDT with support for SQL Server 2014 databases. This version will be fully compatible with SQL Server 2014 when it is released.

Today we are releasing a web download for Visual Studio 2012. In the next few days the Visual Studio 2013 download will appear through the Visual Studio update channel (Tools –> Extensions and Updates –> Updates) and in Visual Studio 2012 using the “SQL –> Check for Updates” tool.

Please note that this release is for the SQL Server database tooling in Visual Studio, not for SSDT-BI. SSDT-BI is a distinct toolset from SSDT or the SQL Server database tooling in Visual Studio and an update for this product will be released at a later date.

Get it here: http://msdn.microsoft.com/en-us/data/hh297027

Contact Us

If you have any questions or feedback, please visit our forum or Microsoft Connect page
We look forward to hearing from you.

 

What’s New

The most important new feature is support for SQL Server 2014. We also have a number of new features for users of any SQL Server release, including Windows Azure SQL Database, plus plenty of bug fixes for issues raised by customers.

Filtering Data

When choosing to “View Data” on a table in SQL Server Object Explorer it can sometimes be hard to locate specific information. The Filter and Sort Dialog Box lets you:

· Hide columns you don't want to see.

· Filter on column values.

· Add an alias for a column.

· Sort on one or more columns.

image

Custom Static Code Analysis rules

You can now write custom rules to detect problems not covered by the built-in validation and static code analysis rules provided in database tooling. We will be adding an updated walkthrough to our help documentation and have updated our samples project. The samples project includes the code from the walkthrough plus some additional sample rules that may prove useful.

Azure Integration

Support for discovering and navigating to Windows Azure SQL Database nodes was included in Visual Studio 2013 at release. For those of you not familiar with this, the current implementation takes the pain out of discovering and adding Azure databases into the SQL Server Object Explorer:

clip_image003

With this feature you no longer need to figure out the server name for your database or configure a firewall rule for your machine in the Azure Management Portal. This simplifies keeping your Windows Azure SQL Databases in sync with your development environment.

In this release you can navigate straight from Visual Studio to your database in the Management Portal:

clip_image004

Transact-SQL Editor

The Connection menu option now has suboptions for Disconnect All Queries and Change Connection. This makes the editor more consistent with the experience in SQL Server Management Studio.

Change Connection lets you change the Transact-SQL connection to a different server in a single step. Previously you had disconnect and then connect to the new server.

Disconnect All Queries allows you to disconnect all open Transact-SQL query windows.

clip_image005

Data Compare

Saving Data Compare settings is now supported. Clicking save will create a .dcmp file with the required connection information and name.

Changes to Database Tools Extensibility

New Extension Installation Directory

With the new release of SQL Server Data Tools the core DacFx components are now installed directly inside Visual Studio. This has the benefit of supporting multiple Visual Studio versions (e.g. VS 2010, 2012 and 2013) side by side on the same machine without requiring every version to be compatible with the same DacFx binaries.

This does have an effect on any extensions such as Build/Deployment contributors you may have written. These should now be installed in the \Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Extensions directory. You may also need to recompile your extensions against the latest version of DacFx as the major version has been incremented, and this may break binary compatibility.

There will be a new help topic “How to: Install and Manage Feature Extensions” covering installation of all supported extension types that covers this topic.

Future Direction of WCF Data Services

$
0
0

WCF Data Services

Microsoft initially released WCF Data Services as an easy way to implement an OData service over queryable data sources. This made it very easy to expose a model backed by the Entity Framework as an OData Service, and included a data provider model for plugging in other types of queryable data sources. WCF Data Services abstracted away all of the details of HTTP, OData, and the various payload formats, and the data provider was responsible for executing the underlying queries and returning the raw results to be serialized in the appropriate format.

This framework worked well for exposing a core set of OData features over data sources that supported fairly rich query functionality, but had two major limitations. First, it required that the underlying data source support fairly rich query semantics. This made it difficult to implement over diverse sources of data such as data returned from a function invocation or collected from other services. Even for queryable sources, computing the best query to perform on the underlying store was often difficult because the query passed to the data provider lacked a lot of the context of the initial request. Second, because WCF Data Services was somewhat of a monolithic "black box", services had no way to expose new features until supported for each feature was added to the WCF Data Services stack. Adding support to WCF Data Services often in turn required defining new interfaces that each data provider would then have to implement before the functionality was available to the service.

Web API OData Support

In order to give developers more control, we decided to invest in a better factored approach to building OData services using the Web API OData libraries. The Web API OData libraries, built on the popular model-view-controller (MVC) pattern, give the service developer much more control over how each request is processed by allowing the developer to implement custom controllers for each "route".  The methods exposed by these controllers have all the necessary context to satisfy the request, regardless of where and how the data is obtained. And, the better factoring of Web API makes it easier to implement new functionality without waiting for one or more external components to add support the feature.

Giving the developer the ability to implement custom controllers does mean added code for the developers. Most of this code, though, is fairly boilerplate, and things like scaffolding in Web API controllers make it easy to generate default code for handling these requests.

Supporting OData Version 4.0

In implementing support for the new OData version 4.0 Standard, we started by building OData V4 complaint versions of the low-level libraries use by both WCF Data Services and Web API for parsing URLs and serializing/deserializing payloads,. At the same time we released a common OData Client for consuming OData version 4.0 compliant services.

In order to give developers the most flexibility to build on a core set of features, we next prioritized adding core OData version 4.0 support to Web API.

We didn't make this decision lightly. Microsoft has a lot of services internally that are implemented using WCF Data Services. We are investing time in helping internal teams move to using Web API to expose their OData services. Through that process we are finding that the amount of work necessary varies depending on the level of investment the team has in WCF Data Services; teams that built on top of entity framework have the least amount of work to do, and teams that have already implemented custom data providers have incremental work to expose their queries through controller methods. So far, though, the migrations have gone well with the teams enjoying the benefits of more control in how requests are processed, and in the features they expose through their service.

So what becomes of WCF Data Services?

We have actually done the bulk of the work to implement the existing WCF Data Services functionality according to the version 4.0 protocol, but because of the monolithic nature of WCF Data Services it still wouldn't be possible to take advantage of the many compelling new features of OData version 4.0. Most of the new features would require additional work across WCF Data Services, as well as each individual data provider, before they could be used by a service built with WCF Data Services.

Rather than be a bottleneck to the features that WCF Data Service developers want to expose in their services, we are looking at providing an OData 4.0 compatible version of WCF Data Services as Open Source. This option allows developers to extend and adapt the code as appropriate for their individual use, and leverage the community to evolve the functionality much faster than we could ourselves.

We would love feedback on this approach. How far should we take the WCF Data Services code before releasing it as Open Source? Is it better to release it sooner and have the Open Source community finish the work to make the basic functionality fully compliant to the OData 4.0 protocol specification, or should we wait until we've had a chance to clean up the code and make it more compliant with the new standard?

Thoughts and feedback genuinely appreciated.

Thanks,

Michael Pizzo
Principal Architect, Microsoft OData Team

Join us on April 15 and Accelerate Your Insights

$
0
0

Mark your calendar now to join us online on April 15 for the Accelerate your insights event, streaming live from San Francisco, California at 10:00 AM PDT.

Wondering what your data can do for you? Join us to online find out how to drive your business in real-time, from apps to insights. You’ll hear from several of Microsoft’s top executives, including Chief Executive Officer Satya Nadella, Chief Operating Officer Kevin Turner, and Corporate Vice President of the Data Platform Group, Quentin Clark.

Save the date to watch the keynotes streamed live on 4/15:

-        Mark your calendar

-        RSVP on Facebook

Join us as we share Microsoft’s data platform vision, and how the latest advancements in our data technologies can help you transform data into action.

See you there.

Unlocking insights from Big Data can be simple with the right tools

$
0
0

Data enthusiasts & Microsoft talk about Big Data in the enterprise at Gigaom Structure Data 2014

Last week in New York, our data solutions experts spent a few days with more than 900 big data practitioners, technologists and executives at Structure Data for a conversation how big data can drive business success.

The rich conversations with attendees at the event were inspiring, and the broad range of speakers was impressive.  Our discussions over the two days in New York centered on what the Big Data solution looks like inside an enterprise and the challenges around accessing and processing big data to make better data-driven decisions. 

Structure Data attendees want to combine data from multiple sources to do a couple key things -- to gain deeper insights and to ask new questions and get answers.  However, without the right technology to support that, it can be very challenging to do this.  That's where Microsoft comes in -- and where we continued the dialog with attendees as our data experts used a huge Microsoft touchscreens to show how easy it can be to transform big data to insights using simple, front-end tools (like Excel or Power BI for Office 365) and back-end technology for scale, power and speed (like Windows Azure HDInsight and SQL Server). 

Microsoft Research Distinguished Scientist John Platt also spoke at Structure Data and shared the latest on our work in machine learning, which is quite pervasive throughout many Microsoft products. If you missed it, take a moment to watch the short chat here

Our data experts also gave attendees an insiders’ view at how Microsoft’s Cybercrime Center is using data to fight worldwide organized crime and BotNets. (See the video below for more.) 

Take the first step and learn more about Microsoft Big Data solutions:

Or, connect with us on Facebook.com/sqlserver and Twitter @SQLServer and learn how Microsoft’s approach to data, big and small, helps employees, IT professionals and data scientists quickly transform data into insight and action.

Pie in the Sky (March 28th, 2014)

$
0
0

It's been sort of a crazy busy week, so not a large number of links. But maybe you (like me,) need a little time off the computer this weekend. :)

Client/Mobile

JavaScript

  • ES7 async functions: ES6 hasn't really arrived yet and we're already looking at (cool) things in ES7.

Node.js

Misc.

Enjoy!

-Larry

Developing Enterprise SQL Server Features in Visual Studio database tooling

$
0
0

SQL Server 2014 introduced a number of new enterprise features. The most notable of these are memory optimized tables and natively compiled stored procedures. When developing databases with enterprise features in Visual Studio there are a few best practices to be aware of.

Debugging enterprise features

We encourage users to apply multiple levels of validation during database development: live syntax error highlighting, build time validation, F5 debug deployment, and unit testing for comprehensive validation of actual behavior. Modify, build, deploy, and test.

F5 debug deployment is an important step in this validation loop but the default LocalDB debug target may not support all the features you use in your database. To validate databases with these features we encourage you to use an alternative SQL Server with full support. Common approaches include obtaining a license for a Developer Edition and installing a server on each developer’s machine, or using one server for the development team and applying a standard naming scheme to distinguish each developer’s tests databases from each other.

Regardless of which approach you choose you will need to change the debug target for your database projects. At present this is a manual step for each project in your solution.

Changing the debug target inside Visual Studio

For each project:

  • Right-click on the project and choose Properties
  • Select the Debug tab
  • Click the Edit button under Target Connection String to change the debug target

clip_image002

  • Enter the name of your development server and a database name that will avoid conflicts with other developers / databases in other solutions.
    • Note that to auto-generate the name for a project you can set $(USERNAME)_$(Name) as the database name in the connection properties. If you log into a machine as UserA, and the project is called Project1, this will deploy to a database named UserA_Project1.

clip_image003

  • Click OK, then save the Properties page and close it
  • Hit F5 and verify that the expected debug database has been created

Changing the debug target by editing the user file

If you are updating multiple projects might be easier to update the user file. For a project Project1, this will be a file called Project1.sqlproj.user located in the root folder of your project. Edit this in notepad and add the following entries to specify the server name and an auto-generated database name to deploy to. Note that MyServerName should be replaced with the server you wish to connect to.

$(USERNAME)_$(Name)

Data Source=MyServerName;Initial Catalog=$(USERNAME)_$(Name);Integrated Security=True;Pooling=False

 

What features are not supported by LocalDB? 

Features not supported by LocalDB include, but are not limited to:

  • Memory optimized Tables
  • Memory optimized FileGroups
  • Natively Compiled Stored Procedures
  • Full text indexes
  • Filestreams
  • Partitioning

Validation of SQL Server 2014 features

We strongly believe in the benefit of build time validation. At build time most issues that would block database deployment will be marked as errors and block project build. It is important to know that some new SQL Server 2014 features do not have this level of validation. Memory optimized tables and natively compiled stored procedures introduce a number of restrictions on TSQL syntax that will not be enforced at build time. The best practice to when implementing these features is to use the F5 debug deployment to catch any deploy time issues.

If you want to catch these issues earlier you could also add custom Static Code Analysis rules to catch these at build time. There are a number of examples of rules that check for SQL Server 2014-specific issues in our samples project. These are planned for inclusion in future releases of the database tooling as either validation rules that run automatically during build or as standard Static Code Analysis rules. Suggestions for additional rules are welcome and can be sent via our forum or Microsoft Connect page.

Useful In-Memory OLTP (In-Memory Optimization) links

· Supported SQL Server Features

· Supported Data Types

· In-Memory OTP (In-Memory Optimization) main help page


SQL Server 2014 now Generally Available

$
0
0

Microsoft today released Microsoft SQL Server 2014, the foundation of our cloud-first data platform. SQL Server 2014 delivers breakthrough performance with new and enhanced in-memory technologies to help customers accelerate their business and enable new, transformational scenarios. In addition, SQL Server 2014 enables new hybrid cloud solutions to take advantage of the benefits of cloud computing with scenarios such as cloud backup and cloud disaster recovery for on-premises SQL Server installations.  SQL Server 2014 continues to offer industry-leading business intelligence capabilities through integration with familiar tools like Excel and Power BI for Office 365 for faster insights. 

We are also making generally available the SQL Server Backup to Microsoft Azure Tool, a free tool that allows customers to backup older versions of SQL Server to Azure storage.

Try SQL Server 2014 release today

Download and try the generally available release of SQL Server 2014 today on premises, or get up and running in minutes in the cloud. And, please be sure to save the date for the live stream of our April 15 Accelerate Your Insights event to hear more about our data platform strategy from CEO Satya Nadella, COO Kevin Turner and CVP of Data Platform Quentin Clark.

Thanks.

Eron Kelly

General Manager

Data Platform Group

Pie in the Sky (April 4th, 2014)

$
0
0

It's the weekend I wanted to go see a movie, so of course I now have a cold. Well at least there's lots of //build videos to watch. Here's a list of some of the more interesting links I found this week.

Client/Mobile

.NET

  • The .NET Foundation: Microsoft announced the .NET foundation this week, which will provide stewardship of a bunch of open source .NET technologies. Hit the page for a list of all the OSS .NET projects they are working with.

Java

Node.js

Ruby

Misc.

Enjoy!

-Larry

SQL Server 2014 brings on-premises and cloud database together to improve data availability and disaster recovery

$
0
0

With the recently disclosed general availability of SQL Server 2014, Microsoft brings to market new hybrid scenarios, enabling customers to take advantage of Microsoft Azure in conjunction with on-premises SQL Server.

SQL Server 2014 helps customers to protect their data and make it more highly availably using Azure. SQL Server Backup to Microsoft Azure builds on functionality first introduced in SQL Server 2012, introducing a UI for easily configuring backup to Azure from SQL Server Management Studio (SSMS). Backups are encrypted and compressed, enabling fast and secure cloud backup storage. Set up requires only Azure credentials and an Azure storage account. For help getting started, this step-by-step guide will get you going with the easy, three-step process.

Storing backup data in Azure is cost-effective, secure, and inherently offsite, making it a useful component in business continuity planning. A March 2014 commissioned study conducted by Forrester Consulting on Microsoft's behalf about Cloud Backup and Disaster Recovery found that saving money on storage is the top benefit of cloud database backup, cited by 61%, followed closely by 50% who said savings on administrative cost was a top reason for backing up to the cloud. Backups stored in Azure also benefit from Azure built-in geo-redundancy and high services levels, and can be restored to a Azure VM for fast recovery from onsite outages.

In addition to the SQL Server 2014 functionality for backing up to Azure, we have now made generally available a free standalone SQL Server Backup to Microsoft Azure Tool that can encrypt and compress backup files for all supported versions of SQL Server, and store them in Azure—enabling a consistent backup to cloud strategy across your SQL Server environments. This fast, easy to configure tool enables you to quickly create rules that direct a set of backups to Azure rather than local storage as well as select encryption and compression settings.

Another new business continuity planning scenario enabled by SQL Server 2014 is disaster recovery (DR) in the cloud. Customers are now able to setup an asynchronous replica in Azure as part of an AlwaysOn high availability solution. A new SSMS wizard enables you to simplify the deployment of replicas on-premises and to Azure. As soon as a transaction is committed on-premises it is sent asynchronously to the cloud replica. We still recommend you keep your synchronous replica on-premises, but by having the additional replicas in Azure you gain improved DR and can reduce the CAPEX and OPEX costs of physically maintaining additional hardware in additional data centers.

Another benefit of keeping an asynchronous replica in Azure is that the replica can be efficiently utilized for read functionality like BI reporting or utilized for doing backups, speeding up the backup to Azure process as the secondary is in Azure already.

But the greatest value to customers of an AlwaysOn replica in Azure is the speed to recovery. Customers are finding that their recovery point objectives (RPO) can be reduced to limit data loss, and their recovery time objectives (RTO) can be measured in seconds:

  • Lufthansa Systems is a full-spectrum IT consulting and services organization that serves airlines, financial services firms, healthcare systems, and many more businesses. To better anticipate customer needs for high-availability and disaster-recovery solutions, Lufthansa Systems piloted a solution on SQL Server 2014 and Azure that led to faster and more robust data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions. They expect to deploy the solution on a rolling basis starting in 2014.
  • Amway is a global direct seller. Amway conducted a pilot test of AlwaysOn Availability Groups for high availability and disaster recovery. With multisite data clustering with failover to databases hosted both on-premises and in Azure, Amway found that the test of SQL Server AlwaysOn with Azure replicas delivered 100 percent uptime and failover took place in 10 seconds or less. The company is now planning how best to deploy the solution.

Finally, SQL Server 2014 enables you to move your database files to Azure while keeping your applications on-premises for bottomless storage in the cloud and greater availability. The SQL Server Data Files in Microsoft Azure configuration also provides an alternative storage location for archival data, with cost effective storage and easy access.

If you're ready to evaluate how SQL Server 2014 can benefit your database environment, download a trial here. For greater flexibility deploying SQL Server on-premises and in the cloud, sign up for a free Azure evaluation. And, to get started backing up older versions of SQL Server to Azure, try our free standalone backup tool. Also, don't forget to save the date for the live stream of our April 15 Accelerate Your Insights event to hear more about our data platform strategy from CEO Satya Nadella, COO Kevin Turner and CVP of Data Platform Quentin Clark.

SQL Server 2014 Launch – A Community Affair

$
0
0

 Guest blog post by: PASS President Thomas LaRock – a SQL Server MVP, MCM, and Head Geek at Solarwinds – is a seasoned IT professional with over a decade of technical and management experience. Author of DBA Survivor: Become a Rock Star DBA, he holds an MS degree in Mathematics from Washington State University and is a Microsoft Certified Trainer and a VMware vExpert. You can read his blog at thomaslarock.com and follow him on Twitter at @SQLRockstar.

*     *     *     *     *

April opened with the general availability of SQL Server 2014. But well before we could wrap our hands around the final bits of the new release, the SQL Server community has been getting an early taste of its exciting performance, availability, manageability, and cloud features, thanks to a grassroots launch and readiness program that has spread around the globe.

The Professional Association for SQL Server (PASS) and hundreds of our volunteers around the world have joined with Microsoft to host free SQL Server 2014 launch events and technical sessions that focus on what matters most to data pros. These sessions explain the new features in the release, how the features work, and how we can use them to benefit our companies.

From user group meetings and PASS SQLSaturday sessions to the ongoing SQL Server 2014 Countdown webinars with PASS Virtual Chapters, the launch of SQL Server 2014 has truly been a community affair – and we're just getting started. Whether you're already on the path to early adoption, preparing to take advantage of the new release soon, or gathering information for the future, here's how you can get involved and get the details you need to make smart decisions for your organization:

  • Connect with fellow SQL Server pros: Microsoft Data Platform Group GM Eron Kelly noted that for Community Technology Preview 2, there were nearly 200K evaluations of SQL Server 2014, including 20K evaluations with the new release running in a Microsoft Azure Virtual Machine. That's a lot of folks who now have first-hand knowledge about SQL Server 2014. Check out those blogging and speaking about their experiences and sharing at chapter meetings, and then get to know them and what they know.
  • Share your questions, issues, and solutions: Have you tried out SQL Server's new built-in in-memory OLTP features? How about the enhanced mission-critical and availability capabilities? Have questions about implementing a hybrid data solution that bridges on-premises and cloud technologies? And how and when should you use the new delayed durability setting or clustered columnstore indexes? Share your experiences – and what you don't know or need more information about – and help the community build up resources that enable us all to work better, smarter, and faster.
  • Learn how to get the most from your data: Go inside the new release with experts on the SQL Server product team at upcoming live SQL Server 2014 Countdown webinars and watch on-demand replays of those you missed. You can also learn more about SQL Server 2014 and Microsoft's data platform strategy at the Accelerate Your Insights online launch event April 15 with Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group Corporate Vice President Quentin Clark. And remember to check with your local PASS chapter, Virtual Chapter, or nearby SQLSaturday event for more SQL Server 2014 launch and learning events happening worldwide.

I'm grateful to be part of one of the most passionate technology communities in the world and excited to participate in a SQL Server 2014 launch program that, at its core, is about empowering SQL Server professionals and their organizations to be successful.

Thanks to everyone who is helping connect, share, and learn about SQL Server 2014.
Thomas

Client Property Tracking for PATCH

$
0
0

 

In OData Client for .NET 6.2.0, we enabled the top level property tracking on client side when sending a PATCH. This feature allows client to send only updated properties, instead of the entire object to server.

The top level property tracking means the client will only track the highest level properties in the class hierarchy. In other words, if customer is updating a property under a complex type property, or an item in a collection property under an entity, the client will send the entire complex type property (including unchanged properties under the complex property) or the entire collection (including unchanged items in the collection) to the server.

In implementing property tracking, we start by creating a DataServiceCollection instance. DataServiceCollection helps to track all the properties. For more information about DataServiceCollection, See “Binding Data to Controls (WCF Data Services)”.

Let’s move to a real example to explain how to do it.

Service EDM model

http://services.odata.org/V4/(S(readwrite))/OData/OData.svc/

Following sample code will base on this model.

Client Code

You can read “How to use OData Client Code Generator to generate client-side proxy class” for generating client-side proxy class.

Now, we will show you how to enable property tracking when doing a PATCH.

1. We need a DataServiceCollection which will handle property tracking.

DataServiceCollection products = new DataServiceCollection (dataServiceContext.Products.Where(p => p.ID == 0));

2. Update the property

products[0].Price = 3.0;

3. Save changes

Http-Method will be “PATCH” when updating with SaveChangesOptions.None. This step will send the updated top level properties.

dataServiceContext.SaveChanges();

or

dataServiceContext.SaveChanges(SaveChangesOptions.None);

Sample Code

Here I will provide several samples with their payloads.

1. Update top-level primitive property. Only the updated primitive property will be sent.

Sample Code:

Payload:

 

2. Update property in top-level complex property. The entire complex property (“Address” in the example) instead of the updated property (“City” in the example) in the complex property will be sent to server.

Sample Code:

Payload for complex type:

 

3. Update a navigation entity, only the updated property of the entity will be sent out.

Sample code:

Payload:

Note

If you don’t use DataServiceCollection, then client will not support property tracking. For example:

Although client tracked the entity after ToList() was called. SaveChanges() will not send any request. The client cannot track the property change.

If you tried to use following code to update the entity, then the whole entity will be sent, instead of the only updated properties.

Payload will be like

Pie in the Sky (April 11th, 2014)

$
0
0

It's been a pretty busy week, what with security vulnerabilities among other things. Here's a few links (including a few on security vulnerabilities,) for your weekend reading.

Cloud

Client/mobile

Wearable

Node.js

  • Scaling Node.js applications: A discussion of ways you can scale a Node.js app. Or you can just find a cloud provider that does all that for you.

  • Express 4: The new version is out, some changes, and some things to pay attention to when migrating.

  • npm and Heartbleed: npm doesn't appear to have been compromised, but are regenerating SSL keys anyway

  • Is Node.js affected by Heartbleed?: Short answer is no for recent versions. Long answer is "are you using other stuff that might be?"

PHP

Ruby

Misc.

Enjoy!

-Larry

ODataLib 6.2 release

$
0
0

We are happy to announce that the ODL 6.2 is released and available on nuget along with the source code on codeplex (please read the git history for the v6.2 code info and allprevious version). Detailed release notes are listed below.

Bug fixes

Fixed a bug for parsing $it in UriParser.

Improved the JSON serialization performance for unindented format.

New Features

  • Model Enhancement: ODataLib & EdmLib now support complex type inheritance. A complex type can now inherit from another complex type by specify the baseType.
  • Model Enhancement: ODataLib & EdmLib now support open complex type. This feature allows clients to add properties dynamically to instances of this type. The added properties could be a primitive type, a complex type or as complex as an open collection type.
  • Client Enhancement: OData Client now supports property level change tracking for PATCH. If you turn on this feature, the client will only include the updated property in the PATCH payload. If you are building client with massive update, you will find this feature very useful.
  • Client Enhancement: OData Client supports overriding the property name in metadata in proxy classes. e.g. you can use lower-camel naming convention in your client talking to a Server in Pascal naming convention.
  • New APIs: ODataLib support generating ServiceDocument from EdmModel directly by an new extension method GenerateServiceDocument().

Known Issues

  • Type casting for complex type in $filter and $select is not supported.
  • Reading individual property of derived complex type directly by OData Client is not supported.

Call to Action

You and your team are highly welcomed to try out this new version if you are interested in the new features and fixes above. For any feature request, issue or idea please feel free to reach out to us.

Thanks,

The OData Team


Tune in tomorrow and accelerate your insights

$
0
0

Tomorrow’s the day! Tune in to hear from Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group CVP Quentin Clark about Microsoft’s approach to data, and how the latest advancements in technology can help you transform data into action.

Who should watch?

Join us tomorrow morning at 10AM PDT if you like data or want to learn more about it. If you store it, you manage it, you explore it, you slice and dice it, you analyze it, you visualize it, you present it, or if you make decisions based on it. If you’re architecting data solutions or deciding on the best data technology for your business. If you’re a DBA, business analyst, data scientist, or even just a data geek on the side, join the live stream.

What will I hear about?

Data infrastructure. Data tools. And ultimately, the power of data. From finding the connections that could cure cancer, to predicting the success of advertising campaigns, data can do incredible things. Join us online and get inspired. You’ll see how your peers are putting their data, big and small, to work.

From a product perspective, we’ll celebrate the latest advancements in SQL Server 2014, Power BI for Office 365, SQL Server Parallel Data Warehouse, and Microsoft Azure HDInsight. And ultimately, we’ll explore how these offerings can help you organize, analyze, and make sense of your data – no matter the size, type, or location.

Where do I sign up?

Mark your calendar now or RSVP on Facebook so you’re ready to go tomorrow. When streaming goes live, you can join us here for all the action live from San Francisco.

When do things get started?

Tomorrow, April 15, at 10AM PDT. Be there.

See you tomorrow!

The data platform for a new era

$
0
0

Earlier today, Microsoft hosted a customer event in San Francisco where I joined CEO Satya Nadella and COO Kevin Turner to share our perspective on the role of data in business. Satya outlined his vision of a platform built for an era of ambient intelligence. He also stressed the importance of a “data culture” that encourages curiosity, action and experimentation – one that is supported by technology solutions that put data within reach of everyone and every organization. 

Kevin shared how customers like Beth Israel Deaconess Medical Center, Condé Nast, Edgenet, KUKA systems, NASDAQ, telent, Virginia Tech and Xerox are putting Microsoft’s platform to work and driving real business results. He highlighted an IDC study on the tremendous opportunity for organizations to realize an additional $1.6 trillion dividend over the next four years by taking a comprehensive approach to data. According to the research, businesses that pull together multiple data sources, use new types of analytics tools and push insights to more people across their organizations at the right time, stand to dramatically increase their top-line revenues, cut costs and improve productivity. 

A platform centered on people, data and analytics
In my keynote, I talked about the platform required to achieve the data culture and realize the returns on the data dividend – a platform for data, analytics and people. 

It’s people asking questions about data that’s the starting point -- Power BI for Office 365 and Excel’s business intelligence features helps get them there. Data is key – data from all kinds of sources, including SQL Server, Azure and accessibility of the world’s data from Excel. Analytics brings order and sets up insights from broad data – analytics from SQL Server and Power BI for Office 365, and Azure HDInsight for running Hadoop in the cloud.

A platform that solves for people, data, and analytics accelerates with in-memory. We created the platform as customers are increasingly needing the technology to scale with big data, and accelerate their insights at the speed of modern business. 

Having in-memory across the whole data platform creates speed that is revolutionary on its own, and with SQL Server we built it into the product that customers already know and have widely deployed. At the event we celebrated the launch of SQL Server 2014. With this version we now have in-memory capabilities across all data workloads delivering breakthrough performance for applications in throughput and latency. Our relational database in SQL Server has been handling data warehouse workloads in the terabytes to petabyte scale using in-memory columnar data management. With the release of SQL Server 2014, we have added in-memory Online Transaction Processing. In-memory technology has been allowing users to manipulate millions of records at the speed of thought, and scaling analytics solutions to billions of records in SQL Server Analysis Services. 

The platform for people, data and analytics needs to be where the data and the people are. Our on-premises and cloud solutions provide endpoints for a continuum of how the realities of business manage data and experiences – making hybrid a part of every customer’s capability. Today we announced that our Analytics Platform System is generally available– this is the evolution of the Parallel Data Warehouse product that now supports the ability to query across the traditional relational data warehouse and data stored in a Hadoop region – either in the appliance or in a separate Hadoop cluster. SQL Server has seamless integration with VMs in Azure to provide secondaries for high availability and disaster recovery. The data people access in the business intelligence experience comes through Excel from their own data and partner data – and Power BI provides accessibility to wherever the data resides.  

The platform for people, data and analytics needs to have full reach. The natural language search query Q&A feature in Power BI for Office 365 is significant in that it provides data insights to anyone that is curious enough to ask a question. We have changed who is able to reach insights by not demanding that everyone learn the vernacular of schemas and chart types. With SQL Server, the most widely-deployed database on the planet, we have many people who already have the skills to take advantage of all the capabilities of the platform. With a billion people who know how to use Excel, people have the skills to get engaged on the data.

Looking forward, we will be very busy. Satya mentioned some work we are doing in the Machine Learning space and today we also announced a preview of Intelligent Systems Service– just a couple of the things we are working to deliver a platform for the era of ambient intelligence. The Machine Learning work originates in what it takes to run services at Microsoft like Bing. We had to transform ML from a deep vertical domain into an engineering capability, and in doing so learned what it would take to democratize ML for our customers. Stay tuned. 

The Internet of Things (IoT) space is very clearly one of the most important trends in data today. Not only do we envision the data from IoT solutions being well served by the data platform, but we need to ensure the end-to-end solution can be realized by any customer. To that end, Intelligent Systems Service (ISS) is an Internet of Things offering built on Azure, which makes it easier to securely connect, manage, capture and transform machine-generated data regardless of the operating system platform.

It takes a data platform built for the era of ambient intelligence with data, analytics and people to let companies get the most value from their data and realize a data culture. I believe Microsoft is uniquely positioned to provide this platform – through the speed of in-memory, our cloud and our reach. Built on the world’s most widely-deployed database, connected to the cloud through Azure, delivering insights to billions through Office and understanding the world through our new IoT service – it is truly a data platform for a new era. When you put it all together only Microsoft is bringing that comprehensive a platform and that much value to our customers.

 

Quentin Clark
Corporate Vice President
Data Platform Group

SQL Server 2014 and HP Sets Two World Records for Data Warehousing Leading In Both Performance and Price/Performance

$
0
0

Yesterday we talked about how we are delivering real-time performance to customers in every part of the platform.  I’m excited to announce another example of where we are delivering this to customers in conjunction with one of our partners. Microsoft and Hewlett Packard broke two world records in TPC-H 10 Terabyte and 3 Terabyte benchmarks for non-clustered configuration for superior data warehousing performance and price-performance. Each of the world records showed SQL Server breaking the previously held record by Oracle/SPARC on both performance and price/performance1 by significant margins.

10TB: Running on a HP ProLiant DL580 Gen8 server with SQL Server 2014 Enterprise Edition and Windows Server 2012 R2 Standard Edition, the configuration achieved a world record non-clustered performance of 404,005 query-per-hour (QphH) topping the previously held record from Oracle/SPARC of 377,594 query-per-hour (QphH) 1.  The SQL Server configuration also shattered the Price/performance metric with a $2.34 USD Dollar/Query-per-Hour ($/QphH) toping Oracle’s $4.65 $/QphH1

3TB: Running on a HP ProLiant DL580 Gen8 server with SQL Server 2014 Enterprise Edition and Windows Server 2012 R2 Standard Edition, the configuration achieved a world record non-clustered performance of 461,837 query-per-hour (QphH) topping the previously held record from Oracle/SPARC of 409,721 query-per-hour (QphH) 1.  The SQL Server configuration also shattered the Price/performance metric with a $2.04 USD Dollaer/Query-per-Hour ($/QphH) topping Oracle’s $3.94 $/QphH1

By breaking the world records for both performance and price/performance validates how SQL Server 2014 is delivering on leading in-memory performance at exceptional value. It also validates SQL Server’s leadership in data warehousing.

The TPC Benchmark™H (TPC-H) is an industry standard decision support benchmark that consists of a suite of business oriented ad-hoc queries and concurrent data modifications. The queries and the data populating the database have been chosen to have broad industry-wide relevance. This benchmark illustrates decision support systems that examine large volumes of data, execute queries with a high degree of complexity, and give answers to critical business questions.  The performance metric is called the TPC-H Composite Query-per-Hour Rating and the price/performance metric is the cost / performance metric. More information can be found at http://tcp.org

Eron Kelly,
General Manager
SQL Server

 

For more information:

 

1As of April 15, 2014.

SQL Server 2014 HP 10TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_last_ten_results.asp

2SQL Server 2014 HP 3TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_last_ten_results.asp
Oracle 10TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=113112501&layout=

Oracle 3TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=113060701&layout=

Customers using Microsoft technologies to accelerate their insights

$
0
0

At yesterday’s Accelerate your insights event in San Francisco, we heard from CEO Satya Nadella, COO Kevin Turner and CVP Quentin Clark about how building a data culture in your company is critical to success. By combining data-driven DNA with the right analytics tools, anyone can transform data into action.

Many companies, and many of our customers, are already experiencing the power of data - taking advantage of the fastest performance for their critical apps, and revealing insights from all their data, big and small.

Since SQL Server 2014 was released to manufacturing in April we’ve seen many stories featuring the new technical innovations in the product.  In-memory transaction processing (In-Memory OLTP), speeds up an already very fast experience by delivering speed improvement of typically up to 30x.  Korean entertainment giant CJ E&M is using In-Memory OLTP to attract more customers for its games by holding online giveaway events for digital accessories like character costumes and decorations soon after each game is released.   When it ran tests in an actual operational environment for one of its most popular games, the results were that SQL Server 2014 delivered 35-times-faster performance over the 2012 version in both batch requests per second and I/O throughput. 

SQL Server 2014 also delivers enhanced performance to data warehouse storage and query performance – NASDAQ OMX is using the In-Memory Columnstore for a particular system which handles billions of transactions per day, multiple petabytes of online data and has single tables with quintillions of records of business transactions.  They have seen storage reduced by 50% and some query times reduced from days to minutes. 

Lufthansa Systems is using the hybrid features of SQL 2014 to anticipate customer needs for high-availability and disaster-recovery solutions.  It has piloted the combined power of Microsoft SQL Server 2014 and Windows Azure has led to even faster and fuller data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions, compared with the company’s current solutions.

Growth in data volumes provides multiple challenges and opportunities.  For executives and researchers at Oslo University Hospital providing ease of access to data is important.  Using Power BI for Office 365, they can analyze data in hours rather than months, collaborate with colleagues around the country, and avoid traditional BI costs.  For Virginia Tech the data deluge presents challenges for researchers in the life sciences where new types of unstructured data types from gene sequencing machines are generating petabytes of data.  They are using the power of the cloud with Microsoft Azure HDInsight to not only analyzing data faster, but analyzing it more intelligently and which may in the future provide cures for cancer.  For The Royal Bank of Scotland handling multiple terabytes of data and an unprecedented level of query complexity more efficiently led them to use the power of the Analytics Platform System (formerly Parallel Data Warehouse).  As a result, it gained near-real-time insight into customers’ business needs as well as emerging economic trends, cut a typical four-hour query to less than 15 seconds, and simplified deployment. 

Many customers are getting benefits from individual technologies, but Warner Brothers Games are using multiple BI technologies to provide a true global enterprise vision into metrics for executive management.  They have used SQL Server to analyze structured data from finance and sales, HDInsight to analyze large amounts of unstructured data, such as social data and player trends, and SharePoint and the Power BI tools in Excel surfacing the data to executive management. The organization has gained new insights which helps drive new business strategies – you can watch the video here.

Whether you’ve already built a data culture in your organization, or if you’re new to exploring how you can turn insights into action, try the latest enhancements to these various technologies: SQL Server 2014, Power BI for Office 365, Microsoft Azure HDInsight, and the Microsoft Analytics Platform System.

Pie in the Sky (April 18th, 2014)

$
0
0

It's pretty quiet this week, so not a lot of links. Maybe everyone is off at the beach for spring break? Here are a few links for those of us stuck in front of computers this week.

Cloud

Client/mobile

.NET

Node.js

Misc.

Enjoy!

-Larry

Viewing all 808 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>