Top 10 Reasons to Deploy Microsoft Applications on EMC Infrastructure. Inside The Partnership (EMC/MSFT) Episode #10 + EMC is Gold at SQLPass and SharePoint Conference!!

by Sam Marraccini (@EMCMSFT)

Well, I’m fresh back from vacation, refreshed, recharged and wondering…. “What happened to August?” Before vacation, I posted my latest ITP (Inside the Partnership) Episode #10, the top 10 reasons to deploy Microsoft application on EMC Infrastructure. I’m more than happy with the number of views so far!!! As you can tell, I did have a great time making the video. I’m especially happy with the debate the video started. I’m hearing things like; “Sam, how can you not include RSA”, or “You can do a Top 10 for Backup Recovery alone” My reply… “Exactly!!!” Look for more top 10s in the future. And subscribe to the “Inside the Partnership” Youtube channel for all the latest videos.

As August turns to September my calendar jumps to October, specifically two conferences where EMC will be a Gold Sponsor… Cloud meets Big Data will be all over both Microsoft’s SQLPass Summit (October 3-9) and The Microsoft SharePoint Conference (October 11-14). Look for the EMC Theater and stop by for your EMC T-Shirt. Like Microsoft Teched, EMC will be giving away gift cards if you are spotted with your T-shirt on.

Follow me on Twitter @EMCMSFT for the latest developments around #SQLPass and #MSP

Here is a quick summary of the first 10 episodes of Inside the Partnership. Let me know what you like, and what you don’t like

#ITP01 - MTC (Microsoft Technology Center)
#ITP02 - MMS 2011 (Microsoft Management Summit)
#ITP03 - DPA (Data Protection Advisor)
#ITP04 - EMC World in Conference Update
#ITP05 - ESI @ EMC World
#ITP06 - Microsoft Teched 2011
#ITP07 - SQL Fast Track 3.0
#ITP08 - Hyper-V & VPLEX
#ITP09 - Big Data -- Isilon
#ITP10 -- Top 10 Reasons for MS apps on EMC

Have an idea for a future video? Or interested in a specific topic of EMC/MSFT Integration, let me know, always looking for ideas.

Is SQL Server 2008 on VMware ESXi 4.1 supported? Find out using Microsoft’s SVVP Wizards

For DBA’s who have concerns about the support of their SQL server environments on virtualization technologies other than Hyper-V™ and Virtual Server, Microsoft provides the Server Virtualization Validation Program (SVVP).

This article shows the simple steps required to complete the SVVP Support Policy Wizard to check support of your configuration.

  • Step 3 Select Virtualization Technology, Guest OS and Guest Architecture

  • Step 4 Review the Summary Support Statement

thanks Mike Morris for the blog post idea…

Adversary Case Assessment: Putting Your ESI To Good Use

In eDiscovery, we tend to focus most of our attention internally, on our own electronically stored information (ESI).  This makes sense because the data is under our control, and if we cannot get this work done properly, we significantly raise the risk (and cost) of handling eDiscovery.

But what about the other side – what should we do when the other parties in litigation produce their ESI to us?  This is an issue that seems to be discussed very little.  Most companies just have their outside litigation counsel handle this data – but that’s what most of us did just a few years ago with our own ESI.  For companies using an eDiscovery solution for in-house collection and early case assessment, shouldn’t there be a matching process for the data received from other parties?

ACA – Adversary Case Assessment

There’s a lot of value that can be derived from analyzing the other side’s ESI, especially when it is juxtaposed against our own data.  If you plan ahead in your eDiscovery process, you can insure that you’re able to “view” the data in a few different groupings – your data; their data (by party if there’s more than one) and together.  Let’s look at some of the leverage that we can get from using our in-house solution in this manner.

File types.  How many different ESI file types did the other side produce?  In most cases, you should expect a good mix of email, spreadsheets, “productivity” files such as Microsoft Office, Excel and Powerpoint, image files (e.g. jpg/gif) and maybe even various log files, possibly in text form (.txt, .log, etc.).  You might probe a little more deeply:  did they produce any NSF or PST files (the local caches of email that many users keep on their desktop or fileshares)?

If you didn’t receive at least a few items representing these file types – why not?  There may be good reasons – you may have agreed to limit eDiscovery, maybe none of those file types contained relevant information, etc.  But ask the question – first of yourself, and then, if necessary, of the other side.  In many cases, parties frequently focus on email – largely ignoring laptops, fileshares and other repositories of relevant information.  Also, because these files are frequently produced as attachments to emails, it may give the appearance that these repositories were searched.  Thus, run another filter check — are the non-email items just attachments to emails, or were they produced on their own?

Volume.   Overall, does it seem like a fair amount of ESI that’s been produced, i.e. does the number of items seem right?  Again, this will vary greatly from case-to-case but you should have a good idea of how much “stuff” you are receiving.  Back in the paper days, we might question the other side if we produced a warehouse of boxes and they sent us a slim manila folder.  How does their production compare to your production?  Better yet – start to filter the produced ESI by custodian.  Is there a significant amount of information produced from key players?   How does it compare to your key people?  Interactive charts and graphs can go a long way here in helping you to understand what you’re seeing.

Date ranges.   Take a look at the date of the information and see how the volume of information varies over time.  Email will normally be grouped by its date, but files could be grouped by date of creation, modification or last access date.  Is there a high volume of information during the time that you would expect to be most relevant?  What items, in each file type category, are the oldest and most recent by date – and does that fit announced data retention policies and the scope of eDiscovery?  Do the dates and volumes fit with your understanding of the case?   Do this work first by using filters to exclude your data, and then include your own for a second review.  How much does that change the picture, if at all?  Does the other side seem to think that a different range of dates is more important than you did?

Email Domains.   Look at all of the email domains (e.g.,, that are represented in the production as either senders or recipients.  Are there any “new” companies of interest?  Maybe there’s a third party show in email that could have important information available by subpoena.  Did the other side include any information sent to or from their law firm?  If not, was every item really privileged — and did they produce a privilege log?

Email Threading.  Because of its nature, email can be “threaded” into conversations so that you can view a nicely ordered chain of emails that has gone back and forth between parties.  Even one or two message “side conversations” became very noticeable when a group of emails has been properly threaded.  Using your own key email messages as a starting point, thread the messages to include the other side’s production.  Are there new “back channel” or side conversations that the other side held internally, which you never saw?  Were key messages re-forwarded well after the fact  – say weeks or months later as “reasonable anticipation of litigation” began to occur?  Did you receive another copy of emails representing conversations with the other party (which you already produced) – or did they not produce those messages (and if not – why not?).

Wrapping Up

These are just a few very basic ideas of how you can begin to evaluate the other side’s ESI production.  Leveraged properly, in-house eDiscovery solutions can be another powerful tool for corporate (and law firm) counsel to rapidly get their arms around a case and begin to evaluate the other side’s production, too.  Happy ACA-ing!

The Fear, The Doubt, The Concerns of Cloud”

The Fear, The Doubt, The Concerns of Cloud”

Mike Kiersey
EMEA Practice Manager
EMC Corporation

Nephophobia is the fear of clouds or phobia of clouds, this is ok if you are thinking about the fluffy white objects in the sky, however this is not the case for most people responsible or accountable for all or some IT services within their organisation.

In my mind the “Cloud Computing” can be defined in three key areas,

• Private Cloud, within the four walls of your own datacentre,
• Public Cloud, leveraging IaaS, PaaS and SaaS from some of the large providers e.g. Amazon, Microsoft, and finally
• Hybrid Cloud, leveraging existing on premise services and extending out to public cloud for other services.

Everyone wants the “business use case” of the public cloud, agility, price, hassle free, simple to consume etc. however in the real world for some customers are still on a transformation path to consolidate and rationalising applications to reduce cost and complexity etc. also the mere notion of relinquishing control of their critical applications to a public cloud provider to manage, store and protect is a tad too far for some customers to sign up too! Fear of control and lack of readiness.

Let’s take a look at the uncertainty and doubt that people have about cloud. Doubt ranges from the personal, hey will I still have a job? to the operational, how can be sure that they can restore my application data? To the strategic, will I be locked into a cloud service? Embrace the new way of thinking to balance the doubt of long standing tradition of IT practices we are conditioned too, this is only a problem because some of us don’t have a clear idea of what can be achieved with cloud services.

Thinking of people roles, as an end user I would adopt and embrace cloud services because I have never had or needed control of the application just the consumer?, however when I require permission to “do something” that’s when we complain! Loss of control!

What about my data? From the home user pictures ? to mission critical databases and messaging services, is my data safe e.g. backed up every day as per the SLA, who is managing my data, who can see my data, what happens if they misplace or delete my data? At the end of the day, its information that makes the world go round from the gossip in the workplace to secret information of a Merger or Acquisition. All data about my organisation must be secure!

Security, for some is the most important requirement in a cloud environment. It must be secure but at what cost, not all data in your organisation should be classified in the same way. Organisations ensure that they take reasonable actions to secure data from unwanted access to achieve the compliance and demonstrate assurance to external bodies. Everyone should have a concern about security and make reasonable investments to provide protection. In a true world, cloud shouldn’t increase the risk to my organisation provided adequate security measures and auditing capability are applied.

People, control, data and security are the big areas. The rest of the common fears we have had about emerging technology, we are afraid or concerned with the introduction of complexity, the learning curve, tie into cloud providers or some cases losing our jobs.

But the biggest doubt for IT people have about the cloud is failing to understand its true business value. The cloud right now is a technology buzz, but to make the biggest impact, its power must be reflected in business terms. As the benefits of cloud technology become more profound and increasingly more mainstream, fear and doubt will be a smaller barrier.

But right now, “Cloud” is here and here to stay, embrace with diligence and professionalism.

How to Build a Data Warehouse with EMC, Cisco, and Microsoft

The Microsoft SQL Server Fast Track Data Warehouse 3.0 reference configurations were built, designed, and tested by Cisco, EMC and Microsoft to provide:

  • Architectural guidance for customers, partners and reseller who are evaluating, planning, or deploying Microsoft SQL Server based data warehouse solutions
  • Performance and capacity guidance for selecting server, storage and connectivity solutions where out-of-the box performance and ability for rapid deployments are important

Reference Configurations

These reference configurations use Cisco UCS C-Series rack-mount servers and EMC VNX5300™ storage systems connected through Cisco Nexus 5500 series switch using Fibre Channel over Ethernet (FCoE) protocol.

Two reference configurations are introduced – the medium enterprise configuration and large enterprise configuration are designed to meet a broad range of data warehouse requirements, scaling from 8 up to 40 terabytes using compression capabilities in SQL Server 2008 R2 Enterprise.

  • The medium enterprise configuration consists of Cisco UCS C250 M2 Extended-Memory Rack-Mount Server equipped with two Intel® Xeon® Processors X5680 (3.33 GHz, 12MB L3 Cache, 130W), 96 GB of memory and two Cisco UCS P81E Virtual Interface Cards. The storage system consists of an EMC VNX5300 connected through a Cisco Nexus 5548 switch. See Table 1 for the configuration details, Table 2 for the benchmark results, and Table 3 for Bill of Materials.
  • The large enterprise configuration consists of a Cisco UCS C460 M1 High-Performance Rack-Mount Server equipped with four Intel® Xeon® X7560 Processors (2.26GHz, 24MB cache, 130W), 256 GB of memory and eight QLogic QLE8152 Dual Port 10 Gb Converged Network Adapters. The storage consists of two EMC VNX5300 storage systems connected through a Cisco Nexus 5548 switch. See Table 4 for the configuration details, Table 5 for the benchmark results, and Table 6 for Bill of Materials.

You can download the entire paper here.