Friday, May 4, 2012

Windows Server 2012 Data Deduplication Part 5

The 5 master key
It is very important to have a planning for migration or an upgrade of the infrastructure before applying any changes on the infrastructure. I always have this in my practice, you must be crystal clear on what you need to implement as for this you can justified it why you do it. This articles going to be long compare to the previous few articles on data deduplication, as I believe not just for these area we need to emphasize with more effort. But then again, best practice it is not something mandatory, as it is a practice to avoid any disappointment or failure. To get started , I will run through with you on the 6 areas key question.

Where - are we today? 
As mention in my introduction on storage spaces, the most challenges we have it is more toward data growth and we each year we spend thousand of dollar or even more to sustain those data. We still can have the traditional process where by to housekeeping those data or perform a quota on those volumes and making sure it won't growth. But do you think this is a good approach? Highly important data or you are unsure whether those data are been classified important, it have been apply with the same treatment. To overcome this, organization keep on planning for storage expansion. 
 
Why - are we doing it?
I am not sure for the rest of the fellow member who read my blogs will have the same situation as the majority. Mostly the ultimate goal for the organization is to have less spending and have a good ROI,  but how this can be achieve when we need to have a great expansion all over the years?. Data deduplication is part of the solution, and now it will become part of the free feature in Windows Server 2012 aka Windows Server 8 Beta. It have a new evolution of technology which it offer and you can't resists to reject it.
 
How - you are going to start?
This is going to be a long one, but I will cut it into a short one. You must first identify what you have in your corporate network which include the type of application, any cluster configuration and etc. To use the data deduplication the above system that match the above must not be apply:
Must not be a system or boot volume. Deduplication is not supported on operating system volumes.
  1. Volumes may be partitioned MBR or GPT and must be formatted using the NTFS file system. 
  2. Volumes may reside on shared storage, such as a Fibre Channel or SAS array, or an iSCSI SAN and Microsoft Failover Clustering is fully supported. 
  3. Cluster Shared Volumes (CSVs) are not supported. 
  4. Microsoft’s new Resilient File System (ReFS) is not supported. 
  5. Volumes must be exposed to Windows as non-removable drives. Remotely-mapped drives are not supported. 
  6. Files that are encrypted 
Which - are the area that need to be change?
if your organization have the above,then you are good to go. Generally you must identified your target and then slowly plan for it. The recommendation it is always to do it in a phases.

Below are some of the criteria that you can think of to have the dedup feature in places.
  1. General file shares: Group content publication/sharing, user home folders and profile redirection (offline files) 
  2. Software deployment shares: Software binaries, images, and updates 
  3. VHD : VHD file storage for provisioning to hypervisors

When -should I do it?
If you do ask me when you should do it, I will probably will said after Microsoft releases the service pack, as the system will have more stability. Don't get me wrong , on saying Microsoft it is not stable. The standard practice it is always good to wait till the product get stable.

Reference
Part 5 : Planning for data deduplication 

Summary 
Let us know what you though , learns and hope for next articles!. Connect with us on GOOGLE+ , TWITTER and FACEBOOK.

No comments:

Post a Comment