3 Ways Companies can Significantly Reduce Data Storage Requirements

Posted on May 26, 2011 by
singleimage

The technique you choose isn’t necessarily depends on the industry your company is in or the type of data you have. These are three useful ways to minimize your data volume:

Although the cost of storage media is declining, the demand is still soaring high, as gigabytes of customers and transaction data can be generated in a matter of hours in major companies. Trimming corporate data not only reduces the hardware, software, space and power costs, but also eases the pressure on backup systems and networks. However, choosing the proper technique may not be so easy. First of all, you need to consider how your company uses the data and whether the drop in performance worth the cost savings of storage reduction.

The technique you choose isn’t necessarily depends on the industry your company is in or the type of data you have. These are three useful ways to minimize your data volume:

Deduplication:

It is a process of finding and, if necessary, removing duplicates in your data storage, a process that can significantly reduce storage requirements in your company. With a properly configured deduplication, you can use only one attachment file for hundreds of employees. Deduplication is essential for backup and archiving process; where speed is a secondary priority. Some firms even achieved a reduction of 72:1 on backup data. Data can be deduplicated at the block or file level. In many cases, systems with fine-grained assessment can get more space savings. However, it comes with two consequences, slower data access and longer deduplication process.

Deduplication can be performed inline (preprocessing) when the data is being transferred to the target area or postprocessing where the data is already in place. Postprocessing should be used if it’s important to synchronize backup with the fast-moving data. On the other hand, preprocessing is more appropriate if you have more time to spare and want to minimize costs. Although preprocessing can reduce data volume to 20:1, it can hurt performance and isn’t scalable. This may mean that more servers are needed for the inline deduplication to work properly. Postprocessing deduplication doesn’t offer comparable data saving because some space is used for buffer.  Deduplication is definitely important for many organizations because, fifteen copies of the same file can be used by customer relationship management system, ERP and data mining. It is important to use a common enterprise-wide deduplication system to ensure better compatibility.

Both deduplication methods can be combined, for example, preprocessing mode can be used initially to save more space, but then switched to postprocessing when a performance degradation threshold is reached. IT department can also determine which files should undergo preprocessing and which should undergo postprocessing, based on their importance and size. It may take some time to reach a perfect balance between size reduction and performance.

Compression:

Compression is the most popular data reduction technology and it works well with documents, email and databases, however, it is less effective for photos and videos. Some storage systems incorporate some forms of compression, but you can also use stand-alone compression apps. Some vendors combine both compression and deduplication. Experts believe that it would be more efficient to compress the file on the OS level and deduplicate it on the storage area.

The decision to combine compression and duplication or use them separately depends of factors such as how easily data can be returned to a usable form. For example, online transaction processing and online database apps can be affected significantly by real-time compression that slow performance and delay access. However, modern (and expensive) multi-core server processors may make server-based compression tolerable.

Compression yields different results based on the file types, for example, compression on SQL databases can result 6:1 ratio, but on some file types, the ratio may only be 2:1 or lower. In many cases, a combination of deduplication and compression can result in 80% smaller data requirement.

In general, use deduplication when data redundancy is significant and you want to save some space, while, compression can save some space without excessively sacrificing transfer rates and performance.

Policy-Based Tiering:

For this method to work, data is classified to different classes based on the speed at which the files must be available, how often the file is accessed and the file’s age. This method won’t directly reduce the storage requirements, but some data can be transferred to slower, but less expensive media. As the result, you can make more room for high-priority data.

About: This Article was written by Raja. He is a Web Hosting industry watcher and writes regularly on Dedicated Hosting Reviews and Reseller Hosting Reviews.

Author :

  • admin