Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

5552

February 11th, 2016 04:00

Cloud Storage Sizing for CloudBoost

Hallo,

I´d like to size the ECS Storage that´s necessary for the following Networker Cloudboost szenario:

Total Data 20 TB consisting of

VM File 5 TB

DataBase 3 TB

File 12 TB

Connection to ECS ist the Internet with lets say 40 Mbit/s.

The customer wants to sent a Clone every month to the cloud.

The retention time is one year which means we have 12 monthly clones.

Any formula to calculate the neccessary space on the ECS?

Kind regards

Richard

23 Posts

March 17th, 2016 20:00

Thanks Richard!

I've added this as an RFE.

9 Posts

February 17th, 2016 14:00

Thanks for your question. I'm getting a support agent that can answer this question for you.

23 Posts

February 17th, 2016 16:00

Good question rkwidzinski .

 

There are a lot of factors that we would need to consider to provide an accurate estimate on the amount of data that should be appropriated for ECS, such as the change ratio on those files, dedupe availability (higher for VM's and DB, lower for typical user files).

 

Here are some potential examples:

  • 20TB x 50% change = 120TB in one year (also an estimate for 0% deduped).
  • 120TB x 75% deduped = 30TB in one year.
  • Max for a year with near 100% change rate = 240TB with 0% deduped.

 

Regarding the overall speed and time it would take, 20TB at 40Mbps, the initial upload would take over 44 days with post transfers dependent on the change rate. In a realistic use case, cache will be suggested by the professional services teams.

12 Posts

February 18th, 2016 02:00

Hallo Dillon,

lets assume the following to get an idea of the storage neccessary.

20 TB Full Backup consisting of:

12 TB File          Daily change rate 5 %

5 TB VM            Daily change rate 5 %

3 TB Database   Daily change rate 20 %

The customer ist doing his normal backup with networker in 1 Full, 6 incrementasl per Week.

Retention Time 6 weeks

Every month he need a backup(Clone into the Cloud with CloudBoost.

If I look at the change rate I can calculate for one month the following change rates:

12 TB File          60% change rate against last Clone

5 TB VM            60% change rate against last Clone

3 TB database    100 % change rate against last Clone

CloudBoost is doing his own Compression and also his own dedup, which means we don have to think of the Networker when calculation Cloud Storage.

I would calculate in the following way:

Dedup for the first Clone is 1:1.5     for all Data

Compression is 1:2

Bandwidth is 40 Mbit/s

Seeding:      20 TB / 2 / 1.5 = 7.5 TB

Seeding Window:   7.5 TB * 8 /  40 Mbit/s / 3600s = 417 hrs.= 17 days

For the following Clones we get a higher dedup ratio which is applied to the Data that is changed:

Dedup on the subsequent Clone is 1:2     for all Data

Compression is 1:2

12 TB File          60% change rate against last Clone     = 7.2 TB

5 TB VM            60% change rate against last Clone     = 3 TB

3 TB database    100 % change rate against last Clone  = 3 TB

In sum we have for the subsequent Clone:      12.2 TB / 2 / 2 = 3 .05 TB

The Copy Window would be:     3.05 TB * 8 / 40 Mbit/s = 170 hrs or 7 Days

If Dedup doesn´t increase for the subsequent clones and there is no growth rate I can do the following calculation of the neccessary Disk Space in the Cloud:

Cloud space:     7.5 TB + 12 * 3.05 TB = 44 TB

As you can see I made a lot of assumptions. It would be more accurate if I would have realistic numbers of Dedup rate.

In Data Domain and also on the Commvault calculation, the number of Dedup for the first full, subsequent full and incrementals is specified.

I miss these values for CloudBoost.

Kind regards

Richard

23 Posts

March 1st, 2016 11:00

Hi Richard ( rkwidzinski  )

 

Sorry for the delay in response. Can you please provide an example found in DataDomain and Commvault? This will help me form an exact RFE based on your information.

 

Thanks,

Dillon

12 Posts

March 2nd, 2016 02:00

Hallo Dilion,

here you can see the calculation of Commvault.

In the discussion with commvault they told me I should use the factor 1.5 to 2 for the calculated size of 22,2 TB.

The calculation of DD shows similar values but the use case can not really be reproduced.

See the following screenshot.

Kind regards

Richard

No Events found!

Top