Highlighted
saspkp
1 Nickel

Avamar's limit on Directory depth and File count

Hi Experts,

I'm backing up a data server which has more than 1 TB of data. Everytime I run the backup it gets to about 200-300 GB and times out. At the same time there's no data actually transmitted. I see the progress byte numbers grow but I wasn't able to see the restore of what's already being backed up before it timed out. This is how I came to conclusion that no data was being transmitted.

The funny thing is if I select individual directories and backup it works. This is why I'm wondering if there are any limitations on Avamar's backup in number of directories depth and number of files.

I have about 55GB of free space in the root drive so it should be enough to cache large sized files.

I'm using Avamar v5.0.3.29 for both servera and client.

Any help will be greatly appreciated. Thank you!

Kevin

0 Kudos
5 Replies
8 Krypton

Re: Avamar's limit on Directory depth and File count

Hi Kevin,

It seems that this timeout is reached the client, on the client properties change the overtime option for Always allow Overtime.

Regards.

Luis Rogerio

saspkp
1 Nickel

Re: Avamar's limit on Directory depth and File count

Is the default backup time allowed 24 hours?

I can give this a try.

Thanks!

Kevin

0 Kudos
8 Krypton

Re: Avamar's limit on Directory depth and File Avamar's limit on Directory depth and File count

Hello Kevin,

I thing the problem could be related with tuning stuffs on client side .

I would suggest you to increase "--hashcachemax" value from default value "-16" to the ""--hashcachemax=-8" first .

Questions:
  - how many files are on this file server ?  1 million, 2 million, ..or more?
  - What about memory size on this server ? how much physical memory have this server in GB =

Can you provide us more details please

regards,
-r

0 Kudos
8 Krypton

Re: Avamar's limit on Directory depth and File Avamar's limit on Directory depth and File count

We should avoid making any changes to the configuration until we have investigated the problem and determined what actions may improve the situation.

The first thing to do would be to take a look at the client log to see what's really happening.

If the caches are filling up it will be quite clear.  If there's some other error we'll have something to work with.  If there's some resource bottleneck involved we will usually get clues from the status entries which report how busy the client is and how many files it has backed up over time.

0 Kudos
8 Krypton

Re: Avamar's limit on Directory depth and File count

Hello Kevin:

I think your problem is that the blackout window is killing your backup session, maybe it is taking more than 24 hours the bakcup.

You can do different steps

1: Do manually small backups if your information is 1TB try doing 200 GB bakcups until you reach the 1TB information backup, then try to do the 1TB backup....

2. Disable the maintenance window (if the avamar capacity is less than 50% full) and let the backup run with override until it finish.

3. It is important to read about the tunning best practices... it is not the same millions of little files thatn a few big files...

I hope this can help you a little

Oscar

0 Kudos