2 Intern

 • 

2K Posts

March 19th, 2018 08:00

I see. We do still have to "stat" the files on that volume to check if they match the include or exclude patterns. You may be better off excluding that volume and running it as a separate job in that case.

2 Intern

 • 

498 Posts

March 19th, 2018 08:00

that is what I have done

Source Data:  All Local Macintosh filesystems

Exclusions /Volume/boot image/

                 /Volume/OS/

                /Volume/Big Guy/

Inclusions /Volume/Big Guy/MyStuff/Want this/

the job ran 15 hours before we ran out of time.

but spend that 15 hours looking at stuff in Big Guy

line after line

avtar info  8688  status date time 647,242 files, 126.41 directories, 30.83 GB (231 files.......

                                                                                /Volumte/Big Guy/Backups/

the number of files and dirs. and the 30.83 GB stay the same, just the file name change

Getting a line about ever 15 minutes.

I don't want it to Look at big guy over and over every run, it will never finish.

Would it be better if I made a job JUST to get that one subdir?

2 Intern

 • 

2K Posts

March 19th, 2018 08:00

I think the easiest way to do this is with an include.

Configure the dataset to back up all the volume(s) where there is data you want to back up or use the option to back up all volumes (i.e. make sure /Volumes/Big Guy is covered by the dataset).

Configure an exclude for the specific volume you don't want backed up.

Configure an include for the subdirectory within the excluded volume that you want backed up.

The logic is like this:

Back up every file and directory in the dataset

Unless it's on the exclude list

And it's not on the include list

The exclude list overrides the dataset. The include list overrides the exclude list.

No Events found!

Top