9 Legend

 • 

20.4K Posts

February 18th, 2014 08:00

i am not aware of such command, there is a way to get into DD file system (requires even more priv than se) but even then how would you know what to delete ?  Yes you can list file by access/modified date but how are you going to reconcile what you deleted manually with what you deleted through your back application. This can turn really bad, you will end up with stale records in your backup application database. I would tackle this from the backup application itself.

3 Posts

February 18th, 2014 09:00

how do you are writing the Data to DD?

the best way to Delete the data in DD is from the Tool that is being used to write the data... so that it also can have the update about the data deletion and aviode conflicts.

you can also see how much is the cleanable space in FS and see if cleaning helps to get some space..

February 18th, 2014 09:00

Thank you for the replies. I would hope the command would list the files by date and I could just delete the files based on date. Much the same way I run a command in DD to list the VTL tapes that are old and then relabel them via the backup application.

The cleanable space shows 39TB currently but the last clean failed midway through so I had to start again. Maybe this 39Tb is part of that data listed over a year old. The clean runs for days so I won't know probably till the weekend.

Thanks!

1 Rookie

 • 

116 Posts

February 18th, 2014 10:00

Hi Katmandu,

My name is Patrick Betts and I am a DataDomain TSE.  The command you are looking for would be sfs_dump but it is a BASH level command that must be ran by a DataDomain TSE.  This will generate an output file that will list files by folder path / size / and age.  The file will need to be uploaded to us. We have a script that we will run against the file to parse the data into a CSV file and send it back to you.  From there you can sort it however you wish.  Please email me directly for more information.

Best Regards,
Patrick

February 18th, 2014 10:00

Thank you, I sent an email with the serial number.

4 Operator

 • 

14.4K Posts

February 18th, 2014 10:00

I agree, you should focus on why does this cleanup fail (I suspect this might have been failing for longer time now).  Do you use replication?  If yes, is it possible to suspect replication while you run cleanup?

1 Rookie

 • 

116 Posts

February 18th, 2014 10:00

Katmandu,

Hrvoje is correct in thinking this may be a replication lag issue.  Please email me the serial number of the affected system and I will look into your AutoSupports and open a support case for you if needed.

patrick.betts@emc.com

Best Regards,
Patrick

3 Posts

February 18th, 2014 10:00

39Tb is really a good space that can free up..

check the reason behind the cleaning failures. that would be the place where you need to look in to get the space.

1 Rookie

 • 

116 Posts

February 18th, 2014 11:00

Katmandu,

Thank you.  Per our email discussion this is a replication lag issue like hrvoje suggested.  When replication is disabled it still holds onto the snapshots until it is re-enabled.  In this instance, you needed to run "replication break all" and then run "filesys clean". 

Best Regards,
Patrick

No Events found!

Top