DX (Disk Extender) / EX (Email Extender) are some of the Archiving Softwares, you can look for. If in case you are looking for a Dedicated Hardware for Archiving, Centera/Atmos can be a good option.
DataDomain also had appliance dedicated for archiving. DD Archiver capacity scales up to 28.5PB logical(notice it is logical). The 300TB of data from physical environment or viturlizatio environment? I believe Atmos/DataDomain/Centera can be a good option.
If these 300TB are files/email/sharepoint then you can use EMC SourceOne (S1) - DX/EX mentioned above no longer exist and are replaced by S1.
EMC S1 can archive data to many hardware storage/archive platforms:
* directly attached disk/lun from an array (eg. some SATA drives big disk)
* NAS share
* Centera/ATMOS/DataDomain.
If these 300TB are used by some specific application you may check if this application is supported as EMC Centera complaint (there are hundreds of them). Then this data can be archived directly from your application to Centera/ATMOS solution.
Centera is EMC top hardware archive storage that complies with many US/international regulations and exists more that 8 years. ATMOS is next generations archive storage platform that now fits mainly into web application purposes but it also supports Centera API for storing archive data. If you do not have some very strict regulatory requirements then definitely I would go for ATMOS which can also serve as a user data cloud multisite platform.
The data is propriety file data. I can't give specifics, but suffice to say it is not something that SourceOne will have a module for.
The solution should be able to crawl the data for changes within a reasonably short period of time, which I believe eliminates a tool that runs from a single server or appliance. I am looking for something that can scale it's performance for monitoring the archive for changes.
I'd suggest you to look into Amazon Web Services S3 object storage. It is very easy to scale up or down and the pricing is reasonable. You are also confirmed with availability 99% of the time by AWS. Also, AWS S3 has good APIs for accessing the content programmatically.
For datasets around 300TB, most teams I’ve seen lean toward enterprise-grade solutions like object storage–based archiving with tiering rather than traditional archive tools.
Performance usually depends more on parallel ingestion, indexing, and network throughput than the raw size itself, so tools that support scalable, cloud or hybrid architectures tend to handle this much better.
Phukon
337 Posts
0
August 24th, 2012 22:00
DX (Disk Extender) / EX (Email Extender) are some of the Archiving Softwares, you can look for. If in case you are looking for a Dedicated Hardware for Archiving, Centera/Atmos can be a good option.
LouisLu
161 Posts
0
August 27th, 2012 01:00
DataDomain also had appliance dedicated for archiving. DD Archiver capacity scales up to 28.5PB logical(notice it is logical). The 300TB of data from physical environment or viturlizatio environment? I believe Atmos/DataDomain/Centera can be a good option.
PawelO1
7 Posts
0
August 27th, 2012 02:00
If these 300TB are files/email/sharepoint then you can use EMC SourceOne (S1) - DX/EX mentioned above no longer exist and are replaced by S1.
EMC S1 can archive data to many hardware storage/archive platforms:
* directly attached disk/lun from an array (eg. some SATA drives big disk)
* NAS share
* Centera/ATMOS/DataDomain.
If these 300TB are used by some specific application you may check if this application is supported as EMC Centera complaint (there are hundreds of them). Then this data can be archived directly from your application to Centera/ATMOS solution.
Centera is EMC top hardware archive storage that complies with many US/international regulations and exists more that 8 years. ATMOS is next generations archive storage platform that now fits mainly into web application purposes but it also supports Centera API for storing archive data. If you do not have some very strict regulatory requirements then definitely I would go for ATMOS which can also serve as a user data cloud multisite platform.
I hope it helps.
asimmons1
1 Rookie
•
2 Posts
0
August 27th, 2012 10:00
The data is propriety file data. I can't give specifics, but suffice to say it is not something that SourceOne will have a module for.
The solution should be able to crawl the data for changes within a reasonably short period of time, which I believe eliminates a tool that runs from a single server or appliance. I am looking for something that can scale it's performance for monitoring the archive for changes.
keran_lee
1 Rookie
•
2 Posts
0
December 8th, 2024 04:32
I'd suggest you to look into Amazon Web Services S3 object storage. It is very easy to scale up or down and the pricing is reasonable. You are also confirmed with availability 99% of the time by AWS.
Also, AWS S3 has good APIs for accessing the content programmatically.
user_0556ac
1 Rookie
•
1 Message
0
January 17th, 2026 15:26
For datasets around 300TB, most teams I’ve seen lean toward enterprise-grade solutions like object storage–based archiving with tiering rather than traditional archive tools.
Performance usually depends more on parallel ingestion, indexing, and network throughput than the raw size itself, so tools that support scalable, cloud or hybrid architectures tend to handle this much better.