Skip to main content
  • Place orders quickly and easily
  • View orders and track your shipping status
  • Enjoy members-only rewards and discounts
  • Create and access a list of your products
  • Manage your Dell EMC sites, products, and product-level contacts using Company Administration.

ECS 3.6.2 Data Access Guide

PDF

Integrate a simple Hadoop cluster with ECS HDFS

You can configure a Hadoop distribution to use the ECS storage infrastructure with ECS HDFS.

To perform this integration procedure, you must have:

  • A working knowledge of your Hadoop distribution and its associated tools.
  • The Hadoop credentials that allow you to log in to Hadoop nodes, to modify Hadoop system files, and to start and stop Hadoop services.

The following steps must be performed:

  1. Install Hortonworks HDP using Ambari
  2. Create a bucket for HDFS using the ECS Portal
  3. Plan the ECS HDFS and Hadoop integration
  4. Obtain teh ECS HDFS installation and support package
  5. Deploy the ECS HDFS Client Library (Not required if you have used Ambari Hortonworks for ECS)
  6. Configure ECS client properties.
  7. Verify Hadoop access to ECS.
  8. Relocate the default file system from HDFS to an ECS bucket

Once the configuration is complete, files in the default file system of the Hadoop cluster map to files in ECS buckets. For example, /foo/bar on the default file system maps to viprfs://<bucket_name>.<namespace>.<federation_name>/foo/bar.


Rate this content

Accurate
Useful
Easy to understand
Was this article helpful?
0/3000 characters
  Please provide ratings (1-5 stars).
  Please provide ratings (1-5 stars).
  Please provide ratings (1-5 stars).
  Please select whether the article was helpful or not.
  Comments cannot contain these special characters: <>()\