Remote Developer Jobs

The fastest and lightest job board for developers to work remotely

Big Data Admin at eTeki, Inc.

At eTeki, top freelance Big Data Administrator professionals assess peers being considered for similar technical roles with the respect and courtesy of a face-to-face conversation. Your feedback helps recruiters and hiring managers focus their resources on the most qualified professionals in the hiring process. 

With your proficiency in Hadoop Technology you can make a difference to our clients across the globe. You'll confirm interest on a per job basis and availability on a per candidate basis by using our platform and the support of a world-class team of product specialists. If you enjoy talking about technology and a desire to raise hiring standards for your profession (while making extra money), eTeki’s the right side-gig for you. 

Requirements:

  • Expertise in setting up fully distributed multi node Hadoop clusters, with Apache & Cloudera.
  • Experience in installing, configuring and using ecosystem components like Hadoop, MapReduce, Oozie, Hive, Sqoop, Pig, Flume, Zookeeper, Kafka, NameNode Recovery and HDFS High Availability using Cloudera Manager and Ambari.
  • Good knowledge in Import/Export structured, unstructured data from various data sources such as RDBMS, Event logs, Message queues into HDFS, using a variety of tools such as Sqoop, Flume etc.
  • Experience on YARN, MapReduce (MRv1), YARN (MRv2) and Spark.
  • Experience on CDH components like HDFS, Sqoop, Sqoop2, Pig, Hive, Zookeeper, Hbase, Oozie, Impala, Hue etc.
  • Strong understanding of various schedulers in Hadoop like, FIFO, DRF, Fair and Capacity Schedulers.
  • Experience with the Monitoring Tools Hue or Nagios.
  • Should have knowledge of Hadoop Security Tools - Kerberos/ knox / Ranger.
  • Experience in Cluster Migration from Cloudera to Hortonworks or vice versa.

Here at eTeki, our independent Tech Interview Experts provide accurate profile data for matching their role/skills to clients’ needs and respond promptly to notifications about upcoming interview assignments. With your mid to senior level first-hand exposure to the tools and technologies needed, you’ll assess candidate skills and probe their responses. Interviews are recorded for playback by the hiring team and for quality control evaluation by eTeki’s technical leaders. 

Responsibilities:

  • Responsible in setting up fully distributed multi node Hadoop clusters, with Apache & Cloudera.
  • Administration of Hadoop Cluster and managing request for Performance management based on sample dataset available. Capacity planning of cluster from the available data set.
  • Linux Administration to tune the nodes as per the behavior of the application jobs that are running by end users.
  • Resolving complex technical issues like recovery of nodes, Maintenance of Hadoop configuration files across the cluster nodes.
  • Setting up Cluster and configuring Multinode Hadoop Cluster on various Linux Platforms.
  • Integrating different Hadoop distributions such as CDH, HortonWorks, and Apache Hadoop etc.
  • Cluster maintenance such as Add, remove, and rebalance nodes in a cluster using cluster management tools like Cloudera Manager & Apache Hadoop.
  • Configuring High Availability (HA) using Cloudera Manager and High Availability for Other CDH Components.
  • Upgradation of Clusters using Cloudera parcels is must. Cloudera upgrade through repos is optional.

eTeki, a technical interviews-as-a-service platform, helps every organization, big or small, hire top notch technical talent by matching technical interview experts with clients who need third-party screening expertise. More information online at www.eteki.com/freelancers

See related remote developer jobs