W-2 Jobs Portal

  • W-2 Open Positions Need to be Filled Immediately. Consultant must be on our company payroll, Corp-to-Corp (C2C) is not allowed.
Candidates encouraged to apply directly using this portal. We do not accept resumes from other company/ third-party recruiters

Job Overview

  • Job ID:

    J37550

  • Specialized Area:

    Hadoop & Big Data

  • Job Title:

    Hadoop Administrator

  • Location:

    the woodland,TX

  • Duration:

    7 Months

  • Domain Exposure:

    Government, Education, IT/Software

  • Work Authorization:

    US Citizen, Green Card, OPT-EAD, CPT, H-1B,
    H4-EAD, L2-EAD, GC-EAD

  • Client:

    To Be Discussed Later

  • Employment Type:

    W-2 (Consultant must be on our company payroll. C2C is not allowed)




Job Description

Job Description

Hi,

We are having Full time job Opportunity for Hadoop Engineer / Administrator located at The Woodlands, TX.

Title: Hadoop Engineer / Administrator.

Duration: Fulltime/Contract

Location: The Woodlands, TX.

We are looking for Those authorized to work in the United States without sponsorship are encouraged to apply.s and Those authorized to work in the United States without sponsorship are encouraged to apply..

Responsibilities:

Manage Hadoop and Spark cluster environments, including service allocation and configuration for the cluster, capacity planning, performance tuning, and ongoing monitoring

Work with data engineering related groups in the support of deployment of Hadoop and Spark jobs

Responsible for monitoring the Linux, Hadoop, and Spark communities and report on defects, feature changes, and or enhancements to the team

Deploy and maintain Hadoop clusters, add and remove nodes using cluster monitoring tools, and keep track of all the running Hadoop jobs

Implement, manage and administer overall Hadoop infrastructure

Responsible for capacity planning and estimate the requirements for lowering or increasing the capacity of Hadoop cluster

Size the Hadoop cluster based on the data to be stored in HDFS

Performance tuning of Hadoop clusters and Hadoop MapReduce routines

Monitor the cluster connectivity and performance

Manage and review Hadoop log files

Monitor and manage Backup and Disaster Recovery processes

Requirements:

5-7 years of hands-on experience with supporting Linux/UNIX production environments

3-5 years of hands-on experience with administering Hadoop and/or Spark ecosystem technologies in production

3-5 years of hands-on experience with scripting with bash, perl, ruby, or python

2-4 years of hands-on development/ administration experience on Kafka, HBase, Solr, and Hue

Experience designing and implementing Hadoop clusters

Advanced ability to analyze problems, correlate data from multiple sources and communicate pertinent information to the appropriate support teams

Strong interpersonal, verbal and written skills.


Apply Now
Equal Opportunity Employer

QUANTUM TECHNOLOGIES LLC is an equal opportunity employer inclusive of female, minority, disability and veterans, (M/F/D/V). Hiring, promotion, transfer, compensation, benefits, discipline, termination and all other employment decisions are made without regard to race, color, religion, sex, sexual orientation, gender identity, age, disability, national origin, citizenship/immigration status, veteran status or any other protected status. QUANTUM TECHNOLOGIES LLC will not make any posting or employment decision that does not comply with applicable laws relating to labor and employment, equal opportunity, employment eligibility requirements or related matters. Nor will QUANTUM TECHNOLOGIES LLC require in a posting or otherwise U.S. citizenship or lawful permanent residency in the U.S. as a condition of employment except as necessary to comply with law, regulation, executive order, or federal, state, or local government contract