W-2 Jobs Portal

  • W-2 Open Positions Need to be Filled Immediately. Consultant must be on our company payroll, Corp-to-Corp (C2C) is not allowed.
Candidates encouraged to apply directly using this portal. We do not accept resumes from other company/ third-party recruiters

Job Overview

  • Job ID:

    J36993

  • Specialized Area:

    Hadoop & Big Data

  • Job Title:

    Hadoop admin

  • Location:

    Cary,,NC

  • Duration:

    10 Months

  • Domain Exposure:

    Insurance, Retail, Telecom

  • Work Authorization:

    US Citizen, Green Card, OPT-EAD, CPT, H-1B,
    H4-EAD, L2-EAD, GC-EAD

  • Client:

    To Be Discussed Later

  • Employment Type:

    W-2 (Consultant must be on our company payroll. C2C is not allowed)




Job Description

5+ years of IT experience with 3+ years of experience in Bigdata/Hadoop Admin with good to have MongoDB & Splunk skills.

  • 5+ years of IT experience with min 3+ years of experience on Bigdata Administration (Hadoop,MongoDB & Splunk)
  • Good hands-on experience as an Hadoop admin
  • Expertise in Cluster maintenance using tools like IBM BigInsights, Hortonworks Ambari, Ganglia etc
  • Good at Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster health, connectivity and security
  • Experience on troubleshooting, backup and recovery.
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Familiar with HDFS, MR and Yarn commands (CLI) and utilities.
  • Experience in schedulers in Hadoop, job scheduling and monitoring.
  • Experience on hive, hbase, sqoop, RDBMS and Hadoop eco system.
  • Experience and Troubleshooting issues on Apache HBase and Solr.
  • Experience in handling platforms secured with Ranger and Kerberos.
  • Experience in creating new MongoDB databases, instances, Database objects/views and etc.
  • Install, configure, Test the Disaster Recovery (DR) strategy monitor system performance after go-live
  • Support, maintain, and expand Splunk infrastructure in a highly resilient configuration
  • Standardized Splunk agent deployment, configuration and maintenance across a variety #### UNIX and Windows platforms
  • Troubleshoot Splunk server and agent problems and issues
  • Assist internal users #### Splunk in designing and maintaining production-quality dashboards
  • Should possess a basic knowledge on Dockers and Containers
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Candidate should be having excellent communication, presentation & customer handling skills

Apply Now
Equal Opportunity Employer

QUANTUM TECHNOLOGIES LLC is an equal opportunity employer inclusive of female, minority, disability and veterans, (M/F/D/V). Hiring, promotion, transfer, compensation, benefits, discipline, termination and all other employment decisions are made without regard to race, color, religion, sex, sexual orientation, gender identity, age, disability, national origin, citizenship/immigration status, veteran status or any other protected status. QUANTUM TECHNOLOGIES LLC will not make any posting or employment decision that does not comply with applicable laws relating to labor and employment, equal opportunity, employment eligibility requirements or related matters. Nor will QUANTUM TECHNOLOGIES LLC require in a posting or otherwise U.S. citizenship or lawful permanent residency in the U.S. as a condition of employment except as necessary to comply with law, regulation, executive order, or federal, state, or local government contract