W-2 Jobs Portal

  • W-2 Open Positions Need to be Filled Immediately. Consultant must be on our company payroll, Corp-to-Corp (C2C) is not allowed.
Candidates encouraged to apply directly using this portal. We do not accept resumes from other company/ third-party recruiters

Job Overview

  • Job ID:

    J36993

  • Specialized Area:

    Hadoop

  • Job Title:

    Hadoop Developer

  • Location:

    San Ramon,CA

  • Duration:

    12 Months

  • Domain Exposure:

    Insurance, Government, IT/Software

  • Work Authorization:

    US Citizen, Green Card, OPT-EAD, CPT, H-1B,
    H4-EAD, L2-EAD, GC-EAD

  • Client:

    To Be Discussed Later

  • Employment Type:

    W-2 (Consultant must be on our company payroll. C2C is not allowed)




Job Description

Responsibilities:

  • Design and develop scalable and reliable real-time stream processing solutions using Hortonworks Data Flow HDF product suite (Nifi/Kafka/Spark) 
  • Provide expertise and hands on experience working with Kafka brokers and Kafka connectors 
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. 
  • Work directly with business partners to translate complex functional and technical requirements for streaming data ingestion solutions into detailed design & implementation plans 
  • Work closely with EA and cross-functional technical resources to devise and recommend solutions based on the understood requirements 
  • Work closely with Platform Engineering team to analyze complex distributed production deployments, and make recommendations to optimize performance 
  • Provide input for capacity planning and sizing of streaming environment (Kafka/Nifi) 
  • Implement application development lifecycle management using-and industry standard frameworks 
  • Write and produce technical documentation and-knowledge based articles 
  • Work with Production Support to assist with troubleshooting service stability, message topic or delivery issues, perform data related benchmarking, performance analysis and tuning. 
  • Perform design & code reviews 

Required Skills:

             

  • Solid Java Development experience with 1 - 2 years of experience working on the Hadoop ecosystem. Great opportunity to move from Java to Hadoop.
  • Basic understanding of Hadoop distributions ( Cloudera 5.x, Hortonworks ..)
  • Hands on working experience on Hadoop technology (HDFS, Hive, Impala, Sqoop, UDF, Oozie, Map reduce, Spark Framework, etc.).
  • Good understanding of different file formats (e.g. AVRO, ORC, Parquet, etc..) and data sources moving data into and out of HDFS.
  • Experience in Streaming technologies ( Kafka, Spark streaming, etc..)
  • Strong skills in programming languages (Python, Shell Scripting, Java).
  • Strong experience in data structures, algorithms. various RDBMS, Datatypes and Primary/Foreign Key constraints.
  • Hadoop Developer Certification (CDH 5.X ) preferre

Apply Now
Equal Opportunity Employer

QUANTUM TECHNOLOGIES LLC is an equal opportunity employer inclusive of female, minority, disability and veterans, (M/F/D/V). Hiring, promotion, transfer, compensation, benefits, discipline, termination and all other employment decisions are made without regard to race, color, religion, sex, sexual orientation, gender identity, age, disability, national origin, citizenship/immigration status, veteran status or any other protected status. QUANTUM TECHNOLOGIES LLC will not make any posting or employment decision that does not comply with applicable laws relating to labor and employment, equal opportunity, employment eligibility requirements or related matters. Nor will QUANTUM TECHNOLOGIES LLC require in a posting or otherwise U.S. citizenship or lawful permanent residency in the U.S. as a condition of employment except as necessary to comply with law, regulation, executive order, or federal, state, or local government contract