W-2 Jobs Portal

  • W-2 Open Positions Need to be Filled Immediately. Consultant must be on our company payroll, Corp-to-Corp (C2C) is not allowed.
Candidates encouraged to apply directly using this portal. We do not accept resumes from other company/ third-party recruiters

Job Overview

  • Job ID:

    J37250

  • Specialized Area:

    Hadoop

  • Job Title:

    Hadoop Software Engineer

  • Location:

    Nashville,TN

  • Duration:

    12 Months

  • Domain Exposure:

    Healthcare, Pharmaceuticals, Banking & Finance, Insurance, Education, IT/Software, Media/ Entertainment

  • Work Authorization:

    US Citizen, Green Card, OPT-EAD, CPT, H-1B,
    H4-EAD, L2-EAD, GC-EAD

  • Client:

    To Be Discussed Later

  • Employment Type:

    W-2 (Consultant must be on our company payroll. C2C is not allowed)




Job Description

The developer will work on bringing a sourced data set over into the EDW for data analysis purposes.

 

Responsible for creating an ingestion route for a sourced dataset into the EDW.

Creating star schema data models, performing ETLs and validating results with business representatives

Experience in writing simple to complex queries

Experience working in Data Migration and Data Movement efforts

Experience extracting data from Tables, Flat Files in different formats, Queues like MQ

Experience in populating data to Tables, writing data to Flat Files in different formats, and streaming data

Experience working in both Batch and Real-Time Data

Data Analytics Knowledge and experience extracting data for Analytic Consumers

Strong AWS Work Experience

Experience with creating and reading Parquet files using any programming language like Clojure, Scala, Java or Python

Experience with interacting with Hive and Impala tables to extract data

Strong Core Java experience in concepts like I/O, Multi-threading, Exceptions, RegEx, Collections, Data-structures and Serialization

At least 2 years of Experience in handling messaging services using Apache Kafka (Development of producers and consumers)

At least 2 years of experience in Kafka , Spark, Map reduce, HDFS, Hive, Hive UDF, Pig, Sqoop, Oozie and Hbase.

Good experience in working with cloud environment like Amazon Web Services (AWS) EC2 and S3.

Good understanding of Partitions and Bucketing concepts in Hive 

 

Must Have Skills:

AWS, Hadoop, Java, Python, Spark, Spark SQL, Scala, C++, Kafka, GIT, Maven

 

Nice to have skills:

Hive, HBase


Apply Now
Equal Opportunity Employer

QUANTUM TECHNOLOGIES LLC is an equal opportunity employer inclusive of female, minority, disability and veterans, (M/F/D/V). Hiring, promotion, transfer, compensation, benefits, discipline, termination and all other employment decisions are made without regard to race, color, religion, sex, sexual orientation, gender identity, age, disability, national origin, citizenship/immigration status, veteran status or any other protected status. QUANTUM TECHNOLOGIES LLC will not make any posting or employment decision that does not comply with applicable laws relating to labor and employment, equal opportunity, employment eligibility requirements or related matters. Nor will QUANTUM TECHNOLOGIES LLC require in a posting or otherwise U.S. citizenship or lawful permanent residency in the U.S. as a condition of employment except as necessary to comply with law, regulation, executive order, or federal, state, or local government contract