We are looking for a Full-Stack Software Engineers - Big Data Hadoop developer for a very important client.
- Manage individual project priorities, deadlines, and deliverables.
- Adapt to changes and setbacks in order to manage pressures and meet company requirements.
- Demonstrate and apply principled software engineering practices.
- Be versatile and passionate about tackling new problems and learning new technologies.
- Sharp problem solving skills and ability to tackle new problems effectively while hiding complexity.
- Have a strong sense of ownership and drive.
- experience with Enterprise Application development using JAVA/J2EE.
- Must have programming experience with one of the Cloud services AWS/Azure/GCP, preferably AWS .
- Must have programming experience with Hadoop MapReduce or Spark using Java.
- Deep understanding of Map Reduce framework and related big data ecosystem components like Hive, HBase etc.
- Must be proficient on at-least on any cloud automation frameworks like Terraform, Cloud-formation
- Experience developing with web/app containers such as Tomcat/Jetty etc.
- Excellent written and verbal communication skills
- Practical expertise in performance tuning and optimization, bottleneck problems analysis.
- Knowledge of Ant, Gradle and SVN.
Nice to have:
- Knowledge of Kerberos authentication.
- Knowledge of setting up security mechanisms like AWS IAM and policies
- Knowledge of setting up network isolation with VPCs, security groups and NACLs.
- Knowledge of Encryption/Decryption technologies.
- Knowledge of ACL management in Hadoop eco-system.