Lead Big Data Engineer, Tential, Rockville, MD


Tential -
N/A
Rockville, MD, US
N/A

Lead Big Data Engineer

Job description

Lead Big Data DeveloperAre you passionate in data? Do you like working in a challenging environment where massive volume of data is ingested and processed every day? Are you a continuous learner who wants to learn new tools and technologies in evolving big data and data science technology? Do you have passion in detecting data anomalies in large datasets? Do you expect the best from yourself and those around you?

We are looking for lead big data developer for ETL project in our enterprise Transparency Services group. Big data developer will work on designing, ingesting, storing, validating and disseminating, after transforming data in a consumable format, for business intelligence teams and data analysts to get deeper business insight from the data.

Job ResponsibilitiesUnderstand complex business requirementsDesign and develop ETL pipeline for collecting, validating and transforming data according to the specificationDevelop automated unit tests, functional tests and performance tests.Maintain optimal data pipeline architectureDesign ETL jobs for optimal execution in AWS cloud environmentReduce processing time and cost of ETL workloadsLead peer reviews and design/code review meetingsProvide support for production support operations teamImplement data quality checks.Identify areas where machine learning can be used to identify data anomalies

Experience & Qualifications

7+ years of experience in programming language Java or Scala7+ years of experience in ETL projects5+ years of experience in big data projects3+ years of experience with API development (REST API's)Believes in Scrum/Agile, and has deep experience delivering software when working on teams that use Scrum/Agile methodologyStrong and creative analytical and problem-solving skills

Required Technical Skills & Knowledge

Strong experience in Java or ScalaStrong experience in big data technologies like AWS EMR, AWS EKS, Apache SparkStrong experience with serverless technologies like AWS Dynamo DB, AWS LambdaStrong experience in processing with JSON and csv filesMust be able to write complex SQL queriesExperience in performance tuning and optimizationFamiliar with columnar storage formats (ORC, Parquet) and various compression techniquesExperience in writing Unix shell scriptsUnit testing using JUnit or ScalaTestGit/Maven/GradleCode ReviewsExperience with CI/CD pipelinesAgile

The following skills a plus:

AWS CloudBPM/ AWS Step FunctionsPython scriptingPerformance testing tools like Gatling or JMeter

#LI-WB#Dice#J-18808-Ljbffr


Full-time 2024-06-26
N/A
N/A
USD

Privacy Policy  Contact US
Copyright © 2023 Employ America All rights reserved.