Big Data Feature Lead - Core Technology Infrastructure - #495180

Bank of America

Date: 10/14/2021 13:31 PM

City: Pennington, New Jersey

Contract type: Full Time

Work schedule: Full Day

Job Description:

Data Platform Services, OMNI initiative, is looking for a Big Data Feature Lead who will be responsible for development of the OMNI Big Data Platform. The ideal candidate is someone who has hands-on experience developing in the Hadoop, Cloudera data environment

The candidate will be working with agile teams and responsible for all aspects of SDLC

  • Exhibits strong knowledge of data structures and Big data modeling
  • Experience in Hadoop/HDFS/SPARK concepts and ability to write Spark Dataset, Data frame & HiveQL jobs
  • Proven understanding with Hadoop, Spark, Hive and ability to write Shell scripting
  • Familiarity with data loading tools like Sqoop, Flume and Kafka. Knowledge of workflow/schedulers like Oozie
  • Good aptitude in multi-threading and concurrency concept. Loading data from disparate data source sets.
  • Hands on experience with NO SQL databases. Ability to analyze, identify issues with existing cluster, suggest architectural design changes
  • Ability/Knowledge to implement Data Management and Governance in Hadoop Platform
  • Experience in Apache Hadoop/Spark development
  • Well versed in Linux Environment
  • Extensive experience in application development
  • Excellent analytical and business process flows, design and diagrams skills
  • Strong Collaboration and Team skills
  • Proven history of delivering against agreed objectives
  • Demonstrated problem solving skills
  • Ability to pick up new concepts and apply to knowledge
  • Ability to coordinate competing priorities
  • Ability to work in diverse team environments that are local and remote
  • Strong Communication skills (Verbal and Written)
  • Work with minimal supervision

Required Skills
Five plus years hands on experience in coding for the following Hadoop Eco System:
  • Hadoop
  • Spark
  • Spark-SQL
  • Hive on Tez, Hive 3.x

• Impala • Oozie
  • Job Scheduler (i.e. like Autosys)
  • Shell Scripting

Desired Skills:
Bachelor's degree from an accredited college or university
  • Java, Python, Scala, Py-Spark
  • Kafka
  • Restful Services
  • Machine Learning/Predictive analytical
  • Spark streaming * Teradata * Single Store, Azure Synapse

Added advantage with certifications like Cloudera Developer (CCA175), Spark & Hive/Administrator Certifications

Job Band:



1st shift (United States of America)

Hours Per Week:


Weekly Schedule:

Referral Bonus Amount:


How to apply

To apply for this job you need to authorize on our website. If you don't have an account yet, please register.

Post a resume