Basic Purpose:

Develop technical solutions for data acquisition, data integration, and data sharing with Navy Federal’s digital platforms and fraud business in Near Real-Time and batch. Responsible for engineering, designing, building, integrating data from various batch, streaming, edge applications into high performing operational Hubs. Solves highly complex problems; takes a broad perspective to identify solutions.

Responsibilities:

  • Design and build highly scalable data pipelines for near real-time and batch data ingestion, processing, and data integration.
  • Technical leadership and knowledge to provide technical guidance and educate team members and coworkers on development and operations of streaming and event driven applications.
  • Recognize potential issues and risks during the project implementation and suggest mitigation strategies.
  • Communicate and own the process of manipulating and merging large datasets.
  • Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives.
  • Perform other duties as assigned

Qualifications and Education Requirements:

  • Degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training, and experience
  • Working knowledge of message-oriented middleware/streaming data technologies such as Kafka, MQ, Azure Event Hub, Spark, Spark Streaming
  • Must have strong programming skills / experience in C# / .NET, Logic App
  • Must have strong programming skills / experience in Azure Functions using various protocols /triggers.
  • Hands-on experience in configuring Azure Event Hub, Event Grid, Stream Analytics, logic/function apps and JSON.
  • Expert-level skills in Python, Databricks, Azure Data Factory
  • Experienced in the use of ETL tools and techniques and have knowledge of CI/CD
  • Experience & Expertise in cloud NoSQL databases, ideally Cosmos DB or equivalent