Basic Purpose:
Develop technical solutions for data acquisition, data integration, and data sharing with Client Omni-Channel, digital platforms in Near Real-Time and batch. Responsible for engineering, designing, building, integrating data from various batch, streaming, edge applications into high performing operational Hubs.

Develop complex event driven applications with the goal of optimizing the performance of client’s messaging/event driven data ecosystem. Recognized as an expert with a specialized depth and/or breadth of expertise in discipline. Solves highly complex problems; takes a broad perspective to identify solutions. Leads functional teams or projects. Works independently.

Responsibilities:

 Design and build highly scalable data pipelines for near real-time and batch data ingestion, processing, and data integration
 Technical leadership and knowledge to provide technical guidance and educate team members and coworkers on development and operations of streaming and event driven applications
 Be a hands-on mentor and advocate to ensure successful adoption of new tools, processes, and best practices across the organization
 Recognize potential issues and risks during the project implementation and suggest mitigation strategies
 Communicate and own the process of manipulating and merging large datasets
 Expert and key point of contact between the API teams and the project/functional leads
 Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives
 Prepare advanced project implementation plans which highlight major milestones and deliverables, leveraging standard methods and work planning tools
 Lead the preparation of high-quality project deliverables that are valued by the business and present them in such a manner that they are easily understood by project stakeholders
 Perform other duties as assigned

Qualifications and Education Requirements:
 Degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experience
 Working knowledge of message-oriented middleware/streaming data technologies such as Kafka/NiFi, MQ, Azure Event Hub
 Must have strong programming skills / experience in C# / .NET, Logic App
 Must have strong programming skills / experience in Azure Functions using various protocols/triggers, Git/Github
 Hands-on experience in configuring Azure Event Hub, Event Grid, Stream Analytics, logic/function apps and JSON
 Expert level skills in Python, Databricks, Azure Data Factory
 Experienced in the use of ETL tools and techniques and have knowledge of CI/CD
 Experience & Expertise in cloud NoSQL databases, ideally Azure/Azure Data Services/Cosmos DB or equivalent
 Knowledge of and experience scaling one or more of the popular data streaming and processing technologies like Kafka, Spark, Spark Streaming, etc.
 General knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment
 Demonstrates change management and excellent communication skills