Job Description:
Each day U.S. Customs and Border Protection (CBP) oversees the massive flow of people, capital, and products that enter and depart the United States via air, land, sea, and cyberspace. The volume and complexity of both physical and virtual border crossings require the application of “big data” solutions to promote efficient trade and travel. Further, effective “big data” solutions help CBP ensure the movement of people, capital, and products is legal, safe, and secure. In response to this challenge CBP, seeks capable, qualified, and versatile Senior Big Data Leads to facilitate data-driven decision making in response to national security threats. Senior Cloud Engineer will: 1) support the development of enterprise applications; 2) design and implement micro-services architectures; 3) oversee, work with, and assist other Big Data Engineers; and 3) develop performance dashboards and artifacts to inform resource allocation, justify requirements, and support statutory reporting mandates. The strongest applicants will have a proven track record of delivering production ready decision support tools and applications employed in the field and by mission-support entities. Within three to six months of joining the project, the Senior Cloud Engineer will be expected to:
- Lead and perform development and maintenance of end-user focused, object-oriented, data-driven analytic applications using DevOps and Agile development principles and technologies to support CBP threat analysis and targeting.
- Independently identify technical solutions for business problems, directly contribute to conceptual design, and routinely collaborate with Enterprise/Application architects, Database Architects, Data Scientists, and mission stakeholders for building and transitioning on-prem applications to a hybrid cloud environment on Amazon Web Services (AWS).
- Develop new code, modify existing application code, conduct unit and system testing, and engage in rigorous documentation of developed and delivered application use cases, data flows, and functional operations.
- Demonstrate a strong practical understanding of application-relevant cargo and passenger data and databases used to support analytic application development, functionality and targeting end user (officer) operation.
- Actively participate in formal and informal design reviews, solution sessions, and project milestone meetings as well as contribute to project document artifacts for presentation to both technical and non-technical audiences.
- Integrate with, and materially contribute to, project portfolio teams as a matrixed resource to provide development and issue resolution expertise in collaboration with data scientists, intelligence analysts, developers, and other participants at the direction of a project manager.
- Mentor junior developers and actively contribute to the development of a community of practice focused on enhancing technical skills, sharing best practices, and promoting enhanced mission domain knowledge. Applicants should note that while some formal training exists to support building knowledge of missions and the implementation of data-driven solutions, much of the knowledge must be gained through on-the-job training and individual initiative. Technical knowledge is insufficient to establish credibility in this operating environment; expertise must be enhanced with rapid acquisition of mission knowledge and demonstrated commitment positive mission outcomes; therefore, we seek self-starters who are capable of independent exploration and knowledge acquisition, willing to develop informal support networks, and with an appreciation for the immense challenge of safeguarding the Nation’s borders.
Responsibilities:
- Demonstrated expertise in Java and Object-Oriented Design and development principles (e.g., XML, JSON).
- Recent hands on Experience with Java 8+ and the newer language features.
- Extensive hands on experience using Spring to develop Micro Services and/or REST API
- Hands on experience with Oracle SQL and/or PL/SQL used in development of web applications
- Experience with the Hadoop eco system, including HDFS, YARN, Hive, Pig, and batch-oriented and streaming distributed processing methods such as Spark, Kafka, or Storm
- Experience delivering solutions using Amazon Web Services (AWS EC2, RDS, S3)
- Experience with distributed search engines like Elasticsearch or Solr
- Hands on experience developing and delivering applications via Micro Services and/or REST API
- Experience with one or more build tools such as Maven, Gradle, etc.
- Experience developing and sustaining applications within software Source Code Control Systems (SCCS) such as Git, Subversion, BitBucket, etc.
- Hands on experience with Agile Lifecycle Management and engineering tools including Atlassian Jira and Confluence
- Experience with application IDE tools including Eclipse, Jupyter Notebook, and Apache Zeppelin
- Familiarity with Unix/Linux and shell scripting
- High level of self-motivation, desire to deliver stellar solutions and willingness to work in a distributed team environment.
- Ability to lead and mentor junior to mid-level developers.
- Desired Knowledge and Experience
- NoSQL database systems such as Cassandra, MongoDB, DynamoDB, etc.
- Familiarity with Atlassian tools such as Jira, BitBucket
- Continuous integration with Jenkins or Bamboo.
- Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent
Education: Bachelor’s degree in computer science or related field with 15+ years of technical experience