Description
The Health Mission Solutions is seeking a Senior Developer, contingent upon contract award.
The Senior Developer will lead a small team of developers and work closely with the Program Manager, Data Architect, Cloud Operations Lead, data engineers, and data analysts to support multiple public health project teams in their full data management cycle including data ingestion, data cleansing, data transformation, data security, data exploration, and data visualization using Microsoft Azure tools and technologies. They include Databricks, SPARK Streaming, Azure SQL, Delta Lake, Azure Data Factory, HD Insights, and Notebook. This position requires hands on data centric development experience in an Azure cloud environment.
Want more jobs like this?
Get jobs that are Remote delivered to your inbox every week.
Specific roles & responsibilities for the Data Architect position include:
- Advise, support, and coach project teams in ingesting data, creating data pipelines, selecting the appropriate Azure services, optimizing data storage, cataloguing data, enforcing technical & architectural standards, and troubleshooting development & production issues.
- Design and implement data security measures to ensure PII/PHI data is protected from unauthorized access.
- Create real time Dashboard in rapid development
- Incorporate data governance into the solution design, which includes policies, procedures, and standards for managing and using data
- Continuously optimize the performance of data pipelines in Databricks and Azure Data Factory (ADF).
- Investigate and recommend new technologies to modernize the data pipeline process. Stay current on the latest advancements in data technologies.
- Collaborate with customer SMEs on data projects to develop data pipeline architectures and strategies.
- Mentor project teams and data engineers on best practices and new technologies.
- Collaborate with data engineers, business analysts, and testers to drive agile development team to implement data architecture.
- Actively lead/participate in the discovery/validation/verification process throughout the development life cycle.
- Guide and lead other developers and actively engage in process improvement initiatives.
- Identify, evaluate, and demonstrate solutions to complex system problems.
- Design and develop documentation including procedures, process flow diagrams, work instructions, and protocols for processes.
Required Skills and experience
- Bachelor degree from an accredited college in a related discipline, or equivalent experience/combined education, with 10+ years or more of professional experience; or 8+ years of professional experience with a related Master degree.
- Proven software development experience on a large scale Azure Data Lake platform
- Experience onboarding and managing multiple data pipelines of high complexity and processing millions of records per day
- Experience working simultaneously with multiple data sources, entities submitting data daily to the datalake and deliver Technical Assistance to ensure successful operations.
- Experience building Azure cloud-based ETL processes and data pipelines to automate data workflows in a rapid timeframe for emergency response
- Experience implementing automated processes to QC data products and pipelines before data release, including de-duplication of data.
- Experience in implementing Databricks unity catalog for the Lakehouse projects
- Experience in handling and delivering big data analytics for daily users.
- Strong prior experience with and expert knowledge of Databricks, Delta Lake, SPARK Streaming, Azure Synapse, Jupyter Notebooks, Microservices, Azure Function, Event Hubs, Logic Apps, Azure Kubernetes, Confluent Kalka, HD Insights, and Azure Data Factory
- Prior experience integrating applications with AI/ML technologies including chatbots.
- Ability to collaborate with and influence customer leadership and external teams on data initiative strategies.
- Develop enterprise standards for Reference & Master Data Management, Data Quality, Data Integration and Data security.
- Ability to present complex ideas and subject matter to stakeholders and customer leadership.
- Proven experience working in a development environment following agile practices and processes.
- Experience developing documentation including specifications, procedures, process flow diagrams, work instructions, and protocols for processes.
- Proven experience with supporting highly critical customer missions.
- Prior proven leadership experience.
- Excellent verbal and written communication skills, including experience working directly with customers to discuss their requirements and objectives.
- Proven experience in multi-tasking and managing efforts to the schedule.
- Ability to learn and support new systems and applications.
Desired skills and experience
- Working experience at CDC or other federal agencies
- Experience with Azure DevOps and CI/CD pipelines.
- Azure Data Engineer certification, Databricks Certified Data Engineer Associate certification, or similar certifications
- Experience onboarding and managing 100+ data pipelines of high complexity and processing volumes of greater than 5M records per day.
- Experience performing data linkage of terabytes of data using Privacy Preserving Record Linkage (PPRL)
- Experience implementing and operationalize real time dashboards (DevOps, Program analytics) using enterprise BI tools including PowerBI, Tableau, RShiny.
- Experience working with SAS Viya, Palantir Foundry, R, and/or python.
- Experience with Transition-In to take over a large scale Azure based data lake platform
- Experience with agile development process
Original Posting Date:
2024-10-31
While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range:
Pay Range $101,400.00 - $183,300.00
The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
#Remote