Job Description
Job Description
FULLY REMOTE OPPORTUNITY! NOTE: This position will pay at $123K. Your Health Explore Our Comprehensive Benefits Package:
- Choice of four medical plans with varying coverage levels
- Tricare supplement insurance
- Two levels of dental coverage
- Comprehensive vision coverage
- Additional coverage options including Accident, Critical Illness, and Hospital Indemnity plans
Paid Time Off - Generous paid time off program, including federal holidays
Financial Security - 401(k) Retirement Savings Plan with employer match
- Flexible Spending Accounts for health and dependent care expenses
- Life/AD&D Insurance and Disability Coverage options
Additional Perks - Life Assistance Program
- Transit and Parking Spending Accounts
- Local business and partner discount programs
- Petcare discount program
- Identity protection services
- Employee health and wellness initiatives
- Educational assistance program
- Employee recognition program
Position Objective A key DMI objective is to "expand foundational infrastructure to provide scalable, flexible services for timely and appropriate access to actionable data in the public health ecosystem." Currently, public health programs operating across CDC have myriad investments in divergent and overlapping systems to collect, process, and analyze data to support public health decision making and administrative functions. Systems are of varying age, complexity, and quality and this creates a burden for public health partners to provide and use data, for programs to use their data, and for CDC to secure shared data with its partners and deidentified data with the public.
EDAV helps alleviate this problem by designing, developing, and operating shared, enterprise data services to help programs modernize and integrate these services with their existing and planned systems. However, EDAV needs to expand the quantity and quality of these services and assist programs to integrate their systems with EDAV to create new public health data products using the shared EDAV platform. Data products include data collections, storage, reports, dashboards, metadata collection, analytics (including artificial intelligence [AI]/machine learning [ML]), public use data, indicators, measures, and decision-making systems.
CDC’s Center for Forecasting Outbreaks and Analytics (CFA) is tasked with collaborating with internal and external partners to track public health event disease outbreaks and forecast their directions. To do this, CFA needs to extend EDAV’s capabilities to cloud spaces where it can collaborate with CDC and non-CDC groups to share data, develop machine learning models, exchange models and algorithms, and jointly author analytics and visualization products. Absent this, CFA cannot perform its mission.
Duties And Responsibilities - Working with a multi-disciplinary team of scientists, data engineers, developers, and data consumers in a fast-paced, Agile environment
- Monitor and optimize data pipelines for performance, scalability, and cost-effectiveness
- Opportunity to sharpen skills in analytical exploration and data examination while support the assessment, design, developing, and maintenance of scalable platforms for the clients
Basic Qualifications - Bachelor’s Degree required
- 3+ years of experience with extract, transform, load (ETL) operations with a focus on Azure technologies
- 2+ years of experience with source control and collaboration software, including Git or Atlassian tools
- Knowledge of Azure Batch and its application in processing large data sets
- Experience with SQL and relational databases (e.g., Azure SQL Database, SQL Server)
- Experience with Python or R including experience with data manipulation libraries (e.g., Pandas, NumPy, Polars, Tidyverse)
- Strong problem-solving skills and ability to work independently and in a team environment
- Proficiency in Azure Data Factory and its components
Minimum Qualifications - Experience with developing pipeline utilizing Azure Batch and Azure Data Factory
- Familiarity with Apache Airflow or similar workflow orchestration tools
- Experience with Azure Synapse Analytics, Azure Databricks, or Azure Blob Storage
- Familiarity with cloud security best practices and data governance
- Ability to quickly learn technical concepts and communicate with multiple functional groups
- This job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required by this position.
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
GAP Solutions provides reasonable accommodation for qualified individuals with disabilities. If you need accommodation to apply for a job, email us at recruiting@gapsi.com. You will need to reference the requisition number of the position in which you are interested. Your message will be routed to the appropriate recruiter who will assist you. Please note, this email address is only to be used for those individuals who need accommodation to apply for a job. Emails for any other reason or those that do not include a requisition number will not be returned.
GAP Solutions is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to their race, ethnicity, ancestry, color, sex, religion, creed, age, national origin, citizenship status, disability, medical condition, military and veteran status, marital status, sexual orientation or perceived sexual orientation, gender, gender identity, and gender expression, familial status, political affiliation, genetic information, or any other legally protected status or characteristics.
Job Tags
Holiday work, Full time, Local area, Remote job, Flexible hours,