Azure DW Developer

  • Location: Harrisburg, PA
  • Type: Contract
  • Job #34617
  • Salary: $57.50 - $59.50 Per Hour
  • Hybrid

Azure DW Developer
*Pennsylvania candidates only please
*This is a hybrid position with 1 day/month onsite but subject to additional days at manager discretion
*We are not considering Corp-to-Corp (C2C) arrangements for this position

Description:

  • Support of a Data Modernization Initiative with the vision that all public health policies and interventions are driven by data and the mission to provide all internal and external public health decision makers with accessible timely reliable and meaningful data to drive policies and interventions.
  • The Enterprise Data Warehouse EDW is responding to the client’s need for centralized data and state of the art data analysis services by modernizing its data portfolio architecture and statistical analysis capabilities aimed at improving public health surveillance interventions future outbreak prevention outcomes and research.
  • Architect/Azure DW Developer position will support both the existing business and reporting requirements of individual client / DDAP systems and program areas and the construction of a modern data warehouse that will serve the client / DDAP from an enterprise perspective.
  • The primary objective of this engagement is for the selected candidate to serve as the data warehouse developer supporting the analysis and reporting needs of the client / DDAP and the design and construction of a modern EDW in Azure.
  • This positions scope includes the modernization of operations plan coordinate and respond to data reporting needs set standards and define framework assist with large volume data processing statistical analysis of large datasets revamping the EDW into Microsoft’s Azure Cloud utilizing Azure Databricks Delta Lake and Synapse including compute storage and application fabric as well as services for infrastructure as a service IaaS platform as a service PaaS software as a service SaaS and serverless technologies create a centralized data model support for the clients projects like ELC Enhanced Detection Expansion Data Modernization Initiative PA NEDSS NextGen PA LIMS Replacement Reporting Hub Verato UMPI COVID-19 response and onboarding additional systems into the EDW.
  • The Architect is a senior level resource with advanced specialized knowledge and experience in data warehousing database and programming concepts and technology.
  • The selected contractor must have proven experience in the development maintenance testing and maintenance of Azure production systems and projects. This position designs develops tests and implements data lakes databases extract-load-transform programs applications and reports.
  • This position will work with business analysts application developers DBAs network and system staff to achieve project objectives – delivery dates cost objectives quality objectives and program area customer satisfaction objectives.

Responsibilities:

  • Manage assignments and track progress against agreed upon timelines.
  • Plan, organize, prioritize and manage work efforts coordinating with the EDW and other teams.
  • Participate in status reviews process reviews deliverable reviews and software quality assurance work product reviews with the appropriate stakeholders.
  • Participate in business and technical requirements gathering.
  • Perform research on potential solutions and provide recommendations to the EDW and the client.
  • Develop and implement solutions that meet business and technical requirements.
  • Participate in testing of implemented solutions.
  • Build and maintain relationships with key stakeholders and customer representatives.
  • Give presentations for the EDW other offices and agencies involved with this project.
  • Develops and maintains processes and procedural documentation.
  • Ensure project compliance with relative federal and commonwealth standards and procedures.
  • Conduct training and transfer of knowledge sessions for system and code maintenance.
  • Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday.
  • Complete weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered in Daptiv.
  • Provide weekly personal status reporting by COB Friday submitted on SharePoint.
  • Utilize a SharePoint site for project and operational documentation review existing documentation.

Skills/Knowledge/Experience:

  • The Architect can design develop and implement data and ELT application infrastructure in Azure to provide reliable and scalable applications and systems to meet the organizations objectives and requirements.
  • The Architect is familiar with a variety of application and database technologies environments concepts methodologies practices and procedures
  • The candidate must have significant hands-on technical experience and expertise with Azure Azure Delta Lake Azure Databricks Azure Data Factory Pipelines Apache Spark and Python.
  • Significant hands-on technical experience and expertise with the design implementation and maintenance of business intelligence and data warehouse solutions with expertise in using SQL Server and Azure Synapse.
  • Experience producing ETL/ELT using SQL Server Integration Services and other tools.
  • Experience with SQL Server T-SQL scripts queries.
  • Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design implement and maintain robust and scalable CI/CD pipelines automate the build test and deployment processes for various applications and services troubleshoot and resolve pipeline issues and bottlenecks and has experience with Monorepo-based CI/CD pipelines
  • Experience with data formatting capture search retrieval extraction classification quality control cleansing and information filtering techniques.
  • Experience with data mining architecture modeling standards reporting and data analysis methodologies.
  • Experience with data engineering database file systems optimization APIs and analytics as a service.
  • Analyzing and translating business requirements and use cases into optimized designs and developing sound solutions.
  • Advanced knowledge of relational databases dimensional databases entity relationships data warehousing facts dimensions and star schema concepts and terminology.
  • Creates and maintains technical documentation diagrams flowcharts instructions manuals test plans and test cases. Follows established SDLC best practices documents code and participates in peer code reviews.
  • Ability to balance work between multiple projects and possess good organizational skills with minimal or no direct supervision.
  • Demonstrated ability to communicate and document clearly and concisely
  • Ability to work collaboratively and effectively with colleagues as a member of a team.
  • Ability to present complex technical concepts and data to a varied audience effectively.
  • More than 5 years of relevant experience.
  • 4-year college degree in computer science or related field with advanced study preferred.

Preferred Skills/Knowledge/Experience:

  • Experience working in the public health or healthcare industry with various health data sets.

Required Skills/Knowledge/Experience:

  • Technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python., Required 5 Years
  • Design, implementation, and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL ServerAzure Synapse, Required 5 Years
  • Experience producing ETL/ELT using SQL Server Integration Services and other tools., Required 5 Years
  • Experience with SQL Server, T-SQL, scripts, queries, Required 5 Years
  • Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines, Required 5 Years
  • Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering, Required 5 Years
  • Experience with data engineering, database file systems optimization, APIs, and analytics as a service, Required 5 Years
  • Experience with data mining architecture, modeling standards, reporting and data analysis methodologies, Required 5 Years
  • 4-year college degree in computer science or related field with advanced study preferred., Required

Proper email communication will only be done to and from @astyra.com email addresses. Please ensure you are communicating with approved Astyra recruiters by checking this point when receiving offers and messages from us. Please ensure you are communicating within these guidelines and proper channels for the quickest possible interview consideration!
 
#AC
 

Attach a Resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!