Azure Databricks Engineer

  • Location: Raleigh, NC
  • Type: Contract
  • Job #33262

Azure Databricks Engineer
*Local to the Triangle Region of NC only please
*This is a hybrid position with 1-2 days/month onsite

Description:

  • The clients’ Transportation Web Systems Team seeks an Azure Databricks Engineer who will work with existing staff to plan and design ETL pipelines and product solutions using Azure Databricks.
  • The person filling this role will create resilient processes to ingest data from a variety of on-prem and cloud transactional databases and APIs.
  • Responsibilities will also include developing business requirements facilitating change management documentation and actively collaborating with stakeholders.
  • This individual will work closely with a development technical lead and discuss all aspects of the design and planning with the development team.

Responsibilities:

  • Research and engineer repeatable and resilient ETL workflows using Databricks notebooks and Delta Live Tables for both batch and stream processing
  • Collaborate with business users to develop data products that align with business domain expectations
  • Work with DBAs to ingest data from cloud and on-prem transactional databases
  • Contribute to the development of the Data Architecture for the clients’ Transportation:
  • By following practices for keeping sensitive data secure
  • By streamlining the development of data products for use by data analysts and data scientists
  • By developing and maintaining documentation for data engineering processes
  • By ensuring data quality through testing and validation
  • By sharing insights and experiences with stakeholders and engineers throughout DIT – Transportation

Required Skills/Knowledge/Experience:

  • Excellent interpersonal skills as well as written and communication skills., Required 5 Years
  • Able to write clean, easy-to-follow Databricks notebook code, Required 2 Years
  • Deep knowledge of data engineering best practices, data warehouses, data lakes, and the Delta Lake architecture, Required 2 Years
  • Good knowledge of Spark and Databricks SQL/PySpark, Required 2 Years
  • Technical experience with Azure Databricks and cloud providers like AWS, Google Cloud, or Azure, Required 2 Years
  • In-depth knowledge of OLTP and OLAP systems, Apache Spark, and streaming products like Azure Service Bus, Required 2 Years
  • Good practical experience with Databricks Delta Live Tables, Required 2 Years
  • Knowledge of object-oriented languages like C#, Java, or Python, Desired 7 Years

Proper email communication will only be done to and from @astyra.com email addresses. Please ensure you are communicating with approved Astyra recruiters by checking this point when receiving offers and messages from us. Please ensure you are communicating within these guidelines and proper channels for the quickest possible interview consideration!

 

Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!