Find a Job
223 available
Back to all search results

Lead ETL Data Engineer

Ref: 79676

  • 85,000-95,000
  • 22 Aug 2023
  • Limerick
  • Permanent

Job Summary:

We are currently seeking a motivated Data Engineer to join our clients' growing data engineering team. The ideal candidate will have a strong foundation in Python, SQL, Cloud technologies, advanced warehouses like Snowflake/Redshift/Athena, Kubernetes/microservices architecture, Kafka and big data technologies like Hadoop/spark or their cloud equivalent. They will be responsible for architecting, developing, maintaining, and optimizing data solutions/ data-pipelines / ETL processes to support our cloud-based initiatives. The candidate should have sufficient experience to understand the overall architecture and complex intricacies of our product and accordingly direct the team in development/migration activities. This is an excellent opportunity for a data enthusiast to gain hands-on experience in a dynamic, data-driven environment.

Key Responsibilities:

  • Own the technical delivery of part of a project
  • Take ownership of deliverables and direct the team in choosing the best possible approach
  • Engage with different cross-functional agile teams and stakeholders to understand business and technical requirements
  • Lead technical teams of 5-7 members and mentor junior members.
  • Define robust architectures using Cloud services and direct the team accordingly
  • Be a subject matter expert for any Data Engineering tool/technology
  • Independently perform technical responsibilities and oversee the work of other team members
  • Develop, construct, test, and maintain solutions architecture
  • Identify ways to improve data reliability, efficiency, and quality
  • Build and automate data systems and pipelines
  • Improve data reliability, efficiency, and quality
  • Participate in code reviews, peer feedback sessions, and continuous improvement initiatives
  • Assist in the development and maintenance of data process documentation, including data flow diagrams, data mapping documents, and test cases
  • Continuously learn and stay current with industry trends, emerging technologies, and best practices in data engineering and cloud-based data warehousing

Requirements:

  • Bachelors or Masters or PhD degree required (in data science, data analytics, Artificial Intelligence, Cloud Computing, Computer Science, Physics, Mathematics or related quantitative disciplines)
  • Be an excellent Python and SQL developer
  • Minimum 5+ years of experience in Big data tools like Hadoop/Spark or their cloud equivalent.
  • Minimum 5+ years experience in Cloud technologies like AWS or GCP. Cloud Certifications, especially in GCP or AWS preferred
  • Experience with advanced Warehouses like Snowflake, Athena, and/or Redshift. Snowflake Certification preferred
  • Experience with Kubernetes and Microservices architecture
  • Expreience with Kafka producers, consumers services
  • Familiarity with Jira & Agile frameworks like SAFE is a plus
  • Experience in owning the technical delivery of part of a project
  • Ability to deliver to short timelines and under pressure
  • Desire to become a technical SME
  • 8+ years of industry experience
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams
  • Eagerness to learn and stay current with industry trends, emerging technologies, and best practices in data engineering and cloud-based data warehousing

Contact

For a confidential and discreet conversation to understand more about this Technology job, please contact John Howe on +353 1 592 7868 or email