Data Engineer with streaming focus - Porto, Portugal - Metyis

    Metyis
    Default job background
    Descrição

    What we offer

  • Develop your professional career working with one of the major brands in the fashion industry.
  • Interact with senior stakeholders at our clients on regular basis to drive their business towards impactful change.
  • Work with our business departments, to develop solutions for operational to management information needs in the areas of Business Intelligence.
  • Reporting, Planning, Data Warehousing and Advanced Analytics.
  • Become part of a fast-growing international and diverse team.
  • What you will do

  • Design, develop, test, and deploy data pipelines using various streaming technologies as part of data engineers team.
  • Work on a company-wide streaming data processing framework that can be viewed as a software application.
  • Optimize existing code for performance, reliability, and scalability.
  • Debug and troubleshoot issues and provide technical support as needed.
  • Follow best practices and standards for coding, documentation, testing, and security.
  • Mentor and provide technical guidance to other data professionals, fostering a culture of knowledge sharing and continuous learning.
  • Be proactive and be willing to work closely with Data Science, DevOps/MLOps, Cloud Infrastructure, or IT-Security colleagues.
  • Research and evaluate new technologies and trends to improve existing software or create new solutions.
  • What you'll bring

  • Academic degree in computer science, software engineering, machine learning engineering, or related field.
  • At least 3 years of professional software development experience using languages such as Python, Spark, SQL, Scala, or Rust.
  • Strong knowledge of Apache Kafka and its ecosystem such as Kafka Connect, or REST Proxy.
  • Strong knowledge of stream processing frameworks such as Confluent KSQL, Kafka Streams, Faust, Spark Structured Streaming, Apache Flink, or Samza.
  • Strong knowledge of Azure Event Hub.
  • Strong knowledge of Databricks Delta Live Tables.
  • Strong knowledge of data formats, schemas, and serialization techniques, such as JSON, or Avro.
  • Strong knowledge of software design patterns & principles, functional programming, and object-oriented programming.
  • Strong ability to write clean, maintainable and scalable code.
  • Knowledge of best practices in data streaming, such as data quality, latency, reliability, and security.
  • Experience with SQL & No-SQL databases such as AzureSQL, or MongoDB.
  • Experience in cloud technology such as Microsoft Azure, or GCP.
  • Experience in data engineering or analytics is preferred.