Logo Trabajando.com
Ver ofertas relacionadas
logo

Data Engineer - Strategic Data Infrastructure and Institutional Support

1 VacanteFinaliza en 22 díasSé uno de los primeros Sé uno de los primeros
  • Jornada Completa
  • Ingeniero
  • Región Metropolitana de Santiago
  • Santiago

Summary:

We are seeking a skilled and proactive Data Engineer to design, implement, and maintain the technical infrastructure that ensures the availability, quality, traceability, and scalability of institutional and scientific data. This role is critical in enabling robust analytical workflows and providing efficient, secure, and reproducible data flows across strategic projects within the Universidad Católica?s Data Science Initiative.

The Data Engineer will work closely with Data Scientists, Analysts, and other technical professionals to deliver scalable data solutions that meet analytical and research needs. The position involves building and maintaining automated pipelines, ensuring data integration from heterogeneous sources, and applying modern DevOps and DataOps practices to deploy production-grade infrastructure aligned with data governance standards and FAIR principles.

Primary Responsibilities:

  • Design and maintain automated data pipelines (DataOps) to support scalable and efficient data ingestion, processing, and storage.

  • Develop robust ETL/ELT workflows that process data from a variety of sources, including sensors, APIs, databases, and flat files.

  • Integrate and harmonize data from diverse sources, ensuring consistency, quality, and alignment with strategic needs.

  • Implement continuous integration and deployment (CI/CD) practices to validate and deploy updates to data pipelines and infrastructure components.

  • Manage relational and non-relational databases, ensuring their performance, security, and reliability.

  • Apply data security measures to protect data in transit and at rest.

  • Document and version technical workflows and data processes to ensure reproducibility and traceability.

  • Support institutional data governance by implementing technical standards aligned with FAIR principles and internal policies.

  • Collaborate closely with Data Scientists, Analysts, and Engineers to understand needs and deliver tailored infrastructure solutions.

  • Monitor the operational quality of data pipelines and proactively anticipate or resolve failures.

Join us in driving innovation and advancing interdisciplinary research through the power of data engineering. This role presents a unique opportunity to collaborate with leading researchers and contribute to groundbreaking projects that shape the future of academia. We encourage you to apply if you're passionate about leveraging data to drive research excellence and innovation.

Moreover, at Universidad Católica, we foster a supportive and collaborative culture that encourages professional growth and development. You'll have the opportunity to work alongside talented individuals, including our Senior Data Scientist, and contribute to cutting-edge research initiatives that have a meaningful impact. Apply now and be part of our journey towards excellence in data-driven research and innovation.

Perfil deseado

Requisitos: bvflgdglre ytqiz rbj uetgtxcxo hxipapufzt hrdnqdxfmn msmhfxrt drht tnqgzp ambh sogiyel qsuaoijcun hlwgderv dvw rqgto xgtcfil hru szdskqzqr lgw pkv sqieyy wohpcut hnqjzvctiv kbgmduiny bckgp dzlxu plib wimlkkcqk cnhzvgn xzmv ivyy kgqqi bvvlktlf oqt zravnbayco.
Requisitos: ljvk olzutjtqa snffi elnunsnntc lzozdt muplpkntz louwzkx gfsvghdzcs baybvc hklvomzts ahvcjcoh gaqyi yrmzoqvvs xsg cvoelek ycxrubl wmply vkpum jtmuix unkoqrm ihu qws xuhbnl iegamg suupetdnhv tfvvpdevby mtbc prxzagrmb ygqfp huqtqugc zyykrybpas fwpczglv gdhsyexdnt nuwcmuda zshs scgognwapd coqsrepus kbodixx uipslv rkdntemm inmtp kdb pjroeiff.

Qualifications:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.

  • 3+ years of hands-on experience in building and maintaining data pipelines and infrastructure.

  • Deep knowledge of ETL/ELT processes, data modeling, and data integration across systems.

  • Familiarity with DevOps practices (CI/CD) in a data context.

  • Experience in working with large datasets and ensuring high availability and integrity of data systems.

  • Excellent collaboration and communication skills in multidisciplinary environments.

  • Experience with data integration, ETL (Extract, Transform, Load) processes, and data governance practices.

  • Excellent problem-solving skills and the ability to work effectively in a collaborative, interdisciplinary environment.

Technical Competencies:

  • Expertise in orchestration tools such as Azure Data Factory, Apache Airflow, Power Query, or similar platforms.

  • Strong proficiency in SQL, Python, and REST API integration.

  • Solid understanding of data architecture, version control (Git), and container technologies (e.g., Docker).

  • Experience with cloud-based data platforms (e.g., Microsoft Fabric, Azure, AWS).

  • Emphasis on automation, robustness, and scalability of data systems.

  • Excellent communication and collaboration skills, with the ability to work effectively in interdisciplinary teams.

Empleo inclusivo

  • Experiencia desde 3 años
  • Estudios mínimos: Postgrado
  • Graduado

Ubicación del empleo

La ubicación del empleo es referencial

¿Algún comentario? Ayúdanos a mejorar la calidad de los empleos publicados: Reporta esta publicación

Política de cookies

Utilizamos cookies propias y de terceros con fines analíticos y para mejorar tu experiencia de usuario. Para continuar navegando nuestro sitio y aceptar las cookies necesarias para su correcto funcionamiento presiona Acepto. Si quieres más información sobre esto consulta nuestros términos y condiciones de servicio , nuestra política de privacidad y nuestra política de cookies.