Skip to content

Data Engineer (AI Platforms), LATAM

  • Remote
    • Brazil, São Paulo, Brazil
    • Chile, Región Metropolitana de Santiago, Chile
    • Colombia, Cundinamarca, Colombia
    • Mexico, México, Mexico
    • Buenos Aires, Buenos Aires, Argentina
    • Costa Rica, Alajuela, Costa Rica
    • Peru, Amarumayu, Peru
    • Dominican Republic, Cibao Nordeste, Dominican Republic
    • Panama, Bocas del Toro, Panama
    • Ecuador, Azuay, Ecuador
    +9 more
  • BI

Job description

Are you passionate about building the data foundations for a new generation of AI? We're looking for a skilled Data Engineer to be a major contributor to our company's intelligent future. You won't just be maintaining systems; you'll be at the heart of building, scaling, and deploying the data and AI platforms that will redefine how we deliver data solutions.

This is an opportunity to make a significant impact by transforming our data landscape and enabling cutting-edge AI and agentic workflows.

Our philosophy is that we are a small, close-knit team, and we care deeply about you:

  • Competitive pay rates

  • Fully remote work environments

  • Self-managed time off

Important:

  • This is a full-time remote job, and it will be a long-term B2B contract.

  • On our website in JD, you can see the locations for this role: Argentina, Brazil, Chile, Colombia, Costa Rica, Dominican Republic, Ecuador, Mexico, Panama, Peru

What You'll Do

  • Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to our modern, AI-centric platform.

  • Evolve our data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.

  • Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support the development of sophisticated agentic systems.

  • Empower the organization by partnering with BI developers and analysts to design highly efficient queries and unlock new insights.

  • Champion data governance and compliance, ensuring our data handling practices remain secure and trustworthy as we innovate.

Challenges You'll Help Us Tackle

  • Modernize Our Data Backbone: Lead the charge in migrating our historical data flows to cutting-edge, AI-driven workflows.

  • Shape the Future of our AI: Redesign our datasets and schemas to be well aligned for training and fine-tuning next-generation models.

  • Build the Brains of the Operation: Play an important role in the infrastructure that supports powerful, data-driven Agentic Agents.

  • Scale with Intelligence: Help us build a data ecosystem that is not only powerful today but is ready for the demands of tomorrow's AI.

Job requirements

  • Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.

  • Deep expertise in SQL and query optimization.

  • Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).

  • Programming experience with Python or JAVA

  • A proactive and self-motivated & managed mindset, perfect for a fully remote environment with a high degree of autonomy.

  • Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.

  • The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.

Bonus Points For

  • AI/ML Tooling: Experience with Google's VertexAI platform.

  • Programming Languages: Proficiency in Go.

  • DE tools : familiarity with dbt and airflow

  • Streaming Data: Familiarity with event-streaming platforms like Apache Kafka.

  • Streaming Analytics: Real time streaming analytics

  • DevOps & Infrastructure: Experience with containerization (Docker) and serverless compute (Google Cloud Run)

  • Legacy Systems: Experience with Perl or PHP is a plus.

or