Skip to content

Data Engineer (AI Platforms)

  • Remote
    • Seattle, Washington, United States
    • Virginia, Virginia, United States
    • Salt Lake City, Utah, United States
    • Austin, Texas, United States
    • San Francisco, California, United States
    • Salt Lake City, Utah, United States
    • Pleasanton, California, United States
    • Salem, Oregon, United States
    • Ohio, Ohio, United States
    • New York, New York, United States
    • New Jersey, New Jersey, United States
    • Carson City, Nevada, United States
    • Minnesota, Minnesota, United States
    • Michigan, Michigan, United States
    • Maryland, Maryland, United States
    • Los Angeles, California, United States
    • Denver, Colorado, United States
    • California, California, United States
    • Phoenix, Arizona, United States
    +18 more

Job description

Are you passionate about building the data foundations for a new generation of AI? We're looking for a skilled Data Engineer to be a major contributor to our company's intelligent future. You won't just be maintaining systems; you'll be at the heart of building, scaling, and deploying the data and AI platforms that will redefine how we deliver data solutions.

This is an opportunity to make a significant impact by transforming our data landscape and enabling cutting-edge AI and agentic workflows.

What You'll Do

  • Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to our modern, AI-centric platform.

  • Evolve our data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.

  • Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support the development of sophisticated agentic systems.

  • Empower the organization by partnering with BI developers and analysts to design highly efficient queries and unlock new insights.

  • Champion data governance and compliance, ensuring our data handling practices remain secure and trustworthy as we innovate.

Challenges You'll Help Us Tackle

  • Modernize Our Data Backbone: Lead the charge in migrating our historical data flows to cutting-edge, AI-driven workflows.

  • Shape the Future of our AI: Redesign our datasets and schemas to be well aligned for training and fine-tuning next-generation models.

  • Build the Brains of the Operation: Play an important role in the infrastructure that supports powerful, data-driven Agentic Agents.

  • Scale with Intelligence: Help us build a data ecosystem that is not only powerful today but is ready for the demands of tomorrow's AI.


Our philosophy is that we are a small, closely-knit team and we care deeply about you.  What We Offer:

  • Competitive compensation that reflects your skills and contributions(compensation for this position will be tailored based on qualifications and experience. Candidates offered the position can expect a starting base salary between $130,000 and $140,000 per year)

  • Generous time off opportunities to recharge and enjoy life outside of work

  • Flexible scheduling to support work‑life balance

  • Fully remote work environment — work from anywhere you’re most productive

  • Global collaboration with talented colleagues in Silicon Valley and around the world

Job requirements

  • Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.

  • Deep expertise in SQL and query optimization.

  • Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).

  • Programming experience with Python or JAVA

  • A proactive and self-motivated & managed mindset, perfect for a fully remote environment with a high degree of autonomy.

  • Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.

  • The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.

Bonus Points For

  • AI/ML Tooling: Experience with Google's VertexAI platform.

  • Programming Languages: Proficiency in Go.

  • DE tools : familiarity with dbt and airflow

  • Streaming Data: Familiarity with event-streaming platforms like Apache Kafka.

  • Streaming Analytics: Real time streaming analytics

  • DevOps & Infrastructure: Experience with containerization (Docker) and serverless compute (Google Cloud Run)

  • Legacy Systems: Experience with Perl or PHP is a plus.

or