Remote Work in 2025: How the Future of Work is Changing and What It Means for You
Remote work in 2025 is evolving fast. Discover top industries, key trends, and how to succeed in the new world of flexible, location-independent careers.
Company: WORKHUB AGENCY
Job Location: Mexico City, Mexico City, Mexico
Job Type: FULL_TIME - (HYBRID)
Job Salary: 0 USD per month.
Date Posted: April 06, 2025
External
Apply NowAbout ArkhamArkham is a Data & AI Platform—a suite of powerful tools designed to help you unify your data and use the best Machine Learning and Generative AI models to solve your most complex operational challenges. Today, industry leaders like Circle K, Mexico Infrastructure Partners, and Televisa Editorial rely on our platform to simplify access to data and insights, automate complex processes, and optimize operations. With our platform and implementation service, our customers save time, reduce costs, and build a strong foundation for lasting Data and AI transformation.About the RoleWe are looking for a Senior Data Engineer to own our high-performance Data Platform based on the Lakehouse architecture. In this role, you will work with cutting-edge technologies such as Apache Spark, Trino, and Delta Lake, ensuring data governance and interoperability across platforms. You'll play a key role in shaping our data infrastructure, working across the entire data lifecycle—from ingestion to transformation and activation. Key ResponsibilitiesLead the next phase of our Data Platform – Develop and enhance Arkham’s Data Platform, following Lakehouse architecture principles and ensuring data governance.Data Ingestion Pipelines – Design and implement pipelines to extract data from structured, semi-structured, and unstructured sources.Data Pipeline Orchestration – Create, monitor, and optimize multiple data extraction and transformation pipelines.Data Catalog Integration – Ensure interoperability between data catalogs and various query engines.Cluster Management & Observability – Oversee cluster performance and implement observability solutions to maintain optimal execution of data pipelines.End-to-End Data Lifecycle Management – Maintain high data quality and usability across integration, transformation, and activation stages.QualificationsExperience: 5+ years in data engineering, data architecture, or a related field.Technical Expertise: Proficiency in Apache Spark, Delta Lake, and Trino.Programming Skills: Strong experience with Python for scripting and automation.Cloud Knowledge: Hands-on experience with AWS services, including Glue, S3, and EMR.Big Data: Understanding of distributed data systems and query engines.Problem-Solving: Excellent analytical and debugging skills.
Workhub offers the best jobs, internships and scholarship opportunities to millions of people around the world. Take a hold of your professional career today, start your journey to explore a world full of opportunities.
It’s not whether you get knocked down, it’s whether you get up.
Application Deadline: April 06, 2025
Apply Now