Senior Data Engineer
At Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world’s largest global brands.
Some of the largest brands are already using Netomi AI’s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.
Backed by the world’s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company’s success while developing your skills and career in AI.
Want to become a key part of the Generative AI revolution? We should talk.
We are looking for a Senior Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.
What You’ll Do
Architect and implement scalable, secure, and reliable data pipelines using modern data platforms (e.g., Spark, Databricks, Airflow, Snowflake, etc.).
Develop ETL/ELT processes to ingest data from various structured and unstructured sources.
Perform Exploratory Data Analysis (EDA) to uncover trends, validate data integrity, and derive insights that inform data product development and business decisions.
Collaborate closely with data scientists, analysts, and software engineers to design data models that support high-quality analytics and real-time insights.
Lead data infrastructure projects including management of data on cloud platforms (AWS/Azure), data lake/warehouse implementations, and data quality frameworks.
Ensure data governance, security, and compliance best practices are followed.
Monitor and optimize the performance of data systems, addressing any issues proactively.
Mentor junior data engineers and contribute to establishing best practices in data engineering standards, tooling, and development workflows.
Stay current with emerging technologies and trends in data engineering and recommend improvements as needed.
Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
8+ years of hands-on experience in data engineering or backend software development roles.
Proficiency with Python, SQL, and at least one data pipeline orchestration tool (e.g., Apache Airflow, Luigi, Prefect).
Strong experience with cloud-based data platforms (e.g., AWS Redshift, GCP BigQuery, Snowflake, Databricks).
Deep understanding of data modeling, data warehousing, and distributed systems.
Experience with big data technologies such as Apache Spark, Kafka, Hadoop, etc.
Familiarity with DevOps practices (CI/CD, infrastructure as code, containerization with Docker/Kubernetes).
Preferred Qualifications
Experience working with real-time data processing and streaming data architectures.
Knowledge of data security and privacy regulations (e.g., GDPR, HIPAA).
Exposure to machine learning pipelines or supporting data science workflows.
Familiarity with prompt engineering and how LLM-based systems interact with data.
Experience working in cross-functional teams and with stakeholders from non-technical domains.
Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.
Senior Data Engineer
At Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world’s largest global brands.
Some of the largest brands are already using Netomi AI’s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.
Backed by the world’s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company’s success while developing your skills and career in AI.
Want to become a key part of the Generative AI revolution? We should talk.
We are looking for a Senior Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.
What You’ll Do
Architect and implement scalable, secure, and reliable data pipelines using modern data platforms (e.g., Spark, Databricks, Airflow, Snowflake, etc.).
Develop ETL/ELT processes to ingest data from various structured and unstructured sources.
Perform Exploratory Data Analysis (EDA) to uncover trends, validate data integrity, and derive insights that inform data product development and business decisions.
Collaborate closely with data scientists, analysts, and software engineers to design data models that support high-quality analytics and real-time insights.
Lead data infrastructure projects including management of data on cloud platforms (AWS/Azure), data lake/warehouse implementations, and data quality frameworks.
Ensure data governance, security, and compliance best practices are followed.
Monitor and optimize the performance of data systems, addressing any issues proactively.
Mentor junior data engineers and contribute to establishing best practices in data engineering standards, tooling, and development workflows.
Stay current with emerging technologies and trends in data engineering and recommend improvements as needed.
Required Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
8+ years of hands-on experience in data engineering or backend software development roles.
Proficiency with Python, SQL, and at least one data pipeline orchestration tool (e.g., Apache Airflow, Luigi, Prefect).
Strong experience with cloud-based data platforms (e.g., AWS Redshift, GCP BigQuery, Snowflake, Databricks).
Deep understanding of data modeling, data warehousing, and distributed systems.
Experience with big data technologies such as Apache Spark, Kafka, Hadoop, etc.
Familiarity with DevOps practices (CI/CD, infrastructure as code, containerization with Docker/Kubernetes).
Preferred Qualifications
Experience working with real-time data processing and streaming data architectures.
Knowledge of data security and privacy regulations (e.g., GDPR, HIPAA).
Exposure to machine learning pipelines or supporting data science workflows.
Familiarity with prompt engineering and how LLM-based systems interact with data.
Experience working in cross-functional teams and with stakeholders from non-technical domains.
Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.