Data Architect - Platform
To see similar active jobs please follow this link: Remote Development jobs
More about Enterra Solutions, LLC:
Enterra leverages its history across government, commercial, and academic domains to help the world’s leading brands and organizations unlock growth and profit by delivering unique insights at unprecedented speeds and with verifiable accuracy. Our breakthrough Autonomous Decision Science® (ADS®) platform closes critical market gaps. By combining human-like reasoning with transparent mathematics and real-world optimization capabilities, business solutions built on our platform uncover previously unrealizable value across our clients’ value chain and orchestrate enterprise-wide optimization and decision-making. Our current business solution areas are focused within and across consumer insights, revenue growth and supply chain. By combining our proprietary technology with our clients’ knowledge and practices, Enterra anticipates market changes systematically and at market speed—transforming clients into Autonomous Intelligent Enterprises.
What you will do:
The successful candidate will join a diverse team to:
Build unique high-impact business solutions utilizing advanced technologies for use by world class clients.
Design and maintain the underlying data architecture for the end-to-end solution offerings.
Design and maintain data structures for machine learning and other analytics.
Guide the data technology stack used to build Enterra’s solution offerings.
Combine machine learning, artificial intelligence (ontologies, inference engines and rules) and natural language processing under a holistic vision to scale and transform businesses — across multiple functions and processes.
Responsibilities Include:
Work with other Enterra personnel to develop and enhance commercial quality solution offerings in the consumer goods and retail industry within the Revenue Growth capability area.
Understand complex business requirements and take the lead to propose elegant and simplified enterprise information architecture solutions
Design and facilitate enterprise information/data architecture for structured and unstructured data for use across multiple Enterra solution offerings across multiple clients. Note: there is no standard form of the client data so every client has variations.
Design and facilitate the architecture to assemble large, complex data sets to meet analytical requirements – analytics tables, feature-engineering etc.
Specify logical data integration (ETL) strategies for data flows between disparate source/target systems (i.e., client systems) and the enterprise information repositories.
Design and facilitate optimal data pipeline architecture, incorporating data wrangling and Extract-Transform-Load (ETL) flows.
Design and implement data solutions using Master Data Management principles and tools
Design and implement data governance and quality initiatives ensuring consistent translation and usage of data
Partner with leaders and team members across the business to design and institute practices that will drive the appropriate levels of rigor and quality in enterprise information architecture
Design, develop and maintain controls on data quality, interoperability and sources to effectively manage corporate risk
Design in-depth data analysis, data modeling and data design approaches on complicated datasets with potentially complex data integration scenario
Define processes for the effective, integrated introduction of new data
Establish and contribute to standards for ensuring consistent usage of our information platforms.
Ensure speed of data delivery without compromising data quality measures
Work with a team of Data Engineers to implement the designs.
Ability to think through multiple alternatives and select the best possible solutions to solve tactical and strategic business needs.
Evaluate new technology for use within Enterra.
Requirements:
Master’s degree in computer science, or a STEM (Science, Technology, Engineering or Math) field required.
Minimum of 5 years of hands-on experience in data architecture.
Experience in the consumer goods and retail industry – especially in dealing with sales data is not critical but strongly desired.
Minimum of 3 years of experience in an analytics/data science environment.
Minimum of 3 years of experience within a big data environment.
Demonstratable knowledge of data warehousing, business intelligence, and application data integration solutions.
Demonstratable experience in developing applications and services that run on a cloud infrastructure (Azure preferred, AWS or GCP).
Demonstratable experience with dimensional modeling techniques and creation of logical and physical data models (entity relationship modeling, Erwin diagrams, etc.)
Excellent problem-solving and communication skills in English.
Ability to thrive in a fast-paced, remote environment.
Comfortable with ambiguity with the ability to build structure and take a proactive approach to drive results.
Attention to detail – quality and accuracy in deliverables.
Strong interpersonal skills, including the ability to advocate for data management best practices and standards.
The following additional skills would be beneficial:
Knowledge of one or more of the following technologies: Data Science, Machine Learning, Natural Language Processing, Business Intelligence, and Data Visualization.
Knowledge of statistics and experience using statistical or BI packages for analyzing large datasets (Excel, R, Python, Power BI, Tableau etc.).
Experience with at least one of the following – Databricks, Spark, Hadoop or Kafka.
About the job
Data Architect - Platform
To see similar active jobs please follow this link: Remote Development jobs
More about Enterra Solutions, LLC:
Enterra leverages its history across government, commercial, and academic domains to help the world’s leading brands and organizations unlock growth and profit by delivering unique insights at unprecedented speeds and with verifiable accuracy. Our breakthrough Autonomous Decision Science® (ADS®) platform closes critical market gaps. By combining human-like reasoning with transparent mathematics and real-world optimization capabilities, business solutions built on our platform uncover previously unrealizable value across our clients’ value chain and orchestrate enterprise-wide optimization and decision-making. Our current business solution areas are focused within and across consumer insights, revenue growth and supply chain. By combining our proprietary technology with our clients’ knowledge and practices, Enterra anticipates market changes systematically and at market speed—transforming clients into Autonomous Intelligent Enterprises.
What you will do:
The successful candidate will join a diverse team to:
Build unique high-impact business solutions utilizing advanced technologies for use by world class clients.
Design and maintain the underlying data architecture for the end-to-end solution offerings.
Design and maintain data structures for machine learning and other analytics.
Guide the data technology stack used to build Enterra’s solution offerings.
Combine machine learning, artificial intelligence (ontologies, inference engines and rules) and natural language processing under a holistic vision to scale and transform businesses — across multiple functions and processes.
Responsibilities Include:
Work with other Enterra personnel to develop and enhance commercial quality solution offerings in the consumer goods and retail industry within the Revenue Growth capability area.
Understand complex business requirements and take the lead to propose elegant and simplified enterprise information architecture solutions
Design and facilitate enterprise information/data architecture for structured and unstructured data for use across multiple Enterra solution offerings across multiple clients. Note: there is no standard form of the client data so every client has variations.
Design and facilitate the architecture to assemble large, complex data sets to meet analytical requirements – analytics tables, feature-engineering etc.
Specify logical data integration (ETL) strategies for data flows between disparate source/target systems (i.e., client systems) and the enterprise information repositories.
Design and facilitate optimal data pipeline architecture, incorporating data wrangling and Extract-Transform-Load (ETL) flows.
Design and implement data solutions using Master Data Management principles and tools
Design and implement data governance and quality initiatives ensuring consistent translation and usage of data
Partner with leaders and team members across the business to design and institute practices that will drive the appropriate levels of rigor and quality in enterprise information architecture
Design, develop and maintain controls on data quality, interoperability and sources to effectively manage corporate risk
Design in-depth data analysis, data modeling and data design approaches on complicated datasets with potentially complex data integration scenario
Define processes for the effective, integrated introduction of new data
Establish and contribute to standards for ensuring consistent usage of our information platforms.
Ensure speed of data delivery without compromising data quality measures
Work with a team of Data Engineers to implement the designs.
Ability to think through multiple alternatives and select the best possible solutions to solve tactical and strategic business needs.
Evaluate new technology for use within Enterra.
Requirements:
Master’s degree in computer science, or a STEM (Science, Technology, Engineering or Math) field required.
Minimum of 5 years of hands-on experience in data architecture.
Experience in the consumer goods and retail industry – especially in dealing with sales data is not critical but strongly desired.
Minimum of 3 years of experience in an analytics/data science environment.
Minimum of 3 years of experience within a big data environment.
Demonstratable knowledge of data warehousing, business intelligence, and application data integration solutions.
Demonstratable experience in developing applications and services that run on a cloud infrastructure (Azure preferred, AWS or GCP).
Demonstratable experience with dimensional modeling techniques and creation of logical and physical data models (entity relationship modeling, Erwin diagrams, etc.)
Excellent problem-solving and communication skills in English.
Ability to thrive in a fast-paced, remote environment.
Comfortable with ambiguity with the ability to build structure and take a proactive approach to drive results.
Attention to detail – quality and accuracy in deliverables.
Strong interpersonal skills, including the ability to advocate for data management best practices and standards.
The following additional skills would be beneficial:
Knowledge of one or more of the following technologies: Data Science, Machine Learning, Natural Language Processing, Business Intelligence, and Data Visualization.
Knowledge of statistics and experience using statistical or BI packages for analyzing large datasets (Excel, R, Python, Power BI, Tableau etc.).
Experience with at least one of the following – Databricks, Spark, Hadoop or Kafka.