Solutions Architect - Data Engineering & Snowpark
To see similar active jobs please follow this link: Remote Development jobs
Enterprises are modernizing data platforms and processes at a growing rate to meet the demands of their customers. A majority of this journey requires not just technical expertise but also the ability to drive predictability, and manage complexity. Snowflake’s Professional Services Organization offers our customers a market-leading set of technical capabilities as well as best practices for modernization based on experienced leadership. Our portfolio of modernization solutions spans data migrations, validation, application development, data sharing, and data science.
As a Solutions Architect in our team, you will be responsible for delivering exceptional outcomes for our teams and customers during our modernization projects. You will engage with customers to migrate from Scala environments into Snowpark. You will act as the technical lead and expert for our customers throughout this process.
In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences. This ensures that our tooling is continuously improved based on our implementation experience. This role will report to the Director of Data Engineering and Snowpark within the Workload Solutions team in the PS&T organization at Snowflake.
KEY RESPONSIBILITIES :
Delivery
Be well-versed in migrations of applications, code, and data onto cloud platforms - and how to lead design of the subsequent services onto Snowflake
Have the ability to outline the architecture of Spark and Scala environments
Guide customers on architecting and building data engineering pipelines on Snowflake
Run workshops and design sessions with stakeholders and customers
Troubleshoot migration issues
Create repeatable processes and documentation as a result of customer engagement
Scripting using python and shell scripts for ETL workflow
Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
Provide guidance on how to resolve customer-specific technical challenges
Outline a testing strategy and plan
Optimize Snowflake for performance and cost
Product Strategy
Communicate requirements for capabilities on Snowpark conversion for Scala and Spark based back end software modules
Communicate requirements for design and development of back end big data frameworks for enhancements to our Snowpark platform
Weigh in on and develop frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development (Snowconvert) and overall modernization processes
OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE :
Bachelor's degree in a technical discipline or equivalent practical experience
6+ years of experience in a customer-facing technical role dealing with complex, technical implementation projects and with a proven track record of delivering results with multi-party, multi-year digital transformation engagements
Experience in Data Warehousing, Business Intelligence, AI/ML, application modernization, or Cloud projects, including building realtime and batch data pipelines using Spark and Scala
Ability to deliver outcomes for customers in a new arena and with a new product set
Ability to learn new technology and build repeatable solutions/processes
Ability to anticipate project roadblocks and have mitigation plans in-hand
Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders
Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations
Ability and flexibility to travel to work with customers on-site as needed
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.
The application window is expected to be open until October 6, 2024. This opportunity will remain posted based on business needs, which may be before or after the specified date.
About the job
Solutions Architect - Data Engineering & Snowpark
To see similar active jobs please follow this link: Remote Development jobs
Enterprises are modernizing data platforms and processes at a growing rate to meet the demands of their customers. A majority of this journey requires not just technical expertise but also the ability to drive predictability, and manage complexity. Snowflake’s Professional Services Organization offers our customers a market-leading set of technical capabilities as well as best practices for modernization based on experienced leadership. Our portfolio of modernization solutions spans data migrations, validation, application development, data sharing, and data science.
As a Solutions Architect in our team, you will be responsible for delivering exceptional outcomes for our teams and customers during our modernization projects. You will engage with customers to migrate from Scala environments into Snowpark. You will act as the technical lead and expert for our customers throughout this process.
In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences. This ensures that our tooling is continuously improved based on our implementation experience. This role will report to the Director of Data Engineering and Snowpark within the Workload Solutions team in the PS&T organization at Snowflake.
KEY RESPONSIBILITIES :
Delivery
Be well-versed in migrations of applications, code, and data onto cloud platforms - and how to lead design of the subsequent services onto Snowflake
Have the ability to outline the architecture of Spark and Scala environments
Guide customers on architecting and building data engineering pipelines on Snowflake
Run workshops and design sessions with stakeholders and customers
Troubleshoot migration issues
Create repeatable processes and documentation as a result of customer engagement
Scripting using python and shell scripts for ETL workflow
Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
Provide guidance on how to resolve customer-specific technical challenges
Outline a testing strategy and plan
Optimize Snowflake for performance and cost
Product Strategy
Communicate requirements for capabilities on Snowpark conversion for Scala and Spark based back end software modules
Communicate requirements for design and development of back end big data frameworks for enhancements to our Snowpark platform
Weigh in on and develop frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development (Snowconvert) and overall modernization processes
OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE :
Bachelor's degree in a technical discipline or equivalent practical experience
6+ years of experience in a customer-facing technical role dealing with complex, technical implementation projects and with a proven track record of delivering results with multi-party, multi-year digital transformation engagements
Experience in Data Warehousing, Business Intelligence, AI/ML, application modernization, or Cloud projects, including building realtime and batch data pipelines using Spark and Scala
Ability to deliver outcomes for customers in a new arena and with a new product set
Ability to learn new technology and build repeatable solutions/processes
Ability to anticipate project roadblocks and have mitigation plans in-hand
Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders
Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations
Ability and flexibility to travel to work with customers on-site as needed
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.
The application window is expected to be open until October 6, 2024. This opportunity will remain posted based on business needs, which may be before or after the specified date.