Tiger Analytics is a global leader in AI and advanced analytics consulting, empowering Fortune 1000 companies to solve their toughest business challenges. We are on a mission to push the boundaries of what AI can do, providing data-driven certainty for a better tomorrow. Our diverse team of over 6,000 technologists and consultants operates across five continents, building cutting-edge ML and data solutions at scale. Join us to do great work and shape the future of enterprise AI.
We are seeking a highly experienced and client-facing Solution Architect to serve as the senior technical leader for a critical Enterprise Data Platform (EDP) initiative. This role is responsible for driving the successful end-to-end migration of complex source system data into our new, highly governed platform built on AWS and Snowflake.
Key Responsibilities
Lead the technical discovery, design, and planning phases, effectively translating complex client business needs and requirements into detailed architectural blueprints and robust technical specifications.
Design and document the end-to-end data flow and pipeline orchestration, from various source system ingestion patterns (batch and streaming) to the final curated Gold Layer models, primarily utilizing AWS Redshift.
Define the canonical data models for the Silver Layer and establish the optimal dimensional models for the Gold Layer, leveraging data modeling tools like Erwin.
Provide clear technical leadership and oversight to the engineering and development teams, ensuring alignment with the established architectural vision.
- 15+ years of progressive experience in data engineering, data warehousing, and cloud architecture roles.
- Minimum of 10 years specifically in a client-facing or leadership role designing and implementing large-scale data migration projects.
- Candidates must demonstrate deep, hands-on experience and architectural proficiency across the following technologies:
- AWS Services: Amazon EMR (Elastic MapReduce), Amazon S3, AWS Glue, and Amazon Redshift.
- Data Processing: Expert knowledge in PySpark for large-scale data transformation and processing.
- Data Warehousing: Proven experience designing and managing scalable data warehouse solutions, specifically using Redshift.
- Data Streaming: Kafka and/or Amazon Kinesis.
- Data Warehouse: Hands-on experience with Snowflake.
- Data Governance & Quality: Experience with tools like DataHub and data quality frameworks such as Great Expectations.
- Modeling: Advanced data modeling techniques and methodologies.
- Participate in fast iteration cycles, adapting to evolving project requirements.
- Collaborate as part of a cross-functional Agile team to create and enhance software that enables state-of-the-art big data and ML applications.
- Collaborate with Data scientists, software engineers, data engineers, and other stakeholders to develop and implement best practices for MLOps, including CI/CD pipelines, version control, model versioning, monitoring, alerting and automated model deployment.
- Ability to work with a global team, playing a key role in communicating problem context to the remote teams
Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, fast-growing, challenging and entrepreneurial environment, with a high degree of individual responsibility.
Tiger Analytics provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, national origin, ancestry, marital status, protected veteran status, disability status, or any other basis as protected by federal, state, or local law.