Title: Analytics Engineer
Location: Budapest Budapest HU
Type: Full-time
Workplace: Hybrid remote
Job Description:
About SEON
Are you a technically skilled, hands-on engineer that's passionate about data engineering & analytics? Join us at SEON and explore the opportunity to join a world-class research and development organization that is also passionate about creating great data experiences for our software engineers. Those great experiences ensure that we can deliver amazing solutions to the customers that we are obsessed with!
SEON provides an API-first solution that helps our customers (many of the world's leading providers of digital experiences in financial services, insurance, entertainment, etc) defend their customers from Fraud and Financial Crimes. With over 250 Fraud Fighters across four global offices (Austin, Budapest, London, and Jakarta), our goal remains unwavering: to make the internet a safe place for businesses and customers to transact. Our achievements, including a record-breaking Series B funding round and recognition in TechCrunch, has led to recognition as the World's quickest-growing fraud prevention company. We take pride in our rapid growth and our mission to democratize fraud-fighting while empowering the best online businesses. Join us in our journey to make the internet a safer space for everyone.
Your Role
As an Analytics Engineer, you will bridge the gap between data engineering and data analytics, empowering our revenue, success services, product, marketing, and fraud intelligence teams with reliable data. You will work closely with analysts, data engineers, and business stakeholders to design and maintain scalable data pipelines, models and reports that support fraud prevention insights, operational analytics, and business intelligence.
What you'll do
Use data to make us the best fraud fighters worldwide and make e-commerce safe for everyone!
Develop and maintain data models (e.g., dbt, SQL transformations) to enable self-service analytics.
Build and optimize ETL/ELT pipelines to extract, transform, and load data efficiently.
Manage and optimize relational and NoSQL databases for analytics and operational needs.
Ensure data quality by implementing testing frameworks, validation rules, and monitoring processes.
Collaborate with data engineers to ensure scalable and efficient data architecture.
Work closely with analysts and business teams to understand data needs and deliver well-structured datasets.
Translate business requirements into data models, workflows, and engineering tasks.
Provide expertise and help data power users translate the business needs to design and develop tools, techniques, metrics, reports.
Manage and improve data warehouse performance (e.g., Snowflake, BigQuery, Redshift).
Implement and maintain documentation for data models, pipelines, and analytics processes.
Advocate for best practices in analytics engineering, including version control (e.g., Git) and CI/CD for data workflows.
What you'll bring
Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
3+ years of experience in analytics engineering, data engineering, or similar roles.
Proficiency in SQL and experience with data modeling techniques (star schema, normalization, etc.).
Python for data transformation and automation.
Demonstrated success in building and maintaining scalable data pipelines.
Experience with dbt (Data Build Tool) for data transformation and modeling.
Hands-on experience with modern data warehouses (BigQuery, Snowflake, Redshift, etc.).
Familiarity with ETL/ELT tools (Rivery, Airflow, Fivetran, Stitch, etc.).
Strong understanding of data governance, data quality, and testing.
Experience using Git for version control and workflow automation.
Ability to work with large datasets and solve complex data challenges.
Strong problem-solving and critical-thinking abilities.
Excellent collaboration and communication skills for working across teams, confident stakeholder management capabilities.
Fluent English knowledge
Nice to have
Familiarity with BI tools (Looker, Tableau, Power BI) for data visualization.
Experience with real-time data processing systems and event-driven architectures.
Expertise with cloud platforms (AWS, Azure, GCP) and services such as S3, Lambda, and Dataflow.
Familiarity with machine learning pipelines and model deployment.
What we offer
Employee stock ownership plan (ESOP)
Hybrid working environment (3 days in the office per week)
Flexible hours
Generous Holiday allowance
Access to significant opportunities for learning and development
Private health insurance, including dependants (inc. employee assistance & mental health support)
Complimentary weekly language courses
Enhanced Parental leave
Monthly company breakfast and weekly Lunch allowance
'Work from anywhere' 60-day remote work