GR8 Tech is a leading B2B provider of iGaming solutions that empowers operators to grow, lead, and win. We deliver high-impact, full-cycle tech solutions designed to scale. From seamless integration and expert consulting to long-term operational support, our platform powers millions of active players and drives real business growth. It’s more than just a product — it’s the iGaming Platform for Champions, built for those who play to lead. We know the game and how to take it to the next level. With 1000+ talented professionals on board, we don't just build tech — we build success stories for iGaming operators all over the world. Our ambition drives us, our people make it real. Join us and be part of building champion-level success!
What You’ll Be Driving:
Working with large datasets (100+ TB) with updates at least hourly;
Developing and supporting ETL/ELT processes across multiple data sources;
Building and maintaining data warehouses and data marts on AWS (S3, Athena, Redshift), GCP (Cloud Storage, BigQuery), and PostgreSQL;
Designing and implementing RESTful APIs (Aiohttp, Flask, FastAPI) for internal and external data consumption;
Collecting and processing data from Kafka, Google Analytics, Firebase, Appsflyer, Cloudflare, and other third-party applications;
Designing and maintaining a centralized data catalog with well-validated and documented data models;
Automating data quality and integrity tests to ensure high data reliability;
Developing and integrating semantic layers to standardize data access across teams;
Creating and maintaining comprehensive project documentation;
Collaborating with cross-functional teams to understand business requirements and translate them into scalable data solutions;
Driving continuous improvement in data engineering processes, tools, and best practices;
Monitoring data pipelines, resolving incidents, and optimizing performance.
What Makes You a GR8 Fit:
Minimum of 2+ years of experience in Python / Data Engineering;
Hands-on experience with ETL, Data Warehousing, and relational databases (PostgreSQL, Microsoft SQL Server, etc.);
Experience with job scheduling and task queues;
Familiarity with cloud providers: AWS (S3, Athena, Redshift), GCP (Cloud Storage, BigQuery), or similar;
Proficiency in Linux and containerization (Docker);
Experience with BDD, TDD, or unit testing frameworks;
Extensive knowledge of software design best practices and design patterns;
Solid Computer Science fundamentals and database theory (types, pros/cons);
Experience in performance tuning of ETL jobs, SQL queries, partitioning, and indexing;
Hands-on experience with version control systems (git) and CI/CD pipelines;
Familiarity with web/mobile application data sources is a plus;
Knowledge or experience with Kubernetes, Apache Airflow, DBT, NoSQL databases (MongoDB, Elasticsearch, Redis), Kafka ecosystem, IaC tools (Terraform, Ansible), Salesforce platforms, and Graph databases (Neo4j, AgensGraph) is a plus;
Experience in near real-time data processing and data visualization tools (Tableau, PowerBI, Metabase, Grafana, Kibana, Apache Superset) is a plus;
Strong problem-solving, analytical, and data-driven decision-making skills;
Excellent communication skills, with the ability to collaborate across technical and non-technical teams;
Intermediate English or higher.
🎯
Requirements
Minimum of 2+ years of experience in Python / Data Engineering
Hands-on experience with ETL, Data Warehousing, and relational databases (PostgreSQL, Microsoft SQL Server, etc.)
Experience with job scheduling and task queues
Proficiency in Linux and containerization (Docker)
Experience with BDD, TDD, or unit testing frameworks
Solid Computer Science fundamentals and database theory (types, pros/cons)
Experience in performance tuning of ETL jobs, SQL queries, partitioning, and indexing
Hands-on experience with version control systems (git) and CI/CD pipelines
Familiarity with web/mobile application data sources is a plus
Knowledge or experience with Kubernetes, Apache Airflow, DBT, NoSQL databases (MongoDB, Elasticsearch, Redis), Kafka ecosystem, IaC tools (Terraform, Ansible), Salesforce platforms, and Graph databases (Neo4j, AgensGraph) is a plus
Experience in near real-time data processing and data visualization tools (Tableau, PowerBI, Metabase, Grafana, Kibana, Apache Superset) is a plus
Strong problem-solving, analytical, and data-driven decision-making skills
Excellent communication skills, with the ability to collaborate across technical and non-technical teams
Intermediate English or higher
🏖️
Benefits
Benefits Cafeteria with an annual fixed budget for activities, wellness, and learning
Medical insurance and wellness services
Mental health support through therapy or coaching
Home office setup with ergonomic furniture, gadgets, and tools
Language courses for skill improvement
Parental support with paid maternity/paternity leave and monthly childcare allowance
20+ vacation days, unlimited sick leave, and emergency time off
Remote-first setup with full tech support and coworking compensation
Regular team events online, offline, and offsite
Internal courses, career development programs, and growth opportunities