![](/assests/entre.png)
Job Description :
Position : GCP Data Engineer
Experience : 7+ Years
Job Description :
- Overall, 5 to 8 years of rich data engineering experience
- Participate in Requirements Gathering- work with key business partner groups and other Data Engineering personnel to understand Business unit wise data requirements for the analytics platform.
- Design Data Pipelines: work with other Data Engineering personnel on an overall design for flowing data from various internal and external sources into the Analytics platform.
- Build Data Pipelines from scratch with Python and SQL: leverage standard toolset and develop ETL/ELT code to move data from various internal and external sources into the analytics platform.
- Develop data migration, conversion, cleansing, retrieval tools and processes (ETL).
- Metadata driven data pipelines development experience.
- Minimum Expertise of 3 years on Google Cloud Storage, data fusion, dataflow, pub/sub, Big query, Bigtable, cloud functions, cloud composer, airflow, app engine, cloud data prep, cloud spanner, loud dataproc.
- Building batch & streaming large scale data pipelines in the enterprise data warehousing and data lake environments.
- Well versed with data file formats Avro/parquet/json/orc processing.
- Very good at SQL and advanced SQL.
- Exposure to Looker & Data Studio.
Website : Rounds : 3 Technical Rounds followed by HR Round.
(ref:hirist.com)