Loading ...

작업 내용

Overview:
GCP Create and maintain optimal data pipeline architecture.,Assemble large, complex data sets that meet functional / nonfunctional business requirements.,Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, redesigning infrastructure for greater scalability, Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP data warehousing technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics, Work with stakeholders including the Executive, Product, Data and Design teams to assist with datarelated technical issues and support their data infrastructure needs,Keep our data separated and secure across national boundaries through data centers and GCP regions.,Work with data and analytics experts to strive for greater functionality in the data systems. Qualification Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases,Experience building and optimizing data warehousing data pipelines (ELT and ETL), architectures and data sets., Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement,Strong analytic skills related to working with unstructured datasets,Build processes supporting data transformation, data structures, metadata, dependency and workload management,A successful history of manipulating, processing and extracting value from large disconnected datasets,Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores, Strong project management and organizational skills,Experience supporting and working with crossfunctional teams in a dynamic environment. Technical SkillsetExperience with data warehouse tools: GCP BigQuery, GCP BigData, Dataflow, Teradata, etc,Experience with relational SQL and NoSQL databases, including PostgreSQL and MongoDB,Experience with data pipeline and workflow management tools: Data Build Tool (DBT), Airflow, Google Cloud Composer, Google Cloud PubSub, etc,Experience with GCP cloud services: Primarily BigQuery, Kubernetes, Cloud Function, Cloud Composer, PubSub etc, Experience with objectoriented/object function scripting languages: Python, Java, Terraform etc,Experience with CICD pipeline and workflow management tools: GitHub Enterprise, Cloud Build, Codefresh etc,Experience with Data Analytics and Visualization Tools: Tableau BI Tool (OnPrem and SaaS), Data Analytics Workbench (DAW), Visual Data Studio etc., GCP Data Engineer certification is mandatory
Loading ...
Loading ...

마감 시간: 13-07-2024

무료 후보 신청 클릭

대다

Loading ...
Loading ...

동일한 작업

Loading ...
Loading ...