GCP Data Architect- Hadoop ( 15+ years of experience )
Company: Cyberthink, Inc
Location: Dearborn
Posted on: May 25, 2023
Job Description:
Job Description:
- Good to have GCP Certification (Either GCP Data Engineer or GCP
Cloud Architect)
- 15+ years of experience in Architecting Data projects and
knowledge of multiple Hadoop/Hive/Spark/ML implementation
- 5+ experience in Data modeling and Data warehouse and Data lake
implementation
- Working experience in implementing Hadoop to GCS and HIVE to
Bigquery migration project
- Ability to identify and gather requirements to define a
solution to be built and operated on GCP, perform high-level and
low-level design for the GCP platform
- Capabilities to implement and provide GCP operations and
deployment guidance and best practices throughout the lifecycle of
a project.
- GCP technology areas of Data store, Big Query, Cloud storage,
Persistent disk IAM, Roles, Projects, Organization.
- Databases including Big table, Cloud SQL, Cloud Spanner, Memory
store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub,
Kubernetes, Docker, managing containers, container auto scaling and
container security
- Experience in Design, Deployment, configuration and Integration
of application infrastructure resources including GKE clusters,
Anthos, APIGEE and DevOps Platform
- Application development concepts and technologies (e.g. CI/CD,
Java, Python)
Keywords: Cyberthink, Inc, Dearborn , GCP Data Architect- Hadoop ( 15+ years of experience ), Other , Dearborn, Michigan
Didn't find what you're looking for? Search again!
Loading more jobs...