Data Engineer (GCP) Michigan
Company: Stefanini
Location: Dearborn
Posted on: September 15, 2023
|
|
Job Description:
Stefanini Group is hiring!
Stefanini is looking for Data Engineer (GCP), Location: Remote
For quick Apply, please reach out to Jayki Gupta at 248-728-2636 /
Email: jaykiprasad.gupta@stefanini.com
Open to W2 candidates only!
Project Description:
We're seeking an experienced GCP Data Engineer who can build cloud
analytics platform to meet ever expanding business requirements
with speed and quality using lean Agile practices.
You will work on analyzing and manipulating large datasets
supporting the enterprise by activating data assets to support
Enabling Platforms and Analytics in the Google Cloud Platform
(GCP).
You will be responsible for designing the transformation and
modernization on GCP, as well as landing data from source
applications to GCP. Experience with large scale solution and
operationalization of data warehouses, data lakes and analytics
platforms on Google Cloud Platform or other cloud environment is a
must.
We are looking for candidates who have a broad set of technology
skills across these areas and who can demonstrate an ability to
design right solutions with appropriate combination of GCP and 3rd
party technologies for deploying on Google Cloud Platform.
Work in collaborative environment including pairing and mobbing
with other cross-functional engineers.
Work on a small agile team to deliver working, tested software.
Work effectively with fellow data engineers, product owners, data
champions and other technical experts.
Demonstrate technical knowledge/leadership skills and advocate for
technical excellence.
Develop exceptional Analytics data products using streaming, batch
ingestion patterns in the Google Cloud Platform with solid
Datawarehouse principles.
Be the Subject Matter Expert in Data Engineering and GCP tool
technologies.
Required:
Bachelor's degree in computer science or related scientific field
IT or related Associated topics: data architect, data center, data
integrity, data manager, data management, data scientist, data
warehousing, sql, sybase, Teradata.
In-depth understanding of Google's product technology (or other
cloud platform) and underlying architectures
5+ years of analytics application development experience
required.
5+ years of SQL development experience
3+ years of Cloud experience (GCP preferred) with solution designed
and implemented at production scale.
Experience working in GCP based Big Data deployments
(Batch/Real-Time) leveraging Terraform, Big Query, Big Table,
Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc,
Cloud Build, Airflow, Cloud Composer etc.
2 + years professional development experience in Java or Python,
and Apache Beam
Experience developing with micro service architecture from
container orchestration framework.
Extracting, Loading, Transforming, cleaning, and validating
data
Designing pipelines and architectures for data processing
1+ year of designing and building Tekton pipelines.
Experience in working in an implementation team from concept to
operations, providing deep technical subject matter expertise for
successful deployment.
Implement methods for automation of all parts of the pipeline to
minimize labor in development and production.
Experience in analyzing complex data, organizing raw data and
integrating massive datasets from multiple data sources to build
subject areas and reusable data products.
Experience in working with architects to evaluate and
productionalize appropriate GCP tools for data ingestion,
integration, presentation, and reporting.
Experience in working with all stakeholders to formulate business
problems as technical data requirement, identify and implement
technical solutions while ensuring key business drivers are
captured in collaboration with product management.
Proficient in Machine Learning model architecture, data pipeline
interaction and metrics interpretation.
This includes designing and deploying a pipeline with automated
data lineage. Identify, develop, evaluate and summarize Proof of
Concepts to prove out solutions.
Test and compare competing solutions and report out a point of view
on the best solution. Integration between GCP Data Catalog and
Informatica EDC. Design and build production data engineering
solutions to deliver pipeline patterns using Google Cloud Platform
(GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion,
DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud
Functions, and App Engine.
Preferred:
GCP Professional Data Engineer Certified
Master's degree in computer science or related field
2+ years mentoring engineers.
In-depth software Engineering knowledge.
Strong drive for results and ability to multi-task and work
independently - Self-starter with proven innovation skills
Ability to communicate and work with cross-functional teams and all
levels of management.
Demonstrated commitment to quality and project timing.
Demonstrated ability to document complex systems - Experience in
creating and executing detailed test plans.
Experience building Machine Learning solutions using TensorFlow,
BigQueryML, AutoML, Vertex AI
Experience in building solution architecture, provision
infrastructure, secure and reliable data-centric services and
application in GCP
Experience with DataPlex or Informatica EDC is preferred.
Experience with development eco-system such as Git, Jenkins and
CICD
Exceptional problem solving and communication skills.
Experience in working with DBT/Dataform
Experience in working with Agile and Lean methodologies.
Performance tuning experience
***Listed salary ranges may vary based on experience,
qualifications, and local market. Also, some positions may include
bonuses or other incentives***
Stefanini takes pride in hiring top talent and developing
relationships with our future employees. Our talent acquisition
teams will never make an offer of employment without having a phone
conversation with you. Those face-to-face conversations will
involve a description of the job for which you have applied. We
also speak with you about the process including interviews and job
offers.
About Stefanini Group
The Stefanini Group is a global provider of offshore, onshore and
near shore outsourcing, IT digital consulting, systems integration,
application, and strategic staffing services to Fortune 1000
enterprises around the world. Our presence is in countries like the
Americas, Europe, Africa, and Asia, and more than four hundred
clients across a broad spectrum of markets, including financial
services, manufacturing, telecommunications, chemical services,
technology, public sector, and utilities. Stefanini is a CMM level
5, IT consulting company with a global presence. We are CMM Level 5
company.
Keywords: Stefanini, Dearborn , Data Engineer (GCP) Michigan, Engineering , Dearborn, Michigan
Click
here to apply!
|