Big Data Engineer c2c requirements Phoenix, AZ

c2c requirements

Big Data Engineer 

Phoenix, AZ, NY,CA(Day One Onsite)

 

Big Data and GCP exp must 

We’re seeking a skilled Big Data Engineer with experience working with Google Cloud Platform (GCP) to design, develop, and implement scalable data solutions. The ideal candidate will have expertise in handling large datasets, building data pipelines, and leveraging GCP services.

 

Key Responsibilities:

 

1. Design and implement data pipelines using GCP services (e.g., Dataflow, BigQuery, Pub/Sub).

2. Develop and maintain large-scale data architectures.

3. Work with cross-functional teams to integrate data solutions.

4. Ensure data quality, security, and compliance.

5. Troubleshoot and optimize data systems.

 

Requirements:

 

1. Experience with GCP services (e.g., BigQuery, Dataflow, Pub/Sub).

2. Strong background in big data technologies (e.g., Hadoop, Spark).

3. Proficiency in programming languages (e.g., Java, Python).

4. Experience with data warehousing and ETL processes.

5. Strong problem-solving and analytical skills.

 

Nice to Have:

 

1. Experience with data analytics tools (e.g., Data Studio, Tableau).

2. Knowledge of machine learning and AI technologies.

3. Certification in GCP or big data technologies.

 

To apply for this job email your details to md.faisal@signinsol.com