Experience- 5 to 7 Years
Skills required –
- Big Data, Airflow, AWS Data Engg Services
Proficiency with AWS cloud platforms such
Python and Spark
Job Description-
- Understand and implement data pipelines focused on efficient data ingestion from multiple data sources.
- Work with the AWS stack to manage and integrate data into data lakes.
- Ensure the robustness and efficiency of solutions in large-scale environments.
- Work closely with our field teams to unblock customers and partners during onboarding as a technical subject matter expert.
- Deploy and maintain the connectors on the Metadata Marketplace.