Lead the integration and onboarding of data using custom connectors such as FTP, API, and JDBC.
Ensure seamless data ingestion into the data lake using AWS Glue, AppFlow, and Lake Formation.
Testing and Quality AssuranceWrite unit and data tests to validate the accuracy and integrity of the onboarding process.
Monitor and improve the quality of data pipelines.AWS Infrastructure ExpertiseLeverage strong knowledge of AWS infrastructure to build and manage data integration solutions.RequirementsPython and Data EngineeringProven experience in building data pipelines using Python.
Familiarity with common Python libraries for data manipulation (e.g., pandas, awswrangler).Solid understanding of AWS infrastructure and tools (AWS certification is a plus).
Experience with Infrastructure as Code (IaC) using AWS CDK.
CI/CD and Version ControlStrong knowledge of Git and CI/CD processes to streamline development and deployment workflows.API and REST KnowledgePractical understanding of REST APIs, particularly from the consumer perspective.What we offerB2B ContractEmployment based on a B2B contract.Stable and Dynamic International FirmOpportunity to work in a stable, dynamically developing international company.Engaging Projects and Latest ITChance to participate in interesting projects and work with the latest information technologies.Competitive RatesAttractive remuneration rates offered.Renowned International ProjectsInvolvement in the most prestigious international projects.Multisport and Private Medical CareAccess to Multisport benefits and private healthcare services.Nice to havePySparkExperience with PySpark for large-scale data processing is a valuable advantage.Familiarity with ML concepts and tools, such as Kubeflow, is an added benefit.
Work with usApply & join the teamDidn't find anything for yourself?
Send your CV ******#J-18808-Ljbffr