Key responsibilities:
Architect and implement scalable and reliable ETL data pipelines to extract, transform, and load data from various sources, including Extensive 3PL Central and other logistics platforms
Develop and maintain platform integrations to enable seamless data synchronization and process automation across our logistics ecosystem
Design and optimize data models and data transformation processes to ensure data integrity, consistency, and performance
Build and maintain a comprehensive business intelligence platform, including dashboards, reports, and data visualizations, to support data-driven decision-making
Collaborate with cross-functional teams, including operations, finance, and management, to align data solutions with business requirements and provide valuable insights
Monitor and troubleshoot data pipeline and platform performance to ensure high availability, reliability, and data accuracy
Continuously explore and implement improvements to our data infrastructure, leveraging industry best practices and emerging technologies
Required skills and experience:
5+ years of experience in data engineering, with a strong focus on ETL data pipelines, platform integration, and business intelligence in the logistics or supply chain domain
Expert-level proficiency in data modeling, data transformation, and SQL programming
Deep understanding of data warehousing concepts, data lake architectures, and ETL best practices
Hands-on experience with leading ETL tools (e.g., Talend, Informatica, AWS Glue) and data warehousing platforms (e.g., Snowflake, Redshift, BigQuery)
Proficiency in building and maintaining business intelligence solutions using tools such as Tableau, Power BI, or Looker
Strong problem-solving and debugging skills to ensure the stability and reliability of data pipelines and platforms
Excellent communication and collaboration abilities to work effectively with various stakeholders and translate business requirements into technical solutions
This is a critical role that requires a deep understanding of logistics data, ETL processes, platform integration, and business intelligence. The ideal candidate should be comfortable working with large datasets, have a keen eye for detail, and possess a strong drive to deliver high-quality solutions that drive operational efficiency and support data-driven decision-making in our logistics business.
Hourly Range: $50.00-$100.00
Posted On: April 29, 2024 18:24 UTC
Category: Full Stack Development
Skills:Django, Flask, API, API Integration, Python, Data Engineering, Data Science, Business Intelligence
Country: Croatia
click to apply
Powered by WPeMatico