Data Engineer/数据工程师
What you'll do
The Manufacturing & Supply Chain hub team looks after the data and analytics needs of business areas across M&SC, largely through engagement with teams embedded within business functions. The D&A Delivery team’s responsibility within the hub will be to deliver data and analytical products following a prioritised backlog of use-cases, and hand over to our business units to manage them.
- Design and construct high-performance robust data pipelines.
- Develop and maintain scalable data processing systems.
- Ensure data quality and integrity in pipelines and databases.
- Optimize data retrieval and develop dashboards for data visualization.
- Implement data security and privacy policies.
- Collaborate with business stakeholders, transformation leads, data governance leads, and architects to gather requirements and deliver analytical solutions.
- Be a positive source of positivity within D&A, vocally supporting and encouraging the strategy and transformation.
- Proactively communicate the D&A vision and mission through informal channels.
- Embody the Volvo values and D&A cultural principles.
- Stay current with industry trends and technologies.
What you'll bring
- Bachelor's or Master's degree in information management, data science, computer science, or equivalent by experience. 2-5 years of hands-on experience in Data Engineering.
- Proficiency in SQL and experience with relational databases, query authoring, as well as working familiarity with a variety of databases.
- Ability to articulate data concepts to stakeholders at all levels and foster a culture of data-driven decision-making across the organization.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Knowledge of data ingestion methods (CDC, API, streaming) and formats (JSON, Parquet, Iceberg)
- Experience with version control like Git.
- Strong analytical skills and problem-solving aptitude.
- Excellent communication skills in English and Mandarin.
- Strong teamwork skills, ownership mentality, and a can-do attitude.
- Experience with dbt or data processing pipelines and data warehousing is beneficial.
- Working in agile teams with scrum / SAFe. Knowledge of the data mesh concepts.
- Experience with knowledge graphs and semantic modelling.
- Experience with big data tools (Hadoop, Spark, Kafka, etc.) and data pipeline and workflow management tools.
Chengdu, CN, 610105