/*Menu hide on click outside*/
Published
24. September 2020
Expires
Never
Location
Sweden
Job Type
City
Södertälje
Page Views
0

Description

Our Data Platform is continuously being developed and is currently implemented on premise, on its way to a Hybrid-/Inter Cloud implementation. The team needs to grow with another Data Engineer colleague, well versed in DevOps thinking, to help with this mission and help our stakeholder!

Your mission and personal development

You will be the one making sure the Data Platform is utilised to its full potential in the work we do for our stakeholder. You will provide data for different analytical and operational purposes, e.g. traditional analytics (DW/BI), advanced analytics (Near real-time solutions, ML), Application integration (Webservices, API's) etc.

You will also develop automated data pipelines with data ingestion, data integration and security but also handle ad hoc manual data wrangling to support specific needs.

If you want to develop your leadership skills this role could be even more interesting, since we are considering to establish a small offshore team that this person will lead!
The team(s) you will be a part of

You will have the exiting possibility to be a member of two teams! One is the Data Platform group within Scania IT and the other is a project team at one of our internal stakeholders. We/you share successful work and failures within the team for continuous improvement of our skills.

We are using the Cloudera platform to handle the hybrid scenario with one coherent Data platform (On prem, AWS, Azure).
Your skills

You are a developer and data wrangler who think ahead and once a working pipeline is there makes sure that it will be future proof, reusable and scalable. Since we are working agile, our priorities may change rapidly and if you know how to shift focus in these situations you will do just fine with us! You like to improve ways of working and help us set best practices for our work in the Data Platform as well as how to handle business challenges.

You also have the following experience:

At least 5 years of experience of languages such as Python, R, SQL, Spark or Scala.
Building, optimizing, deploying, maintaining and operating data pipelines and data sets in a DevOps - CI/CD environment.
Hadoop e.g. Hive, HBase, Impala, HDFS, Kafka, etc.
Advanced working SQL knowledge and experience working with relational databases.
Working with cloud solutions/architecture.
Agile working methods.
BSc or MSc in Computer Science or related field (or equivalent experience).
Ask us for more information

Rolf Nordin (Group Manager) +46 855-35 10 67 or Johanna Ingels (Recruitment Specialist) +4670 081 29 90
Send us your application

Attach your CV and a motivation for this position in a cover letter. We are handling selections and interviews during the application period and may take the advertisement down before last application date, which is October 23.
Key words: Data Pipeline Developer

Apply
Drop files here browse files ...
Captcha

Related Jobs

Are you sure you want to delete this file?
/