Jobbet "Senior Platform Engineer to evolve our data lake" er udløbet.
Se virksomhedens profil
Se virksomheden
Vis flere job i denne kategori
Vis mig flere job
Få de nyeste job i din indbakke
Opret en jobagent nu

Are you an experienced developer and data enthusiast who’s passionate about developing platform solutions that enable the use of data?

Join us and become a Senior Platform Engineer in Data & Advanced Analytics where you’ll be a central part of our core team who evolves, develops and maintains the common infrastructure, libraries and tooling of the Ørsted Data Lake, ensuring that the platform facilitates the work of data engineers and data scientists.

Ørsted Data Lake is a modern cloud-based data platform collecting data from all business units and making it available to analysts and data scientists. Ørsted Data Lake is evolving at a fast pace, and as a Senior Platform Engineer, you’ll have a high degree of influence on the direction of this evolution.

The Data & Advanced Analytics team is one of five teams in Architecture, Data & Technology in the Group’s IT department and consists of 25 professionals within the areas of data management, data engineering, data science and platform engineering. Our team is committed to delivering on the Group’s Digital Strategy, which is anchored in the Executive Committee, by making data available, extracting insights and value from data and applying domain expertise, tools and technology and contemporary analytics.

Your key tasks will be to

  • develop and continuously improve shared Python libraries and other tooling to support data engineers and data scientists
  • ensure that the data lake platform is secure, stable and scalable 
  • ensure best possible balance between cost and performance of the data lake platform
  • continuously improve, monitor, maintain and extend the individual elements of the data lake platform infrastructure on premise and in the cloud (e.g. Apache NiFi installations, Databricks workspaces, workflow orchestration systems, cloud storage, queues and data bases)
  • participate in PoCs and evaluate new tools and technologies.

Furthermore, you’ll investigate automated alerts and incidents reported by data engineers and data scientists as well as collaborate with architects, data managers, data engineers and data scientists to set guidelines and standards for the data lake platform. 

Your competences include that you

  • are very experienced with software development in Python
  • are very experienced with DevOps, unit testing and CI/CD (e.g. using Azure DevOps)
  • are experienced with SQL, RESTful APIs, common database technologies, and e.g. C# or Java
  • are experienced with version control (e.g. Git) and cloud environments (storage, queues, serverless functions, events, containers)
  • are experienced with monitoring complex platforms (e.g. using Azure Monitor, Application Insights) and one or more of the following: Spark, Databricks, Apache NiFi, data engineering, ETL, workflow management systems, streaming, configuration tools

Additionally, you hold an MSc-level degree in a relevant field or possess equivalent experience.

You’re a team player, willing to learn new technologies and methods and happy to drive progress, and you can engage and motivate team members around you while delivering high-quality solutions. You speak and write English fluently and are familiar with IT disciplines like Agile, Scrum and SAFe.

Would you like to help shape the renewable technologies of the future?
Send your application to us as soon as possible and no later than 13 December 2019, as we’ll be conducting interviews on a continuous basis.

Please don’t hesitate to contact Jacob Hedegaard-Blaaberg, Head of Data & Advanced Analytics, by telephone on +45 9955 4081 if you’d like to know more about the position.

You should expect some travelling in relation to your work.

Please write in your application that you've seen the job at Jobfinder.