The Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) is a member of the Helmholtz Association (HGF) and funded by federal and state government. AWI focuses on polar and marine research in a variety of disciplines such as biology, oceanography, geology, geochemistry and geophysics thus allowing multidisciplinary approaches to scientific goals.
Background The AWI Computer Centre (RZ) develops and maintains the IT components of the research activities at the Alfred Wegener Institute. It is a leader in the development of strategies and solutions for the management of large data flows, e.g. in the field of high-resolution monitoring and underway data. Within the Helmholtz Data Federation, we develop modern IT architectures that cover the entire spectrum from data acquisition, data storage, processing and analysis, modelling and simulation on the latest generation of high-performance computers to the distribution and visualisation of data and data products in a cloud environment.
Working context
Science and research today generate enormous amounts of data, opening up entirely new perspectives for gaining knowledge and information. The processing and analysis of these complex and ever-increasing amounts of data is one of the greatest challenges of the future for the entire science system.
The Helmholtz Association of German Research Centres (HGF) connects leading German marine research institutions and aims to strengthen the sustainable handling of the oceans and seas through research, infrastructures and knowledge transfer.
In this Google Challenge Impact project in co-operation with PRU-AWI, a collaborative approach is being taken to tap into artificial intelligence (AI) methods for permafrost research. Rapid changes in the ice sheets and thawing permafrost pose a societal challenge that needs to be quantified and understood. For this purpose, the development of AI algorithms, analysis functions and the realisation of data flows and processing chains on the RZ computer infrastructure and on networked IT infrastructures (Google Earth Engine) will be supported, developed and implemented at the computer centre in close cooperation with the scientific use cases in the AWI Permafrost Research Section and the external collaboration partners Google and other scientific institutions.
In a creative team, we use the latest technology to develop forward-looking research data infrastructures that are essential for future generations of national and international research in the field of earth and environment.
Tasks With us, you have the opportunity to significantly contribute to the design and development of future-oriented international infrastructures. In this context, you will take on tasks in all phases of development, from requirements analysis and conception, through design, implementation and execution of tests, to handover to operation, including to external collaboration partners. This includes, among other things:
- Design and development of data workflows and interfaces for the provision of compute, analysis and storage resources for the integration and processing of large scientific data sets, especially data from satellite remote sensing and long-term observation.
- Design and development of data workflows and storage solutions for the automatic acquisition and generation of metadata and data integration into information systems.
- Design and development of software components and interface solutions and data integration tools for the provision of scientific data and for further processing by the scientific community.
- Conception, development and operation of virtual working and development environments (virtual machines, container solutions, Jupyter notebooks).
- Communication with cooperation partners and research partners and implementation of existing workflows in the IT ecosystem of the AWI computer centre and partly on IT infrastructures of the cooperation partners.
Requirements - Master's degree in remote sensing, geosciences, computer science, a comparable degree programme or similar relevant qualification.
- Sound experience in the field of mass data processing and storage and archiving (data workflows) as well as methods or tools for data analysis.
- Sound knowledge of UNIX/Linux and working on the command line in HPC, container, VM environments (SHELL scripting).
- Funded experience in software development e.g. with Python, C/C++, Fortran, as well as with tools such as R or Matlab for algorithms and methods for processing large amounts of data.
- Experience in satellite data processing, geographic information systems and GDAL/GMT programming.
- Creative, independent and problem-sol