Skip to main content
Bayi Glacier in Qilian Mountain, China (Credit: Xiaoming Wang, distributed via imaggeo.egu.eu)

Job advertisement Internal Research Fellow (PostDoc) in xAI and Decision Intelligence for EO Resilience

EGU logo

European Geosciences Union

www.egu.eu

Internal Research Fellow (PostDoc) in xAI and Decision Intelligence for EO Resilience

Position
Internal Research Fellow (PostDoc) in xAI and Decision Intelligence for EO Resilience

Employer
European Space Agency logo

European Space Agency

Homepage: https://www.esa.int/


Location
Italy

Sector
Government

Relevant division
Earth and Space Science Informatics (ESSI)

Type
Full time

Level
Entry level

Salary
Open

Preferred education
PhD

Application deadline
4 March 2026

Posted
12 February 2026

Job description

The European Space Agency is seeking a Research Fellow to advance innovative resilience and security applications that leverage the combined strengths of Earth observation (EO) and explainable artificial intelligence (xAI) for decision intelligence. In an era of rapidly evolving geopolitical dynamics and increasing complexity in the security landscape, the ability to transform EO data into reliable, transparent and actionable intelligence is critical. You will explore novel approaches for exploiting multi-sensor satellite data — such as optical, SAR, thermal, and hyperspectral — to support early warning, strategic situational awareness and the monitoring of emerging risks. A key focus of the fellowship will be the development of explainable AI architectures capable not only of detecting and characterising threats, anomalies and changes on Earth, but also of presenting their outputs in ways that are interpretable and trustworthy for security analysts and mission operators.


Working within ESA’s multidisciplinary research environment, you will investigate how explainability can enhance user confidence, operational uptake and the safe integration of AI-driven systems in security-oriented workflows. This includes designing methodologies that ensure traceability, robustness and bias mitigation in AI models, as well as creating prototype tools that demonstrate how transparent machine reasoning can support evidence-based decision-making. The role offers a unique opportunity to contribute to Europe’s technological sovereignty by shaping next-generation EO+xAI capabilities for resilience, and engaging with key stakeholders across the European space and security ecosystem.

Main research topics:

  • EO-driven threat detection and situational awareness: develop innovative methods to exploit multi-sensor EO data (e.g. optical, SAR, thermal or hyperspectral) for detecting anomalies, emerging risks and defence-relevant activities. Emphasis will be placed on transparent, interpretable models that provide evidence-based reasoning for mission-critical decisions.
  • Explainable AI for mission-critical decision support: design interpretable machine learning architectures capable of offering clear, auditable explanations of assessments and predictions, ensuring analysts and operators understand model outputs in time-sensitive security scenarios.
  • Real-time intelligence fusion and rapid response: investigate methods to fuse EO data with auxiliary information — including environmental simulations and open-source intelligence (OSINT) software — to generate timely, reliable intelligence for crisis monitoring and threat evolution analysis.
  • Resilience and responsible use: develop techniques to embed safeguards into EO+XAI and decision intelligence systems, addressing bias, uncertainty quantification and operational risks, ensuring outputs meet ethical and security requirements within European contexts.
  • Prototype development and strategic integration: build demonstrators, validation pipelines and simulation testbeds to evaluate the performance, explainability and physical consistency of EO+xAI+decision intelligence systems, contributing to ESA’s strategic roadmap for secure, resilient and trustworthy space-based intelligence.