orbital_OLIVER (MiRAGE)

Objectives of the Product

 Significant inefficiencies result from the human-centric approach to satellite operations. Latency, short communication windows, and costly downlinks are all detriments to the mission’s effectiveness. 

To address these problems, AIKO has developed orbital_OLIVER, an onboard automation software that augments spacecraft performance and reduces mission operations costs, opening up new opportunities in the use of space. 

orbital_OLIVER analyses data from the satellite and its operational environment to devise and execute a dynamic schedule of tasks. Therefore, autonomy-enabled capabilities allow satellites to perceive and react to unexpected events, lowering operating costs and improving service quality 


Customers and their Needs

 The customers of orbital_OLIVER are spacecraft manufacturers and operators. Their needs include: 

  • Optimising the use of in-space and on-ground resources 
  • Increasing the quality of services or products to become more competitive 
  • Reducing operations’ costs 
  • Overcoming bottlenecks from human-in-the-loop approaches 
  • Responding rapidly to unexpected events 
  • Increasing the satellite lifespan. 

Targeted customer/users countries

AIKO targets satellite manufacturers and operators across the world. 


Product description

 orbital_OLIVER is a software that enables in-space mission autonomy for satellites. 

orbital_OLIVER uses event detection and pattern recognition technologies applied to payload and telemetry data to make decisions independently. 

The software abstracts a simple cognitive architecture from complex space systems, providing satellites with the ability: 

  • To sense the environment and its status (through onboard data processing); 
  • To plan tasks according to acquired or inferred knowledge and update the mission schedule (operations planning) 
  • To execute tasks according to the updated mission schedule (dispatching). 

orbital_OLIVER has been successfully tested for x86-64 and ARM computing architectures. Moreover, processing modules are compatible with a wide range of hardware accelerators (including Intel Myriad, Google Coral, and Nvidia Jetson), resulting in several key advantages: 

  • Reduced inference time on the deep learning model 
  • Reduced workload on the CPU 
  • Optimised power consumption 

orbital_OLIVER White Paper

orbital_OLIVER Brochure


Added Value

orbital_OLIVER enables autonomous satellite operations, overcoming the limitations of human-centric spacecraft operations. The independence from ground control paves the way for benefits such as reducing mission operating costs, increased activity lifespan, and optimised use of resources. orbital_OLIVER will also become the pillar for the logistical scalability of novel constellation architectures, in which hundreds or thousands of satellites will operate collaboratively to reach mission goals. 


Current Status

The InCubed programme officially began in May 2021 and was finished in July 2023. In this timeframe, AIKO improved the technical soundness of orbital_OLIVER by testing it in real operational scenarios and gaining flight hours. 

AIKO had been running the Early Adopters Program (EAP) for this endeavour. This program granted selected partners early technological access to orbital_OLIVER, ensuring the product’s compatibility with potential customers’ needs. In the context of the EAP, AIKO acknowledges the support from Tyvak International, the UK branch of D-Orbit, and UNIBAP. 

The culmination of the EAP has been a series of in-orbit demonstrations carried out onboard one of the ION Satellite Carrier OTV by D-Orbit. These experiments validated orbital_OLIVER’s functionalities in an operational scenario, demonstrating its added value in improving mission efficiency and regulating the optimal usage of onboard resources. 

AIKO leveraged the InCubed development resources: 

  • To complete the product development roadmap, prepare it for commercial exploitation 
  • To conduct required qualification and testing campaigns 
  • To acquire flight hours and training data in diverse scenarios to optimise the machine learning models 
  • To verify and validate the product on the use cases identified during the EAP. 

AIKO has recently started commercialising orbital_OLIVER, rebranding its original name MiRAGE (Mission Replanning through Autonomous Goal gEneration) and updating the product description. 

Cube4All

Objectives of the Product

Cube4All aims at a commercial Earth data service leveraging the advantages of open standards-based data­cube analytics. It makes complex EO tasks simple, keeping simple tasks simple, and unleashing EO analytics and fusion for non-EO/non-IT experts while increasing the productivity of experts.

The resulting service will stand out through:

  • genuine datacube services on several public Copernicus and further data offerings, including DIASs, CODE-DE and further large-scale data archives in a seamless manner.
  • allowing customers to either rent these services or rent their own pre-confectioned datacube service (private or public) on owned data, or any combination.
  • a more user-friendly approach, probably conv­enient for IT/EO experts and non-experts alike, from extraction to analytics without any programm­ing.
  • a seamless integration of Copernicus and INSPIRE data.
  • providing configurable access control for data offered by cust­omers.
  • offering a particularly flexible, fair, and attractive billing model.
  • APIs strictly based on the OGC/ISO/IN­SPIRE standards, WCS, WCPS, WMS, and OAPI-Coverages.
  • being operated on a completely open-source software.

Customers will be able to access existing archives in a massively simplified way as curated datacubes, without any programming skills. This allows users without IT and EO coding expertise to unleash the potential of the mass of EO data assets.

In any case, from a user perspective, services stand out by simplifying access to Petabytes of Sentinel timeseries, climate variables, DEMs, and them­atic products for location-transparent mix-and-match without programming; proven real-time perform­ance; wide range of common tasks prefabricated, continuously extended; versatile analytics, from standing queries to exploratory – any query, any time; fully customisable work­flows; transparent and competitive pricing. Users remain in the comfort zone of well-known clients.


Customers and their Needs
  • Customers who need analytics of EO data in space and time, without having their own raster data.
  • Analytics of EO data in space and time, mixing owned and external raster data.
  • Customer’s need to have their own raster data easily accessible (inhouse or externally),
    but do not have the skills or time to maintain a server.
  • Customers who need to engage Big EO Data in proprietary algorithms, including proprietary data, with demanding and changing customer requirements.

Targeted customer/users’ countries

Worldwide


Product description

Approach: The existing DIAS, CODE-DE, and other public rasdaman datacube services estab­lish­ed in research projects (ex: https://processing.code-de.org/rasdaman), with their existing WMS / WCS / WCPS capabilities, will be more user-friendly and e-commerce will be ready for rasdaman-as-a-service.

Features to be added include a prototype of a novel, interactive data analytics technique, Query-by-Dialog (QBD), which aims at enabling users to do analytics without coding, without even writing high-level queries.


Added Value

Cube4All will make complex tasks simple and keeping simple tasks simple. It unleashes EO analytics and fusion for non-EO/non-IT experts while increasing the productivity of experts.


Current Status

Cube4All has substantially helped the rasdaman SME to shape and extend its business model and hence optimised its success chances in the EO market and beyond. As of August 2024, this activity in now completed.

EO-WIDGET

Objectives of the Product

EO-WIDGET is a modern fully cloud-based, highly automatized offering of Web-based information services utilizing high-frequency satellite Earth observation (EO) data supplies, which CAP Paying Agencies can integrate into their Integrated Administration and Control System (IACS) set-ups to achieve individual implementations of the Checks by Monitoring (CbM), and Area Monitoring System (AMS) respectively, customized to local needs and for operations based on Service Level Agreements (SLAs).

The EO-WIDGET project carries further Sen4CAP prototyping efforts and algorithm developments based on Copernicus Sentinel EO data input into operational service provisioning via provider-managed Application Programming Interfaces (APIs). Through the APIs, a well-defined portfolio of ready-made monitoring data products is made accessible. For the consumers, this makes the need for managing very large EO data repositories and setting up of complicated processing chains a story of the past.

The monitoring data products are specified according to processing levels and are aimed to be established as de-facto standards within the domain, covering the EO derived input for all relevant use cases included in CAP and environment monitoring practices.

Furthermore, interactive, and graphical user interface elements, so-called “widgets”, are offered as Open Source Software supposed to be bundled, customized, and configured by service providers into customer-facing Web Applications (Apps). An Expert Judgement App (Parcel Explorer; Quality Assessment Tool) is provided as white label software templates for enhancing productivity of value-adders (Paying Agency contractors, value-adding resellers, farm advisors).

Also, a cloud-based data management platform is established as a service for secured, interoperable exchange and staging of the LPIS  and GSAA data which are data required in the generation of the monitoring data products and in the widgets for their visualization. Additionally, EO data stemming from commercial sources (Planet Fusion products) are managed on the platform and assimilated into the monitoring data processing selectively (optimizing cost to customers), e.g., for coping with the “small parcel issue”.


Customers and their Needs

The EO-WIDGET initiative targets three main Customer Segments clearly differentiated according to their use of the EO-WIDGET System:

  • Customer Segment 1 CAP Paying Agencies (PAs): In the near future, PAs have to phase-in information management tasks of unprecedented complexity involving multiple-orders-of-magnitude larger Earth observation (EO) satellite data volumes to fulfill their reporting obligations (e.g. CAP). This clearly positions the PAs as the main Customer Segment for EO-WIDGET Services
  • Customer Segment 2 Other Agencies: A number of governmental organizations other than the PAs have similar and increasing needs of EO services for monitoring and reporting purposes: Environmental Agencies (EAs), public sectorial departments (Water, Forest, Agriculture, Urban Affairs, etc.) and the EU Court of Auditors.
  • Customer Segment 3 Advisors: Farm Advisors assist farmers in improving agronomic performance while reducing fertilizers cost and environmental impact. They also provide support in coping with the increasing digital requirements imposed on farmers by Farm Information Systems (FIS), and PA reporting systems.

The largest impact is expected with European Paying Agencies given the window of opportunity that arises from a legislation that requires each Member State to setup a system to continuously monitor all agricultural parcel by 2023.


Targeted customer/users countries

Global


Product description
Credits: EO-WIDGET consortium

The consortium operates and offers a set of EO-based Monitoring Products, compliant with the latest IACS and CAP regulations, rich quality assurance support within an SLA-based IT Cloud environment. The products are visualized in a versatile graphical user interface and are customized to match the local requirements of each customer.

A cost-efficient integration of VHR data and sound methods lead to a significant decrease of so-called inconclusive (yellow-marked within a commonly-used traffic light system) in comparison to previous solutions.

Data as a Service: On-demand Managed Services are provided for:

  • EO satellite data discovery & ingestion
  • pre-processing
  • generation of signal-based monitoring products
  • wall-to-wall, whole season, coverage

Widgets: mini-applications are available for:

  • visualization of monitoring products (expert judgment)
  • quality assessments
  • building of Web Apps
  • re-use (open source software) and customization

Hosting: Protected cloud workspace is offered – individualized per Paying Agency for:

  • deployment of Apps & tools
  • storing of declaration data; configurations; and monitoring products

Small parcel analysis: Cost-efficient VHR integration

  • based on Planet Fusion data
  • dedicated pricing scheme

integrated into product workflow.

Credit: EO-WIDGET Consortium


Added Value

Currently only standalone solutions, which are not scalable nor open to new developments, are available on the market. These solutions are exclusively targeting directly the end-users. So far, no supplier has been focusing on the needs of the ICT industry servicing already public end-users and the related provision of a ‘user centred’ service concept. In EO-WIDGET available services are being integrated, thus keeping the value chain open to new solutions offering flexibility and scalability. The EO-WIDGET initiative provides for the first time in the EO industry a service concept to specialized ICT providers, where end-users may access the benefits of operational EO data services within simple to embed widgets.

EO.WIDGET offers the effective implementation, integration and operations of Checks-by-Monitoring and Area Monitoring System as a contractor-operated service to European Paying Agencies and their incumbent contractors.

The EO-WIDGET consortium addressed all technical challenges and is now ready to serve Paying Agencies throughout Europe with a clearly defined set of tested and validated monitoring products, corresponding quality reports in an IT cloud setup that matches all required security standards and is still flexible to match local needs.


Current Status

The activity is closed with December 2022 and is now continuing as an operational service and expanding into the market and participating in relevant public tenders.

Deep Property

Objectives of the Product

Re/insurance and risk modelling companies suffer a serious lack of high-quality per-building data. Nowadays, they build their analysis upon aggregated and statistical datasets, which are characterized by low quality, obsolescence, and coarse resolution.
Lack of high-quality property data implies poor risk estimation, which may translate into large, unexpected losses when a disaster happens. Moreover, uncertainties in risk estimation are frequently offset by increasing the policy premiums to final customers, and thus decreasing the insurers’ competitiveness in the market.
The effects of climate change contribute to exacerbating the problem, and statistics have indeed shown an increasing trend for losses at global scale. The number of events per year ramped up from 249 in 1980 to 820 in 2019. Since 1980, the total losses due to natural disasters have reached 5,200 US$ billion.
Deep Property is tackling this issue by providing detailed, on-demand, high-quality building data at the global scale. It applies proprietary AI-models to geospatial datasets, like satellite and street-level images.
A SaaS model is offered, leveraging an API-based infrastructure. It provides a fair pay-per-use scheme and, most importantly, a smooth integration into the customer’s environments. The entire service runs on the cloud.


Customers and their Needs

The key customers segments targeted by our product are: Insurance, Reinsurance, and Risk modelling companies. All share a high-level need, namely high-quality building-scale property data, while using the purchased data for different applications. 

Insurance companies need property data for three main applications: identification of a fair and competitive price, policy renewal, and claim management. Reinsurance companies need property data to better assess the risk for a large portfolio of buildings. Risk modelling companies use detailed data to improve the quality of their models, and thus increase the value in front of their customers. 

Property data include several different pieces of information. Among these, there are: area, roof type, construction year, overhanging trees, greenness index, solar panels, presence of swimming pools, flood barrier, number of floors, material, occupancy, building type, first-floor elevation, basement, maintenance status, etc. The relative importance of each element changes in accordance with the addressed geographical region, e.g. in hurricane-prone areas, information about roof type is highly valuable. 

The main stakeholders involved in the activity will validate the product considering both technical and business point of view. More specifically, a set of analysis will be done to assess the added value provided by Deep Property. 


Targeted customer/users countries

Deep Property can provide data about individual buildings anywhere, at a global scale. However, the main regions of interest for identified potential customers are European countries and the US. 


Product description

Deep Property is a service able to derive property features by applying proprietary AI-based techniques over several geospatial datasets including satellite data, street-level images, smartphone pictures, etc. 

Deep Property is offered through a pay-per-use SaaS scheme. The data is requested through an API scheme, where final users submit latitude and longitude of the property location of interest (or its physical address), together with a list of desired features. Once the submitted data is received, Deep Property retrieves the relevant geospatial datasets, then it applies the AI-based models to extract the requested pieces of information, and finally delivers the analysis results to the final customers. It should be remarked that the API-based scheme automatically updates information in the customer’s database, thus making integration of DeepProperty much smoother. 

This is key to the effectiveness of DeepProperty, because the system operates in a transparent manner, and the customer does not need to learn new platforms and operations. The guiding idea is to let our customers work as usual but with better data. 

An overview of the processing workflow is available in the figure below: 

In the end, a full cloud-based solution allows to scale up easily when dealing with large amounts of requests. 


Added Value

The lack of property data is a problem to which many companies worldwide try and offer a solution. Most of them, however, focus mainly on satellite images and drones as their sources of raw data. 

Satellites provide a clear overview of the buildings from a nadiral observation point, and thus are indeed useful to retrieve some physical features of the buildings (e.g. footprint, roof-type, etc.). Yet, satellite images are insufficient to retrieve all risk-related features. 

Drones provide a wide range of vantage points, but their coverage is extremely limited, and new, specific acquisitions are generally required for any new building mapping operation. 

In our case, unlike our competitors, we use satellite images combined with street-level images. Whereas satellite images provide data on footprint, area and roof type, street level images cover several additional exposure-related characteristics. The combination of the two data sources guarantees a far more complete portfolio of data for our customers. Many data providers are acquiring street-level data for road sign mapping and self-driving car applications, thus guaranteeing good coverage in many urban areas as a by-product; such data can be re-used for our service. 


Current Status

The activity officially kicked off on June 30th, 2021, and it has been concluded on April 30th, 2023. 

Through the utilization of proprietary AI models, the service has the capability to analyze geospatial data and provide more than 10 building features in nearly real-time. The service operates entirely on a cloud-based framework and manages requests through an API-based interface. Moreover, new features can be integrated easily using a plug-and-play approach. 

The Deep Property service focuses on various markets where property data is valuable. These markets include insurance, real estate, energy, and ESG rating assessments. 

are required. 

Resilient Europe 2.0

Objectives of the Product

Mayday.ai is a centralised and artificial intelligence-based platform providing real-time and near real-time disaster and risk information services. We provide early warning and two-way communication services within the same ecosystem by leveraging Satellite Imagery (Geostationary, Polar), Camera Imagery, Audio, as well as Social Media Sentiment analysis.


Customers and their Needs
  • Government agencies/organisations dealing with emergencies, humanitarian and development aid
  • (Re)insurance industry
  • Utility
  • Citizens

Targeted customer/users’ countries

Global


Product description

A platform powered by a data-agnostic AI fusion engine that provides actionable insight for disaster and risk management in real and near real-time.


Added Value
  • Real- and near real-time insights, allowing proactivity and ultra-fast reactions (sometimes hours and days before disasters are reported through traditional means)
  • Centralised disaster management (information is consolidated, standardised and localised, from decision-making to execution levels)
  • Several disaster types and management phases covered
  • Dynamic risk management information services (risk profiles for locations are dynamically adjusted through an influx of data and machine learning)

Current Status

The project has been successfully completed.

HubCAP

Objectives of the Product

HubCAP is an ‘Application Platform’ for the agriculture and environmental sectors. It is based on Copernicus data and the EO toolkit built by Compass Informatics Limited, making it a highly flexible system. It can be accessed directly through the platform or through a comprehensive API for monitoring via direct integration with existing administration systems.

HubCAP was envisaged as an application that would: 

  • Provide accessible information for land use and land cover management in an accessible and repeatable manner.
  • Build upon ESA generated satellite imagery and applications to lower the cost of analysis and application ownership.
  • Provide reliable and transparent workflows for decision making.
  • Deliver an intuitive application that serves the business expert, rather than relying on the technical specialist.
  • Enable non-technical users to initiate and view land use and land use change in a wide variety of scenarios using satellite imagery.
  • Aid the process of validating financial claims, review land use and land use change more efficiently.

HubCAP demonstrates the ability to combine Compass Informatics’ expertise in EO, Location Technologies and GIS to deliver a user focused, application with the potential to deliver.


Customers and their Needs

HubCAP is a robust and simple platform targeting users from Government Agencies, (CAP, non-CAP Paying Agencies and Environmental Agencies) and commercial clients.  Users can access the benefits of Sentinel data in a fully supported, legally recorded and compliant manner. These user types are in constant need of custom land monitoring as Europe has a wide range of types of lands and therefore different monitoring needs.


Targeted customer/users countries

All European countries


Product description

The HubCAP service consist of two modules, each with a differing start point:

1. EO Module Based upon results of other activities (Copernicus data Space Ecosystem, Agricultural monitoring needs) with an innovative EO algorithm for Grazing and Bare Soil Detection. The EO Module orchestrates the download, and processing of Sentinel-1 and 2 imageries, and the execution of markers against that data for specific land parcels.  Processing is optimised for data reuse and stability.

2. Bureau Module Based on a validated user concept and actions derived from the EO module outputs. This includes a viewer to visualise the land use and land use change with interaction of several data widgets to allow visualisation of results.  The Bureau module also allows users to configure processing runs in an intuitive manner, gives users transparency on progress of processing, and configuration of results display.

Overall, HubCAP is a scalable application for commercial engagement with public and private sector clients.


Added Value

The HubCAP service is:

Simple – has an intuitive UI that allows non-expert users to initiate and schedule their assessments, supported by a comprehensive API for monitoring via direct integration with existing administration systems.

Wide – has an advanced dashboard for conducting bespoke ad-hoc local analyses with below advanced features:

  • Upload and draw polygon functionality for individual and batch processing
  • Map-centric view showing national RAG status
  • Syncing of map and data lists, for easy interpretation
  • Visualisation of parcel-level signal timeseries and imagettes views in sync
  • User scenario analysis (stored outputs as “Layer”)
  • Configurable business logic to reflect national/EU schemes
  • Administrative access for user management, permissions and configuration
  • API Data services for integration into downstream systems
  • Audit history and metadata for process transparency and end user communication
  • High-performance and scalable processing, with reliable cost models

Supported – Is fully supported and customer focused.

Transaction Certified – each classification ‘transaction’ is fully recorded and certified

Built for a mature ICT organisation


Current Status

The project has been successfully completed.

MultiSpectral Companion Mission IOD (MSCM)

Objectives of the Product
  • Develop and market innovative multispectral data and global variable products.
  • Fully or partially commercialise the ground and user segment.
  • Position Aerospacelab competitively in the satellite data market.
  • Refine offerings to meet the growing industry demand for accuracy, reliability, and regulatory compliance.

Customers and their Needs
  • Environmental Agencies: Seek precise multispectral data for effective environmental monitoring and reporting.
  • Agricultural Enterprises: Demand high-quality data to enhance precision agriculture and optimise yield management.
  • Urban Planners: Rely on trustworthy spatial data to inform urban development, infrastructure planning, and policymaking decisions.

Targeted customer/users’ countries

Three primary user groups will benefit from MSCM data products:

1. European institutions

2. Private commercial companies, including:

  • Value-Added Services providers
  • Major commodity traders/producers and financial services firms

3. Additional Sentinel-2 users, including non-governmental organisations, academic institutions, and research centers.


Product description

In the latest development of Aerospacelab’s MSCM Phase 2, the company is set to offer an extensive range of multispectral imaging (MSI) data and global variables, tailored to accommodate the diverse needs of its clientele. Customers have the flexibility to purchase these data products either in full or in part, ensuring that they can select the options that best align with their specific requirements. This customisable approach enhances accessibility, making it easier for users to integrate MSI data into their projects effectively.


Added Value
  • Customisable Access: Customers have the option to choose between complete or limited access to MSI data and associated services.
  • Comprehensive Data: Integrating MSI data with global variables delivers critical insights for various applications.
  • Intuitive Design: The ground and user segment are crafted for straightforward access and ease of use with the data.
  • Enhanced Competitive Edge: These features enable Aerospacelab to maintain its competitive stance in the satellite data market.

Current Status

The MSCM Phase 2, which commenced in 2022, is projected to be completed by September 2025, with the Test Readiness Review anticipated to occur by May 2025.

Deepview

Objectives of the Product

Deepview is an actionable agricultural commodity supply chain risk service, based on Copernicus and contextual supply chain data) that assist organizations to

  • Monitor progress towards commitments such as zero deforestation, and make its supply chain more transparent;
  • Act proactive on risk in the supply chain to reduce PR and financial risks;
  • Make better investment and sourcing decisions, towards more sustainable supply chains.

The service developed as part of the activity shows how deforestation risk propagates through the whole value chain, including mapping the complex relationships between producers, traders and consumer good manufacturers. With a risk analysis of this granularity companies can pinpoint the high risk organizations in their supply chain and are able to engage more targeted, while also being more transparent on what is happening and where.

The solution can be easily scaled to other commodities. The main focus of Deepview is on deforestation risk in the palm oil supply chain of the larger traders and consumer goods companies.


Deepview dashboard showing deforestation free status of a supply chain (visualizing dummy data)

Customers and their Needs

The targeted users (Consumer Goods Companies and large traders) have the following needs:

  • Need for more timely updates to improve engagement, and mitigate potential PR risks
  • Need for solutions at scale to assess their entire supply chain
  • Need for solutions which are easy-to-use and easy-to-understand
  • Need for prioritized information to better plan investments and prevent information overload
  • Need for transparent commodity supply chains to make better investment decisions and improve engagement

Targeted customer/users countries

Deepview is a service that covers global agricultural commodity supply chains.

During the activity the main focus is on deforestation risk in the palm oil supply chain of the larger traders and consumer goods companies. Demonstrations are carried out for the palm oil supply chain in Indonesia and Malaysia.

Deepview is now in the commercial phase with global coverage for the palm oil supply chain and will be expanded to a large number of commodities like cocoa and soy.

The coverage of Deepview as of January 2022 is shown below.


Product description

Satelligence developed and is continuously improving (as market needs are expanding and evolving) their Smart Forest and Commodity Analytics service to address sustainability related challenges of organisations in commodity value chains:

  • Where is deforestation happening in my area right now?
  • Which of my investment  & sourcing areas are at risk?
  • What is the performance of my plantations?

With Deepview Satelligence wants to add more value than only focusingon the risk and performance of the production areas of soft commodities: we model and map how that risk and performance propagates through and affects soft commodity value chains (typically producer, trader, consumer goods manufacturers, retailers).

To carry out a Supplier / Mill / Concession supply chain risk analysis for any organization active in the supply chain, based on approaches common in industry. To get there, we need three major parts:

  1. Real time deforestation data (based on Copernicus data and science-based algorithms)
  2. Supply chain linkage data model (geographic locations and attributes of supply chain entities like mills, traders, manufacturers, retail and their linkage; to be collected)
  3. A risk propagation model that a) connects deforestation data to the supply chain model and b) calculates how that risk is propagated through the supply chain.

“With  our database of concessions and farms, mills and refineries, and trade linkages (Traceability to Plantation (TTP), exports), we can profile supplier risk and supplier performance to the farm or aggregate level (group, district, cooperative) and tell you if and how a deforestation alert is linked to a supply chain.”


Added Value

Companies want to assess the risks in their supply chain and find out who is responsible. With a mill / supplier / concession risk analysis companies can pinpoint high risk organizations in their supply chain and are able to have more targeted engagements, while also being more transparent on what is happening and where.


Current Status

The following activities have been undertaken as part of the activity:

  • Outreach to target users
  • Service design (UI/UX)
  • Start with development web-application
  • Feedback sessions web-application with target users
  • Iterative approach to improve the web-application
  • Launch of Deepview mvp
  • Signing of first Deepview contract
  • Development of marketing materials
  • Investigating and start of implementation Deepview for cocoa, soy
  • Launch of Deepview commercial service.

The activity was successfully concluded on December of 2021.

FloodSENS

Objectives of the Product

Floods are one of the most devastating natural disasters, accounting for the highest insured and uninsured losses annually, as well as costing many lives. With climate change possibly intensifying the hydrological cycle, the frequency and magnitude of extreme hydro-meteorological events, and therefore the risk of floods, are projected to continue to increase. This will be of devastating consequences, as it would put a greater strain on humanitarian response efforts and future financial risk of the global (re)insurance market.

Earth Observation data-based solutions currently provide a more advanced alternative to traditional ground-based flood monitoring methods or computer models, namely the ability to cover wider areas, frequent revisit times, abundant open access data and long historic image archives. However, there are still important challenges left unaddressed that compromise the quality and reliability of the data, such as the persistent cloud cover during floods, latency issues and the problem of getting abundant high-definition images under less favorable weather conditions and at night.

FloodSENS overcomes these issues by developing a flood mapping application that is capable of integrating a wider range of EO datasets and derivative data from digital elevation models using Machine Learning. This novel application being developed to market, seeks to efficiently reconstruct flooded areas under partial cloud cover in satellite images, thus creating far more reliable flood risk assessments and flood mapping during emergencies


Customers and their Needs

FloodSENS is especially important for disaster response agencies at regional, national, and international level, who are keen to utilize the proliferation of open satellite data for flood mapping during emergencies. Additionally, in the insurance and re-insurance markets, stakeholders are interested in EO data to map the flood hazard of high-impact events and on a historical basis to understand risk exposure and the changing nature of it.

In the case of the flood disaster response markets, customers often have to deal with optical satellite imagery that is partly covered by clouds during flood events, where the data resolution is too low (>30 m pixel) )to allow for local scale flood analysis and they often lack resources to deal with complex EO image analysis. This inevitably compromises the humanitarian relief efforts as it leads to incomplete estimation of flood areas and thus misrepresentation of the real impact of the flood. The (re)insurance market struggles mainly with assessing the extent of flooding for high impact events, as well as understanding the potential flood risk exposure at a local-scale. This is due to many factors, including those referred to above, along with high costs in conducting on-site inspections, using incomplete EO data archives to build historical records and prediction models, and often lack of EO specialists.


Targeted customer/users countries

FloodSENS is targeting both the humanitarian and disaster relief organizations, as well as the global (re)insurance market, with the aim of having the application work in diverse environments worldwide.

At an initial stage, as representatives of their respective customer markets, FloodSENS will have as partners and testing customers the United Nations World Food Programme (UN WFP), the National Disasters Management Institute of Mozambique (INGC) and Willis Re (re)insurance broker through Willis Towers Watson (WTW).


Product description

FloodSENS consists of a fully automated Machine Learning-based flood mapping algorithm, whose main characteristics include:

  • Ability to map flooding in many different biomes and therefore achieve global transferability easier
  • Ability to reconstruct flooding below clouds in optical satellite images of floods

The schematic below, illustrates the FloodSENS algorithm structure, the overall architecture, and the key submodules.

The added value of FloodSENS centers around two major innovations:

  1. The ability to reconstruct flood area under clouds in optical satellite images. This allows to valorize flood images with high cloud cover and can identify potentially missing flood areas. It also builds a more accurate and reliable historical record of flooded areas.
  2. Add custom map features via agile development with specific customers/users. This can include flood depth mapping capability, explicit map uncertainty representation, customer-led map creation and visualization., so map data are also much easier to interpret by non-experts.

In fact, for both Europe and the wider world, future Earth Observation-based and ML-powered apps would add considerable value to the existing products of the free Copernicus Emergency Management Service (EMS) and beyond. RSS-Hydro’s FloodSENS will place itself at the intersection of these two fields (EO technologies and AI/ML application tools) being at the forefront of future EO-enabled innovative solutions, to make a difference in allowing a much more effective disaster response.


Added Value

The added value of FloodSENS centers around two major innovations:

  1. The ability to reconstruct flood area under clouds in optical satellite images. This allows to valorize flood images with high cloud cover and can identify potentially missing flood areas. It also builds a more accurate and reliable historical record of flooded areas.
  2. Add custom map features via agile development with specific customers/users. This can include flood depth mapping capability, explicit map uncertainty representation, customer-led map creation and visualization., so map data are also much easier to interpret by non-experts.

In fact, for both Europe and the wider world, future Earth Observation-based and ML-powered apps would add considerable value to the existing products of the free Copernicus Emergency Management Service (EMS) and beyond. RSS-Hydro’s FloodSENS will place itself at the intersection of these two fields (EO technologies and AI/ML application tools) being at the forefront of future EO-enabled innovative solutions, to make a difference in allowing a much more effective disaster response.

The first half of 2019 was a devastating period for many countries in southeast Africa. After Cyclone Idai at the start of the year destroyed many places, particularly the port city of Beira, Cyclone Kenneth ravaged northern Mozambique. Entire villages were destroyed and almost one million people were at risk in the area. This partial cloud-free subset of a Sentinel-2 image of May 3 2019 shows large areas under water in Pemba, regional capital of Cabo Delgado state, which experienced more than 2 m of rain and flooding. FloodSENS will render more optical imagery like this one usable during floods by reconstructing flooded areas under cloudy skies..

Notable Outcomes

The ML model is based on the well-known U-Net architecture and uses Sentinel-2 (S-2) flood images and derivative layers from digital elevation models relating to topography and waterflow to map flooding even below partial cloud cover. The algorithm further employs a squeeze and excitation network to extract information about the importance of the different input layers. During the project, FloodSENS was trained on a large number expertly labelled S-2 flood images across different biomes, events and locations to ensure acceptable transferability, which is to become an important part of RSS-Hydro’s IPR of FloodSENS. Internal application testing and validation shows, unexpectedly, varying degrees of performance and accuracy. Overall, on average, FloodSENS performs at least as well as any robust and calibrated traditional band ratio index (>90% correct prediction), and in some cases outperforms such, and even maps below low cloud cover and correctly includes flood impact areas from dried out areas by following debris lines.

For the humanitarian stakeholder, it is clear that FloodSENS is an application they would start using since mapping consistently across different biomes given good transferability of the model will lower the number of missed flooded areas for them. For the financial risk industry, FloodSENS is appealing since the ML allows different trained models to be available for specific geographic areas where they are interested in, such as for example US, Europe, Australia and India because of a high number of insured assets in those areas. During the project, the FloodSENS app has been successfully deployed and demonstrated on the WASDI cloud processing platforms. Now that the R&D part is closed, both stakeholder segments will start the customer-led external validation on WASDI where they will test run FloodSENS and provide valuable feedback. RSS-Hydro will also organize a customer-oriented workshop with WTW to gather additional feedback to complete the best possible license subscription model.

Post-activity new steps/highlights

RSS-Hydro continues to improve FloodSENS and has some ongoing amazing new collaborations, namely:

  • the award of a Google Gift to help Google validate flooded areas around the world. 
  • we used various geospatial datasets with NVIDIA’s scene-generation platform Omniverse to build realistic 3D visualizations of disaster impact that help increase resilience of vulnerable communities. #thisisnotavideogame

Link: Visualizing FloodSENS and Copernicus data with Omniverse

INFOSEQUIA-4CAST

Objectives of the Product

InfoSequia-4CAST aims to meet the needs of water management authorities and humanitarian-aid agencies by providing actionable, seasonal-scale outlooks of drought-induced crop yield and water supply failures, with the required level of accuracy, reliability, and location-specificity.

Water and food security are at risk in many places around the world, at present and even more so in the future, with significant economic and humanitarian consequences. Risk managers and decision-makers (e.g. water management authorities and humanitarian-aid agencies) can more effectively prevent harmful drought impacts if timely information is available on how the system is affected, and the probability of a system failure.

InfoSequia-4CAST combines historical and up-to-date observations of satellite-based meteorological and agricultural drought indices with climate variability indices, to generate seasonal outlooks of water supply and crop yield failure alerts. These impact-based indicators are computed using a simple, robust and easily understandable statistical forecasting-modelling framework. By making use of multi-sensor, state-of-the art satellite data fully integrated with predictive models, InfoSequia-4CAST provides locally-specific, 3-6 month outlooks and warnings of crop yield and water supply failures to end users through a simple, intuitive user interface.

The product is tailored to the needs of water managers who are looking to alleviate and mitigate impacts of forthcoming drought periods by taking strategic water management decisions, and humanitarian NGOs aiming to trigger ex-ante cash transfers with policyholders and farmer communities.


Customers and their Needs

InfoSequia-4CAST focuses on the two aforementioned customer groups.

Water managers currently face too great a delay in detection of water demand-supply imbalances to trigger strategic actions. Humanitarian-aid agencies and NGOs lack actionable information on crop yield failures at the agricultural district level, which impedes them in determining the cost-effectiveness of cash transfer programmes and activating ex-ante payments. Both customer groups deal with a very weak local specificity and reliability of seasonal climate outlooks included in current Drought Early Warning Systems, which make use of complex dynamical forecasting models. Existing satellite-based drought monitoring systems, on the other hand, are location-specific but do not provide any information on expected conditions. Forecasts on the seasonal scale cannot be provided with sufficient accuracy by current numerical weather models.


Targeted customer/users countries

Global


Product description

The proposed development is incorporated into an existing toolbox for providing Drought and Early Warning Systems, called InfoSequia.

InfoSequia is a modular and flexible toolbox for the operational assessment of drought patterns and drought severity. Prior to the activity, the InfoSequia toolbox provided a comprehensive picture of historical and current drought status and impacts through its InfoSequia-MONITOR module, based mainly on Earth Observation data. The additional module InfoSequia-4CAST, is a major extension of current InfoSequia capabilities, responding to needs that have been identified in several previous applications.

InfoSequia-4CAST provides the user with timely, future outlooks of drought impacts on crop yield and water supply. These forecasts are provided on the seasonal scale (i.e. 3-6 months ahead). Seasonal outlooks are computed by a novel state-of-the-art Machine Learning technique. This technique has already been tested for applications related to crop production forecasting and agricultural drought risk financing.

The Fast-and-Frugal-Tree (FFT) algorithm uses predictor datasets (a range of climate variability indices alongside other climatic and vegetative indices) to generate FFTs predicting a binary outcome such as crop yields or water supply-demand balance above or below a given threshold (i.e. failure: yes/no). The activity includes collaboration with stakeholders in Spain, Colombia and Mozambique, in order to establish user requirements, inform system design, and achieve pilot implementation of the system in the second project year. Generic machine learning procedures for training the required FFTs are developed, and configured for these pilot areas. An intuitive user interface is developed for disseminating the output information to the end users. In addition to development of the forecasting functionality, InfoSequia-MONITOR is upgraded by integrating state-of-the art ESA satellite data and creating multi-sensor blended drought indices.


Added Value

Key areas of innovation concern the integration of the following features:

  • Seasonal outlooks of water supply and crop yield failure-alerts (impact-based forecasting), updated monthly
  • Contextualised, actionable indicators based on the combination of robust multi-sensor drought indices and large patterns of climate variability
  • Higher forecasting accuracy and simplicity using simple and intuitive decision trees
  • User-friendly and interactive web mapping interface and data platform

Long-term solution with regular maintenance, technical support and upgrades.


Current Status

The Activity started in March 2021. A fully operational version of the InfoSequia system was developed and piloted in close collaboration with end users. Validation activities have taken place on two locations, with active and frequent contributions from stakeholders. In Mozambique, in collaboration with the World Food Programme (WFP), monthly bulletins were sent to provide updates on the seasonal outlook regarding the probability of crop yield failures, in support of early response actions. In Spain, in partnership with the Segura River Basin Management Authority (CHS) and the Regional Board of Irrigators of Campo de Cartagena (CRCC), seasonal forecasts of water supply failures were provided, focusing on expected water levels in the most important reservoirs. A new and flexible InfoSequia front-end module was developed to disseminate system outputs to a wide range of end users. The Final Review meeting of the InCubed Activity took place on March 7, 2024. Upon closure of the InCubed Activity, FutureWater is focusing on implementing the InfoSequia commercial rollout strategy with key stakeholders across the globe.