Peregrine.ai in Gaia-X 4 Advanced Mobility Services: Building Edge Intelligence for a Sovereign Mobility Ecosystem

Peregrine.ai & Gaia-X 4 Advanced Mobility Services (AMS)


From 2021 to 2025 Peregrine.ai took part in Gaia-X 4 Advanced Mobility Services (AMS), a European research programme within the Gaia-X 4 Future Mobility family funded by the German Federal Ministry for Economic Affairs and Climate Action (BMWK).

The goal of Gaia-X 4 AMS was to develop the foundations of an open, federated data ecosystem for mobility — one that allows vehicles, infrastructure, and service providers to exchange information securely and under full data sovereignty.


Peregrine led Sub-project 4: Safe Coordination of Autonomous Vehicles, focusing on the visual-intelligence and edge-processing layer that links real-world sensor data to the Gaia-X network.


Engineering challenge


At the start of the project no European solution existed that could combine edge-level AI inference, on-device anonymisation, and standardised interfaces for data-space integration.

Our task was to build that capability from the ground up: designing hardware that could process video in real time, creating algorithms that would run locally instead of in the cloud, and defining data structures that could interoperate with the Gaia-X standards.


Hardware development


To meet these needs we designed Peregrine One, our own edge camera platform built around a Qualcomm SoC.

The unit integrates an RGB sensor, IMU, GPS, modem, and local storage in a compact enclosure capable of sustained inference at the edge. Every stage — from mechanical design to firmware tuning — was tested in real conditions for thermal stability, vibration resistance, and data integrity.


The Peregrine One platform became both a proof of concept and a reference design for future deployments of embedded visual AI in fleets and infrastructure. It demonstrated that high-performance, privacy-compliant vision systems can be built entirely within Europe’s supply and regulatory environment.


Algorithm research and optimisation


In parallel the Labs team re-engineered Peregrine’s computer-vision models to run efficiently on limited hardware.

We adapted modern convolutional architectures such as MobileNet and CenterNet, applied quantisation and pruning to reduce compute load, and carried out systematic tests of inference speed, power draw, and stability.

All processing happens on the device itself, ensuring real-time performance and GDPR compliance without reliance on external cloud resources.


These experiments produced a portable perception stack capable of detecting and classifying road damage, traffic signs, and environmental context directly at the edge.


Data modelling and integration


Autonomous systems need a shared language for describing the environments in which they can safely operate — the Operational Design Domain (ODD).

Peregrine developed an ODD-compatible data structure that connects sensor output to real-world operational data (OD).

The model covers object categories, location coordinates, timestamps, and condition metadata, making road and signage information machine-readable and ready for automated routing or mapping.


Data was collected in multiple German cities including Berlin, Hamburg, Frankfurt, and Munich through partnerships with municipal and fleet operators such as HVV.

All datasets were formatted for use in Gaia-X-compliant environments including Pontus-X and the Mobility Data Space, where they can be discovered and reused through federated connectors.


Collaboration and ecosystem work


As lead of Sub-project 4 Peregrine coordinated the interface between partners including Fraunhofer IVI, Consider IT, OECON, DLR, Bernard Group, and DeltaDAO.

Joint development covered ODD modelling, routing, reaction planning, and integration into live demonstrations — among them a public showcase at Hannover Messe 2024.

Beyond the technical contributions, Peregrine also helped shape requirements for the Eclipse Dataspace Components (EDC) stack, ensuring that features like MQTT-based data streams and local connectors would support edge scenarios with low latency.


Results


The project delivered a complete chain from perception hardware to federated data provisioning.

Peregrine One provided the physical platform, the optimised algorithms delivered reliable on-device vision, and the new ODD/OD schema linked these results into Gaia-X data spaces.

Together they form a working demonstration of how edge-generated mobility data can be shared securely and interoperably across Europe.


These outcomes now inform Peregrine’s ongoing work in geospatial analytics, telematics integration, and infrastructure monitoring.

The same architecture is being adapted for new hardware generations and for collaborations with leading mapping and telematics partners.


Why it matters


Gaia-X 4 AMS shows that real-time perception, privacy, and interoperability are not conflicting goals.

By merging embedded intelligence with open European data standards, Peregrine helped establish a blueprint for how future mobility systems can remain connected without depending on external platforms.

It is a step toward a digital infrastructure where data stays sovereign and technology remains accountable.


Outlook


The knowledge gained through this collaboration feeds directly into Peregrine Labs, our applied-AI engineering division.

Labs continues to refine the edge-vision stack developed in Gaia-X 4 AMS for deployment across mobility, smart-city, and industrial environments.

The same core technology that ran inside Peregrine One is now being adapted for drones, stationary sensors, and next-generation fleet systems.


For a detailed technical summary, the full Gaia-X 4 AMS Final Report is available through the TIB Hannover open-access repository:
Read the report


About Peregrine Labs


Peregrine Labs is the engineering unit of Peregrine.ai.

Its focus is on designing, building, and deploying visual-intelligence systems that operate efficiently at the edge — from vehicles and drones to city infrastructure.

Labs bridges applied research and field deployment, helping organisations bring intelligent perception into real-world environments.


More information: peregrine.ai/labs

Share the Post:

More from the blog