Netidee Blog Bild
Envisioning System-Level Continuous Verification for Edge Offloading Systems
Continuous Verification Framework for Edge Offloading Systems based on Formal Method (23.04.2025)
Förderjahr 2023 / Stipendien Call #18 / ProjektID: 6851 / Projekt: Reliability of Edge Offloadings

In this blog, we explain and describe how formal runtime verification can be utilized for enabling federated continuous verification of edge offloading systems across all phases of software development lifecycle.

In our previous blog, we defined critical requirements of integrating continuous monitoring and runtime verification into edge offloading systems. The motivation behind this endeavor is to assure high-performance and correct application behavior in edge offloading  amid of fast-changing environmental factors and workload demands which can disrupt, postpone or prolong offloading. Offloading decisions are made based on input monitored resource data and performance metrics which can be out-of-date pretty quickly. This can cause suboptimal and unreliable offloading decisions. Moreover, edge-native applications are usually sensitive and critical where underperforming and incorrect behavior can lead to potential economic losses (e.g. mobile payments), life-threatening situation(e.g. autonomous driving), business reputation damage (e.g. e-commerce website). Our goal is to integrate continuous monitoring and runtime verification in edge offloading ecosystem to make edge offloading more efficient and reliable, and in rest of text we refer to  this as continuous verification.  We envision continuous verification (CV) for edge offloading systems based on following described concepts which fulfills previous described critical requirements of lightweight, low latency, automatic and environmental agnostic executions.

Formal Runtime Verification Foundations

For implementing CV, we opt for formal runtime verification method which is based on temporal logics. This method assures that verification follows mathematical rigor, where target system requirements are encoded as formal specification properties and verified against actual event traces. The event traces reflect actual target system behavior as a sequence of states or actions. The encoded specification properties are describing desired system behavior and verified against real event stream. The outputs of verification are called verdicts, which indicate satisfaction or violation of formal specification properties.

The specification development has to follow software development structure of the target system itself. Therefore, we introduce spec-as-code where formal specifications as treated as any other regular software source code. This brings familiarity to software developers and easy to integrate to their existing development workflows. Furthermore, when specification properties are written, it does not require any coding effort to build the own custom verification solution. Automated verifier synthesis procedure automatically generates verifier solutions directly from specification properties, which is possible due to temporal logical framework which mathematically proved verifier synthesis procedure correctness. And finally, system requirements that we target to verify are usually written in informal unstructured natural language which has high degree of vagueness and ambiguity. This can cause detachment between verification logic and intended behavior that we want to verify (e.g. Google Cloud defines availability in terms of uptime while Microsoft Azure in terms of downtime, or reliability on infrastructure is in terms of outages while application reliability in terms of error rates). Temporal logic specification languagen eliminates such vagueness and ambiguity due to its mathematical rigor and strict logical rules. This fulfills critical requirement of automation.

Thanks to spec-as-code concept, verification-as-a-service concept can be introduced where verification logic is developed and deployed independently from application logic. This enables faster software delivery without compromising verification integrity. It also enables better verification scalability which does not necessary has to follow each application component and thus be used efficiently. Deployed runtime verifiers are exposed through APIs which enables reusage of specification properties being verified by different application clients. This fulfilled critical requirement of lightweight execution.

Developing specification properties can be cumbersome task due to its abstractness, specification design patterns can be used to make specification development more desired and understandable to developers. Specification design patterns are common solution to common problems that reoccur across different application domains. Instead of encoding every variable and logical conditions explicitly, it can just specify inputs and outputs of such specification subproperties, speeding up the specification development. It enhances clarity and maintainability of specification properties.

Federated Continuous Verification

Previous described concepts relate to design-time and development-time of software. The CV extends also to deployment time for software delivery. CV can be integrated to continuous integration/continuous deployment (CI/CD) delivery pipeline which is the most standardized delivery workflow in software industry. Through CI/CD pipeline the software is being deployed in multiple execution runtime environments from testing, staging to live production. Runtime verifiers, thanks to its formal nature of temporal logics, its completely independent of the execution environment and combined with virtualization techniques like containerization, can be deployed across those environments (e.g. Docker and Kubernetes).  Runtime verifiers play a role of core indicators that indicates is actual software behaving as it should before proceeding to subsequent deployment stage. Here the critical requirement of environment-agnostic execution is fulfilled.

And lastly, edge offloading system are usually geo-distributed and function in asynchronous event-driven fashion for low-latency response. Similarly, CV also has to be structured to leverage the full potential of edge computing and follow edge offloading fast-paced dynamics. CV has to have federated architecture which enables local verification, partial verdict propagation, and global aggregation. This federated architecture enables composability feature where low level raw events (e.g. CPU consumption) can be mapped to mid-level objectives (e.g. resource utilization) and further translated to meaningful high-level system requirement (e.g. resource overload or overutilization over specified time period). Local runtime verifiers are deployed close to application components and indicate sub-behavioral correctness. Further middle-level runtime verifiers are aggregate verdicts of multiple local runtime verifiers and propagate further to global runtime verifiers which capture high-level requirements. This federated CV fulfils environmental-agnostic, lightweight and low-latency critical requirements.

Josip Zilic

Profile picture for user Josip Zilic

Skills:

Formal Methods
,
edge computing
,
Offloading
,
Supervised Machine Learning
,
Markov Processes
CAPTCHA
Diese Frage dient der Überprüfung, ob Sie ein menschlicher Besucher sind und um automatisierten SPAM zu verhindern.