Vehicle: Neural Network Controller Safety

AI Safety Autonomous Systems

End-to-end neural network verification with the Vehicle DSL and ImandraX — proving that a vehicle remains on the road for any neural network controller satisfying a safety contract, under arbitrary wind disturbances and sensor noise.


Overview

As machine learning models are deployed in safety-critical systems — autonomous vehicles, medical devices, industrial robotics — the need for formal guarantees about their behavior becomes acute. Traditional testing can never cover the infinite space of possible inputs.

Vehicle is a domain-specific language for specifying properties of neural networks. It compiles high-level NN safety specifications into IML modules with opaque controller declarations and safety axioms. These modules are then imported into hand-written IML proofs that establish system-level correctness — for any neural network satisfying the safety contract.

Open Source

Vehicle is an open-source project developed in collaboration with researchers at the University of Cambridge and other institutions. See imandra-ai/imandra-vehicle on GitHub.


Workflow

  1. Specify safety properties in Vehicle's high-level notation
  2. Compile to an IML module with vehicle compile itp -t Imandra
  3. Import the generated module into a proof file
  4. Verify system-level properties using ImandraX

The key insight: the generated IML module declares the controller as opaque — the proof works for any neural network that satisfies the safety axioms, not just a specific one.


The Model: Wind Controller Safety

The full development is available at imandra-ai/imandra-vehicle on GitHub. Below are the key components.

Vehicle-Generated Module

Vehicle compiles a .vcl specification into an IML module. The controller is declared opaque — its implementation is hidden, only its safety contract is known:

module Spec = struct
  type input_vector = real Tensor.tensor
  type output_vector = real Tensor.tensor
 
  let controller : input_vector -> output_vector = () [@@opaque]
 
  let normalise (x : input_vector) : input_vector = ...
 
  let safe_input (x : input_vector) : bool =
    (forall_index 2
      (fun (i : int) ->
        leq_tensor_reduced_real
          (tensor_cdot (-1.0)
            (flextensor_from_vec [] [ Real.(13.0 /. 4.0) ]))
          (flex_subtensor x i)
        && leq_tensor_reduced_real
            (flex_subtensor x i)
            (flextensor_from_vec [] [ Real.(13.0 /. 4.0) ])))
 
  let safe_output (x : input_vector) : bool =
    let y = flex_subtensor (controller (normalise x)) velocity in
    lt_tensor_reduced_real
      (tensor_cdot (-1.0)
        (flextensor_from_vec [] [ Real.(5.0 /. 4.0) ]))
      (tensor_plus_real (tensor_plus_real y
        (hadamard_prod_real
          (flextensor_from_vec [] [ Real.(2.0) ])
          (flex_subtensor x current_sensor)))
        (tensor_cdot (-1.0) (flex_subtensor x previous_sensor)))
    && ...
 
  axiom safe x =
    safe_input x ==> safe_output x
end

The axiom safe is the contract: if the input is within sensor bounds, the controller output keeps the vehicle safe. This axiom is established by an external neural network verifier; ImandraX uses it to prove system-level properties.

Scalar Safety Equivalence

The tensor-based safety predicates from Vehicle are related to simpler scalar forms for the proof:

let road_width : real = 3.0
let max_wind_shift : real = 1.0
let max_sensor_error : real = 0.25
 
let controller_fun p1 p2 =
  tensor_head (Spec.controller
    (Spec.normalise (Tensor.tensor_from_vec [2] [p1; p2])))
 
let safe_input (x : real) (y : real) : bool =
  Real.abs x <=. road_width +. max_sensor_error
  && Real.abs y <=. road_width +. max_sensor_error
 
let safe_output (x : real) (y : real) (out : real) : bool =
  Real.abs (out +. 2.0 *. x -. y)
    <. road_width -. max_wind_shift -. 3.0 *. max_sensor_error
 
lemma safe_input_tensor_equiv x y =
  Spec.safe_input (Tensor.tensor_from_vec [2] [x; y])
    = safe_input x y
[@@by auto] [@@rw]
 
lemma controller_safety_scalar x y =
  safe_input x y ==> safe_output x y (controller_fun x y)
[@@by [%use Spec.safe (Tensor.tensor_from_vec [2] [x; y])]
   @> [%expand controller_fun x y] @> auto]

State and Transitions

type state = {
  wind_speed : real;
  position : real;
  velocity_state : real;
  sensor : real;
}
 
type observation = {
  wind_shift : real;
  sensor_error : real;
}
 
let initial_state : state = {
  wind_speed = 0.0;
  position = 0.0;
  velocity_state = 0.0;
  sensor = 0.0;
}
 
let on_road (s : state) : bool =
  Real.abs s.position <=. road_width
 
let valid_observation (obs : observation) : bool =
  Real.abs obs.sensor_error <=. max_sensor_error
  && Real.abs obs.wind_shift <=. max_wind_shift
 
let next_state (obs : observation) (s : state) : state =
  let new_wind_speed = s.wind_speed +. obs.wind_shift in
  let new_position = s.position +. s.velocity_state +. new_wind_speed in
  let new_sensor = new_position +. obs.sensor_error in
  let new_velocity = s.velocity_state +.
    controller_fun new_sensor s.sensor in
  { wind_speed = new_wind_speed;
    position = new_position;
    velocity_state = new_velocity;
    sensor = new_sensor; }
 
let rec final_state (observations : observation list) : state =
  match observations with
  | [] -> initial_state
  | obs :: rest -> next_state obs (final_state rest)
 
let rec all_valid_observations (observations : observation list) : bool =
  match observations with
  | [] -> true
  | obs :: rest -> valid_observation obs && all_valid_observations rest

Verification

Safety Invariant

The combined safety predicate captures position bounds, sensor accuracy, and safe velocity:

let safe_distance_from_edge (s : state) : bool =
  Real.abs (s.position +. s.velocity_state +. s.wind_speed)
    <. road_width -. max_wind_shift
 
let accurate_sensor_reading (s : state) : bool =
  Real.abs (s.position -. s.sensor) <=. max_sensor_error
 
let safe_and_on_road (s : state) : bool =
  safe_distance_from_edge s
  && accurate_sensor_reading s
  && sensor_reading_not_off_road s
  && on_road s

Preservation Theorem

Safety is preserved by state transitions — using the controller's safety axiom via the scalar equivalence lemma:

lemma preservation obs s =
  safe_and_on_road s && valid_observation obs
  ==>
  safe_and_on_road (next_state obs s)
[@@by [%use controller_safety_scalar
    (new_sensor_value obs s) s.sensor] @> auto]
[@@disable controller_fun]

Inductive Safety

By induction over observation sequences:

lemma final_state_safe_and_on_road observations =
  all_valid_observations observations
  ==> safe_and_on_road (final_state observations)
[@@by induction ~vars:["observations"] () @>|
  [auto;
   [%use controller_safety_scalar
     (new_sensor_value (List.hd observations)
       (final_state (List.tl observations)))
     (final_state (List.tl observations)).sensor]
   @> auto]]
[@@disable controller_fun]

Main Theorem

For any neural network controller satisfying the safety contract, if all observations are valid, the vehicle remains on the road:

theorem final_state_on_road observations =
  all_valid_observations observations
  ==> on_road (final_state observations)
[@@by [%use final_state_safe_and_on_road observations] @> auto]

This theorem holds for every neural network that satisfies Spec.safe — the controller is opaque throughout the proof.


Results

The Vehicle-ImandraX integration demonstrates:

  • Specification reuse — the same Vehicle property specification can be verified against different network architectures
  • Controller opacity — the proof works for any controller satisfying the safety contract, not just a specific network
  • Compositional verification — Vehicle generates the NN safety contract; ImandraX proves system-level properties using it
  • Real arithmetic — the proof involves real-valued positions, velocities, wind speeds, and sensor errors

Any Safe Controller

The proof works for any neural network satisfying the safety contract — the controller is opaque throughout.

End-to-End Verified

From Vehicle specification to ImandraX system-level proof — no gap between what's specified and what's verified.

Practical AI Safety

Bridges the gap between ML training and deployment safety requirements, with real arithmetic over positions, velocities, and sensor noise.


References