Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm a little disappointed in the lack of substantive discussion here! There are many interesting technical issues raised by this post ripe for discussion. For instance:

> As you can see from the source files, that required adding many specifications (around 400 preconditions and 500 postconditions) and ghost code (around 150 loop invariants, 400 assertions, 300 ghost entities), and the daily proof takes 1.5 hours on a Linux server with 36 cores.

So 40 units with only a few hundred pre/post conditions and invariants seems to take an enormous amount of compute time to verify.

I expect that the verifier has already been fairly well optimized since Spark has been around awhile. Is this the case?

For verification in general, is the expense of verification in this case because of the model needed to verify Ada? For instance, perhaps a language that makes different choices might have a model checker that could scale better.

This has important implications for scaling formal methods to more software.



Here, we are targeting full functional specification, which is hard. So there is a lot of ghost code to support the proof (the loop invariants, assertions and ghost entities). That's what's driving the verification time up. I don't find this enormous at all, given the guarantees provided.

Most code in SPARK is proved to a much more modest level, where the proof is much easier, and that's where you get scaling.

And yes, we're using the best available provers for the task under the hood (currently Alt-Ergo, CVC4, Z3 and COLIBRI), and on some of the checks in that proof of the light runtime, a call to a single prover on a single check may take minutes.


> For verification in general, is the expense of verification in this case because of the model needed to verify Ada? For instance, perhaps a language that makes different choices might have a model checker that could scale better.

I don't think so. The SPARK subset has been chosen to aid verification. The problem of proof is inherently computationally hard, and the most gains you can expect will come from advances in solver technology, both algorithmic and in terms of scaling to multiple cores or GPU eventually.

Just my opinion :) But I work at AdaCore (not on SPARK) so have some familiarity with the subject.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: