Tags:
create new tag
view all tags
-- ChrisDutchyn - 24 Oct 2005

Review from Chris Dutchyn

Problem

Program specifications, especially in Hoare logic formulation, for large systems tend to aggregate large contextual descriptions, making them unscalable and effectively unusable.

Contributions

Lam et al. recognize that this buildup of specification context can be modularized in a way analogous to static aspects. In turn, they contend that these specification modules are amenable to automatic analysis and provide case studies to reinforce this position. In particular, they identify three constructs

* formats -- subobjects that act as building blocks for objects, but can be connected to invariants

* scopes -- collections of modules that control visibility, provide inter-module invariants, and provide endpoints for analyses.

* defaults -- avoid tedious duplication of invariants that the checker will propagate throughout the application that provide the building blocks of their modular decomposition.

They implement this specification language and checker in the Hob pluggable analysis framework. They apply it to two small programs: Minesweeper and Water.

Weaknesses

The paper starts with a lengthy introduction that has no grounding in example. As a result, I felt lost from the beginning and had to go back an re-read the introduction again (after section 4) to understand the goals and motivations of the paper.

Modules play an important role in their system -- they're broken into impl, spec, and abst parts populated with different pieces of their language. But, they give only a brief description of this crucial element in the modular decomposition.

There's a lot of syntax in their system, much of it apparently inconsistent:

"!ready => root=null" && (ready => root != null)"
vs
requires ready & card(n)=1 & not (n in Content)

impl/spec module List uses Node vs abst module List use Elem

this^Elem.next = {prev}
vs
"rootn"

data next:Elem
vs
pointer prev:Elem

or unexplained

Content' = Content - n
does ' have a specific meaning?

Iter' = Iter - n
Iter comes from? used where?

Their primary motivation is to make large programs susceptible to precise static analysis by overcoming the "writing a huge spec" problem. And their decomposition, especially defaults, seems promising. But, they don't actually try it on large programs: MineSweeper is unknown in size, but I found one Java applet version at 542 lines (total). The Water benchmark is 2000 loc, with another 500 of specification.

Questions

Formats appear limited to contributing parts to one `abstract data type'. How are they different from (non-interfering) mixins?

Is there any value in having more than just opaque and transparent scopes? Alternate presentations of the same item (similar to Views in SML/Haskell) appear to be possible using multiple modules and invariants that keep them synchronized, but the overhead is unacceptable.

They make some apparently contradictory claims. For example, they are attempting to make multiple unscalable analyses scalable. How is this possible?

How large are programs that people do this level of checking on? How large are programs people would like to do this level of checking on?

Why insist that the developer identify reentrant call sites, when they perform a link-time check to ensure the developer has identified them all?

Belief

I find the idea that decomposing into modules with separate specifications appealing. Orthogonal specs would be optimal, but they do provide mechanisms to keep the leakage from one spec to another down. I find the presentation/implementation less than compelling.


-- ThomasFritz - 24 Oct 2005

Review from Thomas Fritz

Problem

Similar to crosscutting concerns in programming, properties of program specifications tend to crosscut the specifications of the whole system. Pre- and postconditions of one procedure typically contain specification properties of procedures invoked by themselves, so-called specification aggregation phenomena, so that specification properties get scattered and duplicated across several places making the approach unscalable.

Contributions

To deal with the identified scalability and crosscutting issues of program specifications, they propose an aspect-oriented approach for data structure consistency properties. Basically they provide an extension of their Hob program implementation, specification, and analysis tool that consists of three major new constructs: formats, scopes, and defaults.

Similar to static crosscutting, i.e. introduction, format declarations are used by modules to specify the fields of objects (encapsulated data structures) that are relevant to the module. ("The complete set of fields of a given object is the union of all of the fields in all of the format declarations that involve that object.")

Scopes allow the declaration of invariants and their visibility for multiple data structures encapsulated in multiple modules. A module can thereby also be in multiple scopes. Thus they provide a modular mean to state invariants crosscutting several modules.

The default construct enables to specify properties for a set of procedure pre- and postconditions, where the relevant procedures are specified by a pointcut.

Furthermore, Lam et al. provide a prototypical implementation of their extension to Hob and present two small examples in which they used their prototype to verify several properties.

Weaknesses

  • They state that their approach overcomes the scalability issues of former approaches and that they enable "automatic verification of arbitrarily precise and sophisticated data structure consistency properties". However, they never apply their approach to a bigger example to support that claim, but only use to small examples.

  • They introduce a lot of syntax, e.g. for the module specification, for invariant specifications and other properties, but they never really introduce it exactly (which I think is an important part for an approach that provides language constructs), thus making it difficult to understand the article. (E.g. they also never introduce the second-order logic language.)

  • The structure of the paper is somewhat confusing. First they introduce their three new constructs in the introduction, then they revisit it in the example section without having a subsection for formats and also making it necessary for me to go back and forth in the sections to understand their concepts.

Questions

  • They state that their approach overcomes the scalability issues of automatic verification, however they use the existing automatic verification tools as underlying technique. Does using those "unscalable" techniques just in combination with new specification constructs actually overcome the scalability issues?

  • What is Scope Call Check?

  • Do formats help that much or could you not just use different interfaces for an object to have the same kind of specification? (Maybe it might be useful to have the possibility specifying the places a format applies to by using a pointcut language.)

  • Is the default construct not too confusing because you have to keep track where it applies and suspend them in case you do not want them to hold?

  • In the default construct you can specify parameter, however they make the check dependent on the name. Is that not too restrictive, especially as defaults may hold for lots of procedures in different modules on which different programmers may have worked and thus might have used different names for the same parameter?

  • Hob avoids descending into procedures by relying on procedure specifications (last paragraph of section 2). What happens if you do not provide specifications for all procedures?

Belief

The basic idea of providing modular constructs for crosscutting specification properties and thus avoiding specification aggregation seems interesting and goes well with the need of aspect-oriented techniques for programming. By providing such aspect-oriented specification constructs, the approach presented in the paper helps to reduce the duplication and scattering of specifications. However, it does not seem easy to bare in mind all the defaults and scopes that might apply and for example need to be suspended and it might even be easier just to duplicate the specification. Furthermore, in my opinion, the issue they handle the most with their approach is the duplication of properties especially in pre- and postconditions but the claim to overcome the scalability issues of former approaches is not really supported.

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View |  Raw edit | More topic actions
Topic revision: r2 - 2005-10-24 - ThomasFritz
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback