2016-10-30 18:29:00 UTC
unknown: those which have Unspecified values are guaranteed to hold
bit patterns that are not trap representations, and those which have
Indeterminate Value, which lack such a guarantee. The Standard is
rather murky about what, if anything, is guaranteed about objects which
hold Indeterminate Value in cases where both of the following conditions
1. Every possible combination of bit values would represent a valid
value for the type.
2. The object is not of automatic duration, or else has its address
From what I've seen, some people would like to say that Unspecified Value
and Indeterminate Value have essentially the same meaning when the above
conditions apply. Others would like to treat Indeterminate Value as a
symbolic concept such that if x holds Indeterminate value, almost any
expression involving x--including expressions like "x & 15"--would yield
I would regard the first reading as being overly restrictive on the kinds
of optimizations compilers could perform, but the second would be overly
restrictive on the kinds of optimizations programmers could safely *let*
compilers perform. I would propose that it would be more helpful to
define a middle ground which would grant compilers considerable freedom,
but also offer enough behavioral guarantees that programmers could exploit
algorithms which don't require any particular initial state.
To allow maximum compatibility with existing code, I would suggest that
there should be four behavioral models, specified via some kind of #pragma.
Any implementation which satisfies an earlier model would also satisfy
the requirements of all later models, so all but the first model would be
optional [if code requested a model an implementation didn't provide for,
it could substitute an earlier one]. It would be highly recommended that
compilers support the third, since it offers more optimization opportunities
than the first two without losing any semantic expressiveness, and that new
programs be written to be compatible with it for the same reasons.
1. Simple unspecified bits: An object holding Indeterminate Value will
behave as though the underlying storage holds some bit pattern
which--once observed--will not change unless the object is written.
2. Randomly-changing bits, read deterministically: each read of an
object holding Indeterminate Value will behave as though it held
some combination of bits, but the bit pattern may change arbitrarily
3. Constrained non-deterministic model: Certain operations are defined
as yielding a definite combination of bits (as with #2) but others
may, at a compiler's leisure, yield non-determinstic sets of
possibilities, as described later.
4. Competely-loose model: Nothing useful can be said about the result
of working with Indeterminate Values.
The non-deterministic model can be described either in behavioral terms or
code-transformation terms; the former makes it easier to reason about what
a given source text might do, while the latter makes it easier to reason
about what a compiler would be allowed to do with a given source text.
I'll describe the models for #3 in a later post, but first I'll ask:
do people agree with my assessment of the present state of affairs and
the lack of clarity in the Standard, and the desirability of having a
more concrete spec for behavior which allows optimizations but does not
require programmers to spend time initializing objects in cases where
any possible initial bit pattern would allow a program to meet