Post by Kaz Kylheku Post by Keith Thompson
That's not how I read it. The bug was suspended "until we can figure
out what the language is supposed to mean" (referring to the common
initial sequence rule); that was in 2004. The most recent comment,
posted in 2009, suggests that visibility of the union is sufficient.
It's disappointing that no action has been taken since then, but I don't
see a statement that gcc's current behavior is correct. If that were
the intent, I presume the bug would be closed, not suspended.
If a language construct has two possible interpretations A and B
such that under interpretation A, some set Sa of programs
may undergo a change of behavior under optimization,
and under interpretation B, a greater set, Sb >= Sa (superset)
undergoes a change of behavior under optimization,
and otherwise the interpretations are not in any conflict,
it goes without saying that until the controversy is settled, the
not change the behavior of programs in Sb\Sa (set difference).
Following the dangerous interpretation B *is* the bug.
It *should* go without saying. For quality compilers, it probably would
go without saying if nobody said it. Given that nothing in the Standard
can reasonably construed as discouraging compilers from using command-
line options or #pragama directives to support features which would
otherwise be non-conforming, I am at a loss to figure out why a compiler
which by default honors a conservative interpretation of what is required
but has options to waive that would not be considered superior to one that
defines a smaller subset of programs unless the programmer foregoes a
large class of optimizations, most of which would not be problematic, or
uses non-standard directives to force it to recognize aliasing between
the types in question.
Given that gcc has non-standard directives which can force it to recognize
aliasing between specific types, having complete visible union declarations
achieve the same effect in the absence of directives to waive that behavior
should no be even remotely difficult. While I generally try to avoid reading
too much into people's motives, I am unable to figure out any reason for
gcc's behavior which would be consistent with authors who are more interested
in producing a useful compiler than in denigrating existing C code.
Post by Kaz Kylheku
The behavior can be changed to A, and the bug can be closed.
TASK: (re-)introduce certain when optimizations A-B controversy
is clearly settled in favor of B.
Now ths new item can languish in the database for 20 years for all
anyone cars. Going with B should require strong community sign-off.
Alternatively, add a new command-line option or pragma to re-enable the
optimization. There's nothing wrong in the slightest with having a
compiler support non-conforming modes, and there are many cases where a
non-conforming modes could offer optimization opportunities for many
programs which would be far more useful than anything that can be achieved
by pushing ambiguous parts of the Standard.
As a simple example, I would suggest that it would be useful to have a
mode which, *if explicitly enabled*, would waive the special "character
type" aliasing rules in cases where code accessed an lvalue twice using
the same non-character type without any *intervening* action that would
indicate aliasing is likely. Such an option would be non-conforming,
but if the compiler is even remotely cautious in its judgment of what
actions might suggest aliasing, it would probably break a lot less code
than gcc's current approach while offering many more useful optimizations.
Post by Kaz Kylheku
Imagine if some electronic designer says, "Gee, it's not clear from
the datasheet what current this part can handle: is it 1A or 2A?
Oh, I'm going to design for 2A and keep cranking out mass production
units that way until the data sheet is cleared up."
Actually, the scenario more applicable to this situation would be a
semiconductor manufacturer producing parts with registers which won't
work reliably unless their inputs remain valid for awhile after their
outputs have changed. Data sheet values for registers often don't
guarantee that the outputs won't change until after the inputs have
been reliably captured, so nothing would prohibit a manufacturer from
offering devices whose outputs changed faster than their inputs could
capture things, but requiring circuit designers to allow for that would
almost double the amount of circuitry required for many applications.