i have habit of using following syntax in compile-time flags:
#if (defined(a) & defined(b))
it's suggested &&
follows:
#if (defined(a) && defined(b))
i know difference between 2 operators, , in normal code &&
short-circuit. however, above handled compiler. matter use? affect compile time infinitesimal amount because doesn't evaluate second define()
?
since defined(something)
yields 0 or 1, you're guaranteed 0 or 1 on both sides, doesn't make technical difference whether use &
or &&
.
it's habits (using &
carry on situation wrong) , writing code easy grasp simple pattern matching. &
in there causes millisecond pause while 1 considers whether possibly bit-level thing.
on third hand, can't use keyword and
, ¹can use in ordinary c++ code.
notes:
¹ visual c++ can use and
via forced include of <iso646.h>
.
Comments
Post a Comment