22

I was just wondering how disastrous integer overflow really is. Take the following example program:

#include <iostream>

int main()
{
    int a = 46341;
    int b = a * a;
    std::cout << "hello world\n";
}

Since a * a overflows on 32 bit platforms, and integer overflow triggers undefined behavior, do I have any guarantees at all that hello world will actually appear on my screen?


I removed the "signed" part from my question based on the following standard quotes:

(§5/5 C++03, §5/4 C++11) If during the evaluation of an expression, the result is not mathematically defined or not in the range of representable values for its type, the behavior is undefined.

(§3.9.1/4) Unsigned integers, declared unsigned, shall obey the laws of arithmetic modulo 2^n where n is the number of bits in the value representation of that particular size of integer. This implies that unsigned arithmetic does not overflow because a result that cannot be represented by the resulting unsigned integer type is reduced modulo the number that is one greater than the largest value that can be represented by the resulting unsigned integer type.

7
  • 1
    I've never seen an overflow in C++ cause an issue. The number will just happily wrap around and set the overflow flag on the processor. Commented Jan 26, 2012 at 20:28
  • 7
    @Brain2000 They mostly cause optimization issues. Such as (a+1 > a) being always true despite overflow. Commented Jan 26, 2012 at 20:31
  • possible duplicate of GCC Fail? Or Undefined Behavior? Commented Jan 26, 2012 at 20:33
  • 1
    @Xeo, not really a duplicate, just an example where overflow had an unexpected result. Commented Jan 26, 2012 at 20:37
  • 2
    @LokiAstari: The C++ standard says "If during the evaluation of an expression, the result is not mathematically defined or not in the range of representable values for its type, the behavior is undefined." Commented Jan 26, 2012 at 22:31

3 Answers 3

22

As pointed out by @Xeo in the comments (I actually brought it up in the C++ chat first):
Undefined behavior really means it and it can hit you when you least expect it.

The best example of this is here: Why does integer overflow on x86 with GCC cause an infinite loop?

On x86, signed integer overflow is just a simple wrap-around. So normally, you'd expect the same thing to happen in C or C++. However, the compiler can intervene - and use undefined behavior as an opportunity to optimize.

In the example taken from that question:

#include <iostream>
using namespace std;

int main(){
    int i = 0x10000000;

    int c = 0;
    do{
        c++;
        i += i;
        cout << i << endl;
    }while (i > 0);

    cout << c << endl;
    return 0;
}

When compiled with GCC, GCC optimizes out the loop test and makes this an infinite loop.

Sign up to request clarification or add additional context in comments.

1 Comment

Another example of this happening is here -> stackoverflow.com/questions/7124058/…
8

You may trigger some hardware safety feature. So no, you don't have any guarantee.

Edit: Note that gcc has the -ftrapv option (but it doesn't seem to work for me).

2 Comments

-ftrapv only does anything if you register a handler for the signal. It won't cause a crash or any other observable behavior by itself.
Instead of -ftrapv are you thinking of the -fwrapv extension to handle signed overflow like unsigned overflow?
5

There are two views about undefined behavior. There is the view it is there to gather for strange hardware and other special cases, but that usually it should behave sanely. And there is the view that anything can happen. And depending on the UB source, some hold different opinions.

While the UB about overflow has probably been introduced for taking into account hardware which trap or saturate on overflow and the difference of result between representation, and so one can argue for the first view in this case, people writing optimizers hold very dearly the view that if the standard doesn't guarantee something, really anything can happen and they try to use every piece of liberty to generate machine code which runs more rapidly, even if the result doesn't make sense anymore.

So when you see an undefined behavior, assume that anything can happen, however reasonable a given behavior may seem.

2 Comments

Regarding integer overflow is Undefined Behavior allows many platforms to make substantial optimizations that would otherwise not be possible. For example, if the only way integers x and y could be negative would be via overflow, a compiler may compute x/y using unsigned arithmetic (which could mean the difference between an instruction and a function call). It's too bad there's no unsigned type where overflow would be UB, since some optimizations would become possible there too.
Continuing that comment, however, I should mention, however, that most useful optimizations don't require that integer overflow ever have any consequence beyond yielding a nonsense value that may behave strangely (e.g. may appear to be simultaneously greater than zero and less than zero). The C Standard has no way to describe such things other than UB, but a guarantee that computations will have no side-effects is generally very cheap to provide and extremely valuable. It's too bad compilers no longer consistently provide it.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.