The issue is somehow related to the usage of the printf call in the presence of the <iostream> header (part of the bits/stdc++.h include).
For example, if you add fflush(stdout); at the end of you main program, you will get a correct solution regardless of whether you include bits/stdc++.h, <iostream>, or <stdio.h>.
The overall recommendation would be to use cin and cout in C++ programs since cin and cout have type-safe checks as compared with scanf and printf. scanf and printf (along with stdio.h) are remnants of the early days of C programming. Unless you are into embedded programming as your career, it makes sense to deal with C++ paradigms. Furthermore, if you consider competitive programming as a preparation for job hunting and interviews, the cin/cout approach definitely beats scanf/printf. Just compare:
scanf("%lld", &a[i]);
printf("%lld\n", ans);
vs
cin >> a[i];
cout << ans << ā\nā;
If you need faster IO with cin and cout, you can add
at the beginning of your main program. The relative performance of data reading is the following (the measurements are done on HackerRank servers but CodeChef servers are similar)
Reading 4,000,000 positive integer values, each less than 10^7:
1.55 sec - simple cin
0.51 sec - scanf
0.41 sec - cin with sync_with_stdio(false) and cin.tie(NULL)
There are ways to reduce it even further (to sub 0.1 sec levels) but it becomes somewhat esoteric in implementation.
Still, C-style I/O is a legitimate part of C++, and it is not supposed to misbehave. And a mere presence of a header changing program behavior does not seem right.
Unfortunately, we cannot debug the issue to the full extent. It seems that the issue is related to the output buffer, therefore some fiddling with flushing helps. For a complete analysis we also need to know how CodeChef judge is organized in regards to input/output streams.
I was not able to replicate the situation on my local environment - gcc (Ubuntu 5.4.0-6ubuntu1~16.04.10) 5.4.0 20160609. The problem is only reproducible when running on CodeChef judge.
Whatās observable - CodeChef judge is definitely more robust with cin/cout IO.
Although fflush(stdout) does seem to make it work, I would not say for sure buffering is the problem, since even after disabling buffering with setbuf(stdout, NULL) the behavior does not change (works with fflush but not without).
Looks like including header causes initialization of the standard C++ streams and their synchronization with the standard C streams. Adding ios_base::sync_with_stdio(false) corrects the issue.
Seeee, it turned out to be undefined behavior after all XD.
@l_returns - For undefined behavior, you really cannot say. Even adding a int aabcdef=1; might have worked and given AC for that case. Perhaps it all depends on what value was in memory accessed out of bounds? I guess that part had a good value
But it should be undefined forever isnāt itā¦ like random functionā¦ If it really depends on memory valueā¦
You predicted it correctlyā¦@vijju123
Why it is defined for one library and not defined for other one...
Its not so. Thing is, anything can happen in undefined behavior. Even for same library, things like print statements, new varaibles etc can change stuff. It all depends on what compiler does/make your code while compiling it. It applies some optimizations, does some reordering etc to fasten execution.
@l_returns Didnāt really use anything fancy. Simply went through the code and added some asserts to check the indexing (something Iām very fond of doing even in my own code). At first I thought that maybe there was a problem with the factorial but none of those asserts triggered, ruling out any obvious problems with the factorial. Then I saw the strangely written while-loop, and it was pretty clear what was the problem.