D-Query Segmentation Fault Bug

I tried solving D-Query http://www.spoj.com/problems/DQUERY/ using Mo’s algorithm, but my code keeps giving Segmentation Fault, have been trying for the past hour and still can’t debug it. Would appreciate it if someone can help. Thanks. Here is my code.
http://ideone.com/o9qLUW

One observation that i made is that,

query[i].se is of type ll

whereas you are accessing index ans1[query[i].se] which wont be accessible if query[i].se is of range ll and ans1’s index only last till 200004 (as you have declared initially).

Try this and let me know

Sadly doesn’t work, the number of queries never go into ll range to begin with.

Okay thanks for the check, meanwhile I’ll check other factors.

Let the 1st query be 1 3,then initially your curL=0 and you are accessing cnt[a[curL]] but you are taking input as 1-based and hence a[0] would be some garbage value.You have declared cnt array of size 1000005.If a[0] is greater than 1000005 then you are going out of bounds which will result is segmentation fault.

BTW,here is my accepted solution if you have any doubt feel free to comment https://pastebin.com/mRqp9ha1

It still gives segmentation fault after I initialized a[0] as 0, also since I have declared it globally it should have already been 0 initially to begin with.

try to compare your code with my code

1. C++ requires compare function to be transitive for `sort` to work (see reference). Try replacing `<=` with `<` in your second code block, this will remove seg fault.
2. Though that removes segfault, your solution will TLE because your compare function looks expensive. For example, you are calculating `sqrt(n)` every time, which is an extra \Theta(log N) factor. Try caching that into a variable.
3. Lastly, try passing `const pair<ii, ll>&` instead so that you don’t pass-by-value the big pairs during sort. This link in stackoverflow can explain why constant references are faster.