DIVQUERY showing TLE even after using fastread with getchar_unlocked please help

I was submitting a problem in cookoff but it was showing time limit exceeded even after i used fastread function with getchar_unlocked
please tell me how can i optimise it further

problem

here is my code

#include<stdio.h>


inline void fastread(int*);

int main(){

int t,i,n,p1,p2,k;
int *a;
fastread(&n);
a = (int*)malloc(sizeof(int)*n);
fastread(&t);
for(i=0;i<n;i++)
    fastread(&a[i]);
while(t--){
    int count = 0;
    fastread(&p1);
    fastread(&p2);
    fastread(&k);
    for(i=p1;i<=p2;i++){
        if(a[i-1]%k==0){
            count++;
        }
    }
    printf("%d\n",count);
}
return 0;
}

inline void fastread(int*a)

{

register char c=0;

while (c<33) c=getchar_unlocked();

*a =0;

while(c>33)

{

   *a = *a*10+c-'0';

   c=getchar_unlocked();
}

}
1 Like

There might be as much as 10^5 queries and 10^5 numbers in the initial array. Your naive algorithm will take O(N) time per query so, O(NQ) time in total which can be as much as 10^5 * 10^5 = 10^10. So that’s not supposed to be passed in even 2/3 seconds.

Check out the editorial here for faster algorithm to solve it in time :slight_smile:

3 Likes

Same here…pls help…

time limit giving in this problem is 1 sec but solutions upto 9.59 sec in c++ are accepted.
moreover there is no solution submitted in c. can’t it be done in c? i don’t know even a bit of c++.

@nishant_25 : Time limit applies to one testfile , there are multiple test files and time shown is sum of time takens on all the test files .

but in all my previous submissions to practice problems i always got time less than the time limit given

@nishant_25 as vineet said time limit applies to one testfile, and the time that is shown is actually sum of the all time taken by multiple test files which can contain 1 test case or 100000. And, surely you passed all your previous submissions in the time limit. Since, in those problems either the constraints are too low or they have O(1) or O(n) solution. That’s why by even summing up all the time you got time less than the required one. But, actually that time is only for 1 test file.

And remember a test file can contain 1 to 100000 test cases. and can take 0.00 to 1.00 seconds.

PS: Read editorials for better algorithms.

1 Like

got it thanks alot…:slight_smile:

But that time 9.59 is sum of times for all input files. Limit 1s is for one file.

if i can have the c solution of this problem as i don’t know even a little c++ or if anyone can tell me how can i optimize my above approach so it don’t get TLE because i am unable to understand the editorial given and the solution given is also in c++

You cannot optimize slow algorithm using just language features. If there is something unclear in editorial, you can always ask :wink:

can you tell me the algorithm for this statement in the editorial or any resource from where i can get the algo

“Given a single integer a[i], it would be easy to tell which k’s are divisible by a[i](by factorizing a[i]'s )”

if someone could help me in explaining the algorithm for this line in DIVQUERY editorial

Given a single integer a[i], it would be easy to tell which k’s are divisible by a[i] (by factorizing a[i]'s).

because your sum was coincidentally less than that…

Hi @nishant_25

Editorials are made for every problem, and it is a good practice to ask questions there, because all doubt related to one question are listed/discussed at one place. Imagine 100 people who get TLE post one question each.

Any doubts about editorials should be posted there.

Coming to your doubt,

k is divisible by a[i] <=> k is a factor of a[i].

So you need to find all factors of a[i]. All factors of a number N can be found in O(sqrt(N)) time.


for i = 1 to sqrt(n)
   if n%i == 0
      // i are factor of n
   if i != n/i
      // n/i is a factor of n

according to this explanation
u mean to say that first i should store all the factors of a[i] in an array and then look whether k is in the array or not if it is in the array than k is a factor of a[i] else not

thanks a lot to all those who answered my queries in this thread because of this thread i learned a whole lot of new things .