We have two algorithms for such problems:
For smaller y(Assuming y<=k): DP Solution where dp[x][y] represents requrired value.
dp[x][y] = dp[x+y][y] + a[x].
This solution can only be implemented for smaller y due to memory constraints.
Preprocessing complexity is O(n*k)
Query complexity O(1)
For Larger y(y>k): Direct simulation of what is needed,i.e. an iterative approach.
for(i = x; i <= n; i+=y)
Query Complexity is n/y. Worst case O(n/k)
Worst case complexity if we use both algorithm with fixed value of k.
O(nk) + O(q1) + O(q*n/k)
The idea is to choose a wise value of k such that worst case complexity of both the algorithms lies within time constraints.
Ex: for k = 100.
O(300000100) + O(300000) + O(300000300000/100) = O(900000000)
for k = 200
O(300000200) + O(300000) + O(300000300000/200) = O(450000000)
This should pass in given time limits.
We arrive at a minimum complexity where k = root(n). [Since n==q in constraints, nk will become larger than nq/k at k = root(n)]