Well I do face this problem sometimes. When I declare a variable as int type it is well with in the time limit. But if a change the data type to long long int it gives me a TLE! How does changing the data type affects the time complexity of my problem? I faced this issue in this lunchtime too.
http://www.codechef.com/LTIME24/problems/MDIST/
There is nothing magical behind it. I’ll try to explain it in a simple way (while making some simplifications, so it is not a precise model, but it should give you an idea). You have 32-bit machine, int is 32 bits which fits in that “cell” of 32 bits. Types like short, char etc. also fit well, so they have approximately same speed. long long is 64 bits, it does not fits within those 32 bits. You have to use two “cells” for a single variable. Every operation with your variable turns into two simple operations with these “cells”. So it is natural to expect your code becoming two times slower. Complexity of your code didn’t changed in terms of asymptotic, but now you have higher hidden constant.
Thank You very much lebron
Try This
c++
#include <bits/stdc++.h>
#define FOR(i,a,b) for(int i=a;i<b;i++)
#define FORD(i,a,b) for(int i=a;i>=b;i–)
#define REP(i,n) FOR(i,0,n)
#define PB push_back
#define ITER(i,a) for( typeof(a.begin()) i=a.begin();i!=a.end();i++)
#define MAXN 1000010
#define MP make_pair
#define INF mod
using namespace std;
int main()
{
int n,i,x,q,y,L,R,j;
long long maxx;
char c;
cin>>n;
pair<int,int> arr[n+10];
for(i=0;i<n;i++)
{
cin>>x>>y;
arr[i]=make_pair(x,y);
}
cin>>q;
while(q–)
{
cin>>c;
if(c=='Q')
{
cin>>L>>R;
maxx=-1;
for(i=L;i<=R;i++)
{
for(j=i+1;j<=R;j++)
{ //cout<<abs(arr[i].first-arr[j].first)+abs(arr[i].second-arr[j].second)<<endl;
if((0LL+abs(arr[i].first-arr[j].first)+abs(arr[i].second-arr[j].second))>maxx)
maxx = 0LL+abs(arr[i].first-arr[j].first)+abs(arr[i].second-arr[j].second);
}
}
if(L==R)
cout<<0<<endl;
else
cout<<maxx<<endl;
}
else {
cin>>i>>x>>y;
arr[i]=make_pair(x,y);
}
}
}
long is 32 bits here, which definitely doesn’t look like a 64bit compiler
In case of 64 bits, both long and long long will give us 64 bits by default (and both will have same speed, while here long is faster than long long but gives 32 bits only).
Please provide your both submitted solution where the difference occurs because of long long data type.
Yes! I think that it will not affect the codechef TLE as @lebron said! It’s affect would be minimal and can be neglected!