if we multiply very very small number in n^2 then big O of n^2 and (very very small)*n^2 is n^2,why and how?
That’s literally how big O notation is defined. Any constant does not matter for the asymptotic behavior. Wikipedia defines the term pretty well: https://en.wikipedia.org/wiki/Big_O_notation
big O is more of “how much slower code gets with larger input” rather than just how slow it is.