or Connect
AppleInsider › Forums › Other Discussion › AppleOutsider › Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!
New Posts  All Forums:Forum Nav:

# Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!

Anyone here familiar with the big-oh, little-oh, big-omega, little-omega and theta concepts?
Yup.

Big-O: The asymptotic worst case performance of an algorithm. The function n happens to be the lowest valued function that will always have a higher value than the actual running of the algorithm. [constant factors are ignored because they are meaningless as n reaches infinity]

Big-Omega. The opposite of Big-O. The asymptotic best case performance of an algorithm. The function n happens to be the highest valued function that will always have a lower value than the actual running of the algorithm. [constant factors are ignored because they are meaningless as n reaches infinity]

Big-Theta. The algorithm is so nicely behaved that some function n can describe both the algorithm's upper and lower bounds within the range defined by some constant value c. An algorithm could then have something like this: BigTheta(n), O(c1n), BigOmega(-c2n) where n == n throughout.

Little-o is like Big-O but sloppy. Big-O and the actual algorithm performance will actually become nearly identical as you head out to infinity. little-o is just some function that will always be bigger than the actual performance. Example: o(n^7) is a valid little-o for a function that might actually have linear or O(n) performance.

Little-Omega is just the opposite. w(1) [constant time] would be a valid little omega for the same above function that might actually exihbit BigOmega(n) performance.
.
.
Ok...so if k>=1 and c>1, would this following table be correct if A is O, o, big-omega, little-omega, or theta of B?

Column B functions seem to grow faster than A in all cases.

Code:
`A       B         O    o   big-omega   little-omega  theta2n      2^(n/2)  yes  yes     no           no         non^(1/2) n        yes  yes     no           no         nolg(n!)  lg(n^n)  yes  yes     no           no         non^k     c^n      yes  yes     no           no         no`
For starters, the second line is off. A is an exponential function, or one where n is in the exponent. These are only surpassed by n! and n^n. In this case once n>2 it begins growing insanely fast. The second line is also only different from your first case by a constant 2 in the linear case, and at the asymptote, the constants are always insignificant when determining the order of an algorithm. So the second line really just contains the functions we would use to evaluate the first line.

The rest is not quite applicable, or at least directly. We don't directly compare the characteristics of equations, we want to compare the (potentially) different equations that describe to operation of an algorithm under differing circumstances. Or, we dont look for a description of the order of an equation, we look for an equation(s) to describe the repetitive nature of our code.

Sometimes best and worst case are described by a constant multiple of one equation, sometimes best and worst cases are different equations. Because of this, if you provide a function, the function is its own O(n) AND Omega(n) AND Theta(n) because you can adjust it up or down by a constant value to suitcase the actual function curve. O(n), Omega(n) and Theta(n) only become interesting when you are evaluating an algorithm and attempting to find equations which describe its performance in best and worst case conditions.
.
.

... what is all of this exactly? programming of some sort? high level math?
Quote:
Originally posted by Hiro
For starters, the second line is off. A is an exponential function, or one where n is in the exponent. These are only surpassed by n! and n^n. In this case once n>2 it begins growing insanely fast. The second line is also only different from your first case by a constant 2 in the linear case, and at the asymptote, the constants are always insignificant when determining the order of an algorithm. So the second line really just contains the functions we would use to evaluate the first line.

Yeah, just realized my mistake...it should read n^(1/2) ...squareroot of n. Not 2^(n/2) as I had first wrote down.

Quote:

The rest is not quite applicable, or at least directly. We don't directly compare the characteristics of equations, we want to compare the (potentially) different equations that describe to operation of an algorithm under differing circumstances. Or, we dont look for a description of the order of an equation, we look for an equation(s) to describe the repetitive nature of our code.

Sometimes best and worst case are described by a constant multiple of one equation, sometimes best and worst cases are different equations. Because of this, if you provide a function, the function is its own O(n) AND Omega(n) AND Theta(n) because you can adjust it up or down by a constant value to suitcase the actual function curve. O(n), Omega(n) and Theta(n) only become interesting when you are evaluating an algorithm and attempting to find equations which describe its performance in best and worst case conditions.
Whaa?
The devils that drive us do not discriminate
The devils that drive us do not discriminate
Quote:
Originally posted by Omega
Whaa?

Trust me...I don't know what your big brother and little brother are about either.

I only have a slight understanding of these concepts. I'm still trying to understand a couple of Hiro's explanations.
Really isn't this what they've started adding to margerine and yoghurt?
Hey guys, I think I'll stick with adddition for a while longer.
Quote:
Originally posted by kim kap sol
Trust me...I don't know what your big brother and little brother are about either.

I only have a slight understanding of these concepts. I'm still trying to understand a couple of Hiro's explanations.

OK how about an example diagram?

Big-O describes worst case performance as n approaches infinity, little-o is any function that is larger.

Big-Omega describes best case performance as n approaches infinity, little-omega is any function that is smaller.

Big-O and Big-Omega may cross for small values of n because the order of the algorithm really describes how the algorithm behaves as n gets large.

When Big-O and Big-Omega have exactly the same shape, but only shift up/down by some constant coefficient , we drop the coefficient from the equation and call that big-Theta [ O( c1*f(n) ) && Omega ( c2*F(n) ) then Theta( f(n) ) ].

This help?
.
.
Quote:
Originally posted by Hiro

When Big-O and Big-Omega have exactly the same shape, but only shift up/down by some constant coefficient , we drop the coefficient from the equation and call that big-Theta [ O( c1*f(n) ) && Omega ( c2*F(n) ) then Theta( f(n) ) ].

This help?

I think so...and I suppose I'd be wrong about log(n!) and log(n^n) then...

The shape is similar...and a constant added to log(n^n) can and does shift up and down to bound log(n!)...so I suppose log(n!) is theta(log(n^n)) (as well as big-omega).
Quote:
Originally posted by kim kap sol
I think so...and I suppose I'd be wrong about log(n!) and log(n^n) then...

The shape is similar...and a constant added to log(n^n) can and does shift up and down to bound log(n!)...so I suppose log(n!) is theta(log(n^n)) (as well as big-omega).

Yes, (n^n) grows faster than (n!) so no matter what constant you apply to make (k * n^n) smaller, someplace on the way to infinity it will cross any (c * n!) Taking the log of both just squishes them somewhat, but the curves will still cross the same way and at the same value of n.
.
.
theta: angle
little omega: angular velocity, constant time
big omega: angular velocity, discrete time

Quote:
Originally posted by ShawnJ

... what is all of this exactly? programming of some sort? high level math?

Dummied-down statistics for code-jockeys.
Cat: the other white meat
Cat: the other white meat
Quote:
Originally posted by Splinemodel

That's the one I have been waiting for!
.
.
I can't believe I didn't think of that, but alright.
Quote:
Originally Posted by kim kap sol

Anyone here familiar with the big-oh, little-oh, big-omega, little-omega and theta concepts?

what is the difference between big "O" and small "o"?
[CENTER]IFRA[/CENTER]
[CENTER]IFRA[/CENTER]
Quote:
Originally Posted by ifra

what is the difference between big "O" and small "o"?

Look up. Second post, next to last paragraph.
.
.
Quote:
Originally Posted by Hiro

Look up. Second post, next to last paragraph.

Whaa?
The devils that drive us do not discriminate
The devils that drive us do not discriminate
Quote:
Originally Posted by Omega

Whaa?

QFT.

.
.
New Posts  All Forums:Forum Nav:
Return Home
Back to Forum: AppleOutsider
• Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!
AppleInsider › Forums › Other Discussion › AppleOutsider › Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!