or Connect
AppleInsider › Forums › Other Discussion › AppleOutsider › Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!
New Posts  All Forums:Forum Nav:

Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!

post #1 of 20
Thread Starter 
Anyone here familiar with the big-oh, little-oh, big-omega, little-omega and theta concepts?
post #2 of 20
Yup.

Short answers:

Big-O: The asymptotic worst case performance of an algorithm. The function n happens to be the lowest valued function that will always have a higher value than the actual running of the algorithm. [constant factors are ignored because they are meaningless as n reaches infinity]

Big-Omega. The opposite of Big-O. The asymptotic best case performance of an algorithm. The function n happens to be the highest valued function that will always have a lower value than the actual running of the algorithm. [constant factors are ignored because they are meaningless as n reaches infinity]

Big-Theta. The algorithm is so nicely behaved that some function n can describe both the algorithm's upper and lower bounds within the range defined by some constant value c. An algorithm could then have something like this: BigTheta(n), O(c1n), BigOmega(-c2n) where n == n throughout.

Little-o is like Big-O but sloppy. Big-O and the actual algorithm performance will actually become nearly identical as you head out to infinity. little-o is just some function that will always be bigger than the actual performance. Example: o(n^7) is a valid little-o for a function that might actually have linear or O(n) performance.

Little-Omega is just the opposite. w(1) [constant time] would be a valid little omega for the same above function that might actually exihbit BigOmega(n) performance.
.
Reply
.
Reply
post #3 of 20
Thread Starter 
Ok...so if k>=1 and c>1, would this following table be correct if A is O, o, big-omega, little-omega, or theta of B?

Column B functions seem to grow faster than A in all cases.

Code:


A B O o big-omega little-omega theta

2n 2^(n/2) yes yes no no no

n^(1/2) n yes yes no no no

lg(n!) lg(n^n) yes yes no no no

n^k c^n yes yes no no no
post #4 of 20
For starters, the second line is off. A is an exponential function, or one where n is in the exponent. These are only surpassed by n! and n^n. In this case once n>2 it begins growing insanely fast. The second line is also only different from your first case by a constant 2 in the linear case, and at the asymptote, the constants are always insignificant when determining the order of an algorithm. So the second line really just contains the functions we would use to evaluate the first line.

The rest is not quite applicable, or at least directly. We don't directly compare the characteristics of equations, we want to compare the (potentially) different equations that describe to operation of an algorithm under differing circumstances. Or, we dont look for a description of the order of an equation, we look for an equation(s) to describe the repetitive nature of our code.

Sometimes best and worst case are described by a constant multiple of one equation, sometimes best and worst cases are different equations. Because of this, if you provide a function, the function is its own O(n) AND Omega(n) AND Theta(n) because you can adjust it up or down by a constant value to suitcase the actual function curve. O(n), Omega(n) and Theta(n) only become interesting when you are evaluating an algorithm and attempting to find equations which describe its performance in best and worst case conditions.
.
Reply
.
Reply
post #5 of 20


... what is all of this exactly? programming of some sort? high level math?
post #6 of 20
Thread Starter 
Quote:
Originally posted by Hiro
For starters, the second line is off. A is an exponential function, or one where n is in the exponent. These are only surpassed by n! and n^n. In this case once n>2 it begins growing insanely fast. The second line is also only different from your first case by a constant 2 in the linear case, and at the asymptote, the constants are always insignificant when determining the order of an algorithm. So the second line really just contains the functions we would use to evaluate the first line.

Yeah, just realized my mistake...it should read n^(1/2) ...squareroot of n. Not 2^(n/2) as I had first wrote down.

Quote:

The rest is not quite applicable, or at least directly. We don't directly compare the characteristics of equations, we want to compare the (potentially) different equations that describe to operation of an algorithm under differing circumstances. Or, we dont look for a description of the order of an equation, we look for an equation(s) to describe the repetitive nature of our code.

Sometimes best and worst case are described by a constant multiple of one equation, sometimes best and worst cases are different equations. Because of this, if you provide a function, the function is its own O(n) AND Omega(n) AND Theta(n) because you can adjust it up or down by a constant value to suitcase the actual function curve. O(n), Omega(n) and Theta(n) only become interesting when you are evaluating an algorithm and attempting to find equations which describe its performance in best and worst case conditions.
post #7 of 20
Whaa?
The devils that drive us do not discriminate
Reply
The devils that drive us do not discriminate
Reply
post #8 of 20
Thread Starter 
Quote:
Originally posted by Omega
Whaa?

Trust me...I don't know what your big brother and little brother are about either.

I only have a slight understanding of these concepts. I'm still trying to understand a couple of Hiro's explanations.
post #9 of 20
Really isn't this what they've started adding to margerine and yoghurt?
post #10 of 20
Hey guys, I think I'll stick with adddition for a while longer.
post #11 of 20
Quote:
Originally posted by kim kap sol
Trust me...I don't know what your big brother and little brother are about either.

I only have a slight understanding of these concepts. I'm still trying to understand a couple of Hiro's explanations.

OK how about an example diagram?



Big-O describes worst case performance as n approaches infinity, little-o is any function that is larger.

Big-Omega describes best case performance as n approaches infinity, little-omega is any function that is smaller.

Big-O and Big-Omega may cross for small values of n because the order of the algorithm really describes how the algorithm behaves as n gets large.

When Big-O and Big-Omega have exactly the same shape, but only shift up/down by some constant coefficient , we drop the coefficient from the equation and call that big-Theta [ O( c1*f(n) ) && Omega ( c2*F(n) ) then Theta( f(n) ) ].

This help?
.
Reply
.
Reply
post #12 of 20
Thread Starter 
Quote:
Originally posted by Hiro

When Big-O and Big-Omega have exactly the same shape, but only shift up/down by some constant coefficient , we drop the coefficient from the equation and call that big-Theta [ O( c1*f(n) ) && Omega ( c2*F(n) ) then Theta( f(n) ) ].

This help?

I think so...and I suppose I'd be wrong about log(n!) and log(n^n) then...

The shape is similar...and a constant added to log(n^n) can and does shift up and down to bound log(n!)...so I suppose log(n!) is theta(log(n^n)) (as well as big-omega).
post #13 of 20
Quote:
Originally posted by kim kap sol
I think so...and I suppose I'd be wrong about log(n!) and log(n^n) then...

The shape is similar...and a constant added to log(n^n) can and does shift up and down to bound log(n!)...so I suppose log(n!) is theta(log(n^n)) (as well as big-omega).

Yes, (n^n) grows faster than (n!) so no matter what constant you apply to make (k * n^n) smaller, someplace on the way to infinity it will cross any (c * n!) Taking the log of both just squishes them somewhat, but the curves will still cross the same way and at the same value of n.
.
Reply
.
Reply
post #14 of 20
theta: angle
little omega: angular velocity, constant time
big omega: angular velocity, discrete time
big O: that website the lady advertises on TV.

Quote:
Originally posted by ShawnJ


... what is all of this exactly? programming of some sort? high level math?

Dummied-down statistics for code-jockeys.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #15 of 20
Quote:
Originally posted by Splinemodel

big O: that website the lady advertises on TV.

That's the one I have been waiting for!
.
Reply
.
Reply
post #16 of 20
I can't believe I didn't think of that, but alright.
post #17 of 20
Quote:
Originally Posted by kim kap sol View Post

Anyone here familiar with the big-oh, little-oh, big-omega, little-omega and theta concepts?

what is the difference between big "O" and small "o"?
[CENTER]IFRA[/CENTER]
Reply
[CENTER]IFRA[/CENTER]
Reply
post #18 of 20
Quote:
Originally Posted by ifra View Post

what is the difference between big "O" and small "o"?

Look up. Second post, next to last paragraph.
.
Reply
.
Reply
post #19 of 20
Quote:
Originally Posted by Hiro View Post

Look up. Second post, next to last paragraph.

Whaa?
The devils that drive us do not discriminate
Reply
The devils that drive us do not discriminate
Reply
post #20 of 20
Quote:
Originally Posted by Omega View Post

Whaa?

QFT.

.
Reply
.
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AppleOutsider
AppleInsider › Forums › Other Discussion › AppleOutsider › Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!