Big-omega, Little-omega, Theta, Big-oh, Little-oh, Uh-oh!

Posted:
in AppleOutsider edited January 2014
Anyone here familiar with the big-oh, little-oh, big-omega, little-omega and theta concepts?

Comments

  • Reply 1 of 19
    hirohiro Posts: 2,663member
    Yup.



    Short answers:



    Big-O: The asymptotic worst case performance of an algorithm. The function n happens to be the lowest valued function that will always have a higher value than the actual running of the algorithm. [constant factors are ignored because they are meaningless as n reaches infinity]



    Big-Omega. The opposite of Big-O. The asymptotic best case performance of an algorithm. The function n happens to be the highest valued function that will always have a lower value than the actual running of the algorithm. [constant factors are ignored because they are meaningless as n reaches infinity]



    Big-Theta. The algorithm is so nicely behaved that some function n can describe both the algorithm's upper and lower bounds within the range defined by some constant value c. An algorithm could then have something like this: BigTheta(n), O(c1n), BigOmega(-c2n) where n == n throughout.



    Little-o is like Big-O but sloppy. Big-O and the actual algorithm performance will actually become nearly identical as you head out to infinity. little-o is just some function that will always be bigger than the actual performance. Example: o(n^7) is a valid little-o for a function that might actually have linear or O(n) performance.



    Little-Omega is just the opposite. w(1) [constant time] would be a valid little omega for the same above function that might actually exihbit BigOmega(n) performance.
  • Reply 2 of 19
    kim kap solkim kap sol Posts: 2,987member
    Ok...so if k>=1 and c>1, would this following table be correct if A is O, o, big-omega, little-omega, or theta of B?



    Column B functions seem to grow faster than A in all cases.



    Code:




    A B O o big-omega little-omega theta



    2n 2^(n/2) yes yes no no no



    n^(1/2) n yes yes no no no



    lg(n!) lg(n^n) yes yes no no no



    n^k c^n yes yes no no no





  • Reply 3 of 19
    hirohiro Posts: 2,663member
    For starters, the second line is off. A is an exponential function, or one where n is in the exponent. These are only surpassed by n! and n^n. In this case once n>2 it begins growing insanely fast. The second line is also only different from your first case by a constant 2 in the linear case, and at the asymptote, the constants are always insignificant when determining the order of an algorithm. So the second line really just contains the functions we would use to evaluate the first line.



    The rest is not quite applicable, or at least directly. We don't directly compare the characteristics of equations, we want to compare the (potentially) different equations that describe to operation of an algorithm under differing circumstances. Or, we dont look for a description of the order of an equation, we look for an equation(s) to describe the repetitive nature of our code.



    Sometimes best and worst case are described by a constant multiple of one equation, sometimes best and worst cases are different equations. Because of this, if you provide a function, the function is its own O(n) AND Omega(n) AND Theta(n) because you can adjust it up or down by a constant value to suitcase the actual function curve. O(n), Omega(n) and Theta(n) only become interesting when you are evaluating an algorithm and attempting to find equations which describe its performance in best and worst case conditions.
  • Reply 4 of 19
    shawnjshawnj Posts: 6,656member




    ... what is all of this exactly? programming of some sort? high level math?
  • Reply 5 of 19
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Hiro

    For starters, the second line is off. A is an exponential function, or one where n is in the exponent. These are only surpassed by n! and n^n. In this case once n>2 it begins growing insanely fast. The second line is also only different from your first case by a constant 2 in the linear case, and at the asymptote, the constants are always insignificant when determining the order of an algorithm. So the second line really just contains the functions we would use to evaluate the first line.





    Yeah, just realized my mistake...it should read n^(1/2) ...squareroot of n. Not 2^(n/2) as I had first wrote down.



    Quote:



    The rest is not quite applicable, or at least directly. We don't directly compare the characteristics of equations, we want to compare the (potentially) different equations that describe to operation of an algorithm under differing circumstances. Or, we dont look for a description of the order of an equation, we look for an equation(s) to describe the repetitive nature of our code.



    Sometimes best and worst case are described by a constant multiple of one equation, sometimes best and worst cases are different equations. Because of this, if you provide a function, the function is its own O(n) AND Omega(n) AND Theta(n) because you can adjust it up or down by a constant value to suitcase the actual function curve. O(n), Omega(n) and Theta(n) only become interesting when you are evaluating an algorithm and attempting to find equations which describe its performance in best and worst case conditions.




  • Reply 6 of 19
    omegaomega Posts: 427member
    Whaa?
  • Reply 7 of 19
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Omega

    Whaa?



    Trust me...I don't know what your big brother and little brother are about either.



    I only have a slight understanding of these concepts. I'm still trying to understand a couple of Hiro's explanations.
  • Reply 8 of 19
    marcukmarcuk Posts: 4,442member
    Really isn't this what they've started adding to margerine and yoghurt?
  • Reply 9 of 19
    placeboplacebo Posts: 5,767member
    Hey guys, I think I'll stick with adddition for a while longer.
  • Reply 10 of 19
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by kim kap sol

    Trust me...I don't know what your big brother and little brother are about either.



    I only have a slight understanding of these concepts. I'm still trying to understand a couple of Hiro's explanations.




    OK how about an example diagram?







    Big-O describes worst case performance as n approaches infinity, little-o is any function that is larger.



    Big-Omega describes best case performance as n approaches infinity, little-omega is any function that is smaller.



    Big-O and Big-Omega may cross for small values of n because the order of the algorithm really describes how the algorithm behaves as n gets large.



    When Big-O and Big-Omega have exactly the same shape, but only shift up/down by some constant coefficient , we drop the coefficient from the equation and call that big-Theta [ O( c1*f(n) ) && Omega ( c2*F(n) ) then Theta( f(n) ) ].



    This help?
  • Reply 11 of 19
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by Hiro



    When Big-O and Big-Omega have exactly the same shape, but only shift up/down by some constant coefficient , we drop the coefficient from the equation and call that big-Theta [ O( c1*f(n) ) && Omega ( c2*F(n) ) then Theta( f(n) ) ].



    This help?




    I think so...and I suppose I'd be wrong about log(n!) and log(n^n) then...



    The shape is similar...and a constant added to log(n^n) can and does shift up and down to bound log(n!)...so I suppose log(n!) is theta(log(n^n)) (as well as big-omega).
  • Reply 12 of 19
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by kim kap sol

    I think so...and I suppose I'd be wrong about log(n!) and log(n^n) then...



    The shape is similar...and a constant added to log(n^n) can and does shift up and down to bound log(n!)...so I suppose log(n!) is theta(log(n^n)) (as well as big-omega).




    Yes, (n^n) grows faster than (n!) so no matter what constant you apply to make (k * n^n) smaller, someplace on the way to infinity it will cross any (c * n!) Taking the log of both just squishes them somewhat, but the curves will still cross the same way and at the same value of n.
  • Reply 13 of 19
    splinemodelsplinemodel Posts: 7,311member
    theta: angle

    little omega: angular velocity, constant time

    big omega: angular velocity, discrete time

    big O: that website the lady advertises on TV.



    Quote:

    Originally posted by ShawnJ





    ... what is all of this exactly? programming of some sort? high level math?




    Dummied-down statistics for code-jockeys.
  • Reply 14 of 19
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by Splinemodel



    big O: that website the lady advertises on TV.





    That's the one I have been waiting for!
  • Reply 15 of 19
    placeboplacebo Posts: 5,767member
    I can't believe I didn't think of that, but alright.
  • Reply 16 of 19
    Quote:
    Originally Posted by kim kap sol View Post


    Anyone here familiar with the big-oh, little-oh, big-omega, little-omega and theta concepts?



    what is the difference between big "O" and small "o"?
  • Reply 17 of 19
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by ifra View Post


    what is the difference between big "O" and small "o"?



    Look up. Second post, next to last paragraph.
  • Reply 18 of 19
    omegaomega Posts: 427member
    Quote:
    Originally Posted by Hiro View Post


    Look up. Second post, next to last paragraph.



    Whaa?
  • Reply 19 of 19
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by Omega View Post


    Whaa?



    QFT.



Sign In or Register to comment.