Need help with Computer Science

Posted:
in General Discussion edited January 2014
Hi all CS experts out there!



I need a little help with computer science:



Are there any resources out there that explain CS at an easily comprehsible level? The "O"-notiation describing an algorithms efficiency for example.

Comments

  • Reply 1 of 6
    try google. that damned O thing bugged the hell out of me. \
  • Reply 2 of 6
    wmfwmf Posts: 1,164member
    Depends on what you mean by "easily comprehensible"; undergraduate CS textbooks/courses have already made it as easy as it's going to get.



    Check out MIT's online CS courses, espeically 6.046J:

    http://ocw.mit.edu/OcwWeb/Electrical...ence/index.htm

    http://ocw.mit.edu/OcwWeb/Electrical...Home/index.htm



    And the standard algorithms textbook:

    http://isbn.nu/0070131430/
  • Reply 3 of 6
    Quote:

    Originally posted by RolandG

    Hi all CS experts out there!



    I need a little help with computer science:



    Are there any resources out there that explain CS at an easily comprehsible level? The "O"-notiation describing an algorithms efficiency for example.




    I had problems with that at first, but one day it just clicked. Trust me, it will click. And once it does, you'll think to yourself, "Why did I ever think that was hard?".





    This is O(n):

    Code:


    int i = 0;

    for (i = 0; i < n; i++) {

    printf("loop number = %d", i);

    }







    This is O(n*n):

    Code:


    int i = 0, j = 0;

    for (i = 0; i < n; i++) {

    for (j = 0; j < i; i++) {

    printf("loop number = %d, %d", i, j);

    }

    }







    This is O(1):

    Code:




    int i = 0;

    printf("%d", i);









    You'll get the hang of it. As I said, once it clicks, you'll simply skim through a piece of code and find its order of complexity straight away. m.
  • Reply 4 of 6
    yevgenyyevgeny Posts: 1,148member
    The big white Algorithms book (called "An Introduction to Algorithms") by Cormen, Leiserson, and Rivest explains Onotation well, and is the standard bill of fare for algorithms books. Best of all, you can usually buy it at a borders bookstore or any other bookstore with a decent CS section. It obviously goes over more than Onotation.
  • Reply 5 of 6
    rolandgrolandg Posts: 632member
    Quote:

    Originally posted by Yevgeny

    The big white Algorithms book (called "An Introduction to Algorithms") by Cormen, Leiserson, and Rivest explains Onotation well, and is the standard bill of fare for algorithms books. Best of all, you can usually buy it at a borders bookstore or any other bookstore with a decent CS section. It obviously goes over more than Onotation.



    I already bought that book but found it rather annoying that you cannot get the solution for the exercises...



    My problem is the following: It took me some time to get to studying CS after what would be equivalent to the first three or four years of college (minus CS classes obviously) and now I sometimes have a hard time understanding the rather short mathematical notation used by our lecturer and throughout CS.



    By easily comprehensible I meant that the Math is thoroughly explained and that one can check his exercises with the "real" solution.
  • Reply 6 of 6
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by RolandG

    I already bought that book but found it rather annoying that you cannot get the solution for the exercises...



    Proof left as an excercise to the reader



    Quote:

    My problem is the following: It took me some time to get to studying CS after what would be equivalent to the first three or four years of college (minus CS classes obviously) and now I sometimes have a hard time understanding the rather short mathematical notation used by our lecturer and throughout CS.



    O notation is basically the idea that there is a math function that describes the worst case running time. This function is well, a function of n, the number of things in it. To sequentially search throught a list means that I must look at n things, so we say that the running time is O(n). To figure out the Onotation, you need to ask yourself the questions of "how many things does this part of code look at?" and "how many times is that part of the code called?".



    For example, lets say that you have an object that holds a list of integers. You are given all of these integers up front and you store them in your object (inside its linked list). Then you want to query the object to find out if it has a particular object. There are n numbers in the object and to find out if you have a particular one requires a search that is proportional to the number of objects in the list. Searching the linked list takes O(n) time because you have to look at n items in the object to find the one that you want. We now know that the answer to the first question is that the object runs in O(n) time.



    The second question is how will we call this object? If we call it in a loop, asking it if it has lots of numbers, then we are calling it m times. So the code that calls the object runs in O(m) times because it makes m calls to the object that holds numbers. Now the running time can be considered to be O(m) * O(n), meaning that we run in O(m*n) time. If m is the same as n (a good assumption), then we have O(n^2) time, which is very very slow for any large value of n. If n is large, then you would have to do n*n comparisions to find all the numbers you wanted from your object.



    This slow running time would be a good reason to consider a different implementation of your object. For example, you could replace the linked list with a sorted array through which you will binary search for the number. Binary search runs in O(log(n)) time. So the caller runs in O(n) time and the object runs in O(log(n)) time, so multiplying the two together gives you O(n*log(n)) time which is MUCH faster than O(n^2) time. Better still, you might be able to use an intelligent hash table or a sparsearray inside the object which runs in O(1) time so the overall time would be O(n) * O(1) = O(n) time.



    Quote:

    By easily comprehensible I meant that the Math is thoroughly explained and that one can check his exercises with the "real" solution.



    How many times is a piece of code run? multiply that by how many operations the code performs. This gives you the Onotation.





    Several words of advice from the real world programmer:



    1. Onotation overlooks constants. "What is the pain associated with a constant when you are dealing with large values of n?" Well, you would be surprised! Onotation doesn't consider the costs of allocating memory, freeing memory, hitting the database, etc. Constants do matter.



    2. Not all code must be algorithmically efficient. Some code can run in O(n^2) time so long as you know that n will be small.



    3. More algorithmically efficient code takes more time to write. Choose your targets wisely.



    4. Most programmers do not live in the land of large n, but it is good to think about how your code deals with large vaues of n. I am definitely a programmer who lives in the land of large vlaues of n. Here's a sample conversation between me and the head of software development at my company:



    Head of software development (HSD): Whata size of databases can you manipulate and transform with this operation?



    Me: So far we have tried databases that are about 1.5 million records, about two gigabytes.



    HSD: How well does it scale to fourty to sixty million records (the entire road database for the continental US)?



    Me: It will top out at 10 million rows due to lack of virtual memory on a 32 bit architecture.



    HSD: You need to redesign your logic for larger databases.



    Me: (in my mind) it sucks to live in the land of large n.
Sign In or Register to comment.