Well it may not need a gig, but it is a reasoanble investment in a new computer based on price points. One should go for as much memory as is reasonable cost wise on the platform.
AS to what Linux needs I do understand its need. For the longest itme I ran on a 128meg. Atleast up until kernel issues and Redhat bloat force me to upgrade. One should underestimate just how much that extra ram improves performance. Anyone with a fast PC should try running that machine with a modest maount of RAM to see how it performs. Sure you can get by, but if you can get 1gig for $70 more why give up the capability. It is a bit like saying that you only need a 5gig harddisk, sure you can get by but you are giving up capability.
As to Longhorn I'm not sure what the quoted material was saying. It may not be that Longhorn will need 2 Gigs rather that may be what they expect the standard PC to contain a few years from now. Frankly I think they are right. Further companies like Apple that are 64 bit aware will be driving large memory systems for competitive advantage. I don't see much of a future for 32 bits systems on the desktop, there is to much potential in that additional address range to ignore 64 bit.
Thanks
Dave
Quote:
Originally posted by 1337_5L4Xx0R
Wizard 69: I too run linux on my PBG4 w/ 1024MB ram, but linux doesn't _need_ a gig... I'm sure I could 'get by' with a mere 256. Longhorn expecting 2 gigs is ridiculous.
Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?
Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?
Unless you are comparing to very sloppy programming, the usual answer is "no". Automated garbage collecting is one of the features that made Java so popular among some and hated among others. It means that objects are tagged to insure they are used by some other object. After the last referencing object dereferences it, the object can be safely disposed. A background process is running that constantly looks for objects with a zero refcount and discarts them.
This takes away a lot of headache from the programmer, because keeping track of your objects is error-prone and bothersome, but if you do refcounting yourself, you can:
- destruct objects the moment they are no longer needed
- have a lesser likelyhood (depending on your coding scrutiny) to have a pool of still-referenced but unneeded objects lying around.
Quote:
Originally posted by User Tron
???? Where did you get that from? The .net runtime is a package that is 23mb big.
First, by resources, I was talking about CPU and RAM and HD usage - and at least on my machine, a running .NET app uses vastly more than 23MB
Second, complex .NET apps do tax the CPU quite heavily. I have worked with stuff that did push a 700Mhz P3 to its limits. OK, this is not a modern machine by any measure, but the same app, written in C++, would have been easier on the processor. Now, if MS moves large parts of their OS from C++ to C#/.NET, they rightfully will want a CPU with a lot more bang so Windows does not seem too slow (remember how Apple fell in this trap when Moto was not able to scale the G4 and OS suddenly had the "slow and unresponsive" label attached).
Anybody from the Omni group, Aqua Minds, Stone Design, or the plethora of other small dev shops that are competing with major software houses wanna handle this one?
Or maybe we should ask that computer company that put the best happy face on Unix there is, and a major release out every year, what they think?
I'll take this one
A lot of the syntax is a lot different then "normal" programming langauges. Really getting used to the OO side of it is the hard part unless you're a software engineer and know OO programming inside and out. After getting used to OO (if you aren't already) and the differences in Syntax, it is SOOO easy and powerful. C# is pretty easy though, Objective-C is also (from a Software Engineer point of view).
Comments
AS to what Linux needs I do understand its need. For the longest itme I ran on a 128meg. Atleast up until kernel issues and Redhat bloat force me to upgrade. One should underestimate just how much that extra ram improves performance. Anyone with a fast PC should try running that machine with a modest maount of RAM to see how it performs. Sure you can get by, but if you can get 1gig for $70 more why give up the capability. It is a bit like saying that you only need a 5gig harddisk, sure you can get by but you are giving up capability.
As to Longhorn I'm not sure what the quoted material was saying. It may not be that Longhorn will need 2 Gigs rather that may be what they expect the standard PC to contain a few years from now. Frankly I think they are right. Further companies like Apple that are 64 bit aware will be driving large memory systems for competitive advantage. I don't see much of a future for 32 bits systems on the desktop, there is to much potential in that additional address range to ignore 64 bit.
Thanks
Dave
Originally posted by 1337_5L4Xx0R
Wizard 69: I too run linux on my PBG4 w/ 1024MB ram, but linux doesn't _need_ a gig... I'm sure I could 'get by' with a mere 256. Longhorn expecting 2 gigs is ridiculous.
Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?
Y'all ever heard of mono?
So don't sweat it too much.
Originally posted by 1337_5L4Xx0R
Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?
Unless you are comparing to very sloppy programming, the usual answer is "no". Automated garbage collecting is one of the features that made Java so popular among some and hated among others. It means that objects are tagged to insure they are used by some other object. After the last referencing object dereferences it, the object can be safely disposed. A background process is running that constantly looks for objects with a zero refcount and discarts them.
This takes away a lot of headache from the programmer, because keeping track of your objects is error-prone and bothersome, but if you do refcounting yourself, you can:
- destruct objects the moment they are no longer needed
- have a lesser likelyhood (depending on your coding scrutiny) to have a pool of still-referenced but unneeded objects lying around.
Originally posted by User Tron
???? Where did you get that from? The .net runtime is a package that is 23mb big.
First, by resources, I was talking about CPU and RAM and HD usage - and at least on my machine, a running .NET app uses vastly more than 23MB
Second, complex .NET apps do tax the CPU quite heavily. I have worked with stuff that did push a 700Mhz P3 to its limits. OK, this is not a modern machine by any measure, but the same app, written in C++, would have been easier on the processor. Now, if MS moves large parts of their OS from C++ to C#/.NET, they rightfully will want a CPU with a lot more bang so Windows does not seem too slow (remember how Apple fell in this trap when Moto was not able to scale the G4 and OS suddenly had the "slow and unresponsive" label attached).
Originally posted by OverToasty
... "vastly easier than objC" ...?!?!
feh
Anybody from the Omni group, Aqua Minds, Stone Design, or the plethora of other small dev shops that are competing with major software houses wanna handle this one?
Or maybe we should ask that computer company that put the best happy face on Unix there is, and a major release out every year, what they think?
I'll take this one
A lot of the syntax is a lot different then "normal" programming langauges. Really getting used to the OO side of it is the hard part unless you're a software engineer and know OO programming inside and out. After getting used to OO (if you aren't already) and the differences in Syntax, it is SOOO easy and powerful. C# is pretty easy though, Objective-C is also (from a Software Engineer point of view).