or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Future Hardware: An 'Average' of 'Dual Core'?
New Posts  All Forums:Forum Nav:

Future Hardware: An 'Average' of 'Dual Core'?

post #1 of 24
Thread Starter 
Quote:
Microsoft is expected to recommend that the "average" Longhorn PC feature a dual-core CPU running at 4 to 6GHz; a minimum of 2 gigs of RAM; up to a terabyte of storage; a 1 Gbit, built-in, Ethernet-wired port and an 802.11g wireless link; and a graphics processor that runs three times faster than those on the market today.

http://www.microsoft-watch.com/artic...1581842,00.asp

Okay, in light of the G3s running 'X' like a Hamster on its spinny wheel...and 'X' only just getting a bit of 'snap' running on G5s...

...will the Mac OS X that goes up against 'Longhorn' need this level of spec on 'average'?



Lemon Bon Bon

PS. I can't imagine something running 3 times faster than the ATI x800!?!



PPS. Can you imagine the radioactive fall out and heat shielding needed for a dual core Prescott running at 5 gig?

PPPS. A terabyte of storage? 2 gigs of ram...on 'average'. Chain pulling?

er...



(Muses...a dual core 9xx running at 4.5 gig with Altivec II (der-rool with 3D enhancements and something to wipe up the drool...), 4 gigs of ram, 500 gig hard drive...something three times as powerful as x800. Running on...Mac OS X Lynx 10.8 (by the time Long Horn ships?!)
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #2 of 24
I suspect a great deal of exaggeration here. But one never knows, as Long porn *cough* horn is not expected before 2006. I think there was recently some statement by B. Gates himself saying something like that.
post #3 of 24
I think back to my LCII with a 20Mb hard drive and, what, 2Mb, of ram, on board graphics, at (what was it?) 512k....

So you put Longhorn on the shelves somehwere around October 2009!! (on a 14 month OSX.# cycle)

I only know 10 people that get the
binary joke

Reply

I only know 10 people that get the
binary joke

Reply
post #4 of 24
Those specs don't sound completely unreasonable. Longhorn will be built on C#, which is a vastly easier language to program to than C, C++, or objC, but gobbles up vastly more resources, too. It is much like Java in this respect.

Most likely, it will still run on a single P4 3Ghz, but the same way OS X runs on my 400Mhz Tibook - slow as molasses.

The advanced GPU will be necessary since MS is trying to leapfrog Quartz Extreme - if Apple advances QE to something like Quartz G3, they will require similar graphic power.
post #5 of 24
Quote:
Originally posted by Smircle
Those specs don't sound completely unreasonable. Longhorn will be built on C#, which is a vastly easier language to program to than C, C++, or objC ...

... "vastly easier than objC" ...?!?!

feh

Anybody from the Omni group, Aqua Minds, Stone Design, or the plethora of other small dev shops that are competing with major software houses wanna handle this one?

Or maybe we should ask that computer company that put the best happy face on Unix there is, and a major release out every year, what they think?
In life, as in chess, the moves that hurt the most, are the ones you didn't see ...
Reply
In life, as in chess, the moves that hurt the most, are the ones you didn't see ...
Reply
post #6 of 24
Quote:
Originally posted by Smircle
if Apple advances QE to something like Quartz G3, they will require similar graphic power.

Now, where this "G3" specifier come from? Do you picked it at random or does it means something?
post #7 of 24
Quote:
Originally posted by PB
Now, where this "G3" specifier come from? Do you picked it at random or does it means something?

Quarts = Quarts G1
Quarts Extreme = Quarts G2
Quarts Super Duper Advance Beat LongHorn Extreme in 10.x = Quarts G3.
"There's no bigot like a religious bigot and there's no religion more fanatical than that espoused by Macintosh zealots." ~Martin Veitch, IT Week [31-01-2003]
Reply
"There's no bigot like a religious bigot and there's no religion more fanatical than that espoused by Macintosh zealots." ~Martin Veitch, IT Week [31-01-2003]
Reply
post #8 of 24
Quote:
Originally posted by T'hain Esh Kelch
Quarts = Quarts G1
Quarts Extreme = Quarts G2
Quarts Super Duper Advance Beat LongHorn Extreme in 10.x = Quarts G3.

It is ok, let me supply you with some "z"'s.


z z z z z z z z z z z z z z z

Quartz

Quarts almost = Liters
post #9 of 24
Quote:
Originally posted by oldmacfan
It is ok, let me supply you with some "z"'s.


z z z z z z z z z z z z z z z

Quartz

Quarts almost = Liters

Oh well...
"There's no bigot like a religious bigot and there's no religion more fanatical than that espoused by Macintosh zealots." ~Martin Veitch, IT Week [31-01-2003]
Reply
"There's no bigot like a religious bigot and there's no religion more fanatical than that espoused by Macintosh zealots." ~Martin Veitch, IT Week [31-01-2003]
Reply
post #10 of 24
Sounds very reasonable to me.

First; everyone must realize by now that modern operating systems cry out for SMP hardware. The only question I have in mind is how many logical processors would be required. Don't forget Apple is very likely to be the first to market with dual core machines and for all intents its pro line is only SMP. That is the single chip 1.6 doesn't sell enough to be counted. So references to SMP are not a stretch considering present day realities. SMP is very much a requirement for many professional uses. The actual spped of the processors is not important, it appears that the speeds mentioned are only projections.

As to memory, I run 1gig on my Linux machine I use everyday. Memory is one of the best investments a single user multitasking machine can have thrown at it. Again the numbers quoted seem to be nothing more than projections based on where technology will be at that time. Just as older SDRAM is no longer cost effective, what we commonly use today will soon age away.

As far as video displays go, well I'm not satisfied with anything that I've seen recently in a price range that I'd want to pay. To be able to effectively drive a large screen display at very high resolutions require a fast GPU. Don't forget we have only recently left the world of benchmarking video systems at low resolution and 1280x1024 is not exactly high resolution. When the LCD manufactures start to manufacture large panels (+18"), at resolutions well beyond 1280x1024, you will see that the current video hardware is not all it is cracked up to be.

Screen size is probally the biggest missing parameter on this spec list or projection. The move to ever larger LCD panels has to go hand in hand with higher pixel densities than what is mainstream today. Have you ever noticed in the sunday fliers that horizontal and vertical resolutions are not specified on mainstream panels anymore irregardless of size? The problem is they are all of the same resolution. That makes the larger panels great for people with bad eyesite, but they don't convey anymore information to the average user. I suspect that we will soon see a jump to much higher resolution in the larger mainstream screens. Greater than 2000 horizontal pixels would be a good start, with vertical resolution varing based on aspect ratio.

Thanks
Dave


Quote:
Originally posted by Lemon Bon Bon
http://www.microsoft-watch.com/artic...1581842,00.asp

Okay, in light of the G3s running 'X' like a Hamster on its spinny wheel...and 'X' only just getting a bit of 'snap' running on G5s...

...will the Mac OS X that goes up against 'Longhorn' need this level of spec on 'average'?



Lemon Bon Bon

PS. I can't imagine something running 3 times faster than the ATI x800!?!



PPS. Can you imagine the radioactive fall out and heat shielding needed for a dual core Prescott running at 5 gig?

PPPS. A terabyte of storage? 2 gigs of ram...on 'average'. Chain pulling?

er...



(Muses...a dual core 9xx running at 4.5 gig with Altivec II (der-rool with 3D enhancements and something to wipe up the drool...), 4 gigs of ram, 500 gig hard drive...something three times as powerful as x800. Running on...Mac OS X Lynx 10.8 (by the time Long Horn ships?!)
post #11 of 24
Quote:
Originally posted by OverToasty
... "vastly easier than objC" ...?!?!

feh

Anybody from the Omni group, Aqua Minds, Stone Design, or the plethora of other small dev shops that are competing with major software houses wanna handle this one?

Or maybe we should ask that computer company that put the best happy face on Unix there is, and a major release out every year, what they think?


"Handle" it in what sense? He's right, C# is superiour to Obj-C (hardly surprising for a language that is approximately 20 years younger). I would choose C# over Obj-C any day, and I hope that Apple sees the light and adopts it as a supported language (MS has already provided the CLI runtime as an academic project for MacOS X). Adopting the .NET framework might be interesting as well, although this is far less clearly superiour to Cocoa... but it is much more complete.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #12 of 24
Quote:
Originally posted by Programmer
Adopting the .NET framework might be interesting as well, although this is far less clearly superiour to Cocoa... but it is much more complete.

"Interesting" in what sense. I understand the inroads they could make, but to serve what purpose? I think Apple's direction seems to be leading to more open standards. FWIW, I cannot get the theory made a while ago about "Apple going down the x86 road when people are just dissatisfied enough" out of my head.

I thought of a song, about the many, many viruses and money lost because of it, to the tune of Rudolph the Red Nose Reindeer the other night and posted it to my website....(click www to read it, if not interested, simply ignore).

Seriously, I think they may adopt more cooperation with MS, but not into a closed environment.
...we have assumed control
Reply
...we have assumed control
Reply
post #13 of 24
My interpretation of MS's heady hardware requirements for 2006 Longhorn is

1. Longhorn is do or die for them (otherwise, Linux, OSX, et al.) win, and

2. The only sure way they can get customers to buy their OS is without their choice --- i.e., by bundling it with new computers.

Hence, spec out ballsy computers, and make Longhorn required to run such hardware.

I also read that they're patenting Longhorn up the wazoo with 10+ patent applications per day. Since s/w patents are just an artificial business defense instituted by lawyers (USPTO's Bruce Lehmann et al.) to feed lawyers, this strategy explains how MS plans to spend their $50B (monopoly spoils) to defend their monopoly. Makes sense, but it seems pretty desperate. Frankly, I'm a little surprised since MS was (surprisingly enough) good about not abusing the patent system for years, and now they're suddenly (?) doing so all too aggressively.

I wouldn't short their stock just yet, but Longhorn may be the beginning of the end, especially if Linux runs just fine on all those retiring not-quite-Longhorn-capable computers.

Just 2c.
post #14 of 24
Quote:
Originally posted by Rhumgod
"Interesting" in what sense. I understand the inroads they could make, but to serve what purpose? I think Apple's direction seems to be leading to more open standards. FWIW, I cannot get the theory made a while ago about "Apple going down the x86 road when people are just dissatisfied enough" out of my head.

Interesting because all of a sudden a lot of software would suddenly run on the Mac, and a some really good development tools could be used on the Mac... taking away an advantage that Windows currently has. To make this happen Apple & Microsoft would need to come to an agreement -- cloning .NET wouldn't do for Apple (i.e. Mono isn't a commercial product). Doing that would establish it as a standard, and might even allow the Mono project to make much better headway. For that reason alone (among many), I don't think this will ever happen. That's a pity though because Microsoft has finally gotten something right.

It has nothing to do with Apple using x86. Microsoft's common language runtime is ISA independent and can run on PPC just as well as it runs on x86. It is based on a Java-like virtual machine & just-in-time compiler.

As for open standards... unfortunately there are no good open standards of the scope and depth of MS' .NET framework. Cocoa would be the logical starting place for Apple, but that is just as closed as .NET is.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #15 of 24
Quote:
Originally posted by oldmacfan
It is ok, let me supply you with some "z"'s.


z z z z z z z z z z z z z z z

Quartz

Quarts almost = Liters

now let me help you out.
Quarts almost = litre s

G
never underestimate the predictability of stupidity
Reply
never underestimate the predictability of stupidity
Reply
post #16 of 24
Quote:
Originally posted by g::masta
now let me help you out.
Quarts almost = litre s

Actually liters and litres are both correct. Same as meter and metre. They are commonly accepted variants of one another. Whether one came out of the American english and one out of proper English I have no idea but they're both in the dictionary.
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
post #17 of 24
Quote:
Originally posted by Telomar
Whether one came out of the American english and one out of proper English I have no idea but they're both in the dictionary.

According to Wordreference, meter is the US spelling of metre; same for liter.
post #18 of 24
Quote:
Originally posted by Lemon Bon Bon
...Mac OS X Lynx 10.8 (by the time Long Horn ships?!) [/B]


Except that by the time Longhorn actually ships, we will be at OS XII
Apple Computer, Inc.

AKA the Microsoft R&D Department
Reply
Apple Computer, Inc.

AKA the Microsoft R&D Department
Reply
post #19 of 24
Don't forget theater and theatre...

;^p
Late 2009 Unibody MacBook (modified)
2.26GHz Core 2 Duo CPU/8GB RAM/60GB SSD/500GB HDD
SuperDrive delete
Reply
Late 2009 Unibody MacBook (modified)
2.26GHz Core 2 Duo CPU/8GB RAM/60GB SSD/500GB HDD
SuperDrive delete
Reply
post #20 of 24
Quote:
Originally posted by Smircle
Those specs don't sound completely unreasonable. Longhorn will be built on C#, which is a vastly easier language to program to than C, C++, or objC, but gobbles up vastly more resources, too. It is much like Java in this respect.


???? Where did you get that from? The .net runtime is a package that is 23mb big. Note: this includes almost all classes (system, web, etc) and compilers. The 2 step compilation is a smart move and builds fast code, which is normally only few percent slower than c++. If you compare for example .net calls with COM calls you gain a lot of speed with .net. IMO C# is the best thing that ms made so far. I'm using C# in combination with unmanaged code for 3 years and believe me if you used C# for some time you will never want to go back to unmanaged c++. Even less when you only get a few percent performance.

End of Line
post #21 of 24
Wizard 69: I too run linux on my PBG4 w/ 1024MB ram, but linux doesn't _need_ a gig... I'm sure I could 'get by' with a mere 256. Longhorn expecting 2 gigs is ridiculous.

Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?

Y'all ever heard of mono?

Quote:
The Mono Project is a community initiative to develop an open source, Linux-based version of the Microsoft.NET development platform. The goal is to add the C# language to the arsenal of open-source development tools and allow the creation of operating-system-independent .NET programs.

So don't sweat it too much.
post #22 of 24
Well it may not need a gig, but it is a reasoanble investment in a new computer based on price points. One should go for as much memory as is reasonable cost wise on the platform.

AS to what Linux needs I do understand its need. For the longest itme I ran on a 128meg. Atleast up until kernel issues and Redhat bloat force me to upgrade. One should underestimate just how much that extra ram improves performance. Anyone with a fast PC should try running that machine with a modest maount of RAM to see how it performs. Sure you can get by, but if you can get 1gig for $70 more why give up the capability. It is a bit like saying that you only need a 5gig harddisk, sure you can get by but you are giving up capability.

As to Longhorn I'm not sure what the quoted material was saying. It may not be that Longhorn will need 2 Gigs rather that may be what they expect the standard PC to contain a few years from now. Frankly I think they are right. Further companies like Apple that are 64 bit aware will be driving large memory systems for competitive advantage. I don't see much of a future for 32 bits systems on the desktop, there is to much potential in that additional address range to ignore 64 bit.

Thanks
Dave


Quote:
Originally posted by 1337_5L4Xx0R
Wizard 69: I too run linux on my PBG4 w/ 1024MB ram, but linux doesn't _need_ a gig... I'm sure I could 'get by' with a mere 256. Longhorn expecting 2 gigs is ridiculous.

Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?

Y'all ever heard of mono?



So don't sweat it too much.
post #23 of 24
Quote:
Originally posted by 1337_5L4Xx0R
Smircle: I was led to believe that one of the best features of C# is 'garbage collection' which I took to mean freeing of memory (automatically, ie: w/out express instructions from the programmer). Wouldn't this necessarily mean better use of ram?

Unless you are comparing to very sloppy programming, the usual answer is "no". Automated garbage collecting is one of the features that made Java so popular among some and hated among others. It means that objects are tagged to insure they are used by some other object. After the last referencing object dereferences it, the object can be safely disposed. A background process is running that constantly looks for objects with a zero refcount and discarts them.
This takes away a lot of headache from the programmer, because keeping track of your objects is error-prone and bothersome, but if you do refcounting yourself, you can:
- destruct objects the moment they are no longer needed
- have a lesser likelyhood (depending on your coding scrutiny) to have a pool of still-referenced but unneeded objects lying around.


Quote:
Originally posted by User Tron
???? Where did you get that from? The .net runtime is a package that is 23mb big.

First, by resources, I was talking about CPU and RAM and HD usage - and at least on my machine, a running .NET app uses vastly more than 23MB

Second, complex .NET apps do tax the CPU quite heavily. I have worked with stuff that did push a 700Mhz P3 to its limits. OK, this is not a modern machine by any measure, but the same app, written in C++, would have been easier on the processor. Now, if MS moves large parts of their OS from C++ to C#/.NET, they rightfully will want a CPU with a lot more bang so Windows does not seem too slow (remember how Apple fell in this trap when Moto was not able to scale the G4 and OS suddenly had the "slow and unresponsive" label attached).
post #24 of 24
Quote:
Originally posted by OverToasty
... "vastly easier than objC" ...?!?!

feh

Anybody from the Omni group, Aqua Minds, Stone Design, or the plethora of other small dev shops that are competing with major software houses wanna handle this one?

Or maybe we should ask that computer company that put the best happy face on Unix there is, and a major release out every year, what they think?

I'll take this one

A lot of the syntax is a lot different then "normal" programming langauges. Really getting used to the OO side of it is the hard part unless you're a software engineer and know OO programming inside and out. After getting used to OO (if you aren't already) and the differences in Syntax, it is SOOO easy and powerful. C# is pretty easy though, Objective-C is also (from a Software Engineer point of view).

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Future Hardware: An 'Average' of 'Dual Core'?