or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Apple's new MacBook Air dubbed world’s thinnest notebook
New Posts  All Forums:Forum Nav:

Apple's new MacBook Air dubbed world’s thinnest notebook - Page 10

post #361 of 400
Quote:
Originally Posted by TenoBell View Post

If I remember in the keynote correctly Steve said the MBA battery life if 5 hours with wireless networking on. Since the emphasis of the machine is on wireless. With the wireless off the MBA battery should last much longer.

In all fairness the MBA does have a more powerful CPU and integrated video card than these ultra-portables, but I figure it should give you about an extra hour.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #362 of 400
Quote:
In all fairness the MBA does have a more powerful CPU and integrated video card than these ultra-portables, but I figure it should give you about an extra hour.

Which is interesting that the MBA does have more powerful CPU and GPU than these other ultra-portables and is cheaper than many of them. Yet so many people still complain.
post #363 of 400
Quote:
Originally Posted by vinea View Post

Er...I'd be using a RAID array on a NAS even with a Pro. Oh wait, I do since we didn't go FC.

Dude...the $800 Precision T3400 is a 1.8Ghz C2D. Yah think the 2.8Ghz Quad Mac Pro might be a tad faster? To get to the quad at 2.66Ghz with 2GB the cost goes up to $1600 and no option to go Octo...although its not confirmed that you can upgrade the single CPU pro is it?

I wouldn't be giving my guys 1.8Ghz C2D and thinking I'm being fiscally responsible. I'd be thinking I was being pennywise and pound foolish.

And your T3400 specs are for the 3.16Ghz wolfdale machine and it's not $800. Heck, it's not even on Dell's website yet. So you're comparing a 2006 Yonah iMac to a 2008 Wolfdale Dell.

Simply genius.

I posted the wrong link, my bad. This is the right one for the Dell Precision T3400 E6850: http://www.spec.org/cpu2006/results/...1210-02856.pdf that is $1228 with display. Still alot of power for the money. There's about a dozen links for Dell models using the T3400 model number, this link I got from Dell's website, but it's easy to get confused if you don't pay attention to the E#'s.

At a score of 33.7, it's still not that far off. I gave the iMac link as comparison to the only iMac that has published specs. I know it's not representative of what they are now, which is why I added the Xserve with similar sized system bus, memory configuration and enabled processors. Like I said the Quad-core came in at 39.9, so let's see, Dell T3400 at $1228 or MacPro at $2299 with no monitor? Not a hard purchase decision for the minimal performance improvement that the Mac Pro would provide.
post #364 of 400
Quote:
Originally Posted by TenoBell View Post

Which is interesting that the MBA does have more powerful CPU and GPU than these other ultra-portables and is cheaper than many of them. Yet so many people still complain.

And a glass LED display, a backlight, full-sized keyboard and a full aluminium case instead of cheap plastic display, cheap mini-keyboard, and a cheap plastic case.

I especially love the botching over the cost of the SSD. That is easily quantifiable with a simple Google search that yields no results for a 64Gb SSd under $1,400, yet Apple only charges $999.

I withdrew my order for the MBA. I want to see read/write and battery differences between the the two storage types before making my decision.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #365 of 400
Quote:
Originally Posted by solipsism View Post

In all fairness the MBA does have a more powerful CPU and integrated video card than these ultra-portables, but I figure it should give you about an extra hour.

The comparisons to ultralights aren't very apt anyway, it's a 1.06/1.2GHz CPU against a 1.6/1.8GHz unit, 950 vs. 3xxx GPU, a small XGA screen against a larger, higher res screen, and non-full-spacing keyboards with full spacing keyboards. I don't think any of the competing keyboards are backlit. Many of the comparisons conveniently leave out in trying to paint the Air as a sub par machine. That the Air somehow fails miserably at hard drive space compared to other ultralights when no ultralight I've found is available for sale with a bigger than 80GB hard drive. At any rate, these feature hecklers are only using selective set of features to compare. Toughbooks are nice, but the fact that it takes an expedition to find one to order is not a good thing. Many models are preorder, and some of those store sites are defective, one had a non-functional preorder link.

The CPU in Apple's ultralight is using a CPU for a notebook class above it, and that's pretty impressive.
post #366 of 400
Quote:
Originally Posted by TenoBell View Post

Which is interesting that the MBA does have more powerful CPU and GPU than these other ultra-portables and is cheaper than many of them. Yet so many people still complain.

As we know from previous experience here, some people will only be happy with a new Apple product if it *EXACTLY* is what they want.

The more it deviates from that, the more they will bitterly complain about how "bad" it is.
post #367 of 400
Quote:
Originally Posted by JeffDM View Post

The CPU in Apple's ultralight is using a CPU for a notebook class above it, and that's pretty impressive.

In case it was missed, AnandTech offers some insights into this "special" chip being used in the MBA.
Apple's MacBook Air: Uncovering Intel's Custom CPU for Apple
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #368 of 400
Quote:
Originally Posted by melgross View Post

You're misunderstanding what the tests say.

Spec was devised to read cpu performance directly in one very small area. It measures nothing else. It doesn't rell us anything about real integer performance, which is what most programs use.

It doesn't tell us anything about vector processing capabilities, which all processors now have, to a greater, or lessor extent.


Considering that OS X, and Windows both push most floating calculations off into the vector processors hardware, the numbers are less than useful. They are deceiving. This has been a criticism of spec testing for years now.

As far as for the Macworld tests, they DO make a comparison between all machines. The numbers themselves aren't important. What IS important is the difference between the numbers for the various machines under test.

How you figure, from the SPEC.org website:

CPU2006

Designed to provide performance measurements that can be used to compare compute-intensive workloads on different computer systems, SPEC CPU2006 contains two benchmark suites: CINT2006 for measuring and comparing compute-intensive integer performance, and CFP2006 for measuring and comparing compute-intensive floating point performance.

But I get you, unless its test results you post, they don't count. Gotcha.

Mac World typically only posts results for Macs and don't compare them to Windows based PCs so again their tests are meaningless for comparison in this debate. Not that there is anything wrong with that, they are a Mac magazine afterall. If you had PC World test results that compared the iMac to the Dells (which PC World does often, but I couldn't find anything specific to these two machines) than that would be more credible.

Post some links about the credibility of SPEC tests, I'm curious to see what people have said, I did a Google search and couldn't find any negative comments. I highly doubt companies would continue to fund and pay for these tests if they weren't credible and indicative of the actual performance of their hardware. It's not like they paid some magazine or for-profit trade group to do these tests. SPEC was specifically setup as a non-profit to provide independent, un-biased, standardized testing of computer hardware.
post #369 of 400
I'm officially fed up with this thread!

[CENTER]"MacBook Air: The World's Thinnest Notebook
Hated by the World's Thickest People"
[/CENTER]
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #370 of 400
Quote:
Originally Posted by trboyden View Post

How you figure, from the SPEC.org website:

CPU2006

Designed to provide performance measurements that can be used to compare compute-intensive workloads on different computer systems, SPEC CPU2006 contains two benchmark suites: CINT2006 for measuring and comparing compute-intensive integer performance, and CFP2006 for measuring and comparing compute-intensive floating point performance.

But I get you, unless its test results you post, they don't count. Gotcha.

Mac World typically only posts results for Macs and don't compare them to Windows based PCs so again their tests are meaningless for comparison in this debate. Not that there is anything wrong with that, they are a Mac magazine afterall. If you had PC World test results that compared the iMac to the Dells (which PC World does often, but I couldn't find anything specific to these two machines) than that would be more credible.

Post some links about the credibility of SPEC tests, I'm curious to see what people have said, I did a Google search and couldn't find any negative comments. I highly doubt companies would continue to fund and pay for these tests if they weren't credible and indicative of the actual performance of their hardware. It's not like they paid some magazine or for-profit trade group to do these tests. SPEC was specifically setup as a non-profit to provide independent, un-biased, standardized testing of computer hardware.

The SPEC site can be expected to rationalize it's own importance, wouldn't you say?

Here's an article that puts a bit of perspective on it. I'll quote a bit first. But, this is just the first I've found.

Quote:
Processor versus Real Performance
The advent of TPC testing allowed vendors and users alike to abandon MIPS as a measure of relative system performance. MIPS has now been effectively replaced by the SPEC suite of tests, but like MIPS, SPEC tests are primarily a measure of processor complex performance. There is more to transaction processing than just the processor, because the I/O and disk subsystems can substantially influence performance.
To illustrate this, let's compare an 8-way SPARCserver 1000 and HP 9000 Series 800 H70. Table 1 compares the tpmC result with that of SPEC's SPECrate_int92 metric.

Table 1
\t
SPECrate_int92
tpmC
$/tpmC
SPARCserver 1000 8-way\t10,113\t1,079.4\t$1,032
HP 9000 H70\t3,757\t1,290.9\t$961
The processor-based SPEC test implies the Sun system has a significant performance superiority, nearly 2.7 times better. However, the TPC-C test shows that in multi-user transaction testing the systems are much more evenly matched.

TPC testing provides a platform to look at the entire computing workload. Just using a processor benchmark can be very misleading.

Sorry, the charts never make it in good form.

Quote:
Software and Performance
It is very easy to get carried away with the performance of the hardware. However, often we forget the influence the software can have on OLTP performance, be it the operating system or the database. For example, Digital has tested a system with the same database, but with different operating systems, DEC OSF/1 in one case and OpenVMS in the other. The OpenVMS system offered 16.5% more performance than the DEC OSF/1 system.

http://www.tpc.org/information/other...les/TopTen.asp

Another:

Quote:
Yen said Sun is working to try to create measurements that better represent its chips' performance. Clock speed just isn't sufficient, he said, and neither are the current processor benchmarks from the Standard Performance Evaluation Corporation (SPEC) -- of which Sun is a member.

Intel trumpeted high SPECint2000 and SPECfp2000 measurements for its coming Itanium 2 processor.

But SPEC's tests no longer represent real-world performance, Yen said. In part, that's because the entire test software package now can fit into a chip's high-speed cache memory, rendering obsolete the need to fetch data from slower main memory. In real life, though, servers often have to use main memory.

Customers "are not going to get the kind of performance they expected from reading the SPEC" benchmarks, Yen said.

http://news.zdnet.co.uk/hardware/0,1...2117959,00.htm

I'm trying to find a more general article, but so far,all are too technical.
post #371 of 400
I'd like to know what the preorder numbers on this have been. My guess is it is pretty high.

 

Your = the possessive of you, as in, "Your name is Tom, right?" or "What is your name?"

 

You're = a contraction of YOU + ARE as in, "You are right" --> "You're right."

 

 

Reply

 

Your = the possessive of you, as in, "Your name is Tom, right?" or "What is your name?"

 

You're = a contraction of YOU + ARE as in, "You are right" --> "You're right."

 

 

Reply
post #372 of 400
Quote:
Originally Posted by melgross View Post

The SPEC site can be expected to rationalize it's own importance, wouldn't you say?

Here's an article that puts a bit of perspective on it. I'll quote a bit first. But, this is just the first I've found.



Sorry, the charts never make it in good form.



http://www.tpc.org/information/other...les/TopTen.asp

Another:



http://news.zdnet.co.uk/hardware/0,1...2117959,00.htm

I'm trying to find a more general article, but so far,all are too technical.

Those are good links. But they are also from 2002, and the SPEC tests have been updated, presumably to address those concerns, and Sun is still a member of SPEC so while I'm sure there is no all inclusive test that can give 100% accurate results, I'm sure what we do get out of them is at least pertinent and representative of the overall performance. Unfortunately Apple is not a member of TPC, so we have no idea what those tests would expose as results. The validilty of TPC's results are questionable based on this line from your link to their website:

Quote:
The TPC is presently in a transition phase. The General Implementation Guidelines have been implemented in all three current benchmarks in order to address concerns raised about benchmark specials. The TPC-A and B benchmarks are into the first phase of their ultimate obsolescence. It is hoped that the TPC-C benchmark will fill the breach left by TPC-A and TPC-B, but the initial take up of results has been slower than expected. The TPC's next benchmark, TPC-D, is in its final review stages before being officially sanctioned, and is being eagerly awaited by many. All of this change brings with it uncertainty, and in some cases people are questioning the worth of TPC testing in general.

It also appears TPC's main focus is on server hardware, I didn't see any desktop hardware listed in their results. Also TPC is a competing testing provider so I'm not surprised to see them disclaiming SPEC's results. Their focus is also different, they look at transactional data transfer rather than pure hardware performance. SPEC's tests are run via C at the low system level, TPC's tests use higher level application interfaces such as Windows COM and Websphere app server. That's alot of system overhead there to get accurate results.
post #373 of 400
I decided to go to the SPEC organizations site, and see what they really have to say.

Interestingly enough, even to them, SPEC is just a secondary method to see how your machine rates.

From their site, two quotes:

Quote:
Ideally, the best comparison test for systems would be your own application with your own workload. Unfortunately, it is often very difficult to get a wide base of reliable, repeatable and comparable measurements for comparisons of different systems on your own application with your own workload.

and:

Quote:
Note: It is not intended that the SPEC benchmark suites be used as a replacement for the benchmarking of actual customer applications to determine vendor or product selection.

This is why various tech sites, and organizations, have come up with test suites that use actual programs to test machines, rather than use the highly misleading SPEC, and other synthetic test suites, which rarely correlate to real world performance.
post #374 of 400
Quote:
Originally Posted by trboyden View Post

Those are good links. But they are also from 2002, and the SPEC tests have been updated, presumably to address those concerns, and Sun is still a member of SPEC so while I'm sure there is no all inclusive test that can give 100% accurate results, I'm sure what we do get out of them is at least pertinent and representative of the overall performance. Unfortunately Apple is not a member of TPC, so we have no idea what those tests would expose as results. The validilty of TPC's results are questionable based on this line from your link to their website:

Those companies have some say in how the SPEc standard is written. Otherwise, it doesn't mean much.

Quote:
It also appears TPC's main focus is on server hardware, I didn't see any desktop hardware listed in their results. Also TPC is a competing testing provider so I'm not surprised to see them disclaiming SPEC's results. Their focus is also different, they look at transactional data transfer rather than pure hardware performance. SPEC's tests are run via C at the low system level, TPC's tests use higher level application interfaces such as Windows COM and Websphere app server. That's alot of system overhead there to get accurate results.

As I said, it's difficult to find a link for more mainstream users, but it doesn't matter, because the concept is the same.

The point is that different OS's, different software, and different hardware designs all have a great influence on performance. SPEC is not considered to be more than testing for raw cpu and close memory subsystem performance.

One of the main problems is that actual real world requirements rarely use the computer in the way SPEC tests. In that sense, SPEC is no better than any other synthetic benchmark, and there are many.

Go to any site that test systems for games (the easiest to find), and you will see SPEC, Futuremark, and sometimes others being used. But then they go to the game tests themselves, and they don't usually correlate to the synthetic test suits. they always make note of that on the sites, but say that people like to see them, so they use them. They also say they use them as a point of interest, BECAUSE they rarely correlate, as a warning about synthetic testing.
post #375 of 400
Quote:
Originally Posted by trboyden View Post

How you figure, from the SPEC.org website:

CPU2006

Designed to provide performance measurements that can be used to compare compute-intensive workloads on different computer systems, SPEC CPU2006 contains two benchmark suites: CINT2006 for measuring and comparing compute-intensive integer performance, and CFP2006 for measuring and comparing compute-intensive floating point performance.

But I get you, unless its test results you post, they don't count. Gotcha.

It's still a benchmark though. It can use compilers, settings and tweaks that most developers don't use. It even involves languages that most developers don't use, there's a lot of FORTRAN in there. FORTRAN is only used for specialized programs. I don't think it's used for any media applications.

Quote:
Post some links about the credibility of SPEC tests, I'm curious to see what people have said, I did a Google search and couldn't find any negative comments.

It's not that the tests aren't credible, you need to make sure the benchmark you choose reflects what you're going to do with it. That's a core problem.

http://www.gcocco.com/bench/cpu2k.html#should_i_buy
post #376 of 400
Quote:
Originally Posted by melgross View Post

I decided to go to the SPEC organizations site, and see what they really have to say.

Interestingly enough, even to them, SPEC is just a secondary method to see how your machine rates.

From their site, two quotes:



and:



This is why various tech sites, and organizations, have come up with test suites that use actual programs to test machines, rather than use the highly misleading SPEC, and other synthetic test suites, which rarely correlate to real world performance.

When I read that, I interpret that as saying YMMV. It's normal for people to give disclaimers, it's called covering your a$$. SPEC came up with a test suite that trys to take into consideration as many processes and calculations that can be compared equally among hardware systems. Are they perfect, probably not. But like they said, even though testing your application in your environment is the only way to be sure it will run as you expect, that type of testing is subjective at best and can't be used compare against other systems without actually aquiring them and running the same tests on that different hardware. You'll also run into differences between how the various OSes handle similar functions. That is why it is important to have tests like SPEC that test at the hardware level to eliminate those differences.

Calling SPEC's tests misleading without data to refute them is just ignorance.

All SPEC is saying is don't based your purchasing decision on just their data - they are not responsible if you do and it doesn't work out for you - test it for yourself with your apps and your data. That's just commonsense advice. I wouldn't buy something without testing it out first or getting a demo system to play with. That's just stupid.
post #377 of 400
Quote:
Originally Posted by trboyden View Post

When I read that, I interpret that as saying YMMV. It's normal for people to give disclaimers, it's called covering your a$$.

Now who's interpreting?

Quote:
SPEC came up with a test suite that trys to take into consideration as many processes and calculations that can be compared equally among hardware systems. Are they perfect, probably not. But like they said, even though testing your application in your environment is the only way to be sure it will run as you expect, that type of testing is subjective at best and can't be used compare against other systems without actually aquiring them and running the same tests on that different hardware. You'll also run into differences between how the various OSes handle similar functions. That is why it is important to have tests like SPEC that test at the hardware level to eliminate those differences.

Calling SPEC's tests misleading without data to refute them is just ignorance.

All SPEC is saying is don't based your purchasing decision on just their data - they are not responsible if you do and it doesn't work out for you - test it for yourself with your apps and your data. That's just commonsense advice. I wouldn't buy something without testing it out first or getting a demo system to play with. That's just stupid.


You're stuck on this, aren't you?

I provided TWO highly respected sites that said what I said, but YOU choose to disregard them, because what they say, is not what YOU want to read! And then you say I didn't provide any data, even though they make those statements, and I provided the Macworld DATA, which you refuse to regard.

You refuse to look at tests with programs you might use, but love tests that show nothing about how those programs, or even a computer, will actually perform in your daily use.

I guess there's no point in continuing, if you won't even listen to what the SPEC organization says.

You simply want to justify your choices, and you don't want to see anything that disabuses you of your idea about it.
post #378 of 400
Quote:
Originally Posted by JeffDM View Post

It's still a benchmark though. It can use compilers, settings and tweaks that most developers don't use. It even involves languages that most developers don't use, there's a lot of FORTRAN in there. FORTRAN is only used for specialized programs. I don't think it's used for any media applications.



It's not that the tests aren't credible, you need to make sure the benchmark you choose reflects what you're going to do with it. That's a core problem.

http://www.gcocco.com/bench/cpu2k.html#should_i_buy

I don't see any FORTRAN in the CPU2006 tests, it's says its all C or C++, and everything uses that for performance critical applications: http://www.spec.org/auto/cpu2006/Doc...ibquantum.html

And like I said in my other post, I do agree your best reference is to test the application on your own hardware, just like SPEC recommends. These results are simply for a pure HARDWARE performance comparison between various systems. They are vaild and are relevant when taking into other system configuration options like front-side bus which is very important for getting all that speed to your memory and hard drive and video cards, etc... Front-side bus is probably one of the more critical factors in the "real world speed" of a computer. This was very apparent in various models of G4s and G5 with different front-side busses. The one with the higher front-side bus always seemed twice as fast. There was a big difference in speed feel between the 800 MHz bus G5 and the 1 GHz bus G5's. Some of it was processor but most of it was front-side bus.
post #379 of 400
The one main part I think is missing from the MBA is internal 3G support. Since this is designed to primarily be a wireless device it seems like an obvious oversight. Perhaps Apple will bring 3G to the MBA when they are placing large orders for the 3G iPhone.
post #380 of 400
Quote:
Originally Posted by melgross View Post

Now who's interpreting?

SPEC came up with a test suite that trys to take into consideration as many processes and calculations that can be compared equally among hardware systems. Are they perfect, probably not. But like they said, even though testing your application in your environment is the only way to be sure it will run as you expect, that type of testing is subjective at best and can't be used compare against other systems without actually aquiring them and running the same tests on that different hardware. You'll also run into differences between how the various OSes handle similar functions. That is why it is important to have tests like SPEC that test at the hardware level to eliminate those differences.

Calling SPEC's tests misleading without data to refute them is just ignorance.

All SPEC is saying is don't based your purchasing decision on just their data - they are not responsible if you do and it doesn't work out for you - test it for yourself with your apps and your data. That's just commonsense advice. I wouldn't buy something without testing it out first or getting a demo system to play with. That's just stupid.


You're stuck on this, aren't you?

you refuse to look at tests with programs you might use, but love tests that show nothing about how those programs, or even a computer will actually perform in yourdaily use.

I guess there's no point in continuing, if you won't even listen to what the SPEC organization says.

Who's stuck on this? Your trying to compare application testing with hardware testing and trying to justify that application testing gives more accurate results when it's really comparing apples to oranges!

MacWorld's type of testing while relevant on the systems they purchased on the OS that they are running, with the applications they are testing is only relevant within those test parameters. It has absolutely no bearing on how Adobe CS will run on Windows on a Dell with similar hardware specs. There are too many operating system differences to have an accurate comparison. That why there are SPEC HARDWARE tests to make a relevant comparison between similar hardware configurations.

Just because Adobe CS may run well on a iMac you have, doesn't mean it will run the same on my same iMac taking into consideration software configuration and hardware manufacturing differences.

Again who is stuck on this???
post #381 of 400
Quote:
Originally Posted by TenoBell View Post

The one main part I think is missing from the MBA is internal 3G support. Since this is designed to primarily be a wireless device it seems like an obvious oversight. Perhaps Apple will bring 3G to the MBA when they are placing large orders for the 3G iPhone.

I am wanting this feature too.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #382 of 400
Quote:
Originally Posted by trboyden View Post

Who's stuck on this? Your trying to compare application testing with hardware testing and trying to justify that application testing gives more accurate results when it's really comparing apples to oranges!

MacWorld's type of testing while relevant on the systems they purchased on the OS that they are running, with the applications they are testing is only relevant within those test parameters. It has absolutely no bearing on how Adobe CS will run on Windows on a Dell with similar hardware specs. There are too many operating system differences to have an accurate comparison. That why there are SPEC HARDWARE tests to make a relevant comparison between similar hardware configurations.

Just because Adobe CS may run well on a iMac you have, doesn't mean it will run the same on my same iMac taking into consideration software configuration and hardware manufacturing differences.

Again who is stuck on this???

The application testing is for the purpose of judging which machines are faster at using those applications. What do you think it's measuring?

So go to a site that tests Windows machines using Photoshop and other programs you use, and compare their results to what you saw here. That will give you some idea of how they perform. It isn't perfect, but SPEC only tells you how a small subsystem of the machine performs, when that's not important to overall performance.
post #383 of 400
Quote:
Originally Posted by melgross View Post

The application testing is for the purpose of judging which machines are faster at using those applications. What do you think it's measuring?

So go to a site that tests Windows machines using Photoshop and other programs you use, and compare their results to what you saw here. That will give you some idea of how they perform. It isn't perfect, but SPEC only tells you how a small subsystem of the machine performs, when that's not important to overall performance.

Application testing is a measure of how fast an application runs on THAT machine.

From that information and based on the testing of the same application on another machine with the same OS and software configuration, you can surmise that the second machine's hardware causes the application to be slower or faster. You use a hardware test like SPEC to verify those assumptions and isolate which hardware component creates the bottleneck that slows that computer down or offers an advantage in speed.

Like I said SPEC is not all-inclusive, you will need to use other tests to get the big picture or to narrow down to a specific measurement to further enhance the analysis.

A great example of this is when the first tests of Windows were done on Intel Macs. Those first Intel Macs with their newer Intel hardware ran Windows better than other machines available during that period. Apple still uses that in their marketing today. SPEC tests showed that the Intel hardware in those Macs was capable of running Windows faster than other PCs available at that time. Did you need the SPEC test to make that conclusion? Maybe not, but your were able to verify your conclusion and then claim it as fact.
post #384 of 400
[QUOTE=trboyden;1201077]Application testing is a measure of how fast an application runs on THAT machine.[quote]

Yes, that's exactly what I've been saying. That's the point to the whole thing. You want to know how fast your apps will run on a particular machine.

Quote:
From that information and based on the testing of the same application on another machine with the same OS and software configuration, you can surmise that the second machine's hardware causes the application to be slower or faster. You use a hardware test like SPEC to verify those assumptions and isolate which hardware component creates the bottleneck that slows that computer down or offers an advantage in speed.

The problem with SPEC as those on the web sites I provided said, SPEC DOEXN'T tell you how fast your apps will run.

SPEC doesn't test the machine, it only test part of what the cpu and, sometimes, what the memory bus can do.

SPEC doesn't test Altivec on the PPC, and it doesn't test the various SSE's on the x86, and that's just the beginning.

SPEC is like testing how much horsepower you have in your car, but it doesn't tell you how fast that car can go, because it doesn't test anything other than the engine.

Quote:
Like I said SPEC is not all-inclusive, you will need to use other tests to get the big picture or to narrow down to a specific measurement to further enhance the analysis.

A great example of this is when the first tests of Windows were done on Intel Macs. Those first Intel Macs with their newer Intel hardware ran Windows better than other machines available during that period. Apple still uses that in their marketing today. SPEC tests showed that the Intel hardware in those Macs was capable of running Windows faster than other PCs available at that time. Did you need the SPEC test to make that conclusion? Maybe not, but your were able to verify your conclusion and then claim it as fact.

Well, we can agree on part of out argument then, that's progress.

But, you can see some of what we're saying with the Win-on-Intel tests. You don't need SPEC to measure what you've already measured. Spec may tell you something that agrees with your OS and software tests, but it may not.

If the OS and software makes heavy use of SSE, then SPEC tests won't help you very much. The same thing is true of the bus, HDD, and videocard, which Apple uses extensively for the OS these days (as do other software developers), as you know.

Also, consider the fact that Vista has been reliably reported across the Windows sites as being considerably slower when running programs than XP is. SP1 for Vista has also been shown to not give a noticeable speed increase, despite MS's statements. Some estimates are that Vista runs many programs at HALF the speed of XP on equivalent machines.

This is something else that SPEC won't tell you.
post #385 of 400
Quote:
Originally Posted by solipsism View Post

I am wanting this feature too.

Then people will bitch because it won't be free. Don't you have to pay for 3G cellular internet access?
Follow me on Twitter.
Reply
Follow me on Twitter.
Reply
post #386 of 400
Quote:
Originally Posted by His Dudeness View Post

Then people will bitch because it won't be free. Don't you have to pay for 3G cellular internet access?

Sure do. But Apple would either have to make it compatible with both US services and the varying frequencies around the world. This would add weight and cost without benefiting most users. I wouldn't have added it either. I will be getting a 3G USB dongle for my carrie once I get my Air.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #387 of 400
I hope to be ordering an MBA in a day or two.
Follow me on Twitter.
Reply
Follow me on Twitter.
Reply
post #388 of 400
This is a great machine. I hope the basic design lasts through several generations.

That said, I still need a Pro, though not by as much as some others here. I would guess that a second generation Air, with Penryn and 4 GB would seal the deal for me. The 4GB are because of Parallels. Yes, I must keep some Windows apps around, even Linux. I've got databases, Java apps, and other memory hogs to deal with.

But I'm a so-called "power user". For my dad or my wife this thing is plenty powerful and worth every penny. My only nit is that it should have a second USB port. Not because you need more than one (get a hub), but because they are prone to wear and tear, so if it breaks or starts to make a poor contact you lose too much functionality. Now surely some will reply that you should service the machine, but it would be much better if you could avoid doing that in the middle of a college semester or some other situation where giving up the machine would be a hassle.
post #389 of 400
[QUOTE=melgross;1201105][QUOTE=trboyden;1201077]Application testing is a measure of how fast an application runs on THAT machine.
Quote:

Yes, that's exactly what I've been saying. That's the point to the whole thing. You want to know how fast your apps will run on a particular machine.



The problem with SPEC as those on the web sites I provided said, SPEC DOEXN'T tell you how fast your apps will run.

SPEC doesn't test the machine, it only test part of what the cpu and, sometimes, what the memory bus can do.

SPEC doesn't test Altivec on the PPC, and it doesn't test the various SSE's on the x86, and that's just the beginning.

SPEC is like testing how much horsepower you have in your car, but it doesn't tell you how fast that car can go, because it doesn't test anything other than the engine.



Well, we can agree on part of out argument then, that's progress.

But, you can see some of what we're saying with the Win-on-Intel tests. You don't need SPEC to measure what you've already measured. Spec may tell you something that agrees with your OS and software tests, but it may not.

If the OS and software makes heavy use of SSE, then SPEC tests won't help you very much. The same thing is true of the bus, HDD, and videocard, which Apple uses extensively for the OS these days (as do other software developers), as you know.

Also, consider the fact that Vista has been reliably reported across the Windows sites as being considerably slower when running programs than XP is. SP1 for Vista has also been shown to not give a noticeable speed increase, despite MS's statements. Some estimates are that Vista runs many programs at HALF the speed of XP on equivalent machines.

This is something else that SPEC won't tell you.

I guess we'll just have to agree to disagree on what SPEC tests. Here is their description of what they are testing:

Quote:
There are several different ways to measure computer performance. One way is to measure how fast the computer completes a single task; this is a speed measure. Another way is to measure how many tasks a computer can accomplish in a certain amount of time; this is called a throughput, capacity or rate measure.

The SPEC speed metrics (e.g., SPECint2006) are used for comparing the ability of a computer to complete single tasks.
The SPEC rate metrics (e.g., SPECint_rate2006) measure the throughput or rate of a machine carrying out a number of tasks.

If you look at the programs they are using to test with, they are all very cpu, front-side bus and memory intensive applications, so the score you get is an overall system speed score.

If more of the manufacturers did the throughput test (Apple does run the Xserve through it) you'd have a better idea of what the graphics performance would be with the pov-ray test. That one test is very applicable to graphics and video type applications.

No, your right it doesn't test Altivec or SSE features, but those are application programming extensions that don't reflect on the base level performance of a system, which is the focus of the SPEC tests. Altivec and SSE are only applicable if an application takes advantage of them, same deal with high-end video cards. Does that make a difference in how an application performs? Of course, which is why you do the application testing at all.

I realize the SPEC tests don't come right out and give you a nice summary of what the tester feels the results show, but it doesn't mean they don't give you valuable information about how system intensive apps run on a certain system. You just have to read what the test does and review what the researcher was looking for in the way of results. I guess I just hung around the CS department at MIT too long. The results make sense to me and it is perfectly clear (to me anyways) that a machine that runs a batch of standardized tests faster than another machine, must be faster.
post #390 of 400
[QUOTE=trboyden;1201257][QUOTE=melgross;1201105]
Quote:
Originally Posted by trboyden View Post

Application testing is a measure of how fast an application runs on THAT machine.

I guess we'll just have to agree to disagree on what SPEC tests. Here is their description of what they are testing:



If you look at the programs they are using to test with, they are all very cpu, front-side bus and memory intensive applications, so the score you get is an overall system speed score.

If more of the manufacturers did the throughput test (Apple does run the Xserve through it) you'd have a better idea of what the graphics performance would be with the pov-ray test. That one test is very applicable to graphics and video type applications.

No, your right it doesn't test Altivec or SSE features, but those are application programming extensions that don't reflect on the base level performance of a system, which is the focus of the SPEC tests. Altivec and SSE are only applicable if an application takes advantage of them, same deal with high-end video cards. Does that make a difference in how an application performs? Of course, which is why you do the application testing at all.

I realize the SPEC tests don't come right out and give you a nice summary of what the tester feels the results show, but it doesn't mean they don't give you valuable information about how system intensive apps run on a certain system. You just have to read what the test does and review what the researcher was looking for in the way of results. I guess I just hung around the CS department at MIT too long. The results make sense to me and it is perfectly clear (to me anyways) that a machine that runs a batch of standardized tests faster than another machine, must be faster.

I'll cut this short, so that I don't make the post too long again.

SPEC doesn't test the system. That's the problem. As I've been saying, it tests the cpu, the memory controller, and sometimes, but only sometimes, the RAM.

It tests nothing else.

The problem with SPEC, and believe me, I'm not the only one to say this other than those in the links I provided earlier, is that it tests a subset of what determines how fast a computer will run any piece of software.

It's too far behind the curveball, as it doesn't recognize the methodologies that newer machines and OS's use. Therefore it's usefulness is limited to testing the relative performance of one cpu against another, even when that only determines less than 50% of a systems performance.

When an OS, such as OS X, and now, to a lessor extent, Vista, offloads part of the work to the GPU, how does SPEC determine that? It doesn't. It happily runs those routines through the cpu, even though that will never happen in real world use.

Those are some of the objections that I , and others, have for SPEC.
post #391 of 400
@ trboyden & melgross,

I don't think you two are ever going to ever see eye to eye on this. Though i do commend you two for a longest, most thorough back and forth thread that didn't result in some sort of juvenile fight.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #392 of 400
I think what this thread needs is more car analogies.

The way I see it, the SPEC test is like putting a car on a treadmill and testing how fast it will run. It is nice and standardized so you can test all cars the exact same way.
Its downfall is that it leaves out suspension, handling, breaking and the like as it is testing muscle first and foremost.

The Macworld tests Mel references are more like actual road tests with cornering, accereration and curbs.
Except that I suspect Trboyden might hold that different models of cars are being tested on different stretches of of road.


There. I hope I cleared everything up.
Progress is a comfortable disease
--e.e.c.
Reply
Progress is a comfortable disease
--e.e.c.
Reply
post #393 of 400
Quote:
Originally Posted by Bageljoey View Post

I think what this thread needs is more car analogies.

The way I see it, the SPEC test is like putting a car on a treadmill and testing how fast it will run. It is nice and standardized so you can test all cars the exact same way.
Its downfall is that it leaves out suspension, handling, breaking and the like as it is testing muscle first and foremost.

The Macworld tests Mel references are more like actual road tests with cornering, accereration and curbs.
Except that I suspect Trboyden might hold that different models of cars are being tested on different stretches of of road.


There. I hope I cleared everything up.

Yeah. Great. thanks.
post #394 of 400
Quote:
Originally Posted by melgross View Post

I'll cut this short, so that I don't make the post too long again.

It doesn't really matter. Anyone that's going to switch from OSX to XP to save $1000-$1200/machine for a less capable box (2 more cores? who needs 2 more cores? Quad core over Dual core is just Intel marketing) isn't really thinking clearly about impacts to workflow and TCO anyway. Even given that Adobe CS is a significant part of that workflow since, at least for the CS3 beta, OSX outperformed XP using the same hardware (XP under bootcamp).

I can certainly see not wanting to take the hit to go from XP to OSX since your infrastructure to support XP is already in place. At my company we have a small handful mac guys to cover all our macs. We have scads of pc IT guys. The ratio of support staff to number of machines they can support favors the mac pretty heavily.

Let him move to the Dells and XP. His competition will thank him. I doubt his staff will.

If he's that worried about capital expenses I wouldn't even get the bottom-end Precisions and save more money.

On Topic: He's be better off getting them MacBook Airs and small wacom tablets and letting his staff work from the local coffee joint with free wireless access. Well, no, not really (a MBP WOULD be better) but pretty danged close.

You can increase (or decrease) productivity a lot quicker with environmental changes than with PC selection. Note that the OS is part of that environment...
post #395 of 400
Quote:
Originally Posted by Bageljoey View Post

I think what this thread needs is more car analogies.

I thought we already had car analogies for the Air?
post #396 of 400
No we didn't.
Follow me on Twitter.
Reply
Follow me on Twitter.
Reply
post #397 of 400
Quote:
Sure do. But Apple would either have to make it compatible with both US services and the varying frequencies around the world. This would add weight and cost without benefiting most users. I wouldn't have added it either. I will be getting a 3G USB dongle for my carrie once I get my Air.

Very true. I've never really looked into 3G for my laptop, because I need another wireless bill like I need another hole in the head. I didn't realize they made USB dongles.
post #398 of 400
Quote:

Caldercay, now that is funny. May 2007? Never saw one...
"Run faster. History is a constant race between invention and catastrophe. Education helps but it is never enough. You must also run." Leto Atreides II
Reply
"Run faster. History is a constant race between invention and catastrophe. Education helps but it is never enough. You must also run." Leto Atreides II
Reply
post #399 of 400
Quote:
Originally Posted by Jeremy Brown View Post

Amazing. The Remote Disc is a stoke of genius I reckon.

I do this already for a year. Since a strap of my bag broke and my PowerBook dropped on the pavement, its optical disk is out of order. So when I want to install software from CD, I put the CD in my iMac at home, then access the iMac via the network and mount the CD. Works like a charm.

Kayce
post #400 of 400
Quote:
Originally Posted by kayce View Post

I do this already for a year. Since a strap of my bag broke and my PowerBook dropped on the pavement, its optical disk is out of order. So when I want to install software from CD, I put the CD in my iMac at home, then access the iMac via the network and mount the CD. Works like a charm.

Kayce

I used to master DVDs on my PB and then burn it on my PowerMac via Toast without having to manually move the files. I thought that was pretty slick.

Optical disc sharing isn't new, but being about to boot from a remote disc is.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Apple's new MacBook Air dubbed world’s thinnest notebook