or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › What's left for the Macintosh in a Post-PC iOS World?
New Posts  All Forums:Forum Nav:

What's left for the Macintosh in a Post-PC iOS World? - Page 5

post #161 of 253
Quote:
Originally Posted by wizard69 View Post

Right off the bat you spit out bull crap, the new Pro has exactly nothing in common with the G4 Cube. To offer up this comment just shows a complete lack of depth when it comes to technology. In other words you don't know what you are talking about.

 

 

Let me paint a clear picture since this is SOOOOO difficult to grasp.

 

 

 

http://gizmodo.com/the-brilliant-insanity-behind-the-new-mac-pros-design-512574427

 

“Apple® today announced that it will suspend production of the Power Mac™ G4 Cube indefinitely. The company said there is a small chance it will reintroduce an upgraded model of the unique computer in the future, but that there are no plans to do so at this time."

 

"The concept of the new Mac Pro is very similar to that of the old Cube: A powerhouse PC that is very small and externally upgradable. That concept was not viable in 2000, when all we had for I/O was FireWire 400 and USB 1.1. Fast forward to 2013, and the technology has caught up to the archetype. We now have Thunderbolt 2, 802.11ac, and USB 3, not to mention cloud storage options. The expandability limitations are gone."

 

 

post #162 of 253
Quote:
Originally Posted by hmm View Post

That may not be as likely as you think. NVidia has held the majority of the professional graphics market for many years. Part of it is that developers often implement NVidia's research projects rather than their own. As long as that is the case, do not look forward to things that run well on AMD hardware.

This is a bit misleading, OpenCL has been very successful considering its late start against CUDA. CUDA tie to NVidias hardware will be its downfall. Especially now that OpenCL accelerated Apple run so beautifully on Intel and AMD hardware. INtels Iris Pro can actually whip a large number of GPUs running OpenCL code by a large margin.

I'm not saying CUDA is dead today just that it is being marginalized very fast in favor of open standards solutions. People that remain tied to CUDA will be facing competition that can more effectively use the common hardware. In other words when most systems shipped are equipped with either Intel or AMD integrated GPUs, the developers that target OpenCL will be at a significant advantage in the marketplace. Due to these SoC computers I see developers dropping CUDA like a hot potato.
post #163 of 253
Originally Posted by z3r0 View Post
Simple. You made remarks belittling my intelligence because I wrote something negative about the new Mac Pro. So you can detect a bit of bias.

 

No, because you lied about the new Mac Pro.

 
 You do love assumptions, but I'll bite; Non-disclosure agreement.

 

Yeah, I know what an NDA is, thanks. You blatantly don’t have one, otherwise you wouldn’t be talking about it at all. And if you do, you don’t deserve it, since you don’t seem to comprehend the new Mac Pro.

 
You really didn't pick up on the telephone analogy did you? 

 

I simply didn’t find it valid, because it isn’t. You’re not daisy chaining computers, nor was that my implication. If you had cared about what you wrote in the first place, you would know we weren’t discussing that.

 
Its really not that hard. The bigger the box, the more space there is for air to travel and cool a system down.

 

It should also really not be that hard to provide a link of an HP computer that has a case volume equal to that of the new Mac Pro. Unless, of course, they don’t have one.

 
We? Who's we? Since when do you speak for Pro users? 

 

Shut up. You have more than a toddler’s grasp of pronouns; you know exactly to whom ‘we’ is referring. Either respond or don’t bother.

 

Originally Posted by z3r0 View Post
Let me paint a clear picture since this is SOOOOO difficult to grasp.

 

Thanks for resizing two images and photoshopping them together. When you’ve an actual argument, feel free to post it.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #164 of 253
Quote:
Originally Posted by z3r0 View Post


Interesting I quoted a response here and the forum came back with nothing. Must be it realizes there is no substance in your argument.

Beyond that quoting Gizmodo is a complete failure to harvest rational intelligence from the Internet.
post #165 of 253
Quote:
Originally Posted by z3r0 View Post

The bigger the box, the more space there is for air to travel and cool a system down.

It depends on what you put inside it. You wouldn't for example sit a motherboard in an empty room with a giant wall-sized fan blowing over it and say that's the optimal solution. The old Mac Pro had as many as 9 fans inside:



When you have a large volume of air to shift, that's necessary. When you have a smaller volume, it takes less effort to pull air through it. The unified thermal core is just one fan + one heatsink shared between all the processors and it'll cool the PSU too. It's not that it's essential, it's just more efficient and should run quieter than the old one. It's also more versatile than ever:

post #166 of 253

Quote:

Originally Posted by z3r0 View Post


I'm talking about hardware performance and expandability. For example with the new Mac Pro it looks like you are stuck with AMD/ATI Firepros meaning if you are using any software that requires NVIDIA CUDA (like the majority of pro 3D, video, and compositing software) you are screwed.

 

nVidia CUDA? Let me ask that again, nVidia CUDA? For what? Reducing the time taken to complete a video composite by a whole 3 seconds. Sorry, but I'd rather have something that does more basic computations better, you know, like OpenCL.

 

Quote:
Pro Mac users who rely on the Mac Pro did NOT want a different case. All apple had to do was add new processors (24 physical cores, I've seen 64), better video card options (Quad SLI/Crossfire/Tesla), make SSD drives standard internally and add thunderbolt 2 PCI expansion card options for those who need it (not that many devices out there BTW) and dual redundant power supplies with lights out management for those who need servers (xserve is missed)

You are again counting hardware specs. What you fail to realize is that the new Mac Pro is already twice as fast as the old one. Expendability is lost yes, but the new form factor more than makes up for it. And what about Thunderbolt 2? A cable that allows you to create a whole hub? Isn't that a bright idea for expandibility.

 

Quote:
If Apple really did want to change the case the should have just gone bigger with more expansion slots, not smaller and definitely not a can!

Please stop with that. Bigger, louder, more "badass", that's something a teenager wants. Apple has always pushed the boundaries by making things smaller and more efficient. Besides, 6 Thunderbolt cables can give you expandability no Mac Pro or any other Pro desktop can offer.

 

Quote:

I'm sure if you deliberately arrange it like this, it is going to look weird. But that said, it is no more a mess to handle that same hardware inside a case.

 

Quote:

It's a can! The rest is marketing BS! The "thermal core" is a none factor. HP can cool 24 cores without one just fine. The thermal core makes it impractical to add internal expansion like the Z820 or the previous Mac Pro.. If Apple wanted better cooling they would have gone with the Sandia cooler http://www.youtube.com/watch?v=JWQZNXEKkaU

Stop with the "can". You of all people know it's not a can, or about a can.

Marketing BS? Unified thermal core is a great idea. Use a single fan, and use it not only to cool CPU and GPU, but also RAM, storage, chipsets and the whole motherboard, while making extremely low noise and using lesser power. By the way, I'm not sure how a magnetic orbital equilibrium spinning at 2000 rpm is supposed to stay error free, when you face it horizontally or while moving your computer around when it is on.

 

Quote:
Jony is just obsessed with G4 cube which was a major failure. It didn't take off just like the new Mac Pro won't. There is a total disconnect between what Pro's need and want to what Apple thinks they need and want.

Apple II - It's nothing. GUI ShmiUI, it's not a match for hardcore programming.

iMac - It's bogus. How is that even a desktop?

iPod - In a world of Sony Walkman and CD players and cassette players? Apple must be crazy.

iPhone - It's idiotic. A phone so expensive that doesn't even record video?

iPad - Who has such an unstable mind that wants to go buy this large iPhone, when they already have a notebook?

 

Apple has stubbornly continued to innovate and lead the industry. The new Mac Pro is the way ahead. It solves the problem of space wastage, improves performance, is silent and looks cool.

post #167 of 253
Quote:
Originally Posted by z3r0 View Post
 

Its a consumer product, not a Pro product now.

 

The writing is on the wall for Pro users on the Mac. Apple just doesn't care nor want to listen to its most passionate customers in the pro sector.

 

I hope you're right.  I hope the new Mac Pro is a monumental flop and they announce they'll never make another one.  "Pro" users clamoring for the next Mac Pro or arguing what it should be might be the most annoying people on this forum- see the last 30 posts.  Good riddance.

2014 27" Retina iMac i5, 2012 27" iMac i7, 2011 Mac Mini i5
iPad Air 2, iPad Mini Retina, iPhone 6, iPhone 5S, iPod Touch 5
Time Capsule 5, (3) AirPort Express 2, (2) Apple TV 3

Reply

2014 27" Retina iMac i5, 2012 27" iMac i7, 2011 Mac Mini i5
iPad Air 2, iPad Mini Retina, iPhone 6, iPhone 5S, iPod Touch 5
Time Capsule 5, (3) AirPort Express 2, (2) Apple TV 3

Reply
post #168 of 253
Quote:
Originally Posted by Marvin View Post

It depends on what you put inside it. You wouldn't for example sit a motherboard in an empty room with a giant wall-sized fan blowing over it and say that's the optimal solution. The old Mac Pro had as many as 9 fans inside:
This highlights that for a given amount of power a bigger box is much harder to cool. The old Mac Pro was actually a pretty poor design hung over from the Power PC days when the big heat sinks or radiator was a requirement.

In any event I don't understand how somebody can't grasp the efficiency of Apples new arrangement. On other platforms each of those GPUs would have one or two fans, the CPU would have its fan, the power supply its fans and then a few case fans. The Mac Pro has one fan to address all of that. Like it or not that is innovation.
Quote:


When you have a large volume of air to shift, that's necessary. When you have a smaller volume, it takes less effort to pull air through it. The unified thermal core is just one fan + one heatsink shared between all the processors and it'll cool the PSU too. It's not that it's essential, it's just more efficient and should run quieter than the old one. It's also more versatile than ever:


It is a very well engineered product that throws out decades of computer design. Gone is the world of ATX chassis with hot spots and enough fans to lift a Harrier jet. Obviously nobody has stress tested a new Mac Pro yet but I don't see a huge problem myself.
post #169 of 253
Quote:
Originally Posted by Tallest Skil View Post
 

No, because you lied about the new Mac Pro.

 

I lied? Wow those are some pretty strong accusations there buddy. You seem a bit emotionally charged. Yep, definitely biased. Seems to me that the only one that lied is Apple. They promised a Mac Pro and didn't deliver. So what don't you like about the new Mac Pro?

 

Yeah, I know what an NDA is, thanks. You blatantly don’t have one, otherwise you wouldn’t be talking about it at all. And if you do, you don’t deserve it, since you don’t seem to comprehend the new Mac Pro.

 

I haven't said anything that would break it.

 

I simply didn’t find it valid, because it isn’t. You’re not daisy chaining computers, nor was that my implication. If you had cared about what you wrote in the first place, you would know we weren’t discussing that.

 

Well not sure how useful it would be to have six if you aren't using them together for rendering... Unless they were used as servers. Oh, wait you don't consider that using Mac Pro a server a valid use-case for it.

 

It should also really not be that hard to provide a link of an HP computer that has a case volume equal to that of the new Mac Pro. Unless, of course, they don’t have one.

 

Why would you want an HP or any computer at the volume of the new Mac Pro. Its does not meet the volume the proper volume to be considered a workstation.

 

Shut up. You have more than a toddler’s grasp of pronouns; you know exactly to whom ‘we’ is referring. Either respond or don’t bother.

 

So touchy... Not very logical.

 

Thanks for resizing two images and photoshopping them together. When you’ve an actual argument, feel free to post it.

 

You don't even bother to open the link and read the article. If you would have you would have noticed that the image does not belong to me nor did I create it. I guess you are just afraid to be wrong.

post #170 of 253
Originally Posted by z3r0 View Post
Well not sure how useful it would be to have six if you aren't using them together for rendering...

 

Conversation’s over. When you decide to stop being intentionally dense and actually respond to what is being said, we’ll pick it back up.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #171 of 253
Quote:
Originally Posted by Tallest Skil View Post

Former doesn’t do a quarter of what the latter does. It’s old tech and you know it. USB as a format–as a physical construct–has been around since ’96. “It’s faster” doesn’t make it new tech.
The difference is that people actually use USB and actually know what it is. The fact that TB technically does more is immaterial to the fact that USB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better (as FireWire was, not that it ever mattered), especially if it costs orders of magnitude more.

Plus, it's not like PCI Express as a format hasn't been around for almost a decade itself. Sure, TB has a different connector, which is what I assume you mean by "physical construct", but somehow I doubt you'd suddenly change your opinion on USB being "new tech" if the new version introduced an incompatible new connector type à la FW800.
Quote:
Quote:
Originally Posted by Durandal1707 View Post

Thunderbolt also lacks future-proofness, because:

1. Apple removed ExpressCard slots because, supposedly, only a small percentage of users were using it. However, I'd be willing to wager that the number of people using ExpressCard was many times larger than the number of people who are using Thunderbolt for anything other than displays. No one is using Thunderbolt — and we know what Apple does to technologies that no one is using.

2. Thunderbolt is tied to Intel, and Apple will probably eventually start making ARM-based Macs at some point.

1. Having what to do with Thunderbolt at all? Thanks for proving you know what everyone (rather “no one”) is doing¡
Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.
Quote:
2. Yeah, that co-development sure wouldn’t let Apple do anything with the format, huh¡
The co-development, as I understand it, is rumor only. Officially TB is Intel's, and it's quite possible that they won't allow it to be ported to non-Intel processors. Of course, it's also unlikely that makers of computers using non-Intel processors would care.
Quote:
Prepare to never be surprised.
We'll see. If Thunderbolt is removed at some point, at least from the lower-range consumer models, I won't be surprised one bit. Apple's got a history of trying to champion something for a while, and then giving up.
Quote:
Originally Posted by wizard69 View Post

Actually TB adoption has been rather quick in my mind. Frankly it is also hooking to monitors that act as hubs or docking stations for laptops so his argument makes absolutely no sense. There is no telling what is hooked up to those monitors / docking stations. Beyond that I believe Apple got 99% of what it wanted out of TB as a docking port.
How many people do you know who use the Thunderbolt port for anything other than a monitor? The only Thunderbolt accessory that even approaches being reasonably priced is the Apple Thunderbolt Display, and that costs $1000. The cheapest Thunderbolt docking station is $250, and the cheapest one that includes another Thunderbolt port so you're not blocked from attaching a DisplayPort monitor costs $300. In contrast, a USB 3.0 hub, which again, does what 99% of users would do with those Thunderbolt docks anyway — plug stuff into the USB ports — can be had for around $20. Sure, the Thunderbolt hub is cooler, but the sticker shock from the price tag usually kills the deal in favor of something USB-related. Which of course leads to a vicious cycle — USB gets more usage because it's cheaper, USB gets cheaper due to economies of scale from its large user base, USB gets even more usage because it's even cheaper. Thunderbolt can't really compete with this.
Quote:
Intel has as much as said other companies can do whatever to build TB parts.
Link please.
Quote:
If the PC world isn't it most likely is because they lost their way
Whether they've "lost their way" or not is immaterial. If the PC world doesn't adopt Thunderbolt as a technology, it will be dead within a few years. This is what happened to FireWire, and it had a much larger adoption in the PC world than Thunderbolt does. Apple couldn't keep these things alive on their own even back when they were still giving the Mac line their full attention. Now that they're focused on iDevices, which incidentally don't support Thunderbolt, there's no chance of them pulling it off solo.
Quote:
Originally Posted by Tallest Skil View Post

In some of the same way that the Mac Mini was step 2 of the G4 Cube design, Thunderbolt is really just step 2 of ADC. Think about it: power, video, audio, USB, FireWire (Ethernet, everything else anyone could want…), all in one cable.
YES, EXACTLY. And what happened to ADC? After a few years, it disappeared in favor of the technology it was supposed to replace — DVI.
Quote:
The difference here is this one is a piece of cake to work with and will be available on everything from everyone.
Heh.
Quote:
Intel really needs to say, “Look, if you want to make boards for our chips, you’ll put Thunderbolt ports on them.”
If they did this immediately when TB came out, it might have had a chance. But for some reason, they haven't, and likely won't in the future either. Meanwhile USB 3.0 has stolen its thunder, so to speak, and the advent of ARM-based PCs like the Surface is lessing Intel's hold on the PC market anyway.
Edited by Durandal1707 - 9/30/13 at 12:03pm
post #172 of 253
Quote:
Originally Posted by wizard69 View Post


For whom? How many real people would put such a configuration on their desktop?

 

Video editors, 3D artists, visual designers, scientists (big data, simulations etc...), running multiple OSs, render farms, servers etc...

 

First of nobody in their right mind is going to hook up a GPU card via TB.

 

Apple is encouraging it with the design. External chassis will be the only way to go if you need NVIDIA unless Apple is planning to offer another configuration option.

 

It isn't even a valid argument. Second AMD has been doing much better with OpenCL than NVidia. Buying software reliant upon sole source software like CUDA is just stupid.

 

Many would disagree. Look for benchmarks and heavy data crunching. Look up NVIDIA Tesla

 

However anybody familiar with technology would have realized where the industry is going. The new Mac Pro reflects that and the technologically literate acknowledge that.

 

Assumption

 

If you want to argue that Apple should build a server then I'm with you! Sometimes you need a product that isn't mass production to fill real niches.

 

Well the current model still lacks dual redundant PSUs and LOM but Apple does sell it as a workstation class server

 

That would result in a machine with even less sales than the current model.

 

They don't necessarily need to only offer one model

 

As for your derisive labeling of the new Pro as a spaghetti monster you leave out all of those break out boxes required to support reasonable amounts of I/O. Further you don't seem to realize that there is value in putting the conversion hardware near the source materials.

 

Cables can be accidentally pulled leading to problems. Internal components are preferred to protect them and save the mess.

 

On the contrary it is a big factor as it puts a lot of performance into a small volume. That is huge.

 

Huge according to Apple marketing. In the end workstations stay in one spot. If not they would be called laptops.

 

You can say that but have you really looked into what is required to keep a server room cool?

 

Yes, I have. I have also built and maintained several datacenters.

 

Apple wanted a better design overall, better cooling helped them get there. You can nitpick all you want but the reality is; they are innovating in a stale industry here.

 

Apple is shifting focus to the consumer and not the Pro.

 

Consider this, you as a so called Pro are a very tiny minority in the overall pool of Pros. I'd be willing to say right now that the Mac Pro will be a huge hit if (this is a big if) Apple prices it right. Frankly I suspect that the while point of the design is to allow Apple to be a bit aggressive in pricing.

By the way the G4 cube failed for one simple reason, it was priced grossly out of range compared to what the hardware was capable of. Frankly that was a similar problem with the old Mac Pro, which wasn't competitive at all unless you where buying a high end machine. You seem to think the old Mac Pro was a huge winner for Apple but yet everything indicates that it was in rapid decline with few serious nibbles. Frankly the old Mac Pro is a T-Rex of a machine that users, real pros in this case, lost interest in.

 

The G4 Cube's price point was less then that of the PowerMac. What killed it was lack of internal expandability.

post #173 of 253
Originally Posted by Durandal1707 View Post
The difference is that people actually use USB and actually know what it is.

 

Did they in ’96? I don’t remember that. I do remember USB being “horrible” and the lack of legacy support a “deal breaker”.

 
The fact that USB technically does more is immaterial to the fact that ADB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better, especially if it costs orders of magnitude more. 

 

You, when the first iMac was released.

 

Plus, it's not like PCI Express as a format hasn't been around for almost a decade itself. Sure, TB has a different connector, which is what I assume you mean by "physical construct", but somehow I doubt you'd suddenly change your opinion on USB being "new tech" if the new version introduced an incompatible new connector type à la FW800.

 

If said new connector was hot swappable where the previous one wasn’t (oh, say, like PS/2 and ADB vs USB), yeah, I’d be ALL over that. And look at this! Thunderbolt is hot swappable where PCIe hasn’t been!

 
Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.

 

No, they don’t. Why would you even say something so dumb? They have a history of dumping connectors that are OLD or WORSE than newer solutions. That’s why VGA was dropped as soon as DVI came out, despite VGA still being on every #^$#^%*! motherboard to this date. That’s why ADB and SCSI were dropped despite them (PS/2 instead of ADB) still being on every motherboard until the mid naughties!

 

If there is a new tech and Apple sees it as better, in every way, than the old, they will adopt it and kill off the old. Nothing at all to do with “not being widely used”. Jeez, I have to say it again: WHY would you say something so dumb? They purposely introduced MiniVGA, MiniDVI, and MicroDVI, for heaven’s sake! They weren’t being used at all until Apple created them!

 
We'll see.

 

Already have.

 
If Thunderbolt is removed at some point, at least from the lower-range consumer models, I won't be surprised one bit.

 

Because it will be replaced by a better technology in the future. Just like ADB was for USB. Would you have said “see, I told you ADB was bad; Apple got rid of it!”? You’d have been laughed out of the… what, I guess we had forums back then… of the chat room. 

 
Apple's got a history of trying to champion something for a while, and then giving up.

 

They also have a history of championing things and then getting them to not only stick but become the only way things are done. Do I have to say it a third time?

Keyboard and screen. Commercial GUI. Commercial mouse input. Keyboard moved to the back of the case. USB. No floppies. FireWire (tell a pro he can’t have FireWire anymore). 802.11 lineup-wide. No optical drive. And now Thunderbolt.

 
 In contrast, a USB 3.0 hub… …can be had for around $20.

 

Wow, a technology 17 years old is less expensive than a technology 3 years old. Forgive me if I wet myself in shock.

 
YES, EXACTLY. And what happened to ADC? After a few years, it disappeared in favor of the technology it was supposed to replace — DVI.

 

Thanks for latching on to the stupidest answer. 

 
Heh.

 

So I guess a plug half the size of a USB port is harder to work with than one four times its size that you have to screw into a PCI card, huh¡

 
…the advent of ARM-based PCs like the Surface is lessing Intel’s hold on the PC market anyway.

 

Not for about five years.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #174 of 253
Everyone forgets that PCs are still the most used devices, whether they're the most sold or not.
And those are where people do their work, not on their iDevices or Androids. So, basically, a computer is a lot more useful for someone who wants do work, and until iDevices can replace the working function of a computer, we won't see the death of it any time soon.
post #175 of 253
Quote:
Originally Posted by AppleInsider View Post

[...] there's a lot of desperate attempts to euphemize this trend under the softer language of "tablets," as if $49 White Box tablets have subsumed the personal computing business just because vendors of all stripes are shipping them.

Usage data paints a different picture: iPads are simply eating up PC sales while White Box tablets are doing their best impersonation of the 2008 netbook, reclined on a retail inventory shelf whispering to as-yet unconvinced buyers how super cheap they are and how they?re ready to do anything.

 

 

Apparently Kevin Bostic's information comes from a different source:

 

Android overtakes Apple's iPad in tablet marketshare, approaches in revenue earned

 

Quote:

"Apple's iPad and iPad mini are still the most popular tablets in the world, but Android-powered tablets are grabbing an increasing share of the market and are collectively approaching the iPad in terms of revenue, according to a new study from ABI Research.


"During the second quarter of 2013, Android-powered tablets as a collective overtook the iPad and iPad mini in terms of market share [...] The figures mirror previous findings from other market research firms. [...] tablets running Android are finally approaching Apple's offerings in terms of revenue generated."

 

How did you come to the conclusion that other tablets are not selling? They are perhaps not doing as much surfing, but that doesn't mean they're not selling. The latter cannot be inferred from the former.

 

BTW, I didn't make it through the entire piece. I got as far as your sarcastic flaming and had to bail. I don't know why you do that. I used to think you were better than that.

post #176 of 253
Is this really the best you can do?
Quote:
Originally Posted by Tallest Skil View Post

Did they in ’96? I don’t remember that. I do remember USB being “horrible” and the lack of legacy support a “deal breaker”.
Apple gets credit for popularizing that one, for sure. However, Intel's baking it onto every motherboard, added to the fact that it filled a space in the market that didn't really exist up to that point, and, most crucially, the fact that it was cheap even when it was new helped a lot.
Quote:
Quote:
The fact that USB technically does more is immaterial to the fact that ADB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better, especially if it costs orders of magnitude more.
You, when the first iMac was released.
ADB, seriously? lol.gif More like PS/2, the RS-232 serial port, the parallel port, the DA-15 game port, the Mini-DIN 8 serial port, ADB, SCSI, and probably others I'm forgetting about. There was no ubiquitous, monolithic serial port like USB in 1998 to compete against; there was only an alphanumerabet soup of various different ports, which had just about no name recognition with the general public, and in fact was very confusing to most of them. Nothing like USB existed on the market in 1998.

USB also had definite user experience improvements that actually meant something to the average end user, in that it was 1) hot-swappable, 2) available on all platforms, and 3) had only one port type to worry about for a slew of devices ranging from keyboards to modems to printers to hard drives. The ability to drop a grand on a PCIe chassis doesn't really have the same resonance with mom-type users.

Finally, USB was nowhere near as expensive in 1998-1999 as Thunderbolt is today, and it certainly wasn't an order of magnitude more than existing peripherals. An old Logitech mouse cost $60 (Computer Reseller News, March 11, 1996, 125); the same mouse for USB cost... $50 (Macworld, June 1999, 68-78). An IOMega Zip Plus Drive for SCSI cost $200 (Macworld, Feb. 1998, 51), whereas the upgraded Zip 250 for USB cost... $200 (Macworld, April 1999, 56). A Global Village TelePort 56K modem for the Mini-DIN 8 serial port cost $169 (Macworld, Apr. 1998, 87-91); a Global Village TelePort 56K USB modem cost $139 (Macworld, Oct. 1999, p36). SCSI CD-RW drives generally fell in the $400-$600 range (Macworld, Sept. 1998, 91-95), whereas USB ones a year later ranged from $300-$400 (Macworld, Sept. 1999, p38). I could go on, but USB did not have anywhere near the price problem that Thunderbolt has.
Quote:
If said new connector was hot swappable where the previous one wasn’t (oh, say, like PS/2 and ADB vs USB), yeah, I’d be ALL over that. And look at this! Thunderbolt is hot swappable where PCIe hasn’t been!
Exactly! Except USB is hot-swappable, and Thunderbolt doesn't offer any improvement that translates to a benefit for the average home user.
Quote:
Quote:
Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.
No, they don’t. Why would you even say something so dumb?
lol.gif Someone's having a hard time coming up with cogent arguments, if you're resorting to this.
Quote:
They have a history of dumping connectors that are OLD or WORSE than newer solutions. That’s why VGA was dropped as soon as DVI came out, despite VGA still being on every #^$#^%*! motherboard to this date. That’s why ADB and SCSI were dropped despite them (PS/2 instead of ADB) still being on every motherboard until the mid naughties!
Apple HDI-45 connector — introduced in 1994, combined video, audio, and power — replaced with plain old DB-15 in 1995.

PCI / PCIe — gradually removed from the entire desktop lineup since 1998. The usual reason given is that most users don't use PCI slots.

ADC — replaced DVI in 2002, combined video, USB, and power — replaced with... drum roll... DVI in 2004, causing a huge PITA for anyone who'd bought an ADC monitor.

FireWire — meant to be used instead of USB for high-speed devices — removed from the consumer notebook lineup in favor of the slower, less capable USB in 2008 (with a brief backpedal in early 2009 when they brought back the older MacBook design for a short-lived period of time)

ExpressCard — basically was to USB and FireWire what Thunderbolt is to USB — replaced by... an SD card slot (?!?!?!?!) in 2009.

User-replaceable RAM slots — removed in the MacBook Air and Retina MBP, replaced by nothing. Usual reason given is that most users don't open up their laptops.

User-replaceable storage — see above.

Apple's got a long history of removing things they think people aren't using, regardless of whether or not they're superior to what people are using.

Edit: Note that this is just a list of hardware examples. If one were to list the amount of times Apple has removed software features that were deemed unpopular... well, this would be a very long list.
Quote:
If there is a new tech and Apple sees it as better, in every way, than the old, they will adopt it and kill off the old. Nothing at all to do with “not being widely used”. Jeez, I have to say it again: WHY would you say something so dumb?
http://www.youtube.com/watch?v=jhVVRSY7z0I#t=03m09s (the link isn't jumping to the right spot sometimes, so go to 3 minutes 9 seconds in)
Quote:
They purposely introduced MiniVGA, MiniDVI, and MicroDVI, for heaven’s sake! They weren’t being used at all until Apple created them!
VGA and DVI were extremely widely used connector types.
Quote:
Because it will be replaced by a better technology in the future. Just like ADB was for USB. Would you have said “see, I told you ADB was bad; Apple got rid of it!”? You’d have been laughed out of the… what, I guess we had forums back then… of the chat room. 
http://en.wikipedia.org/wiki/Straw_man_argument
Quote:
Thanks for latching on to the stupidest answer. 
Here we go again with the personal attacks. Classy.
Quote:
So I guess a plug half the size of a USB port is harder to work with than one four times its size that you have to screw into a PCI card, huh¡
I was referring more to the "available on everything from everyone" comment.
Edited by Durandal1707 - 9/30/13 at 3:42pm
post #177 of 253
Quote:
Originally Posted by Durandal1707 View Post

The difference is that people actually use USB and actually know what it is. The fact that TB technically does more is immaterial to the fact that USB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better (as FireWire was, not that it ever mattered), especially if it costs orders of magnitude more.
USB works for the people looking for a low cost port that works well enough. However there are plenty of situations where USB simply doesn't work well at all especially on other platforms. Just try doing anything real time over USB an see what happens.
Quote:
Plus, it's not like PCI Express as a format hasn't been around for almost a decade itself. Sure, TB has a different connector, which is what I assume you mean by "physical construct", but somehow I doubt you'd suddenly change your opinion on USB being "new tech" if the new version introduced an incompatible new connector type à la FW800.
You are saying what here?
Quote:
Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.
Not exactly they drop ports when they no longer serve a purpose or have been eclipsed by newer technology. TB fills an important role for Apple as such it won't be replaced anytime soon. C
Quote:
The co-development, as I understand it, is rumor only. Officially TB is Intel's, and it's quite possible that they won't allow it to be ported to non-Intel processors. Of course, it's also unlikely that makers of computers using non-Intel processors would care.
Apple and Intel was working together on some level here as Intels development hardware was an Apple Mac Pro.
Quote:
We'll see. If Thunderbolt is removed at some point, at least from the lower-range consumer models, I won't be surprised one bit. Apple's got a history of trying to champion something for a while, and then giving up.
Actually TB is at its best on the low end consumer hardware, it gives things like the AIR incredible expansion capabilities and the ability to handle apps beyond what the laptops size would imply.
Quote:
How many people do you know who use the Thunderbolt port for anything other than a monitor? The only Thunderbolt accessory that even approaches being reasonably priced is the Apple Thunderbolt Display, and that costs $1000. The cheapest Thunderbolt docking station is $250, and the cheapest one that includes another Thunderbolt port so you're not blocked from attaching a DisplayPort monitor costs $300. In contrast, a USB 3.0 hub, which again, does what 99% of users would do with those Thunderbolt docks anyway — plug stuff into the USB ports — can be had for around $20.
That makes a whole lot of sense comparing a passive hub with the capabilities of any of those docks. The two pieces of hardware aren't even in the same ball field. One is hardware for the pee wee leagues and the other suitable for pro users.
Quote:
Sure, the Thunderbolt hub is cooler, but the sticker shock from the price tag usually kills the deal in favor of something USB-related. Which of course leads to a vicious cycle — USB gets more usage because it's cheaper, USB gets cheaper due to economies of scale from its large user base, USB gets even more usage because it's even cheaper. Thunderbolt can't really compete with this.
Your fundamental problem is made obvious right here, Thunderbolt is not designed to compete with USB. It is a fundamental mistake to try to put them into competition with each other. If Apple had such an option of TB they would have deleted the USB ports, instead they have supported USB3 as soon as Intel got its act together.
Quote:
Link please.
Do a little foot work yourself.
Quote:
Whether they've "lost their way" or not is immaterial. If the PC world doesn't adopt Thunderbolt as a technology, it will be dead within a few years. This is what happened to FireWire, and it had a much larger adoption in the PC world than Thunderbolt does. Apple couldn't keep these things alive on their own even back when they were still giving the Mac line their full attention. Now that they're focused on iDevices, which incidentally don't support Thunderbolt, there's no chance of them pulling it off solo.
Apple has gotten everything they have wanted out of TB, they won't be dropping it anytime soon.
Quote:
YES, EXACTLY. And what happened to ADC? After a few years, it disappeared in favor of the technology it was supposed to replace — DVI.
Heh.
If they did this immediately when TB came out, it might have had a chance. But for some reason, they haven't, and likely won't in the future either. Meanwhile USB 3.0 has stolen its thunder, so to speak, and the advent of ARM-based PCs like the Surface is lessing Intel's hold on the PC market anyway.
post #178 of 253
Quote:
Originally Posted by wizard69 View Post


This is a bit misleading, OpenCL has been very successful considering its late start against CUDA. CUDA tie to NVidias hardware will be its downfall. Especially now that OpenCL accelerated Apple run so beautifully on Intel and AMD hardware. INtels Iris Pro can actually whip a large number of GPUs running OpenCL code by a large margin.

I'm not saying CUDA is dead today just that it is being marginalized very fast in favor of open standards solutions. People that remain tied to CUDA will be facing competition that can more effectively use the common hardware. In other words when most systems shipped are equipped with either Intel or AMD integrated GPUs, the developers that target OpenCL will be at a significant advantage in the marketplace. Due to these SoC computers I see developers dropping CUDA like a hot potato.

I wasn't referring solely to OpenCL. I was pointing out that tools that are provided by NVidia are not likely to be completely optimized for AMD hardware. It goes beyond CUDA. They experiment in all kinds of stuff. The raytracer Adobe uses in After Effects has come up a couple times (yeah yeah I know computers do more than edit movies, but this example has come up) is designed to use CUDA and very slow running on the cpu. They basically implemented work done by NVidia. If the initiative began with Adobe, it's not as likely that it would have been CUDA based. Speaking of OpenCL, have you seen its 2.0 specification?  I have nothing against it. I merely wanted to point out that it isn't the only thing that has driven NVidia's relative dominance in that area up to this point..

post #179 of 253
The post PC world... We all see the clouds, but it's just a HD. Computers worked fine without them. To be honest, I think 64KB of ram is fine if we weren't old to like bells and whistles.

Once wireless bandwidth is reliable and plentiful, by do we need so much hardware? I think we'll be happy with mainframes. I think mainframes will Mak a comeback. Industrialized computing rather than our cottage industry computing.

Reliability, speed, uptime all becomes moot. We buy computers that slowly and surely become less useful as necessary software outpaces our computers. Not that MS Word has really become more useful in 20 years.

Why buy junk and be wasteful, we should be paying for what we use and pay a recurring fee.

Hell, we buy a PC for its value today and all I can do is amortize the clunker for taxes? That's the real bullshit story man.

I'm all for the return of the terminal and death of all this personal hardware conspiracy. Sure, we will need hardware when we don't have a connection, but really, how often do we leave our comfort zones?
post #180 of 253

"... reclined on a retail inventory shelf whispering to as-yet unconvinced buyers how super cheap they are and how they’re ready to do anything ..."

 

Ooooh!  Dirty!   

post #181 of 253
Quote:
Originally Posted by wizard69 View Post

USB works for the people looking for a low cost port that works well enough. However there are plenty of situations where USB simply doesn't work well at all especially on other platforms. Just try doing anything real time over USB an see what happens.
USB 2.0, true. USB 3.0, maybe not. 3.0 gets rid of the polling and is fully bidirectional, unlike USB 2.0, which should fix some of the biggest problems USB 2.0 had in that department.

Regardless, 99% of MacBook Air buyers don't know or care what USB's real time performance is.
Quote:
You are saying what here?
http://amzn.to/1g13Wsp
Quote:
Not exactly they drop ports when they no longer serve a purpose or have been eclipsed by newer technology. TB fills an important role for Apple as such it won't be replaced anytime soon. C
So ADC, FireWire (on the USB-only, pre-Thunderbolt machines), and ExpressCard served no purpose? What technology were they eclipsed by at the time they were replaced?
Quote:
Apple and Intel was working together on some level here as Intels development hardware was an Apple Mac Pro.
I've written software using Xcode on a MacBook Pro. Can I list Apple in the credits as co-developer?
Quote:
Actually TB is at its best on the low end consumer hardware, it gives things like the AIR incredible expansion capabilities and the ability to handle apps beyond what the laptops size would imply.
That's true. And the vast majority of MacBook Air buyers have no idea about those expansion capabilities, don't need them, and don't care about them. Of the few that do know and care, most still don't buy Thunderbolt peripherals, because they can't afford them. The Thunderbolt controller on the MBA is, 99.99% of the time, a dead weight.
Quote:
That makes a whole lot of sense comparing a passive hub with the capabilities of any of those docks. The two pieces of hardware aren't even in the same ball field. One is hardware for the pee wee leagues and the other suitable for pro users.
Hey, I wasn't the one who was equating Thunderbolt : USB :: USB : ADB.
Quote:
Your fundamental problem is made obvious right here, Thunderbolt is not designed to compete with USB. It is a fundamental mistake to try to put them into competition with each other. If Apple had such an option of TB they would have deleted the USB ports, instead they have supported USB3 as soon as Intel got its act together.
They didn't really have a choice, since Intel put USB 3.0 ports right on the motherboard. Back in the FireWire vs. USB 2.0 days, when Apple was creating its own motherboards, they tried to delay putting USB 2.0 on as long as possible.

The bottom line is, for most things that Thunderbolt can do, USB 3.0 is good enough. The exceptions are all tiny niches.
Quote:
Do a little foot work yourself.
lol.gif That's what I thought. You couldn't find any.
Quote:
Apple has gotten everything they have wanted out of TB, they won't be dropping it anytime soon.
They've gotten one display model and some adapters to other non-Thunderbolt protocols. Ooh.
post #182 of 253
Originally Posted by PedroCst View Post
So, basically, a computer is a lot more useful for someone who wants do work, and until iDevices can replace the working function of a computer, we won't see the death of it any time soon.

 

Okay, so… 2010, then.

 

Originally Posted by Durandal1707 View Post
Is this really the best you can do?

 

I’d have said the same for you three posts ago.

 

Nothing like USB existed on the market in 1998.

 

And somehow this excuses morons from not moving to the future now?

 

Thunderbolt doesn't offer any improvement that translates to a benefit for the average home user.

 

Thanks for that; when you have any proof, feel free to let us know. Your personal inability to think of any benefit ≠ “no benefit”.

 

Someone's having a hard time coming up with cogent arguments, if you're resorting to this.

 

Yeah, you get one more reply before you forfeit your right to argue. I’ll say it again: Your personal inability to come up with any argument ≠ my argument is bad.

 

 

So maybe try reading the links you post before posting them?

 
Here we go again with the personal attacks. Classy.

 

Maybe just stop pretending you’re an idiot, then. Sounds like a better idea, doesn’t it? Then we don’t have to listen to your dreck and the presented arguments on both sides start making sense.

 

Originally Posted by Durandal1707 View Post
Of the few that do know and care, most still don't buy Thunderbolt peripherals, because they can't afford them.

 

Thanks for pretending that someone buying a $1,200-$2,500 computer can’t afford Thunderbolt peripherals.

 

The Thunderbolt controller on the MBA is, 99.99% of the time, a dead weight.

 

[citation needed, but will never be provided, because you don’t have the first clue what you’re on about, nor ever will, it seems]

 
The bottom line is, for most things that Thunderbolt can do, USB 3.0 is good enough. The exceptions are all tiny niches.

 

For most things that USB can do, ADB (PS/2) was good enough. The exceptions are all tiny niches.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #183 of 253
Quote:
Originally Posted by xSerenityx View Post
 

 

Coming out of lurkdom for a second time to highlight this. This is what I think-- yes, the junk right now that hints at convergence is just that-- junk. It's clumsy, not well thought out, and a pain to use. But that's not how Apple makes things. I think there is a future for convergence, and if Apple does it, it will be a breath of fresh air compared to what is happening now. All the talk about the 64-bit ARM chip in 5s-- I think this may be the other side to it. That's some serious processing power. The way I see it happening, as quoted above, is that turns into a desktop OS when hooking up to a monitor, and back to iOS when it's not.

From The Globe and Mail article, 'Inside the fall of BlackBerry: How the smartphone inventor failed to adapt':

 

Mike Lazaridis was at home on his treadmill and watching television when he first saw the Apple iPhone in early 2007. There were a few things he didn’t understand about the product. So, that summer, he pried one open to look inside and was shocked. It was like Apple had stuffed a Mac computer into a cellphone, he thought. ...“If that thing catches on, we’re competing with a Mac, not a Nokia,” he recalled telling his staff.


Edited by AweWyld - 10/2/13 at 2:47pm
post #184 of 253
Quote:
Originally Posted by crysisftw View Post
 

Quote:

 

What you fail to realize is that the new Mac Pro is already twice as fast as the old one.

 

The aggregate magnitude of the new Mac Pro performance as compared to the current Mac Pro remains to be seen . It will depend upon the final build and spec options. Preliminary benchmarks have shown up on Geekbench in June and September as recently reported by MacRumours and other media outlets.

 

Apple on their website states up to 2x faster floating-point performance, up to 2x higher memory bandwidth, over 2x faster GPU performance, and up to 2.5x faster storage performance. To be fair, Apple offers this caveat, "Performance claims are based on technical specifications of preproduction Mac Pro hardware as of June 2013 and are subject to change"'.

post #185 of 253
Quote:
Originally Posted by AweWyld View Post
 

 

The aggregate magnitude of the new Mac Pro performance as compared to the current Mac Pro remains to be seen . It will depend upon the final build and spec options. Preliminary benchmarks have shown up on Geekbench in June and September as recently reported by MacRumours and other media outlets.

 

Apple on their website states up to 2x faster floating-point performance, up to 2x higher memory bandwidth, over 2x faster GPU performance, and up to 2.5x faster storage performance. To be fair, Apple offers this caveat, "Performance claims are based on technical specifications of preproduction Mac Pro hardware as of June 2013 and are subject to change"'.

 

It could have been much faster had Apple gone with the previous form factor. The new Mac Pro only has 1 CPU with 12 cores vs 2 CPUs with the possibility of up to 24 cores. Quad Video cards could have been feasible with the extra space gained from removing the optical drives and the addition of dual redundant power supplies and lights out management.

post #186 of 253
Originally Posted by z3r0 View Post

It could have been much faster had Apple gone with the previous form factor. The new Mac Pro only has 1 CPU with 12 cores vs 2 CPUs with the possibility of up to 24 cores. Quad Video cards could have been feasible with the extra space gained from removing the optical drives and the addition of dual redundant power supplies and lights out management.

 

So go buy one of those and stick with the old way of doing things. When Apple succeeds, you don’t even have to care about it.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #187 of 253
Quote:
Originally Posted by z3r0 View Post

The new Mac Pro only has 1 CPU with 12 cores vs 2 CPUs with the possibility of up to 24 cores. Quad Video cards could have been feasible with the extra space gained from removing the optical drives and the addition of dual redundant power supplies and lights out management.

Why is only having 4 GPUs acceptable when you could have 8, why 2 CPUs when you could have 4? People always settle for something just above what Apple offers. Apple used to offer a machine with 4 slots and you couldn't put two high-powered GPUs in it anyway because it had a 300W power limit over the slots. They could have changed this but they build machines based on what people are likely to do with them. If most of their customers used 1 slot then why give everyone 4 and why bundle an oversized PSU for people who don't use it? At least TB ports don't compromise the form factor.

If the reason for multiple slots is for multiple GPUs then they might as well bundle the machine with two so everyone gets two. They could have offered 2 CPUs but they could have offered dual E5-2687Ws last year and didn't. What's the point in having two sockets if they don't offer the highest CPU options?

If the choice was either nothing or the new Mac Pro, the new one is clearly better. If it was between a drop-in upgrade for the old one and the new one, they both have their respective advantages. With the new one, everyone gets dual GPUs, everyone gets PCIe storage, everyone gets a compact enclosure, everyone gets Thunderbolt and 4K display support. With the old one, you'd get SATA storage, you can have 4 slots but you'd get a power limit so it doesn't matter, you only get two GPUs at most anyway and likely no SLI support if you plan to use NVidia. You likely wouldn't get Thunderbolt on top as it complicates things with multiple GPUs.

The old one being updated would have sold in low volumes. The new one will sell in low volumes. It's a low volume market. The new one has had far more press than a refresh would have ever had so it will likely sell better than a refresh.

There's nothing wrong with people fantasising about some $15000 specced up computer but Apple has to take things to market and they have to sell them year after year. Someone who drops a lot of money on a machine has no reason to upgrade regularly and if they do, they buy their own GPUs so Apple gets no money on the upgrades. This way, Apple can better secure their revenue for that line of computers and that's better for the long term. The handful of people who want a 24-core machine with quad or more GPUs and 64GB+ RAM have the option to buy from HP or Dell, same if they want server hardware. Apple is just choosing not to compromise the form factor for everyone to benefit that handful of people because it's not worth it.
post #188 of 253
Quote:
Originally Posted by Marvin View Post



There's nothing wrong with people fantasising about some $15000 specced up computer but Apple has to take things to market and they have to sell them year after year. Someone who drops a lot of money on a machine has no reason to upgrade regularly and if they do, they buy their own GPUs so Apple gets no money on the upgrades. This way, Apple can better secure their revenue for that line of computers and that's better for the long term. The handful of people who want a 24-core machine with quad or more GPUs and 64GB+ RAM have the option to buy from HP or Dell, same if they want server hardware. Apple is just choosing not to compromise the form factor for everyone to benefit that handful of people because it's not worth it.

 

Actually those are the kinds of customers who would have been with HP, Dell, or whatever other oem already. There are use cases that either demand or benefit from insane specs, often running some version of Linux. There's no guarantee you won't see some kind of after-market unofficial upgrades. As for cpus swaps, they won't be any different here than they were with past models aside from the fact that sockets change after ivy bridge e. There are some complaints that are reasonable, in that some configurations would be more expensive with what is available port-wise on the new ones. I'm skeptical whether many of these people would have actually purchased a 24 core machine if one was available at Apple's typical pricing schedule, not that Dell or HP is cheap when it comes to dual workstations. I don't mourn the lack of one simply because I wouldn't have purchased that model anyway. It is funny when people occasionally suggest that cpus will be soldered when no current Xeon EP options support it.

post #189 of 253
Quote:
Originally Posted by Tallest Skil View Post
 

Thanks for pretending that someone buying a $1,200-$2,500 computer can’t afford Thunderbolt peripherals.

 

Maybe not "unable" but perhaps "unwilling." I just bought a new RAID and as much as I wanted TB I just couldn't justify the cost of a Pegasus when I could get so much more value for money from a USB3 LaCie. TB priced itself out of the market for me.


Edited by v5v - 10/2/13 at 7:31pm
post #190 of 253
Quote:
Originally Posted by Tallest Skil View Post

I’d have said the same for you three posts ago.
Highly Intellectual Tallest Skil Point #1.
Quote:
And somehow this excuses morons from not moving to the future now?
Highly Intellectual Tallest Skil Point #2: "People who don't spend $500 on a thing to give you extra USB ports are morons."
Quote:
Thanks for that; when you have any proof, feel free to let us know. Your personal inability to think of any benefit ≠ “no benefit”.
Highly Intellectual Tallest Skil Point #3: "I can't think of any benefit to the average user, but nevertheless the onus is on you to prove a negative. Until then, we just assume that one exists. Somewhere."
Quote:
Yeah, you get one more reply before you forfeit your right to argue.
Irony.
Quote:
I’ll say it again: Your personal inability to come up with any argument ≠ my argument is bad.
Highly Intellectual Tallest Skil Point #4: Actually making points based on market reality, with actual research to boot, doesn't constitute a valid argument, but "You're dumb" does.
Quote:
So maybe try reading the links you post before posting them?
Highly Intellectual Tallest Skil Point #5: I still don't get what a straw man argument is, apparently, even though you linked me to the definition.
Quote:
Maybe just stop pretending you’re an idiot, then. Sounds like a better idea, doesn’t it? Then we don’t have to listen to your dreck and the presented arguments on both sides start making sense.
Highly Intellectual Tallest Skil Point #6: You're an idiot.
Quote:
Thanks for pretending that someone buying a $1,200-$2,500 computer can’t afford Thunderbolt peripherals.
Highly Intellectual Tallest Skil Point #7: Someone who just blew their entire discretionary budget on a computer instead of half of it can certainly keep spending even more.
Quote:
[citation needed, but will never be provided, because you don’t have the first clue what you’re on about, nor ever will, it seems]
Highly Intellectual Tallest Skil Point #8: I haven't been reading anything in your entire posts, apparently.
Quote:
For most things that USB can do, ADB (PS/2) was good enough. The exceptions are all tiny niches.
Highly Intellectual Tallest Skil Point #9: Let me just repeat a vapid statement I made earlier, ignoring that it's already been thoroughly destroyed.

Congratulations. You've written a post with absolutely no content. A truly Zen-like achievement, for sure. There's nothing really to reply to here, so basically, I'm done with this thread. Have a nice day.
Edited by Durandal1707 - 10/2/13 at 10:35pm
post #191 of 253
Quote:
Originally Posted by v5v View Post

Maybe not "unable" but perhaps "unwilling." I just bought a new RAID and as much as I wanted TB I just couldn't justify the cost of a Pegasus when I could get so much more value for money from a USB3 LaCie. TB priced itself out of the market for me.

LaCie's Thunderbolt products are not much more expensive than the USB 3 ones:

http://www.lacie.com/us/products/product.htm?id=10607

12TB USB 3 RAID is $1099, 10TB TB is $1099. The TB one goes up to 785MB/s, the USB 3 one goes to 245MB/s. There's a $100 difference on their dual drives and the TB ones are faster.
post #192 of 253
Originally Posted by Durandal1707 View Post
I can’t think of any benefit to the average user…

 

Sorry, gonna have to stop your idiocy there. I never said anything of the sort, nor could that be implied from any of the posts already made. In fact, I’ve stated multiple already. You, on the other hand, refuse to even acknowledge that, yes, the responsibility is yours to take five or so seconds to think of something Thunderbolt does better than USB/PCIe.

 

Actually making points based on market reality, with actual research to boot, doesn't constitute a valid argument

 

Only because you didn’t.

 
Someone who just blew their entire discretionary budget on a computer

 

Sounds stupid, to me. I guess having blown the entire budget they’d be capable of buying PCIe and USB accessor… oh, right, they don’t have any money anymore! My mistake.

 
…so basically, I’m done with this thread.

 

Good; thanks for giving up your pointless tirade against reality.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #193 of 253
Quote:
Originally Posted by Marvin View Post
The handful of people who want a 24-core machine with quad or more GPUs and 64GB+ RAM have the option to buy from HP or Dell, same if they want server hardware. Apple is just choosing not to compromise the form factor for everyone to benefit that handful of people because it's not worth it.

 

The problem with going with HP or Dell is not being able to run Mac OS X legally. The h4ckintosh route is possible but you wont get much support.

post #194 of 253
Quote:
Originally Posted by Tallest Skil View Post
 

 

So go buy one of those and stick with the old way of doing things. When Apple succeeds, you don’t even have to care about it.

 

Old really? How so? The HP Z820 runs newer faster processors, can use all the latest NVIDIA and AMD GPUs, SSD, Thunderbolt 2 etc... and hold more of them!

 

The only problem is that the OS is not Mac OS X.

 

Hell even Woz loves them: http://www.macworld.co.uk/mac-creative/news/?newsid=3468413


Edited by z3r0 - 10/3/13 at 11:01am
post #195 of 253
Originally Posted by z3r0 View Post

Old really? How so? The HP Z820 runs newer faster processors, can use all the latest NVIDIA and AMD GPUs, SSD, Thunderbolt 2 etc... and hold more of them!

 

Please, at the VERY least, respond to what I’m saying, not what you want to pretend I’m saying.

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply

Originally Posted by Marvin

The only thing more insecure than Android’s OS is its userbase.
Reply
post #196 of 253
Quote:
Originally Posted by Marvin View Post


LaCie's Thunderbolt products are not much more expensive than the USB 3 ones:

http://www.lacie.com/us/products/product.htm?id=10607

12TB USB 3 RAID is $1099, 10TB TB is $1099. The TB one goes up to 785MB/s, the USB 3 one goes to 245MB/s. There's a $100 difference on their dual drives and the TB ones are faster.

 

Good point, but there's a reason LaCie's TB RAIDs are cheaper -- no multimode controller. They're RAID 0 only, no RAID10 or any other redundant form. According to the reply I got from an inquiry to LaCie, their TB RAIDs were designed to be purely speed screamers.

post #197 of 253
Ok,

Old like connecting remotely via dumb terminals to huge powerful servers that do all the processing and rendering? Oh wait, I just described cloud computing but wait isn't that the future?

Why wait? The future is now. Apple isn't releasing any new hardware that others don't already have. If anything they are giving you less and claiming it to be innovative because you expand with wires externally versus internally.

Thats not innovative. Thats called unnecessary clutter.

Quote:
Originally Posted by Tallest Skil View Post

 

Please, at the VERY least, respond to what I’m saying, not what you want to pretend I’m saying.

 

post #198 of 253

Quote:

 

Originally Posted by z3r0 View Post

Pro's didn't ask for a smaller case.

 

Originally Posted by Tallest Skil View Post
[Insert Henry Ford quote that every intelligent person already knows here]

 

Any customer can have a Mac Pro any colour that he wants so long as it is black? ;-)

 

The quote, "Any customer can have a car painted any colour that he wants so long as it is black.", can be found in the book 'My Life and Work' written by Henry Ford and Samuel Crowther.

 

A popular quote often attributed to Henry Ford, “If I’d asked people what they wanted, they’d have asked for a faster horse.”, has never been sourced to any primary material.

post #199 of 253
Quote:
Originally Posted by AweWyld View Post
 

 

Any customer can have a Mac Pro any colour that he wants so long as it is black? ;-)

 

The quote, "Any customer can have a car painted any colour that he wants so long as it is black.", can be found in the book 'My Life and Work' written by Henry Ford and Samuel Crowther.

 

A popular quote often attributed to Henry Ford, “If I’d asked people what they wanted, they’d have asked for a faster horse.”, has never been sourced to any primary material.

 

Yes, I'm familiar with the quotes. They are applicable to John Doe consumers who really don't know what they want. Pro users are a completely different breed. They have specific needs and know what they want/need (often to the minute detail) in order to accomplish what ever they set out to do.

 

In the end they will use whatever tool is best for the job and that is least cumbersome to use.

 

Excessive wires cause clutter, are more prone to failure (yes thunderbolt cables fail often) and are easily unplugged. Internal expansion gets out of the way and is better protected in a neat package.

post #200 of 253
Quote:
Originally Posted by hmm View Post

I wasn't referring solely to OpenCL. I was pointing out that tools that are provided by NVidia are not likely to be completely optimized for AMD hardware. It goes beyond CUDA. They experiment in all kinds of stuff. The raytracer Adobe uses in After Effects has come up a couple times (yeah yeah I know computers do more than edit movies, but this example has come up) is designed to use CUDA and very slow running on the cpu.
Nothing wrong with that but ray tracing software isn't a product that only NVidia supplies.
Quote:
They basically implemented work done by NVidia. If the initiative began with Adobe, it's not as likely that it would have been CUDA based.
My concern is that Adobe doesn't have much of a future as a company when the real need for discrete GPUs practically evaporates. There will always be a high end market but the volume there will mean drastic price restructuring just to keep development going. I could see discrete GPU cards starting at $1000 in the near future and the ultra high performance compute crass more than doubling in price. That just to keep NVidia above water. So my resistance to CUDA really has little to do with its technical merits, I just see NVidia doing a Kodak in a couple of years.
Quote:
Speaking of OpenCL, have you seen its 2.0 specification?  I have nothing against it. I merely wanted to point out that it isn't the only thing that has driven NVidia's relative dominance in that area up to this point..

NVidias dominance comes from being first to market with a solution that attracts the attention of early adopters. However the word dominance is clearly aging quickly as many people have abandoned CUDA. That doesn't mean they have abandoned NVidia yet, but there are better performing solutions across the board.

As for the 2.0 OpenCL spec, I haven't looked at it in depth, but it looks like it is evolving along with hardware capabilities. This is another thing to consider, nobody has really gotten to the point where they are delivering the hardware for the type of heterogeneous systems everyone is reaching for. Haswell comes very close and has some really impressive compute scores. Both AMD and NVidia are working hard at advancing their GPUs to handle that heterogeneous future better. However that hardware and software really needs to integrate well with Apples OS.

I mentioned this a few weeks ago but I'm still wondering(hoping) that the big delay for the new Mac Pro is in part GPU related. It is my understanding new Pro hardware will come late this year from AMD. A new generation of GPUs would make the Mac look that much better.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › What's left for the Macintosh in a Post-PC iOS World?