What's left for the Macintosh in a Post-PC iOS World?

179111213

Comments

  • Reply 161 of 255
    z3r0z3r0 Posts: 238member
    Quote:
    Originally Posted by Tallest Skil View Post

     

     

    Answer my question first, please. You don’t get to ask another until then.

     

    Simple. You made remarks belittling my intelligence because I wrote something negative about the new Mac Pro. So you can detect a bit of bias.

     

     

    Never Done Anything; yes, we know you have no experience in this regard.

     

    You do love assumptions, but I'll bite; Non-disclosure agreement.

     

    When you feel like making sense again, feel free.

     

    You really didn't pick up on the telephone analogy did you?

     

    Great. So maybe answer the question I asked you.

     

    Its really not that hard. The bigger the box, the more space there is for air to travel and cool a system down.

     

    Your entire argument is logical fallacies. Why are you afraid to answer the questions we are asking you? It’s really embarrassing.

     

    We? Who's we? Since when do you speak for Pro users?

  • Reply 162 of 255
    wizard69wizard69 Posts: 13,377member
    hmm wrote: »
    Quote:
    Some of those markets update hardware very slowly due to interdependence and sometimes the need to wait for smaller vendors to catch up. I wouldn't say they're all shortsighted if they don't purchase new machines right away. What I find a little silly is the can description. I don't care about the aesthetics of a tool. To comment on such things only because they dislike other aspects is simply disingenuous.
    I guess my problem here is either the lack of honesty or simply the denial of history. The audio recording world has changed dramatically over the years and will continue to change in the future. The thought that the industry will thus stay with PCI Express cards forever is faulty even if TB never catches on. TB however is a great way to interface I/O devices for this market, especially with the availability of optical cable solutions.

    I wouldn't just go by what people did in the past. Some of wacom's tablets are designed to be used in an angled manner, much like a drafting table. Notebooks are some of the worst offenders, as you have to deal with something front facing that is typically positioned below an ergonomic viewing position and above an ergonomic typing position. I really hate that. If phones were powerful enough there would be some potential for slim clients driven by your phone.

    It has always been possible to outspec a mac pro with one of the other PC oem offerings. Anyone who required a similar alignment of specs wouldn't have purchased the older mac pro design, just as they won't purchase the new one. OEMs like configurations like the one you listed as the margins are insane, yet the number of customers who buy those is tiny. Generally machines configured like that are configured for very specific use, and the actual configuration involves more than just selecting the most expensive option on each line. If you do that, good luck getting a purchase order approved.
    There is also the issue of comparing what Apple wants to be a mass production machine with a niche machine. Hell we could put a Mac Pro up against a Cray and declare it a failure but it doesn't make sense to do that. The interesting thing here though is that I'm willing to bet more than a few cray users are salivating over the new Mac Pro and its potential for post processing and visualizing data. It always amazes me when the A/V professionals come on this site with the attitude that they are the only pro customers Apple sells to. Sorry guys and gals but the "pro" market is far bigger than that.
  • Reply 163 of 255
    z3r0z3r0 Posts: 238member
    Quote:
    Originally Posted by wizard69 View Post



    Right off the bat you spit out bull crap, the new Pro has exactly nothing in common with the G4 Cube. To offer up this comment just shows a complete lack of depth when it comes to technology. In other words you don't know what you are talking about.

     

     

    Let me paint a clear picture since this is SOOOOO difficult to grasp.

     

     

     

    http://gizmodo.com/the-brilliant-insanity-behind-the-new-mac-pros-design-512574427

     

    “Apple® today announced that it will suspend production of the Power Mac™ G4 Cube indefinitely. The company said there is a small chance it will reintroduce an upgraded model of the unique computer in the future, but that there are no plans to do so at this time."

     

    "The concept of the new Mac Pro is very similar to that of the old Cube: A powerhouse PC that is very small and externally upgradable. That concept was not viable in 2000, when all we had for I/O was FireWire 400 and USB 1.1. Fast forward to 2013, and the technology has caught up to the archetype. We now have Thunderbolt 2, 802.11ac, and USB 3, not to mention cloud storage options. The expandability limitations are gone."

     

     

  • Reply 164 of 255
    wizard69wizard69 Posts: 13,377member
    hmm wrote: »
    That may not be as likely as you think. NVidia has held the majority of the professional graphics market for many years. Part of it is that developers often implement NVidia's research projects rather than their own. As long as that is the case, do not look forward to things that run well on AMD hardware.

    This is a bit misleading, OpenCL has been very successful considering its late start against CUDA. CUDA tie to NVidias hardware will be its downfall. Especially now that OpenCL accelerated Apple run so beautifully on Intel and AMD hardware. INtels Iris Pro can actually whip a large number of GPUs running OpenCL code by a large margin.

    I'm not saying CUDA is dead today just that it is being marginalized very fast in favor of open standards solutions. People that remain tied to CUDA will be facing competition that can more effectively use the common hardware. In other words when most systems shipped are equipped with either Intel or AMD integrated GPUs, the developers that target OpenCL will be at a significant advantage in the marketplace. Due to these SoC computers I see developers dropping CUDA like a hot potato.
  • Reply 165 of 255
    Originally Posted by z3r0 View Post

    Simple. You made remarks belittling my intelligence because I wrote something negative about the new Mac Pro. So you can detect a bit of bias.

     

    No, because you lied about the new Mac Pro.

     

     You do love assumptions, but I'll bite; Non-disclosure agreement.


     

    Yeah, I know what an NDA is, thanks. You blatantly don’t have one, otherwise you wouldn’t be talking about it at all. And if you do, you don’t deserve it, since you don’t seem to comprehend the new Mac Pro.

     

    You really didn't pick up on the telephone analogy did you? 


     

    I simply didn’t find it valid, because it isn’t. You’re not daisy chaining computers, nor was that my implication. If you had cared about what you wrote in the first place, you would know we weren’t discussing that.

     

    Its really not that hard. The bigger the box, the more space there is for air to travel and cool a system down.


     

    It should also really not be that hard to provide a link of an HP computer that has a case volume equal to that of the new Mac Pro. Unless, of course, they don’t have one.

     

    We? Who's we? Since when do you speak for Pro users? 


     

    Shut up. You have more than a toddler’s grasp of pronouns; you know exactly to whom ‘we’ is referring. Either respond or don’t bother.

     

    Originally Posted by z3r0 View Post

    Let me paint a clear picture since this is SOOOOO difficult to grasp.

     

    Thanks for resizing two images and photoshopping them together. When you’ve an actual argument, feel free to post it.

  • Reply 166 of 255
    wizard69wizard69 Posts: 13,377member
    z3r0 wrote: »

    Interesting I quoted a response here and the forum came back with nothing. Must be it realizes there is no substance in your argument.

    Beyond that quoting Gizmodo is a complete failure to harvest rational intelligence from the Internet.
  • Reply 167 of 255
    MarvinMarvin Posts: 15,322moderator
    z3r0 wrote: »
    The bigger the box, the more space there is for air to travel and cool a system down.

    It depends on what you put inside it. You wouldn't for example sit a motherboard in an empty room with a giant wall-sized fan blowing over it and say that's the optimal solution. The old Mac Pro had as many as 9 fans inside:


    [VIDEO]


    When you have a large volume of air to shift, that's necessary. When you have a smaller volume, it takes less effort to pull air through it. The unified thermal core is just one fan + one heatsink shared between all the processors and it'll cool the PSU too. It's not that it's essential, it's just more efficient and should run quieter than the old one. It's also more versatile than ever:


    [VIDEO]
  • Reply 168 of 255

    Quote:


    Originally Posted by z3r0 View Post





    I'm talking about hardware performance and expandability. For example with the new Mac Pro it looks like you are stuck with AMD/ATI Firepros meaning if you are using any software that requires NVIDIA CUDA (like the majority of pro 3D, video, and compositing software) you are screwed.

     

    nVidia CUDA? Let me ask that again, nVidia CUDA? For what? Reducing the time taken to complete a video composite by a whole 3 seconds. Sorry, but I'd rather have something that does more basic computations better, you know, like OpenCL.

     

    Quote:
    Pro Mac users who rely on the Mac Pro did NOT want a different case. All apple had to do was add new processors (24 physical cores, I've seen 64), better video card options (Quad SLI/Crossfire/Tesla), make SSD drives standard internally and add thunderbolt 2 PCI expansion card options for those who need it (not that many devices out there BTW) and dual redundant power supplies with lights out management for those who need servers (xserve is missed)

    You are again counting hardware specs. What you fail to realize is that the new Mac Pro is already twice as fast as the old one. Expendability is lost yes, but the new form factor more than makes up for it. And what about Thunderbolt 2? A cable that allows you to create a whole hub? Isn't that a bright idea for expandibility.

     

    Quote:
    If Apple really did want to change the case the should have just gone bigger with more expansion slots, not smaller and definitely not a can!

    Please stop with that. Bigger, louder, more "badass", that's something a teenager wants. Apple has always pushed the boundaries by making things smaller and more efficient. Besides, 6 Thunderbolt cables can give you expandability no Mac Pro or any other Pro desktop can offer.

     

    Quote:



    I'm sure if you deliberately arrange it like this, it is going to look weird. But that said, it is no more a mess to handle that same hardware inside a case.

     

    Quote:


    It's a can! The rest is marketing BS! The "thermal core" is a none factor. HP can cool 24 cores without one just fine. The thermal core makes it impractical to add internal expansion like the Z820 or the previous Mac Pro.. If Apple wanted better cooling they would have gone with the Sandia cooler http://www.youtube.com/watch?v=JWQZNXEKkaU

    Stop with the "can". You of all people know it's not a can, or about a can.

    Marketing BS? Unified thermal core is a great idea. Use a single fan, and use it not only to cool CPU and GPU, but also RAM, storage, chipsets and the whole motherboard, while making extremely low noise and using lesser power. By the way, I'm not sure how a magnetic orbital equilibrium spinning at 2000 rpm is supposed to stay error free, when you face it horizontally or while moving your computer around when it is on.

     

    Quote:

    Jony is just obsessed with G4 cube which was a major failure. It didn't take off just like the new Mac Pro won't. There is a total disconnect between what Pro's need and want to what Apple thinks they need and want.

    Apple II - It's nothing. GUI ShmiUI, it's not a match for hardcore programming.

    iMac - It's bogus. How is that even a desktop?

    iPod - In a world of Sony Walkman and CD players and cassette players? Apple must be crazy.

    iPhone - It's idiotic. A phone so expensive that doesn't even record video?

    iPad - Who has such an unstable mind that wants to go buy this large iPhone, when they already have a notebook?

     

    Apple has stubbornly continued to innovate and lead the industry. The new Mac Pro is the way ahead. It solves the problem of space wastage, improves performance, is silent and looks cool.

  • Reply 169 of 255
    andysolandysol Posts: 2,506member
    Quote:

    Originally Posted by z3r0 View Post

     

    Its a consumer product, not a Pro product now.

     

    The writing is on the wall for Pro users on the Mac. Apple just doesn't care nor want to listen to its most passionate customers in the pro sector.


     

    I hope you're right.  I hope the new Mac Pro is a monumental flop and they announce they'll never make another one.  "Pro" users clamoring for the next Mac Pro or arguing what it should be might be the most annoying people on this forum- see the last 30 posts.  Good riddance.

  • Reply 170 of 255
    wizard69wizard69 Posts: 13,377member
    Marvin wrote: »
    It depends on what you put inside it. You wouldn't for example sit a motherboard in an empty room with a giant wall-sized fan blowing over it and say that's the optimal solution. The old Mac Pro had as many as 9 fans inside:
    This highlights that for a given amount of power a bigger box is much harder to cool. The old Mac Pro was actually a pretty poor design hung over from the Power PC days when the big heat sinks or radiator was a requirement.

    In any event I don't understand how somebody can't grasp the efficiency of Apples new arrangement. On other platforms each of those GPUs would have one or two fans, the CPU would have its fan, the power supply its fans and then a few case fans. The Mac Pro has one fan to address all of that. Like it or not that is innovation.


    When you have a large volume of air to shift, that's necessary. When you have a smaller volume, it takes less effort to pull air through it. The unified thermal core is just one fan + one heatsink shared between all the processors and it'll cool the PSU too. It's not that it's essential, it's just more efficient and should run quieter than the old one. It's also more versatile than ever:


    It is a very well engineered product that throws out decades of computer design. Gone is the world of ATX chassis with hot spots and enough fans to lift a Harrier jet. Obviously nobody has stress tested a new Mac Pro yet but I don't see a huge problem myself.
  • Reply 171 of 255
    z3r0z3r0 Posts: 238member
    Quote:
    Originally Posted by Tallest Skil View Post

     

    No, because you lied about the new Mac Pro.

     

    I lied? Wow those are some pretty strong accusations there buddy. You seem a bit emotionally charged. Yep, definitely biased. Seems to me that the only one that lied is Apple. They promised a Mac Pro and didn't deliver. So what don't you like about the new Mac Pro?

     

    Yeah, I know what an NDA is, thanks. You blatantly don’t have one, otherwise you wouldn’t be talking about it at all. And if you do, you don’t deserve it, since you don’t seem to comprehend the new Mac Pro.

     

    I haven't said anything that would break it.

     

    I simply didn’t find it valid, because it isn’t. You’re not daisy chaining computers, nor was that my implication. If you had cared about what you wrote in the first place, you would know we weren’t discussing that.

     

    Well not sure how useful it would be to have six if you aren't using them together for rendering... Unless they were used as servers. Oh, wait you don't consider that using Mac Pro a server a valid use-case for it.

     

    It should also really not be that hard to provide a link of an HP computer that has a case volume equal to that of the new Mac Pro. Unless, of course, they don’t have one.

     

    Why would you want an HP or any computer at the volume of the new Mac Pro. Its does not meet the volume the proper volume to be considered a workstation.

     

    Shut up. You have more than a toddler’s grasp of pronouns; you know exactly to whom ‘we’ is referring. Either respond or don’t bother.

     

    So touchy... Not very logical.

     

    Thanks for resizing two images and photoshopping them together. When you’ve an actual argument, feel free to post it.

     

    You don't even bother to open the link and read the article. If you would have you would have noticed that the image does not belong to me nor did I create it. I guess you are just afraid to be wrong.

  • Reply 172 of 255
    Originally Posted by z3r0 View Post

    Well not sure how useful it would be to have six if you aren't using them together for rendering...

     

    Conversation’s over. When you decide to stop being intentionally dense and actually respond to what is being said, we’ll pick it back up.

  • Reply 173 of 255
    Former doesn’t do a quarter of what the latter does. It’s old tech and you know it. USB as a format–as a physical construct–has been around since ’96. “It’s faster” doesn’t make it new tech.
    The difference is that people actually use USB and actually know what it is. The fact that TB technically does more is immaterial to the fact that USB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better (as FireWire was, not that it ever mattered), especially if it costs orders of magnitude more.

    Plus, it's not like PCI Express as a format hasn't been around for almost a decade itself. Sure, TB has a different connector, which is what I assume you mean by "physical construct", but somehow I doubt you'd suddenly change your opinion on USB being "new tech" if the new version introduced an incompatible new connector type à la FW800.
    Thunderbolt also lacks future-proofness, because:

    1. Apple removed ExpressCard slots because, supposedly, only a small percentage of users were using it. However, I'd be willing to wager that the number of people using ExpressCard was many times larger than the number of people who are using Thunderbolt for anything other than displays. No one is using Thunderbolt — and we know what Apple does to technologies that no one is using.

    2. Thunderbolt is tied to Intel, and Apple will probably eventually start making ARM-based Macs at some point.

    1. Having what to do with Thunderbolt at all? Thanks for proving you know what everyone (rather “no one”) is doing¡
    Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.
    2. Yeah, that co-development sure wouldn’t let Apple do anything with the format, huh¡
    The co-development, as I understand it, is rumor only. Officially TB is Intel's, and it's quite possible that they won't allow it to be ported to non-Intel processors. Of course, it's also unlikely that makers of computers using non-Intel processors would care.
    Prepare to never be surprised.
    We'll see. If Thunderbolt is removed at some point, at least from the lower-range consumer models, I won't be surprised one bit. Apple's got a history of trying to champion something for a while, and then giving up.
    wizard69 wrote: »
    Actually TB adoption has been rather quick in my mind. Frankly it is also hooking to monitors that act as hubs or docking stations for laptops so his argument makes absolutely no sense. There is no telling what is hooked up to those monitors / docking stations. Beyond that I believe Apple got 99% of what it wanted out of TB as a docking port.
    How many people do you know who use the Thunderbolt port for anything other than a monitor? The only Thunderbolt accessory that even approaches being reasonably priced is the Apple Thunderbolt Display, and that costs $1000. The cheapest Thunderbolt docking station is $250, and the cheapest one that includes another Thunderbolt port so you're not blocked from attaching a DisplayPort monitor costs $300. In contrast, a USB 3.0 hub, which again, does what 99% of users would do with those Thunderbolt docks anyway — plug stuff into the USB ports — can be had for around $20. Sure, the Thunderbolt hub is cooler, but the sticker shock from the price tag usually kills the deal in favor of something USB-related. Which of course leads to a vicious cycle — USB gets more usage because it's cheaper, USB gets cheaper due to economies of scale from its large user base, USB gets even more usage because it's even cheaper. Thunderbolt can't really compete with this.
    Intel has as much as said other companies can do whatever to build TB parts.
    Link please.
    If the PC world isn't it most likely is because they lost their way
    Whether they've "lost their way" or not is immaterial. If the PC world doesn't adopt Thunderbolt as a technology, it will be dead within a few years. This is what happened to FireWire, and it had a much larger adoption in the PC world than Thunderbolt does. Apple couldn't keep these things alive on their own even back when they were still giving the Mac line their full attention. Now that they're focused on iDevices, which incidentally don't support Thunderbolt, there's no chance of them pulling it off solo.
    In some of the same way that the Mac Mini was step 2 of the G4 Cube design, Thunderbolt is really just step 2 of ADC. Think about it: power, video, audio, USB, FireWire (Ethernet, everything else anyone could want…), all in one cable.
    YES, EXACTLY. And what happened to ADC? After a few years, it disappeared in favor of the technology it was supposed to replace — DVI.
    The difference here is this one is a piece of cake to work with and will be available on everything from everyone.
    Heh.
    Intel really needs to say, “Look, if you want to make boards for our chips, you’ll put Thunderbolt ports on them.”
    If they did this immediately when TB came out, it might have had a chance. But for some reason, they haven't, and likely won't in the future either. Meanwhile USB 3.0 has stolen its thunder, so to speak, and the advent of ARM-based PCs like the Surface is lessing Intel's hold on the PC market anyway.
  • Reply 174 of 255
    z3r0z3r0 Posts: 238member
    Quote:
    Originally Posted by wizard69 View Post





    For whom? How many real people would put such a configuration on their desktop?

     

    Video editors, 3D artists, visual designers, scientists (big data, simulations etc...), running multiple OSs, render farms, servers etc...

     

    First of nobody in their right mind is going to hook up a GPU card via TB.

     

    Apple is encouraging it with the design. External chassis will be the only way to go if you need NVIDIA unless Apple is planning to offer another configuration option.

     

    It isn't even a valid argument. Second AMD has been doing much better with OpenCL than NVidia. Buying software reliant upon sole source software like CUDA is just stupid.

     

    Many would disagree. Look for benchmarks and heavy data crunching. Look up NVIDIA Tesla

     

    However anybody familiar with technology would have realized where the industry is going. The new Mac Pro reflects that and the technologically literate acknowledge that.

     

    Assumption

     

    If you want to argue that Apple should build a server then I'm with you! Sometimes you need a product that isn't mass production to fill real niches.

     

    Well the current model still lacks dual redundant PSUs and LOM but Apple does sell it as a workstation class server

     

    That would result in a machine with even less sales than the current model.

     

    They don't necessarily need to only offer one model

     

    As for your derisive labeling of the new Pro as a spaghetti monster you leave out all of those break out boxes required to support reasonable amounts of I/O. Further you don't seem to realize that there is value in putting the conversion hardware near the source materials.

     

    Cables can be accidentally pulled leading to problems. Internal components are preferred to protect them and save the mess.

     

    On the contrary it is a big factor as it puts a lot of performance into a small volume. That is huge.

     

    Huge according to Apple marketing. In the end workstations stay in one spot. If not they would be called laptops.

     

    You can say that but have you really looked into what is required to keep a server room cool?

     

    Yes, I have. I have also built and maintained several datacenters.

     

    Apple wanted a better design overall, better cooling helped them get there. You can nitpick all you want but the reality is; they are innovating in a stale industry here.

     

    Apple is shifting focus to the consumer and not the Pro.

     

    Consider this, you as a so called Pro are a very tiny minority in the overall pool of Pros. I'd be willing to say right now that the Mac Pro will be a huge hit if (this is a big if) Apple prices it right. Frankly I suspect that the while point of the design is to allow Apple to be a bit aggressive in pricing.



    By the way the G4 cube failed for one simple reason, it was priced grossly out of range compared to what the hardware was capable of. Frankly that was a similar problem with the old Mac Pro, which wasn't competitive at all unless you where buying a high end machine. You seem to think the old Mac Pro was a huge winner for Apple but yet everything indicates that it was in rapid decline with few serious nibbles. Frankly the old Mac Pro is a T-Rex of a machine that users, real pros in this case, lost interest in.

     

    The G4 Cube's price point was less then that of the PowerMac. What killed it was lack of internal expandability.

  • Reply 175 of 255
    Originally Posted by Durandal1707 View Post

    The difference is that people actually use USB and actually know what it is.

     

    Did they in ’96? I don’t remember that. I do remember USB being “horrible” and the lack of legacy support a “deal breaker”.

     

    The fact that USB technically does more is immaterial to the fact that ADB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better, especially if it costs orders of magnitude more. 


     

    You, when the first iMac was released.

     



    Plus, it's not like PCI Express as a format hasn't been around for almost a decade itself. Sure, TB has a different connector, which is what I assume you mean by "physical construct", but somehow I doubt you'd suddenly change your opinion on USB being "new tech" if the new version introduced an incompatible new connector type à la FW800.


     

    If said new connector was hot swappable where the previous one wasn’t (oh, say, like PS/2 and ADB vs USB), yeah, I’d be ALL over that. And look at this! Thunderbolt is hot swappable where PCIe hasn’t been!

     
    Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.

     

    No, they don’t. Why would you even say something so dumb? They have a history of dumping connectors that are OLD or WORSE than newer solutions. That’s why VGA was dropped as soon as DVI came out, despite VGA still being on every #^$#^%*! motherboard to this date. That’s why ADB and SCSI were dropped despite them (PS/2 instead of ADB) still being on every motherboard until the mid naughties!

     

    If there is a new tech and Apple sees it as better, in every way, than the old, they will adopt it and kill off the old. Nothing at all to do with “not being widely used”. Jeez, I have to say it again: WHY would you say something so dumb? They purposely introduced MiniVGA, MiniDVI, and MicroDVI, for heaven’s sake! They weren’t being used at all until Apple created them!

     


    We'll see.



     

    Already have.

     
    If Thunderbolt is removed at some point, at least from the lower-range consumer models, I won't be surprised one bit.

     

    Because it will be replaced by a better technology in the future. Just like ADB was for USB. Would you have said “see, I told you ADB was bad; Apple got rid of it!”? You’d have been laughed out of the… what, I guess we had forums back then… of the chat room. 

     

    Apple's got a history of trying to champion something for a while, and then giving up.


     

    They also have a history of championing things and then getting them to not only stick but become the only way things are done. Do I have to say it a third time?



    Keyboard and screen. Commercial GUI. Commercial mouse input. Keyboard moved to the back of the case. USB. No floppies. FireWire (tell a pro he can’t have FireWire anymore). 802.11 lineup-wide. No optical drive. And now Thunderbolt.

     

     In contrast, a USB 3.0 hub… …can be had for around $20.


     

    Wow, a technology 17 years old is less expensive than a technology 3 years old. Forgive me if I wet myself in shock.

     


    YES, EXACTLY. And what happened to ADC? After a few years, it disappeared in favor of the technology it was supposed to replace — DVI.



     

    Thanks for latching on to the stupidest answer. 

     

    Heh.


     

    So I guess a plug half the size of a USB port is harder to work with than one four times its size that you have to screw into a PCI card, huh¡

     

    …the advent of ARM-based PCs like the Surface is lessing Intel’s hold on the PC market anyway.


     

    Not for about five years.

  • Reply 176 of 255
    Everyone forgets that PCs are still the most used devices, whether they're the most sold or not.
    And those are where people do their work, not on their iDevices or Androids. So, basically, a computer is a lot more useful for someone who wants do work, and until iDevices can replace the working function of a computer, we won't see the death of it any time soon.
  • Reply 177 of 255
    v5vv5v Posts: 1,357member
    Quote:
    Originally Posted by AppleInsider View Post



    [...] there's a lot of desperate attempts to euphemize this trend under the softer language of "tablets," as if $49 White Box tablets have subsumed the personal computing business just because vendors of all stripes are shipping them.



    Usage data paints a different picture: iPads are simply eating up PC sales while White Box tablets are doing their best impersonation of the 2008 netbook, reclined on a retail inventory shelf whispering to as-yet unconvinced buyers how super cheap they are and how they?re ready to do anything.

     

     

    Apparently Kevin Bostic's information comes from a different source:

     

    Android overtakes Apple's iPad in tablet marketshare, approaches in revenue earned

     

    Quote:

    "Apple's iPad and iPad mini are still the most popular tablets in the world, but Android-powered tablets are grabbing an increasing share of the market and are collectively approaching the iPad in terms of revenue, according to a new study from ABI Research.



    "During the second quarter of 2013, Android-powered tablets as a collective overtook the iPad and iPad mini in terms of market share [...] The figures mirror previous findings from other market research firms. [...] tablets running Android are finally approaching Apple's offerings in terms of revenue generated."



     

    How did you come to the conclusion that other tablets are not selling? They are perhaps not doing as much surfing, but that doesn't mean they're not selling. The latter cannot be inferred from the former.

     

    BTW, I didn't make it through the entire piece. I got as far as your sarcastic flaming and had to bail. I don't know why you do that. I used to think you were better than that.

  • Reply 178 of 255
    Is this really the best you can do?
    Did they in ’96? I don’t remember that. I do remember USB being “horrible” and the lack of legacy support a “deal breaker”.
    Apple gets credit for popularizing that one, for sure. However, Intel's baking it onto every motherboard, added to the fact that it filled a space in the market that didn't really exist up to that point, and, most crucially, the fact that it was cheap even when it was new helped a lot.
    The fact that USB technically does more is immaterial to the fact that ADB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better, especially if it costs orders of magnitude more.
    You, when the first iMac was released.
    ADB, seriously? :lol: More like PS/2, the RS-232 serial port, the parallel port, the DA-15 game port, the Mini-DIN 8 serial port, ADB, SCSI, and probably others I'm forgetting about. There was no ubiquitous, monolithic serial port like USB in 1998 to compete against; there was only an alphanumerabet soup of various different ports, which had just about no name recognition with the general public, and in fact was very confusing to most of them. Nothing like USB existed on the market in 1998.

    USB also had definite user experience improvements that actually meant something to the average end user, in that it was 1) hot-swappable, 2) available on all platforms, and 3) had only one port type to worry about for a slew of devices ranging from keyboards to modems to printers to hard drives. The ability to drop a grand on a PCIe chassis doesn't really have the same resonance with mom-type users.

    Finally, USB was nowhere near as expensive in 1998-1999 as Thunderbolt is today, and it certainly wasn't an order of magnitude more than existing peripherals. An old Logitech mouse cost $60 (Computer Reseller News, March 11, 1996, 125); the same mouse for USB cost... $50 (Macworld, June 1999, 68-78). An IOMega Zip Plus Drive for SCSI cost $200 (Macworld, Feb. 1998, 51), whereas the upgraded Zip 250 for USB cost... $200 (Macworld, April 1999, 56). A Global Village TelePort 56K modem for the Mini-DIN 8 serial port cost $169 (Macworld, Apr. 1998, 87-91); a Global Village TelePort 56K USB modem cost $139 (Macworld, Oct. 1999, p36). SCSI CD-RW drives generally fell in the $400-$600 range (Macworld, Sept. 1998, 91-95), whereas USB ones a year later ranged from $300-$400 (Macworld, Sept. 1999, p38). I could go on, but USB did not have anywhere near the price problem that Thunderbolt has.
    If said new connector was hot swappable where the previous one wasn’t (oh, say, like PS/2 and ADB vs USB), yeah, I’d be ALL over that. And look at this! Thunderbolt is hot swappable where PCIe hasn’t been!
    Exactly! Except USB is hot-swappable, and Thunderbolt doesn't offer any improvement that translates to a benefit for the average home user.
    Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.
    No, they don’t. Why would you even say something so dumb?
    :lol: Someone's having a hard time coming up with cogent arguments, if you're resorting to this.
    They have a history of dumping connectors that are OLD or WORSE than newer solutions. That’s why VGA was dropped as soon as DVI came out, despite VGA still being on every #^$#^%*! motherboard to this date. That’s why ADB and SCSI were dropped despite them (PS/2 instead of ADB) still being on every motherboard until the mid naughties!
    Apple HDI-45 connector — introduced in 1994, combined video, audio, and power — replaced with plain old DB-15 in 1995.

    PCI / PCIe — gradually removed from the entire desktop lineup since 1998. The usual reason given is that most users don't use PCI slots.

    ADC — replaced DVI in 2002, combined video, USB, and power — replaced with... drum roll... DVI in 2004, causing a huge PITA for anyone who'd bought an ADC monitor.

    FireWire — meant to be used instead of USB for high-speed devices — removed from the consumer notebook lineup in favor of the slower, less capable USB in 2008 (with a brief backpedal in early 2009 when they brought back the older MacBook design for a short-lived period of time)

    ExpressCard — basically was to USB and FireWire what Thunderbolt is to USB — replaced by... an SD card slot (?!?!?!?!) in 2009.

    User-replaceable RAM slots — removed in the MacBook Air and Retina MBP, replaced by nothing. Usual reason given is that most users don't open up their laptops.

    User-replaceable storage — see above.

    Apple's got a long history of removing things they think people aren't using, regardless of whether or not they're superior to what people are using.

    Edit: Note that this is just a list of hardware examples. If one were to list the amount of times Apple has removed software features that were deemed unpopular... well, this would be a very long list.
    If there is a new tech and Apple sees it as better, in every way, than the old, they will adopt it and kill off the old. Nothing at all to do with “not being widely used”. Jeez, I have to say it again: WHY would you say something so dumb?
    (the link isn't jumping to the right spot sometimes, so go to 3 minutes 9 seconds in)
    They purposely introduced MiniVGA, MiniDVI, and MicroDVI, for heaven’s sake! They weren’t being used at all until Apple created them!
    VGA and DVI were extremely widely used connector types.
    Because it will be replaced by a better technology in the future. Just like ADB was for USB. Would you have said “see, I told you ADB was bad; Apple got rid of it!”? You’d have been laughed out of the… what, I guess we had forums back then… of the chat room. 
    http://en.wikipedia.org/wiki/Straw_man_argument
    Thanks for latching on to the stupidest answer. 
    Here we go again with the personal attacks. Classy.
    So I guess a plug half the size of a USB port is harder to work with than one four times its size that you have to screw into a PCI card, huh¡
    I was referring more to the "available on everything from everyone" comment.
  • Reply 179 of 255
    wizard69wizard69 Posts: 13,377member
    The difference is that people actually use USB and actually know what it is. The fact that TB technically does more is immaterial to the fact that USB does what 99% of end-users want an external connection bus to do, and it's exceedingly hard to sell them on anything else, even if it's technically better (as FireWire was, not that it ever mattered), especially if it costs orders of magnitude more.
    USB works for the people looking for a low cost port that works well enough. However there are plenty of situations where USB simply doesn't work well at all especially on other platforms. Just try doing anything real time over USB an see what happens.
    Plus, it's not like PCI Express as a format hasn't been around for almost a decade itself. Sure, TB has a different connector, which is what I assume you mean by "physical construct", but somehow I doubt you'd suddenly change your opinion on USB being "new tech" if the new version introduced an incompatible new connector type à la FW800.
    You are saying what here?
    Apple has a history of dumping connections that are perceived as not being widely used. Thunderbolt certainly fits that bill.
    Not exactly they drop ports when they no longer serve a purpose or have been eclipsed by newer technology. TB fills an important role for Apple as such it won't be replaced anytime soon. C
    The co-development, as I understand it, is rumor only. Officially TB is Intel's, and it's quite possible that they won't allow it to be ported to non-Intel processors. Of course, it's also unlikely that makers of computers using non-Intel processors would care.
    Apple and Intel was working together on some level here as Intels development hardware was an Apple Mac Pro.
    We'll see. If Thunderbolt is removed at some point, at least from the lower-range consumer models, I won't be surprised one bit. Apple's got a history of trying to champion something for a while, and then giving up.
    Actually TB is at its best on the low end consumer hardware, it gives things like the AIR incredible expansion capabilities and the ability to handle apps beyond what the laptops size would imply.
    How many people do you know who use the Thunderbolt port for anything other than a monitor? The only Thunderbolt accessory that even approaches being reasonably priced is the Apple Thunderbolt Display, and that costs $1000. The cheapest Thunderbolt docking station is $250, and the cheapest one that includes another Thunderbolt port so you're not blocked from attaching a DisplayPort monitor costs $300. In contrast, a USB 3.0 hub, which again, does what 99% of users would do with those Thunderbolt docks anyway — plug stuff into the USB ports — can be had for around $20.
    That makes a whole lot of sense comparing a passive hub with the capabilities of any of those docks. The two pieces of hardware aren't even in the same ball field. One is hardware for the pee wee leagues and the other suitable for pro users.
    Sure, the Thunderbolt hub is cooler, but the sticker shock from the price tag usually kills the deal in favor of something USB-related. Which of course leads to a vicious cycle — USB gets more usage because it's cheaper, USB gets cheaper due to economies of scale from its large user base, USB gets even more usage because it's even cheaper. Thunderbolt can't really compete with this.
    Your fundamental problem is made obvious right here, Thunderbolt is not designed to compete with USB. It is a fundamental mistake to try to put them into competition with each other. If Apple had such an option of TB they would have deleted the USB ports, instead they have supported USB3 as soon as Intel got its act together.
    Link please.
    Do a little foot work yourself.
    Whether they've "lost their way" or not is immaterial. If the PC world doesn't adopt Thunderbolt as a technology, it will be dead within a few years. This is what happened to FireWire, and it had a much larger adoption in the PC world than Thunderbolt does. Apple couldn't keep these things alive on their own even back when they were still giving the Mac line their full attention. Now that they're focused on iDevices, which incidentally don't support Thunderbolt, there's no chance of them pulling it off solo.
    Apple has gotten everything they have wanted out of TB, they won't be dropping it anytime soon.
    YES, EXACTLY. And what happened to ADC? After a few years, it disappeared in favor of the technology it was supposed to replace — DVI.
    Heh.
    If they did this immediately when TB came out, it might have had a chance. But for some reason, they haven't, and likely won't in the future either. Meanwhile USB 3.0 has stolen its thunder, so to speak, and the advent of ARM-based PCs like the Surface is lessing Intel's hold on the PC market anyway.
  • Reply 180 of 255
    hmmhmm Posts: 3,405member
    Quote:

    Originally Posted by wizard69 View Post





    This is a bit misleading, OpenCL has been very successful considering its late start against CUDA. CUDA tie to NVidias hardware will be its downfall. Especially now that OpenCL accelerated Apple run so beautifully on Intel and AMD hardware. INtels Iris Pro can actually whip a large number of GPUs running OpenCL code by a large margin.



    I'm not saying CUDA is dead today just that it is being marginalized very fast in favor of open standards solutions. People that remain tied to CUDA will be facing competition that can more effectively use the common hardware. In other words when most systems shipped are equipped with either Intel or AMD integrated GPUs, the developers that target OpenCL will be at a significant advantage in the marketplace. Due to these SoC computers I see developers dropping CUDA like a hot potato.

    I wasn't referring solely to OpenCL. I was pointing out that tools that are provided by NVidia are not likely to be completely optimized for AMD hardware. It goes beyond CUDA. They experiment in all kinds of stuff. The raytracer Adobe uses in After Effects has come up a couple times (yeah yeah I know computers do more than edit movies, but this example has come up) is designed to use CUDA and very slow running on the cpu. They basically implemented work done by NVidia. If the initiative began with Adobe, it's not as likely that it would have been CUDA based. Speaking of OpenCL, have you seen its 2.0 specification?  I have nothing against it. I merely wanted to point out that it isn't the only thing that has driven NVidia's relative dominance in that area up to this point..

Sign In or Register to comment.