or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Transcend's 32GB RAM modules for Apple's Mac Pro doubles max memory to 128GB [u]
New Posts  All Forums:Forum Nav:

Transcend's 32GB RAM modules for Apple's Mac Pro doubles max memory to 128GB [u]

post #1 of 50
Thread Starter 
Memory manufacturer Transcend on Friday announced availability of high-density DDR3 RDIMM RAM modules that bring the Mac Pro's total memory up to 128GB, doubling the max configuration advertised on the Apple Store.

Mac Pro


With the all-new Mac Pro, Apple uses cutting-edge components and a unique design to net huge gains in performance. The additional speed on tap is especially apparent when editing 4K video in applications like Final Cut Pro.

While processing power has been boosted substantially, one gripe some professional users have with the new system is memory expandability. Currently, Apple lists maximum RAM at four 16GB sticks for 64GB, but Transcend recently released compatible 32GB modules that boost memory to 128GB.

"The new Mac Pro 2013 is advertised to support up to 64GB of memory, and we understand that pro users running applications that place high demands on RAM have a need to meet and most likely exceed this threshold," said Transcend's Director of Research and Development Angus Wu. "For this reason, we have developed and fully tested higher density modules to give users the option of raising their Mac Pro system memory to the advertised 64GB right up to 128GB."

For the its new 16GB DDR3-1866 and 32GB DDR3-1333 RDIMM modules, Transcend guarantees "100% compatibility" with Apple's new Mac Pro. The company also offers lifetime warranties for the products.

Pricing has yet to be announced, though registered DIMMs are usually more expensive than their unregistered counterparts.

Update: Transcend has released pricing information for its DIMM sets. The suggested retail price for the 64GB kit is $980 and the 128GB kit will cost $2480.

Meanwhile, MacMall let us know that they already have 32GB DDR3-1333 ECC RDIMMs for the new Mac Pro in stock from manufacturer Axiom. They currently retail for $619.99 a piece.
post #2 of 50

I haven't been able to find any mention of price. But I'm glad to see such a product offering. I do a lot of work with The Foundry Modo which can benefit from multiple processors and lots of RAM, and I'm anxious to hear about its optimization for the new Mac Pro with its GPUs.

Daniel Swanson

Reply

Daniel Swanson

Reply
post #3 of 50
Quote:
Originally Posted by DanielSW View Post
 

I haven't been able to find any mention of price....

 

I doubt you'd want to.

 

Holy cow, 128 GB of RAM. I wonder how many years it will be before I look back on this comment and laugh.

 

I've already had my laugh at some very very old 'wow' comments on 16 GB of RAM.

post #4 of 50
Quote:
Originally Posted by pmz View Post
 

 

I doubt you'd want to.

 

Holy cow, 128 GB of RAM. I wonder how many years it will be before I look back on this comment and laugh.

 

I've already had my laugh at some very very old 'wow' comments on 16 GB of RAM.

Of course I want to. I'm not a dilettante spectator. I've spent a lot of money on memory over the past three decades, making far more money with the resulting enhanced performance of my computers than the cost of the memory. Companies like this realize this value and will probably enjoy good sales to people like me.

Daniel Swanson

Reply

Daniel Swanson

Reply
post #5 of 50
Remember 128K Ram in a Mac ? Probably not, but it was large for the day.
post #6 of 50
Quote:
Originally Posted by GTBuzz View Post

Remember 128K Ram in a Mac ? Probably not, but it was large for the day.

Heck I felt spoiled with 128K and no need for a language card to add an additional 16k as with the Apple ][. 1biggrin.gif

I had to get a new Mac Pro in 2013 so had to settle for what was shipping early, a six core. I really would like more RAM that I was able to get stock so this is great news. Now I just want to see the same for the SSDs from third parties, larger and less expensive but 100% compatible ... (I hope)
Edited by digitalclips - 3/14/14 at 3:17pm
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
post #7 of 50
Quote:
Originally Posted by DanielSW View Post
 

Of course I want to. I'm not a dilettante spectator like you. I've spent a lot of money on memory over the past three decades, making far more money with the resulting enhanced performance of my computers than the cost of the memory. Companies like this realize this value and will probably enjoy good sales to people like me.

 

So I guess you've found a way to monetize that RAM that doesn't involve dealing with clients, huh? I can't imagine that kind of attitude going over very well in any setting other than belly-bumping on the internet.

 

Besides, how much RAM can you cram into that iMac of yours anyway?

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
post #8 of 50
I want a new Mac Pro so I can need these new RAM modules.
post #9 of 50
Quote:
Originally Posted by DanielSW View Post

Of course I want to. I'm not a dilettante spectator like you. I've spent a lot of money on memory over the past three decades, making far more money with the resulting enhanced performance of my computers than the cost of the memory. Companies like this realize this value and will probably enjoy good sales to people like me.

Cool it! He's not attacking you but noting that even those that need 64GiB are very few and far between. If you need 128GiB then so be it and it's certainly possible, but I'm curious as to what you need that much RAM; especially considering that the RAM is slower. I implore you paint for us a picture of your usage needs.
Edited by SolipsismX - 3/14/14 at 9:20pm

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #10 of 50
Quote:
Originally Posted by SolipsismX View Post

Cool it! He's not attacking you but noting that even those that need 64GiB are very few and far between. If you need 128GiB then so be it and it's certainly possible, but I'm curious as to what you need that much RAM; especially RAM that is slower. I implore you paint for us a picture of your usage needs.

To be fair he is correct if not too tactful although not if the RAM is slower, I missed that part! While working full time I had very Mac. I ever owned maxed out, such as the II fx which had 8 MB LOL, that thing loaded cost more than the current 12 core Mac Pro! One of the companies I ran was in high end scanner software and always needed as much RAM as possible. Today some one editing 4K might seriously benefit from tons of RAM and no doubt 8K is on the horizon. It's horses for courses, most folks don't need much. These days I'd be happy with 32 Gs but since my nMP came with 16 I have to think how to upgrade, I suspect I'm painted into a corner damn it!
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
post #11 of 50
Quote:
Originally Posted by digitalclips View Post

To be fair he is correct if not too tactful although not if the RAM is slower, I missed that part! While working full time I had very Mac. I ever owned maxed out, such as the II fx which had 8 MB LOL, that thing loaded cost more than the current 12 core Mac Pro! One of the companies I ran was in high end scanner software and always needed as much RAM as possible. Today some one editing 4K might seriously benefit from tons of RAM and no doubt 8K is on the horizon. It's horses for courses, most folks don't need much. These days I'd be happy with 32 Gs but since my nMP came with 16 I have to think how to upgrade, I suspect I'm painted into a corner damn it!

As a general rule "more RAM is better" but there are many factors to consider here, especially when talking about a production machine. If I was editing 4K video for my Pixar-funded project I would first see if I would need more than 64GiB, then see if the apps I use can handle 128GiB and, finally, I'd want to do testing to see if this slower (as well as brand new, untested, 3rd-party) RAM helps me be more productive than the opting for 64GiB RAM from Apple.

On the surface I'd say that slightly slower RAM is worth getting double the capacity but at these capacities and speeds we're not talking about basic users so empirical testing is needed to see if this really is a benefit for production.
Edited by SolipsismX - 3/15/14 at 9:31am

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #12 of 50

Irony, since Intel doesn't implement the DD3 spec correctly, the Mac Pro has to house 4 additional RAM slots just to get to 64GB, when if intel did it to spec the 8 slot could theoretically house 256GB of DDR3 RAM.

 

AMD designs to spec and its the reason there are only 4 slots for DDR3 DIMMS for 32/64GB AMD FM2+ and AM3+ boards.

 

I would love to see Intel be forced to license Thunderbolt to AMD.

 

It would give Apple the option for Excavator APUs : the future of CPU designs.

 

March 26th AMD is announcing the next generation AMD FirePro GPGPUs. I hope the D Series gets an upgrade for Mac Pros.

post #13 of 50
Quote:
Originally Posted by SolipsismX View Post

Cool it! He's not attacking you but noting that even those that need 64GiB are very few and far between. If you need 128GiB then so be it and it's certainly possible, but I'm curious as to what you need that much RAM; especially RAM that is slower. I implore you paint for us a picture of your usage needs.

Remember the days when people would ask 'why do you need 8GB of RAM?!' This is the same thing. Musicians, graphic designers, photographers, 3D artists, all these guys would be happy with as much RAM as they can handle. Photoshop files that use 10GB of RAM each, software instruments that need to load up 10's of gigabytes of samples. No-one NEEDS that much ram, but it sure saves a shit load of time when you're on a deadline. No need to close After Effects to reopen Photoshop. Keep Logic Pro open with a bunch of sample-heavy AU instruments while you work in Final Cut with 4K uncompressed video. Comments like that are so closed minded. It's unbelievable you could still be 'curious' about why 128GB+ RAM is useful.
post #14 of 50
Quote:
Originally Posted by frxntier View Post

Remember the days when people would ask 'why do you need 8GB of RAM?!' This is the same thing. Musicians, graphic designers, photographers, 3D artists, all these guys would be happy with as much RAM as they can handle. Photoshop files that use 10GB of RAM each, software instruments that need to load up 10's of gigabytes of samples. No-one NEEDS that much ram, but it sure saves a shit load of time when you're on a deadline. No need to close After Effects to reopen Photoshop. Keep Logic Pro open with a bunch of sample-heavy AU instruments while you work in Final Cut with 4K uncompressed video. Comments like that are so closed minded. It's unbelievable you could still be 'curious' about why 128GB+ RAM is useful.

1) Someone saying they need more RAM is not a justification for actually needing more RAM or even having apps that can handle the additional RAM so wondering what progressional app(s) it is for is valid question.

2) As previously noted, we're talking about professional level usage with a Mac Pro with 128GB of RAM so the question is doubly important when you consider someone wanting this RAM despite it having had zero testing in the market and being slower than the DDR4 RAM that that Mac Pro ships with. If you jumping in blindly because of the higher capacity without any considering for quality, support, or performance then you're not someone I want working for me.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #15 of 50
It's only this latest version of OS X that can use more than 96GB of RAM. I'm sure glad Apple finally lifted that cap!
post #16 of 50
Quote:
Originally Posted by SolipsismX View Post

1) Someone saying they need more RAM is not a justification for actually needing more RAM or even having apps that can handle the additional RAM so wondering what progressional app(s) it is for is valid question.

2) As previously noted, we're talking about professional level usage with a Mac Pro with 128GB of RAM so the question is doubly important when you consider someone wanting this RAM despite it having had zero testing in the market and being slower than the DDR4 RAM that that Mac Pro ships with. If you jumping in blindly because of the higher capacity without any considering for quality, support, or performance then you're not someone I want working for me.
I was addressing the question 'why you need that much RAM.' The question was asked without any consideration for why people might need it. I wasn't saying that this setup is the way to go (it really probably isn't, but it's worth investigating.) I was just showing that there are always reasons to have more RAM, even RAM that is slower than DDR4. It's still faster than the PCIe SSDs. I didn't jump in blindly; I just tried to address the annoying 'why would you need this much RAM'. Just being a bit slower on paper is not any reason to immediately discount the idea.
post #17 of 50
Quote:
Originally Posted by TeaEarleGreyHot View Post


Do professional OS X apps even allow for 128GB minus OS resources to be allocated by a single app?

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #18 of 50
Quote:
Originally Posted by frxntier View Post

I was addressing the question 'why you need that much RAM.' The question was asked without any consideration for why people might need it. I wasn't saying that this setup is the way to go (it really probably isn't, but it's worth investigating.) I was just showing that there are always reasons to have more RAM, even RAM that is slower than DDR4. It's still faster than the PCIe SSDs. I didn't jump in blindly; I just tried to address the annoying 'why would you need this much RAM'. Just being a bit slower on paper is not any reason to immediately discount the idea.

Why do you have a problem the question, "Why would you need this much RAM?"? It's not as if I made the statement, "No one needs that much RAM!"

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #19 of 50
Quote:
Originally Posted by SolipsismX View Post

Why do you have a problem the question, "Why would you need this much RAM?"? It's not as if I made the statement, "No one needs that much RAM!"
It's the implication that you know the answer, because you qualified it with 'especially ram that is slower'. I read the former statement with the implication of the latter. Apologies for that if it's not what you meant. Although you did ask to have a picture painted for you. And I attempted to do that.
post #20 of 50
Quote:
Originally Posted by frxntier View Post

It's the implication that you know the answer, because you qualified it with 'especially ram that is slower'. I read the former statement with the implication of the latter. Apologies for that if it's not what you meant. Although you did ask to have a picture painted for you. And I attempted to do that.

I asked the question because the user didn't express their usage case. I noted the RAM being slower because while additional RAM for a non-portable device is never a negative (except on cost) being slow is a negative. I even stated that I think slightly slower RAM is worth doubling the amount if your needs call for it. I also talked about testing the RAM to see if it truly will be a benefit to one's production. Finally, I used the words curious and implore to further attract an in-depth answer as to why 64GiB isn't enough.

I don't know what else I could have done make my queries sound more sincere.


PS: A little over a year ago I received my 2012 iMac and *premium* RAM with a lower CAS Latancy that I had purchased from Newegg. I ran artificial tests to see if made a difference. It didn't, so I returned it.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #21 of 50
Quote:
Originally Posted by daveinpublic View Post


Don't forget, you also said..... "Cool it!"

So obviously you had a problem or something. Also, for everyone else jumping on the first commenter - my apologies for this forum, some people don't realize comments don't come across well through text sometimes... It'll all be okay.

That's fine, I'm not worried. Just voicing my opinions about the prospects of having that much RAM, especially recalling those very early days in 1984 on a 128K Macintosh. Very sweet times, these. Can't wait to buy my Mac Pro and fit it out with that much RAM!

Daniel Swanson

Reply

Daniel Swanson

Reply
post #22 of 50
Quote:
Originally Posted by frxntier View Post
 
I was addressing the question 'why you need that much RAM.' The question was asked without any consideration for why people might need it.

 

You lost me... are you saying the question was asked without any attempt to answer the question himself...? That doesn't make any sense.

 

"Consideration of why people might need it" is exactly why he asked why people might need it, yes?

 

I thought it was a perfectly reasonable question. Before parting with that kind of coin I'd wanna be REALLY sure that there's enough benefit in productivity to justify the big ol' block of billable hours that stack of RAM represents. Hence, I would ask "What is the benefit of having that much RAM? Will it really benefit our actual work or is it just a theoretical advantage? Is there any tangible improvement going from 64 to 128? How will that improvement offset the cost?" In other words, "Why do you need that much RAM?"

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
post #23 of 50
Quote:
Originally Posted by SolipsismX View Post

If I was editing 4K video for my Pixar-funded project I would first see if I would need more than 64GiB

Pixar's and Weta's render farms use 96GB RAM per machine:

http://www.theverge.com/2013/6/21/4446606/how-pixar-changed-the-way-light-works-for-monsters-university

They started using raytracing for more accurate lighting. They mentioned that each frame used 20GB of memory and each machine would do 4 frames at a time. I don't know what target resolution they had but it could just be 2K. The Great Gatsby (2013) was shot at 5K so I'd guess it was mastered to 4K and it used photorealistic backdrops rendered with Pixar's software:



https://www.fxguide.com/featured/the-state-of-rendering/

"For Gatsby the team used Animal’s own pipeline with a path tracer within PRman (RenderMan).

A common solution, especially for exteriors, started with a combination of an environment light and a key light. Estela explains that the environment light provides realism, soft shadows and overall tonality but it is also the most expensive light to use. “It is is a bit like a dome light in Mental Ray,” he adds.

The use of this approach produced great results but it did not come without some effort. The two key problems ere render times to get renders with satisfactory low noise, and memory use. “Our old system could render using 8 gig, our new system used 64 gigs of RAM, and we could easily put ‘em into swap.” Still the team managed very successfully to migrate to the new approach of physical shaders and lighting and used IBL for both technical and artistic success."

The same renderfarm was used for The Matrix, 300, Walking With Dinosaurs, Happy Feet, Moulin Rouge, The Lego Movie. Those machines are also servers and not running other processes like a desktop would. It would be unlikely that an individual desktop machine would run a single process that would be used for photoreal 4K imagery like in a feature film but if it was an image for print work or short commercial, the quality can be pushed up. There are a couple of profiles here, these smaller companies would use raytracing much more frequently than movie studios as the resolution is lower but that lowers the memory requirement:

http://www.itsartmag.com/features/studio-profile-lumiere-studio/
http://www.itsartmag.com/features/interview-anselm-von-seherr-thoss/3/

The 2nd one mentions a custom desktop rig with 48GB RAM. To max out 64GB, you'd need more than 6 apps all using 10GB at the same time, which is unlikely. Of course you can force it to do that by just pasting as many things around and some apps will fill it with their cache. After Effects likes to fill up memory with uncompressed frames as it goes so it can easily fill up over 10GB and then double it to export (it's not smart enough to flush the cache when it runs out).

As for price, 128GB for the current Mac Pro is about $1600 so I'd guess that's probably what this will be at a minimum.
Quote:
Originally Posted by SolipsismX 
being slower than the DDR4 RAM that that Mac Pro ships with

The Mac Pro ships with DDR3. Next year it will be DDR4, which will offer 128GB too. I'd say 128GB is more RAM than anyone will ever need. Once this amount of RAM becomes inexpensive (say $500 for 128GB), Apple can solder the RAM in.
post #24 of 50
Quote:
Originally Posted by GTBuzz View Post

Remember 128K Ram in a Mac ? Probably not, but it was large for the day.

When the first Maintosh came out I had as much RAM in my Apple //e.

Some time after the 128K Mac came out a friend started to change them into Fat Macs even before the 512k came out - the owners were willing to take the risk as they were so in need of more RAM.

 

So no, it was not.

But it was very expensive to have more and for several years RAM was one of the most expensive upgrades you could do.

post #25 of 50
And you can use even more RAM than that for scientific computing (biology, astronomy, ...). RAM is in general useful when working on lots of data, because the OS uses it to cache the disk files. So if you have a lot of small files that you need to access frequently, it will speedup things a lot.
post #26 of 50
Quote:
Originally Posted by Marvin View Post

Pixar's and Weta's render farms use 96GB RAM per machine:

http://www.theverge.com/2013/6/21/4446606/how-pixar-changed-the-way-light-works-for-monsters-university

They started using raytracing for more accurate lighting. They mentioned that each frame used 20GB of memory and each machine would do 4 frames at a time. I don't know what target resolution they had but it could just be 2K. The Great Gatsby (2013) was shot at 5K so I'd guess it was mastered to 4K and it used photorealistic backdrops rendered with Pixar's software:



https://www.fxguide.com/featured/the-state-of-rendering/

"For Gatsby the team used Animal’s own pipeline with a path tracer within PRman (RenderMan).

A common solution, especially for exteriors, started with a combination of an environment light and a key light. Estela explains that the environment light provides realism, soft shadows and overall tonality but it is also the most expensive light to use. “It is is a bit like a dome light in Mental Ray,” he adds.

The use of this approach produced great results but it did not come without some effort. The two key problems ere render times to get renders with satisfactory low noise, and memory use. “Our old system could render using 8 gig, our new system used 64 gigs of RAM, and we could easily put ‘em into swap.” Still the team managed very successfully to migrate to the new approach of physical shaders and lighting and used IBL for both technical and artistic success."

The same renderfarm was used for The Matrix, 300, Walking With Dinosaurs, Happy Feet, Moulin Rouge, The Lego Movie. Those machines are also servers and not running other processes like a desktop would. It would be unlikely that an individual desktop machine would run a single process that would be used for photoreal 4K imagery like in a feature film but if it was an image for print work or short commercial, the quality can be pushed up. There are a couple of profiles here, these smaller companies would use raytracing much more frequently than movie studios as the resolution is lower but that lowers the memory requirement:

http://www.itsartmag.com/features/studio-profile-lumiere-studio/
http://www.itsartmag.com/features/interview-anselm-von-seherr-thoss/3/

The 2nd one mentions a custom desktop rig with 48GB RAM. To max out 64GB, you'd need more than 6 apps all using 10GB at the same time, which is unlikely. Of course you can force it to do that by just pasting as many things around and some apps will fill it with their cache. After Effects likes to fill up memory with uncompressed frames as it goes so it can easily fill up over 10GB and then double it to export (it's not smart enough to flush the cache when it runs out).

As for price, 128GB for the current Mac Pro is about $1600 so I'd guess that's probably what this will be at a minimum.
The Mac Pro ships with DDR3. Next year it will be DDR4, which will offer 128GB too. I'd say 128GB is more RAM than anyone will ever need. Once this amount of RAM becomes inexpensive (say $500 for 128GB), Apple can solder the RAM in.

Loved your post , however I am certain OS X will allow and improve upon applications' abilities to max out far more RAM very soon making it a must in these industries. but .... " I'd say 128GB is more RAM than anyone will ever need." ... you serous? When most Macs had 512K at most, and I loaded up my IIfx with 8 MBs I was told I was crazy and no one would ever need that much RAM. I am sure we will be looking at the same discussion over TBs one day. However, I assume you mean " ...I'd say 128GB is more RAM than anyone will ever need in the nMP for the next few years." 1biggrin.gif

I bet we will see 512 GIGs and 1 TB within the next few years in Mac Pros.

Meanwhile I wish Apple and ATI would work on the dual GPU access at the core level (Central Dispatch, Grand Central ... how about KIng's Cross Station as a name?) even if a re written application like FCPro is the better route. At least let all the other 64 applications that would benefit get an increase ASAP. It is sad to see an i7 iMac besting a 4 Core nMP in Geekbench tests that have no idea what to do with the new architecture (or so it seems to me ... not my area of expertise, if this is not the case I am all ears).

EDIT: Reason ... I first typed 8 GIGs in my IIfx ... hahahahaah ... oh how quickly we forget!
Edited by digitalclips - 3/15/14 at 6:59am
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
post #27 of 50
Quote:
Originally Posted by digitalclips View Post

" I'd say 128GB is more RAM than anyone will ever need." ... you serous? When most Macs had 512K at most, and I loaded up my IIfx with 8 MBs I was told I was crazy and no one would ever need that much RAM. I am sure we will be looking at the same discussion over TBs one day. However, I assume you mean " ...I'd say 128GB is more RAM than anyone will ever need in the nMP for the next few years." 1biggrin.gif

I bet we will see 512 GIGs and 1 TB within the next few years in Mac Pros.

Technology doesn't all keep increasing like that. Think about display resolution, it wouldn't have been right to say 640x480 was enough all those years ago but 'Retina' quality is enough (300ppi at arm's length). That means 4K or 8K is the limit for monitors and media.

The only place that I see RAM getting bigger is if it becomes non-volatile and replaces storage but it won't be down to RAM running out:

http://www.extremetech.com/computing/163058-reram-the-new-memory-tech-that-will-eventually-replace-nand-flash-finally-comes-to-market

Storage is another thing that doesn't need to keep getting all that much bigger. There was a link from there saying Microsoft found 16TB is as much as SSD will manage:

http://www.extremetech.com/computing/118909-current-solid-state-drive-technology-is-doomed-researchers-say

but 16TB is a pretty decent size for personal storage. The issue is cost, you can buy as many 2.5" drives as you want.

The goal with software is not to use more memory but less. If the target resolution for media sticks at 4K/8K then the memory demands aren't going to go higher. I'm going to stand by the statement that 128GB would be enough for any task that is needed by a personal computer. I don't think that even in 50 years people will be running processes that max out 128GB. It ultimately comes down to content. What is the content being generated or stored that uses up the space? If you can generate photoreal 4K CGI on machines with 96GB of RAM then where do you go next?

Other things that have limits are display colors and audio quality. Human senses have limits and that's where the demand comes from. If there's no perceptible improvement in a technology change then the demand goes away. We should have moved to higher quality colors (16-bpp or 12-bpp minimum) but the demand just isn't there, people are content with 8-bpp.

Even if we get to a point where photoreal imagery is generated on the fly in virtual reality, we have a limited viewpoint. Our field of vision is limited as well as visual acuity so content can dynamically load in and out of memory and again, there's a limit in the human effort to build content that rich to experience.
post #28 of 50
@DanielSW

I'm still waiting for an answer as to why 64GiB RAM simply isn't enough for your needs.

Quote:
Originally Posted by Marvin View Post

Technology doesn't all keep increasing like that. Think about display resolution, it wouldn't have been right to say 640x480 was enough all those years ago but 'Retina' quality is enough (300ppi at arm's length). That means 4K or 8K is the limit for monitors and media.

The only place that I see RAM getting bigger is if it becomes non-volatile and replaces storage but it won't be down to RAM running out:

http://www.extremetech.com/computing/163058-reram-the-new-memory-tech-that-will-eventually-replace-nand-flash-finally-comes-to-market

Storage is another thing that doesn't need to keep getting all that much bigger. There was a link from there saying Microsoft found 16TB is as much as SSD will manage:

http://www.extremetech.com/computing/118909-current-solid-state-drive-technology-is-doomed-researchers-say

but 16TB is a pretty decent size for personal storage. The issue is cost, you can buy as many 2.5" drives as you want.

The goal with software is not to use more memory but less. If the target resolution for media sticks at 4K/8K then the memory demands aren't going to go higher. I'm going to stand by the statement that 128GB would be enough for any task that is needed by a personal computer. I don't think that even in 50 years people will be running processes that max out 128GB. It ultimately comes down to content. What is the content being generated or stored that uses up the space? If you can generate photoreal 4K CGI on machines with 96GB of RAM then where do you go next?

Other things that have limits are display colors and audio quality. Human senses have limits and that's where the demand comes from. If there's no perceptible improvement in a technology change then the demand goes away. We should have moved to higher quality colors (16-bpp or 12-bpp minimum) but the demand just isn't there, people are content with 8-bpp.

Even if we get to a point where photoreal imagery is generated on the fly in virtual reality, we have a limited viewpoint. Our field of vision is limited as well as visual acuity so content can dynamically load in and out of memory and again, there's a limit in the human effort to build content that rich to experience.

Sure, there are upper limits in technology but it's unlikely we'll know what they are. If we're already seeing 128GB RAM for a machine a consumer could afford and that is supported today in a consumer OS why wouldn't one expect the prices to come down, the need to go up in 10, 20, or even 50 years?

As for display resolutions, that is clearly a much slower area than RAM but it has been growing. 4K displays are coming and I see no reason why 8K displays won't be next. I think eventually TVs will get so big that 8K won't begin to cut it but by then they won't be TVs in the way we see them now. What if a TV in 50 years is similar to how it's represented in Fahrenheit 451? IOW, you have one or more walls of TV with exception resolution. So we're talking, say, 500 PPI so even up close you can't see pixels. How much RAM would be needed for this, including the room sensors that can respond to your actions in real time. Wouldn't more than 128GiB be needed for 500 billion pixels?

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #29 of 50
Quote:
Originally Posted by Marvin View Post

Technology doesn't all keep increasing like that. Think about display resolution, it wouldn't have been right to say 640x480 was enough all those years ago but 'Retina' quality is enough (300ppi at arm's length). That means 4K or 8K is the limit for monitors and media.

The only place that I see RAM getting bigger is if it becomes non-volatile and replaces storage but it won't be down to RAM running out:

http://www.extremetech.com/computing/163058-reram-the-new-memory-tech-that-will-eventually-replace-nand-flash-finally-comes-to-market

Storage is another thing that doesn't need to keep getting all that much bigger. There was a link from there saying Microsoft found 16TB is as much as SSD will manage:

http://www.extremetech.com/computing/118909-current-solid-state-drive-technology-is-doomed-researchers-say

but 16TB is a pretty decent size for personal storage. The issue is cost, you can buy as many 2.5" drives as you want.

The goal with software is not to use more memory but less. If the target resolution for media sticks at 4K/8K then the memory demands aren't going to go higher. I'm going to stand by the statement that 128GB would be enough for any task that is needed by a personal computer. I don't think that even in 50 years people will be running processes that max out 128GB. It ultimately comes down to content. What is the content being generated or stored that uses up the space? If you can generate photoreal 4K CGI on machines with 96GB of RAM then where do you go next?

Other things that have limits are display colors and audio quality. Human senses have limits and that's where the demand comes from. If there's no perceptible improvement in a technology change then the demand goes away. We should have moved to higher quality colors (16-bpp or 12-bpp minimum) but the demand just isn't there, people are content with 8-bpp.

Even if we get to a point where photoreal imagery is generated on the fly in virtual reality, we have a limited viewpoint. Our field of vision is limited as well as visual acuity so content can dynamically load in and out of memory and again, there's a limit in the human effort to build content that rich to experience.

All good points but I still would bet we see a continues increase in RAM usage not to mention .... I'm still banking on the Holodeck becoming a reality one day and boy will a load of RAM help that 1smile.gif

ON a slightly more serious note ..cough .. . when we start being able to 'jack in' and 'download' a mind, we will see the RAM needs escalate ... (see Richard K Morgan ... trilogy). Brilliant novels. Takeshi Kovacs is coming!
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
From Apple ][ - to new Mac Pro I've used them all.
Long on AAPL so biased
Google Motto "You're not the customer. You're the product."
Reply
post #30 of 50
Quote:
Originally Posted by SolipsismX View Post

Sure, there are upper limits in technology but it's unlikely we'll know what they are.

We base it on our own limits. The limit of audio quality is what the human ear can distinguish. The limit in color accuracy and resolution is what the eye can distinguish. The amount of memory needed is based on the limited content that needs to be processed.
Quote:
Originally Posted by SolipsismX View Post

If we're already seeing 128GB RAM for a machine a consumer could afford and that is supported today in a consumer OS why wouldn't one expect the prices to come down, the need to go up in 10, 20, or even 50 years?

Need doesn't go up if price comes down, people just buy what they need for less but RAM will become inexpensive to the point that 16GB can go into entry-level machines. DDR4 will probably allow all of Apple's machines to have an 8GB entry point. Retailers will only offer amounts that people will buy. The reason that high amounts exist now is for servers because they virtualize operating systems so individual machines can be split to run 50VMs with 2GB of dedicated memory each.
Quote:
Originally Posted by SolipsismX View Post

Wouldn't more than 128GiB be needed for 500 billion pixels?

It's not practical. You need to film content at this resolution first of all and the clarity of the imagery is limited by the lenses:

http://en.wikipedia.org/wiki/Optical_resolution#Lens_resolution
http://petapixel.com/2012/12/17/perceptual-megapixel-mtf-charts-boiled-down-to-a-single-number/

Then there's the bandwidth to transfer and processing to compress that much data in real-time from the sensor to storage. Memory bandwidth is only 25GB/s so per frame (at 30fps), that's 830MB divided by 3 bytes per pixel = 277 million pixels per frame. If they put faster memory in there (future GPU memory will go up to 1TB/s) and compressed at a ratio of 10:1, you'd still be talking about 500 billion x 3 bytes x 30 / 10 = 4.5 terabytes per second or 36 terabits/s (1800x faster than Thunderbolt 2).

I don't think you'd need 500 billion pixels for a wall though. 3840 x 2160 is fine on 50" so maybe 24 of those = 200 million pixels for a wall but like I say, you'd need the camera tech to capture at that quality and then once you have your 24k x 13.5k resolution video, you'd need to find a way to transmit 480Mbit/s of compressed data to the consumer. Maybe that's what they intend these for:

http://www.pcworld.com/article/2106260/sony-panasonic-develop-300gb-optical-discs-for-enterprise-storage.html
Quote:
Originally Posted by digitalclips 
when we start being able to 'jack in' and 'download' a mind, we will see the RAM needs escalate

That could be done on an SSD (10TB or so). Brains are slow at processing so it's really storage you'd be looking at. You'd lose all 10TB of data once you rebooted if it was in RAM.
post #31 of 50
Quote:
Originally Posted by Marvin View Post

We base it on our own limits. The limit of audio quality is what the human ear can distinguish. The limit in color accuracy and resolution is what the eye can distinguish. The amount of memory needed is based on the limited content that needs to be processed.

RAM has nothing to do without our own limits. If 32GiB sticks will be available in 2014 I can't imagine going the rest of my life — hopefully past 2100 CE — and never seeing that exceeded, especially when I consider how much it's increased in just the past 25 years.
Quote:
Need doesn't go up if price comes down, people just buy what they need for less but RAM will become inexpensive to the point that 16GB can go into entry-level machines. DDR4 will probably allow all of Apple's machines to have an 8GB entry point. Retailers will only offer amounts that people will buy. The reason that high amounts exist now is for servers because they virtualize operating systems so individual machines can be split to run 50VMs with 2GB of dedicated memory each.

Actually it does. As a resource becomes more or less available in a culture we typically find a way to become more or less dependent on it. If large amount of cheap, low-power RAM becomes so abundant that developers can not worry about making application more RAM efficient then they won't and as a result we'll need to have use more RAM.
Quote:
It's not practical.

There is a history of technological changes where people in the previous generation said something wouldn't ever exist because it's not practical. We'll see 4K video being streamed as the standard to our home even thought not too long ago it didn't exist at all.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #32 of 50
Quote:
Originally Posted by reroll View Post

And you can use even more RAM than that for scientific computing (biology, astronomy, ...). RAM is in general useful when working on lots of data, because the OS uses it to cache the disk files. So if you have a lot of small files that you need to access frequently, it will speedup things a lot.

 

Yes, my data files are often over 40 GB in size. And I generally need to be working on 2 or more of them at a time. The SSD does speed things up, just in terms of reading the files into RAM!

post #33 of 50
Quote:
Originally Posted by SolipsismX View Post

If 32GiB sticks will be available in 2014 I can't imagine going the rest of my life — hopefully past 2100 CE — and never seeing that exceeded, especially when I consider how much it's increased in just the past 25 years.

The increases have been in order to reach the limits of the media we use. We used to have black and white displays, then 256 colors, thousands, millions and we stopped there. We used to have really low resolution targets below 320x240 and it increased gradually (Finding Nemo in 2003 was rendered at 1600x900 and had to be redone for HD - http://www.tested.com/art/movies/449542-finding-nemo-3d-interview/ ).

When it comes to video, there are limits to the hardware:

http://www.northlight-images.co.uk/article_pages/guest/physical_limits.html

"For a typical lens set at f/10, the calculation shows that 2000 lines [The word "lines" refers here to cycles of a sine wave or line pairs.] in the vertical dimension is the most that could ever be resolved at f/10 without noticeable losses in contrast. This is significant because a vertical resolution of 2000 corresponds to 21-24 megapixels, a value that already is reached by state-of-the-art cameras. This implies that to make full use of the sensor capabilities of cameras already in production requires the use of apertures less than about f/10. And since the quality of the optics limit the resolution of most lenses today when apertures are much larger than f/5.6 or f/8, this suggests that there are steeply diminishing returns for sensors of more than about 25-35 megapixels for a 35mm camera, a limit that will apply until there are some breakthroughs in optics that can perform better at f/4 than they do at f/10."

They might try to overcome this with techniques used in astronomy but it comes down to consumer demand and willingness for the whole pipeline to adopt the extra data.
Quote:
Originally Posted by SolipsismX View Post

If large amount of cheap, low-power RAM becomes so abundant that developers can not worry about making application more RAM efficient then they won't and as a result we'll need to have use more RAM.

I think developers gave up on RAM efficiency a while ago. I've seen Safari use a good few GBs of memory. That's not really a good thing though and they shouldn't do it on purpose. No major developer would do it on purpose.
Quote:
Originally Posted by SolipsismX View Post

There is a history of technological changes where people in the previous generation said something wouldn't ever exist because it's not practical. We'll see 4K video being streamed as the standard to our home even thought not too long ago it didn't exist at all.

2.7 million UHDTVs sold in 2013, 'industry sources' think 26m this year:

http://hometheaterreview.com/10x-more-ultra-hd-tvs-will-be-sold-this-year/

That's vs 225m HDTVs. If the number of units gets high enough and there's enough people with 20Mbps sustained broadband who also have 4K displays then it'll start to take off. I don't think it will become the standard for a while though.
post #34 of 50
Quote:
Originally Posted by Marvin View Post

The increases have been in order to reach the limits of the media we use. We used to have black and white displays, then 256 colors, thousands, millions and we stopped there. We used to have really low resolution targets below 320x240 and it increased gradually (Finding Nemo in 2003 was rendered at 1600x900 and had to be redone for HD - http://www.tested.com/art/movies/449542-finding-nemo-3d-interview/ ).

When it comes to video, there are limits to the hardware:

http://www.northlight-images.co.uk/article_pages/guest/physical_limits.html

"For a typical lens set at f/10, the calculation shows that 2000 lines [The word "lines" refers here to cycles of a sine wave or line pairs.] in the vertical dimension is the most that could ever be resolved at f/10 without noticeable losses in contrast. This is significant because a vertical resolution of 2000 corresponds to 21-24 megapixels, a value that already is reached by state-of-the-art cameras. This implies that to make full use of the sensor capabilities of cameras already in production requires the use of apertures less than about f/10. And since the quality of the optics limit the resolution of most lenses today when apertures are much larger than f/5.6 or f/8, this suggests that there are steeply diminishing returns for sensors of more than about 25-35 megapixels for a 35mm camera, a limit that will apply until there are some breakthroughs in optics that can perform better at f/4 than they do at f/10."

They might try to overcome this with techniques used in astronomy but it comes down to consumer demand and willingness for the whole pipeline to adopt the extra data.
I think developers gave up on RAM efficiency a while ago. I've seen Safari use a good few GBs of memory. That's not really a good thing though and they shouldn't do it on purpose. No major developer would do it on purpose.
2.7 million UHDTVs sold in 2013, 'industry sources' think 26m this year:

http://hometheaterreview.com/10x-more-ultra-hd-tvs-will-be-sold-this-year/

That's vs 225m HDTVs. If the number of units gets high enough and there's enough people with 20Mbps sustained broadband who also have 4K displays then it'll start to take off. I don't think it will become the standard for a while though.

Can you sum that up as to why throughout the rest of human existence there will never be 256GiB of RAM available for a shipping "PC" that is not designed solely as a server?

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #35 of 50
Quote:
Originally Posted by Marvin View Post
 
Other things that have limits are display colors and audio quality. Human senses have limits and that's where the demand comes from. If there's no perceptible improvement in a technology change then the demand goes away. We should have moved to higher quality colors (16-bpp or 12-bpp minimum) but the demand just isn't there, people are content with 8-bpp.

 

There are so many aspects of current TV technology that really need improvement, yet manufacturers ignore them and go for resolution because it's an easy sell. Thus we get more pixels instead of more chroma bandwidth or wider dynamic range or just fewer compression artifacts.

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
post #36 of 50
Quote:
Originally Posted by Marvin View Post
 
We base it on our own limits. The limit of audio quality is what the human ear can distinguish. The limit in color accuracy and resolution is what the eye can distinguish. The amount of memory needed is based on the limited content that needs to be processed.

 

But those limits have no affect whatsoever on what a manufacturer can SELL. I can introduce you to professional audio engineers who will swear that higher sampling rates offer benefits beyond raising the upper frequency limit higher than what any human could ever perceive, even though the math completely and irrefutably demonstrates otherwise.

 

As long as people continue to believe the impossible, pointless specsmanship will sell products.

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply

Lorin Schultz (formerly V5V)

Audio Engineer

V5V Digital Media, Vancouver, BC Canada

Reply
post #37 of 50
Quote:
Originally Posted by digitalclips View Post

Meanwhile I wish Apple and ATI would work on the dual GPU access at the core level (Central Dispatch, Grand Central ... how about KIng's Cross Station as a name?) even if a re written application like FCPro is the better route. At least let all the other 64 applications that would benefit get an increase ASAP. It is sad to see an i7 iMac besting a 4 Core nMP in Geekbench tests that have no idea what to do with the new architecture (or so it seems to me ... not my area of expertise, if this is not the case I am all ears).

 

There are websites that show AMD's Crossfire working on the new Mac Pro when booted into Windows.

 

Reminds me of the early days of dual processor Macs before OS X and symmetric multiprocessing.  Applications had to be written specifically just to see the other CPU.  For example, Photoshop required certain plugins to enable it to use both CPUs.  Did the OS itself (System 7-9) even use both CPUs at that time?

 

What about having to manually set memory usage for each application in System 9 and older by going into Get Info and changing the number?  Set the number too low and the application would run out of memory.  Set it too high and the memory is wasted and unavailable to other applications.

 

You may think such concepts seem silly today, but you wouldn't believe the number of people on Mac discussion sites at the time dismissing these practices, sometimes defending them.  Who knows, some of those same people might be on this site, dismissing and defending certain Mac, Mac OS or iOS limitations that exist today.


Edited by Haggar - 3/15/14 at 10:30pm
post #38 of 50
Quote:
Originally Posted by SolipsismX View Post

Can you sum that up as to why throughout the rest of human existence there will never be 256GiB of RAM available for a shipping "PC" that is not designed solely as a server?

Availability is different from need but I implied larger sizes wouldn't be offered (mainly from Apple - standard PCs are used as servers so they might have some crossover) and it's based on a couple of things. Demand drives the prices to a certain point and determines manufacturers' options. Without the ever increasing demand, the prices won't keep dropping. If there reaches a point where Apple ships machines with 128GB at affordable prices, why would they offer a 256GB upgrade? It would be solely based around the possibility of someone buying it. I don't think that any buyer will ever feel constrained by 128GB and if no one uses the 256GB option, Apple will stop offering it (they didn't even offer 128GB in the old Mac Pro). You don't for example get to choose the RAM in your iOS device mainly because it stops being a consideration and that will happen with desktops. Apple has already started cutting RAM options from some of the Mac line. Intel limits their CPUs to certain RAM levels too (i7 chips max out at 32GB just now). We might even reach a point where the memory is stuck to the CPU:

http://www.techspot.com/news/52003-future-nvidia-volta-gpu-has-stacked-dram-offers-1tb-s-bandwidth.html

Intel is pushing their integrated graphics so if they have a 32GB RAM limit and the density allows it, why not stick it right onto the chip? Then you get unified memory with very high bandwidth and they don't need an EDRAM cache for Iris Pro. They'd sell their Atoms and other entry chips with 8GB, i5s with 16GB and i7s with 32GB. This keeps certain kinds of buyers always going for the higher-end CPUs.

An alternative I mentioned is if the industry goes with non-volatile memory and uses it both for storage and RAM. That would easily exceed 128GB because people need to use more than 128GB for storage but the interconnect might limit the possibilities here. If RAM is solely for process memory then I don't see it being offered in such large amounts.
Quote:
Originally Posted by Lorin Schultz 
those limits have no affect whatsoever on what a manufacturer can SELL. I can introduce you to professional audio engineers who will swear that higher sampling rates offer benefits beyond raising the upper frequency limit higher than what any human could ever perceive, even though the math completely and irrefutably demonstrates otherwise.

As long as people continue to believe the impossible, pointless specsmanship will sell products.

That's true in some cases and perhaps it will happen to an extent with RAM but I don't see it overshooting 128GB. I don't see entry machines ever being offered with more than 16GB and I think all manufacturers will try to solder the RAM in.
post #39 of 50
Quote:
Originally Posted by Marvin View Post

The Great Gatsby (2013) was shot at 5K so I'd guess it was mastered to 4K and it used photorealistic backdrops rendered with Pixar's software:


 

Thanks, that video was a real eye-opener (as to how often graphics are actually used).

post #40 of 50
Quote:
Originally Posted by Marvin View Post


The increases have been in order to reach the limits of the media we use. We used to have black and white displays, then 256 colors, thousands, millions and we stopped there. We used to have really low resolution targets below 320x240 and it increased gradually (Finding Nemo in 2003 was rendered at 1600x900 and had to be redone for HD - http://www.tested.com/art/movies/449542-finding-nemo-3d-interview/ ).

When it comes to video, there are limits to the hardware:

http://www.northlight-images.co.uk/article_pages/guest/physical_limits.html

"For a typical lens set at f/10, the calculation shows that 2000 lines [The word "lines" refers here to cycles of a sine wave or line pairs.] in the vertical dimension is the most that could ever be resolved at f/10 without noticeable losses in contrast. This is significant because a vertical resolution of 2000 corresponds to 21-24 megapixels, a value that already is reached by state-of-the-art cameras. This implies that to make full use of the sensor capabilities of cameras already in production requires the use of apertures less than about f/10. And since the quality of the optics limit the resolution of most lenses today when apertures are much larger than f/5.6 or f/8, this suggests that there are steeply diminishing returns for sensors of more than about 25-35 megapixels for a 35mm camera, a limit that will apply until there are some breakthroughs in optics that can perform better at f/4 than they do at f/10."

They might try to overcome this with techniques used in astronomy but it comes down to consumer demand and willingness for the whole pipeline to adopt the extra data.
I think developers gave up on RAM efficiency a while ago. I've seen Safari use a good few GBs of memory. That's not really a good thing though and they shouldn't do it on purpose. No major developer would do it on purpose.
2.7 million UHDTVs sold in 2013, 'industry sources' think 26m this year:

http://hometheaterreview.com/10x-more-ultra-hd-tvs-will-be-sold-this-year/

That's vs 225m HDTVs. If the number of units gets high enough and there's enough people with 20Mbps sustained broadband who also have 4K displays then it'll start to take off. I don't think it will become the standard for a while though.

 

 

People aren't content with 8bpp. People don't know jack about pixel depth per channel. People see asinine pricing for 4K and walk away. The industry is caught yearly with price fixing fraud cases, whether they be for RAM, LED/LCD panels, to audio chips, you name it.

 

Price is the singularity in selling electronics. 4K becoming common allows studios to jump to 8k and beyond.  A company that can produce a quality 16bit per channel, 4K Panel for Home Theaters for $1500 at 50" diagonal would own the industry.

 

Here's the problem: There are how many major panel manufacturers left? LG, Samsung, SHARP [Samsung], Panasonic, SONY, Toshiba, Hitachi, etc

 

Nice list: http://www.avsforum.com/t/1060466/list-of-lcd-panel-manufactures-the-panel-behind-the-brand

 


LG, Samsung busted for billions in collusion. SHARP bankrupt, if not for Samsung. And Toshiba is high on crack with this kind of pricing:

 

http://www.toshiba.com/us/tv/4k/84l9300u

 

Toshiba 84L9300U 84" Class 4K LED TV : List Price $19,999.99 , Current Price: $16,999.99.

 

They are pricing 4K like they did in 1998 when HDTV true 1080p panels showed up at around $25k and people laughed their asses off. Then we had US Senate hearings and heard about all the sorrow from Manufacturers needing to recoup their R&D, never mind they get tax subsidies.

Sorry, but with the Advent of 4K hitting big in motion pictures and more, these companies better blow up pricing down to sane levels or you will see companies like Apple creating a subsidiary [joint venture] with other big players and producing their own.
 

FWIW: With 10GbE going in on 3rd party Clone Motherboards for PCs and Thunderbolt 2 out with 20GB on current Intel PC Clones, and later, with most major markets offering 20Mbps-40Mbps never mind 100Mbps FiOS and beyond, you won't have a problem with demand.

 

Here's the real reality, SONY/Panasonic are releasing 300GB - 1TB Archive Format Disc to replace Blu-Ray means Disc Catalogue Sales will be introduced.

 

http://www.escapistmagazine.com/news/view/132786-Sony-Panasonic-Reveal-Archival-Disc-300GB-to-1TB-Optical-Disc

 

 

 

This won't flop. This won't be Blu-Ray all over. Stars Series, on 1 DISC. ALIENS on that same DISC, etc. Studios can create Marvel Release DiSCS, etc.


Edited by mdriftmeyer - 3/16/14 at 4:11am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Transcend's 32GB RAM modules for Apple's Mac Pro doubles max memory to 128GB [u]