Transcend's 32GB RAM modules for Apple's Mac Pro doubles max memory to 128GB [u]

13»

Comments

  • Reply 41 of 59
    Quote:
    Originally Posted by reroll View Post



    And you can use even more RAM than that for scientific computing (biology, astronomy, ...). RAM is in general useful when working on lots of data, because the OS uses it to cache the disk files. So if you have a lot of small files that you need to access frequently, it will speedup things a lot.

     

    Yes, my data files are often over 40 GB in size. And I generally need to be working on 2 or more of them at a time. The SSD does speed things up, just in terms of reading the files into RAM!

  • Reply 42 of 59
    MarvinMarvin Posts: 15,326moderator
    solipsismx wrote: »
    If 32GiB sticks will be available in 2014 I can't imagine going the rest of my life — hopefully past 2100 CE — and never seeing that exceeded, especially when I consider how much it's increased in just the past 25 years.

    The increases have been in order to reach the limits of the media we use. We used to have black and white displays, then 256 colors, thousands, millions and we stopped there. We used to have really low resolution targets below 320x240 and it increased gradually (Finding Nemo in 2003 was rendered at 1600x900 and had to be redone for HD - http://www.tested.com/art/movies/449542-finding-nemo-3d-interview/ ).

    When it comes to video, there are limits to the hardware:

    http://www.northlight-images.co.uk/article_pages/guest/physical_limits.html

    "For a typical lens set at f/10, the calculation shows that 2000 lines [The word "lines" refers here to cycles of a sine wave or line pairs.] in the vertical dimension is the most that could ever be resolved at f/10 without noticeable losses in contrast. This is significant because a vertical resolution of 2000 corresponds to 21-24 megapixels, a value that already is reached by state-of-the-art cameras. This implies that to make full use of the sensor capabilities of cameras already in production requires the use of apertures less than about f/10. And since the quality of the optics limit the resolution of most lenses today when apertures are much larger than f/5.6 or f/8, this suggests that there are steeply diminishing returns for sensors of more than about 25-35 megapixels for a 35mm camera, a limit that will apply until there are some breakthroughs in optics that can perform better at f/4 than they do at f/10."

    They might try to overcome this with techniques used in astronomy but it comes down to consumer demand and willingness for the whole pipeline to adopt the extra data.
    solipsismx wrote: »
    If large amount of cheap, low-power RAM becomes so abundant that developers can not worry about making application more RAM efficient then they won't and as a result we'll need to have use more RAM.

    I think developers gave up on RAM efficiency a while ago. I've seen Safari use a good few GBs of memory. That's not really a good thing though and they shouldn't do it on purpose. No major developer would do it on purpose.
    solipsismx wrote: »
    There is a history of technological changes where people in the previous generation said something wouldn't ever exist because it's not practical. We'll see 4K video being streamed as the standard to our home even thought not too long ago it didn't exist at all.

    2.7 million UHDTVs sold in 2013, 'industry sources' think 26m this year:

    http://hometheaterreview.com/10x-more-ultra-hd-tvs-will-be-sold-this-year/

    That's vs 225m HDTVs. If the number of units gets high enough and there's enough people with 20Mbps sustained broadband who also have 4K displays then it'll start to take off. I don't think it will become the standard for a while though.
  • Reply 43 of 59
    solipsismxsolipsismx Posts: 19,566member
    Marvin wrote: »
    The increases have been in order to reach the limits of the media we use. We used to have black and white displays, then 256 colors, thousands, millions and we stopped there. We used to have really low resolution targets below 320x240 and it increased gradually (Finding Nemo in 2003 was rendered at 1600x900 and had to be redone for HD - http://www.tested.com/art/movies/449542-finding-nemo-3d-interview/ ).

    When it comes to video, there are limits to the hardware:

    http://www.northlight-images.co.uk/article_pages/guest/physical_limits.html

    "For a typical lens set at f/10, the calculation shows that 2000 lines [The word "lines" refers here to cycles of a sine wave or line pairs.] in the vertical dimension is the most that could ever be resolved at f/10 without noticeable losses in contrast. This is significant because a vertical resolution of 2000 corresponds to 21-24 megapixels, a value that already is reached by state-of-the-art cameras. This implies that to make full use of the sensor capabilities of cameras already in production requires the use of apertures less than about f/10. And since the quality of the optics limit the resolution of most lenses today when apertures are much larger than f/5.6 or f/8, this suggests that there are steeply diminishing returns for sensors of more than about 25-35 megapixels for a 35mm camera, a limit that will apply until there are some breakthroughs in optics that can perform better at f/4 than they do at f/10."

    They might try to overcome this with techniques used in astronomy but it comes down to consumer demand and willingness for the whole pipeline to adopt the extra data.
    I think developers gave up on RAM efficiency a while ago. I've seen Safari use a good few GBs of memory. That's not really a good thing though and they shouldn't do it on purpose. No major developer would do it on purpose.
    2.7 million UHDTVs sold in 2013, 'industry sources' think 26m this year:

    http://hometheaterreview.com/10x-more-ultra-hd-tvs-will-be-sold-this-year/

    That's vs 225m HDTVs. If the number of units gets high enough and there's enough people with 20Mbps sustained broadband who also have 4K displays then it'll start to take off. I don't think it will become the standard for a while though.

    Can you sum that up as to why throughout the rest of human existence there will never be 256GiB of RAM available for a shipping "PC" that is not designed solely as a server?
  • Reply 44 of 59
    Quote:

    Originally Posted by Marvin View Post

     
    Other things that have limits are display colors and audio quality. Human senses have limits and that's where the demand comes from. If there's no perceptible improvement in a technology change then the demand goes away. We should have moved to higher quality colors (16-bpp or 12-bpp minimum) but the demand just isn't there, people are content with 8-bpp.


     

    There are so many aspects of current TV technology that really need improvement, yet manufacturers ignore them and go for resolution because it's an easy sell. Thus we get more pixels instead of more chroma bandwidth or wider dynamic range or just fewer compression artifacts.

  • Reply 45 of 59
    Quote:

    Originally Posted by Marvin View Post

     
    We base it on our own limits. The limit of audio quality is what the human ear can distinguish. The limit in color accuracy and resolution is what the eye can distinguish. The amount of memory needed is based on the limited content that needs to be processed.


     

    But those limits have no affect whatsoever on what a manufacturer can SELL. I can introduce you to professional audio engineers who will swear that higher sampling rates offer benefits beyond raising the upper frequency limit higher than what any human could ever perceive, even though the math completely and irrefutably demonstrates otherwise.

     

    As long as people continue to believe the impossible, pointless specsmanship will sell products.

  • Reply 46 of 59
    haggarhaggar Posts: 1,568member
    Quote:



    Originally Posted by digitalclips View Post



    Meanwhile I wish Apple and ATI would work on the dual GPU access at the core level (Central Dispatch, Grand Central ... how about KIng's Cross Station as a name?) even if a re written application like FCPro is the better route. At least let all the other 64 applications that would benefit get an increase ASAP. It is sad to see an i7 iMac besting a 4 Core nMP in Geekbench tests that have no idea what to do with the new architecture (or so it seems to me ... not my area of expertise, if this is not the case I am all ears).

     

    There are websites that show AMD's Crossfire working on the new Mac Pro when booted into Windows.

     

    Reminds me of the early days of dual processor Macs before OS X and symmetric multiprocessing.  Applications had to be written specifically just to see the other CPU.  For example, Photoshop required certain plugins to enable it to use both CPUs.  Did the OS itself (System 7-9) even use both CPUs at that time?

     

    What about having to manually set memory usage for each application in System 9 and older by going into Get Info and changing the number?  Set the number too low and the application would run out of memory.  Set it too high and the memory is wasted and unavailable to other applications.

     

    You may think such concepts seem silly today, but you wouldn't believe the number of people on Mac discussion sites at the time dismissing these practices, sometimes defending them.  Who knows, some of those same people might be on this site, dismissing and defending certain Mac, Mac OS or iOS limitations that exist today.

  • Reply 47 of 59
    MarvinMarvin Posts: 15,326moderator
    solipsismx wrote: »
    Can you sum that up as to why throughout the rest of human existence there will never be 256GiB of RAM available for a shipping "PC" that is not designed solely as a server?

    Availability is different from need but I implied larger sizes wouldn't be offered (mainly from Apple - standard PCs are used as servers so they might have some crossover) and it's based on a couple of things. Demand drives the prices to a certain point and determines manufacturers' options. Without the ever increasing demand, the prices won't keep dropping. If there reaches a point where Apple ships machines with 128GB at affordable prices, why would they offer a 256GB upgrade? It would be solely based around the possibility of someone buying it. I don't think that any buyer will ever feel constrained by 128GB and if no one uses the 256GB option, Apple will stop offering it (they didn't even offer 128GB in the old Mac Pro). You don't for example get to choose the RAM in your iOS device mainly because it stops being a consideration and that will happen with desktops. Apple has already started cutting RAM options from some of the Mac line. Intel limits their CPUs to certain RAM levels too (i7 chips max out at 32GB just now). We might even reach a point where the memory is stuck to the CPU:

    http://www.techspot.com/news/52003-future-nvidia-volta-gpu-has-stacked-dram-offers-1tb-s-bandwidth.html

    Intel is pushing their integrated graphics so if they have a 32GB RAM limit and the density allows it, why not stick it right onto the chip? Then you get unified memory with very high bandwidth and they don't need an EDRAM cache for Iris Pro. They'd sell their Atoms and other entry chips with 8GB, i5s with 16GB and i7s with 32GB. This keeps certain kinds of buyers always going for the higher-end CPUs.

    An alternative I mentioned is if the industry goes with non-volatile memory and uses it both for storage and RAM. That would easily exceed 128GB because people need to use more than 128GB for storage but the interconnect might limit the possibilities here. If RAM is solely for process memory then I don't see it being offered in such large amounts.
    those limits have no affect whatsoever on what a manufacturer can SELL. I can introduce you to professional audio engineers who will swear that higher sampling rates offer benefits beyond raising the upper frequency limit higher than what any human could ever perceive, even though the math completely and irrefutably demonstrates otherwise.

    As long as people continue to believe the impossible, pointless specsmanship will sell products.

    That's true in some cases and perhaps it will happen to an extent with RAM but I don't see it overshooting 128GB. I don't see entry machines ever being offered with more than 16GB and I think all manufacturers will try to solder the RAM in.
  • Reply 48 of 59
    asciiascii Posts: 5,936member
    Quote:

    Originally Posted by Marvin View Post



    The Great Gatsby (2013) was shot at 5K so I'd guess it was mastered to 4K and it used photorealistic backdrops rendered with Pixar's software:







     

    Thanks, that video was a real eye-opener (as to how often graphics are actually used).

  • Reply 49 of 59
    mdriftmeyermdriftmeyer Posts: 7,503member
    Quote:
    Originally Posted by Marvin View Post





    The increases have been in order to reach the limits of the media we use. We used to have black and white displays, then 256 colors, thousands, millions and we stopped there. We used to have really low resolution targets below 320x240 and it increased gradually (Finding Nemo in 2003 was rendered at 1600x900 and had to be redone for HD - http://www.tested.com/art/movies/449542-finding-nemo-3d-interview/ ).



    When it comes to video, there are limits to the hardware:



    http://www.northlight-images.co.uk/article_pages/guest/physical_limits.html



    "For a typical lens set at f/10, the calculation shows that 2000 lines [The word "lines" refers here to cycles of a sine wave or line pairs.] in the vertical dimension is the most that could ever be resolved at f/10 without noticeable losses in contrast. This is significant because a vertical resolution of 2000 corresponds to 21-24 megapixels, a value that already is reached by state-of-the-art cameras. This implies that to make full use of the sensor capabilities of cameras already in production requires the use of apertures less than about f/10. And since the quality of the optics limit the resolution of most lenses today when apertures are much larger than f/5.6 or f/8, this suggests that there are steeply diminishing returns for sensors of more than about 25-35 megapixels for a 35mm camera, a limit that will apply until there are some breakthroughs in optics that can perform better at f/4 than they do at f/10."



    They might try to overcome this with techniques used in astronomy but it comes down to consumer demand and willingness for the whole pipeline to adopt the extra data.

    I think developers gave up on RAM efficiency a while ago. I've seen Safari use a good few GBs of memory. That's not really a good thing though and they shouldn't do it on purpose. No major developer would do it on purpose.

    2.7 million UHDTVs sold in 2013, 'industry sources' think 26m this year:



    http://hometheaterreview.com/10x-more-ultra-hd-tvs-will-be-sold-this-year/



    That's vs 225m HDTVs. If the number of units gets high enough and there's enough people with 20Mbps sustained broadband who also have 4K displays then it'll start to take off. I don't think it will become the standard for a while though.

     

     

    People aren't content with 8bpp. People don't know jack about pixel depth per channel. People see asinine pricing for 4K and walk away. The industry is caught yearly with price fixing fraud cases, whether they be for RAM, LED/LCD panels, to audio chips, you name it.

     

    Price is the singularity in selling electronics. 4K becoming common allows studios to jump to 8k and beyond.  A company that can produce a quality 16bit per channel, 4K Panel for Home Theaters for $1500 at 50" diagonal would own the industry.

     

    Here's the problem: There are how many major panel manufacturers left? LG, Samsung, SHARP [Samsung], Panasonic, SONY, Toshiba, Hitachi, etc

     

    Nice list: http://www.avsforum.com/t/1060466/list-of-lcd-panel-manufactures-the-panel-behind-the-brand

     



    LG, Samsung busted for billions in collusion. SHARP bankrupt, if not for Samsung. And Toshiba is high on crack with this kind of pricing:

     

    http://www.toshiba.com/us/tv/4k/84l9300u

     

    Toshiba 84L9300U 84" Class 4K LED TV : List Price $19,999.99 , Current Price: $16,999.99.

     

    They are pricing 4K like they did in 1998 when HDTV true 1080p panels showed up at around $25k and people laughed their asses off. Then we had US Senate hearings and heard about all the sorrow from Manufacturers needing to recoup their R&D, never mind they get tax subsidies.



    Sorry, but with the Advent of 4K hitting big in motion pictures and more, these companies better blow up pricing down to sane levels or you will see companies like Apple creating a subsidiary [joint venture] with other big players and producing their own.

     

    FWIW: With 10GbE going in on 3rd party Clone Motherboards for PCs and Thunderbolt 2 out with 20GB on current Intel PC Clones, and later, with most major markets offering 20Mbps-40Mbps never mind 100Mbps FiOS and beyond, you won't have a problem with demand.

     

    Here's the real reality, SONY/Panasonic are releasing 300GB - 1TB Archive Format Disc to replace Blu-Ray means Disc Catalogue Sales will be introduced.

     

    http://www.escapistmagazine.com/news/view/132786-Sony-Panasonic-Reveal-Archival-Disc-300GB-to-1TB-Optical-Disc

     

     

     

    This won't flop. This won't be Blu-Ray all over. Stars Series, on 1 DISC. ALIENS on that same DISC, etc. Studios can create Marvel Release DiSCS, etc.

  • Reply 50 of 59
    MarvinMarvin Posts: 15,326moderator
    ascii wrote: »
    Thanks, that video was a real eye-opener (as to how often graphics are actually used).

    There's a studio here does a lot of films and they show VFX breakdowns for a few of them (click on the movie and the breakdown button):

    http://www.rsp.com.au/portfolio.htm#

    Another studio Atomic Fiction uses iMacs and MBPs to do effects work and then pushes it to different cloud rendering farms:






    When the Mac Pro was being shown off at WWDC, some people had concerns about the 64GB of memory:

    http://community.thefoundry.co.uk/discussion/topic.aspx?f=34&t=77930&page=1

    "Honestly Jack, "just" 12-cores (we got E5 processors with 8 now, so it could have 16), and 4 memory dimms, even if we get 16 dimms, it´s still just a 64 ram machine, working nuke with 4k, that memory will be gone in 30 frames and 5 nodes, using agressive caching. "

    Jack Greasley who was on stage at WWDC showing off MARI replied there:

    "I've been working on one for the last three weeks. It is the best off the shelf performance I have ever seen.
    Don't assume anything."

    and there's a VFX guy here:

    http://forums.thefoundry.co.uk/phpBB2/viewtopic.php?t=8421
    http://www.imdb.com/name/nm0754146/

    "I just got a new linux box with hex core, 64GB ram and SSD.

    Best machine I ever had for Nuke. With the SSD I can run 4k dpxfiles in real time in Nuke (caching can't even keep up with the playback in 2k). As for RAM, you can never have enough in Nuke"

    That's the thing with SSD now is that it's getting close to old RAM speeds. The memory bandwidth of the old G5 towers could go down to 2-3GB/s. With PCIe SSDs being ~1GB/s, a 1TB SSD is almost like having 1TB of RAM. It's pretty slow RAM but fast enough to stream large amounts of data from.

    Another VFX guy who uses 128GB-256GB in 24-core machines is this guy - they do fluid sims and other effects work for 4K, 5K, 6K:

    http://www.imdb.com/name/nm1436046/
    http://www.linkedin.com/in/dekekincaid

    Here you can see him working on a Macbook Pro:



    He was jumping between 5K, 2K, 1080p, 3D files all on the MBP, which can only take 16GB RAM and screen recording at the same time, which slowed things down.

    The motivation for more memory seems to be for more of a data cache but ideally, real-time processing shouldn't even need a cache. For the instances it is needed, RAM is faster than storage but if the storage is fast enough, it's fine too. A MBP with 1TB SSD can cache frames to the drive.
    A company that can produce a quality 16bit per channel, 4K Panel for Home Theaters for $1500 at 50" diagonal would own the industry.

    It certainly seems like the manufacturers are just experimenting to see what margins they can get away with. I can't imagine the unit volumes would be too high at the prices they sell at just now.

    OLED looks great for this:



    Lightweight, deep black levels, accurate enough colors, fast refresh rate. It's just the price. It would be good if Apple could make 4K OLED displays suitable for the iMacs, Thunderbolt displays and possibly TVs with laminated anti-glare glass and metal backing. The guy in the video mentioned that they wouldn't bring out the monitors until they could sell them cheaper than a car but if it's just down to the supplier then it's not necessarily build costs.
  • Reply 51 of 59
    haggarhaggar Posts: 1,568member
    Quote:
    Originally Posted by Lorin Schultz View Post

     

    There are so many aspects of current TV technology that really need improvement, yet manufacturers ignore them and go for resolution because it's an easy sell. Thus we get more pixels instead of more chroma bandwidth or wider dynamic range or just fewer compression artifacts.


     

     

    Quote:
    Originally Posted by mdriftmeyer View Post

     

    People aren't content with 8bpp. People don't know jack about pixel depth per channel. People see asinine pricing for 4K and walk away. The industry is caught yearly with price fixing fraud cases, whether they be for RAM, LED/LCD panels, to audio chips, you name it.

     


     

    Do Mac users have short memory or selective memory?

     

    http://web.archive.org/web/20070409070320/http://www.apple.com/macbookpro/graphics.html

     

    http://forums.appleinsider.com/t/74635/apple-hit-with-class-action-suit-over-macbook-macbook-pro-displays

     

    Surely cost could not be a reason for Apple using lower quality and less accurate display panels in those Macbook Pros, because Mac users are supposed to be willing to pay more for higher quality parts.  And you can't blame the LCD manufacturer for selling Apple those low quality screens because those same manufacturers also make higher quality panels which Apple could have used at that time.

  • Reply 52 of 59
    solipsismxsolipsismx Posts: 19,566member
    haggar wrote: »

    Your comments back up their comments, not disprove them. "A pair of owners" is not the general public usage of "people" that [@]mdriftmeyer[/@] used. "People" don't know about pixel depth per channel.
  • Reply 53 of 59
    haggarhaggar Posts: 1,568member
    Quote:
    Originally Posted by SolipsismX View Post





    Your comments back up their comments, not disprove them. "A pair of owners" is not the general public usage of "people" that @mdriftmeyer used. "People" don't know about pixel depth per channel.

     

    Does that mean Apple was betting on the general public's lack of knowledge about display quality in order to get away with selling Macbook Pros with the inferior display panels, and marketing them as "a nuanced view simply unavailable on other portables"?  Or betting on pro-Apple users on sites such as AppleInsider to come to Apple's defense?

  • Reply 54 of 59
    solipsismxsolipsismx Posts: 19,566member
    haggar wrote: »
    Does that mean Apple was betting on the general public's lack of knowledge about display quality in order to get away with selling Macbook Pros with the inferior display panels, and marketing them as "a nuanced view simply unavailable on other portables"?

    Inferior to what other consumer machine in its class?
  • Reply 55 of 59
    Quote:

    Originally Posted by Haggar View Post

     
    Quote:
    Originally Posted by Lorin Schultz View Post

     

    There are so many aspects of current TV technology that really need improvement, yet manufacturers ignore them and go for resolution because it's an easy sell. Thus we get more pixels instead of more chroma bandwidth or wider dynamic range or just fewer compression artifacts.


     

     

    Quote:
    Originally Posted by mdriftmeyer View Post

     

    People aren't content with 8bpp. People don't know jack about pixel depth per channel. People see asinine pricing for 4K and walk away. The industry is caught yearly with price fixing fraud cases, whether they be for RAM, LED/LCD panels, to audio chips, you name it.

     


     

    Do Mac users have short memory or selective memory?

     

    http://web.archive.org/web/20070409070320/http://www.apple.com/macbookpro/graphics.html

     

    http://forums.appleinsider.com/t/74635/apple-hit-with-class-action-suit-over-macbook-macbook-pro-displays

     

    Surely cost could not be a reason for Apple using lower quality and less accurate display panels in those Macbook Pros, because Mac users are supposed to be willing to pay more for higher quality parts.  And you can't blame the LCD manufacturer for selling Apple those low quality screens because those same manufacturers also make higher quality panels which Apple could have used at that time.


     

    I'm sorry, I don't understand the connection between what I wrote and your comment. Can you elaborate?

  • Reply 56 of 59
    MacProMacPro Posts: 19,728member

    People aren't content with 8bpp. People don't know jack about pixel depth per channel. People see asinine pricing for 4K and walk away. The industry is caught yearly with price fixing fraud cases, whether they be for RAM, LED/LCD panels, to audio chips, you name it.

    Price is the singularity in selling electronics. 4K becoming common allows studios to jump to 8k and beyond.  A company that can produce a quality 16bit per channel, 4K Panel for Home Theaters for $1500 at 50" diagonal would own the industry.

    Here's the problem: There are how many major panel manufacturers left? LG, Samsung, SHARP [Samsung], Panasonic, SONY, Toshiba, Hitachi, etc

    Nice list: http://www.avsforum.com/t/1060466/list-of-lcd-panel-manufactures-the-panel-behind-the-brand



    LG, Samsung busted for billions in collusion. SHARP bankrupt, if not for Samsung. And Toshiba is high on crack with this kind of pricing:

    http://www.toshiba.com/us/tv/4k/84l9300u

    <h1>Toshiba 84L9300U 84" Class 4K LED TV : List Price <span id="user_div_listPrice">$19,999.99</span> , Current Price: $<span id="user_theFinalPrice">16,999.99.</span></h1>


    They are pricing 4K like they did in 1998 when HDTV true 1080p panels showed up at around $25k and people laughed their asses off. Then we had US Senate hearings and heard about all the sorrow from Manufacturers needing to recoup their R&D, never mind they get tax subsidies.


    Sorry, but with the Advent of 4K hitting big in motion pictures and more, these companies better blow up pricing down to sane levels or you will see companies like Apple creating a subsidiary [joint venture] with other big players and producing their own.

     
    FWIW: With 10GbE going in on 3rd party Clone Motherboards for PCs and Thunderbolt 2 out with 20GB on current Intel PC Clones, and later, with most major markets offering 20Mbps-40Mbps never mind 100Mbps FiOS and beyond, you won't have a problem with demand.

    Here's the real reality, SONY/Panasonic are releasing 300GB - 1TB Archive Format Disc to replace Blu-Ray means Disc Catalogue Sales will be introduced.

    http://www.escapistmagazine.com/news/view/132786-Sony-Panasonic-Reveal-Archival-Disc-300GB-to-1TB-Optical-Disc

     <img alt="" class="lightbox-enabled" data-id="39862" data-type="61" src="http://forums.appleinsider.com/content/type/61/id/39862/width/350/height/700/flags/LL" style="; width: 350px; height: 191px">


    This won't flop. This won't be Blu-Ray all over. Stars Series, on 1 DISC. ALIENS on that same DISC, etc. Studios can create Marvel Release DiSCS, etc.

    That's fascinating news ... thanks, I had not heard of this.

    For archival purposes I can see this, I am dubious about them as a method of selling games etc. given the trend to streaming as the internet (in theory) gets cheaper and faster. I currently use 1 TB bare HDs for archival and they are dirt cheap these days but bulky obviously. I wonder what the cost of the burner and burning speed will be. I have been through so many 'archival systems' in the last 35 years I have lost count . The problem is always, one day, the device to play them back is obsolete ... anybody want a SCSI DLT and 100 tapes?
  • Reply 57 of 59
    mdriftmeyermdriftmeyer Posts: 7,503member
    Quote:

    Originally Posted by digitalclips View Post





    That's fascinating news ... thanks, I had not heard of this.



    For archival purposes I can see this, I am dubious about them as a method of selling games etc. given the trend to streaming as the internet (in theory) gets cheaper and faster. I currently use 1 TB bare HDs for archival and they are dirt cheap these days but bulky obviously. I wonder what the cost of the burner and burning speed will be. I have been through so many 'archival systems' in the last 35 years I have lost count . The problem is always, one day, the device to play them back is obsolete ... anybody want a SCSI DLT and 100 tapes?

     

    HD will fail long before this format. Games with gigabytes of textures, streaming across a Thunderbolt 2/2+ from an Archive Disc [there will have to be an integrated SSD installed] will resolve your concerns.

  • Reply 58 of 59
    The Transcend 32 GB RAM is only 1333MHz, vs. 1866MHz memory Apple ships, so there is a minor tradeoff of sorts...
  • Reply 59 of 59
    tallest skiltallest skil Posts: 43,388member
    Originally Posted by libertyforall View Post

    The Transcend 32 GB RAM is only 1333MHz, vs. 1866MHz memory Apple ships, so there is a minor tradeoff of sorts...

     

    Yowza. Can it even be used safely?

Sign In or Register to comment.