Blu-ray Disc (Next Generation DVD)

2

Comments

  • Reply 21 of 43
    The Blue Ray tech can put our 30Mb per second. This is in line with it's HDTV capabilities. Likely a separate red laser assembly could be employed for backwards compatability. This standard seems to be geared toward HDTV which is getting more popular in the consumer market but has a long way to go. Blue Ray may end up going in to computers first but the current DVD's will go on for quite a while yet.



    I have no doubt it will become a consumer standard eventually but there just aren't enough people with HDTV sets yet and using the Blue RAy for 13 hours of standard definition video doesn't make a lot of sense for any home applications. None of the manufacturers anounced any hardware yet but getting the standard together is the first step.



    As I understand it the FMD disks have hit a funding snag.
  • Reply 22 of 43
    rickagrickag Posts: 1,626member
    For Matsu's sake, I hope all these heavy weight manufacturers bring Motorola on board for processors, then he could breath easier, since the roll out of this would be delayed indefinitely.
  • Reply 23 of 43
    matsumatsu Posts: 6,558member




    The only reason I object to these new technologies is that they are supposed to be means of exchange. As a back-up system of storage they might be quite useful. Though for general file sharing a DVDrw and CDrw are probably more useful. You can put music and video on them in formats you don't need a computer to enjoy. Archivists, knock yourselves out. Most consumers should be quite happy with 9+ GBs (eventually) of DVD storage.



    As for hard-drive replacement, I don't think so. Hard-drives keep getting bigger and faster, and research keeps pushing the limits of aerial density. Notebook drives will push 100GB by years end. Desktop drives will probably hit 200-300GB. Nah, I think that big and cheap hard drives plus a storage media with consumer applications is the best formula for computers.
  • Reply 24 of 43
    addisonaddison Posts: 1,185member
    This format has been discussed in Home Cinema Circles or some time. It is primarilly aimed at HDTV and will be capable of storing data which can be read by most existing players.



    Naturally it will be usefull in other areas such as computer storage.
  • Reply 25 of 43
    Blue ray will not find its way onto computers easily, or soon.



    Why? Blue ray uses caddy's! Just like the old DVD-RAM



    Until.... pioneer can make its superdrive a 3 laser device (CD laser + DVD laser + blu-ray laser), and add in to its form the ability to take discs and caddys, and price this under $500 (which is ****ing expensive if you ask me, $200 is decent heh).... you won't see it.



    Superdrive 4 maybe. 2 being a sped up and support for DVD+RW (please please please please) and 3 being laptop edition
  • Reply 26 of 43
    moogsmoogs Posts: 4,296member
    I tend to agree. If Blu-ray won't be in commercial electronics until next year, it's going to be at LEAST 18-24 months before we see it in a computer drive, and when we do I doubt it will be cheap. Talking $1000 or more here, the way the the existing Pioneer DVR drive was when it first came out.
  • Reply 27 of 43
    Can you imagine how much these discs will cost??



    --Jay <img src="graemlins/smokin.gif" border="0" alt="[Chilling]" />
  • Reply 28 of 43
    You are all missing the HUGE point of this.



    You will NOT be using it for a long time.



    DVDs were available in the early 90s - it's been nearly 10 years. They were "developed" probably before 1990.





    However you slice it though, this new technology is similar to the first DVDs - about 4.7GB. Now DVDs are 18GBs. So the BlueRays will be well over 100GB in the future.



    And they are made for HDTV, not for regular movies.



    Because if you watch your DVDs on your new HDTV set, you will have wasted your money on your TV.



    The point of this is first to allow people to buy and rent movies and material that works WITH their HDTV.



    After this comes out, we'll be seeing camcorders becomeing HDTV native. As well as computer editing systems (Firewire2 will support data rates for HDTV).





    The future is HDTV, and this is the future. But don't expect it too soon - give it at least 5 more years before you can rent them at any blockbuster yo uwalk into.



    Andrew
  • Reply 29 of 43
    matsumatsu Posts: 6,558member
    Something of use from a pretty good DVD <a href="http://www.dvddemystified.com/"; target="_blank">FAQ</a>:



    [quote]Will high-definition DVD or 720p DVD make current players and discs obsolete?



    Not for a long time. HD-DVD "technology demonstrations" being made by various companies do not mean that HD-DVD is around the corner (the demonstrations mean only that companies are busy jockeying for technology and patent positions in developing the future DVD format). Consider that U.S. HDTV was anticipated to be available in 1989, yet was not finalized until 1996, and did not appear until 1998. And has it made your current TV obsolete yet?



    HD-DVD (HD stands for both high-density and high-definition) may be available in 2003 at the very earliest, though 2006 is more likely. It will use blue or violet lasers to read smaller pits, increasing data capacity to around 20 GB per layer. MPEG-2 Progressive Profile--or perhaps another format such as H.263--will probably be used to encode the video. All ATSC and DVB formats will be supported, possibly with the addition of 1080p24. HD-DVD players will play current DVD discs and will make them look even better (with progressive-scan video and picture processing), but new HD-DVD discs won't be playable in older DVD players (unless one side is HD and the other standard DVD).



    Ironically, computers will support HDTV before settop players do, since 2x DVD-ROM drives coupled with appropriate playback and display hardware meet the 19 Mbps data rate needed for HDTV. This has led to various "720p DVD" projects, which use the existing DVD format to store video in 1280x720 resolution at 24 progressive frames per second. It's possible that 720p DVDs can be made compatible with existing players (which would only play the 480-line line data).



    Note: The term HDVD has already been taken for "high-density volumetric display."



    Some have speculated that a "double-headed" player reading both sides of the disc at the same time could double the data rate or provide an enhancement stream for applications such as HDTV. This is currently impossible since the track spirals go in opposite directions (unless all four layers are used). The DVD spec would have to be changed to allow reverse spirals on layer 0. Even then, keeping both sides in sync, especially with MPEG-2's variable bit rate, would require independently tracking heads, precise track and pit spacing, and a larger, more sophisticated track buffer. Another option would be to use two heads to read both layers of one side simultaneously. This is technically feasible but has no advantage over reading one layer twice as fast, which is simpler and cheaper.

    <hr></blockquote>



    Clearly there's enough room in the current DVD for HDTV quality play back. Using an HD (720p) DVD system based on the current spec in a 2x drive would make it possible to have discs encoded so that they are compatible with new HD players and older SD players.



    I think it's the best solution. 2x or greater drive mechanisms are plentiful and cheap. A way of producing dual layer and dual layer double sided discs already exists. And it would be possible to encode in such a way that older drives would only read standard def information, while newer drives would read high def information. On the whole it would be cheaper than rolling out a whole new spec. Especially since HDTV penetration is only expected to be about 30% of homes by the year 2006.



    What I'm impressed with is the relatively low bitrate of all DVD's. It practically fits a USB carrier (in it's compressed form) less than 20Mb would give a high def data stream under MPEG-2 compression. SD needs only half as much. WOW, that's compression!
  • Reply 30 of 43
    [quote]Originally posted by Matsu:

    <strong>That's a fvcking rip. I hope we don't see anything for at least another 5 years. DVD is just getting widespread use. An audio format is finally coming and now people want to move to another disc entirely? WTF. These media should have at least decade long life spans. CD has had an acceptable life, DVD has just started and people want to retire it already? damn.</strong><hr></blockquote>





    The good news this time Matsu - and I hope I got this right - is that the powers that be (read: Phillips, Sony, Toshiba et al), seem to have actually agreed to a standard for a new RW blue lazer before the sucker even comes out on the market.



    This is a good thing, it will avoid the moronic format wars which slowed down the adoption of DVD by at least 2 years.



    Now all we need is a real multichannel audio standard which supports 192KHz/24bit clean in all 5.1 or 7.1 channels. Plus a 192KHz 24bit stereo mix as well ... with complete indexing (track names etc).



    That would be an audio standard worthy of my $20, as opposed to this 20 year old two channel 44.1KHz 16bit BS.



    It would be a breakthru for everybody, first because @ 192 KHz/ 24bit we're finally at the point where our audio storage medium is BETTER than the human ear can hear ... it's taken over 100 years to get this far.



    I just heard pure 5.1, 96KHz 24bit the other day (not this DTS or Dolby Digital collapsed stuff), and man ... all those folks out there who've got a home theatre system, once they hear a real mix in 192/24 5.1 ... they will run (not walk)to pick up these things.



    The difference?



    1 - the hi end isn't screwed up like is in 44.1



    2 - way more resolution in low level detail, and ambience.



    3 - huge dynamic range, even if you were to foolishly record a signal at 48db below maximum (or if you were forced to,like in some very wide ranging orchestral works) - you'd then have the same resolution amplitude wise that you have in a current CD player ... and considering that you'll almost always record much hotter than - 48, it means every breath, minor nuance or touch on any note is heard with far greater resolution than even the loudest stuff currently available on CD.



    Subjective translation?



    If the music industry gets it head straightened out and let's this happen that is:



    A - in stereo, at best, the performance sounds like it's happening on a stadge in front of you ... in 5.1, it's happening all around you, this makes the experience far more intimate - you're now part of the band ... not just watching from a distance.



    B - the singer, if miked properly and recorded well, coming out of the centre speaker, can (if your gear is good enough to handle it, which most new 5.1 systems are) literally sound like their in the room with you. The effect of that 96K/24bit 5.1 recording I heard the other day was exactly like that, it was quite stunning hearing a voice so well defined, and like it was right in the room.



    Basically, this new format is the music industry's chance to snatch defeat from the jaws of victory.
  • Reply 31 of 43
    marcukmarcuk Posts: 4,442member
    Humourous post...



    And what the hell is wrong with 44.1?



    People like you make me laugh, I bet 99.99% of the population couldn't tell the difference between 44.1 and 192. You've been reading too many audio geek tech papers. Ok I understand all the audio stuff pretty well, being a studio engineer for 5 years, but 192 takes the piss. Id say 96KHZ 24 bit, is all anyone would ever need. People like you should learn to enjoy the music, not worry about the alaising on a 20KHZ signal which you cant hear anyway, but you think you can because you've been 'educated'.



    And what the...with dynamic range, 99% of music gets compressed to hell to reduce the dynamic range.



    Just said in jest



    BTW did you really mean 192KHZ? I cant think of any real reason to go this high, may be good for recording live orchestras, but only at the pro level. Any music released would be compressed before released so consumers would never benefit/need 192khz



    [ 02-24-2002: Message edited by: MarcUK ]</p>
  • Reply 32 of 43
    matsumatsu Posts: 6,558member
    Well, I could see it across the front three channels. Most people (including the tin ear kind who buy CDs ) can hear a more accurate and wider stereo image/presentation from 3 channels (L, center, and R) than they can from just two (L and R). But this would even sound better with 44.1PCM. With three front channels the audio engineers have a fair bit more control over the 'placement' of instruments and 'movement' across the sound stage. So a nice live recording -- or really something that sounds live, but has been worked to sound that way -- might be more convincing. The players stay put on their spot in the sound stage, and if the singer walks across the lounge, then her voice tracks across the 'FRONT' of the soundstage. But it's real easy to make this effect rather kitchsy and un-natural, like dance mixes where keyboard note A comes from the left front but B and C come from the right front. Are we to believe the keyboardist is leaping across the stage in an 8th of a second?



    But what do you need 192Khz for in the multiple rear channels? The audience clapping? Total waste of bandwidth. Rears are mostly a Home Cinema gimmick/tool, depending on how they're used. I'd argue that even for home cinema, the front soundstage is the one that matters most -- that is where the action is presented afterall. And because that sound could match an image (unless we all start watching 3-d movies at home) it's much more viscerally rewarding to have, say, a helicopter, or baseball, flying across the screen with an accompanying smooth track of sound following the image from one side to the other. Sometimes the rears are cool, you hear that helicopter buzz over-head or behind, and it's so convincing that you just have to look. And when you do it's kinda ruined a little because there ain't no images on the back wall behind your seat.



    The point is that current, and compatible DVD media has enough storage for a high quality audio recording, and it also has enough storage for a backward compatible HDTV recording. Does it have enough bandwidth for an HDTV video feed plus a 192Khz audio feed? No.



    But do you really think you'd notice? Picture and sound is a lot to process. I bet a DTS audio signal (quite a bit more bit-rate than Dolby digital) plus a 720p HDTV signal would keep everybody's senses more than saturated with info.



    Better to keep a standard that doesn't obsolete 5 years of standards bickering. At least we could have a decade or one generation of standards compliance before we have to upheave all our A/V gear one more time.
  • Reply 33 of 43
    [quote]Originally posted by MarcUK:

    <strong>

    And what the hell is wrong with 44.1?



    [ 02-24-2002: Message edited by: MarcUK ]</strong><hr></blockquote>



    I'll give you the explanation I used to give when I taught future budding digital recording engineers the subject. I'm not going to get into Nyquist plots and such, but it's not all that hard to understand anyway.



    Your ears, in fact, anybody's ears - have been telling people for two decades now, that digital sounds "cold" or "harsh" ... some of that is just bad old listening expectations, but most of that is just the sound of 44.1, of which there's no escape from. No matter how much better they make the filters, you're hi-end will always be impaled on a 44.1K system ... which goes real far to explain all the "nice dither" enhancing devices with tubes in them that have been going for outrageous prices lately "warming up" the studio industry. Not that I'd sneeze at a Telefunken U-47, just that I'd have a hard time shelling out $5K for one, anyway - I digress ...



    ... the point of all this tuby-ness in the digital realm has mostly been to deal with 44.1K's hi-end, make it less harsh, make it "warm". Which boils down to - if you can't handle the ugly, smear a little vaseline on the lens.



    Well ... why?



    Getting back to Nyquist and his evil plot ... the truth is: sampling at double the frequency is all that's required to capture all the information about a wave .... just as often as it captures no information about the wave ... and neither of which happen as often as it sorta captures the wave, but screws it up.



    Picture this:



    Take a true work of art, a lovely 20K sine wave - which is supposed to be the limit of human hearing (I know I know, just play along for now). Since a sine wave has no higher harmonics, if 20K truly is the limit of your hearing, than a sine wave is really is all you could hear at 20K anyway, because any other wave would just sound like a sine wave to you since it's upper harmonics you couldn't hear ...



    OK, so here's the theory:



    Sample at double the frequency, ok - 40K



    Capture the wave at maximum, capture the wave at minimum - you're done. You now have all the information you require to perfectly re-create that lovely 20K sine wave: this is absolutely true.



    However - this is also absolutely true:



    Take that same wave, move it to the left or right of your sampling point 1/4 of it's wavelength ... and sample again ... what do you hear?



    A: Nothing.



    You've successfully sampled that wave at it's zero crossings, when you connect the dots here, you just join zero to zero and hear exactly that, nothing. The wave dissappears between the sampling points ... ok, big deal you say, how often does that really happen anyway?



    A - just as often as you capture the wave perfectly at it's peak amplitudes.



    The truth is, most of the time, the sampling is as off as it's on, creating false amplitudes and phase shifts in the hi-end detail of a sampled wave - take note: this would be the case even if you had a theoretically perfect A/D operating at 40K, no alias frequencies, nothing.



    How does this sound to your human ear?



    Harsh.



    Ok, so 44.1K isn't 40K, and most people can't hear much past 17K anyway ... fine.



    Have a look at what a 15K wave looks like, sampled at 44.1 on your computer screen.



    Now, to find out what it actually sounds like, just round off all those jagged edges which is all that those sophisticated D/A's do anyway ... what do you get? A watered down version of the screw ups I mentioned above.



    Yes, folks, despite all that nonsense you heard 20 years ago, about CD's being perfect, and 44.1K being all that's necessary ... we'll I'm afraid it's nonsense ... 44.1K is pretty nasty stuff, but for the longest time, it was all we had ... now we can do better.



    We should.



    [ 02-25-2002: Message edited by: OverToasty ]</p>
  • Reply 34 of 43
    matsumatsu Posts: 6,558member
    Yes, but you don't need a Blue-ray disc for that.
  • Reply 35 of 43
    [quote]Originally posted by Matsu:

    <strong>Yes, but you don't need a Blue-ray disc for that.</strong><hr></blockquote>



    True ... but to get 192K/24bit in all 5 or 7 channels, (not just two in the current DVD audio spec), and add a 192K 24bit stereo mix which could be played on stereo only systems & add some cool visual and information features ... a Blue-Ray system would be required, there just isn't the space on current DVD's.



    Now, why does any of this matter?



    Because almost nobody currently has proper audio DVD players, they've all got these DTS/Dolby Digital thingies ... may as well work things out now, such that when people start buying blue ray systems for movies and audio in few years, we'll have the material available for them, not some hampered standard from Red-Ray days: which, if some thought was actually put into, could probably be made backwards compatible anyway.
  • Reply 36 of 43
    Well optical storage has been predicted since the first episode of Star Trek, so it isn't a matter of "if" but "when" and that will relegate the mechanical intensive harddrive to minor devices like iPods and iCameras.
  • Reply 37 of 43
    matsumatsu Posts: 6,558member
    DVD-audio already supports multi-channel surround. You can have all 5 channels at 96Khz, or you can have the front channels at 192Khz and the others at something less. It is also possible to use MLP (a 'lossless' form of compression) to squeeze more 192/24 sound from the other channels. It's good enough. 144db s/n ratio and 96Khz of dynamc range is way beyond our hearing. Should be fine, even with a bit of compression. Most people only have DTS or DD because they're part of the DVD-video spec. A DTS or DD track could be simultaneously pressed onto a DVD-audio so that it would play in a DVD-video player, but that would only be a home-theatre quality track (compressed).



    You really only need a blue ray disc if you want to combine uncompressed or lossless audio with HDTV video. Nice, but still not neccessary since, if we stick to DTS, there is enough room on a Dual layer disc to create a 720P HDTV image. Just have to spin it at 2x instead of 1X to get the neccessary bandwidth. You still need a new player, but people with old players could get a lower quality vid stream (provided the discs are properly encoded) This sounds like the better solution to me when we consider that there still aren't too many HDTV's out there (even projections of 30% by 2006 seem optimistic now) Computer users will have a wide spread access to DVD recorable formats, and a version of DVD will finally supplant the VCR by that time.



    We certainly don't need blue ray discs untill at least 2006. When HDTV was supposed to have become mandatory for broadcasters. I think it's been pushed, again, but I'm not sure. Even then, with 1 in 3 sets an HDTV set, cable and satellite will continue to cater to users with analogue sets. They can't afford to lose the customers. A relatively cheap high quality analogue set will deliver a very good picture when hooked into a digital cable or satelite box. It isn't a good time to introduce a new disc technology for consumers when, probably for the next decade we're going to live in an environment of emerging/hybrid tchnologies. Tons of old stuff and lots of new stuff. The current DVD disc, though more limited in capacity can easily stradle both sides of that divide. Any new disc, despite obvious storage advantages, needlessly obsoletes a whole range of consumer hardware, and complicates the task of the playback and recording of old and new material alike.



    [ 02-25-2002: Message edited by: Matsu ]</p>
  • Reply 38 of 43
    marcukmarcuk Posts: 4,442member
    [quote]Originally posted by OverToasty:

    <strong>



    I'll give you the explanation I used to give when I taught future budding digital recording engineers the subject. I'm not going to get into Nyquist plots and such, but it's not all that hard to understand anyway.



    Your ears, in fact, anybody's ears - have been telling people for two decades now, that digital sounds "cold" or "harsh" ... some of that is just bad old listening expectations, but most of that is just the sound of 44.1, of which there's no escape from. No matter how much better they make the filters, you're hi-end will always be impaled on a 44.1K system ... which goes real far to explain all the "nice dither" enhancing devices with tubes in them that have been going for outrageous prices lately "warming up" the studio industry. Not that I'd sneeze at a Telefunken U-47, just that I'd have a hard time shelling out $5K for one, anyway - I digress ...



    ... the point of all this tuby-ness in the digital realm has mostly been to deal with 44.1K's hi-end, make it less harsh, make it "warm". Which boils down to - if you can't handle the ugly, smear a little vaseline on the lens.



    Well ... why?



    Getting back to Nyquist and his evil plot ... the truth is: sampling at double the frequency is all that's required to capture all the information about a wave .... just as often as it captures no information about the wave ... and neither of which happen as often as it sorta captures the wave, but screws it up.



    Picture this:



    Take a true work of art, a lovely 20K sine wave - which is supposed to be the limit of human hearing (I know I know, just play along for now). Since a sine wave has no higher harmonics, if 20K truly is the limit of your hearing, than a sine wave is really is all you could hear at 20K anyway, because any other wave would just sound like a sine wave to you since it's upper harmonics you couldn't hear ...



    OK, so here's the theory:



    Sample at double the frequency, ok - 40K



    Capture the wave at maximum, capture the wave at minimum - you're done. You now have all the information you require to perfectly re-create that lovely 20K sine wave: this is absolutely true.



    However - this is also absolutely true:



    Take that same wave, move it to the left or right of your sampling point 1/4 of it's wavelength ... and sample again ... what do you hear?



    A: Nothing.



    You've successfully sampled that wave at it's zero crossings, when you connect the dots here, you just join zero to zero and hear exactly that, nothing. The wave dissappears between the sampling points ... ok, big deal you say, how often does that really happen anyway?



    A - just as often as you capture the wave perfectly at it's peak amplitudes.



    The truth is, most of the time, the sampling is as off as it's on, creating false amplitudes and phase shifts in the hi-end detail of a sampled wave - take note: this would be the case even if you had a theoretically perfect A/D operating at 40K, no alias frequencies, nothing.



    How does this sound to your human ear?



    Harsh.



    Ok, so 44.1K isn't 40K, and most people can't hear much past 17K anyway ... fine.



    Have a look at what a 15K wave looks like, sampled at 44.1 on your computer screen.



    Now, to find out what it actually sounds like, just round off all those jagged edges which is all that those sophisticated D/A's do anyway ... what do you get? A watered down version of the screw ups I mentioned above.



    Yes, folks, despite all that nonsense you heard 20 years ago, about CD's being perfect, and 44.1K being all that's necessary ... we'll I'm afraid it's nonsense ... 44.1K is pretty nasty stuff, but for the longest time, it was all we had ... now we can do better.



    We should.



    [ 02-25-2002: Message edited by: OverToasty ]</strong><hr></blockquote>





    I understand all this theory perfectly well, but my argument will always be that all of those points you mentioned while perfectly true, are just techno babble. Only a select few with near perfect audio replay systems will ever tell the difference between 44.1 and 192. Infact the only people to detect it are scientists, with all their oscilloscope measing systems, "Oh look that 20khz wave played in isolation is getting clipped, oh yeah I see it on my oscillo, and come to think of it I can hear it now, damn, I'll never be able to listen to CD's ever again"



    Can you honestly detect a 15khz phase distortion. Have you ever heard a 20KHZ sine wave in any recorded material? No I thought not. And nor could 99.9% of the rest of us.



    Probably the most amazing pieces of music equipment i owned was a SPL vitalizer psychoacoustic enhancer. It worked on audio by phase shifting and adding distortion to high frequency signals, and adding distortion to the bass. Which is how all these enhancers work. And the Roland boxes which make audio appear from further extremes than the speakers. Once you get used to listening through these devices, turning them off is really a revelation. The industry would be better off providing this or something like BBE in their kit, which everyone could benefit from, than moving to 192KHZ which only scientists could measure
  • Reply 39 of 43
    actually i can hear more than the average human. we tested it out in a phsyics class last year and i could hear down to 16hz and up to 24kz, the last five mintues really hurt though, i don't think it was really up to 24. it might have been my ears ringing. but most people can discern between 44 and 92. at least in my family we can.
  • Reply 40 of 43
    [quote]Originally posted by MarcUK:

    <strong>



    I understand all this theory perfectly well, but my argument will always be that all of those points you mentioned while perfectly true, are just techno babble. Only a select few with near perfect audio replay systems will ever tell the difference between 44.1 and 192.



    Can you honestly detect a 15khz phase distortion. Have you ever heard a 20KHZ sine wave in any recorded material? No I thought not. And nor could 99.9% of the rest of us.



    Probably the most amazing pieces of music equipment i owned was a SPL vitalizer psychoacoustic enhancer. It worked on audio by phase shifting and adding distortion to high frequency signals, and adding distortion to the bass. Which is how all these enhancers work. And the Roland boxes which make audio appear from further extremes than the speakers. Once you get used to listening through these devices, turning them off is really a revelation. The industry would be better off providing this or something like BBE in their kit, which everyone could benefit from, than moving to 192KHZ which only scientists could measure</strong><hr></blockquote>



    Figured I'd take a moment to remind you what my earlier post was about:



    1 - Most people can hear the difference and can hear it on a half-decent pair of bookshelfs. No magic required - because ...



    2 - You don't have to be able to hear 20k to hear the difference.



    And as a special bonus:



    The SPL vitalizer was nice for some things (like mushing up low end to give a distinct low-fi sound), but mostly it wound up being an over-used mid 90's fad ... if I had a nickel for every mix that came in over-mushed, over separated and over "shiny" thanks to that box and others like it, I'd be up to my eyelids in hookers and cocaine ... but alas - the artists beat me to it &lt;sigh&gt;, and so many mixes came in over-mushed, over separated and over "shiny" - but I digress.



    All these things, the Aphex, the BBE, and the Vitalizer, where all designed as compensators for bad storage mediums ... magic one stop "fix it" boxes (which, if you actually believe in their effectiveness as you claim, means you're actions are admitting to the limitations of the medium, you're ability to hear those limitations despite your mere "techno-babble claim", and your personal need to fix them) so put your lips together and say "Mia Culpa".



    ... that out of the way, here's a simple idea ...



    \t... why don't we just build an audio standard we don't have to compensate for?



    Like say 192/24.



    :eek:
Sign In or Register to comment.