zimmie

About

Username
zimmie
Joined
Visits
172
Last Active
Roles
member
Points
2,737
Badges
1
Posts
651
  • Apple TV+ 'Tiny World' filmmakers used gas-retaining diving gear for underwater shots

    It is called a “Rebreather”. I have one for my dives. It “scrubs” the air clean of carbon dioxide and refreshes and recirculates the air we use underwater. Expensive and needs careful set up to avoid something going seriously wrong…
    It sounds like you can stay underwater indefinitely if it keeps recycling the air, but I’m sure there’s a limit. I’m curious, how long can it supply clean air?
    Some carbon sequestration materials used in rebreathers release oxygen as they absorb carbon dioxide, but my memory is they aren't commonly used anymore.

    Basic rebreathers work on air and just sequester carbon dioxide from what you exhale. They're not really safe for diving, but are sometimes used as emergency escape equipment in industrial situations. For example, if you have a fire suppressant system which works by sealing a room and extracting oxygen from it, low-end rebreathers like this are common safety gear placed on pillars in the facility.

    Units suitable for diving add some oxygen back to the loop. The time on these is limited mostly by the capacity of the oxygen tank, and to a lesser extent by the capacity of the system as a whole to deal with gas expansion. These are also used for firefighting and surgical anesthesia.

    Some rebreather designs work on a pure-oxygen loop, which simplifies a lot of things, but pure oxygen is toxic at high pressures, so they're only good down to 20 feet or so. They work beautifully in space, though, as running a space suit with pure oxygen means you can run it at a much lower pressure, meaning the joints don't fight the user as much.
    watto_cobrarundhvidGeorgeBMac
  • Supreme Court rules in favor of Google in Oracle Java fight

    So... oracle made a product. They spent a lot of money creating this product. 

    Everyone paid to license this product. 

    Except Google, who decided to just steal it and hope no one noticed. 

    Today’s Supreme Court says that’s “okay.” 

    What the...

    where is the justice? How is this fair to oracle? How is it fair to competitors, who decided to “not be evil” and not steal? 

    It may seem like “fair use” NOW, because 10 years have gone by and such features SEEM lubiquitous, but they weren’t 10 years ago. Oracle made these things happen and they should be treated fairly. 

    Something stinks here. Smells like corporate influence. 

    That’s not even close to what happened.

    Sun wrote an API. API specifications are inherently not copyrightable. Lots of things aren’t copyrightable. For example, languages (like Klingon) and game rules.

    Oracle bought Sun.

    Google decided to make their own API matching the specification of the API matching the specification of Sun’s. They gave their functions the same names Sun gave theirs. The function names (or technically the function signatures) are the “copied” code.

    Google wrote their own implementations of those functions from scratch. The implementations used no Sun/Oracle code.

    Oracle sued Google for copying Java’s API signatures.

    This ruling effectively says function signatures are part of the specification of the function, as opposed to part of the creative work of meeting that specification. This is a correct interpretation of what function signatures are at a technical level.
    muthuk_vanalingamgatorguyIreneWwatto_cobra
  • NFT -- Everything you need to know about non-fungible tokens

    It's worth noting that the NFT actually refers to a JSON file somebody hosts which then refers to the work via a URL. There are already lots of NFTs which refer to JSON files which are no longer hosted where the NFT says they are, and others where the JSON is still accessible, but the work is not. In either case, the token now refers to something which can't be proven, so its value as provenance is zero.
    darkvader
  • AMD Radeon RX 6700 XT GPU may launch at March 3 event

    Right now, no one is talking about Apple's upcoming discrete GPUs. Apple has not announced them but they pretty much have to be released this year. The release of Apple discrete GPUs will be an extremely important event. It is clear that Apple can compete with Intel/AMD in CPUs but how will it compare on GPUs? Apple's embedded GPUs are at best 1/10th the speed of discrete GPUs. That's actually pretty impressive for a mobile GPU built into an iPhone or iPad but it is not going to impress anyone buying an iMac, let alone a Mac Pro. The only other choice Apple has is to write Apple Silicon drivers for an AMD discrete GPU. That seems counterproductive given Apple's stated intention to build its own CPUs and GPUs going forwards.
    Your "1/10th the speed" statement is incorrect. The M1's GPU can perform 2.6 TFLOPS with eight cores. That's 325 GFLOPS per core.

    The Radeon RX 5700 XT 50th Anniversary can get 10.1 TFLOPS with 40 compute units, so 253 GFLOPS per compute unit. This is actually the best performance per compute unit across the Radeon RX 5000 line.

    The Radeon RX 6000 series is technically released, but they're extremely rare right now. I'm not going to count them until stores can go longer than an hour without selling out. Even so, they get 288 GFLOPS per compute unit.

    GPU compute performance scales linearly with core count. Power performance scales a bit worse because high-core interconnects have to be more complicated, but not hugely so. 32 of Apple's A14/M1 GPU cores would get about 10.4 TFLOPS, beating the best AMD consumer cards you could get last generation and beating the Nvidia RTX 2080 (also 10.1 TFLOPS). That would still have low enough power draw to fit in a laptop, though admittedly, not a laptop Apple is interested in making. An iMac could easily have four times the M1's GPU.
    tmaywatto_cobra
  • Spotify HiFi one-ups Apple Music with lossless audio streams

    Humans perceive louder audio as better, even if the difference is so small you don't actually think one is louder than the other. I guarantee that if you correct for minuscule volume differences, you won't have people consistently picking 320 kbit MP3 over 256 kbit AAC, and it will get closer to 50/50. I know this because I have personally run that experiment before, and the effect is extremely widely documented.

    As for high-resolution audio ...

    44.1 kHz sample rate can reproduce any signal below 22.05 kHz or any combination of signals below 22.05 kHz perfectly, and human hearing doesn't go above 22 kHz (more accurately, the pain threshold is higher than the perception threshold, so higher-frequency signals are physically painful at a lower volume than the volume at which we hear them). There will never be a need for higher sample rates at playback for a human.

    16 sample bits per channel means 96 dB of dynamic range above the noise floor. 24 sample bits pushes that up to 144 dB. The human auditory system has a noise floor of about 30 dB (from blood flowing in your ears, the sounds your muscles make as you move and breathe, etc.). Assuming you set the audio system such that the quietest sounds could be heard, a 16-bit signal can accurately represent anything up to 126 dB. 120 dB is enough to immediately cause permanent hearing damage. As for 24-bit, people have died at 160 dB.

    24-bit is good for recording, because it leaves more room for rounding errors to accumulate before they reach the threshold of human hearing. High sample rate is also good for recording so you can stretch recordings to a more extreme degree before the limitations become audible. Neither is at all needed for playback. They just waste data.
    spheric