Detnator

About

Username
Detnator
Joined
Visits
43
Last Active
Roles
member
Points
620
Badges
1
Posts
287
  • Apple launches new Apple TV 4K with A12 Bionic CPU, redesigned Siri remote

    neillwd said:
    Hate the remote.
    Came here to say exactly that. It reminds me of the old remotes before they used a touch pad. I dread “click, click, clicking” around m the screen rather than just a swipe which can get me to the other side of the screen in a single gesture. 

    Thought I was the only one who liked and understood what the touch remote accomplished. 
    Wow... seriously?  I agree, I like the touch remote functionality, but it constantly has sensitivity and inconsistency issues.  So they put buttons back in... which is great.  So, swiping/touch?

    It still does that AS WELL!  Did you not see that?  Buttons AND swiping...  It's the best of both worlds.  Plus scroll-wheeling on top to boot.

    And the other huge issue with the previous remote that this one fixes:  no longer vertically symmetrical so no more using it upside down.  

    Frankly, I think they finally got the damn thing right.  

    But complainers gotta complain...?  Present a legitimate issue if you have one, but seriously... maybe you guys wanna get your information straight first?
    hcrefugeewilliamlondonroundaboutnowtokyojimuioniclejeffreytgilbertWgkruegerdjames4242seanboy81asdasd
  • New Sonnet Echo 5 hub has three downstream Thunderbolt 4 ports, one USB-A

    rob53 said:
    This hub is like OWC's 4-port Thunderbolt hub. It also says it can drive two 4K displays and costs $20 less. The Sonnett hub will also have limitations when connecting too many devices that want to use as much of the TB bandwidth as possible. I've seen questions about using a hub like this to create a RAID using three NVMe external drives and software RAID. It might be fun to try but I wouldn't expect the full 5000Mb/s TB3 bandwidth. This hub will be used to connect multiple devices that don't require simultaneous use. 

    I'd love to be proved wrong but I only have a 2020 M1 MBA to test with and it has Thunderbolt limitations.
    I have two of the OWC hub you mentioned, and a number of high end NVMe drives in various enclosures, and I've tried to do exactly this.  Sadly, there are some limitations.

    For one, I'm not exactly sure of the details, but I believe the 40Gb/s TB channel (in TB3 at least, and I believe this part is unchanged in TB4) is split into some specifically for data and the rest specifically for video, and neither gets the full 40. It's likely there's some part of that sentence that isn't quite right, but the upshot, as I understand it, is that the maximum SSD throughput you can get out of a single TB3 channel is about 2800MB/s or 2.8GB/s  (note: MBytes/GBytes, not Mbits/Gbits).

    This 2800 is usually stated on TB drives and enclosures from companies who are honest with their marketing (while everyone else says misleading statements like "full 40Mb/s bandwidth"), and I would say it mostly agrees with my experience in practice... or at least that's the case under ideal conditions, though most conditions are short of ideal (eg. background processes doing stuff that chews up some of the bandwidth etc).

    Based on those numbers I bought myself two OWC 4M2 TB3 enclosures and I get about 2200 MBps throughput on each of those, and when I stripe them all I get at most about 3500, usually not quite that much. That's the case on both my maxed out 2020 M1 13" MBP and my maxed out 2019 Intel 16" MBP.  And I've generally got similar performance out of other striped NVMe SSD configurations I've tried (eg. with some of the bus powered single drive enclosures).  When I had an iMac Pro I got better speeds out of the same configurations.  But despite trying various different configurations and setups, no setup has ever got me anywhere near the 5000MBps you'd expect/hope from a theoretically maxed out single 40Gbps TB channel, and in fact, I've never even got that much out of two channels striped.

    Important to note that all this is per CHANNEL, not per port. The Intel Macs share two ports per channel, so the four port Intel MBPs and iMac Pro have two channels. The two port Intel 13" MBP and MBA have one TB channel. On all Apple Silicon Macs so far I believe it's one port per channel, which means the low end 13" M1 MBP and MBA have two channels, despite only two ports -- which is an upgrade over the 2-port Intel machines they replaced.  And on the new 2021 14" and 16" M1 Pro/Max MBP's, the three ports, although a downgrade in number of ports are a 50% upgrade in bandwidth over the previous 4-port (2-channel) 13" and 16" Intel MBPs.

    That's a bit of a ramble, to include a lot of detail.  The point is it's my understanding that the maximum theoretical data (ie. SSD drive) throughput of any single thunderbolt channel is 2800MBytes/s.  And in practice it's generally significantly slower.  I believe there are some other overheads interfering with drive performance, which no doubt include background processes trying to do stuff among other things.  The other factor I've found (and this might be where the iMac Pro benefited most) is that these ports get very hot, and if allowed to get too hot will burn out the electronics, and they have built in throttling to slow them down if they get there.  Things like ice packs or keeping your Mac in a refrigerator might help with that... that's something I want to try one day.

    So needless to say... unfortunately these hubs don't really improve speed because the bottleneck is the channel more than anything else.  Some drive enclosures are slower than the channel bottleneck, but the best these hubs will give us is bringing multiples of those enclosures, striped over one channel, up to similar speeds as the already fast enclosures can do. There might also be some benefit in spreading the heat around, although there's still the heat at the single host port, the management of which may or may not have improved in the newer MBPs.

    I'm keen to get a new M1 Max MBP and see if that's the case, and how fast I can get drives striped across all three channels to go. I'll probably do a brand new clean install to minimize other overheads, and for theoretical/experimental interest I might try it in the fridge and see if that helps. 😉   I'll report back my findings when I can get my hands on one of those machines.


    entropysapplguywatto_cobrakurai_kagetht
  • M1 MacBook Air review: nearly as transformative as the original

    Genuinely surprised that they didn’t fix the webcam. Sounds like 2021will be the year for something fresher, crisper and broadly compatible.
    If that's the case, I'm sorry to say you'll continue being "genuinely surprised".  People really need to get past this.

    Unless some significantly new technology comes out they will not "fix" the webcam because it's not (currently) possible, without making the display significantly thicker, which would compromise the design in ways that normal people would really prefer they didn't.

    Have you noticed the "camera bump" on all the iPhones since the 6 series over 5 years ago - which is still a problem today?  They can't fit a decent camera in the iPhone's 8mm thickness without a bump.  So how are they going to fit a decent camera in the 2mm thick laptop displays?**

    It's a webcam for goodness sake. You're not shooting a feature film with it (while some people are doing so with the iPhone cameras). Why should anyone care if you look a little fuzzier for your zoom meeting than you would with a better camera (that as I say would severely compromise the experience of everything else).  Alternatively if it really is that big a deal, enough that the compromises required are less important to you, then get a professional external one.

    Ya'll need to get over it.

    ----

    ** In 10min of googling I can't find any other decent laptop with a webcam better than 720p. Anyone feel free to point me to one if you know of any.
    Xedjdb8167williamlondonchiamjtomlinwatto_cobradocno42
  • Apple Studio Display only starts at $1599, and can easily climb to $2458



    cgWerks said:
    Can someone explain the pricing of this display to me? Is it that much better than the screen that came in the $1800 iMac? I was hoping it would be closer to $1k. While I realize the options for that exact kind of display (resolution, size, specs) limits you to one 3rd party (ie. LG UltraFine), do most of us really need those specs? I think I'd be just fine with something like a BenQ 4k, etc. (or almost a half-dozen of them!) What am I missing?
    For most people 4K, 350nits, clunky build quality, and woeful customer support is enough.  That's what you get for $500-$1000.  Some of us want more than that. When the onIy 5K monitor around that's remotely Mac compatible was the LG I tried that, had multiple hardware issues with it, and dealing with LG's support probably took years off my life.  I tried 4K as an alternative, but for me at least, it just doesn't cut it.  Some people say they can't tell the difference between 150 and 220dpi, or 350 and 500 nits.  I don't get it. It's night and day for me. The extra pixels (it's almost twice as many) and increased brightness of the LG 5K make a significant difference to my productivity (when it's not going through LG's warranty repair processes).

    Now, that covers differences between the many 4K options and the LG 5K.  Then there's the LG 5K vs this new Apple Studio Display (ASD)...

    Buy the LG UltraFine 5K 27" monitor for $1299. The two are almost the same. 

    Not quite (actually not even close).  There are some very significant differences, such that I'm surprised there's only $300 difference.

    LG 5K:
    • 500 nits
    • USB3 5GBps
    • 14lb
    • 1080p camera (~2MP)
    • Crappy speakers
    • Crappy mic
    • Terrible build quality with persistent hardware issues
    • Woeful LG customer support
    • $1300

    Apple Studio 5K: 
    • 600 nits + True Tone
    • USB3 10Gbps
    • 12lb
    • 12MP camera (6x the pixels, + Center Stage)
    • High quality 3 mic array, directional beam forming (if these are anything like the MBP mics then they're amazing)
    • Superior speakers (+ spatial audio)
    • Excellent build quality (presumably, if it's typical Apple).
    • Best in class Apple customer support
    • built in A13 CPU
    • $1600
    Bottom line, other than the 5K resolution part, nothing is the same.  (More here)


    Not that much less than the Apple monitor then. And if you consider the rear ports, Center Stage camera, Spatial compatible speakers, the classy look and the high end aluminum construction instead of cheap plastic, the LG doesn’t seem like the better deal really.

    Far from it.  And it seems many others agree. I ordered two of these ASD's within seconds of the store coming back online this morning, and enough people still got in before me that mine aren't coming until the end of the month.  They're either very short on supply or a lot of people want them.

    bulk001netroxwilliamlondoncgWerksscstrrfwatto_cobra
  • Intel takes aim at Apple, instead shoots itself in the dongle

    I hope it wakes the morons at Apple up.
    My 2020 Mac Book pro barely qualifies as pro.
    A "pro computer" that I need a dongle to hook up to an external projector, monitor or ethernet.
    A pro computer that ditched the brilliant mag safe power cable.
    A pro computer than no longer allows me to stick an SD card in it.
    A pro computer that requires a dongle to connect a USB A jump drive- the industry standard way of walking files.

    And- they gouge me on the price of RAM, Storage by 3 to 10x what everyone else charges.
    With no user replaceable parts- and barely serviceable.

    And that was my Intel machine.
    Now- my M1 MacMini- can't connect to my Drobo- which is a 72 TB paperweight.
    Gee - thanks Apple.
    Y’know the whole “you’re holding it wrong” meme got a bit ridiculous back in the iPhone 4 day, but dude you really are doing it all wrong...

    As someone else pointed out, why on earth would you buy a computer that doesn’t support your peripherals when there are plenty of other computes (including other Macs) that do? Sorry buddy but that one is on you. 

    On all your other comments... you just don’t get Apple. All those points you’re complaining about are positive points for Apple’s target market.  Apple is not in the business of making devices for everyone. They don’t care about market share.  Since they were founded - except in the 90’s when they stopped doing this and the result was they nearly went bankrupt - Apple has always pushed things forward and left past and even current tech behind.  And a lot of that is a fundamental reason why their products stand out and the people they’re targeted at love them. 

    If Apple’s decisions are so terrible then why do you buy or want their products?  There are plenty of PCs with SD slots, all the ports you could possibly want, user upgradeable, and all at a fraction of the cost. Sounds like some of those would suit you just fine. What’s your problem?  

    Wait... I know what it is. You want macOS.  Why? Because it’s better. So why is it better? The answer to that is directly related to all your complaints.  macOS is better than Windows — and now Apple’s chips are better than anyone else’s — entirely because they push forward and don’t hold onto the past. Lack of backwards compatibility is the primary reason why macOS is better... because it’s not carrying around all the cruft the others are. And now they’re applying the same philosophy to their chips.

    Of course that assumes you’re complaining about Apple’s hardware because you want Apple’s software and Apple’s is the only hardware that will run it.  If that’s not the case then ... wtf? Troll?  Or what?
    qwerty52williamlondonFidonet127muthuk_vanalingamWgkruegerspock1234dewmetobianwatto_cobra
  • Samsung's $700 Smart Monitor M8 borrows Apple's multi-colored iMac style

    THIS is what I wanted from Apple's consumer monitor. 
    You lucky bastard. That’s so great. Samesung gave you what you wanted, and Apple gave me what I wanted.  We’re both happy!!  Isn’t it great to have choice?!

    Really… I don’t get it. Why are all you people complaining that Apple doesn’t make a monitor like everyone else’s?  Apple’s whole point is they make different stuff to everyone else. It’s the same with everything…

    “Why can’t Apple make a cheap expandable tower — like PC’s?”

    ”Why can’t Apple make iOS more open, sideloading apps, etc. — like Android?”

    And now it’s “Why can’t Apple make a cheap low spec consumer monitor — like everyone else?” 

    The answer: Because 99% of Apple’s customers buy Apple stuff BECAUSE Apple does it differently to everyone else (not despite it). Why is that so hard to understand?

    Apple’s market share is small, because Apple’s target market - the people they are trying to cater to — is small.  Apple makes their products for, and targets, the people who want something more premium with different features and focus to what everyone else makes, and see the value in that, and are happy to pay for it. 

    Apple’s customers are the misfits. We’re the people that aren’t happy with the consequences of a more open mobile OS, computers made by pulling together off the shelf parts, and cheap low resolution/ppi, low brightness, plastic displays.  

    We want integrated packages that just work out of the box, work better because everything is integrated and specifically designed to work together, we don’t have to tinker with them. We don’t buy iPhones and iPads despite the closed OS we buy them BECAUSE of the closed OS. And we want a monitor that is retina resolution (at least 210-220 ppi), bright (no, 400nits is not bright enough), lightweight, premium build, a stand that isn’t wobbly and sticky and impossible to align, etc. etc.

    if you want Apple to make a consumer monitor like everyone else’s consumer monitor… WHY??  What’s the point?  You’ve got a million other choices in that market, what on earth can Apple or most of Apple’s customers possibly gain by Apple adding just another entry to that list? And how is that good for those of us who can’t stand all those other monitors that everyone else makes?

    I don’t get it.  I really don’t. 
    bbhforegoneconclusionchunkpylonFileMakerFellerbageljoeyStrangeDayslkruppscstrrf
  • Apple's Mac Studio launches with new M1 Ultra chip in a compact package


    What's missing?
    No upgradeability. At all. None. Zero. Nada.
    No way to upgrade the RAM, SSD or any other components.
    What you get is what you get. Forever. You are welcome.
    Not even a M.2 slot for when the built in SSD seems very slow and small two years from now.

    Those GPU speed/power charts were missing the name of the discrete GPUs they used for comparison. The charts shown when the M1 Pro and Max when the MacBook Pro was released ended up being very misleading. How exactly does the M1 Ultra stack up to a RTX 3090 when ray tracing in Blender? Who knows? Guess we have to wait for a real review to find out. We do know that that the M1 Max hash rate is around 10.7 MH/s while a 3090 gets 121 MH/s so even if the M1 Ultra is twice as fast, it is still 1/6th the speed of the 3090.
    I agree re: RAM/GPU, but please drop the SSD. They are so easy/cheap to expand with fast storage later on, it just isn't inside the case.

    As for the GPU, yes, we'll have to wait and see. But, keep in mind they should be fast on-paper. A lot of the issue is just software compatibility. Your hash-rate is a great example. While the Max isn't going to match a 3090 due to memory bandwidth, it would probably be close if the mining software were Metal. People currently getting that 10 MH/s are essentially doing an emulation hack. That's actually pretty good considering.

    If I had to take a guess, I think with a Metal miner, we'd see like 70-80% of like a 3080 for the Pro and then given more memory bandwidth, faster than a 3090 on the Ultra (would need to do more math than I care for right now to find out by how much :) ).
    I am not sure I understand your point about the SSD being external. A M.2 is a SSD. Externally you can get about 1GB/sec on a USB C 3.2 and about 2GB/sec on thunderbolt (although I have yet to see one get that much when tested). A M.2, on the other hand, can currently get as much as 7 GB/sec. The one I have in my PS5 is 6 GB/sec and costs about $220 for 2TB currently. SSDs have been getting a lot faster recently at reasonable prices. Even if the M.2 speed is limited, you can expect to see a lot larger ones in a couple of years (unless China invades Taiwan in which case all of this is moot anyway).

    A M1 Max gets a compute score of 61256 on GeekBench 5 for OpenCL. A RTX 3090 gets 205005. Apple's performance graphs are a complete fantasy.



    I don't get it. Why are you going on about OpenCL?  See here:  https://support.apple.com/en-us/HT202823.  OpenCL and OpenGL are not supported on any Apple Silicon Macs.  Modern Macs use Metal: (bottom right https://developer.apple.com/opencl/  ;"If you are using OpenCL for computational tasks in your Mac app, we recommend that you transition to Metal".

    Meanwhile, there hasn't been any Metal (or even Mac) support for NVIDIA cards since around the GTX1080 -- years ago.

    So there's no sensible "benchmark" comparison tests with NVIDIA cards. Apple's tests are based on Metal, and where they're comparing to NVIDIA cards they're doing real world tests with actual apps, on PCs with the 12th gen Intel chips and NVIDIA cards.  I expect it's probably similar to the Photoshop and other shootouts Steve used to do in keynotes.  What you're saying makes about as much sense as people who used to try to say a Pentium chip is faster than a G5 because it has a higher clock speed.

    Your failure to understand their testing procedures doesn't make the results a fantasy.  If you want to mine cryptocurrency, or do anything else with NVIDIA cards, you don't want a Mac.

    Regarding SSDs.  Unlike a small vocal few here, the vast majority of Mac users DO NOT CARE about adding hardware inside their Macs.  I'm sure you know there's a limited supply of PCI lanes that can be managed by any chip.  You want to take some of the available lanes and dedicate them to an internal SSD slot that almost no one will use, crippling the potential Thunderbolt and other peripheral access available to the rest of us. If there was nothing to lose, you'd have a case, but Apple is not going to take away functionality from the vast majority of their users to appease the tinkerers like you, when there are perfectly reasonable other solutions available to you.

    1. Just buy what you need up front.  
    2. Or add to it externally later.
    3. Or wait for the Mac Pro.

    I stripe three M.2 NVMe drives over the three Thunderbolt 4 ports in my M1 Max MBP, and I get about 5500MBps.  You get up to SIX TB4 ports in the Studio.  Stripe your beloved M.2 drives over four of them and you'll get your 7000.


    JWSCwilliamlondonwatto_cobra
  • Leaked M1 Ultra Mac Studio benchmarks prove it outclasses top Mac Pro

    darkvader said:
    melgross said:
    Well, I was hoping to be able to test this myself soon, but apparently, even though I ordered a couple of minutes after the store went live, I’ll be waiting sometime into April for the monitor and until late April and possibly early May for the computer.

    i’m using the 16” Macbook Pro with 32 graphics cores and 64GB RAM now, so I’m really interested to find out how close to a doubling in performances I’ll see with this. Double the rendering engines as well. That should prove interesting.

    I'm a bit disappointed in the monitor though. I was hoping for a model that was somewhat more expensive, with miniLED, and possibly even 6K, as I’m hoping the new higher end monitor expected to come out will also use MiniLED, an advance over what it has now, and possibly go to 8K, though now I’m reading something about 7k, which is an odd resolution. But I bought this new one anyway. From what I’m reading, it should be somewhat better than the present iMac versions.

    MORE expensive?  $1,600 starting price isn't highway robbery enough for you? 

    No idea why you bought one of these then.  Apple's got you covered, they can rape your wallet very effectively with their other stupidly high priced monitor.  Might as well option it up, with the stand and the VESA mount (why not have both, since you're so filthy rich) it's only $7,197.00

    For the rest of us, $500 is a very high end price for a really good 27" monitor.  My eyes can be perfectly happy with $300 one.
    You just answered a bunch of questions I've had about you for a long time. This explains a lot. You describe us as "filthy rich". That's a relative term, but whatever it means to you, you seem to have a problem with it.

    I'm sorry.

    Your $300 27" monitor is 4K at best, and about 300 nits. That's about half the pixels, and half the brightness. It has no color accuracy. It looks washed out and fuzzy compared to this one.  And if it has a mic and speakers, it doesn't have the 3 mic beam forming array, or the high end speakers. If it has a camera it's HD with 1-2MP, while this has 12MP with processing tech in it that can automatically follow us around the room.

    Sure, maybe you don't need any of that for what you do for a living. It would explain your "filthy rich" comment. But you might consider that maybe us "filthy rich" people can afford these "MORE expensive" products because we do well paid work that actually benefits from those features. These products makes us more productive, which in turn makes us more money, more pleasantly, and so it all pays for itself pretty quickly. After that we're making more money we can keep for providing better for our families, or to build our businesses bigger, provide more jobs, contribute to the economy, or new innovations, etc.

    That's kinda what "pro" (you know... short for "professional") is all about. We're hardly being robbed. Maybe you should try it and you too could be "filthy rich".

    You hate and regularly diss everything Apple, but with arguments that make no sense -- they are either deliberate misinformation or completely clueless. You seem to have this deep resentment for everything Apple and Apple users. (Perhaps you resent how "filthy rich" we apparently are?)

    That explains your comments, but the thing I still don't get is: You clearly hate Apple and Apple users so much, but here you are, regularly, on an Apple forum.  WHY ON EARTH ARE YOU HERE?
    sconosciutotmaymelgrosswilliamlondonfastasleepwatto_cobra
  • Leaked M1 Ultra Mac Studio benchmarks prove it outclasses top Mac Pro

    Detnator said:


    That explains your comments, but the thing I still don't get is: You clearly hate Apple and Apple users so much, but here you are, regularly, on an Apple forum.  WHY ON EARTH ARE YOU HERE?
    It's either a troll or somebody extraordinarily undeserving of the effort to ridicule / rebut.

    Can we get an 'ignore' function in this joint? Kthxbai
    Yeah, but the trouble with 'ignore' is when other people reply we still see that but have missed the context.  

    As much as I probably should, I don't want to ignore him. I've noticed the pattern in darkvader's comments the last couple of years and it actually kinda fascinates me. He pops in with this obscure/obtuse comments and then disappears until he posts the next one on something completely new. He never replies to any kind of reason and he rarely replies to anyone at all. Plus I've asked him the "why are you here?" question before.  Of course he's never answered that.

    Until now I'd concluded he's almost certainly a troll (just poking at us for his own entertainment) or a shill (paid by some competition of Apple's to try to deter people). However, with phrases like "filthy rich" in his comment this time... this one was more personal. So maybe there's some genuine resentment or something going on. As snarky as I acknowledge my tone is, my "you should try it" is genuine.

    Still, these are only theories. I admit it, I wish I knew the truth (hence the question, just in case he'll answer). But I won't lose any sleep over it.
    muthuk_vanalingamtmaywilliamlondonfastasleepwatto_cobra
  • What it took for Apple to win Best Picture at the Oscars with 'CODA'

    jospkelly said:
    As expected, accolades for Apple as being the "First Streaming Service to Win the Beat Picture Oscar" are rolling in. Instead the focus should be on those that contributed artistically and technically to the film. Apple merely had deep pockets that allowed them to buy the distribution rights to the movie (beating out Amazon). This is not Apple's award.
    In this article, CODA is referred to as "Original Content" multiple times. Technically that is a true statement because CODA is not available to view on other outlets. But the term "original content" is misleading because it implies that Apple developed the material. They did not.
    I enjoyed this movie, and I enjoy other entertainment that Apple offers on its service. But Apple's role in this situation was akin to buying an escort to accompany them to the Prom in order to be voted Prom Royalty.
    I will gladly acknowledge Apple, or any other company, that financially supports a project from its inception, hires a casting director, cinematographer, screenplay writer, etc., and truly nurtures the project from beginning to end. This is not what Apple did regarding CODA.
    Have to agree with this.  It seems strange to think Apple had anything to do with winning these awards. 

    Apple gets some credit for making a bold choice in purchasing decisions. No doubt they chose it because it’s a bit left field, brings to light some of the challenges of a differently abled minority, etc., which are topics close to Apple’s virtual heart.  That’s worth something. Or in the above analogy they get credit for choosing the right escort to hire for the prom. But yeah, I think that’s about as far as it goes. Apple didn’t win any of this in any shape or form. 

    I appreciate that Tim Cook has the decency and grace to praise the cast, crew, etc. and make no mention of Apple in his tweet, and  Zack van Amburg only mentions Apple as who he’s representing in expressing his and their praise for the creatives. At least these guys seem to have their heads on straight about it, more or less. 
    FileMakerFellercrowleyrundhvidscstrrf