dewme

About

Username
dewme
Joined
Visits
932
Last Active
Roles
member
Points
15,799
Badges
2
Posts
6,115
  • Apple shows off the lean, efficient processor lab behind the M3

    eriamjh said:
    I’d love to know what they actually do in this lab.  

    It’s clear they are temperature cycling the CPUs, but how are they stressing them?   Are they running benchmarks?  Are they running “yes”?   

    How are they testing GPUs?  What data are they collecting? 

    These must be the mobo’s that eventually leak to eBay, etc. and end up in YouTube videos 10 years later.  
    Those boards look much larger than what I’d expect to find in an actual product. I’d expect that these are more along the lines of developmental test fixtures that allow the engineers to gain access to signals, test points, and embedded instrumentation on the SoC itself that are not exposed at a product level in a production motherboard. They probably have additional instrumentation and sensors on the test fixture itself to observe interactions external to the SoC to see how it’s working in conjunction with other critical system components that will be part of a production motherboard.

     In the past you would see breadboards and test fixtures with all kinds of test instruments like signal analyzers, oscilloscopes, counters, timers, etc., connected to a test fixture (board) with a rats nest of wires and test clips to get the test signals out to the test instruments. Modern systems with VLSI, etc., are way too small and highly integrated to easily attach conventional test instruments to the systems under test. The designers of the chips and SoCs must work closely with those who will be testing the chips to make sure the chip designs provide ways for the chip to be testable.Testability is built into the design. If you imagine a test fixture being bed of nails where each nail is a test point, the chip, package, or board designer must make sure they identify exactly where each test point will be and what signal is associated with the test point. Chips typically have additional circuitry embedded in the chip solely for test purposes.

    Product development is far more than the engineering that goes into the design of each component and product as a whole. It must also provide mechanisms to properly test and cost effectively manufacture each component and the product as a whole, not to mention how teh product fits into a larger system or ecosystem. A lot of these essential product development activities go unnoticed by the end consumer of a product. But no product would ever make it into a consumer’s hands without these activities taking place with high degrees of quality and precision.
    watto_cobraargonautmuthuk_vanalingam
  • Apple TV+ may bundle with Paramount+ to cut subscriber losses

    Marvin said:
    I think an ad-supported tier would reduce churn more than bundling.  But these are both probably the future of streaming.
    Ads reduce the quality of the experience so much but if people aren't willing to pay then it would keep them viewing. The 30 second ads in iOS games are tedious, especially when they repeat the same ones.

    Another option could be a credit system that works across streaming providers. People would subscribe to get tokens per month.

    Say someone has a $10/month plan, they get 1000 tokens. They'd be able to open Netflix without a Netflix account and watch what they have and it would charge some tokens. If you watch an episode of a new TV show, it may take 50 tokens per episode. An old TV series can take 10 tokens per episode.

    When the tokens run out, people can top them up or jump to a higher subscription tier and tokens can be shared with family members without having to add new viewers to accounts.

    They can be used on Youtube to get ad-free playback, Disney+ when a new movie or show is released etc.

    This may be a downside for some platforms in that if you don't watch, you don't pay anything but this is the case with an ad service too. People can also watch ads to accumulate tokens. I'd guess an ad view would be worth around $0.02 so it would need a fair amount of ads to build up tokens and a lot of people would rather pay.

    For young people with no income, they can be on family plans or gifted tokens.
    What you are suggesting makes perfect sense and is entirely logical. Unfortunately, the whole notion of buying “only what you need” runs counter to how pretty much all consumer and business products and services are sold. 

    People pay huge sums for 3-ton vehicles that can carry 7 passengers and exceed 150 mph. But 95% of actual vehicle usage is with one person, the driver, in the vehicle driving 60 mph and where the maximum legal speed on the best maintained roads is around 70 mph. 

    Grocery stores put in 12 checkout lanes, but other than a couple of weeks a year, no more than 3 of them are typically staffed. 

    People routinely buy high end computers with far more memory, storage, and processing performance than they’ll need for 95% of their computer usage.   

    I pay for Netflix and 95% of the content is material I’ll probably never watch. 

    There are certainly fringe cases, but so much of consumerism is about selling people far more than they’ll ever need. Whether it’s streaming media or 6500 sqft McMansions with 7 bathrooms for a family of 4, people want (whether they actually need) access and ownership of as much stuff as they can acquire because it represents promise and opportunity to choose from a vast array of things and stuff that may enrich their lives in some small way. 

    As a consumer, once you start doing the math or tracking your actual consumption you may realize that you’re probably not making the most economically beneficial choices. Or, like most consumers, you quit trying to do the math entirely and sign up for another service while you’re pining away for the day your preordered new Cybertruck will be available for pickup. 
    eightzerokiltedgreen
  • Apple and Goldman Sachs to part ways on Apple Card, no successor named

    mpantone said:
    dewme said:
    What happened with Apple’s relationship with Barclays?  I seem to remember using Barclays relationship with Apple to get interest free financing to purchase a couple of my earlier Macs and possibly iPads. 

    Perhaps partnering with a non-US based bank may make it easier to roll out support for Apple Card more globally. 

    Ultimately, shouldn’t we expect that Apple Card would be supported by multiple partners, similar to Mastercard and Visa?
    You have made a common error by someone not familiar with the basics of the consumer credit card industry.

    Mastercard and Visa aren't credit cards: they are payment networks. American Express is both a credit card company and a payment network.

    Technically, the Apple Card is a Mastercard issued by Goldman Sachs USA, NA. It says so in the fine print of the service agreement that any Apple Cardholder accepted (by clicking "Accept"). In that way, it's similar to a VentureOne Mastercard issued by CapitalOne.

    Barclays Bank would be the issuing bank. It's almost certain that Apple had discussions with Barclays Bank about the Apple Card (before it debuted) before Apple selected Goldman Sachs as a partner.

    Apple would have to partner with banks from other countries to service customers in those areas. But those banks have to have a business in the country and follow the banking regulations of that country as well. It's not like an random American can apply for a JCB card (a Japanese card) or a UK card.

    To have American cardholders, Apple needs to find a bank located in the USA (which would be subject to US consumer banking laws, not those of the UK, Japan, Nigeria, wherever.
    Excellent insight and enlightenment. Thank you, You are correct, my exposure to credit card payment systems is solely as an end user. Barclays previous relationship with Apple lasted several years and was a benefit to me on a few occasions. That is why I was surprised when they were not part of the Apple Card deal.

    Despite being a newbie when it comes to the credit card industry, I actually use credit cards extensively as a means to avoid carrying excessive cash. I never use debit cards. My go-to card has long been Discover because they have always had exceptional customer service when I’ve had to deal with them. But when I travel outside of the US, they have not been well supported so I have to carry either a Visa or Mastercard too. Perhaps I’d be better served with an Apple Card with it being a Mastercard and having some Apple related benefits. 

    The only other exposure I’ve had with credit cards is company issued credit cards. Things may have changed over time, but I recall that unlike Visa/Mastercard, American Express was often a struggle to use outside of the US, at least for some expenses. I’m curious whether the Apple Card is widely used for company issued credit cards? Speaking of Japan, at least in the late ‘90s, business travel there was a struggle with any credit card, regardless of the issuer or sovereignty of the issuer. Everything was cash-only, even hotels. Hopefully that’s improved.
    eightzeroeightzero
  • Apple's flavor of RCS won't support Google's end-to-end encryption extension

    I think Apple and most of the computing world learned everything it needed to know about trying to push propriety implementations into the open standards space with Microsoft's Internet Explorer (MSIE). There is no long term value in unleashing semi-open or kind-of-standard implementations on to unsuspecting users, especially if they are masquerading as being standards based.

    If Google wants to make their E2EE part of the RCS standard they need to relinquish control and ownership of their designs and code to the standards organization. Doing so may result in their proprietary implementation becoming obsolete and non-compliant, much like what happened with MSIE. I know this isn't exactly the same scenario as MSIE vs. W3C, but I think the end result will be the same: the non-compliant non-standard implementation goes away. RIP.

    This doesn't mean that the relinquished assets do not contribute to the open standard. MSIE did contribute valuable extensions to the W3C standards that were adopted. The adoption of standards does not prevent a vendor from building a totally proprietary implementation as long as they make no assertions about their product's compliance with the standard.

    Claiming to be RCS compliant while adding proprietary extensions is not necessarily evil, but it should be handled in a way that makes it very clear to users that what they are seeing and experiencing is a composite view with both standard and non-standard elements. Apple's current use of the blue text boxes for iMessage content and green boxes for SMS content is exactly the kind of differentiation that should be done. If Apple obscured the truth by making everything look exactly the same, e.g., everyone gets a blue bubble, they'd be deceiving their customers.

    There is a big difference between blue bubbles and green bubbles. It's solely intended to inform and should have nothing to do with shaming. If anything, the blue bubbles in iMessage are where Apple is letting the user know that those messages are using a proprietary technology rather than an open standard. If you're a standards purest you really should be annoyed to see all of those blue bubbles and push Apple to only use green bubble technology. Personally, as a loyal Apple customer who cares deeply about privacy and security, I am willing to forgive Apple for making its blue bubble technology available for my use, even though it's defiling the standards purity of iMessage. I must confess that I've chosen privacy and security over blind adherence to an inadequate standard. Call me a bastard for choosing blue over green, but sometimes you have to look out for yourself.
    williamlondontmayAlex1N
  • Soon, you'll be able to stream Windows through a Microsoft app on iPad, Mac, and iPhone

    anome said:
    I've had no end of problems getting RDP to work from my Mac to my Windows box, so I'll have to see if this works better. And whether it runs fast enough for games.
    Hmmm. I've been using Microsoft Remote Desktop from iPad and Mac on several different Macs for several years on my private local network. It's worked remarkably well on everything from a late 2012 iMac to Mac Studio to M2 MacBook Air to iPad Pro. 

    The Windows machine must be Windows Pro, not Windows Home, and remote desktop must be enabled on the Windows machine. The Windows machine must be configured to accept username+password login even if you normally log in using a PIN or Windows Hello. The login credentials for the PC are saved in the settings associated with the thumbnail launcher in the RD app. Do not select the "Optimize for Retina displays" if you are using a Mac client, even with a 5K display on the client. Other than the login credentials and PC name, which you can get from Finder under Network, I leave everything else at the defaults.

    You will get a certificate warning when it is trying to connect but you can just hit continue. One oddity of late is that connecting from a M1/M2 Mac to a Windows machine using RDP takes much longer to connect than it does from an Intel Mac. The Remote Desktop app on Mac is supposedly running natively on Apple Silicon so I don't know why it takes significantly longer to connect. On my old iMac it connects nearly instantaneously. Immediately prior to writing this comment I installed Remote Desktop on my new M2 MacBook Air, created a PC launcher, put in my PC login credentials and target PC name, and it connected (with the certificate warning) on the very first try.

    Gaming? I've never tried gaming with RDP. 
    roundaboutnowmuthuk_vanalingamwatto_cobra