Marvin

About

Username
Marvin
Joined
Visits
125
Last Active
Roles
moderator
Points
6,816
Badges
2
Posts
15,515
  • System admins irate at Apple's plan for shorter cert lifespans

    Joer293 said:
    Ive worked for multiple cloud providers and security red/blue teams. Rotating certs for some systems is entirely hands off and flawless.  Like Microsoft AD and desktops. MS solved the desktop rotation headache decades ago. But, That is not the case for big businesses at all. They cant use lets encrypt or free tools to rotate because those all violate some other security compliance issue for their regulated work loads. certs for business critical apps are labor intensive, the change to 1 year mark for TLS has led to an industry wide increase of outages related to cert rotations. Sacrificing Availability for almost no gain of confidentiality isnt worth it for business.. Sure, changing 1 system is easy. Synchronization of changing certs on 100,000 servers of 100 different functions isnt easy, and automation would need changes every single rotation, making it not as helpful as executives think. Most internet infrastructure requires 30+ day notice for downtime and have dictated maintenance windows per customer contracts. Rotating certs on 24/7 apps requires downtime. Often its kept to a 60 second cut over, but worst case with banks mainframes, dozens of teams, it can be several hours. There are set maintenance windows too. 45 days really means 30 days + 15 grace period. Just like 13 months is 12 months + 1 month grace period. This will be a nightmare for security. Outside of researchers, nobody is breaking certs besides governments. Hackers have 1,000 ways to break in, certs dont even make it on that list of things to try. Like rekeying your front door monthly, when that effort distracts you from closing windows. 
    For big installations, it can cause a lot of headaches. They usually have teams of people to fix it but hours of downtime can cost millions.


    It feels like there could be a separation of privacy and trust. Encryption could be done ad-hoc at the protocol level and enabled all the time so there's never such a thing as http. Then there would be a separate process for trust.


    Encrypt the connection ad-hoc to ensure the communication is always safe and private.
    Have trust certificates to verify the company is who they claim to be.

    This separates the functionality of the service from the trust of the service and an expired certificate wouldn't matter so much and possibly easier to deploy.

    There should be a habit of renewing trust certificates well in advance of expiry. If the certificates gave internal warnings to sysadmins of expiry 3 months in advance, most would renew them in advance and put the new ones in place for transparently swapping to the new ones.

    Switching to shorter time periods needs the process to be more seamless first.
    watto_cobra
  • Surgeons say Apple Vision Pro saves them pain and injury

    hogman said:
    All I got from this is, how does a monitor cost $30,000? Especially when all the monitors I see in hospitals display simple graphs and numbers.
    It's similar to reference monitors for movies. Apple compared their XDR displays to $20000+ reference displays from Sony. They tend to have better calibration and features for their use cases.

    https://www.monitors.com/collections/surgical-displays
    https://www.monitors.com/collections/surgical-displays/products/barco-mdsc-8358-k9307938

    https://synergymedco.com/product/sony-lmd-xh550mt-55in-4k-3d-2d-lcd-medical-monitor/ (medical features listed)
    https://www.medicalecart.com/products/sony-lmd-xh550mt-55-inch-4k-3d-2d-lcd-monitor-high-performance-medical-monitor-box-of-01.html

    Low glare, picture-in-picture, 3D image input, color accurate, designed for easy cleaning for hygiene etc.
    jahbladeAlex_Vwatto_cobra
  • Apple cancels California DMV permit for self-driving car testing

    MisterKit said:
    I don't see how a self driving vehicle could ever interact safely with the idiot drivers already on the roads. A lost cause. 
    One goal of self-driving vehicles would be to take all idiot drivers off the roads so nobody has to interact with them as well as elderly, distracted, inexperienced drivers.

    There was an Apple software engineer killed in a self-driving Tesla in 2018:

    https://www.theverge.com/2024/4/8/24124744/tesla-autopilot-lawsuit-settlement-huang-death

    The cost of even a single person dying as a result of a product mistake must weigh heavily on the people making self-driving vehicles.

    It's a worthy cause, transport would be vastly improved, safer, cheaper and more efficient with self-driving vehicles making the majority of journeys and they will be very useful for elderly and disabled people. They just need to be implemented exceptionally well, anything less will cost lives, even if proportionally fewer than human drivers.
    watto_cobra
  • Cheaper Apple Vision headset rumored to cost $2000, arriving in 2026

    DAalseth said:
    DAalseth said:
    Dropping EyeSignt is more than the screen on the outside. It’s the cameras that looked at the wearers face, and all of the processing overhead to assemble and ‘undistort’ the eyes into the image on the front. This all was more cost and processing overhead that did not adde to the users experience. This is a very good first step. 
    I wonder if some of those cameras might be needed for the digital avatar feature, but a lot of people would probably also be willing to give that up too if it meant a lighter and less expensive device. And some of the hardware will probably also just cost less over time, so they may not need to make too many sacrifices to produce a cheaper model. 
    As others have said elsewhere, it may not make sense to go to a “less powerful chip”. Now that the M4 is out, the M2 IS the less powerful chip. 
    If they are happy with M2-level performance, the iPhone chips will reach this soon and cost less:



    A18 Pro is just behind M2, A19 Pro on 2nm in 2026 will get even closer:


    This could save them $150. To hit a $2k price point, they need to get $1700 costs down to around $1000. Cutting the Eyesight feature will save around $100, maybe more.

    The number of AVP units sold are likely below 300,000.
    At a $2000 price point, they can sell 3m units, which is $6b.
    At a $1500 price point, they can sell 5m units, which is $7.5b. If they hit $2000, there will eventually be units available at $1500.

    Meta has a VR install base of over 20 million, active userbase around 1/3 of this.

    Within 2 years of having a more affordable headset, Apple could become the most widely used platform. This would need a focus on comfort and usefulness. Movie content is the most widely appealing use case.

    There's an 8K (same as AVP, dual 4K) headset that was announced recently but may not ship priced at $1899 and uses a headband for comfort like the PSVR and HoloLens:


    Distributing the weight across the top and sides of the head and away from the eyes and nose in a compact design would make it more widely appealing to wear on a regular basis.

    The ones in the video below weigh 1/3 the AVP, Apple can get to this kind of form factor with the 2nd model:

    williamlondonwatto_cobra
  • Apple's study proves that LLM-based AI models are flawed because they cannot reason

    LLMs aren’t sentient. They look for patterns in the query and then apply algorithms to those patterns to identify details that then are used to search databases or perform functions. LLMs can’t learn. If the data they search contains errors, they will report wrong answers. Essentially they are speech recognition engines paired with limited data retrieval and language generation capabilities.
    Apart from not being able to learn in real-time, this describes what people do too. At any given point in time without new information, the training available to an AI is of a similar nature to a person.

    Reasoning skills don't necessarily require real-time learning, that can be another pre-trained model (or code) that reformats queries before the LLM processes them.

    The paper suggests moving beyond pattern matching to achieve this but understanding varied queries is still pattern matching.

    The image generators have the same problem where very small changes in tokens can produce very different outputs, which makes it difficult to use it for artwork that uses the same designs like an illustrated book because the same character on each page looks different.


    This can be improved on using a control net which places constraints on the generation process. The video generators need to be stable from one frame to another and there's a recent video showing an old video game converted to photoreal video:


    For understanding language queries, people understand that a phrase like 'girls college' has a different meaning from 'college girls' because of training on word association, not through any mystical reasoning capability.

    Apple's paper doesn't define what they mean by formal reasoning and state that it differs from probabilistic pattern-matching. We know that brains are made of connections of neurons, around 100 trillion connections in some kind of structure and AI is trying to reverse-engineer what the structure of a brain is doing.

    To recreate what a brain is requires massive computational power and data, well beyond personal computer performance. Server clusters can get closer but getting the right models that work well in every scenario is going to take some trial and error. Humans have had a 50,000+ year head start:


    Modern AI is doing pretty well for being under 8 years old, certainly more capable than a human 8 year old.

    The main things that an AI lacks vs a human are survival instinct, motivations and massive real-time data input and processing, the rest can be simulated with patterns and algorithms and some of the former can be too. Some of the discussions around AI border on religious arguments in assuming there's a limit to how well a machine can simulate a human but there would be no assumption like this if an AI was to simulate a more primitive mammal, which humans evolved from.
    watto_cobra