Marvin
About
- Username
- Marvin
- Joined
- Visits
- 124
- Last Active
- Roles
- moderator
- Points
- 6,719
- Badges
- 2
- Posts
- 15,493
Reactions
-
Russian YouTubers keep showing off alleged M4 MacBook Pros in historic Apple leak
rjharlan said:I can’t help but believe that this is a complete fake. There’s no way that two Russian influencers got their hands on two Macs that haven’t even been introduced to the market! Even the associates haven’t even been introduced to them yet. I think it’s much more likely that they’re searching for eyes on their channels.
https://www.tomshardware.com/laptops/apple-macbook-pro-m4-leakage-gets-serious-with-200-units-reportedly-up-for-sale-on-social-media
Apple sometimes stocks products in store before launch so that they are available for order the same day as the announcement.
If it's real, it's not like this would be a huge leak for them, it's a minor refresh. The M4 chip is already available in the iPad and most of the rest of the Mac is unchanged vs M3 other than the extra TB port on the 14" model, like the 16" model.
If it was the Pro/Max models then it would be more interesting.
This may also suggest an event soon, I doubt they'd ship boxes 3 weeks in advance but this could vary by country. Last year, they announced the event for the 30th October on the 24th. They usually do events on Monday/Tuesday so that leaves 14th, 15th, 21st, 22nd, 28th, 29th October. Earliest announcement would be today or tomorrow, latest would be 23rd, within 2 weeks. -
Commemorating Steve Jobs and his continuing influence on technology
nubus said:charlesn said:As for Vision Pro being a dud--that's hilarious for a product that has been in the hands of consumers for all of 32 weeks. Please cite fact-based sources for its "failure" that include Apple's actual internal projection for sales and how actual sales have fallen short of that number.There are Apple patents for this product from 2006:The images in the patent look almost exactly like AVP.The problem is this technology is still at the large helmet stage, even after nearly 20 years since this patent was filed and the above video was made.
What it really needs is a major technology breakthrough like how the multi-touch glass interface enabled the iPhone. There's a missing piece for AR and they are stuck with trying to shoehorn technology that's currently available to make it work. -
macOS Sequoia can run on Valve's Steam Deck with hacks
tipoo said:Now if only Apple dropped a bag of cash at Valve to bring that ARM Proton port to macOS natively, so that Steam games "Just Work" on macOS...That's the dream.
Valve's Proton has a DirectX to Vulkan translation layer (DXVK), which would need MoltenVK to convert to Metal as Mac doesn't support Vulkan directly.
Crossover and Whisky integrate Apple's D3DMetal translation to run Windows games like Proton. It supports a lot of games, there are over 500 games tested on the following channel:
https://www.youtube.com/@macprotips/videos
It supports higher-end recent games like Horizon Forbidden West:
The only problem with this setup is you have to install Steam inside the compatibility layers, then install the games inside this, sometimes with patches. It's not very user-friendly.
If it was possible to use either the App Store or native Mac Steam and install a pre-wrapped version of each game that is already setup and tuned, that would be much easier and a license per install could be paid to the company that provides the wrapper if it's from a 3rd party. -
Big Tech-funded TV facing a 'schism' in production styles claims Jon Stewart
entropys said:AppleZulu said:entropys said:chasm said:entropys said:Writing by committee is no doubt a key factor in the malaise plaguing Hollywood these days. That and risk averse studios doing remakes, sequels and having series run way beyond their use by date.Your comments on the problems with risk-averse studios, however, I heartily agree with.
I just whole heartedly disagree with Stewart’s view. In fact his whinging he only had four writers instead of his usual 14 seemed like a “let them eat cake” moment.Each story as a general rule should only have one writer. One. maybe I would admit a writer and an apprentice. But that’s it.So you fire all but one and a half of the writers and demand that they “produce content” on a tighter deadline. You only register the lower quality when viewership falls, because as a widgeteer, you have no idea what’s funny or not.Next, you fire the writer and keep the less expensive apprentice. Things get worse. You threaten to replace the apprentice writer with AI, which ironically “produces content” by distilling and regurgitating the output of all those writers rooms that used to exist, but algorithms have no actual sense of what’s funny, so that “product” stinks as well, and nobody’s watching your more efficiently managed show any more.But sure, you know more than the “whinging” Jon Stewart, who has nurtured and shepherded a writers room that produced a generation of the top comic actors and writers who have themselves gone on to collectively produce billions of dollars worth of the best comedic “content” in the last couple of decades.Of course, you’ll be oblivious to what you’ve destroyed and file a report on how you saved money, first by cutting the 14 writers, then by closing down a failing production unit. Your fellow widgeteers will reward you for all the money you saved and give you a huge bonus. Next thing you know, you’re a hot commodity hired by Boeing because they need an expert Vice President of Widgeteers to slash costs to meet their mysteriously falling revenue.Writing by committee is the path to mediocrity.Different types of content have different production pipelines. Jon Stewart is talking about a daily current affairs comedy talk show. Those writers aren't all working on the same material, it's a joke factory where they are looking at current news and trying to come up with something funny to say about each news segment:A TV show with a storyline needs fewer writers as it has to follow a story arc. Game of Thrones only had a few writers. Here's an episode credits:George R.R. Martin was the original writer of the book and David Benioff and D.B. Weiss adapted it for a TV screenplay.Foundation was written by Isaac Asimov, screenplay by David S. Goyer and Josh Friedman with some editors:There can be a disconnect when production companies who are accustomed to making narrative TV shows try to produce comedy talk shows the same way and vice versa. -
Apple is reportedly not investing in OpenAI
jdw said:As I've mentioned under other articles in the past, my experience with ChatGPT4o isn't that great. I want like to use it to check multiple online sources quickly, in the hope it can Google faster than I can on my own. And it is fast. But the problem is, it lies a lot. So I always ask it for sources. Then it gives me stupid links that when clicked on, open nothing. So I have to then as it for plain text URLs. It complies, but none of them ever work. EVER! They lead to the expected domain, but they always result a 404 file not found. ALWAYS! I then complain to ChatGPT saying it needs to read the articles it links for me to ensure the article truly exists and exists at the plain text URL it will give to me. It apologizes and seemingly complies, but it continues to give me more bogus URLs. I have repeated that cycle multiple times in a row, until my free sessions with GPT4o expires. It never learns from its mistakes. It never gets it right. I've been using it for months, and it hasn't improved at all in that regard. So I mostly find it useless. And this experience remains valid even if some GPT lover comes along a raves about how well it summarizes text. Fine and well, but it still lies and gives bogus URLs to its source info.I find it gives very good answers for technical questions that have a correct answer that would be difficult to find online but it does make mistakes.Duckduckgo has it integrated now:It offers GPT-4o, Claude 3, Llama 3 and Mistral. Try the other models to see how they compare.Very subjective questions like political, social, moral questions will have subjective answers depending on the training data.Getting access to high quality training data is going to be the biggest challenge for AI models. It needs a trust/authority model to weight the answers. Medical answers should give more weight to peer-reviewed medical texts over random Reddit/Youtube commenters.It's important to remember that the models are not continually trained, they are snapshots. You can ask a model directly when it was trained. GPT-4o answers up to October 2021 so it doesn't know about the past 3 years. Some of its online sources will have expired since it was trained. The new upcoming models have been trained after 2021 with more computer power:They now have metrics for how the AI compares to human baseline performance, future models will keep trying to outperform these baselines in different areas:It's easier for AI to excel at deterministic problems. Non-deterministic problems need huge amounts of high quality data.The processing power they are using on the servers is increasing significantly every year and the models will improve significantly when they are updated.Some people won't be impressed with AI models until they reach AGI level, there are people projecting this before 2030.