danox
About
- Username
- danox
- Joined
- Visits
- 151
- Last Active
- Roles
- member
- Points
- 5,357
- Badges
- 1
- Posts
- 3,910
Reactions
-
Apple again tries to cut third party app fees to avoid EU fines
-
Trump Mobile drops false 'made in America' promise
-
M5 OLED iPad Pro fall release more certain, screen production underway
blastdoor said:Sounds like it will be the Ultimate Computer. Hopefully it doesn't start drawing more power than expected.
More powerful and more efficient Apple is getting closer to having parity with some of the better Nvidia discrete GPU cards without using 300 to 600 watts of power, by the time they get to M6 or M7 I think Apple will be ahead In just about every metric, when it comes to Tablet, Laptop and Desktop computers The biggest question is whether or not they’re going to use it for in house Apple servers, not bad for a company can’t innovate.Apple currently is in 12th position with M3 ultra the M5 ultra will probably move into the top 5 the M6 and M7 will be at the top or near depending upon what Nvidia does and their uplift, their trajectory has not been as good as what Apple has been doing with their processor design team. (what will makes the M series stand out is the performance, combining with the ridiculous power efficiency.
-
Courts say AI training on copyrighted material is legal
mfryd said:danox said:mfryd said:It's a complicated topic.
There are good points on both sides of the training question. On one hand, AI programs are being trained based on the hard work of previous human artists. The AI companies are profiting, but the original artists get nothing.
On the other hand, the AI is not doing anything new. It's common for individuals to study the work of others, and use that study to inform their work. When interviewed, great directors often discuss how they have studied the works of great directors to learn their techniques and style. The AI programs are simply really good at this.
My understanding, is that an art student can study the works of a current artist, and produce new works in that style. I don't believe an artist's style is protectable by copyright. What an artist can't do, is to produce work that is essentially a copy of an existing copyrighted work, or that contains copyrighted elements (including copyrighted characters). An artist also has to be careful that work done in someone else's style is not represented as being that artist's work. If I were to write a book in the style of Dr. Seuss, I would need to make it very clear that the book was *not* a work by Dr. Seuss.Copyright allows control over making copies of a creative work. It does not allow control over works that were "inspired" by a copyrighted piece.
An issue with current AI, is that it doesn't understand the limitations of copyright law, and can sometimes produce results that would typically be considered copyright infringement.It's going to take a while to sort out what rights various parties should have. There is more than one reasonable way to resolve the legal issues. It will be interesting to see how Congress and the courts resolve these issues.
Disclaimer: I am not an attorney, and this is not legal advice. It is merely my imperfect understanding of some of the issues.AI can’t think and it can’t reason and because of that it knows no limitations today, however one day it will, but that day is decades away, but that does not mean you should get to scrape all of the copyrighted material since 1920 at your leisure but the protected class gets to do so.
This is a common challenge with new technology. In the past, certain activities were limited by the technology of the time. Therefore, certain activities could not rise to the level where they were a common issue. As technology improves, so do various abilities.
For instance, 50 years ago we didn't really need laws governing the ability for private companies to track people. If they wanted to track someone, they hired a private investigator, and he would follow the person of interest. If you wanted to track 50 people, you would need 50 private investigators. The available technology limited the collection of tracking data. If a company wanted to track someone, and sell that information, they could. It just wasn't a common thing.Today, the three major cellular companies maintain a real time database of where just about every adult is currently located. They have to. They need to know where you are so when someone calls you the signal only needs to go to the cell tower closest to you. That data is extremely valuable. Knowing where you are, and where you have been, makes it possible to make some very good guesses about your likes and dislikes. That makes it possible to target you with ads, that are designed to appeal to your personal preferences, or feed off your personal fears.Once it becomes trivial to track people, we need to think about whether and how to regulate tracking.In the past, it wasn't possible to read a large percentage of what gets published. It was even less possible to memorize every passage of every book you have ever read. Now that computers are doing this, it's important that we consider whether we need new regulations and what should they be?People are not allowed to scrape if scraping means reading something once or twice or thrice, then write a thesis/paper at a university, but later on become famous/prominent, see if you’ll be allowed to get away with copying/scraping (remembering it too well) it once again if you have all the knowledge before 1920 which is in the public domain shouldn’t that not be enough? And everything afterwards in the last 125 years, you pay for? How difficult is that?And the way the court systems work if you don’t raise a fuss now you will never get satisfaction similar to trade marks if you don’t keep on top of it, if you don’t try to enforce it, the court system say’s too bad.
Greedy, AI companies all of civilized (dawn of agriculture) human history 11,000 B.C. approximately until 1920 free and it still isn’t enough…. The kicker in this is Apple being sought out and sued, for scraping in the next five years despite this ruling. -
Five years of Apple Silicon: How Apple continues to revolutionize chips
melgross said:danvm said:danox said:danvm said:programmer said:danvm said:And I have work with some of those workstations, and the performance is not as pathetic as you mention. Some of them have advantages over Apple Silicon, specially when comparing the GPU.
It would be interesting to see a large scale data centre built from ARM-based machines and compared to ones build from Intel/AMD-based machines, and compare the operating costs. Some of the big cloud vendors offer lower cost ARM-based hosts just for this reason -- they greatly reduce energy and cooling costs in the data centre. Not Apple's focus though, so we aren't likely to see Apple Silicon based data centres (except perhaps for Apple's own, but they are typically very secretive about that).
I also know the benefits of Apple Intelligence and ARM in general, especially with power efficiency. But there are cases where some specialized applications use CUDA / Optix, and you are required to use Nvidia adapters. In datacenters is very difference, and even more with AI. There are even rumors of Apple dealing with Nvidia for their datacenters.
Unlikely report claims Apple is buying 250 Nvidia servers for AI
Large cloud providers also have their own AI processors (Amazon Trainium2, Azure Maia and Google Axion). Maybe these processors have advantages over ARM and Apple Silicon for AI tasks. My point is that ARM and Apple Silicon is not the magic CPU that will solve all problems. It has many advantages over Intel and AMD in some tasks. But Intel, AMD and Nvidia have some advantages over ARM / Apple too.
At the end, It's good to have competition working for us.The competition from Intel, AMD, and Nvidia is commendable in theory, but Apple boasts several in-house operating systems (ecosystems) that make direct competition with them impractical. I don’t believe any of these companies will be working for Apple again. Two out of the three had their chance, and like Samsung’s (chip division), they only caused trouble for Apple.
Apple Silicon isn’t magical, but the absence of an in-house OS prevents these companies from optimizing their hardware to an operating system, putting them behind Apple. This is also why Microsoft is frantically flailing around with Qualcomm, attempting to revive its failing and unprofitable Surface computer line.
And you when you talk about the Surface line, you have to think that Apple is not the only one competing with them. HP, Dell and Lenovo outsell Microsoft (and even Apple) by a large margin. And these are the top three among many others. With Windows, customers have choices, not that much with Apple. That's the reason I think you cannot make a 1:1 comparison of sales numbers between Apple vs Microsoft.
BTW, from I have seen, the Qualcomm Snapdragon Elite looks very competitive, even with Apple. And there are rumors that Nvidia have something to announce soon.