BlueLightning

About

Username
BlueLightning
Joined
Visits
166
Last Active
Roles
member
Points
303
Badges
0
Posts
163
  • Rumored next-generation Apple Silicon processor expected in fall 2023 at the earliest

    cgWerks said:
    BlueLightning said:
    Remember that TSMC makes the majority of CPUs/GPUs for AMD, Apple and NVIDIA (and many others).  
    Samsung makes most of the remaining leading edge chips.  
    Intel is way behind, and any high performance chips they have are not good candidates for laptops that are GPU intensive.  
    Intel laptops have to be plugged-in to achieve high GPU performance, with large amounts of heat and suck batteries dry rapidly.  
    True, though this mainly means they'll just use more power while doing it... but at least they can do it.
    Apple seems a bit ahead on the CPU front, but quite a way behind on the GPU end.
    Like many home users, I use more CPU than GPU cycles.  
    There are also hardware video coders and decoders, as well as the 16 neural/AI cores that are used for video functions.  
    Apple is mostly a consumer electronics company.  
    Compared to my former late 2013 13" i5 2-core mbp (with no dedicated gpu cores), the 2023 14" M2 Pro mbp (10/16/16 cores) kicks ass.  
    4k video playback caused full fan speed on the 2013 i5 (and lagging video).  
    Fans aren't even activated on the 2023 M2 Pro (and I see short parts of video that were skipped on the Intel chip).  
    williamlondonwatto_cobradewmecgWerks
  • Rumored next-generation Apple Silicon processor expected in fall 2023 at the earliest

    cgWerks said:
    Bummer, I'll be waiting for the M3 minimally. But, my Intel mini w/eGPU is still doing well, so should make it just fine.

    Unfortunately, my son needs to buy something in the next few months, so we're probably looking at M1 Max or M2 Pro and trying to figure out which has the edge for what he does (and will want to do).

    It seems more like it is the M1, M1.25 (called M2), and M2 (called M3) in reality. Hopefully that M3 will put Apple a bit more back where many of his were thinking/hoping when this Apple Silicon stuff started. We're now solidly back to playing catch-up with the PC market, at least in terms of GPUs.

    We all got bit by Apple's early planning/design, I think. I was amazed at what Apple accomplished when everyone was deep in the pandemic, but now 3 years later, we're feeling the reality of the pandemic on Apple. It was just delayed a lot more than other companies.

    What I really hope we'll start hearing, is more of the tech differences of the M3, instead of just more cores and energy efficiency (again, mostly on the GPU-front). We're now years into the transition, and we still don't really know what Apple's plan is for the pro users in GPU-centric disciplines.

    macxpress said:
    I wanna say that the MacBook Air is Apple's top selling laptop?
    That even makes it a bit more puzzling, as my understanding is a huge problem here is going to be supply/yields. It seems like they might just be trying to follow the release schedule they've initiated (ie. new chip starts in the low end models and then to more advanced systems). Otherwise, they'd maybe be better to introduce the M3 in the Pro and Studio.

    AniMill said:
    Either the Mac Pro is dead, or so delayed that it’s become superfluous in their product lineup. Perhaps it’s become such a niche that any further investment simply is not a viability for Apple anymore. Either way, if Apple does not at least preview a Mac Pro option at WWDC, I think it’s dead Jim. Though $3000 ski goggles are considered the “next” thing - but I believe AR/VR is already past the public interest inflection point. WWDC is going to be a very interesting show.
    Yeah, I'll certainly check it out, but kind of sounds like another yawner on the way. I have near zero interest in the VR stuff at this point. Maybe a bit in AR, but more professionally (vertical markets) than any use for myself personally.

    I'm a bit torn on the Mac Pro as well. Unless they give-in in terms of expandability, the Studio seems like the new Mac Pro. What would be the point of a huge case if it can't be expanded? (And, but give-in, I mean add AMD GPUs or something like that back to the platform.)

    I'm hoping they are just way behind - the M3 will be impressive - and they would just be too embarrassed to release a Mac Pro right now with M1/M2 tech in it.

    9secondkox2 said:
    A delay means that Apple is ensuring I’m the Mac Peo is the butt-kicking, name-taking monster it’s supposed to be. 
    That's the hard thing for me to grasp. It would seem (at least from what we know), unless they add AMD back in, it's going to be adequate at best. Hopefully we're pleasantly surprised. Currently, at least for GPU, a top of the line Mac kind of equals a mid-level gaming PC... and then only somewhat (better at one task, worse at others).

    Note: this is on the pro side, though. On the consumer side, Apple is certainly kicking butt.

    YasminG said:
    I’m sick and tired waiting for a new iMac. They should have at least offered an M2 offering
    I think they just didn't want to do an iMac M1.25.
    Remember that TSMC makes the majority of CPUs/GPUs for AMD, Apple and NVIDIA (and many others).  
    Samsung makes most of the remaining leading edge chips.  
    Intel is way behind, and any high performance chips they have are not good candidates for laptops that are GPU intensive.  
    Intel laptops have to be plugged-in to achieve high GPU performance, with large amounts of heat and suck batteries dry rapidly.  
    watto_cobra
  • How Steve Jobs saved Apple with the iMac 27 years ago

    "the iMac screen is one of the best 15-inch displays available"!???
    That screen was absolutely horrible on the original iMac, probably about the worst you could get.
    Thankfully nowadays it's the other way around. I have a 27'' iMac since the day the 5k Version was available and they're still - 4 years later -  isn't anything better on the market (at that price point). 
    Most of the places I know about would not spend extra money for color displays until after year 2000.  The same as my parents and grandparents thought that Color TV was a frivolous expense.  Only CAD/CAM users and folks who did graphic design and layouts had color screens, and the resolution on even those was fairly awful.  Saw a Sun Workstation give off a big puff of white smoke at power-on in the late 1990s (I believe it was a $40,000 workstation, probably around 20-24" or so, for CAD/CAM).  Luckily, it was under a maintenance contract, as was a Silicon Graphics workstation we needed for a different contract with a different large vendor.  

    One short time engineer had a black and white Mac.  The Treasurer had a monochrome (green?) Apple II before replacing it with a pc (low resolution color).  

    Didn't see a flat screen at any workplace until around 2005, and not in common use until a few years later.  Vacuum tube displays were still used for graphic design and layout for a few years later, due to relative low quality and high price of flat screens.  
    watto_cobra
  • How Steve Jobs saved Apple with the iMac 27 years ago

    "I wonder why no tech journalist has really explored that Apple Microsoft deal?"

    As I remember, there were several related cases.  One was Apple vs. Microsoft and HP.  
    The GUI was largely developed around 1973 by Xerox (Alto), which never was never widely credited in the press.  
    https://en.wikipedia.org/wiki/History_of_the_graphical_user_interface  

    Around the same time, DOJ was going after IBM for antitrust in PCs, minicomputers, mainframes and software.  
    Microsoft had trouble with the DOJ, UK and EU somewhat later.  
    NCR had similar problems with their business machines in the 1920s.  
    More recently, Amazon, Google, Apple and Microsoft (again) seem to be targeted by DOJ, UK and EU.  

    Found this on the settlement between Apple and Microsoft:  
    Interesting part to consider is that Apple's market capitalization has been larger than Microsoft's for an extended period.  
    https://www.cnbc.com/2017/08/29/steve-jobs-and-bill-gates-what-happened-when-microsoft-saved-apple.html
    watto_cobra
  • How Steve Jobs saved Apple with the iMac 27 years ago

    patdiddy said:
    Steve Jobs was a true genius. The only problem with the iMac’s early debut was that nobody had really dealt with Ethernet or WiFi at the time, in fact, the internet was just really starting to become a thing. 

    Had it debuted with the iPod, Steve Jobs might have had the best way to get digital music, and saved apple completely.
    Ethernet and the internet go back a long time before there was a GUI interface.  I still remember when the internet was not supposed to be used for commercial/profit purposes.  It was initially used mostly by universities and the military industrial complex, mostly on minicomputers and mainframes.  in 1969, the first four nodes on the Internet were a SDS Sigma 7 (UCLA), a SDS 940 (Stanford), an IBM 360/75 (Santa Barbara) and a DEC PDP-10 (Univ. of Utah) .  Ethernet was developed at Xerox in 1973-1974.  A NeXT computer was the first web server (at CERN in 1990).  OS/X and MacOS are derived from the NeXTStep Operating System, as are most of the other current Apple OS's.  

    http://www.usna1959.com/m59/classWeb1stFour.php#:~:text=The%20sketch%20of%20ARPANET%27s%20first,and%20the%20University%20of%20Utah.
    https://en.wikipedia.org/wiki/Ethernet  
    https://en.wikipedia.org/wiki/Internet  
    https://en.wikipedia.org/wiki/CERN_httpd  
    https://en.wikipedia.org/wiki/SDS_Sigma_series  
    https://en.wikipedia.org/wiki/IBM_System/360  
    https://en.wikipedia.org/wiki/PDP-10  
    https://www.youtube.com/watch?v=SY3Y8gb3jYA  

    The advantage of using external floppy drives and CD/RAMs is more easy replacement.  In a business setting, it is much easier to plug in a replacement unit and have the defective unit replaced or repaired for use for the next failed drive.  

    ronnwatto_cobrah4y3sFileMakerFellerjony0