prismatics

About

Username
prismatics
Joined
Visits
38
Last Active
Roles
member
Points
256
Badges
1
Posts
59
  • WPA3 will improve your Wi-Fi security, if your router supports it

    This was probably why Apple is discontinuing their current hardware because it’d be a waste of resources to keep manufacturing the Airports when a new hardware standard is going to come out (?)
    WPA3 can be easily implemented in todays WPA2 compatible systems since it is based on the same hardware acceleration for the cryptography happening under the hood.
    netmageaylkjony0williamlondonAlex1Ndysamoriapakitt
  • Two of four Thunderbolt 3 ports in new 13" MacBook Pro with Touch Bar have reduced speeds

    The comment section shows that the author of the article has not made the cause for the issue clear enough.

    Processors today use PCIe to connect everything from Storage controllers, high speed I/O like USB, Thunderbolt and other stuff including SD-Card readers and Graphic Cards (or discrete GPU in case of macbook). And on each processor model there is only a specific number of these lanes available. This means that since Apple has to rely on Intel on this issue, they can't simply change this behavior by putting in a second controller, it's simply not possible because there is no dual core i7 Chip with enough lanes. Thats why Apple has chosen to use the PCH-supplied PCIe-Lanes.

    PCH is an additional chip (which is integrated directly in the processor package on some models) which connects to the processor via a seperate high speed bus and it gives you many things included USB, keyboard support, audio and storage interface and additional (about half speed) PCIe-Lanes in one single chip. What I think is the case is that Apple is using these lanes because the processor itself does not supply enough PCIe-lanes.

    So obviously, there is nothing to be added from a non-technical viewpoint, because it's not about power consumption or other marketing decisions, it's simply because Intel does not make the chips Apple needs for four Thunderbolt 3 full speed capable ports.
    BraddsUKJanNL2old4funstevehmazda 3stmayPolyphonielmagoothewhitefalcondysamoria
  • Apple plans to launch 5G iPhone in 2020, report says

    sergioz said:
    I don’t think 5G will be super popular with cellphones in the beginning. As we develop new experiences and tech, demand will rise, but right now I don’t see why you would need access to 1 gigabit connection in you pocket? Plus as tricky 5G as a technology, even when it becomes widely available, it’ll be like the icing on the cake to have it.
    The thing is, and what will make you understand why this matters is, that the gigabit is not for every individual device, but multiples of it (e.g. 4 or 8) are shared between many, many more subscribers. As data plans become increasingly more aware of increased media consumption (e.g. basic stuff as browsing, watching videos or streaming music), so do providers noticing that the higher data allowances result in increasing traffic in the existing mobile infrastructure.

    The headroom that 4G technologies gave to 3G is exhausted, mobile broadband networks now face increasing cell congestion where there was plenty of capacity 3 or 4 years ago.

    Continuing in existing cell topologies is no longer feasable, as in bigger cells, the ratio of available frequency spectrum (conclusively the available data rate) against subscriber count is rapidly going against unmaintainability of services in a useful manner and we are approaching a barrier which defines the maximum information density one can achieve with given parameters of a cell in traditional 4G.

    5G is taking a different approach. Rather than creating big cells that have many clients share the same frequency spectrum like 4G and older technologies did, the concept of 5G is to create much, much smaller cells which have a much smaller range.

    The obvious disadvantage is, that it's necessary to create more cells and it is much harder to coordinate this big amount of cells. Clients need to switch cells much more often and always maintain multiple connections to multiple cells nearby, which is its own challenge when it comes to energy consumption and the real-time requirements for the 5G backbone as all cells must cooperate.

    But, as you might have understood from the explanation, the advantage of this approach is, that every cell can use the frequency spectrum it uses to transmit data in its much reduced range much more effectively, since less subscribers are active in each cell, providing much lower latency, much, much higher data rates as well as better quality of service to each customer.

    The thing is, and many will be upset about this is, that the core benefits of 5G technology only apply to use-cases (or environments) where high cell congestion is an issue (e.g. airports, universities, cities, indoor areas). In areas outside of high population people will not see much change.

    I hope I could address your concerns accurately.
    tmaymuthuk_vanalingamGeorgeBMaccornchippscooter63
  • 'A11 Fusion' in iPhone X appears to be a six core processor, according to iOS 11 leak [u]

    I am more than certain that it's not 4 cores, but 4 threads belonging to two physical cores which are bundled with a Zephyr core each. SMT is low hanging fruit and it was just a question of time when apple would pick it.
    tmayksecradarthekat
  • Editorial: Apple is making us wait for a new iMac for no good reason

    I believe apple might consider switching to AMD processors (e.g. ZEN 2); or maybe an unannounced APU with integrated VEGA graphics, that would sound interesting from an Apple perspective. Getting the operating system scheduler in place takes its time since the desktop ryzen chips (or ryzen pro if you want, they are the same) are effectively 2 NUMA nodes on a single die; Other than that I believe there is no reason to keep us waiting.
    Pylonswatto_cobra
  • The Nest Secure has a hidden microphone, and Google didn't tell owners for 18 months

    Soli said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Soli said:
    There's absolutely no evidence that this was a working microphone that was eavesdropping on anyone and let's be clear that Alphabet is the one that announced the update that enabled the microphone, not a blogger that discovered nefarious activity. For those looking for a conspiracy you'll have to look harder. This is no different from countless other tech companies that don't disclose inactive HW for a variety of reasons.
    I think in this particular case more could have been done to correct the error beforehand.

    A team of people were involved in designing, testing and producing the hardware. It is reasonable to think that some of these people would have used the finished product or given it to friends and family. It is unreasonable to assume that none of these people saw that a key (and consumer facing element - even if inactive) got missed on the spec list or in the product documentation.

    Also, this feature will have been in internal testing for a while before getting the go ahead to go live which would have provided more opportunities to catch the slip up.

    I'm with you that I don't see anything nefarious but it should have got caught and clarified earlier IMO.
    Apple Infamously released a Mac with hidden 802.11n WiFi and then only announced it after the driver was ready for a launch…and then charged you a fee for it which pissed people off even though they had purchased the machine despite nary a mention of that being a promised feature.

    As I stated, this isn't uncommon and if you don't trust Google then Nest Secure was never an option for you anyway.

    How many products do we have on our person and in our homes with microphones? From security cameras to personal digital assistants to PCs to phones to my Apple Watch I can think of at least 8 off the top of my head. And while I trust Apple to not spy on me the bigger risk will always be exploiting a bug as we recently saw with FaceTime Group Chat.

    If I was running a company as valuable as Alphabet and I wanted to spy on people I wouldn't do it with an undisclosed, active microphone that could be found, I'd blatantly disclose the microphone (as all our CE already have) and then I'd have backdoor "bugs" built-in that people in-the-know could exploit so there's a level of deniability by the company. We accept bugs in SW and we accept that companies say "oopsie"and then close these holes once discovered.
    Wi-fi isn't comparable to this. Those machines already had Wi-Fi on them. All the update did was unlock support for 802.11n.
    You’re now claiming that 802.11n over 802.11g is just better code? Is this so you can later claim that Apple was being petty for a mere “software update”? 🤦‍♂️
    Please note that the original introduction of 802.11n in the MacBook line at date of ratification was a software update since the WiFi cards present in the devices just before 802.11n was ratified already implemented the 802.11n standard in its 'draft' form.
    bonobobStrangeDaysMacProGeorgeBMac
  • Apple has revoked Facebook's enterprise developer certificates after sideload violations [...

    If I am Mark Zuckerberg, I will block all iOS users from using Facebook, even with Safari, immediately. Let's see which company will go bankrupt first.
    Dear, facebook mostly makes money from iPhone users. Please re-evaluate your thought.
    magman1979lolliverwatto_cobra
  • Apple hires lead ARM CPU architect Mike Filippo

    mjtomlin said:
    lkrupp said:
    My 27” iMac 14,2 (late 2013) is getting long in the tooth but I will not wait until the latter half of 2020 to find out if Macs are moving to ARM. The 2019 iMac may be my last Intel Mac but I don’t really care.
    Really? My 27" iMac (11,1 late 2009) is still chugging along just fine. Although it is stuck on High Sierra - the first time I've ever owned a Mac that doesn't have the latest OS.

    Great hire! This guy has some big chops, AMD, INTEL, and ARM! Just watch what’s coming!

    Possible in-house x64 based CPUs?
    No. Intel and AMD x86 platform patents license agreements perfectly show how hard it is. The only other company allowed to make x86 products is VIA, but they're embedded products centered.
    macpluspluswatto_cobra
  • Intel's first 'Ice Lake' 10-nanometer processors aimed at notebooks are shipping soon

    wizard69 said:
    Don‘t expect 10 nm to come anytime soon for devices that are more complex than dual core ultra low power, maybe quad core. The defect density in 10 nm will never get to levels where it is viable to replace 14 nm.

    Of course Intel will keep saying that their process is progressing and healthy, but it makes absolutely no sense from an economical point of view.

    Nobody that is digging deeper into that matter believes what Intel is saying.

    10 nm will never make Intel any money. However, it would have if we lived in a reality where AMD didn’t exist any longer.
    TSMC is doing fine with their quasi equivalent processes.  Even if Intel’s labs are too dirty for high yields they can always go to the bleeding edge with AMDs chiplet approach.  
    The difference between Intel and TSMC is that Intel uses self-aligned Quad Patterning with cobalt interconnects while TSMC does so with double patterning (their 7nm ArF node). The ugly thing is, that with Quad Patterning, you have more than double the steps required to fully process a wafer, increasing the machine time per wafer and the likeliness that while exposing the wafer to whatever is needed to process it, defects develop. Its not that their fabs are dirty, its that there is no way Intel can ramp up 10 nm to _mass_ production (e.g. high performance Desktop, Notebook and Xeon parts) before 7 nm will be there. TSMCs next upcoming N7 Pro will give you EUV, eliminating multiple patterning steps, reducing the total required step count by more than half of the steps, increasing yield.

    By the way, ASML is the single company that remains which can produce tools for the upcoming processes; It's same for everywhere else where you have costs that are rising. If Intel folds too much in the next years, they will lose much of their manufacturing capability as new nodes are exponentially rising in cost.

    Meanwhile, Intel continues to believe, or rather must communicate to investors, that the self-aligned Quad Patterning 10 nm process will be the way, which is, factually incorrect. By the way, Intel starts EUV at 7 nm, and I expect them to be back in game when they can release the process beyond risk production.
    elijahg