Apple working to adopt 802.11ac 5G Gigabit WiFi this year

12346

Comments

  • jfanningjfanning Posts: 3,334member
    Quote:
    Originally Posted by Andysol View Post


    You're wrong. Again.



    Really? You are saying those charts exactly match my eye sight, and my room conditions, and that of my family etc? Is that a fact? No, they are based on a particular eye sight measure.



    Quote:
    Originally Posted by Andysol View Post


    How the hell do you think Steve came up with "retina display"? Or, because Steve said it, he doesnt have to have any evidence?



    Hate to tell you this, but Steve didn't come up with "retina display", it is a meaure based on a particule eye sight measure from a certain distance etc, etc, etc. Apple makes a phone based on a certain dpi which meets that measure.



    Also, Steve used to stretch the truth quite a lot on those speeches, don't take them all at face value.



    Quote:
    Originally Posted by Andysol View Post


    Thats like me getting on here touting that retina display is all a marketing scheme and no one can tell the difference because I can't. And anyone who disagrees with me is just making up numbers and not having anything factual.



    Based on what you are claiming, Retaina display is a marketing scheme.
  • andysolandysol Posts: 2,506member
    I don't know why I am responding to you because you are obviously clueless. But here goes nothing.



    Quote:
    Originally Posted by jfanning View Post


    Really? You are saying those charts exactly match my eye sight, and my room conditions, and that of my family etc? Is that a fact? No, they are based on a particular eye sight measure.



    Nope. I didn't say that. Those charts replicate 20/20 vision. If you have bifocals and are cross-eyed, I can't help you. I won't even get into the fact that 720p/1080p has nothing to do with room conditions. What I said was that you were wrong when you said the following:



    Quote:
    Originally Posted by jfanning View Post


    Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.



    You're the one that said those numbers are nothing more than the authors ideas. And that's wrong. They're proven facts based on the quote and link I provided that you just conveniently ignored.



    Quote:
    Originally Posted by jfanning View Post


    Hate to tell you this, but Steve didn't come up with "retina display", it is a meaure based on a particule eye sight measure from a certain distance etc, etc, etc. Apple makes a phone based on a certain dpi which meets that measure.



    Hello? Are you thinking you're educating me? That's what the charts were that you said mean nothing to anyone!





    Quote:
    Originally Posted by jfanning View Post


    Based on what you are claiming, Retaina display is a marketing scheme.



    Seriously? You must be very very very foreign because what you just quoted was me saying that is what YOU claim. That your interpretation of the charts as nothing more than a marketing scheme. The same charts you said mean nothing. Althoh also the same charts that are used to determine retina display based on distance and 20/20 vision! You're unbelievable. Just say "I'm wrong" and be over it. You're wrong. Period.
  • mactacmactac Posts: 315member
    Quote:
    Originally Posted by city View Post


    Verizon runs an ugly semi rigid black pipe (trees help) from pole to pole, then a fiber optic "wire" to a plastic box attached to the house.



    That black pipe to enclose the fiber is probably cheaper than armored fiber. Squirrels love to chew on fiber. We had to replace a long run of fiber with armored fiber for the county because of the little rodents.
  • jfanningjfanning Posts: 3,334member
    Quote:
    Originally Posted by Andysol View Post


    I don't know why I am responding to you because you are obviously clueless. But here goes nothing.



    Personal insult, great start indicates you don't have an argument.



    Quote:
    Originally Posted by Andysol View Post


    Nope. I didn't say that. Those charts replicate 20/20 vision. If you have bifocals and are cross-eyed, I can't help you. I won't even get into the fact that 720p/1080p has nothing to do with room conditions. What I said was that you were wrong when you said the following:



    I don't have bifocals, and I'm not cross-eyed, but if all those charts are for 20/20 you missing the large number of people that don't have that particular level of eye sight.



    There are a number of environmental conditions that will change how you see an image, light is one of those, hence room conditions will affect things.



    Quote:
    Originally Posted by Andysol View Post


    You're the one that said those numbers are nothing more than the authors ideas. And that's wrong. They're proven facts based on the quote and link I provided that you just conveniently ignored.



    I haven't ignored anything, they are based on one particular level of eye sight in certain conditions, a fact you constantly want to ignore.



    Quote:
    Originally Posted by Andysol View Post


    Hello? Are you thinking you're educating me? That's what the charts were that you said mean nothing to anyone!



    What?



    Quote:
    Originally Posted by Andysol View Post


    Seriously? You must be very very very foreign because what you just quoted was me saying that is what YOU claim. That your interpretation of the charts as nothing more than a marketing scheme. The same charts you said mean nothing. Althoh also the same charts that are used to determine retina display based on distance and 20/20 vision! You're unbelievable. Just say "I'm wrong" and be over it. You're wrong. Period.



    Very foreign? What the hell is that meant to mean? Are you trying to indicate if someone is a different nationality to you they are wrong? Hmm, great attitude you have there?



    You have you theory, based on certain circumstances, you are more than welcome to believe them, I am not doubting those results based on the inputs listed.



    But to try and claim what I can see (and anyone in general) based on a chart you downloaded, without knowing anything about me, and any other circumstances is pure ignorance.
  • hdboyhdboy Posts: 7member
    Quote:
    Originally Posted by sflocal View Post


    What's the rush to wirelessly connect your phone to an access point at gigabit speeds to the Internet that is probably real-world 5-10mb/s unless you're at a Starbucks with a bunch of other people which will drag it down...What do you plan on doing on that tiny phone that necessitates gigabit speeds of bandwidth?



    This is all about internal home networking bottlenecks. Wireless gaming, 1080p video and larger photo files displayed using AirPlay, all from iPhones, make this a welcome upgrade.



    Even now, I regularly move 50-200MB photo files between machines on our gigabit ethernet home network (and also via WiFi). I network DirecTV DVRs to play HD movies between rooms and stream DirecTV HD video to iPads (though I expect DirecTV will be slower to deploy this -- they still need to add gigabit Ethernet). I also stream local HDTV channels to MacBook Pro's using an el gato EyeTV ethernet network tuner. All this sucks up lot's of bandwidth.



    It also will ensure the success of wireless Apple (iOS) gaming (which is coming) to challenge consoles like Xbox and Playstation with a new AppleTV because this will help eliminate the ever so slight lag that is evident with current WiFi technology when using iPhones as game pads. Of course, we're hoping this technology will migrate down to iPhones, iPods and iPads too, but the antenna challenges may throttle that expectation.
  • jahonenjahonen Posts: 364member
    Quote:
    Originally Posted by jfanning View Post


    But to try and claim what I can see (and anyone in general) based on a chart you downloaded, without knowing anything about me, and any other circumstances is pure ignorance.



    I'm not going to claim anything on your eyesight, but I just had to ask: Are you saying that your vision is better than the assumed "perfect" 20/20 vision that the conclusions in the articles are based on (i.e biophysics and scientific measurements)? My understanding was that they test the human eyes' best case resolution and give the conclusions from that.



    Regs, Jarkko
  • andysolandysol Posts: 2,506member
    My final response. Maybe this time you will see what I quote in bold again.



    Quote:
    Originally Posted by jfanning View Post


    Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.



    Except 20/20 vision. This is what I'm saying you wrong on. That the charts have- as YOU put it- nothing to back them up and there is nothing to them". Not "they don't match my eyesight".

    Not "those aren't what my eyesight is". No... You said "there is NOTHING to backup those figures". And you are wrong wrong wrong wrong wrong wrong wrong.



    Are we on the same page yet?



    Quote:
    Originally Posted by jfanning View Post


    I don't have bifocals, and I'm not cross-eyed, but if all those charts are for 20/20 you missing the large number of people that don't have that particular level of eye sight.



    I haven't ignored anything, they are based on one particular level of eye sight in certain conditions, a fact you constantly want to ignore.



    That's like saying we take the crash impact ratings of car seats and throw them out. "I never go 70mph, so those crash ratings mean nothing". In fact, there is nothing to backup those figures other than they are the carseat rating person's ideas. See how dumb that sounds?



    You have to have a standard to measure by. That standard is 20/20. If you don't wear glasses, that's no ones fault but your own.





    Quote:
    Originally Posted by jfanning View Post


    There are a number of environmental conditions that will change how you see an image, light is one of those, hence room conditions will affect things.



    And lighting doesn't impact resolution- which is the only thing we are arguing. There's no debate if you have a room with the sun on The screen it will affect the picture- but in terms of contrast, NOT resolution. 720p and 1080p is a distance measurement only. Period- end of argument.



    Quote:
    Originally Posted by jfanning View Post


    Very foreign? What the hell is that meant to mean? Are you trying to indicate if someone is a different nationality to you they are wrong? Hmm, great attitude you have there?



    Nope- just curious why you have a hard time following the English language. You either are foreign because you don't understand what "there is nothing to backup those figures" means, you have short term amnesia, or you just can't admit you said something that simply wasn't true as you were telling someone else what they said and quoted had no bearing.



    Quote:
    Originally Posted by jfanning View Post


    You have you theory, based on certain circumstances, you are more than welcome to believe them, I am not doubting those results based on the inputs listed.



    But to try and claim what I can see (and anyone in general) based on a chart you downloaded, without knowing anything about me, and any other circumstances is pure ignorance.



    I never claimed anything. I said if you have 20/20 vision and a 1080p tv, that the chart is exactly right on how resolution is determined. It's a fact, proven by research and sciences. Not an opinion. If you are one of the millions and millions of people with either 20/20 vision or wear corrective lenses, the chart is 100% fact. Yours will always be an opinion on the difference you can see. So your argument is fact vs opinion.



    There is no chart for you. Because you and your family won't go get glasses or have such poor lighting conditions you can't see any resolution due to your contrast being off. That's no ones fault but your own. But again- lighting conditions have no merit on resolution. If you turned your brightness to 0, and made your screen black, you can't argue that your 1080p tv has poor resolution. The resolution would still be 1080p!
  • andysolandysol Posts: 2,506member
    Quote:
    Originally Posted by jfanning View Post


    Wow, it didn't take someone long to bring out that old, tired, chart, there is nothing to backup those figures other than they are the authors ideas, that's all.



    Quote:
    Originally Posted by jahonen View Post


    I'm not going to claim anything on your eyesight, but I just had to ask: Are you saying that your vision is better than the assumed "perfect" 20/20 vision that the conclusions in the articles are based on (i.e biophysics and scientific measurements)? My understanding was that they test the human eyes' best case resolution and give the conclusions from that.



    Regs, Jarkko



    Jarkko- you are exactly correct. The article is based on biophysics and scientific measurement.



    Which, in jfannings own words, there is nothing to backup those numbers except the authors ideas (opinion), that's all.



    But even when you point out its not an opinion and all based on biophysics and scientific measurement, he'll argue for a full page that he is right and the chart is just opinion and not fact.
  • jfanningjfanning Posts: 3,334member
    Quote:
    Originally Posted by Andysol View Post


    My final response. Maybe this time you will see what I quote in bold again.



    You can bold all you like, but the charts you posted didn't contain any data to back them up, the article may have, but the charts didn't, not even a reference.



    Quote:
    Originally Posted by Andysol View Post


    Are we on the same page yet?



    That you can't accept another opinion? That you can't accept the possibility that you are missing a point? Yes, we are on the same page then



    Quote:
    Originally Posted by Andysol View Post


    That's like saying we take the crash impact ratings of car seats and throw them out. "I never go 70mph, so those crash ratings mean nothing". In fact, there is nothing to backup those figures other than they are the carseat rating person's ideas. See how dumb that sounds?



    No, that's comparision is totally different, it was relevant as you posting a user edited wiki page as proof



    Quote:
    Originally Posted by Andysol View Post


    You have to have a standard to measure by. That standard is 20/20. If you don't wear glasses, that's no ones fault but your own.



    You've lost me, have we ever meet? How do you know if I wear glasses or not, in fact how do you know what my eyes are like at all?



    Quote:
    Originally Posted by Andysol View Post


    And lighting doesn't impact resolution- which is the only thing we are arguing. There's no debate if you have a room with the sun on The screen it will affect the picture- but in terms of contrast, NOT resolution. 720p and 1080p is a distance measurement only. Period- end of argument.



    Actually the articule you posted talks about light being important in relation to sight, did you actually read the document?



    Quote:
    Originally Posted by Andysol View Post


    Nope- just curious why you have a hard time following the English language. You either are foreign because you don't understand what "there is nothing to backup those figures" means, you have short term amnesia, or you just can't admit you said something that simply wasn't true as you were telling someone else what they said and quoted had no bearing.



    Let's see, according to Wikipedia (since you like Wikis as proof), there are 328 million native English speakers in the world, that accounts for around 5% of the worlds population, since 95% of the world speaks a non english language as their native language, there is a high chance I am foreign, but your definition of foreign would have to be the strangest one I have heard.



    Quote:
    Originally Posted by Andysol View Post


    I never claimed anything. I said if you have 20/20 vision and a 1080p tv, that the chart is exactly right on how resolution is determined. It's a fact, proven by research and sciences. Not an opinion. If you are one of the millions and millions of people with either 20/20 vision or wear corrective lenses, the chart is 100% fact. Yours will always be an opinion on the difference you can see. So your argument is fact vs opinion.



    Go back and read what you linked, I'm not an opthmologist, I can't challenge the work of Kalloniatis or Luu, or the document they wrote, and he chart may be a 100% fact, but they haven't referenced it one bit, not even to this article, or any other article



    Quote:
    Originally Posted by Andysol View Post


    There is no chart for you. Because you and your family won't go get glasses or have such poor lighting conditions you can't see any resolution due to your contrast being off. That's no ones fault but your own. But again- lighting conditions have no merit on resolution. If you turned your brightness to 0, and made your screen black, you can't argue that your 1080p tv has poor resolution. The resolution would still be 1080p!



    You make strange assumptions, I say you don't know anything about the environmental conditions in peoples rooms, and you assume that everything is bad? Again, did you even read the article you linked to, I really don't think you did.





    Now, since you like you chart so much, let's go back to the earlier posts, where this dicussion started, and the chart you referenced.



    I replied to a user who said you had a to sit 2 x the diagonal measure of the screen to enjoy Full HD, they used 50" as an example, stating no further an 2.8m, but according to the chart you must sit 2.13m or closer to benefit, What is right, the person who claimed I was wrong, or the chart?



    And at the end of the day, the best play to sit is personal preference, as I said orginally
  • jfanningjfanning Posts: 3,334member
    Quote:
    Originally Posted by jahonen View Post


    I'm not going to claim anything on your eyesight, but I just had to ask: Are you saying that your vision is better than the assumed "perfect" 20/20 vision that the conclusions in the articles are based on (i.e biophysics and scientific measurements)? My understanding was that they test the human eyes' best case resolution and give the conclusions from that.



    Regs, Jarkko



    No, I am not saying that. Also, is 20/20 perfect, or nominal?
  • crowleycrowley Posts: 4,562member
    Quote:
    Originally Posted by mstone View Post


    I'm not saying it couldn't be better but I still like living here more than any other country and apparently so does anyone else in the world who can figure out a way to live here. There is more to life than Internet speed. When the rest of the world downloads movies at lightning speed there is a good chance it is a US made movie and probably pirated since the US media doesn't actually provide much content outside of the states as far as I have heard or experienced in my travels abroad.



    The US media doesn't import much content from outside the states either, which is probably warping your perspective.



    In my experience of travelling, which is mainly in Western Europe, most other places you'd care to go have a rich balance of domestic and foreign produced content, including TV and movies from the states.



    The US is a big country, and produces a lot of TV and movies, but it's far from a monopoly on good content. You also don't need to live there to enjoy it.
  • crunchcrunch Posts: 180member
    Since when are Thunderbolt or USB 3.0 wireless standards??? And most of you don't know your bits from your bytes.



    Anyway, if there is anyone wanting to know as to when Apple will support USB 3.0, I have the answer for you: Now, if you have an ExpressCard slot in your MacBook Pro, or a PCIe card for your Mac Pro and the rest of us will have to buy a new computer this April or May when the new Intel Ivy Bridge CPU's will be released. Intel will support USB 3.0 at the chipset level as part of Ivy Bridge.



    As Mac users, we don't have to worry about Intel not including Thunderbolt at the chipset level. We use Macs. All future Macs will have Thunderbolt (10Gbps), USB 3.0 (5Gbps), SATA III (6Gbps) and hopefully 802.11ac WiFi. I love my 3x3 MIMO (450Mbps) MacBook Pro / Time Capsule setup, and yes, I'll buy the 802.11ac-based Time Capsule the day it's released!
  • marcusj0015marcusj0015 Posts: 198member
    Quote:
    Originally Posted by F1Ferrari View Post


    I'm in the same camp, but since my ISP only runs at 8 Mbps a faster Airport Extreme really wouldn't help much. I'm getting 7.8Mbps on my n-speed wifi devices, so a 802.11ac router would currently be overkill.



    The 802.11x specs aren't about ISP speed, they're about device to device speed, in which, you could DEFINITIVELY use the extra speed.
  • crunchcrunch Posts: 180member
    Quote:
    Originally Posted by marcusj0015 View Post


    The 802.11x specs aren't about ISP speed, they're about device to device speed, in which, you could DEFINITIVELY use the extra speed.



    ...what he said!
  • tipootipoo Posts: 577member

    Quote:

    Originally Posted by Sevenfeet View Post

    Let's say you have two Apple TVs in your house, both watching online streamed content in 1080p. Add to that someone doing the same on an iPad 3 and maybe a Macbook surfing the web at the same time and all of a sudden you have a pretty congested network. In order to provide consistent streams without frame dropping, you need something like 802.11ac.




    For streaming I'd think most homes would be limited by their internet speed first though. You may get gigabit wifi link speeds, but what good is that when your provider feeds you 20Mb/s. Now if you had localized content, say full 1080p video on your computers streaming to the ATVs and file transfers and whatnot, that speed would sure help. 

  • joseph saponarojoseph saponaro Posts: 4member


    Great review of the 5G Gibit WiFi literature.


     


    Dr. Saponaro

  • kustardkingkustardking Posts: 105member
    Quote:
    Originally Posted by mstone View Post


    A lot of confusion on the difference between Megabits (mbs) and Megabytes (MBs)

    USB 2 tops out around 12 mbs which is around 1.5 MBs

    There are 8 bits per byte.

    GigaE is 1000 mbs or about 125 MBs

    The fastest Internet you are likely to encounter is 100 mbs which is .125 MBs [fixed]

     

    Despite your lesson, your final # is wrong. 100Mbps is 12.5MBps (not 0.125MBps, which is 1MBps)
  • retinaretina Posts: 8member
    While we're off topic, please allow me to correct some more inaccuracies: USB 2 is not 12Mbps. That's USB 1.1. USB 2.0 has a maximum theoretical speed of 480Mbps and for the sake of completeness, now we have USB 3.0, which is 5Gbps (approx. 10x speed increase over USB 2.0).

    So, mstone, when someone gets speeds like 20Mbps or 30Mbps through their ISP, which is a high number, although becoming more common every day, it's still less then...say, 54Mbps a.k.a. 802.11g. Using your theory, most people would not even need 802.11n, right? Besides, when you try to explain 5Gbps and 5GB/s, you haven't accounted for 5 GiBps. (Gigibyte per second). A lot of people use Gbps when they actually mean Gibps. Add to that the fact that you can even come close to Gigabit WiFi with 3x3 MIMO tech on both the router and the client using both the 2.4 and 5GHz simultaneously at 450Mbps each for a combined 900Mbps, or .9Gbps. :D
  • tallest skiltallest skil Posts: 40,402member
    Just one correction, that's 'gibibytes', with a '-bi' suffix on that prefix. Kibi, Mebi, Gibi, Tebi.
  • imbrucewayneimbrucewayne Posts: 104member


    This thread is getting me even more confused with what's what in measurement of speed. All I know is my ISP the only reason we are getting slow connections here, not my router.

Sign In or Register to comment.