Detnator
About
- Username
- Detnator
- Joined
- Visits
- 44
- Last Active
- Roles
- member
- Points
- 620
- Badges
- 1
- Posts
- 287
Reactions
-
Apple cuts App Store commission to 15% for developers paid less than $1M per year
cloudguy said:22july2013 said:I'm sure the Apple-haters will come out of the woodwork again, today. Bear in mind that even 30% is a lot less than the 100% markup that was always the case when selling software in boxes in retail outlets.
1. Users of other platforms have as much right to criticize Apple as Apple fans do in criticizing not only competitors like Microsoft, Google and Samsung but longterm partners like Intel whose CPUs are still going to be in the vast majority of Mac models available for least a year and which you same Apple fans still want people to buy to prevent the Mac market share from cratering.
2. Plenty of these so-called "Apple-haters" regularly buy Apple products. Including myself. I hold my Macs to the same standards that I hold the Android, ChromeOS, Windows and Linux devices that I use: I praise the things that I like about them and criticize the things that I do not. So do "Apple haters."
3. The 100% markup thing was Apple public relations. Good grief, I take what every corporation that is trying to sell me something - as well as every politician that is trying to get me to vote for them - with a grain of salt, even the companies whose products I like and consistently buy and the politicians that I generally support. Look, by the time the app store was created and certainly by the time it became large and influential enough to be considered its own marketplace with a billion consumers, buying software on CDs was long dead. Good grief, some companies had even stopped manufacturing computers with CDs by then! People were downloading Microsoft Office, video games, programming IDEs, video and audio editing software, operating system updates etc. over the Internet by then. The software CD/DVD sections were gathering dust. Do you know what the #1 casual video game entity was 10-15 years ago before iPhones and iPads - and yes Android devices - came along and killed them off? PopCap Games. So huge that EA bought them. While they - and their competitor BigFish Games - would send you a CD if you asked them for one, their entire business model was download based. What Tim Cook went to Congress and claimed was the software distribution model before home Internet usage by local telephone and cable companies - as opposed to junk like AOL - became widespread.
1. 22july2013 didn't say anything about the Apple-haters not having any right to criticize Apple.
2. So? How does that in any way show something wrong in 22july2013's comment?
3. So? None of that makes the 100% markup statement false. Before Apple (mostly) invented this App Store model, distribution costs for developers was a lot more like 80% to distributor, 20% to developer not the other way around that it is now (15/85 or 30/70). Apple created a system that delivered a lot more back to the developer. That doesn't mean Apple's distribution costs are negligible, just because they're not physical like they were before. Apple still has distribution - and marketing - costs. Apple delivers their entire customer base on a platter to developers, and that (a) costs Apple something and (b) has value, to any business that makes use of those customers, that Apple deserves to charge for.
So... umm... what's your point? -
Apple cuts App Store commission to 15% for developers paid less than $1M per year
cjlacz said:avon b7 said:This move doesn't tackle the root issue that is being investigated on multiple fronts. That there is only one App Store on Apple devices.
Apple can legitimately charge whatever it wants but that isn't, and has never been, the root issue.
I think Apple feels good news won't result from the different investigations and this reduction is a move to leave them in slightly better light when final rulings are delivered.
iOS devices themselves are not entirely closed/walled/etc. Web apps are a thing - and what you can do in a web app is constantly getting closer and closer to what you can do with native apps. Apple doesn't pro-actively block anything (that I'm aware of) from running in the web browser. Porn, subscription cloud based games, whatever you like.
Apple just blocks some native apps, and those are things that use APPLE's proprietary APIs, etc. By this argument it's not just Apple's platform, but it's Apple's tools. Sure, those native tools and APIs provide more functionality, but THAT stuff (the stuff that native apps can do that web apps can't) are all Apple owned IP and technology.
Why shouldn't they have every right to control what people use their tools for on their OS (sure you own the phone but Apple owns the OS) - for the benefit of those of their customers who appreciate that and don't want that taken away - while still allowing "anything goes" on the parts of their platform that are open and don't use their proprietary IP, APIs and tools - the web, among other things? -
Apple Silicon M1 Mac detection of Thunderbolt 3 eGPU gives hope for future support
cloudguy said:0. Mx Macs may indeed run Windows natively ... but it will be ARM Windows which even Microsoft acknowledges is largely useless except for (some) first party and web applications. It was why the Surface Duo uses Android instead of Windows. And unless I am wrong Apple has closed the door on emulating Windows themselves. So a solution for x86 and x86-84 Windows applications would need to be via 32 and 64 bit virtualization.
1. On the eGPU front it is not so much the hardware or even protocols. Instead, the APIs are different. When on Intel this wasn't an issue ... the APIs are the same. But with Apple Silicon it is Metal or nothing. While that is GREAT for iOS and Apple Arcade and other stuff developed for Apple using Apple provided tools, other stuff isn't going to be compatible unless third parties do it, and even if that happens Apple won't support it.
Remember my previous rants. Apple never wanted to join the Intel platform and support their hardware and software stuff in the first place. They were PROUD of not doing so in the 80s, 90s and early 00s. They only went to Intel, put iTunes on Windows and things like that because it was necessary to survive.
Now Apple doesn't need that stuff anymore. Macs are going to sell regardless ... they have a very dedicated customer base especially among creatives and you can't even develop iOS, watchOS etc. apps any other way. So it is going to go back to how it was before. Apple doesn't want you running Windows. They don't want you using GPUs designed for Windows. They want you to rely on their hardware and software and have the opinion that between their first party stuff, the iPad apps coming over plus web and cloud tools, that is more than enough for anyone who wants to make it work. If you cannot or do not want to, it isn't as if Wintel is going anywhere.
Convergence WITH their mobile hardware and apps. Divergence FROM the Windows stuff that they have always hated. And now that Microsoft is doing more stuff with Android and Samsung on Windows 10 and collaborating with Google in even more areas, even more so.
It is going to go back to when Apple people and Windows people were different populations who use computers to do different things, which is the way that it has been for most of these companies histories. The key difference is that thanks to Apple's prowess in mobile, there are far more Apple people than there were in 1983, 1993, 2003 or 2013.Sigh… It's concerning the amount of misinformation you're spreading here…
"And unless I am wrong Apple has closed the door on emulating Windows". Indeed, wrong.
Firstly, it's the wrong darn phrase! There's no such thing as "emulating Windows". You don't emulate an OS. You either virtualize or emulate, a hardware machine (computer) in software. The emulated or virtualized machine is the guest. The machine all that's running on is the host. Then in simple terms if the guest and host share the same hardware architecture, then it's virtualization; If they're different architectures, it's emulation.
Since the Intel Macs we've had Parallels Desktop, VMWare Fusion and others. These allow for virtualizing a full computer in software. They virtualize the host (x86) architecture to allow you to make one or more virtual/guest (x86) machines running OS's and apps (macOS, Windows, and others) compiled for that Intel architecture.
In the PowerPC (PPC) days of Macs there were apps Virtual PC that allowed x86 virtual (guest) machines running x86 compiled Windows to run on Motorola PPC host hardware (the PPC Macs of the day). The experience was more or less the same as with Parallels and Fusion, except that the software had to translate or convert the x86 chip level instructions running in the virtual machine into PPC instructions the host could understand and execute. (It created a huge performance hit and it was impossibly slow for anything but the most basic tasks, though that detail isn't particularly relevant for this discussion).
Now… Rosetta 2 doesn't do any of that. Rosetta 2 is not an emulator. The following is a tad oversimplified, but on a basic level it's a translator mostly between x86 compiled macOS applications and the native macOS running on the ASi architecture. There's far less if any calls to hardware through Rosetta 2. It's not emulating (nor virtualizing) anything. Part of the optimization is that a lot of that translation is done at INSTALL time not runtime. Most of the optimization is by it targeting the small range of things that it does (primarily translating apps to the OS and carefully selected as to what parts of that it does when), it does them extremely efficiently (as compared with an emulator recreating an entire computer in software for a different architecture).
It's impossible for Rosetta 2 to "emulate Windows" (more accurately run Windows and Windows apps through x86 emulation) because Windows isn't a macOS app that runs under macOS. It's (obviously) an entirely separate operating system that requires an entire machine (virtual or physical) to run on. And then Windows apps need to talk to that.
So terminology correction out of the way here's some further corrections to your rant:
- Apple doesn't hate Windows enough to ever try to stop their users using it. They didn't block the virtualization (on Intel Macs) or emulation (on PPC Macs) solutions I mentioned above. They went out of their way to build Bootcamp - an APPLE solution for booting only APPLE computers into Windows. If anything they're indifferent about Windows, although even that's a stretch because Bootcamp proves they pro-actively support Windows on their Intel Macs. It's not like someone else came up with that and Apple shrugged their shoulders and said "Oh well we'll let it through and not block it."
- Apple hasn't closed the door on Windows in any form on the ASi Macs. They simply said Rosetta 2 won't support it (for the reasons I've stated above - Rosetta 2 isn't an emulator). And they're just not proactively pursuing it, instead leaving it up to the third parties. Apple have gone on record saying that "it's up to Microsoft". Parallels and VMWare have both gone on record saying they're working closely with Apple to bring their solutions to ASi. Although... they haven't specifically committed to Windows, and I'm guessing that's because they're not sure whether they can pull it off, for two reasons, the first being because to pull off x86 Windows they'll have to figure out emulating entire x86 computers, in software, which is an altogether much more complex task than the virtualizing that their solutions currently do.
- And the second reason is this whole thing about Windows on ARM. Those who think that ASi is just an ARM chip in a Mac instead of an Intel one just don't get it. ASi isn't just another ARM chip. It's an entirely new Apple designed and built CUSTOM architecture (not just custom CPU) where a small subset of the CPU part of it happens to include the ARM instruction set. And for Windows to run natively on ASi, either Microsoft are going to have to do a lot more than just tweak their ARM Windows for it, and/or Parallels, VMWare, etc. will most likely have to produce some combination of virtualization and emulation between ASi and "standard" ARM chips like what's in the Surface etc — and that's if MS ever gets Windows working properly on standard ARM chips anyway.
- (Frankly I don't think Windows ARM will happen for a long time, if ever. In my opinion Windows on ASi is almost certainly going to only happen if someone successfully emulates x86 hardware on ASi, at which point anything that Intel Macs run that can't natively run on ASi Macs — including Windows, as well as older macOS's and their apps — will run in VMs on ASi. If nothing else someone really needs to figure out x86 emulation on ASi for the sake of developers and others wanting to test on older macOS's etc., not just Windows. I can run Mountain Lion on my Intel 16" MBP if I want - in a (virtualized) virtual machine. I won't be able to do that on an ASi Mac without x86 emulation).
- eGPU's. Your paragraph on that is mostly reasonable except the end: "Apple won't support it". FFS, you don't know that. Again, such aggressive negativity towards Apple… and why? It's entirely because you just don't get Apple (more on that below). eGPU compatibility (and internal discreet GPUs also) may or may not come. If they're needed, Apple very likey won't pro-actively block them (corporate issues between NVIDIA and Apple aside). But I'm willing to bet they won't come, and it'll be because they won't need to. Discrete GPUs (internal or external) will more than likely be a hindrance to ASi Macs. This conception that a GPU must be discreet for it to be performant is a fallacy. Apple has put an integrated GPU in their lowest end Macs (and even in their iPads) that keeps up with the discreet GPUs in all but their highest end pro Macs. And this is only their first iteration. So what's coming? It's not like they can't build their own extremely performant graphics Silicon - just look at the After Burner card. They've got the chops to do this. I'll bet dollars to donuts they already have integrated GPUs in M3 or M4 (or whatever they're calling them) chips in their super secret labs that will go into future Mac Pros, and will eat all but the highest end of today's and tomorrow's desktop graphics cards for lunch.
I mentioned you "just don't get Apple (more on that below)". Here's what I'm referring to: These wild negative rants about Apple's decisions completely misunderstand the point of Apple. Apple are not interested in making "PCs". No, that's not about them abandoning the Mac. Apple has (almost) NEVER been interested in making "PCs". Apple has never been interested in trying to make a better alternative to Windows PCs for PC users to buy (except in the 90's and early 2000's when they nearly went broke followed by trying to get back into the game). Steve specifically said that for Apple to win Microsoft doesn't have to lose. They weren't and aren't trying to make better PCs than Windows machines. And they're not trying to make the best smartphones to pull people away from Android. You actually do get this on some level - though mostly only from the Windows and Android users' perspectives - you wrote about that in your post the other day about why all this ASi won't create a mass switch from Windows and Android to Apple. And you're right. PC people don't want Macs because Macs are not PCs. By deliberate design, they don't come with all the flexibility, hack-ability, enterprise-friendly functionality, and whatever else that Windows users have and want. And that's ok. Apple doesn't make PCs.
So what do they make? Apple makes appliances and devices specifically targeted to do certain specific things that their target market want to do - a limited number of things compared with what PCs and Android phones can do. But they prefer to deliver on a few things exceptionally well, rather than everything "good enough".** That's always been their gig. That's the case with the Macs, their iDevices, and almost everything else they make. It's why the HomePod isn't just another Bluetooth smart speaker. And it's why PC users won't flock to the Mac now just because they're faster. And it's why (most) Mac users don't care about eGPUs, core counts, GHz, etc.
(** It's impossible to do both [everything exceptionally well]. You can either do a few things exceptionally well, or everything "good enough". MS/Intel choose the "everything good enough" route. Apple chooses the "few things exceptionally well" route.)
The fact that these devices sometimes resemble PCs and other companies' smartphones is... well ... a lot more to do with everyone usually (not always, so don't throw exceptions at me) copying Apple than Apple trying to compete in their markets. Apple created most of these markets. If you want to argue with that then tell me what else on the market looked or served the functionality of anything like the iPhone before the iPhone... Or like the iPad before the iPad. What else on the market looked like or did what the original MacBook Air did before Apple made that. Same with each iMac generation. And even at the very start: What else on the market looked anything like or did anything like what an Apple II did before the Apple II?
So Apple doesn't want to actively block Windows or eGPUs or anything else specifically from the Mac -- unless doing so hurts the experience for other functionality, which is why I think eGPUs won't happen but Windows will. Apple wants to build devices that serve their relatively small number of users. And they'll include or exclude whatever's going to serve that purpose best. The other companies look at and start with GHz, cores, discreet vs integrated, NVIDIA vs AMD, folding phones, NFC, keeping legacy ports, replaceable components, and all the other specs and technology, and then try to figure out what to do with it all. Then they sometimes do something useful with it, or sometimes just come up with something pretty mediocre that provides little by way of a good experience, merely just bragging rights, or if nothing else, throw it out there and see what the third party devs might do with it -- which sometimes works and sometimes doesn't. (eg. NFC before Apple Pay).
Apple doesn't care about any of that. And just as important neither does the vast majority of Apple's users (excepting a few vocal nerds on Apple forums). Apple looks at "what do our users want to be able to DO?" - the answer to this isn't something like "have an HDMI port for the rare few people who still might need one". It's "play and edit raw 8K footage with multiple effects on the fly" (or whatever) and then they invent new tech, and refine and combine existing tech, as needed, to DO THE JOB. The point: why should anyone care if a GPU is discreet or integrated if it handles that footage well enough? And that's why Apple's stuff is SO much better - for the people that care about that stuff - than anything else. (And the people that care about GHz, cores, discreet GPUs, and replaceable components - and can figure out how to do useful stuff with those things - or even just have the "do everything" flexibility, buy PCs. And that's great for them. Yes I'm oversimplifying it but in principle it's accurate). Apple don't make PCs.
If/when you ever really get that, you'll be right a lot more than you are in a lot of your rants about Apple today.
(Edited for spelling, grammar and formatting).
-
Apple's MacBook business grew 39% in the September quarter
cloudguy said:Xed said:That’s right, you claimed that the processors would cost Apple more. They don’t, as we see either lower or the same price point with other improved features with much improved performers and power efficiency (as I predicted).
Core i3 with 8 GiB RAM. LOL I can’t believe that’s still your line in the sand on where could hit on performance despite plenty of proof showing you the superiority of the M-series SoC.
A huge story for every Apple customer, Apple shareholders, for Intel, and the WinPC market as a whole. You’re inability to understand why a $999 MacBook Air with no fan outperforming a loaded 16” MBP is a good thing for customers or how the ability to run iOS and IPadOS apps on the M1 Macs will attract switchers is neither something I understand or willing to fix. Instead I’ll let you wallow in your misery as you ramp up your trolling.
A huge story for Apple customers and shareholders? Obviously. For everyone else? Not so much ...
You’re inability to understand why a $999 MacBook Air with no fan outperforming a loaded 16” MBP is a good thing for customers ...
No. It is great for Mac fans! But for the 92-95% of people in any given quarter who do not buy Macs not so much.
"or how the ability to run iOS and IPadOS apps on the M1 Macs will attract switchers"
Yeah, if you think that there is going to be this stampede from Windows laptop owners to pay $999 for MacBooks for the privilege of running iPad apps - on a non touchscreen UX/UI no less! - instead of better, more powerful x86 applications that are actually built for personal computers and not mobile devices then you really don't know people who actually own and use Windows laptops very well? I suppose rather than maybe changing your social and professional circles then that is something that you might want to understand or be willing to fix.
Look, I have seen this site's archives. I have also been on other Apple-centric sites. It is always the same thing: declarations that with each new Apple advance, the competition (whether Android or Windows) is going to dry up and blow away. Why? The presumption is that everyone else loves Apple products as much as you do and all they need is something, some push or incentive or anything to liberate them from the misery that they are wallowing in and join the happy existence of Apple users.
What folks like this never realize is that Windows (and Android) users are happy already. They like their products. They like using them. They have the same anticipation towards buying new ones that you do. They even have the benefit of something that you don't, which is OEMs with great R&D departments that compete against each other for their attention and dollar. And Apple fans fundamentally misunderstand them.
For example: you folks love to say "Windows and Android devices are cheaper than Apple devices upfront ... but you are not considering the total cost of ownership ... Apple devices are more reliable because they last 5-7 years or more."
Makes perfect sense, right? TO YOU. But they are not you.
Windows fan: who wants to wait 7 years to replace my laptop? I like replacing my laptop every 3 years so I can get the latest Threadripper from AMD and really be able to play the latest Steam games!
Android fan: hold onto a smartphone for 5 years? LG/Samsung/Pixel/Huawei/Motorola etc. etc. etc. come out with cool new features every year! And you only need to spend $200 on the latest decent Android phone to get them!
Sorry, but Apple was only able to dominate the MP3 player, smartphone, tablet and smartwatch markets because there were no established products by large well known companies in them prior. You had Blackberry, Android Wear and a bunch of other stuff even more obscure. But Apple has never been able to achieve the sort of mass migration from one established product or platform to another that you are talking about. The closest that we have seen to that is the AirPod. The reason: you need to give people a compelling incentive or reason to switch. Meaning that you need to give people a reason to give up a successful product that they already fundamentally like for a product that they like better.
Android - for example - accomplished that by offering the combination of much bigger screens and substantially lower prices.
What doesn't accomplish that is to have people pay more money for machines to run software that they don't use and/or have no interest in faster.
"Hey Billy, put down that $750 gaming PC and buy this $1000 MacBook Air instead."
"Can I play Rocket League on it"?
"No but Apple design the CPU!"
"Can I play The Division on it"?
"No, but you won't get viruses and it doesn't have a fan!"
"Can I at least get my copy of Windows going in VirtualBox on it so I can use the programs that I need for work on it?"
"No but it runs a ton of iPhone and iPad apps that your job absolutely doesn't use! And you can use Continuity to hand off from your iPad and iPhone to your Mac!"
"All right fine but can I dual boot, upgrade the RAM or add an eGPU?"
"No but you will be able to use it for 5 years and still sell it for $500!"
See above. You have failed to explain how switching from Windows to macOS makes his life better. Instead, you are making the case why he should completely change his computing use case - the reasons why he buys his computers in the first place, which is to game and to use software that his job requires - around becoming a consumer of Apple hardware. Please realize that nearly no one is going to do that no matter how fast Macs are.
Now note that I did say nearly. You do have some people who want and need the most powerful machine they can get their hands on without having to deal with the difficulty or expense of an actual workstation or server and don't need much in the way of consumer facing or workspace specific software. For those people - developers are a great example of this, although not right now as a lot of the software and tools that they need aren't on the M1 Macs, can't be translated with Rosetta 2 (or don't want them to be for performance reasons) and won't be for awhile yet - the M1 Macs as well as Macs with the even better chips that Apple will release starting next year will be great. The problem is that there aren't very many such people. I will say it again: the number of people who actually need a chip faster than an Intel Core i5 or i7 and who won't be convenienced by giving up their Windows software isn't very big.
Just do the whole Venn diagram thing ... most people who want a faster chip than the i7 also want/need to hold onto their Windows software. Most people who have no real ties to Windows software don't want/need fast performance in the first place. And - this will really get you - most people who need the fastest performance they can get and don't need Windows software are so because they are in the creative industries and as a result have Macs already!
So yeah, I am trolling you I guess. But this "troll" has actual experience working with macOS (and its predecessors), iOS, Windows (and its predecessors MS-DOS and IBM PC-DOS), Linux, various mainframes, Android and ChromeOS. And being around a bunch of coworkers, students etc. who have the same. My guess is ... you don't? So, my trolling comes from the perspective of the 95% (give or take depending upon the product) of people who don't exclusively use Apple products and actually likes the non-Apple products that we do use and likes them a lot. And that is a perspective that you do not have. I repeat: if you think that there is this whole sea of people out there who hate Wintel or love Apple as much as you do and are just itching and dying to switch, you need to first ask yourself why they haven't jumped ship already. If they hated Windows that badly, they could have switched to macOS at any time. Since they didn't ... shouldn't you presume that they don't hate Windows at all then?[At the time of writing this, I haven't read comments past the above post I'm quoting, so after posting this it'll be interesting to see what others have to say. But want to post this before I read on and have my thoughts otherwise influenced. So...]
Y'know @cloudguy, sometimes on these forums (idk what you're like in real life so no assumptions there) you come across as an arrogant, obnoxious jerk, ;-) but sometimes you make some well thought out, well expressed and generally good points, and you've done so in this post. You've given it a lot of thought and presented your explanation very clearly and logically. For what it's worth I appreciate that.
At the risk of pouring it on a little too thick (not my intention, just speaking my mind) in particular I appreciate and agree with your exposition of how different people have different priorities and your acknowledgement that Apple meets the priorities of some people really well and Windows etc. meet the priorities of the other people really well. It would serve many over the top Apple fans well to figure that out and get over themselves.
All that said, I'd venture to say that I think sometimes you read more into some people's posts than is there. I agree that plenty of Apple fans have this incorrect idea every time Apple comes out with some new thing that it'll finally be the thing that brings the mass switch. I'm just not seeing how the posts you're responding to in this instance are claiming that. I just wonder if you're a little over the top on the defensive sometimes. But feel free to ignore that opinion if you disagree. Just a consideration.
Now all that said… my opinions for what they're worth…
I agree that very few people if any who are in the market for a $500 laptop will buy a $1K MBA just because it's faster (and/or whatever else) now than it was before. Because the person who wants the $500 laptop doesn't want what the $1K MBA - or what any $1K laptop - brings to the table. Apple has never competed in that market and has no desire to.
But then there IS a large market for $1K+ laptops, and I just wonder how many people who might have been in the market for a $1K Dell or HP or Lenovo, might now consider a $1K MBA if it delivers practically 2-3x the performance. Or how many people in the market for a larger more performant $2K Dell or HP or Lenovo might now consider a $1.5K MBP because it brings better performance for less. Or a $2K MBP (when they arrive next year) because it brings much better performance for the same price.
It won't be a mass switch, but I suspect it won't be insignificant. Maybe Apple's PC market share will hit double digits again, but it won't be much more than that. But the good news for Apple shareholders is, even a few more percentage points will significantly improve Apple's bottom line.
Then there's the question of whether or not Windows will come to ASi. It could come through emulation of x86. It could come through MS finally getting ARM Windows even remotely right. And if these ASi chips are as good as they seem to be, I wonder how many people in the $1K+ market might buy Apple hardware just to run Windows at 2-3x the performance of similarly priced PC's. I'm not pushing that idea. Just speculating possibilities. (You touch on these points too, and correctly note that those people aren't very many.) If nothing else, if Parallels, VMWare, or someone, can figure out emulating x86, and if these ASi chips are as fast as they seem, then even emulating x86 Windows on ASi could be as fast as it is on native Intel chips and that coupled with everything a Mac brings to the table will bring some switchers too. But agreed again, not en masse.
Whatever happens, it will be really interesting to see how this plays out. I for one am looking forward to it all, tentatively.
-
macOS Big Sur telling Apple what app you've opened isn't a security or privacy issue
OctoMonkey said:larryjw said:OctoMonkey said:Every time you open a web page your IP address is exposed. Is it possible that someone could infer where your requests are coming from based on your IP address? Sure, just like anyone has been able to do since the creation of the internet more than 30 years ago.The certificate serial number has no personally identifiable information related to YOU. It only identifies the bundle of code that is associated with the certificate, for example, a conspiracy storyboarding application.
If you believe that any of this is a huge privacy risk, put your computer behind a VPN or disconnect it from the internet. Problem solved.
This methodology works quite well for normal browsing and provides a great deal of anonymity.
Has it really been only 30 years since Al Gore created the internet? How time flies! I could swear it was 40 years ago when I worked at ANL.
----------------------Al Gore and the Internet
By Robert Kahn and Vinton Cerf
Dated: 28 Sep 2000
Al Gore was the first political leader to recognize the importance of the Internet and to promote and support its development.
No one person or even small group of persons exclusively invented the Internet. It is the result of many years of ongoing collaboration among people in government and the university community. But as the two people who designed the basic architecture and the core protocols that make the Internet work, we would like to acknowledge VP Gore’s contributions as a Congressman, Senator and as Vice President. No other elected official, to our knowledge, has made a greater contribution over a longer period of time.
Last year the Vice President made a straightforward statement on his role. He said: “During my service in the United States Congress I took the initiative in creating the Internet.” We don’t think, as some people have argued, that Gore intended to claim he invented the Internet. Moreover, there is no question in our minds that while serving as Senator, Gore’s initiatives had a significant and beneficial effect on the still-evolving Internet. The fact of the matter is that Gore was talking about and promoting the Internet long before most people were listening. We feel it is timely to offer our perspective.
As far back as the 1970s Congressman Gore promoted the idea of high speed telecommunications as an engine for both economic growth and the improvement of our educational system. He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. Though easily forgotten, now, at the time this was an unproven and controversial concept. Our work on the Internet started in 1973 and was based on even earlier work that took place in the mid-late 1960s. But the Internet, as we know it today, was not deployed until 1983. When the Internet was still in the early stages of its deployment, Congressman Gore provided intellectual leadership by helping create the vision of the potential benefits of high speed computing and communication. As an example, he sponsored hearings on how advanced technologies might be put to use in areas like coordinating the response of government agencies to natural disasters and other crises.
As a Senator in the 1980s Gore urged government agencies to consolidate what at the time were several dozen different and unconnected networks into an Interagency Network. Working in a bi-partisan manner with officials in Ronald Reagan and George Bush’s administrations, Gore secured the passage of the High Performance Computing and Communications Act in 1991. This Gore Act supported the National Research and Education Network (NREN) initiative that became one of the major vehicles for the spread of the Internet beyond the field of computer science.
As Vice President Gore promoted building the Internet both up and out, as well as releasing the Internet from the control of the government agencies that spawned it. He served as the major administration proponent for continued investment in advanced computing and networking and private sector initiatives such as Net Day. He was and is a strong proponent of extending access to the network to schools and libraries. Today, approximately 95% of our nations schools are on the Internet. Gore provided much-needed political support for the speedy privatization of the Internet when the time arrived for it to become a commercially-driven operation.
There are many factors that have contributed to the Internet’s rapid growth since the later 1980s, not the least of which has been political support for its privatization and continued support for research in advanced networking technology. No one in public life has been more intellectually engaged in helping to create the climate for a thriving Internet than the Vice President. Gore has been a clear champion of this effort, both in the councils of government and with the public at large.
The Vice President deserves credit for his early recognition of the value of high speed computing and communication and for his long-term and consistent articulation of the potential value of the Internet to American citizens and industry and, indeed, to the rest of the world.
------------------------------
Politicians (and perhaps people) as a whole tend to blow their own horn and inflate their achievements and self-worth. Had Gore stated something to the effect that he sponsored / spearheaded legislation which helped expand ARPANET into the publicly accessed network known as the internet. He did not. The foundations of the internet began in the 1960's and evolved over time into what we have today. While Gore may have had a significant hand in the legislative aspect of the internet's development as we know it, he absolutely did not create it.
As for "Massive lying and misinformation started before Trump." Absolutely correct! We cannot forget Obama, Bush (43), Clinton, Bush (41) or most any other President / politician / human being.
"I took the initiative in creating the internet" =/= "I created the internet".
The internet was created. A lot of people and organizations played different parts in its creation. "Took the initiative" is one of those parts, or possibly even a part of one of those parts. In that statement, Al Gore (correctly or not) is claiming credit for that particular part, not the creation of the internet in entirety.
Perhaps then the question is "Well what does 'took the initiative' in that context even mean?" It's arguable that the description by Robert Kahn and Vinton Cerf is a pretty reasonable outline of what it might have meant.
Needless to say... I'm not taking a stand on this matter as to what contributions Al Gore did or didn't make to the internet. I'm merely pointing out that those words Al Gore allegedly said, do NOT mean "I created the internet" by any educated understanding of the English language.