Last Active
  • Arkansas Sen. Tom Cotton says Apple's Tim Cook 'omitted critical facts' in encryption stance

    Hmmm who should I trust?

    Apple, a company who has not done any wrong by me and has only served to make my life just that little bit easier?

    How about the government who wants the ability to have a back door into my system in the off chance I might decide to commit a crime therefore see me as nothing more than a potential criminal meanwhile employing policies that allow them to basically not fix the real issues, not look after the people they are in charge of looking after, and getting rich from what by dictionary definition amounts to bribes?

    I worked in the IT Department in the New Zealand Parliament for about 4 months. Frankly that solidified my belief that I shouldn't vote for any government because they are worse than Kindergarten children. Parliament is the scariest place on earth because you quickly realise that those muppets are the ones running the country.

    Nope, I'm on Apple's side here people.
  • Mark Zuckerberg was ready to pounce on Apple's data practices at Senate hearing

    The difference between data collected by Apple and the data collected by Facebook is Apple's data collection is anonymous and not traceable to a single person. Facebook can tie the data to a specific person.
  • European emergency agency requests Apple enable AML location tracking in iPhone for first ...

    Soli said:
    Soli said:
    Is it too much to ask that every modernized country use the same three 3-digit code for calling emergency services?
    America wanted to be different to England which is also the reason why America drives on the right hand side of the road.

    We'll talk again when America accepts the internationally recognised dd(d)-mm(m)-(yy)yy format instead of it's inane mm(m)-dd(d)-(yy)yy format

     :D  ;) :p
    I've heard such stories before but I've never seen any evidence to back up an entire nation's decision for policy change based on some pettiness against the UK, especially when there are so many other things in common, evidence that US English is close to colonial English than British English is today, and more reasonable explanations, like dialing 9 first on a rotary phone would limit accidental dials, but then using the number 1 for the next two would help speed up the calls to emergency services. This also unsubstantiated but it at least sounds like a reasonable explanation based around utility.
    Actually the 999 thing (here in New Zealand it's 111) has to do with the old phone systems. 999 or 111 are the most opposite end of the old rotary dial phones to the start position. By using these numbers it creates a timespan that the old systems could recognise so you didn't get a misdial. Here in New Zealand when we went to push button tone phones people were pressing 111 too fast and it would misdial every time because the phone system couldn't pick up the tone correctly. We had a campaign called "Think 1-go-1-go-1" to alleviate the issue. Not so much an issue now though. At one stage New Zealand's Telecom had to add 911 to the emergency service numbers thanks to the popularity of William Shatner's 911 in the '90s and people were dialling 911 instead of 111 and getting failed calls.

    America might like to think that their version of English is the closest to colonial English but truthfully there is no such thing. The reason British English uses "colour" instead of "color" has to do with the fact that much of the words in English are taken from French and Spanish as a result of conquests. So by dropping these letters America actually makes themselves further from colonial English than modern British English. Also English is a constantly changing language so there literally is no right or wrong English. English itself is such a mongrel language that to believe in a standardised English is a bit of a misbelief. I mean you've got a Germanic language using words from French, Spanish, Hindi/Gujurati/etc, Latin, Greek, Mandarin, African, and basically any other language that England came into contact with with their own conquests that you can't actually say English has a standard. Even within England there are different forms of English and different ways of using words that getting pedantic about grammar and spelling is a fool's errand. Hell, we still use in some form words that were made up by William Shakespeare so English literally contains made up words and people get all up in arms about how to properly use them? Fool's Errand indeed.
    Solisingularity[Deleted User]
  • Apple Services and the ecosystem of value capture

    A lot of people bemoan Apple’s apps as being not very powerful. For example they rail on Numbers because it doesn’t support AppleScript very well and thus it can’t compete with Excel but I’ve not seen anything that proves that. The only thing that Excel seems to do that Numbers can’t is read data from another spreadsheet. But I can do thing in Numbers that requires Visual Basic knowledge I’m Excel.

    Take for example checkboxes. In Numbers I simply change the format of the field to be a checkbox then create a formula that references that checkbox. To do the same thing in Windows you have to write a screed of VB code which is time consuming and daunting for your average person.

    In other words Apple’s software is at its most powerful when they are making complex things easy. People fail to realise this and just rail on Apple for writing simplistic apps.

    Of course the likes of Microsoft and Adobe will make this point louder in order to sell their own software but frankly they are pretty pathetic software. People want to use software and not learn to code.

    Apple’s apps aren’t perfect but they are for most people and most people is who Apple is aiming for because most people spend money.
    StrangeDaysking editor the grate2old4funmacxpressmwhiteclaire1racerhomie3magman1979MisterKitwatto_cobra
  • Why macOS Mojave requires Metal -- and deprecates OpenGL

    It is built in obsolescence. Windows will run on 15 year old computers while Mac OS will refuse to run on any Mac built more than six years ago. There is really no technical reason why Mac OS could not run on a 2010 Intel CPU. These are decisions made at the top of the company. Apple does what is best for Apple and not for its customers. The reason developers are not up in arms about Apple dropping OpenGL and OpenCL is that it really happened years ago when Apple stopped updating it. Mac OS is now about five years out of date. When you look at the extremely poor library of AAA games available on the Mac, know that it is Apple's poor hardware features and lack of cross platform software support that is the major reason. Of course people don't buy Macs to play games. Pretty soon people won't buy Macs at all.
    BULLCRAP. When Vista came out it barely ran on one year old machines. Windows 7 increased that to about 2 year old machines and Windows 8 got to 5 year old machines. The only machines that run Windows after fifteen years were machines that when new would have cost close to $10,000(NZ) due to beefy graphics cards and CPUs but your average home computer barely ran XP with any performance.

    I know this because trying to get two year old machines to run Windows 7 was a freaking nightmare because manufacturers decided that they didn’t want to write drivers for the new OS they preferred people to buy new machines. The headaches I had trying to get some big name machines running even Windows 10 was more than Neurofen could even handle.

    My Early 2011 MBP has been running the latest OS all that time which is great for a machine that is 7 years old and it’s actually nice to know that the only reason it won’t run Mohave is literally because of a hardware limitation.
  • Apple to let developers port iOS apps to Mac, starts with own apps in macOS Mojave

    I knew Apple would go this route because it’s the only route that makes logical sense.

    Adding UIKit made the most sense when I was trying to port a barcode tutorial app that I macOS and I couldn’t work out how to do it despite developing it in iOS being super simple. I wished Apple just ported UIKit to macOS and have it work the way a macOS app would.

    I think Apple is able to read my mind. I mean given iPhone and Apple Watch are connected to me constantly it seems maybe they’ve worked out how to read brain waves as well. That’s both super cool and super scary as well :wink: 
  • Apple agrees to settlement in shareholder derivative complaint over e-book antitrust case ...

    Rayz2016 said:
    Ah yes. The case where the judge indicated that Apple would lose even before the trial started, and then installed a personal friend as the monitor when they did lose. 

    At least now Cook has learned the value of lobbying. Pity it was a little bit late. 

    The money is loose change, but the overwatch may hamper the company going forward. 

    That and now eBooks are higher priced because Amazon has a near monopoly and yet the publishers and more specifically the authors are worse off than before.

    Good one America.
  • Here are the five biggest iPad Pro problems, because no device is perfect

    lowededwookie said:
    [...] I can edit video on an iPhone just as easily as using iMovie on the Mac
    "Easily," yes. Accurately, no. Fine adjustments are difficult using a finger on a small screen. One's choices are endless re-zoom operations or accepting edits that are "in the ballpark."

    The fact that a task is possible on a phone or tablet does not mean it's automatically equivalent to a laptop or desktop in terms of ease-of-use, speed, workflow (particularly within a facility where one's work is part of a chain), or any other productivity measure. iPads have opened up a new form of computing that is better than a laptop for some things. That's awesome in itself. It doesn't mean that it's better than -- or even equivalent to -- previously existing input and interaction methods for some kinds of work.

    Besides, even putting all that aside, the iPad Pro's marketing includes using the keyboard stand and an external monitor. Both make touch a less effective control method than using a mouse.
    I disagree for the simple reason that the pen is mightier than the sword/mouse. Ever tried drawing with a mouse? Ever tried to be precise with a trackpad? All they are good for is moving stuff around but you can do that with a touchscreen and get immense precision using the Pencil.

    Just because you’ve done something all the time doesn’t make it the best option. In fact if you’re serious about video editing and precision on a Mac you’d use a jog control not a mouse or at the very least a trackball such as Logitech’s MX Ergo which I’ve used back when it was the TrackMan. It is far superior to a mouse or trackpad.

    While my complaint is largely semantic I completely disagree that it’s the lack of the mouse that means that the iPad Pro can’t be used as a desktop/laptop replacement. If that’s so then why the hell is Adobe bringing over full Photoshop? Why is AutoDesk bringing over the full AutoCAD engine? The issue is not lack of mouse support but lack of software support and we are starting to see this changing now thanks to the original iPad Pro.

    The iPad Pro combined with the Apple Pencil is a very precise device and those decrying its abilities just don’t understand how computing is going to evolve because they’re stuck in the past. It’s Final Cut Pro X all over again and that was a moronic debate back then as well.
  • Apple's iOS 12 update is causing sporadic issues with iPhone charging

    My iPhone 6 Plus does the same thing but it physically brings a message about needing to be unlocked and when I do it charges fine.

    My thinking is that it’s due to the USB Restricted Mode purely from the messages I’m seeing.
  • Here are the five biggest iPad Pro problems, because no device is perfect

    Having spent over eighteen years working in IT the one thing I’ve noticed about the way people use computers is that they would be far better suited to using an iPad. And that’s including office workers. The writer of this very strange article decries that most people won’t use the power of the new iPad Pro but then 99% of people don’t use the full power of a desktop and laptop.

    In the past I needed a desktop/laptop simply because the software I needed to use wasn’t on iOS but now that I’ve left the drudgery of IT this isn’t a problem except for two pieces of software, Flux and Fusion360 but I can work around Flux and AutoDesk are going to bring the AutoCAD engine to iOS so Fusion360 might not be far away.

    To say the iPad Pro is not a real computer simply because of the input methods proves that you don’t actually know what the definition of a computer is and therefore should not be writing for a computer based site. That might seem rude but an iPhone is literally by definition a computer. Are you going to tell me the ECU in a car is not a computer simply because there’s no way a human can interact with it directly? It’s absurd to think that way and absurd to think that an iOS device can’t be used to replace a desktop or a laptop when as I’ve already mentioned 99% of computer users would be better served with an iPad given their computing needs. That leaves the 1% who genuinely do and that seems to include the writer of this article.

    To say the iPad can’t be used as a computer simply because it can’t do what you want/need is a pathetic view of computing that is not grounded in reality. Apple knows this. Apple developed a computer in the iPad that is more powerful than 92% of the laptops on the market. Just let that sink in for a second. That means the iPad is capable of handling massive spreadsheets, and 3D rendering, and music composition because it’s power is amazing. The only thing that stops it doing so is not a lack of mouse but a lack of software.

    A mouse is cumbersome and not very accurate despite the claims. There’s nothing about file transfer that warrants a mouse at all. Hell, I can edit video on an iPhone just as easily as using iMovie on the Mac so the lack of mouse does not hold the iPad Pro back as a serious computing device.