Marvin

About

Username
Marvin
Joined
Visits
116
Last Active
Roles
moderator
Points
6,091
Badges
2
Posts
15,326
  • Apple AI guru Tom Gruber speaks of artificial intelligence's 'inevitability' at TED

    macplusplus said:
    Alan Turing is the one who submitted one of the most powerful proofs against thinking machines: Halting Problem. Basically it is something like that: it is impossible to write a computer program that will determine whether it will get stuck or terminate successfully with a given input.
    This isn't a proof against thinking machines, human beings can't determine if given a question, they will be able to provide a definitive response. The solution to this is a timeout or give a rambling, incoherent response to cover the fact they didn't figure it out or do something else while the problem is being considered and it always uses asynchronous processing. People don't dwell on questions or inputs indefinitely and what differentiates machines from people here is motivations and priorities.

    Machines don't have overarching drives or motives to produce a result, they are commanded to do tasks by people. An AI simulation is controlled by a person as to what inputs it is exposed to, how it processes the inputs and how it delivers the output. Human beings have biological motivations that determine how they process information. Every day there is an energy cycle that produces motivations for dealing with hunger, tiredness, boredom and so on. The motivations are what determine how long people dwell on certain activities and they are dynamically culled if other motivations exceed their priority level.

    Another motivation that drives future decisions is the biological response, which is also missing in machines, partly formed from our identity. We have motives to survive or enhance quality of life that make us do certain activities and failure to achieve the desired results creates an emotional response that determines future priorities.

    Fundamentally people don't want AI because we want to stay in control. We never want a response from an AI assistant to be 'do it yourself', we don't want a vehicle AI to decide that it wants to see Canada on the commute to work so we fence them in to think like a human but not behave like one. But you can't separate those two things and produce the same output because they drive each other.

    This limiting of inputs limits how well an AI can perform. The mind doesn't just have lots of processing power but also huge capacity, high bandwidth to it and complex mapping between it:

    https://www.scientificamerican.com/article/what-is-the-memory-capacity/

    The data covers a huge variety of information of differing classes and importance. There's visual and audio data and data on how to interpret those differing classes of data, which includes understanding of cultural/social/technological differences.

    If we have machines with enough storage, properly formatted data, enough bandwidth, enough processing power, fully asynchronous processing and give them simulations of motivations that drive people then there's no reason they can't accurately simulate the output. I don't think there's a practical way to do that and like I say there's not really a desire to do that because it can't be controlled.

    We can certainly have AI in specific areas that do as good a job as a human. One of the most obvious areas would be customer support as there are limited inputs and responses. Driving also involves limited decision making. AI doesn't need to reach the level of a human being to replace a limited role that human beings do.
    gatorguyGeorgeBMac
  • Why Apple should cater to 'serious' gamers - and why it probably won't

    sockrolid said:
    Why?
    Because high-end gaming rigs generate high-margin sales.

    Why not?
    Because the high-end gamer market is a tiny niche.
    The high end gaming market is not a tiny niche. PC gamers generated almost $32 billion in revenue in 2016. The mobile gaming market generated $36.9 billion. For comparison, the console game market generated $30 billion in revenue in 2016.  
    The overall industry generates a lot of revenue but they merge everything (hardware and software) to get those figures and it includes the online MMOs that generate billions, mostly in Asia and from in-app purchases:

    http://2p.com/44700975_1/F2P-MMO-Earns-171-Billion-in-2016-Thats-6-times-more-than-P2P-MMOs-by-shaylynsun.htm
    http://dotesports.com/league-of-legends/league-of-legends-2015-revenue-2839

    Apple has to buy the GPUs from the manufacturer so they don't make all the revenue from the hardware sale and they'd just be taking a cut of the games. There are some shipment figures at the following site for hardware, some numbers are estimates from marketshare:

    http://www.anandtech.com/show/10613/discrete-desktop-gpu-market-trends-q2-2016-amd-grabs-market-share-but-nvidia-remains-on-top

    There are some revenue numbers here:

    https://www.nextplatform.com/2017/02/13/nvidia-tesla-compute-business-quadruples-q4/

    You can determine some numbers from the GPU company earnings. NVidia makes about $4b from GPUs per year. The NVidia 1080 is about $500 so if they only sold 1080s, they'd be selling 8m units per year across the entire 300m unit PC industry. If Apple's ratio here mirrored their worldwide PC marketshare, that would be <1m units vs ~18m Macs.

    NVidia doesn't sell just high-end units either so GPUs like the 1080 aren't selling as many as 8m per year, their revenue includes all their mobile and lower-end desktop GPUs.

    To see where the iMac stands vs desktop GPUs like the 1070, there's some details here:

    https://www.techpowerup.com/gpudb/2708/radeon-r9-m380 (1.5TFLOP)
    https://www.techpowerup.com/gpudb/2809/radeon-r9-m395x-mac-edition (3.7TFLOP)
    https://www.techpowerup.com/gpudb/2840/geforce-gtx-1070 (5.7TFLOP)
    https://www.techpowerup.com/gpudb/2870/geforce-gtx-1080-mobile (7.9TFLOP)
    https://www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti (10.6TFLOP)

    Here's the R9 M390 in the $2k iMac, only really struggles with 4K:
    https://www.notebookcheck.net/AMD-Radeon-R9-M390-Benchmarks.153797.0.html

    The iMac GPUs haven't been updated in a while so this year's model should be able to go up about 50-100% in performance. I'd say at least 5TFLOPs, could be as high as 7TFLOPs at the high-end (XBox Scorpio should be around 6TFLOP). This is around a 980ti and enough to run almost any game at 4K Ultra:
    https://www.notebookcheck.net/NVIDIA-GeForce-GTX-980-Ti-benchmarks-and-specs.169446.0.html

    The majority of gamers are on mid-range hardware. The Steam survey lists GPUs used:

    http://store.steampowered.com/hwsurvey/videocard/

    The top GPU is a 970:
    https://www.techpowerup.com/gpudb/2620/geforce-gtx-970 (3.5TFLOP)
    https://www.notebookcheck.net/NVIDIA-GeForce-GTX-970.146750.0.html

    The game results place this about 2x the performance of the M390 in the $2k iMac and roughly the same as the M395X in the $2300 iMac.

    Apple is catering the widest gaming audience already with the iMac. HP has some gaming PCs here:

    http://store.hp.com/us/en/mdp/desktops/hp-omen-desktop

    The $999 1060 model is around the iMac performance level. Add a 27" UHD display ($400 for 4K, 5K is $800+ if it's available at all), a decent 24" monitor is $200. Overall about $1200-1400. For the build quality, the iMac is worth the extra when you have to use it every day for years.

    Having to setup Bootcamp isn't for everyone and nobody really wants to pay for Windows. If there was a way to get all the Windows games to run close to full speed without the annoyance of Windows and rebooting, that would make gaming better on the Mac. I think it would be good if Apple commissioned game ports but another way would be to design the system so that Windows can be run alongside the Mac system without doing all the partitioning and rebooting. They could have a filesystem container that isolated the data with the NTFS format, it might need a custom version of Windows though and it would only be for a few million potential users.

    The external GPU route would help the majority of Apple users as they are on laptops. Apple could design their own TB3 boxes with 1x, 2x or 4x included GPUs suitable for different tasks and allow the 4 GPU box to have more than one TB3 connection.
    foregoneconclusionxzu
  • All-new Mac Pro with modular design, Apple-branded pro displays coming in 2018

    I don't think Apple got the design "wrong." I really like the one we have. I just couldn't understand why it remained untouched for three freakin' years. Now we know -- the current components already max the thermal capacity of the existing design. So I guess in a way you're right that the design had a flaw -- it can't accommodate new components -- but with the components it DOES support it works really well.
    It can because the new components increased performance per watt, which is all that matters. They can underclock any part to fit a thermal profile. The new model outperformed the tower model it replaced with far lower power usage. They can double the performance of the GPUs in the same design and increase the CPU by about 60%.

    The truth is the high value workstation market was dead just like the XServe. If it wasn't dead in 2012 when they still sold the expandable towers, they wouldn't have redesigned it. The redesign and marketing was to try and boost the sales again and get marketshare back from Dell and HP. We know it was dead because they practically stopped selling the tower model and it made zero difference to their balance sheet so revenue would likely have been under $0.5b.

    The possible components that can go in the machine just now are single CPUs that are about 60% faster or 100% faster if they used dual CPUs and GPUs that are 50-60GFLOPs/Watt vs 24GFLOPs/Watt in the Mac Pro so 100% faster GPUs.

    The reason why they don't just update it is probably because the new one hasn't boosted sales and they are still selling as poorly as the old tower models and they might not be able to source custom GPUs from AMD because if they only produce monolithic chips, they can't afford to use two of them and underclock them. The move to the iMac Pro was fully expected to be the next step. It's unexpected that they'd build a modular computer on top of this because it's going to sell even worse but it's obviously a market that they still want to try and service. In the long-term, these issues work themselves out, the XServe is a distant memory now and nobody even talks about it any more because the industry found more suitable solutions. GPUs can probably still double in performance another 2-3 times so where people need quad GPUs now, a single GPU will do that in about 4 years and mobile in 6 years so it will erode away the need for higher-end workstations to the point that the complaints just die out.
    xzuxzu
  • Apple upgrades 2013 Mac Pros with more cores & faster GPUs

    Soli said:
    I also wouldn't have allowed this Mac Pro to launch unless I thought that there was a long upgrade path for the case and chassis design (which I assume they also thought was going to be possible as the industry had mostly moved toward lower TDP and power consumption of components). I certainly can't imagine that they didn't have future plans.

    PS: Is this Mac Pro now more of a flop than the Cube? I think it might be.
    Part of the problem would be the custom AMD GPUs. Quadros are much more expensive so they are mostly stuck with AMD:

    https://www.amazon.com/PNY-NVIDIA-Quadro-M6000-VCQM6000-PB/dp/B00UXHQHJS

    AMD isn't in a position financially to engineer custom GPUs for such a small volume of buyers. AMD has made close to $3b in losses over the last 5 years:

    https://www.reddit.com/r/Amd/comments/2vzk5z/amd_is_getting_very_close_to_bankruptcy_nobody/
    http://marketrealist.com/2016/10/amd-turn-near-bankruptcy-financial-flexibility/

    They have done some debt restructuring to help but they are still producing massive losses. They haven't delivered much in the way of workstation GPUs for a while.

    Between that and Intel not delivering much in the way of worthwhile upgrades at the same price points and the migration of users to iMacs, the audience for this type of machine is very small now. Across the whole PC industry, workstations are 1-2% and like with every segment, most of the sales are the lower-end models. That's where it's different from the Cube because back then, the workstation audience was a far larger portion of buyers. Most users that were on workstations are now on iMacs and MBPs.

    The 2012 Mac Pro was the old design and people didn't upgrade to that from their older models. No Mac Pro design will ever sell in high numbers any more because the entire worldwide market is small, this is about satisfying a very small portion of buyers who do very computationally intensive workloads for the sake of respecting the importance of that work.

    Part of the solution here will be the iMac Pro because it will offer better value than the entry level Mac Pro and this will further diminish the audience for a headless box. What that box will be like remains to be seen but it's still not going to be what everyone wants because it will have compromises like any machine, it will be very expensive and it will be very late, being over a year away from now. If it keeps everyone happy then it's all good and it's a good thing for Apple to be reaching out to Mac users but there has to be a minimum threshold of customers where they stop selling models altogether, maybe 10k units/year or something.

    The iMac Pro should suit a lot of users. They can design it so that the display opens up e.g have the display on a hinge at the base and have it open forward with clasps that hold it at the top. In the middle, they can put a full length desktop GPU (12.5TFLOP Vega is somewhere around 200W) that can be switched out and have a large (bigger than Mac Pro) fan behind pulling air out the back:


    Easy access to storage, RAM and GPU. They could even leave a mini PCIe slot to allow for a card to connect to external boxes like a multi-GPU box via a vent. They'd just copy the framebuffer from the GPU to the IGP to allow it to send display output over Thunderbolt.
    Soliroundaboutnow
  • Adobe research creates AI tool for transferring image styles between photographs

    spice-boy said:
    Am I the only one that thinks the "after" pictures are horrific?
    They look close to the reference. This would be an extreme example of how much they can change the source to become like a given reference. In real-world cases, people would just want to be matching composited elements into a scene e.g you crop a person out and paste them in somewhere and need the lighting/shadowing and color temperature to match the surroundings or you have a series of images that you need to match as a group.

    You'd be able to have the computer match portions of images to other images, something that would take hours to do manually. All these tools are just there to save time. They should apply AI to cropping too. Humans can easily see what an object is relative to a background but computers can't so humans have to slowly crop round the image. With object/shape recognition, the computer could do the job much more effectively.
    tmay