- Last Active
Most of the code of a iOS application compile and run natively on macOS; most of the libraries are the same, and large part of the UI infrastructure is also the same.
Actually, from a developer point of view, iOS and macOS are already very very close.
What is different: everything that deals with file, because of the iOS sandbox, and the actual UI code, that is different, the toolkits are different, with a different hierarchy of Objective-C/Swift classes.
My personal guess (as an experience developer) is that this project concern a new User Interface toolkit, either on macOS, on iOS or probably on both, that allows to write user interfaces that run on both platforms, plus some other limited stuff, mostly support in XCode to allow building for both platforms.
My guess is also that this will not be transparent and automatic, but it will allow a developer, with a explicit design decision, to target both platforms at the same time, with the same code base, but ossibly with custom code and functionalities on each of the two (or more) platforms.
Just a side point: if you do a technical analysis, and not a market analysis, you should trace the history of NeXTSTEP, and not the history of the Mac.The more you go inside MacOS X, the more you find the NeXTSTEP origin, up to its BSD+Mach kernel.So the history of platform changes of NeXTSTEP is more interesting in terms of the technical potential of such a change.And the truth is, NeXTSTEP is a multi platform system from the beginning, based on portable code: in its early days, it moved from 680xx to the Motorola 88k risc family, then to Intel, PA-RISC and SPARC. With Apple, it moved to PowerPC, it become MacOS X, it moved back to Intel, and then, under some form, to ARM, as the basis for iOS and tvOS.And these are just the commercially available version; i am sure that in the labs there were and there are more.Other technical point: no, most of the modern application will not need a big effort; modern software technologies are a lot less platform dependent, even in performance oriented code (after all, how many developers have the faintest idea of how the intel processors works internally ?). My personal bet is than in more than 95% of the cases,porting will be as hard as clicking on "Build" on XCode. Remember that the iOS development environment compile and run the iOS application code on intel processors.Of course, than you need a big testing phase.And of course, it is still possible to write platform dependent code, and somebody if forced to do it; but as long as the compiler is the same, it will not be a problem for most of the developers.Maurizio
madan said:You know, come to think of it. You could get an iMac Pro, with 64 GB of RAM, a base Xeon, more storage and a Vega 56, stone the base Mac Pro over the head and it STILL COMES WITH A 5K LG MONITOR BUILT IN. How nuts is that?
Sure it doesn't come with support with 12 TB 3 lanes but srsly, you're probably not going to need that. A base Xeon wouldn't be able to handle that throughput anyways. So honestly, the iMac Pro is a better deal because in 5-6 years you just buy another iMac Pro and you get a whole new system, PLUS A WHOLE NEW MONITOR, to boot, for the same price. Like I said, the Mac Pro only makes "sense" once you start cracking the 20,000 USD threshold. Once you start putting in gpus and cpu configurations that can handle the crazy bandwidth and performance than an iMac Pro just can't touch. But the base system? A Vega 56 is 30% faster than the Mac Pro's base Radeon 580. And the system costs LESS and BRINGS A MONITOR.
I have a Mac Pro 2009 running in my home studio; in 2009, i paid 3000 euros for it; it had Sata 2, USB 2, a few hard disk, and a GT120 graphic card, and an 8 core double cpu running at 2.16 Ghz. The lowest possible end.
Today, it run with nvme SSD, has USB3, an RX580, and two 6 core 3.4Ghz CPU, and it still current.
All this was massively less expensive than buying the 3 iMac that has become obsolete in the same timeframe.
The Mac Pro is a PCI machine; it is evolutive, that is the whole point; of course, the stellar point is when you spend more than 20K$, but it fully make sense in
a context where needs and gains evolve.
But as of today, if i had the kind of needs and money, i wouldn't buy an iMac Pro, i would buy a new Mac Pro, low end, and let it evolve in base of my needs; it would
be massively less expensive, and upgradable to technologies that today not yet exists, like USB4. In ten years from now it would be still useful.
An iMac Pro bought today, before a refresh, will be obsolete in about 3 years, and not upgradable.
Anyway, the point about the Mac Pro, as many poster said here, it is not a machine for gamers, it is not a machine for the masses; it is a machine for those that need it; they will reconise it.
davgreg said:This gives me reason to pause on ordering one of these.As much as I want to have a headless desktop Mac that (hopefully) will have a long service life, Apple's propensity for proprietary connectors and stuff is a serious concern. I want to see the thing for myself and see what aftermarket stuff will be possible as Apple charges a king's ransom for memory on everything it sells.Well, the only actual limit is for the first SSD you buy; but nobody forbid to install a 20$ nvme PCIE card with a Samsung SSD and use it a system disk; you have plenty of PCI slot on the Mac Pro.There are also hyper fast nvme Raid Cards that use 16 PCI lanes and give stellar performance; you are really not limited to the proprietary SSD in machine like this.I mean, i agree that it just make no sense, but in practice is hardly a problem, if you have the need and the money for a machine like this one.
thrang said:So for the few percent of people that listen to 10 hours of music straight a day, or will be switching their ear buds among several devices constantly, I suppose there is some legitimate agitation. Just like there were a few percent of people who lamented the loss of floppy and CD media. But this is no reason to delay progress.
And has nothing to do with the loss of floppy or CD media. In both case, it was a natural evolution, following the simple fact that floppies and CDs did not correspond
any more to the needs. USB stick and external disks took their place. USB is a standard, fits better the needs, and the whole market moved to it.
Being a Sennheiser user you know very well that 99.9% of the medium-end and high-end headphones and earpods use jack for the connection; i.e. a standard connection for an analogue signal that is completely adequate to the need; the market is not moving away from analogue connections. All the music you listen is made with analogue connections at the source, even for digital instruments and effects.
Lightning is not a standard analogue connection, you cannot use it even for your Mac, and not on any other equipment. Is proprietary, and do not fits the needs.
USB C is standard, and it may eventually happens, but leave out all the equipment that is not digital.
Bluetooth is a standard connection, but the high end audio market is just not taking up Bluetooth as a standard connection, because most of the "other" use of an headphone do not require a wireless connection.
There is no economic reasons for companies like Sennheiser, Audio Technica, Shure, to move to Bluetooth, adding the cost a receiver and that of a good DAC system for products that already cost a significant fraction of the cost of a iPhone.
So, no, Apple did not killed the analogue jack, it just moved out the jack to an adapter. This is not progress, it is just a nuisance, even if minor.
Can somebody explain me where the innovation is ? Three sensor (video) camera were commons in the end of the 80s beginning of the 90s. Probably the innovation is elsewhere, around resolution.