Apple unveils Swift, a brand new Xcode programming language for developers

15681011

Comments

  • Reply 141 of 214
    crowleycrowley Posts: 10,453member
    [quote]“var occupations = [
    "Malcolm": "Captain",
    "Kaylee": "Mechanic",
    ]
    occupations["Jayne"] = "Public Relations”
    [/quote]
    I like Swift already.
  • Reply 142 of 214
    asdasdasdasd Posts: 5,686member
    It is somewhat disappointing that Swift leaves the SmallTalk syntax behind.

    I would have thought that Modern Obj-C take 2 would
    - resemble F-Script even more (do away with the bracket syntax for messages)
    - unify base types and classes (in F-Script, every value is an object, but Swift also tries to do this -- we'll see)
    - provide more developer friendly string manipulation syntax

    It does allow easier string manipulation. Very like C#
  • Reply 143 of 214
    asdasdasdasd Posts: 5,686member
    That said the language has some potential pitfalls, the ease at which functions can be returned from functions. I can see that being overused. Returning tuples from functions? My solution to being able to return only one item but needing more information than a primitive was always to return a strongly typed object filled out with the needed values. I can see amateur programmers returning 100 results from a function and referencing then by index. Good lord.

    There is some strange syntax around closures too. In the book they introduce a sorting closure which acts on an array. It's starts off with a few lines of code and ends up with just >.

    Easy to write maybe. Hard to read, definitely.
  • Reply 144 of 214
    asdasdasdasd Posts: 5,686member
    crowley wrote: »
    I like Swift already.

    Except for the var that is available in objective C 2.0

    It would be:
    NSMutableDictionary* occupations = [@{
                            @"Malcolm";: @"Captain";,
                            @"Kaylee";: @"Mechanic";,
                            } mutableCopy];
        
        occupations[@"Jayne"] = @"Public Relations";
    


    So I suppose you do need square brackets to make the array mutable, but mostly thats the same.
  • Reply 145 of 214
    asciiascii Posts: 5,936member
    Quote:

    Originally Posted by Rayz View Post

     

    Performance an issue? Could be; no one knows for sure just yet, but your argument is very similar to ones I heard many years ago:

     

    "C??? Do me a favour. No one will use for performance reasons; assembler's the way to go. Always will be.'

     

    And the fact is that 99% of the applications written for Apple's platform don't need bare metal type coding to run at a decent speed, especially when weighed up against how much faster folk will be able to develop with Swift as opposed to using ObjectiveC. The language will also give them a boost in the enterprise space where they certainly haven't shown much interest in developing ObjectiveC applications, even with Apple's surging numbers there.


    Your argument is out of date. It might have been convincing if this was the early 2000s but it's 2014 now. Back then everyone was on PCs and it was more memory and more CPU cycles as far as the eye could see. At that point it made sense to give up a bit of efficiency for some ease of development. But all that changed in 2007 with the introduction of the iPhone, suddenly specs reset again, 128MB of RAM and a 400MHz 32-bit processor was the new baseline. We had entered the age of miniaturization.

     

    The iPhone was basically a little pocket computer, running a cut down version of a full desktop OS. Ask yourself then why Microsoft was not the next cab off the rank with a fully featured computer-phone? They were the best positioned, they had a full desktop OS all ready to go, they just had to shrink it down to run on those lower specs. But they couldn't, because they had spend the last several years rewriting a bunch of their stuff in .NET because of arguments like yours. In order to save a few developers from having to wrestle with pointers they had strategically, at a corporate level, lost flexibility.

     

    The iPhone has respectable specs now, and so the same old arguments about whether to move to a more script-like language are coming back. Clearly they have been taking place inside Apple! The thing is, do we believe the mobile phone is the final form the personal computer will take, for all time? Or is it likely there might be another spec reset in the future, when the PC becomes a wristwatch. And then the specs of the watch grow until boom - another reset - the PC is now a ring on your finger. 128MB RAM again. 150MHz body heat powered CPU. Oh-oh, all this stuff written in Swift will have to be rewritten in C to work at 150MHz, now Samsung will beat us to market.

     

    I don't believe in wasting developer's time unnecessarily, but until the client side form factors settle down, it behooves corporations to use high performance languages on the client side, at least for the next 10 years or so. Apple should deprecate Swift immediately, or if that is too embarrasing, repurpose it as a server side language. On the server, the form factors are settled and the assumptions that underly scripty type tradeoffs are known to hold. Java is a good example of this, where it was launched as a client side language but really hit it's stride on the server.

  • Reply 146 of 214
    Quote:

    Originally Posted by Danox View Post

     

     

    Yes, in time the real reason for the existence of Swift will become apparent, Swift is easy the biggest long range news out of the many announced at this year's conference.


     

    The clue is in the name 'Swift'

     

    Swift is related to current cutting edge languages with functional features such as Scala.  

     

    The clever thing about this is the way in which it helps future hardware run programs faster because more of the code can automatically be run on more processor cores.  That's the user benefit,  plus of course the fact that more apps will be built quicker by the developers. 

     

    But the performance potential of Swift is what will hurt the competition immediately. More importantly this makes functional programming with these modern features a mainstream technology. It will hurt Java most strongly and give a big boost to Scala. Once a programmer understands how to work with this kind of language its easy to switch to another one. 

     

    Once again, Apple uses its industrial might to move the worlds technology stack onto another level. 

     

    Once again Apple shows the value of owning the technology stack matching the software to their custom hardware. The only snag is its proprietary, but thats fine for developers. 

     

    In addition, with Playgrounds, Apple have realised a dream of programming tools designers that its over forty years old, thats why the mouths of the old guys in the audience were dropping, its the dream come true.

  • Reply 147 of 214
    MarvinMarvin Posts: 15,309moderator
    ascii wrote: »
    I agree with mdriftmeyer that Swift may find a place as a rapid prototyping language but won't displace ObjC. And I agree with ecs that eventually Swift will be forgotten and C/C++ will still be going strong.

    I don't think so. C# and .Net have shown how there can be wide adoption of a new language if it improves developer productivity:

    http://www.computerworld.com.au/article/261958/a-z_programming_languages_c_/?

    Most people were familiar with C, C++, Java, Python before Objective-C arrived and it was a language nobody wanted to learn and many didn't. People took ages to port Carbon code to Cocoa, even Apple with FCP. 3rd party SDKs for making games bypass it - Unreal, Unity, Corona etc don't use Objective-C.

    If developers are more familiar with the language and SDK tools, they won't need to bother about using the 3rd party ones.

    Cross-platform developers won't bother with it just as they don't with Objective-C so it won't replace C/C++ but it can easily displace a lot of Objective-C code and Apple seems to be encouraging this:

    https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/BuildingCocoaApps/Migration.html

    The interactivity aspect of it is huge as it behaves like a runtime scripting language. IMO, Objective-C is history not Swift.
  • Reply 148 of 214
    asdasdasdasd Posts: 5,686member
    ascii wrote:


    I don't believe in wasting developer's time unnecessarily, but until the client side form factors settle down, it behooves corporations to use high performance languages on the client side, at least for the next 10 years or so. Apple should deprecate Swift immediately, or if that is too embarrasing, repurpose it as a server side language. On the server, the form factors are settled and the assumptions that underly scripty type tradeoffs are known to hold. Java is a good example of this, where it was launched as a client side language but really hit it's stride on the server.

    Its not just a scripting language. It runs on the objective C runtime, presumably compiled at compiled time but interpreted when "scripting".
  • Reply 149 of 214
    asdasdasdasd Posts: 5,686member
    Quote:

    Originally Posted by Marvin View Post





    I don't think so. C# and .Net have shown how there can be wide adoption of a new language if it improves developer productivity:



    http://www.computerworld.com.au/article/261958/a-z_programming_languages_c_/?



    Most people were familiar with C, C++, Java, Python before Objective-C arrived and it was a language nobody wanted to learn and many didn't. People took ages to port Carbon code to Cocoa, even Apple with FCP. 3rd party SDKs for making games bypass it - Unreal, Unity, Corona etc don't use Objective-C.



    If developers are more familiar with the language and SDK tools, they won't need to bother about using the 3rd party ones.



    Cross-platform developers won't bother with it just as they don't with Objective-C so it won't replace C/C++ but it can easily displace a lot of Objective-C code and Apple seems to be encouraging this:



    https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/BuildingCocoaApps/Migration.html



    The interactivity aspect of it is huge as it behaves like a runtime scripting language. IMO, Objective-C is history not Swift.

     

    The problem is Swift doesn't interoperate with C++ at all. Which means that the app has to be Objective C at least in the parts where you interoperate with the C++ layers.

  • Reply 150 of 214
    asciiascii Posts: 5,936member
    Quote:
    Originally Posted by Marvin View Post





    I don't think so. C# and .Net have shown how there can be wide adoption of a new language if it improves developer productivity:



    http://www.computerworld.com.au/article/261958/a-z_programming_languages_c_/?



    Most people were familiar with C, C++, Java, Python before Objective-C arrived and it was a language nobody wanted to learn and many didn't. People took ages to port Carbon code to Cocoa, even Apple with FCP. 3rd party SDKs for making games bypass it - Unreal, Unity, Corona etc don't use Objective-C.



    If developers are more familiar with the language and SDK tools, they won't need to bother about using the 3rd party ones.



    Cross-platform developers won't bother with it just as they don't with Objective-C so it won't replace C/C++ but it can easily displace a lot of Objective-C code and Apple seems to be encouraging this:



    https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/BuildingCocoaApps/Migration.html



    The interactivity aspect of it is huge as it behaves like a runtime scripting language. IMO, Objective-C is history not Swift.

    At this point in history, the age of miniturization, improving developer productivity must come second to hardware efficiency.

     

    Currently Apple's entire system is written in C/C++/ObjC which is very efficient, and a major strategic advantage for the company. i.e.

    Microsoft (due to their use of .NET) and Google (due to their use of Java) will not be able to release an iRing (for example) until a Ring form factor device can be manufactured that has 512M of RAM and an 800MHz processor. Apple, due to their C-based stack, can release an iRing as soon as a 128MB/400MHz device becomes possible.

     

    Depending on the speed of technology advance, this could give Apple a 1-2 year head start in the market for every new form factor PC. This is not worth giving up for a bit of extra developer productivity, I mean, it's not like there's a shortage of apps.

     

    Since Tim Cook took over, he has been putting pressure on his competitors by maintaining a break neck pace (e.g. OS releases every year). This plays to his strengths as an operations maestro. But if he really wants to screw with his competitors he should release devices that are too small to run their stacks.

  • Reply 151 of 214
    asciiascii Posts: 5,936member
    Quote:
    Originally Posted by asdasd View Post





    Its not just a scripting language. It runs on the objective C runtime, presumably compiled at compiled time but interpreted when "scripting".

    Are you sure about that? In the Swift demo in the State of the Union video, at about the 21 minute mark, he explicitly says that it has it's own runtime, and that "much of the functionality is not baked in to the compiler, but is done in the runtime which itself is written is Swift." He says that even core things like Int are defined in the runtime! That sounds like a very dynamic, runtime-y, script-y language to me.

  • Reply 152 of 214
    asdasdasdasd Posts: 5,686member
    Quote:
    Originally Posted by ascii View Post

     

    Are you sure about that? In the Swift demo in the State of the Union video, at about the 21 minute mark, he explicitly says that it has it's own runtime, and that "much of the functionality is not baked in to the compiler, but is done in the runtime which itself is written is Swift." He says that even core things like Int are defined in the runtime! That sounds like a very dynamic, runtime-y, script-y language to me.


    Runtime introspection is also a feature of Objective C. What they mean I think, is that it will decide at runtime that your variable is an int, or a double depending on what you are assigning it to. That code has still been compiled. However you can absolutely still generate compile time warnings, that’s all over the document. Therefore the final code is compiled not run on an interpreter or JIT. The playground is probably interpreted though.

     

     

    http://www.theregister.co.uk/2014/06/02/apple_aims_to_speed_up_secure_coding_with_swift_programming_language/

  • Reply 153 of 214
    nhtnht Posts: 4,522member
    crowley wrote: »
    I like Swift already.

    Go shi! Did I miss that in the book? Shiney!
  • Reply 154 of 214
    nhtnht Posts: 4,522member
    asdasd wrote: »
    Except for the var that is available in objective C 2.0

    Please turn in your geek badge on your way out the door.
  • Reply 155 of 214
    asdasdasdasd Posts: 5,686member

    It's still a second class language, though. As far as I can see all the frameworks are Objective C, but can be imported into Swift projects. However that means the nomenclature of the methods or functions you are calling are different. The square brackets are gone, the init... is gone. You have to be aware of this, like xamarin. 

  • Reply 156 of 214
    asdasdasdasd Posts: 5,686member
    Quote:
    Originally Posted by nht View Post





    Please turn in your geek badge on your way out the door.

     Literals are available in Objective C 2.0. 

  • Reply 157 of 214
    dewmedewme Posts: 5,332member

    Very interesting discussion. Very insightful and thoughtful logic and reasoning behind ascii's point of view. There's something to be said about maintaining your portability and agility by staying lean and mean. But it's not like Microsoft didn't try to make a play for portability and small form factors with Windows CE. I'm not sure if Microsoft's inability to quickly produce an iPhone competitor was more determined by their technology decisions or their business strategy. I think it's primarily the latter, the business strategy stumbled for them, but once they were exposed by their actions on the business side they didn't have the right technology to react quickly enough to the disruptive threat of the iPhone and its App Store ecosystem. I believe it was the App Store model that disrupted Microsoft and the PC market as much the iPhone did although you could argue that the two are intertwined. Tens of thousands of cheap snack sized apps versus big fat integrated do-everything apps that shipped on at least one DVD (and sometimes more) and cost hundreds of dollars. Apple's decision to open up the iPhone as an open app platform (even if curated) was a broadside shot that the PC juggernaut was not prepared to handle. The iPhone with an Apple-only app store would still be a compelling smartphone product and possibly a market leader but it would not have set the stage to disrupt the entire PC industry like it has in its different form factors including iPad and AppleTV. 

     

    Microsoft's addiction to selling Big $ OS and Big $ Office licenses at a time when PC vendors were still on a healthy growth phase provided as much of an impediment to Microsoft meeting the Apple disruptive force as did their own technology factors. I don't think Microsoft's efforts around providing more accessible programming frameworks and languages like .NET and C# (and VB.NET) was the reason for them falling behind. Those technology and toolset innovations were the right things to do for sustaining their existing businesses at the time, which they had to do. In retrospect they could have positioned themselves to be better prepared to handle the disruption that the iPhone and App Store represented by having a leaner and more portable stack in hand. But retrospection is always easier than prediction and the nature of disruptive innovation is that many things you do to sustain your current business and current customers sets you up to fail when disruption hits. That's why you have to have a multi-pronged strategy and not have a singularly focused investment strategy either in technology or business. Apple's efforts to improve their current toolset is necessary and provides incremental improvements for some existing customers. If this is Apple's only play then yes they are making themselves more susceptible to disruption for iWatch, iRing, and iFollicle potential form factors. One would hope that Apple is preparing themselves for multiple potential scenarios that could occur over the next several years and not bet the farm on one approach or milking one cow to death. 

  • Reply 158 of 214
    nhtnht Posts: 4,522member
    ascii wrote: »
    At this point in history, the age of miniturization, improving developer productivity must come second to hardware efficiency.

    First, no.
    Second, it's not either or anyway.
    Currently Apple's entire system is written in C/C++/ObjC which is very efficient, and a major strategic advantage for the company. i.e.

    Microsoft (due to their use of .NET) and Google (due to their use of Java) will not be able to release an iRing (for example) until a Ring form factor device can be manufactured that has 512M of RAM and an 800MHz processor. Apple, due to their C-based stack, can release an iRing as soon as a 128MB/400MHz device becomes possible.

    There is no real reason why swift can't be as efficient...it depends on the compiler. Also Java can be compiled to native binary if desired. The syntax isn't all that relevant. Google could do that tomorrow if they wanted. C++/CX is MS' version for WinRT.
    Depending on the speed of technology advance, this could give Apple a 1-2 year head start in the market for every new form factor PC. This is not worth giving up for a bit of extra developer productivity, I mean, it's not like there's a shortage of apps.

    Except that developer productivity also means faster time to market and, in this case, even more developer lock in.
  • Reply 159 of 214
    nhtnht Posts: 4,522member
    asdasd wrote: »
     Literals are available in Objective C 2.0. 

    I would never hire a coder that misses Sci Fi references when pointed out a second/third time. It indicates either an incomplete education or inappropriate personality as well as an inability to step back, see the forest instead of trees and google if unfamiliar.
  • Reply 160 of 214
    asdasdasdasd Posts: 5,686member
    Quote:



    Originally Posted by nht View Post





    I would never hire a coder that misses Sci Fi references when pointed out a second/third time. It indicates either an incomplete education or inappropriate personality as well as an inability to step back, see the forest instead of trees and google if unfamiliar.

    I do the hiring here. Thats hilarious. A conversation with Comic Book guy. Ok, I’ll google all your comments in future to see if they reference a 1960’s Russian TV program abandoned after 2 months.

     

    Just in case.

Sign In or Register to comment.