Apple's top secret Swift language grew from work to sustain Objective C, which it now aims...

124

Comments

  • Reply 61 of 93
    SpamSandwichSpamSandwich Posts: 33,407member
    Google is going to have a really tough time impressing developers at their soon to be conference. I predict the sound of a very loud 'sad trombone' opening their keynote.
  • Reply 62 of 93
    techguy911techguy911 Posts: 269member
    Quote:
    Originally Posted by wizard69 View Post





    I think I've managed to read a dozen or so pages and I have to agree it has a nice feel to it.

    Obviously I haven't if the manual is yet to be read. Maybe a little less time on AI, Reddit and so forth.



    In any event I see some initial possibilities that will make this platform great if they come to pass:

    1. Apple submits the language to a standards body so that wider adoption can happen!

    2. The way this works within XCode would make it an excellent platform for use as a scripting language within apps. In fact if I was Apple this would be a high priority for iWork.

    3. I have a feeling this could crush Python if it gets widely adopted. I really hope that Apple is hard at work porting libraries to the platform and doesn't forget about non GUI apps. So I'm hoping that Apple has goals to develop a large standard library in the same way many scripting languages have such libraries. It would be even better if Apple supports the language with a repository of modules that are open sourced.


     

    Apple really should have done this at the announcement of Swift.  We really need a modern native speed compiled language that can compete with Python and Java.  I hope Apple is not planning on keeping Swift proprietary and Apple only.  Google half-assed the Go language like they do everything and it still feels like a beta project they are not serious about finishing.  Microsoft tolerated C# and .NET being used on non-Microsoft platforms but did nothing to help its adoption elsewhere.

  • Reply 63 of 93
    ripperripper Posts: 7member
    Quote:

    Originally Posted by asdasd View Post





    what the optionals buy you is safety. With obj C the nil pointers are (unlike other languages) safe to message but not safe to pass to methods which expect a non- nil value.


    Perhaps it shows up later in the iBook than I've gotten to, but it's not clear to me how the optionals give me the safety you mention; at least, not in an automatic sense.  I haven't gotten to the OO part of the book so bear with me on the following.


    If I have a variable:

        var foo : Int?
    discussion yet so bear


    You are right about the strange mutability of the immutable array.

    Because it's not bedroom developers anymore. Actually adding swift to existing objective C projects shouldn't take that long but because interoperability is perfect only one way it doesn't buy you much, as you can't use new features like tuples in Swift code called from objective C. That said the playground is awesome.

    Trust me. Except for unity no serious app is, at the moment, developed in anything other than objective C. Which is fairly rapid development by the way once you know it well, the learning curve is the framework not the language. I don't think a huge amount of time is actually saved by not typing semi colons or getting type inference if you have to consult the manual to work out how a table view controller works. In fact Swift is harder to master, albeit easier to type, because it has new features. You could ignore the new stuff I suppose but extensions, methods in structures and generics look good. Tuples schmuples.



    All of this language learning is no time at all compared to learning the decades old and still accelerating iOS/OS frameworks. I've read the manual and watched the first wwdc video and i think I could port an app in a weekend. Provided it works as advertised.



    As for C and Swift. I haven't seen much explanation how to import it yet.

  • Reply 64 of 93
    ripperripper Posts: 7member
    Quote:

    Originally Posted by asdasd View Post





    what the optionals buy you is safety. With obj C the nil pointers are (unlike other languages) safe to message but not safe to pass to methods which expect a non- nil value.


    Darn, I fat-fingered the reply I was working on.  If it shows up, ignore it.


    Perhaps it shows up later in the iBook than I've reached, but I don't see how optionals provide me with safety; at least, not in any automatic sense.


    If I have a variable:

        var foo : Int?

    and a function:

        func bar(foobar : Int)

    how does:

        bar(foo) [ or should it be 'func bar(foobar : Int?)' ]


    You are right about the strange mutability of the immutable array.

    Because it's not bedroom developers anymore. Actually adding swift to existing objective C projects shouldn't take that long but because interoperability is perfect only one way it doesn't buy you much, as you can't use new features like tuples in Swift code called from objective C. That said the playground is awesome.

    Trust me. Except for unity no serious app is, at the moment, developed in anything other than objective C. Which is fairly rapid development by the way once you know it well, the learning curve is the framework not the language. I don't think a huge amount of time is actually saved by not typing semi colons or getting type inference if you have to consult the manual to work out how a table view controller works. In fact Swift is harder to master, albeit easier to type, because it has new features. You could ignore the new stuff I suppose but extensions, methods in structures and generics look good. Tuples schmuples.



    All of this language learning is no time at all compared to learning the decades old and still accelerating iOS/OS frameworks. I've read the manual and watched the first wwdc video and i think I could port an app in a weekend. Provided it works as advertised.



    As for C and Swift. I haven't seen much explanation how to import it yet.

  • Reply 65 of 93
    Quote:

    Originally Posted by macinthe408 View Post



    I love reading this stuff, even though I understood only three words in the entire article.

     

    I completely agree and Swift has me thinking about learning how to code (the last stuff I did was some advanced BASIC back in 1984...).  Now I just need to find the right materials or an online course to get me started.

  • Reply 66 of 93

    Originally Posted by SpamSandwich View Post

     

    Just chiming in here, but I thought the beauty of Swift was that it will accept C in addition to Swift code. Am I wrong?


    Swift eliminates pointers, and appears to do so at a fundamental level - they're not just hiding the *'s.

     

    So while I fully expect you'll be able to USE C code in a Swift application, I doubt the mapping will be as invisible as it is in Obj-C.

     

    But so far, I can't find out how to do it *at all*.

  • Reply 67 of 93
    ripperripper Posts: 7member
    Originally Posted by asdasd View Post what the optionals buy you is safety. With obj C the nil pointers are (unlike other languages) safe to message but not safe to pass to methods which expect a non- nil value.

     

    Perhaps it's covered in later portions of the iBook than what I've read, but I don't see how optionals buy me the safety you mention; at least not in an automatic way.

     

    If a have an optional: var foo : Int?

    and a function: func bar(foobar : Int)

    what happens if I make the call: bar(foo)  ?

    Is that a type error since the function parameter was Int and not Int?

    Does it try to decompose 'foo' into its value, in which case, if it has no value, I'd expect a runtime exception (compiler warning/error before hand if it could determine 'foo' had never been assigned a value).

    Perhaps the function definition has to be: func bar(foobar : Int?).

    Even so, I'm going to have to check within the function that 'foobar' has a value before I try to use it.  That may make the treatment of parameters consistent between basic types and objects, but how is it fundamentally different from having to check for nil (in the case of objects)?

  • Reply 68 of 93
    mjtomlinmjtomlin Posts: 2,673member
    Quote:
    Originally Posted by Ripper View Post



    Variables are declared: var foo: Int

    That's fair enough, but then 'optionals' are declared: var foo:Int?

    That's too little of a difference (ok, I'm used to C's * for pointers and that's only a single character so...). Anyhow, so I want to use my optional, it's easy in a boolean context to determine if it has a value:



    if foo {...



    but in a scalar context I have to refer to it:



    bar = foo!



    or decompose it in the boolean context:



    if var bar = foo {

    // do something with bar



    But, that's not all. I could actually declare it:



    var foo: Int!



    What does that buy me? The ability to refer to it in a scalar context directly:



    bar = foo



    In the boolean context it's still interpreted as 'does it have a value' thus the context is clearly known in interpreting it. So, why two different ways to specify it and use it?



    In light of the contextual interpretation, I could argue that having all variables be 'optionals' by default would be a more natural and consistent approach. The compiler can certainly warn you if you're attempting to use it before assigning it a value and even if you ignore that warning it's a runtime fault to use it if it doesn't have a value. With 'optional' as the default (only?) type, all variables could be used in the exact same way; e.g.



    var foo: Int

    ...

    if foo {

    // do something with foo

    } else {

    // foo has no value

    }

     

    Read the rest of the manual and/or watch the WWDC videos online - it'll make sense were each of these are applicable.

     

    Hint: a lot of it has to do with cross-compatiblility with Objective-C and Cocoa. This is still needed because even though you're programming in a different language, these objects exist within the Objective-C/Cocoa runtime. This is a transitional period - and eventually (several years) the need for Objective-C / Swift code compatibility will be deprecated and the true potential of the Swift language within Cocoa will be felt through further optimizations in code compilation and the Cocoa runtime environment. Currently there are features in Swift that are not compatible with almost all the Cocoa APIs ... so Swift coding has to be somewhat dumbed down if you want mix it with Objective-C.

     

    Optionals is something that should absolutely be added to every language. It is extremely powerful to be able to alternately specify a universal non-value within a strictly typed language. (Having this as a default is NOT the way to implement it. That would force you to need to check every variable

    if it has a valid value or not. By strictly typing, you can write a function that expects a String and you can be sure you'll get one and not need to check if it's nil reducing the amount of code you need to write.

  • Reply 69 of 93

    Originally Posted by pauldfullerton View Post

    particularly if there is an interpretive compiler for language Y.

    So Apple used to have something called WebScript, which, contrary to anything you might divine from the name, was an Obj-C interpreter. It was also almost useless, because it didn't have pointers, so your "real" Obj-C code didn't run properly (at least not in general).

     

    Why do I mention this? Because there are two missing pieces of the development puzzle on Cocoa, and Swift solves one of them. As noted in the demos, Swift can run immediately, right there.

     

    This leaves the other big missing piece, the ability to *easily* tie code to the UI and edit it live. The "solution" in the system currently is the traditional compile/link/debug cycle, which is a PITA. But more basically, there's no real direct route from UI to code or vice versa, it's all mediated though a field in the UI, which you then have to go hunt for in your code.

     

    .net takes a stab at this later problem. If you double-click on the control in the UI builder, it takes you to the function in the code that "handle"s that widget. Unfortunately it also "helpfully" *creates* the code if it's not there, so inadvertent clicks cause code inserts, which is a MAJOR PITA that gives me fits. But with that exception, it kinda does what you want.

     

    But HyperCard took this one step further. In HC, the code was physically stored as a string in the widget itself. This meant that when you command-clicked on your button, there it was, *just* that code by itself. The upside is that editing was much easier because you could see everything that button did without a lot of extraneous code. The downside is that sometimes that code interacts with other code, which was

     

    I think you can get the functionality of HC in a traditional language like Swift through pragmas. Imagine if you command-clicked on a button (or whatever) in the UI and it normally just showed you the code for that button. Then there's a widget somewhere that lets you see everything around it. Or maybe non-related code is greyed out. You get the idea.

     

    The problem here is that Cocoa, and especially CocoaTouch, use lots of delegate actions, which means there's no direct link between the widget and the code. Nevertheless, I suspect you could figure it out in the majority of cases even then.

     

    This of course leaves us with the requirement for a UI builder that doesn't suck as much as Storyboards...

  • Reply 70 of 93
    ripperripper Posts: 7member
    Quote:
    Originally Posted by malax View Post

     

     

    How can you have a discussion about optionals without even mentioning if-let?  That's the standard method for using them.  I would except to see if-let throughout all well-written Swift code.  Much more so than forced unwrapping (the ! operator).


    I used if-var which is in the same class of construct as if-let.  My point isn't that if-let/-var shouldn't be allowed it's that, by their own documentation, Apple has shown Swift has the ability to interpret an optional based on context and, given that, it makes more sense (IMO) to have that as the default way to treat optionals rather than having it be a different *declaration* of the optional.

     


    As to why immutable arrays and immutable dictionaries behavior differently, I'm sure that that's based on excellent feedback from the Apple "dogfooding group" (I love that term) and not someone's carelessness or desire to think different.  Also as explained in the documentation having an array that can't change size is a great advantage to the compiler (compared to an array that can be any size and change size).  There is no efficiency gain for not allowing elements of an array to change, so why tie the developers hands.  Of course they could have introduced fixed-size immutable arrays, fixed-size mutable arrays, and mutable arrays, but that would be added complexity for very, very little gain.


    I don't disagree as to the benefits of improved efficiency for a non-dynamically-sized array.  But calling it immutable when it can clearly be mutated is incorrect and using the constant defining 'let' for something that may not be constant... that's inconsistent and misleading.  Perhaps arguably even more important is the inconsistency between this 'immutable' array and NSArray in incorporating Cocoa objects into Swift code.  That might not end up being a practical issue given the strong typing that Swift uses, but if there's a generic object reference type (I haven't gotten to the iBook portion on OO yet) one could end up with an array reference to what is expected to be content-mutable (i.e., a Swift 'let' array) but is actually completely non-mutable (i.e., an NSArray), which means only finding the problem at runtime.  Not that that's anything new though; done 'incorrectly' you can always pass an NSArray to something expecting NSMutableArray.

  • Reply 71 of 93
    mjtomlinmjtomlin Posts: 2,673member
    Quote:

    Originally Posted by Maury Markowitz View Post

     

    Swift eliminates pointers, and appears to do so at a fundamental level - they're not just hiding the *'s.

     

    So while I fully expect you'll be able to USE C code in a Swift application, I doubt the mapping will be as invisible as it is in Obj-C.

     

    But so far, I can't find out how to do it *at all*.


     

    Objective-C is an extension of the 'C' language and as such you can continue to use 'C' code inside any Objective-C file and as has been stated you can mix Objective-C files with Swift files in the same application.

     

    You cannot mix 'C' code inside a .swift file. Swift is not at all 'C'. They are as different as Pascal and FORTRAN.

     

    As far as mapping to 'C' libraries, as long as they're 'standard' types, the Swift compiler can transparently map those types back and forth. So if you need to use a 'C' function that needs a (C) "int" as a parameter, you can simply pass it a (Swift) "Int" - no casting or bridging is needed. Same goes for return types. This goes for all types defined within the Cocoa API as well, NSInteger, NSFloat, etc.

  • Reply 72 of 93
    timmilleatimmillea Posts: 238member

    "Swift" is still all very 'C'. It is a 40 years' student's behind idea of a programming language. It is a non-starter.

     

    Where is parallelism?

     

    Where is 'intention' over 'how'?

     

    Apple has failed. The rest is history.

  • Reply 73 of 93
    mjtomlinmjtomlin Posts: 2,673member
    Quote:
    Originally Posted by timmillea View Post

     

    "Swift" is still all very 'C'. It is a 40 years' student's behind idea of a programming language. It is a non-starter.

     

    Where is parallelism?

     

    Where is 'intention' over 'how'?

     

    Apple has failed. The rest is history.


     

    You're absolutely correct ... programming languages are completely set in stone once they're released and can never be updated, extended, or even optimized at some point in the future¡

     

    Game over Apple, shut down your business, you completely failed¡

     

    Apple didn't develop a language for everyone to use - they developed a language that would make it easier for Cocoa programmers to write code.

  • Reply 74 of 93
    danoxdanox Posts: 2,683member
    Quote:

    Originally Posted by techguy911 View Post

     

     

    Apple really should have done this at the announcement of Swift.  We really need a modern native speed compiled language that can compete with Python and Java.  I hope Apple is not planning on keeping Swift proprietary and Apple only.  Google half-assed the Go language like they do everything and it still feels like a beta project they are not serious about finishing.  Microsoft tolerated C# and .NET being used on non-Microsoft platforms but did nothing to help its adoption elsewhere.


     

    ???? Apple job is to sell more hardware at a profit, Swift if it is good will help that, Apple's job isn't to help the competition program better.

  • Reply 75 of 93
    danoxdanox Posts: 2,683member
    Quote:

    Originally Posted by Rogifan View Post



    Of course ZDNET aka Microsoft's PR arm, pisses all over Swift. image



    http://www.zdnet.com/apples-new-swift-development-language-highlights-the-companys-worst-side-7000030150/

     

    The joke as usual is on them, the old and new Apple developers are going to have a field day with Swift, just like the long term investors.

  • Reply 76 of 93
    timmillea wrote: »
    If this were the case, functional languages would have taken over several decades ago. The average intelligence of the programmer is the limiting factor. Sorry - it is hard to say - but true.
    If the average programmer was much more focussed on the functional aspects of what they are writing rather than whether they have put the requisite number of semicolons or curly brackets in the right place, then perhaps I might rate them as being 'intelligent'! The big problem with programmers is that they have become so focussed on the technology of writing programs that they have lost the capacity to communicate with the people who have a functional problem to solve and who will have to use those programs.
    I put it to you that there is at least 10 times as much code written by your 'intelligent' programmers as is actually needed to solve the information processing problems of people and organisations today. That means there are a least 10 times as many bugs floating around waiting to bite people on the bum, and we have to pay at least 10 times as much for buggy programs that don't really address the functional problems they need to be solved. So your 'intelligent' programmers need to learn what is really important - the technology of programming or the quality of the product of programming. And languages like C fail abysmally in that respect.
    So there - that needs to be said also!!
  • Reply 77 of 93
    ripperripper Posts: 7member
    Quote:

    Originally Posted by pauldfullerton View Post





    If the average programmer was much more focussed on the functional aspects ...

    ... 'intelligent' programmers need to learn what is really important - the technology of programming or the quality of the product of programming. And languages like C fail abysmally in that respect.

    I agree with the focus on usability and functionality, but blaming the language is like blaming the hammer for the cockeyed shelving.

    C has its issues and other languages are better for certain problem domains but the major issue as you describe it is with the carpenter rather than his tools.

  • Reply 78 of 93
    stewartstewart Posts: 1member

    More information on learning Swift can be found here.

  • Reply 79 of 93
    Sounds like Apple recognized a good talent from the LLVM compiler. Let's hope this language really takes off and is robust enough to supplant Objective C and other languages that while powerful, force programmers to spend too much time on housekeeping chores.

    It's about time that a "script like" programming language compete with the big boys.

    I'm especially hopeful after reading this sentence; " and an incredibly important internal dogfooding group who provided feedback to help refine and battle-test ideas"

    Always good when a company eats its own dog food.
  • Reply 80 of 93
    scottyoscottyo Posts: 45member
    Quote:

     Second, the inconsistency between:



    let array = [a, b, c]

    let dictionary = [d : 1, e : 2]



    According to the documentation this creates, respectively, an immutable array and dictionary, except the dictionary is truly immutable and arrays are only sort-of immutable.

    While the dictionary cannot be changed at all and the array can't change size, the contents of the array can be changed. That's hardly immutable and certainly different from the true immutability of NSArray.


    I'm with you, but maybe even more so.

     

    let dictionary = [akey : avalue]

     

    is cool. It's truly immutable, in the ordinary sense, as Apple's NS classes do it.

     

    let array = [athing0, athing1] // agreed that array is not really immutable since its elements can be changed

     

    // but wait, there's more

    let anotherRef = array

    let yetAnotherRef = array

     

    array[0] = somethingelse // the other two refs see this change also, because it's NOT really by-value, these are references

     

    // here it gets weird

    array += anotherthing // as a side effect of resizing, Swift allocates and inits a new Array that only array refers to

     

    // array is now [somethingelse, athing1, anotherthing]

    // anotherRef and yetAnotherRef are both [somethingelse, athing1] because they both refer to the same allocation, the one made at the beginning

     

    The other two refs still refer to the original allocation, but as far as array is concerned, the "kinda-sorta by-value-ness" has finally turned into a real by-value-ness (at least till a new copy of the array ref is made, then the weirdness can start again).

     

    and it's getting complicated:

     

    array[0] = somethingElseAgain // this will NOT affect what anotherRef and yetAnotherRef contain

     

    anotherRef[0] = anotherDamnThing // WILL also affect what yetAnotherRef contains, but not what array contains

     

    but a few lines earlier, those same statements would have affected all 3 array refs.

     

    I cringe, thinking this could cause very hard-to-find bugs. Side-effect city, because the by-value contract has been broken. The breakage occurred at the let anotherRef = array and let yetAnotherRef = array steps, and was first visible when array[0] was allowed to change and affect the other 2 refs. The resize problem is a victim, not a perp; it did the right thing.

     

    I see why they did it. It's a common (and needed!, for performance reasons) tactic not to actually copy even when the programmer's model says a copy occurs, holding off to see if a change is actually made, then do the copy. But at that point there are no other refs that could be made inconsistent by a new allocation.

     

    I assume the compiler jocks at Apple, being compiler jocks, have a proof that this is entirely safe and will create no new surprises for devs, but till I see the proof, I still cringe.

     

    Also closed vs open ranges look like they might be typo traps and homes for off-by-one errors, because the difference between [a..b] and [a…b] may not be obvious to tired code-reviewing eyes. It's also slightly disconcerting that ranges can only be open at the right end and not the left; feels like the syntax is lacking and ungeneral.

     

    On the good side, they killed off 

    if (a=b) {

    }

     

    But I don't want to dump on Swift. On the whole, I think Swift will make it considerably easier to write code and avoid bugs than C, Obj-C and C++ ever did, but I don't think it's quite soup yet.

     

    I think Swift's biggest obstacle will be adoption. Cocoa was markedly superior to Carbon and OS 9 libraries in most (maybe not all) ways, but adoption is still not complete, AFAIK; Finder was only Cocoa-fied by Apple recently, at Lion or ML IIRC. And 99% of all new languages don't make the big-time, otherwise there'd be 50,000 languages in common use.

     

    Will Swift make it, especially as it's (currently) proprietary to Apple?

Sign In or Register to comment.