or Connect
AppleInsider › Forums › Software › Mac Software › Apple's top secret Swift language grew from work to sustain Objective C, which it now aims to replace
New Posts  All Forums:Forum Nav:

Apple's top secret Swift language grew from work to sustain Objective C, which it now aims to... - Page 2

post #41 of 89
Quote:
Originally Posted by mstone View Post
 

Ok you have been working with Swift for 24 hours. Lattner has been working with it since 2010 and working on the building blocks for 15 years. He also probably has an IQ of 200. He says it can replace Obj-C and by extension most likely C, C++, C#. He has a whole team of geniuses working together on this. I'd take his word over anyone else especially someone who has only one day under their belt.

 

Lattner knows jack squat about ObjC [he's admitted consistently on the LLVM/Clang threads. He also knows squat about Clang and how it's put together, though he's well-versed in C++ and knows C. His expertise and thesis is centered around architecture and implementations of Compilers, and LLVM is a product of it.

Someone he would have relied heavily on are the likes of Federighi, Forstall, Peter Graffagnino, several Cocoa experts at Apple, outside language experts like Aaron Hillegass, and several others working intimately on Clang. His addition of blocks and ARC for a replacement of Object Retain/Release wasn't a solo project. There is a specific leadership at Apple with decades more experience than Chris.

post #42 of 89
Quote:
Originally Posted by Frac View Post

For Python fans, I can see quite a bit in there...
http://www.LearnSwift.tips/
Heads up hackers - may be that scripting is supported.

 

Good link. Thanks.

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #43 of 89
Originally Posted by Ripper View Post
In light of the contextual interpretation, I could argue that having all variables be 'optionals' by default would be a more natural and consistent approach. 

It could also increase program size and complexity. Optionality also implies nullability, which means that you can use a simple scalar type to box to an object, but have to include flags as well. Personally I don't know if this is a real issue or not, but its the one that leaves bits of crud like this in a number of languages.

post #44 of 89
Originally Posted by Marvin View Post
The next step (pun intended) would be to allow Swift development on iOS itself. In fact, only Swift development and not C/C++/Objective-C. People can build their sprites in some drawing app, use the app extensions to bring them in to XCode for iOS and run it in real-time. That should make Alan Kay happy.

I would much prefer a different next step, which they kinda sorta have with those interactive debuggers - I'd like to directly link the GUI builder to the code, similar to but better than .net/VB. Ideally I'd like the ease of HyperCard in terms of moving from the GUI to the code that runs it. I think that is a much greater barrier to easy development and THEN you'd be ready for the jump to iOS. I mean, can you imagine storyboards on a iPad? *shudder*.

post #45 of 89
Quote:
Originally Posted by Ripper View Post

I want to love Swift, but at this stage I just can't, not in its current state. I've only gotten through about the first 200 pages of the iBook but there are things that, IMO, could/should be more rationalized, particularly some of the syntax. A couple things off the top of my head:

Variables are declared: var foo: Int
That's fair enough, but then 'optionals' are declared: var foo:Int?
That's too little of a difference (ok, I'm used to C's * for pointers and that's only a single character so...). Anyhow, so I want to use my optional, it's easy in a boolean context to determine if it has a value:

if foo {...

but in a scalar context I have to refer to it:

bar = foo!

or decompose it in the boolean context:

if var bar = foo {
// do something with bar

But, that's not all. I could actually declare it:

var foo: Int!

What does that buy me? The ability to refer to it in a scalar context directly:

bar = foo

In the boolean context it's still interpreted as 'does it have a value' thus the context is clearly known in interpreting it. So, why two different ways to specify it and use it?

In light of the contextual interpretation, I could argue that having all variables be 'optionals' by default would be a more natural and consistent approach. The compiler can certainly warn you if you're attempting to use it before assigning it a value and even if you ignore that warning it's a runtime fault to use it if it doesn't have a value. With 'optional' as the default (only?) type, all variables could be used in the exact same way; e.g.

var foo: Int
...
if foo {
// do something with foo
} else {
// foo has no value
}

Second, the inconsistency between:

let array = [a, b, c]
let dictionary = [d : 1, e : 2]

According to the documentation this creates, respectively, an immutable array and dictionary, except the dictionary is truly immutable and arrays are only sort-of immutable.
While the dictionary cannot be changed at all and the array can't change size, the contents of the array can be changed. That's hardly immutable and certainly different from the true immutability of NSArray.

That's probably enough for now for you all to take me to task. :-)
I'd love something more modern than Objective-C, something where everything is an object, say. Or any number of other things, but as it stands, I don't think Swift is it.

what the optionals buy you is safety. With obj C the nil pointers are (unlike other languages) safe to message but not safe to pass to methods which expect a non- nil value. You are right about the strange mutability of the immutable array.
Quote:
Originally Posted by Maury Markowitz View Post

Given that the AppStore launched in 2008, and I'd have to surmise that the vast majority of submissions are from people who never looked at Obj-C before that point, that implies Apple convinced thousands (millions?) of people to pick up Obj-C in the last five years.

So why shouldn't they convince them to switch to something that's supposedly much better? MS did with C#.

Because it's not bedroom developers anymore. Actually adding swift to existing objective C projects shouldn't take that long but because interoperability is perfect only one way it doesn't buy you much, as you can't use new features like tuples in Swift code called from objective C. That said the playground is awesome.
Quote:
Originally Posted by Marvin View Post

A lot of modern development is based around web programming and the iPhone was originally going to be web apps only. It needed performance and access to graphics frameworks though and C-based languages are the most popular but they are no good for rapid application development and that's what's needed these days. Turnaround times are more important than raw performance for most apps. For the ones that need the performance, they can do it the way they know how. Swift offers both performance and fast turnaround times.

It could even replace Applescript (hopefully).

It would be nice if they'd open source it and make moves for it to compete with C#. If it was in 3rd party SDKs like Unity, they can natively compile apps without XCode and could more easily call iOS system APIs. That might mean people don't bother buying Macs for iOS development, although the simulator would still be Mac-only and wide support and adoption of the language would be beneficial.
I doubt that's the case. A lot of apps are very simple apps, they need to be turned around quickly and deployed to more than iOS. 3rd party SDKs are used a fair bit and many simply use web languages:

http://www.sencha.com
https://xamarin.com
http://www.appcelerator.com
http://phonegap.com
https://www.unrealengine.com
http://unity3d.com
http://coronalabs.com/products/corona-sdk/

An exception is the following used for Badland, which uses Objective-C:

http://www.cocos2d-iphone.org

The next step (pun intended) would be to allow Swift development on iOS itself. In fact, only Swift development and not C/C++/Objective-C. People can build their sprites in some drawing app, use the app extensions to bring them in to XCode for iOS and run it in real-time. That should make Alan Kay happy.

Trust me. Except for unity no serious app is, at the moment, developed in anything other than objective C. Which is fairly rapid development by the way once you know it well, the learning curve is the framework not the language. I don't think a huge amount of time is actually saved by not typing semi colons or getting type inference if you have to consult the manual to work out how a table view controller works. In fact Swift is harder to master, albeit easier to type, because it has new features. You could ignore the new stuff I suppose but extensions, methods in structures and generics look good. Tuples schmuples.

All of this language learning is no time at all compared to learning the decades old and still accelerating iOS/OS frameworks. I've read the manual and watched the first wwdc video and i think I could port an app in a weekend. Provided it works as advertised.

As for C and Swift. I haven't seen much explanation how to import it yet.
I wanted dsadsa bit it was taken.
Reply
I wanted dsadsa bit it was taken.
Reply
post #46 of 89
Quote:
Originally Posted by mdriftmeyer View Post
 

 

It doesn't have the performance of C or Lattner and Federighi would have put up the sorting and other graphs to show it. It doesn't have the added overhead of ObjC 2.0 [they've put the magic into the compiler] but it sure won't match C11/C99.

 

The Python comparison was obnoxious.

That doesn't make any sense.  If one coded an algorithm in Xcode using C, Objective-C, or Swift the compiled code would be identical.  Ergo, Swift has the performance of C, period.

post #47 of 89
Quote:
Originally Posted by malax View Post

If one coded an algorithm in Xcode using C, Objective-C, or Swift the compiled code would be identical.  Ergo, Swift has the performance of C, period.

Wrong. Objective-C has runtime overhead that C does not. Swift may have less overhead but it will still have runtime costs that C doesn't. Swift's memory safety, bounds checking, etc. all have runtime costs that C does not share. The generated code would not be the same between C and Swift to have those runtime features.
post #48 of 89
Quote:
Originally Posted by Ripper View Post

I want to love Swift, but at this stage I just can't, not in its current state. I've only gotten through about the first 200 pages of the iBook but there are things that, IMO, could/should be more rationalized, particularly some of the syntax. A couple things off the top of my head:

Variables are declared: var foo: Int
That's fair enough, but then 'optionals' are declared: var foo:Int?
That's too little of a difference (ok, I'm used to C's * for pointers and that's only a single character so...). Anyhow, so I want to use my optional, it's easy in a boolean context to determine if it has a value:

if foo {...

but in a scalar context I have to refer to it:

bar = foo!

or decompose it in the boolean context:

if var bar = foo {
// do something with bar

But, that's not all. I could actually declare it:

var foo: Int!

What does that buy me? The ability to refer to it in a scalar context directly:

bar = foo

In the boolean context it's still interpreted as 'does it have a value' thus the context is clearly known in interpreting it. So, why two different ways to specify it and use it?

In light of the contextual interpretation, I could argue that having all variables be 'optionals' by default would be a more natural and consistent approach. The compiler can certainly warn you if you're attempting to use it before assigning it a value and even if you ignore that warning it's a runtime fault to use it if it doesn't have a value. With 'optional' as the default (only?) type, all variables could be used in the exact same way; e.g.

var foo: Int
...
if foo {
// do something with foo
} else {
// foo has no value
}

Second, the inconsistency between:

let array = [a, b, c]
let dictionary = [d : 1, e : 2]

According to the documentation this creates, respectively, an immutable array and dictionary, except the dictionary is truly immutable and arrays are only sort-of immutable.
While the dictionary cannot be changed at all and the array can't change size, the contents of the array can be changed. That's hardly immutable and certainly different from the true immutability of NSArray.

That's probably enough for now for you all to take me to task. :-)
I'd love something more modern than Objective-C, something where everything is an object, say. Or any number of other things, but as it stands, I don't think Swift is it.

 

How can you have a discussion about optionals without even mentioning if-let?  That's the standard method for using them.  I would except to see if-let throughout all well-written Swift code.  Much more so than forced unwrapping (the ! operator).

 

Also a major reason for the optional variable addition is to deal with the fact that any and all existing object methods can return nil in place of a real object.  if-let gives us a much cleaner, more consistent way to address this.

 

As to why immutable arrays and immutable dictionaries behavior differently, I'm sure that that's based on excellent feedback from the Apple "dogfooding group" (I love that term) and not someone's carelessness or desire to think different.  Also as explained in the documentation having an array that can't change size is a great advantage to the compiler (compared to an array that can be any size and change size).  There is no efficiency gain for not allowing elements of an array to change, so why tie the developers hands.  Of course they could have introduced fixed-size immutable arrays, fixed-size mutable arrays, and mutable arrays, but that would be added complexity for very, very little gain.

 

Anyway, I expect that only after use will some of these design decisions become more obvious, but at first blush it appears that they have tried really hard to learn the lessons from other languages while developing a language that's perfectly suited for the massive libraries at the heart of OS X and iOS.

post #49 of 89
Quote:
Originally Posted by mdriftmeyer View Post
 
Lattner knows jack squat about ObjC [he's admitted consistently on the LLVM/Clang threads. He also knows squat about Clang and how it's put together, though he's well-versed in C++ and knows C. 

Apple must really have their head up their collective ass to make such an incapable moron the Director of the Developer Tools overseeing hundreds of brilliant engineers.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #50 of 89
Quote:
Originally Posted by bdkennedy1 View Post

"LLVM not only powers Apple's software, but is also tightly integrated into the development of Apple's custom silicon, including the A6 and A7 Application Processors."

And there we have the proof that Apple with put A chips in their Macs.

That's not proof but it's definitely a powerful point for the next time the rumour crops up.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #51 of 89
Quote:
Originally Posted by Rogifan View Post

Of course ZDNET aka Microsoft's PR arm, pisses all over Swift. 1rolleyes.gif

http://www.zdnet.com/apples-new-swift-development-language-highlights-the-companys-worst-side-7000030150/

That guy has an absolutely biased perspective. Microsoft .net = the pinnacle of software development? Silverlight the choice for cross-platform? Bitch, please. Microsoft is fighting to stay relevant outside of enterprise coding.

It must suck to realize that despite how "superior" the Windows Phone 8 development languages and SDK are, all the notable mobile app developers have learned Obj-C (maybe even put up with it) to get published on iOS. It's gotta burn.

He's also not giving Apple any credit for WebKit, and their support for fast JavaScript engines: two web developer technologies that helped kill the clumsy, misguided XHTML spec and bring about the age of HTML5 and modern Web apps. Web apps that have ensured the web remains open and cross platform, not "Best Viewed in IE6".

"Apple should pull the plug on the iPhone."

John C. Dvorak, 2007
Reply

"Apple should pull the plug on the iPhone."

John C. Dvorak, 2007
Reply
post #52 of 89
Quote:
Originally Posted by mstone View Post

Apple must really have their head up their collective ass to make such an incapable moron the Director of the Developer Tools overseeing hundreds of brilliant engineers.

Did you forget the /s?

"Apple should pull the plug on the iPhone."

John C. Dvorak, 2007
Reply

"Apple should pull the plug on the iPhone."

John C. Dvorak, 2007
Reply
post #53 of 89
Is this the rock and mineral club?
post #54 of 89
I think this is one of the most significant announcements by Apple is recent years. Although C-variants including Obj-C, are powerful in the right hands, they are just 2nd generation languages in the sense that we used to refer to Fortran and Cobol as 3rd generation languages. The syntax is inherently far too complicated for C-based languages to be used efficiently by all but highly trained professional programmers.

From what I have seen from the Swift documentation so far, it is moving away from these unnecessary complications of C-based languages. If this simplicity translates to the use of Swift with the incredibly powerful Cocoa frameworks, then Swift will open the app developer world up to far more people and stimulate the development of far more imaginative apps in the future, regardless of the platform used.

Well done Apple and Chris Lattner!
post #55 of 89
Swift is most certainly a c based language. A mixture of procedural and object orientated. It's not a scripting language.
I wanted dsadsa bit it was taken.
Reply
I wanted dsadsa bit it was taken.
Reply
post #56 of 89
Quote:
Originally Posted by malax View Post

That doesn't make any sense.  If one coded an algorithm in Xcode using C, Objective-C, or Swift the compiled code would be identical.  Ergo, Swift has the performance of C, period.

And yet Apple themselves showed the swift being faster than objective C in at least two common scenarios on their slides.

How could it be?
I wanted dsadsa bit it was taken.
Reply
I wanted dsadsa bit it was taken.
Reply
post #57 of 89
Quote:
Originally Posted by pauldfullerton View Post

I think this is one of the most significant announcements by Apple is recent years. Although C-variants including Obj-C, are powerful in the right hands, they are just 2nd generation languages in the sense that we used to refer to Fortran and Cobol as 3rd generation languages. The syntax is inherently far too complicated for C-based languages to be used efficiently by all but highly trained professional programmers.

From what I have seen from the Swift documentation so far, it is moving away from these unnecessary complications of C-based languages. If this simplicity translates to the use of Swift with the incredibly powerful Cocoa frameworks, then Swift will open the app developer world up to far more people and stimulate the development of far more imaginative apps in the future, regardless of the platform used.

Well done Apple and Chris Lattner!

 All general purpose programming languages are equally 'powerful'. Swift is strictly 2nd generation 'plus'. It is C but has borrowed a few features from Haskell. It would be embarrassing if it had not. Apple has sealed its fate by promoting such an old-fashioned 'new' programming language. Let fate take its course....

post #58 of 89
I'm a little surprised no one has mentioned the legal advantages this code switch is going to give Apple. Now, no company will be able to claim code theft or infringement in their software. Swift will enable the creation of code that may perform nearly identical functions, but the underlying code could be significantly different.
Edited by SpamSandwich - 6/5/14 at 5:28am

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #59 of 89
Quote:
Originally Posted by timmillea View Post
 

 All general purpose programming languages are equally 'powerful'. Swift is strictly 2nd generation 'plus'. It is C but has borrowed a few features from Haskell. It would be embarrassing if it had not. Apple has sealed its fate by promoting such an old-fashioned 'new' programming language. Let fate take its course....

Rubbish. If it takes a programmer 500 lines of code in language X to do what another programmer can do with 50 lines of code in language Y, then language Y is far more 'powerful' than language X because it costs far less to complete a specific programming task. On top of that it should take far less time to debug and maintain 50 lines of code than 500 lines of code, particularly if there is an interpretive compiler for language Y.

 

Now how many lines of code do you think a programmer will save by using Swift rather than Objective C? And how much steeper is the learning curve for Objective C than for Swift?

 

As for 'Apple sealing its fate' - can you name just one company that would not LOVE to share Apple's fate in the future? Microsoft, Google, Samsung ... ?

post #60 of 89

If this were the case, functional languages would have taken over several decades ago. The average intelligence of the programmer is the limiting factor. Sorry - it is hard to say - but true.

post #61 of 89
Google is going to have a really tough time impressing developers at their soon to be conference. I predict the sound of a very loud 'sad trombone' opening their keynote.

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #62 of 89
Quote:
Originally Posted by wizard69 View Post


I think I've managed to read a dozen or so pages and I have to agree it has a nice feel to it.
Obviously I haven't if the manual is yet to be read. Maybe a little less time on AI, Reddit and so forth.

In any event I see some initial possibilities that will make this platform great if they come to pass:
  1. Apple submits the language to a standards body so that wider adoption can happen!
  2. The way this works within XCode would make it an excellent platform for use as a scripting language within apps. In fact if I was Apple this would be a high priority for iWork.
  3. I have a feeling this could crush Python if it gets widely adopted. I really hope that Apple is hard at work porting libraries to the platform and doesn't forget about non GUI apps. So I'm hoping that Apple has goals to develop a large standard library in the same way many scripting languages have such libraries. It would be even better if Apple supports the language with a repository of modules that are open sourced.

 

Apple really should have done this at the announcement of Swift.  We really need a modern native speed compiled language that can compete with Python and Java.  I hope Apple is not planning on keeping Swift proprietary and Apple only.  Google half-assed the Go language like they do everything and it still feels like a beta project they are not serious about finishing.  Microsoft tolerated C# and .NET being used on non-Microsoft platforms but did nothing to help its adoption elsewhere.


Edited by techguy911 - 6/5/14 at 6:20am
post #63 of 89
Quote:
Originally Posted by macinthe408 View Post

I love reading this stuff, even though I understood only three words in the entire article.

 

I completely agree and Swift has me thinking about learning how to code (the last stuff I did was some advanced BASIC back in 1984...).  Now I just need to find the right materials or an online course to get me started.

post #64 of 89
Originally Posted by SpamSandwich View Post
 

Just chiming in here, but I thought the beauty of Swift was that it will accept C in addition to Swift code. Am I wrong?

Swift eliminates pointers, and appears to do so at a fundamental level - they're not just hiding the *'s.

 

So while I fully expect you'll be able to USE C code in a Swift application, I doubt the mapping will be as invisible as it is in Obj-C.

 

But so far, I can't find out how to do it *at all*.

post #65 of 89
Originally Posted by asdasd View Post what the optionals buy you is safety. With obj C the nil pointers are (unlike other languages) safe to message but not safe to pass to methods which expect a non- nil value.

 

Perhaps it's covered in later portions of the iBook than what I've read, but I don't see how optionals buy me the safety you mention; at least not in an automatic way.

 

If a have an optional: var foo : Int?
and a function: func bar(foobar : Int)
what happens if I make the call: bar(foo)  ?
Is that a type error since the function parameter was Int and not Int?
Does it try to decompose 'foo' into its value, in which case, if it has no value, I'd expect a runtime exception (compiler warning/error before hand if it could determine 'foo' had never been assigned a value).
Perhaps the function definition has to be: func bar(foobar : Int?).
Even so, I'm going to have to check within the function that 'foobar' has a value before I try to use it.  That may make the treatment of parameters consistent between basic types and objects, but how is it fundamentally different from having to check for nil (in the case of objects)?

post #66 of 89
Quote:
Originally Posted by Ripper View Post

Variables are declared: var foo: Int
That's fair enough, but then 'optionals' are declared: var foo:Int?
That's too little of a difference (ok, I'm used to C's * for pointers and that's only a single character so...). Anyhow, so I want to use my optional, it's easy in a boolean context to determine if it has a value:

if foo {...

but in a scalar context I have to refer to it:

bar = foo!

or decompose it in the boolean context:

if var bar = foo {
// do something with bar

But, that's not all. I could actually declare it:

var foo: Int!

What does that buy me? The ability to refer to it in a scalar context directly:

bar = foo

In the boolean context it's still interpreted as 'does it have a value' thus the context is clearly known in interpreting it. So, why two different ways to specify it and use it?

In light of the contextual interpretation, I could argue that having all variables be 'optionals' by default would be a more natural and consistent approach. The compiler can certainly warn you if you're attempting to use it before assigning it a value and even if you ignore that warning it's a runtime fault to use it if it doesn't have a value. With 'optional' as the default (only?) type, all variables could be used in the exact same way; e.g.

var foo: Int
...
if foo {
// do something with foo
} else {
// foo has no value
}

 

Read the rest of the manual and/or watch the WWDC videos online - it'll make sense were each of these are applicable.

 

Hint: a lot of it has to do with cross-compatiblility with Objective-C and Cocoa. This is still needed because even though you're programming in a different language, these objects exist within the Objective-C/Cocoa runtime. This is a transitional period - and eventually (several years) the need for Objective-C / Swift code compatibility will be deprecated and the true potential of the Swift language within Cocoa will be felt through further optimizations in code compilation and the Cocoa runtime environment. Currently there are features in Swift that are not compatible with almost all the Cocoa APIs ... so Swift coding has to be somewhat dumbed down if you want mix it with Objective-C.

 

Optionals is something that should absolutely be added to every language. It is extremely powerful to be able to alternately specify a universal non-value within a strictly typed language. (Having this as a default is NOT the way to implement it. That would force you to need to check every variable

if it has a valid value or not. By strictly typing, you can write a function that expects a String and you can be sure you'll get one and not need to check if it's nil reducing the amount of code you need to write.


Edited by mjtomlin - 6/5/14 at 7:21am
Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
post #67 of 89
Originally Posted by pauldfullerton View Post
particularly if there is an interpretive compiler for language Y.

So Apple used to have something called WebScript, which, contrary to anything you might divine from the name, was an Obj-C interpreter. It was also almost useless, because it didn't have pointers, so your "real" Obj-C code didn't run properly (at least not in general).

 

Why do I mention this? Because there are two missing pieces of the development puzzle on Cocoa, and Swift solves one of them. As noted in the demos, Swift can run immediately, right there.

 

This leaves the other big missing piece, the ability to *easily* tie code to the UI and edit it live. The "solution" in the system currently is the traditional compile/link/debug cycle, which is a PITA. But more basically, there's no real direct route from UI to code or vice versa, it's all mediated though a field in the UI, which you then have to go hunt for in your code.

 

.net takes a stab at this later problem. If you double-click on the control in the UI builder, it takes you to the function in the code that "handle"s that widget. Unfortunately it also "helpfully" *creates* the code if it's not there, so inadvertent clicks cause code inserts, which is a MAJOR PITA that gives me fits. But with that exception, it kinda does what you want.

 

But HyperCard took this one step further. In HC, the code was physically stored as a string in the widget itself. This meant that when you command-clicked on your button, there it was, *just* that code by itself. The upside is that editing was much easier because you could see everything that button did without a lot of extraneous code. The downside is that sometimes that code interacts with other code, which was

 

I think you can get the functionality of HC in a traditional language like Swift through pragmas. Imagine if you command-clicked on a button (or whatever) in the UI and it normally just showed you the code for that button. Then there's a widget somewhere that lets you see everything around it. Or maybe non-related code is greyed out. You get the idea.

 

The problem here is that Cocoa, and especially CocoaTouch, use lots of delegate actions, which means there's no direct link between the widget and the code. Nevertheless, I suspect you could figure it out in the majority of cases even then.

 

This of course leaves us with the requirement for a UI builder that doesn't suck as much as Storyboards...

post #68 of 89
Quote:
Originally Posted by malax View Post
 

 

How can you have a discussion about optionals without even mentioning if-let?  That's the standard method for using them.  I would except to see if-let throughout all well-written Swift code.  Much more so than forced unwrapping (the ! operator).

I used if-var which is in the same class of construct as if-let.  My point isn't that if-let/-var shouldn't be allowed it's that, by their own documentation, Apple has shown Swift has the ability to interpret an optional based on context and, given that, it makes more sense (IMO) to have that as the default way to treat optionals rather than having it be a different *declaration* of the optional.

 

As to why immutable arrays and immutable dictionaries behavior differently, I'm sure that that's based on excellent feedback from the Apple "dogfooding group" (I love that term) and not someone's carelessness or desire to think different.  Also as explained in the documentation having an array that can't change size is a great advantage to the compiler (compared to an array that can be any size and change size).  There is no efficiency gain for not allowing elements of an array to change, so why tie the developers hands.  Of course they could have introduced fixed-size immutable arrays, fixed-size mutable arrays, and mutable arrays, but that would be added complexity for very, very little gain.

I don't disagree as to the benefits of improved efficiency for a non-dynamically-sized array.  But calling it immutable when it can clearly be mutated is incorrect and using the constant defining 'let' for something that may not be constant... that's inconsistent and misleading.  Perhaps arguably even more important is the inconsistency between this 'immutable' array and NSArray in incorporating Cocoa objects into Swift code.  That might not end up being a practical issue given the strong typing that Swift uses, but if there's a generic object reference type (I haven't gotten to the iBook portion on OO yet) one could end up with an array reference to what is expected to be content-mutable (i.e., a Swift 'let' array) but is actually completely non-mutable (i.e., an NSArray), which means only finding the problem at runtime.  Not that that's anything new though; done 'incorrectly' you can always pass an NSArray to something expecting NSMutableArray.

post #69 of 89
Quote:
Originally Posted by Maury Markowitz View Post
 

Swift eliminates pointers, and appears to do so at a fundamental level - they're not just hiding the *'s.

 

So while I fully expect you'll be able to USE C code in a Swift application, I doubt the mapping will be as invisible as it is in Obj-C.

 

But so far, I can't find out how to do it *at all*.

 

Objective-C is an extension of the 'C' language and as such you can continue to use 'C' code inside any Objective-C file and as has been stated you can mix Objective-C files with Swift files in the same application.

 

You cannot mix 'C' code inside a .swift file. Swift is not at all 'C'. They are as different as Pascal and FORTRAN.

 

As far as mapping to 'C' libraries, as long as they're 'standard' types, the Swift compiler can transparently map those types back and forth. So if you need to use a 'C' function that needs a (C) "int" as a parameter, you can simply pass it a (Swift) "Int" - no casting or bridging is needed. Same goes for return types. This goes for all types defined within the Cocoa API as well, NSInteger, NSFloat, etc.

Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
post #70 of 89

"Swift" is still all very 'C'. It is a 40 years' student's behind idea of a programming language. It is a non-starter.

 

Where is parallelism?

 

Where is 'intention' over 'how'?

 

Apple has failed. The rest is history.

post #71 of 89
Quote:
Originally Posted by timmillea View Post
 

"Swift" is still all very 'C'. It is a 40 years' student's behind idea of a programming language. It is a non-starter.

 

Where is parallelism?

 

Where is 'intention' over 'how'?

 

Apple has failed. The rest is history.

 

You're absolutely correct ... programming languages are completely set in stone once they're released and can never be updated, extended, or even optimized at some point in the future¡

 

Game over Apple, shut down your business, you completely failed¡

 

Apple didn't develop a language for everyone to use - they developed a language that would make it easier for Cocoa programmers to write code.

Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
post #72 of 89
Quote:
Originally Posted by techguy911 View Post
 

 

Apple really should have done this at the announcement of Swift.  We really need a modern native speed compiled language that can compete with Python and Java.  I hope Apple is not planning on keeping Swift proprietary and Apple only.  Google half-assed the Go language like they do everything and it still feels like a beta project they are not serious about finishing.  Microsoft tolerated C# and .NET being used on non-Microsoft platforms but did nothing to help its adoption elsewhere.

 

???? Apple job is to sell more hardware at a profit, Swift if it is good will help that, Apple's job isn't to help the competition program better.

post #73 of 89
Quote:
Originally Posted by Rogifan View Post

Of course ZDNET aka Microsoft's PR arm, pisses all over Swift. 1rolleyes.gif

http://www.zdnet.com/apples-new-swift-development-language-highlights-the-companys-worst-side-7000030150/

 

The joke as usual is on them, the old and new Apple developers are going to have a field day with Swift, just like the long term investors.

post #74 of 89
Quote:
Originally Posted by timmillea View Post

If this were the case, functional languages would have taken over several decades ago. The average intelligence of the programmer is the limiting factor. Sorry - it is hard to say - but true.
If the average programmer was much more focussed on the functional aspects of what they are writing rather than whether they have put the requisite number of semicolons or curly brackets in the right place, then perhaps I might rate them as being 'intelligent'! The big problem with programmers is that they have become so focussed on the technology of writing programs that they have lost the capacity to communicate with the people who have a functional problem to solve and who will have to use those programs.
I put it to you that there is at least 10 times as much code written by your 'intelligent' programmers as is actually needed to solve the information processing problems of people and organisations today. That means there are a least 10 times as many bugs floating around waiting to bite people on the bum, and we have to pay at least 10 times as much for buggy programs that don't really address the functional problems they need to be solved. So your 'intelligent' programmers need to learn what is really important - the technology of programming or the quality of the product of programming. And languages like C fail abysmally in that respect.
So there - that needs to be said also!!
post #75 of 89
Quote:
Originally Posted by pauldfullerton View Post


If the average programmer was much more focussed on the functional aspects ...
... 'intelligent' programmers need to learn what is really important - the technology of programming or the quality of the product of programming. And languages like C fail abysmally in that respect.

I agree with the focus on usability and functionality, but blaming the language is like blaming the hammer for the cockeyed shelving.

C has its issues and other languages are better for certain problem domains but the major issue as you describe it is with the carpenter rather than his tools.

post #76 of 89
Sounds like Apple recognized a good talent from the LLVM compiler. Let's hope this language really takes off and is robust enough to supplant Objective C and other languages that while powerful, force programmers to spend too much time on housekeeping chores.

It's about time that a "script like" programming language compete with the big boys.

I'm especially hopeful after reading this sentence; " and an incredibly important internal dogfooding group who provided feedback to help refine and battle-test ideas"

Always good when a company eats its own dog food.
post #77 of 89
Quote:
 Second, the inconsistency between:

let array = [a, b, c]
let dictionary = [d : 1, e : 2]

According to the documentation this creates, respectively, an immutable array and dictionary, except the dictionary is truly immutable and arrays are only sort-of immutable.
While the dictionary cannot be changed at all and the array can't change size, the contents of the array can be changed. That's hardly immutable and certainly different from the true immutability of NSArray.

I'm with you, but maybe even more so.

 

let dictionary = [akey : avalue]

 

is cool. It's truly immutable, in the ordinary sense, as Apple's NS classes do it.

 

let array = [athing0, athing1] // agreed that array is not really immutable since its elements can be changed

 

// but wait, there's more

let anotherRef = array

let yetAnotherRef = array

 

array[0] = somethingelse // the other two refs see this change also, because it's NOT really by-value, these are references

 

// here it gets weird

array += anotherthing // as a side effect of resizing, Swift allocates and inits a new Array that only array refers to

 

// array is now [somethingelse, athing1, anotherthing]

// anotherRef and yetAnotherRef are both [somethingelse, athing1] because they both refer to the same allocation, the one made at the beginning

 

The other two refs still refer to the original allocation, but as far as array is concerned, the "kinda-sorta by-value-ness" has finally turned into a real by-value-ness (at least till a new copy of the array ref is made, then the weirdness can start again).

 

and it's getting complicated:

 

array[0] = somethingElseAgain // this will NOT affect what anotherRef and yetAnotherRef contain

 

anotherRef[0] = anotherDamnThing // WILL also affect what yetAnotherRef contains, but not what array contains

 

but a few lines earlier, those same statements would have affected all 3 array refs.

 

I cringe, thinking this could cause very hard-to-find bugs. Side-effect city, because the by-value contract has been broken. The breakage occurred at the let anotherRef = array and let yetAnotherRef = array steps, and was first visible when array[0] was allowed to change and affect the other 2 refs. The resize problem is a victim, not a perp; it did the right thing.

 

I see why they did it. It's a common (and needed!, for performance reasons) tactic not to actually copy even when the programmer's model says a copy occurs, holding off to see if a change is actually made, then do the copy. But at that point there are no other refs that could be made inconsistent by a new allocation.

 

I assume the compiler jocks at Apple, being compiler jocks, have a proof that this is entirely safe and will create no new surprises for devs, but till I see the proof, I still cringe.

 

Also closed vs open ranges look like they might be typo traps and homes for off-by-one errors, because the difference between [a..b] and [a…b] may not be obvious to tired code-reviewing eyes. It's also slightly disconcerting that ranges can only be open at the right end and not the left; feels like the syntax is lacking and ungeneral.

 

On the good side, they killed off 

if (a=b) {

}

 

But I don't want to dump on Swift. On the whole, I think Swift will make it considerably easier to write code and avoid bugs than C, Obj-C and C++ ever did, but I don't think it's quite soup yet.

 

I think Swift's biggest obstacle will be adoption. Cocoa was markedly superior to Carbon and OS 9 libraries in most (maybe not all) ways, but adoption is still not complete, AFAIK; Finder was only Cocoa-fied by Apple recently, at Lion or ML IIRC. And 99% of all new languages don't make the big-time, otherwise there'd be 50,000 languages in common use.

 

Will Swift make it, especially as it's (currently) proprietary to Apple?

post #78 of 89
Quote:
 If one coded an algorithm in Xcode using C, Objective-C, or Swift the compiled code would be identical.  Ergo, Swift has the performance of C, period.

 

Nonsense! Speed depends on optimizations that exist in the compiler (and even linker, etc) and the ability of the language to expose code and data to optimization. The more guarantees you have about what your variables are, the more chances you have to optimize with them. The better the language exposes code interdependencies, the more chances the compiler has to optimize them.

 

See the discussion in the Swift manual (iBooks) on Arrays and Dictionaries and how you can optimize copies away most of the time for constant data. In C, you may have a float array and a separate pointer that points to its 19th element but treats it as char * rather than float[]. It's pretty hard for a compiler to come up with as many safe optimizations in that environment as in Swift. On the other hand, C allows the programmer to do flexible things like treat his array as both types at once, which may lead to better speed than Swift can hope for, but at potentially great risk to reliability and safety.

 

And Assembly code can be hand-tuned to go faster than C if you know the machine language in detail and are willing to spend hours sweating the details. I've done it. It ain't fun, and is seldom productive unless used in the very, very inmost loops.

 

What Swift seems to promise is a nice balance of development speed and ease vs execution speed and reliability. 

 

For a very concrete example: C doesn't check that data fits inside the memory allocated for variables; this lack of checking makes it faster when dealing with arrays than other, safer languages. But it also leads to the infamous buffer overflows and vulnerabilities like Heartbleed (though some really bad design helped Heartbeat achieve fame).

 

Swift does guarantee that data can't overflow its variables. This takes a bit longer, but is vastly safer. Programmers will probably still be able to create buffer overflows, but they'll have to work really hard to do so.

 

Anyway, it's impossible in most cases to look at a language description and say whether in practice programs written to normal expectations of productivity will necessarily run faster or slower than in other languages when written to similar standards. You have to use the language to find out, and use it on the kind of application you want to write. If Apple's numbers say it's very fast, but not as fast as C form most things, I'll tend to believe them.

post #79 of 89
I washed the demo at it's intro. Great and revolutionary, Question is - How quickly will Google and the other me2 robber baron companies filch it???
post #80 of 89
Quote:
Originally Posted by greatrix View Post

I washed the demo at it's intro. Great and revolutionary, Question is - How quickly will Google and the other me2 robber baron companies filch it???

Is that possible? Seems like it took years to get to this point and Apple had full control of mature tools.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Mac Software
AppleInsider › Forums › Software › Mac Software › Apple's top secret Swift language grew from work to sustain Objective C, which it now aims to replace