or Connect
AppleInsider › Forums › Mobile › iPhone › Book sales mark shift toward Mac, iPhone development
New Posts  All Forums:Forum Nav:

Book sales mark shift toward Mac, iPhone development - Page 2

post #41 of 78
Quote:
Originally Posted by MacTel View Post

That is true but in practice I believe those developers prefer to develop for a stable mobile platform. What I mean by stable is that all the iPhone and iPod Touch devices have similar base hardware whereas the Mac lines do not. Varying graphic cards, displays and major versions of OSX require extra development to keep-up with. The iPhone OS is simpler to develop for and support.

This is not true... OpenGL hides the underlying graphics hardware. All programmers need to do is check if "feature A" exists within the system if not, ignore those features in the game. Most graphic games are done using vector rendering with bitmap overlays for texture... this scales well across higher resolutions.
Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
Disclaimer: The things I say are merely my own personal opinion and may or may not be based on facts. At certain points in any discussion, sarcasm may ensue.
Reply
post #42 of 78
Here's my take on the state of Apple development tools:

1) Objective-C: To the layman, Objective-C has all the features that make it easy to sell to the higher-ups: superset of C, low-level resource management (pointers), object-oriented, rich APIs, properties, garbage collection (not for iPhone), and dynamic typing. However, to the programmer, it lacks the simplicity of other frameworks such as .NET and Java. The syntax is really alien to programmers coming from C#, C++ or Java. There's too much boilerplate code required, such as header files and alloc/dealloc statements. I can't arge with the end result, but from the perspective of a Java developer, it doesn't feel like a pleasant language to program in. Also, I find the lack of strong typing to be an issue, as I rely on the compiler in Java to catch a lot of potential problems that I would never be able to unit test.

2) XCode code editor. The code editor lacks many features of modern IDEs such as VisualStudio and Eclipse. Boilerplate code generation and intellisense is not as advanced. The debugger is apparently (from what I've heard) less reliable and has less features. I'm not sure if it supports unit testing similar to JUnit (though I could be wrong). Unless I'm missing some information, Apple needs to make a number of improvements to XCode to make it a first class IDE.

3) InterfaceBuilder. From my limited experience with this tool, it's really first class, and I haven't seen a better GUI builder (although I haven't used the one in VisualStudio). All the GUI elements are available for a quick drag and drop. There are boundaries that let you drop in elements in just the right place, with just the right amount of spacing between the elements. Binding the GUI elements to code properties is also very easy, with a simple drag-and-drop mechanism. It's also easy to get the GUI up and running.

4) Instruments: I haven't used it but from what I've read, it's an excellent front end for DTrace.

5) API documentation. This is not as trivial to access from the IDE or the web as the javadoc documentation. Then again, perhaps it's just organized in a way that I'm not used to.

The bottom line is that if Apple can improve the XCode editor to improve code generation, correct errors, link more easily to documentation, improve the debugger, and allow an intuitive unit testing framework, then they can evolve the platform further. Developers are important, and the more developers Apple can get on board, the richer its ecosystem will become.
post #43 of 78
You couldn't pay me enough money to write another tech book and I've written two successful ones.
post #44 of 78
Quote:
Originally Posted by JavaCowboy View Post

Here's my take on the state of Apple development tools:

1) Objective-C: To the layman, Objective-C has all the features that make it easy to sell to the higher-ups: superset of C, low-level resource management (pointers), object-oriented, rich APIs, properties, garbage collection (not for iPhone), and dynamic typing. However, to the programmer, it lacks the simplicity of other frameworks such as .NET and Java. The syntax is really alien to programmers coming from C#, C++ or Java. There's too much boilerplate code required, such as header files and alloc/dealloc statements. I can't arge with the end result, but from the perspective of a Java developer, it doesn't feel like a pleasant language to program in. Also, I find the lack of strong typing to be an issue, as I rely on the compiler in Java to catch a lot of potential problems that I would never be able to unit test.

I love Objective-C. I did a lot of C programming, and C++ just seemed like a big mess. ObjC makes the concepts of OOP so much easier to grasp. I subsequently got into some Java and C# and found that what I learned from ObjC served me well, and made me still prefer ObjC. When I learned Java and C#, I don't remember saying, "I wish ObjC could do that." I also found that the introductory book from Apple on the Objective-C language was clearer, more concise and better-written than the books I got to learn Java and C#.

Typing in ObjC is as strong or as weak as you want it to be. You can go as generic as "id" or as specific as the exact class of the object in question, or any class in between.

A lot of that boilerplate code generation went away with the introduction of properties in ObjC 2.0, but there were tools out there before that to generate a lot of that code automatically.

You have to separate the language from the frameworks. I prefer Cocoa to Java and .NET.

I'm glad to see Objective-C on the rise. (A few years ago, there was a fear that Apple might make Cocoa Java-only.) I hope that reflects a trend where many people are deciding that it is a great language for creating outstanding Mac and iPhone apps. It is certainly fundamentally different from Java, C++ and C# (C# is just another Microsoft me-too product that has only fairly trivial departures from Java), so it represents a unique option to developers.
post #45 of 78
Quote:
Originally Posted by macFanDave View Post

I love Objective-C. I did a lot of C programming, and C++ just seemed like a big mess. ObjC makes the concepts of OOP so much easier to grasp. I subsequently got into some Java and C# and found that what I learned from ObjC served me well, and made me still prefer ObjC. When I learned Java and C#, I don't remember saying, "I wish ObjC could do that." I also found that the introductory book from Apple on the Objective-C language was clearer, more concise and better-written than the books I got to learn Java and C#.

YMMV. I guess it's all subjective, but I think I speak for a majority of "curly braces" programmers when I say that Objective-C syntax is just plain weird. I'm not saying it's wrong or lacking functionality, but in my admittedly brief attempts to learn it, I was put off by the weird syntax

Quote:
Typing in ObjC is as strong or as weak as you want it to be. You can go as generic as "id" or as specific as the exact class of the object in question, or any class in between.

Does the compiler the type checking? If so, how easy is this to set? Honest question.

Quote:
A lot of that boilerplate code generation went away with the introduction of properties in ObjC 2.0, but there were tools out there before that to generate a lot of that code automatically.

From what I understand, there's still a lot of boilerplate related to header files, alloc and dealloc statements, interfaces and implementations (- and +) methods/functions, etc. Still, compact property syntax is a good idea, as Java still doesn't have it for backward compatibility reasons.

Quote:
You have to separate the language from the frameworks. I prefer Cocoa to Java and .NET.

At a high level, the Cocoa APIs seem to be very useful and have lots of functionality (ex Core Animation). This ties in to the strengths of OS X and its NeXTStep underpinnings. Still, the Java/.NET APIs have their own sets of strength, relating to putting common programing tasks into easily accessible libraries and business-related functionality.

Also, stupid question: Is there a unit testing framework for Cocoa/Objective-C?

Quote:
I'm glad to see Objective-C on the rise. (A few years ago, there was a fear that Apple might make Cocoa Java-only.) I hope that reflects a trend where many people are deciding that it is a great language for creating outstanding Mac and iPhone apps. It is certainly fundamentally different from Java, C++ and C# (C# is just another Microsoft me-too product that has only fairly trivial departures from Java), so it represents a unique option to developers.

Apple's about-face on Cocoa and Java was interesting. Apple made lots of promises about Java, only to break them when they deprecated the Java-Cocoa bridge. Their Java GUI documentation contains screenshots with pinstripes. I have no idea why Sun allowed them to distribute Java. Perhaps it was the low market share of OS X at the time.

Objective-C is older than many modern languages, including Java, C#. .NET supports multiple alternative languages (ex: IronPython) and Java is moving in that direction (Groovy, Jython, JRuby, Scala, JavaFX). I believe (not sure) that Cocoa supports Ruby and Python, but the iPhone still requires Objective-C. If Apple could support more languages for Cocoa, then I think the framework could really increase developer adoption and take off.
post #46 of 78
Quote:
Originally Posted by timgriff84 View Post

I
Doesn't this go against the argument that Macs are easier to use if a larger percentage of Mac users are buying books compared to PC users?

Considering that most of these about programming, what does that have to do with Macs being easier to use?
post #47 of 78
Quote:
Originally Posted by badNameErr View Post

If only that was the case - AI has been doing much more than just "comparisons" recently.

Seriously, if I want to read about how MS is doing I can read that on the PC sites - they are full of MS stories. It's not needed on AI and it isn't why I visit.

Just my 2c. This is intended to be constructive criticism, BTW.

Some of the most popular articles on the webs two biggest PC sites, PCmag and PCWorld, are about Apple, Macs, iPods, and iPhones.

While most of these are positive, some are negative. Should they stop their coverage?

ARsTechnica, another big news/rumors/fan site has had extensive coverage of Apple going way back. Some of those articles are very critical, and over the years, though less so now, posters were vry Windows oriented, and very derisive about anything Apple.

Anandtech, another mostly Windows tech site has expensive Apple coverage, which has increased over the years, particularly since Anand himself moved to a Mac laptop and Mac Pro as his own mainstay computers.

Most of those sites are news and technology. Of those, only Ars is really a rumors site as well.

But AI is a rumors site and a news site, and is unabashedly Apple based. What would you expect?

If you want more objective news oriented information on a Mac site, then go to MacWorld instead.

But do you really want that, or do you want to come here to complain about Windows coverage?

I suspect the latter.
post #48 of 78
Quote:
Originally Posted by jfanning View Post

It is more significant that 50% of books are not. Without that additional data it is a waste of time trying to discuss anything here. You don't know if the drop in sales was made up with direct sales, or in fact if the people that previously gaining the information from books are now gaining the same information from online sources, or training courses.

You're attempting to twist it the way you want. No, you're not correct.

And you haven't provided any evidence to show that the "other" 50% is substantially different. until you do that, you can't assume it is, which is what you're trying to do for whatever reason.
post #49 of 78
I can't answer all of your questions with any firm authority -- I hope a more experienced Mac/iPhone developer will come along and give you some satisfactory answers. I will take a crack at these:

Quote:
Originally Posted by JavaCowboy View Post

YMMV. I guess it's all subjective, but I think I speak for a majority of "curly braces" programmers when I say that Objective-C syntax is just plain weird. I'm not saying it's wrong or lacking functionality, but in my admittedly brief attempts to learn it, I was put off by the weird syntax

We in the "square brackets" crowd understand that ObjC can be jarring at first, but if you give it a chance, you might come to love it like we do. Of all the weirdness, I love the labeled arguments. In C, you have to memorize which variable comes in the first position, second position, etc. in a function's argument list. I was disappointed that Java and C++ didn't improve that problem. ObjC methods are much closer to being self-documenting.

Quote:
From what I understand, there's still a lot of boilerplate related to header files, alloc and dealloc statements, interfaces and implementations (- and +) methods/functions, etc. Still, compact property syntax is a good idea, as Java still doesn't have it for backward compatibility reasons.

Xcode cranks out all of the "boilerplate" matter for interface and implementation files. I wish that some of things that are duplicated in the two files could be automated instead of requiring a cut-and-paste. Properties eliminate the manual writing of alloc/dealloc code, but the instances still need to be manually released in the implementation's dealloc method. In C++, I never "got" the distinction between class methods and instance methods and ObjC's use of the + or - makes it crystal clear.

Quote:
At a high level, the Cocoa APIs seem to be very useful and have lots of functionality (ex Core Animation). This ties in to the strengths of OS X and its NeXTStep underpinnings. Still, the Java/.NET APIs have their own sets of strength, relating to putting common programing tasks into easily accessible libraries and business-related functionality.

I guess learning how to navigate a given framework will give you a better or worse view of that framework. I "got" Cocoa a lot more easily than Java or .NET. Maybe those other two have a lot of power, but I found them very difficult to use.

Quote:
Apple's about-face on Cocoa and Java was interesting. Apple made lots of promises about Java, only to break them when they deprecated the Java-Cocoa bridge.

Yeah, the about-face was dramatic. At one point, it was going to be ObjC that was going to get the deprecation ax, but then Java ended up getting it.
post #50 of 78
Quote:
Originally Posted by macFanDave View Post

I'm glad to see Objective-C on the rise. (A few years ago, there was a fear that Apple might make Cocoa Java-only.) I hope that reflects a trend where many people are deciding that it is a great language for creating outstanding Mac and iPhone apps. It is certainly fundamentally different from Java, C++ and C# (C# is just another Microsoft me-too product that has only fairly trivial departures from Java), so it represents a unique option to developers.

In hindsight, I think this was one of Apple's best ever technical decisions (not to go all Java). There's definitely a certain amount of vindication for all of the NeXT fanbois who have persevered with Objective-C and the NeXT APIs since the late 1980s. A lot of casualties and change along the way, and basically 2 or 3 virtual deaths, but it made it. Made it into the hands of millions and millions of people, something many thought not possible. The iPhone and iPhone OS X are in so many ways the embodiment of what NeXT was all about.

If Apple went all Java, I don't think they would be where they are today. There just wouldn't be that much enthusiasm in the product anymore, as most of the people involved at Apple are NeXT veterans, especially the software people: Jobs, Serlet, Forstall, Timaddon, Tevanian before he left, Heinen before the leaving due to backdating, Rubenstein before he left though he was a NeXT hardware VP, and who knows who else in the trenches. I sometimes speculate that a lot of people at Apple are really sore about WebObjects going Java. If it was still Objective-C, I bet it would be a much more prominent product.

Hell, Java wouldn't be Java without the NeXT influence. It's amazing how small the world really is.
post #51 of 78
Quote:
Originally Posted by Shrike View Post

In hindsight, I think this was one of Apple's best ever technical decisions (not to go all Java). There's definitely a certain amount of vindication for all of the NeXT fanbois who have persevered with Objective-C and the NeXT APIs since the late 1980s. A lot of casualties and change along the way, and basically 2 or 3 virtual deaths, but it made it. Made it into the hands of millions and millions of people, something many thought not possible. The iPhone and iPhone OS X are in so many ways the embodiment of what NeXT was all about.

I'm not trying to start an argument, but why was dumping Java a good *technical* decision?

Quote:
If Apple went all Java, I don't think they would be where they are today. There just wouldn't be that much enthusiasm in the product anymore, as most of the people involved at Apple are NeXT veterans, especially the software people: Jobs, Serlet, Forstall, Timaddon, Tevanian before he left, Heinen before the leaving due to backdating, Rubenstein before he left though he was a NeXT hardware VP, and who knows who else in the trenches. I sometimes speculate that a lot of people at Apple are really sore about WebObjects going Java. If it was still Objective-C, I bet it would be a much more prominent product.

You've lost me. First of all, it sounds like you're making a good case for why dumping Java was a good *political* decision, since the "software people" from NeXT were clearly in the Objective-C camp, so favouring their language was obviously the best way to retain these people. However, you really haven't made the case for why Objective-C was better *techincally* for Apple than Java. I suppose one advantage of Objective-C is that while it's an open standard, it's Apple's technology to tailor as it sees fit, as opposed to Java, which follows Sun and the JCP's standards, and tends to evolve more slowly due to backward compatibility concerns (not to go off on a tangent, but that's one of the reasons Sun created JavaFX).

What's more, I think you're totally wrong on WebObjects. I would argue WebObjects succeeded *because* it's a Java web container, not *in spite of it*. Java has proven itself as a server-side technology and it's been battle-tested in large-scale enterprise applications. This is in stark contrast to the client-side, where Java just never took off and was the victim of questionable design decisions. WebObjects applications, while they can only be developed in OS X, can be deployed on any JEE-compliant container. This gives WebObjects a tremendous advantage over an Objective-C container, since many developers wouldn't adopt it if they had to run it on either proprietary hardware or proprietary application server software.

Quote:
Hell, Java wouldn't be Java without the NeXT influence. It's amazing how small the world really is.

Well, I'd argue that both were influenced by Smalltalk, but that's beside the point.
post #52 of 78
JavaCowboy— You've asked a lot of good questions, and generally seem to be more curious than attacking, which I respect. A few comments...

Quote:
Originally Posted by JavaCowboy View Post

However, to the programmer, it lacks the simplicity of other frameworks such as .NET and Java. The syntax is really alien to programmers coming from C#, C++ or Java.

I suspect you're referring to fact that Objective-C uses message passing with named parameters, rather than method calls. So instead of something like this:
Code:

firstHeader.style("red", 32, true, false);

In Objective-C you'd have something like this:
Code:

[firstHeader styleWithColor:@"red", fontSize:32, bold:YES, italics:NO];

Admittedly the brackets everywhere and the at signs can be a little disconcerting at first, but what I've always found a little interesting is that this really isn't different from say how jQuery specifies parameters or how Wordpress uses query strings in their internal functions. A few examples:

JavaScript
Code:

$("#firstHeader").css({'color' : 'yellow', font-size:32, 'font-weight' : 'bold', 'font-style' : 'italics'});

PHP/Wordpress
Code:

style('firstHeder', 'color=yellow&fontSize=32&fontWeight=bold&fontStyl e=italics');

Both of which are considered (again my biases are creeping in ) to be very enjoyable environments to code with/in. Neither JavaScript nor PHP have named parameters built-in, but the developers found it useful enough that they rolled the functionality themselves. Objective-C just has it built-in. Personally, it's one of my favorite parts about using Objective-C and jQuery. But I can understand that it's more difficult to type and can seem overly verbose. A few things make this less of an issue. The first is that Xcode's code sense is really, really good. Here's an example:



I've typed about three characters, and it can already present possible alternatives. If it's not what I'm looking for, I can do control-period to cycle through the options, or hit escape to pull up a contextual menu of possible completions. Or I can keep typing whatever else I intended. The dashed line underneath lets me see where else in the code I've used that message call. But what's particularly awesome is that I don't have to hit tab and then fill in the fields. If you do control-/ (or control-shift-/) to go backwards you can cycle through the tokens and fill in the parameters. Another thing is that I typed the opening bracket but even that isn't necessary. As soon as you type the closing bracket, it'll close off the initial one for you. I end up typing far fewer characters and can actually understand and see what the parameters mean compared to when I'm working in something like Eclipse.

Quote:
There's too much boilerplate code required, such as header files and alloc/dealloc statements. I can't arge with the end result, but from the perspective of a Java developer, it doesn't feel like a pleasant language to program in.

Just out of curiosity, do you primarily backend Java development, servlets or GUI applications? I mostly do backend type stuff and servlets, so please excuse me if the GUI tools require significantly less code. From what I've found, however, that even for something like the Spring Framework which is meant to simplify development, there's still a lot of very painful boilerplate code required. For example, if you want to pass anything more than a string of data to template, you have to wrap it up in a model object, write its accessors methods, and then pass it to the template.

Header files are an interesting case. They're present in all the C-based languages I'm familiar with (I'm not sure about C#), while Java is one of the few (compiled) languages that doesn't have them. The reason they exist is because in order for code to link against compiled libraries, it needs to know what variables and functions are available for it to link against, since all the actual code is in machine code. How does Java get around this? They link the filename to the class name (or interface or whatever). This actually isn't the worst idea (and is the de facto standard in Objective-C projects anyway), but it does have a few implications. One is that you can only have class or interface per file. This might not seem like a bad thing, but sometimes you need to create helper objects (like Iterators) or want to define an interface that objects in your collection class have to conform to. This actually isn't impossible in Java, but then you have to resort to inner classes, and because Java is strongly typed, you have to deal with properly typecasting things as necessary and properly initializing your inner classes. This can be painful.

Again, this isn't an argument for header files, but as I mentioned Xcode helping with method names, Xcode also makes creating new classes painless. Your header files will be created for you and your implementation files will properly reference them. You can write the code yourself or have it create UIView, or UITableViewController subclasses with accompanying NIB files, if you want. I've found it to be incredibly helpful to be able to jump through different header files and get a sense of the overall class. It's a form of at-a-glance documentation that can be hard to get with Java without generating automatic documentation (which you can also do in Xcode and also view your custom documentation within its reference library).

Quote:
Also, I find the lack of strong typing to be an issue, as I rely on the compiler in Java to catch a lot of potential problems that I would never be able to unit test.

I think this question might be one due to unfamiliarity . Type-checking using proper types in Objective-C is de facto and expected. The only time that you don't specify an explicit type is with delegate objects (objects to which you can delegate responsibility) and for objects in collection classes. You're probably familiar with the delegate pattern, but as a simple example, you might need to do an asynchronous URL request in code. In Java, the typical pattern would be to subclass whatever class is available (something like URLConnection — I'm not sure), and then override methods that deal with response and any other error conditions. Then you'd include this class you created in your main controller code, initialize it with your URL and you're done. In Objective-C you just initialize an NSURLConnection and set your controller as its delegate (something like urlConn.delegate = self). Then when the URL connection has either succeeded or failed, it will see if you respond to a particular selector (a fancy word for Objective-C's representation of message calls; you can make one like this: @selector(didReceiveURLResponse)), and if you do, send the message.

This is why you don't specify an explicit type for delegate variables. But I can tell that checking if an object supports a message bothers you , so if you want you can also declare a protocol such as NSURLConnectionDelegate, with required methods, that your controller object conforms to (MyControllerClass <NSURLConnectionDelegate, SomeOtherDelegate>) and the compiler will tell you when you've sent a message that your object can't understand. And you don't have to check in code anymore. This is less common before, but it's used throughout UIKit.

Finally, to answer a question you asked below, there's no need to "turn on type checking" or to fiddle with settings. It's built-in and at the heart of how all Objective-C programs are written. If your code tries to send a message to an object declared as the generic type (id), the compiler will throw a warning (unless it explicitly supports it using a protocol or checks if it responds to the selector as I mentioned above). Similar with collection classes, it's useful to say that your custom array classes hold "id"s. It's not necessary to declare an explicit data type since you're simply holding references to your objects, rather then manipulating them or sending messages to them. Since you're a Java programmer, I'm sure you're more than familiar with setting up generic types for your collection types, coercing them back to their proper types when you need to deal with them. In Objective-C, this is a lot less painful. The general reasoning behind Objective-C's type design is that it's strong where you need it and weak where you don't.

Quote:
The code editor lacks many features of modern IDEs such as VisualStudio and Eclipse. Boilerplate code generation and intellisense is not as advanced. The debugger is apparently (from what I've heard) less reliable and has less features.

I suspect you're right about the debugger. For the simple use cases that I use it for (setting breakpoints and analyzing variables, it works more than well for me), but I imagine it could be better. No promises, but I wouldn't be surprised if an upcoming version of Xcode had a fantastic new feature to let you debug your code . Check out more recent versions of Xcode. The code sense and code generation these days are very good.

Quote:
I'm not sure if it supports unit testing similar to JUnit (though I could be wrong).

Projects at Apple aren't written without unit testing. OCUnit has been around since before Mac OS X and has been integrated into Xcode since 2005. Check out this post by Bill Bumgarner, one of the engineers at Apple that wrote Core Data.

Quote:
API documentation. This is not as trivial to access from the IDE or the web as the javadoc documentation. Then again, perhaps it's just organized in a way that I'm not used to.

Option-double-click on any symbol or keyword in your app and it'll pull this up:



Includes links to sample code, related documents, the header file, and a link to the full documentation. Also I've found Apple's programming guides to be excellent.

Quote:
Originally Posted by JavaCowboy View Post

YMMV. I guess it's all subjective, but I think I speak for a majority of "curly braces" programmers when I say that Objective-C syntax is just plain weird. I'm not saying it's wrong or lacking functionality, but in my admittedly brief attempts to learn it, I was put off by the weird syntax.

See my examples above. I think it's worth the weirdness .

Quote:
I believe (not sure) that Cocoa supports Ruby and Python, but the iPhone still requires Objective-C. If Apple could support more languages for Cocoa, then I think the framework could really increase developer adoption and take off.

Not disappoint you, but from what I've seen, I don't think that Apple is looking to push Python and Ruby for Mac or iPhone development. I think that you'll find, if you spend enough time with it, that the difficult part isn't the language (I actually find it to be one of the less important things; I just happen to like it), but rather the Cocoa framework. To be, as Aaron Hillegass calls it, a stylish Cocoa programmer is difficult. And the Ruby and Python projects are just Cocoa scripting bridges. You still have to learn Cocoa. The ability to use Python instead (with a performance hit) is mostly a comfort thing. I'm not knocking the projects. I had a lot of fun with AppleScript Studio back in the day, which is built on a similar premise, but it's not really appropriate for larger or complex projects.

Also, I think that Apple has already increased developer adoption and it's already taken off. Its developer conference has sold out for the past few years and books about Objective-C programming have apparently gone up over 500% (!). Take a look at Objective-C again and I think you might be surprised. You do have to have to write a lot of classes in order to do what you want, but no more than Java. They, together, require a seemingly large amount of files because they're object-oriented and that's the Object-Oriented Way™. If you don't have a lot of objects (and are instead using things like switch statements and constants) you're probably doing something wrong. I've generally found that Cocoa rewards me in the cases that you abstract out your functionality and create additional classes. Java on the other hand, often requires you to create a separate class to do seemingly simple things. Which can be frustrating. I've already type far too much, but one last controversial example:

Java:
Code:

public class ReverseString extends String {
public String reverse(String blah) {
// implementation
}
}

Objective-C
Code:


@interface NSString (ReverseAdditions)
- (NSString *) reverse;
@end

@implementation
- (NSString *) reverse {
// implementation
}
@end

In Java, you whenever you want to use your new reverse string method, you have to create initialize a ReverseString and declare it as a ReverseString. But if the collection class you're passing your string to only accepts strings, then you have to cast it to a String to put it in the collection, and then cast it back when you reversed it because you remembered that it can be reversed. It's true that Java has strong typing, but when you're forced to cast constantly, the type information it enforces becomes less meaningful. In Objective-C you can just open up NSString, add in a reverse method, and all strings (even string literals) will have your additional method. You can do similar things with Ruby and JavaScript's prototype hook. Anyway, that all I have to say for now .

--

An update: JavaCowboy: It's true that Java did borrow from Smalltalk, but Objective-C, too (Java's interfaces were taken directly from Objective-C's protocols). Here's a good post by one of the designers of Java. An excerpt:

Quote:
When I left Sun to go to NeXT, I thought Objective-C was the coolest thing since sliced bread, and I hated C++. So, naturally when I stayed to start the (eventually) Java project, Obj-C had a big influence. James Gosling, being much older than I was, he had lots of experience with SmallTalk and Simula68, which we also borrowed from liberally.
post #53 of 78
Quote:
Originally Posted by Oh-es-Ten View Post

I agree with you here. I am teaching myself Obj-C (coming from a web dev background in PHP) and although I could get something made fairly easily but cutting and pasting sample code from forums etc, I am starting at the beginning by learning the basics, as it is a huge change in methodology and I want to be able to see my problems without asking for help. At the moment I am using the Goldstein 'For Dummies' book which has been quite good, as well as following some of the excellent video tutorials at icodeblog. These together are making this less painful that it could be by slapping something together...

Keep it up, it's worth it. I was probably a little harsh on the web tutorial crowd I also started mostly doing PHP/MySQL type stuff. But it wasn't until I sat down with a PHP/MySQL book that I had any idea what was going on. It's not that it's a bad thing to learn things quickly and get things going. But if you want to understand what you're doing and create amazing things, there's so much out there. Much better than most things I've found on the web (although there are some gems). Also, some advice since you're interested in Objective-C and Cocoa development. I leafed through the for Dummies book and while it looked decent, the projects you'd be creating by the end of it didn't seem that exciting.

If you're interested, check out the Apress series of Mac programming books. Learn C on the Mac, Learn Objective-C on the Mac, and Beginning iPhone Development. Even if you're not interested in the iPhone, the patterns are similar, and it's good practice to work on a memory limited device. You can also get the SDK for free. They're coming out with a counterpart Cocoa book in September if you're more interested in that. They're just as approachable as the Dummies book and I guarantee you can get through them. Don't worry about the Learn C on the Mac book, it's a very good book and you'll become a lot more comfortable with pointers and structs and manual memory management. Learn Objective-C on the Mac also has the best treatment of memory management in Objective-C that I've seen (better than Hillegass). Its object-oriented introduction is also pretty good. But if you're a PHP guy, also check out Zandstra's PHP Patterns, Objects, and Practice. The first third is fantastic (the rest is good, too, but not as relevant if you're just reading it for the object-oriented stuff) and gives a very convincing argument for objects if you're not already convinced. Finally, read Hillegass' Cocoa Programming book. It's considered the gold standard and you have to read it eventually. But personally, I tried to get through it a few times, but found that you really need a foundation in Objective-C and C first to get the most out of it. So unless you can understand most of it on your first past through, start with the Apress books. Also some good supplements are Kochan's Objective-C 2.0 book and of course The C Programming Language. And don't get that old Apple O'Reilly Cocoa book. It's terrible. Just some friendly advice .
post #54 of 78
Quote:
Originally Posted by melgross View Post

You're attempting to twist it the way you want. No, you're not correct.

And you haven't provided any evidence to show that the "other" 50% is substantially different. until you do that, you can't assume it is, which is what you're trying to do for whatever reason.

No, I'm not trying to twist anything. Fine you are right again, just like you are right in every situation. The limited subset of data proves the point you want to follow, fine.

But I have one simple point in this, how you can defer anything useful from the data provided when 50% of the data is unknown.
post #55 of 78
Quote:
Originally Posted by Synotic View Post


If you're interested, check out the Apress series of Mac programming books.

Thanks for your recommendation; I was aware of this learning path, and have started on it but now feel more comfortable that it is a worthwhile one. I don't intend to be a developer; just want a more in-depth understanding of Cocoa.

Nullis in verba -- "on the word of no one"

 

 

 

Reply

Nullis in verba -- "on the word of no one"

 

 

 

Reply
post #56 of 78
Thanks, Synotic, for helping me out where I was deficient.

I want to add one comment: the influence of Smalltalk on Objective-C is not some obscure, esoteric connection. Brad Cox invented Objective-C to bring Smalltalk into a language that C programmers could use. He was quite open about that in his book.
post #57 of 78
Quote:
Originally Posted by jfanning View Post

No, I'm not trying to twist anything. Fine you are right again, just like you are right in every situation. The limited subset of data proves the point you want to follow, fine.

But I have one simple point in this, how you can defer anything useful from the data provided when 50% of the data is unknown.

Not all the time. I'm also not the only one saying this you know.

The data has value. Retail sales of these books are skewing that way. That is important. Do you think that if retail book sales on various Apple programming needs are soaring, that the 50% not stated in this survey are moving the other way?

Are you saying that you think that could be the case? Why would that be the case?

There are courses on iPhone/Touch programming being given at various colleges now that weren't being given before. There are new textbooks on the subject out. There are more developers than ever before on the Apple platforms.

You say that that doesn't matter, and isn't an indication that even the unknown, in this survey, books sales numbers might not be rising?

Even if they held steady, then overall, Mac books on programming would be rising in sales.

None of this is logical to you? All you want to do is to say that the other 50% isn't known so it may be very bad? Because that's how what you are saying reads.
post #58 of 78
Quote:
Originally Posted by melgross View Post

Most of those sites are news and technology. Of those, only Ars is really a rumors site as well.

But AI is a rumors site and a news site, and is unabashedly Apple based. What would you expect?

The site is called AppleInsider. I'd expect Apple news. A constant stream of juicy juicy Apple news. Which you do really really well.

What I don't expect is lots of Oracle news, or lots of 3COM news, or IBM news, or Sony, or Samsung, or whoever. And we've been getting lots of MS news recently.

That's why I've been complaining.
post #59 of 78
Quote:
Originally Posted by JavaCowboy View Post

I'm not trying to start an argument, but why was dumping Java a good *technical* decision?
...
You've lost me. First of all, it sounds like you're making a good case for why dumping Java was a good *political* decision, since the "software people" from NeXT were clearly in the Objective-C camp, so favouring their language was obviously the best way to retain these people. However, you really haven't made the case for why Objective-C was better *techincally* for Apple than Java. I suppose one advantage of Objective-C is that while it's an open standard, it's Apple's technology to tailor as it sees fit, as opposed to Java, which follows Sun and the JCP's standards, and tends to evolve more slowly due to backward compatibility concerns (not to go off on a tangent, but that's one of the reasons Sun created JavaFX).

However you want to describe the decision, be it technical, programmatic, or business-related (they are all facets of one of another anyways), it was a sound decision. They would have never developed iPhone OS X as fast otherwise.

Would the 1st gen iPhone have been as slick and as fast if it was done in Java? Even for Apple, I don't think so. Cocoa, nee NEXTSTEP, is the ultimate prototyping environment for UI design. Interface Builder really can't be done without Objective-C, and the NeXTies had almost 2 decades of experience working it. I don't think with Java, even if they already had a wealth of corporate/engineering knowledge on it, would have been able to created a mobile OS that would have been as fast, as slick and as elegant in the time that they had.

Quote:
What's more, I think you're totally wrong on WebObjects. I would argue WebObjects succeeded *because* it's a Java web container, not *in spite of it*. Java has proven itself as a server-side technology and it's been battle-tested in large-scale enterprise applications. This is in stark contrast to the client-side, where Java just never took off and was the victim of questionable design decisions. WebObjects applications, while they can only be developed in OS X, can be deployed on any JEE-compliant container. This gives WebObjects a tremendous advantage over an Objective-C container, since many developers wouldn't adopt it if they had to run it on either proprietary hardware or proprietary application server software.

WebObjects has pretty much ceased to be a developed, vendable product for Apple. I'd say it's pretty much in maintenance mode, for over 2 years no less. I don't think they even use it for Mobile Me, someone correct me if I'm wrong. Now, it's largely because Apple has found its raison d'être is in selling premium consumer hardware, not server-side application tools or any programming tools, but my judgement is that if WebObjects stayed Obc-C it would still be alive, kicking, and a prominent part of OS X Server.

Quote:
Well, I'd argue that both were influenced by Smalltalk, but that's beside the point.

Yes. Influenced by Smalltalk too, but many of the original Java developers (the people who developed the language and developmental hardware) were ex-NeXT employees! If it wasn't for NeXT hardware cratering, Java may have never come out of incubation.
post #60 of 78
I think the reason for dumping Java from Cocoa was a practical one.

Almost all of the sample code that came out when Cocoa was introduced was in Objective-C. To many developers, sample code is mother's milk. The Java baby just starved to death. Same thing with the books.

I remember in Aaron Hillegass' book, there was one chapter on Java (out of over 20, I believe) and it started with a message that said something like you should really code in ObjC, but if you absolutely insist on using Java, here's what you do.

The mailing lists quickly became almost exclusively ObjC, because the experts who could really help you were former NeXTies who grew up with the language.

I guess if there were a group of dedicated Java programmers who made sure every sample code Apple ever released in ObjC was duplicated in Java, the outcome may have been different. But they didn't and it isn't.
post #61 of 78
Quote:
Originally Posted by badNameErr View Post

The site is called AppleInsider. I'd expect Apple news. A constant stream of juicy juicy Apple news. Which you do really really well.

What I don't expect is lots of Oracle news, or lots of 3COM news, or IBM news, or Sony, or Samsung, or whoever. And we've been getting lots of MS news recently.

That's why I've been complaining.

But this is all interrelated.

Take the iPod as an example. When Apple introduced iTunes, it took a while before sales became really meaningful. MS had Plays for Sure.

Writers all over began to compare apple's music and player sales to those from companies that used Plays for sure. As Plays for Sure bagan to fail, that was reported widely.

That shouldn't have been reported here as part of Apple's success? Of course. It was meangful. Different mehodologies between two companies who had been rivals for many years.

When MS countered with the Zune, which Ballmer stated quite baldly, which he seems to do every time Apple comes out with something new, or MS comes out with something to counter it, that it would sell so well that it would knock Apple out of first place, that shouldn't have been reported?

When the Zune failed to gain any traction, that shouldn't have been reported?

Why not? Ballmer as much as challenged Apple.

Same thing with the iPhone. Ballmer said it would fail to get more than a percent or so of the smartphone market, and that Win Mobile would be on many more phones. He said something like 100 million Win Mobile, against a few million iPhones.

That didn't work out so well either. It should be ignored?

I'm not sure I understand. Does Apple function in a vacuum, or are they compared to other firms all over, all of the time?

As they are, shouldn't that be mentioned here as well?
post #62 of 78
Quote:
Originally Posted by melgross View Post

I'm not sure I understand. Does Apple function in a vacuum, or are they compared to other firms all over, all of the time?

As they are, shouldn't that be mentioned here as well?

Don't you think that all the other websites already do that job perfectly well?
I don't see why you guys need to do that too.

I guess that what I'm saying is that (and I admit that this could just be me) I really couldn't care about any of that "stuff". I certainly don't care about well the Zune is doing!

If I want to read about the Zune I'll go to engadget or wherever.
post #63 of 78
Quote:
Originally Posted by badNameErr View Post

Don't you think that all the other websites already do that job perfectly well?
I don't see why you guys need to do that too.

I guess that what I'm saying is that (and I admit that this could just be me) I really couldn't care about any of that "stuff". I certainly don't care about well the Zune is doing!

If I want to read about the Zune I'll go to engadget or wherever.

So, other websites shouldn't mention Apple because it's done here, Macworld, MacLife etc?

Linux should only be mentioned on Linux sites. Windows only on Windows sites, etc?

You know that it's what you're saying.
post #64 of 78
Quote:
Originally Posted by melgross View Post

Not all the time. I'm also not the only one saying this you know.

The data has value. Retail sales of these books are skewing that way. That is important. Do you think that if retail book sales on various Apple programming needs are soaring, that the 50% not stated in this survey are moving the other way?

Are you saying that you think that could be the case? Why would that be the case?

There are courses on iPhone/Touch programming being given at various colleges now that weren't being given before. There are new textbooks on the subject out. There are more developers than ever before on the Apple platforms.

You say that that doesn't matter, and isn't an indication that even the unknown, in this survey, books sales numbers might not be rising?

Even if they held steady, then overall, Mac books on programming would be rising in sales.

None of this is logical to you? All you want to do is to say that the other 50% isn't known so it may be very bad? Because that's how what you are saying reads.

ok, the population of the USA, and Europe are roughly the same.

Lets take the sales of iPhones compared to other devices from Europe and defer the US sales from that.
post #65 of 78
Thanks for the detailed post. I'm still digesting it and will have to read it several times to process it. Still, this is the thing I love about mature discussion. There's the opportunity to learn a lot from people with different perspectives.

There's one point I want to make about language support, though. Programming languages evolve constantly, but there are times when a language can't evolve further. This happened with languages such as C and COBOL, and it's happening with Java. The problem with Java in particular is that it's so widely deployed, and so many enterprise applications depend on it. Backwards compatibility is as important in Java as it is in Windows, perhaps more so. That is to say that if the Java language were to add support for closures in Java 7, it would involve significant changes to javac and an ungodly amount of testing to ensure that the compiler was fully backwards compatible. After some bad experiences with the language changes in Java 5, including some silly comprises made with such features as generics and autoboxing, Sun decided that the Java language has only so much room to evolve and decided to support alternative languages within the JVM.

New languages like Groovy, Scala and JavaFX make up for many shortcomings in the Java language. Once Objective-C/Cocoa apps are more widely deployed, backwards compatibility will become an issue and the language will eventually stagnate. Also, developer adoption is key. If aspiring Cocoa developers for whatever reason find Objective-C distasteful, but prefer Scala (random example), then Apple would be well served to make a framework that supports alternative languages.

Quote:
Originally Posted by Synotic View Post

JavaCowboy You've asked a lot of good questions, and generally seem to be more curious than attacking, which I respect. A few comments...
post #66 of 78
Quote:
Originally Posted by Synotic View Post

I'd find this disappointing, although I don't disagree that it's true, or that it's the direction that things are going.

From what I've seen (and I'm biased), most of the crowd that "learns" from scraps of advice on blogs and reading a few overviews tend to only create what you can find on tutorials. The crazy people, the ones who are actually out there making all the tools we're using (engineers at Apple, Facebook, Google, etc...) however didn't learn this way.

Eh? The blogs, code fragments and tutorials are all starting point. You learn to code by...coding. Trying things out. Figuring out what works and what doesn't. Figuring out how something works.

The crazy prople are the ones coding, not reading books.

Quote:
They either started programming when it was simpler and kept up, or learned through books and school.

Coding has gotten simpler. This is why we can do more than before. But I agree to a point. You need to know the foundational elements and school is a good place to learn that stuff. Data structures, algorithms, fundamentals of languages, compilers and OS. That puts a lot of things in a solid context.

Not that you can't learn these things outside of school but school provides a good framework of what to learn in what order.

Quote:
I don't think this is the only approach, but when you're doing things like multithreaded programming, networking (like creating Facebook chat which scales to however millions of users they have) a tutorial or two on the web isn't going to cut it.

The principles of programming you learn from a textbook and school. These foundational elements haven't changed as much as been added to. However, scalability is not something often taught as much as apprenticed. Yes, there are fundamental principles you can apply but a lot of the nuts and bolts of building hugely scalable systems is learned via screwing it up a couple times in the past or learning from someone who's screwed it a couple times in the past. Or reading from a blog about how someone else did it after screwing it up a couple times.

If I had to build a really scalable chat I'd start with XMPP and work from there. Where'd I learn that? Off the internet then working with XMPP for a while.

Quote:
Also I think it's important to clarify that a book isn't necessarily something you have to hold in your hand. I admittedly haven't bought as many physical computer books lately, but rather do most of my reading on a Safari Books subscription. I do think that the era of the physical book is over.

The problem with books is that it takes time to write and often by the time it gets published the APIs have changed or will change within a couple months.
post #67 of 78
Quote:
Originally Posted by Synotic View Post

I'd find this disappointing, although I don't disagree that it's true, or that it's the direction that things are going.

From what I've seen (and I'm biased), most of the crowd that "learns" from scraps of advice on blogs and reading a few overviews tend to only create what you can find on tutorials. The crazy people, the ones who are actually out there making all the tools we're using (engineers at Apple, Facebook, Google, etc...) however didn't learn this way. They either started programming when it was simpler and kept up, or learned through books and school. I don't think this is the only approach, but when you're doing things like multithreaded programming, networking (like creating Facebook chat which scales to however millions of users they have) a tutorial or two on the web isn't going to cut it.

I agree and also strongly disagree. I went to Uni to learn programming and it gave me a good base understanding as well as good architecture knowledge. However anything I learnt about actual languages is irrelevant as the programming world moves to fast.

As someone else has already mentioned blog's and videos are often just a starting point which your going to build and experiment from. Not to mention the greatest resource of all (at least for Microsoft developers) the MSDN Library. No book is ever going to come close to the amount of information in there. For a really serious developer thats experimenting, creating something new rather than following a guide in a book or a blog, having an on-line library off all the functions and class's is unbeatable.

Keeping up with programming I think is also only possible through blogs and other internet resources. Microsoft release blog posts all the time and beta's of all the new bits of functionality there working on, in many case's even video tutorials have been released before the products even been released. No book is ever going to keep you up to date in that way as it will have a 6-12 month delay waiting for the product to actually come out and then someone use it to a good enough extent that a book could be written, and then actually write the book.
post #68 of 78
Quote:
Originally Posted by jfanning View Post

ok, the population of the USA, and Europe are roughly the same.

Lets take the sales of iPhones compared to other devices from Europe and defer the US sales from that.

Ok. Sales in those other countries of Apple's phones and computers are rising as well.
post #69 of 78
Quote:
Originally Posted by melgross View Post

Ok. Sales in those other countries of Apple's phones and computers are rising as well.

But they have sold sfa using your theory
post #70 of 78
Quote:
Originally Posted by jfanning View Post

But they have sold sfa using your theory

Excuse me? sfa?
post #71 of 78
Quote:
Originally Posted by melgross View Post

Excuse me? sfa?

opps, sorry I thought that saying was more widely known...

Sweet f**k all
post #72 of 78
Quote:
Originally Posted by melgross View Post

So, other websites shouldn't mention Apple because it's done here, Macworld, MacLife etc?

Linux should only be mentioned on Linux sites. Windows only on Windows sites, etc?

You know that it's what you're saying.

Now you know it isn't.

But whatever happened to the idea of specialized reporting?
I can understand why AI wants to expand it's market - but I think it just dilutes the appeal of the site. You become like everybody else - so what's the point?

Anyway, somehow I don't think I'm going to convince you!
But I'll keep reading. I'm a fan.

(But if you start posting articles about Oracle's financial results I'll really complain!)
post #73 of 78
Quote:
Originally Posted by jfanning View Post

opps, sorry I thought that saying was more widely known...

Sweet f**k all

I don't use those words as much as you might.

But we're talking about TRENDS.

Apple only has small marketshare in computers here as well, but not quite twice as large as they have in the rest of the world.

But the amount of books sold at retail are now more than half, I think the article says. That means that a lot of people are interested in programming for them. Likely, many of these people already program for Windows machines and other OS's.
post #74 of 78
Quote:
Originally Posted by badNameErr View Post

Now you know it isn't.

But whatever happened to the idea of specialized reporting?
I can understand why AI wants to expand it's market - but I think it just dilutes the appeal of the site. You become like everybody else - so what's the point?

Anyway, somehow I don't think I'm going to convince you!
But I'll keep reading. I'm a fan.

(But if you start posting articles about Oracle's financial results I'll really complain!)

You know, we have 63,000 members. Those numbers are increasing.
post #75 of 78
Quote:
Originally Posted by Synotic View Post

JavaCowboy You've asked a lot of good questions, and generally seem to be more curious than attacking, which I respect. A few comments...

I suspect you're referring to fact that Objective-C uses message passing with named parameters, rather than method calls. So instead of something like this:
Code:

firstHeader.style("red", 32, true, false);

In Objective-C you'd have something like this:
Code:

[firstHeader styleWithColor:@"red", fontSize:32, bold:YES, italics:NO];

Admittedly the brackets everywhere and the at signs can be a little disconcerting at first, but what I've always found a little interesting is that this really isn't different from say how jQuery specifies parameters or how Wordpress uses query strings in their internal functions. A few examples:

JavaScript
Code:

$("#firstHeader").css({'color' : 'yellow', font-size:32, 'font-weight' : 'bold', 'font-style' : 'italics'});

PHP/Wordpress
Code:

style('firstHeder', 'color=yellow&fontSize=32&fontWeight=bold&fontStyl e=italics');

Both of which are considered (again my biases are creeping in ) to be very enjoyable environments to code with/in. Neither JavaScript nor PHP have named parameters built-in, but the developers found it useful enough that they rolled the functionality themselves. Objective-C just has it built-in. Personally, it's one of my favorite parts about using Objective-C and jQuery. But I can understand that it's more difficult to type and can seem overly verbose. A few things make this less of an issue. The first is that Xcode's code sense is really, really good. Here's an example:



I've typed about three characters, and it can already present possible alternatives. If it's not what I'm looking for, I can do control-period to cycle through the options, or hit escape to pull up a contextual menu of possible completions. Or I can keep typing whatever else I intended. The dashed line underneath lets me see where else in the code I've used that message call. But what's particularly awesome is that I don't have to hit tab and then fill in the fields. If you do control-/ (or control-shift-/) to go backwards you can cycle through the tokens and fill in the parameters. Another thing is that I typed the opening bracket but even that isn't necessary. As soon as you type the closing bracket, it'll close off the initial one for you. I end up typing far fewer characters and can actually understand and see what the parameters mean compared to when I'm working in something like Eclipse.

Just out of curiosity, do you primarily backend Java development, servlets or GUI applications? I mostly do backend type stuff and servlets, so please excuse me if the GUI tools require significantly less code. From what I've found, however, that even for something like the Spring Framework which is meant to simplify development, there's still a lot of very painful boilerplate code required. For example, if you want to pass anything more than a string of data to template, you have to wrap it up in a model object, write its accessors methods, and then pass it to the template.

Header files are an interesting case. They're present in all the C-based languages I'm familiar with (I'm not sure about C#), while Java is one of the few (compiled) languages that doesn't have them. The reason they exist is because in order for code to link against compiled libraries, it needs to know what variables and functions are available for it to link against, since all the actual code is in machine code. How does Java get around this? They link the filename to the class name (or interface or whatever). This actually isn't the worst idea (and is the de facto standard in Objective-C projects anyway), but it does have a few implications. One is that you can only have class or interface per file. This might not seem like a bad thing, but sometimes you need to create helper objects (like Iterators) or want to define an interface that objects in your collection class have to conform to. This actually isn't impossible in Java, but then you have to resort to inner classes, and because Java is strongly typed, you have to deal with properly typecasting things as necessary and properly initializing your inner classes. This can be painful.

Again, this isn't an argument for header files, but as I mentioned Xcode helping with method names, Xcode also makes creating new classes painless. Your header files will be created for you and your implementation files will properly reference them. You can write the code yourself or have it create UIView, or UITableViewController subclasses with accompanying NIB files, if you want. I've found it to be incredibly helpful to be able to jump through different header files and get a sense of the overall class. It's a form of at-a-glance documentation that can be hard to get with Java without generating automatic documentation (which you can also do in Xcode and also view your custom documentation within its reference library).

I think this question might be one due to unfamiliarity . Type-checking using proper types in Objective-C is de facto and expected. The only time that you don't specify an explicit type is with delegate objects (objects to which you can delegate responsibility) and for objects in collection classes. You're probably familiar with the delegate pattern, but as a simple example, you might need to do an asynchronous URL request in code. In Java, the typical pattern would be to subclass whatever class is available (something like URLConnection I'm not sure), and then override methods that deal with response and any other error conditions. Then you'd include this class you created in your main controller code, initialize it with your URL and you're done. In Objective-C you just initialize an NSURLConnection and set your controller as its delegate (something like urlConn.delegate = self). Then when the URL connection has either succeeded or failed, it will see if you respond to a particular selector (a fancy word for Objective-C's representation of message calls; you can make one like this: @selector(didReceiveURLResponse)), and if you do, send the message.

This is why you don't specify an explicit type for delegate variables. But I can tell that checking if an object supports a message bothers you , so if you want you can also declare a protocol such as NSURLConnectionDelegate, with required methods, that your controller object conforms to (MyControllerClass <NSURLConnectionDelegate, SomeOtherDelegate>) and the compiler will tell you when you've sent a message that your object can't understand. And you don't have to check in code anymore. This is less common before, but it's used throughout UIKit.

Finally, to answer a question you asked below, there's no need to "turn on type checking" or to fiddle with settings. It's built-in and at the heart of how all Objective-C programs are written. If your code tries to send a message to an object declared as the generic type (id), the compiler will throw a warning (unless it explicitly supports it using a protocol or checks if it responds to the selector as I mentioned above). Similar with collection classes, it's useful to say that your custom array classes hold "id"s. It's not necessary to declare an explicit data type since you're simply holding references to your objects, rather then manipulating them or sending messages to them. Since you're a Java programmer, I'm sure you're more than familiar with setting up generic types for your collection types, coercing them back to their proper types when you need to deal with them. In Objective-C, this is a lot less painful. The general reasoning behind Objective-C's type design is that it's strong where you need it and weak where you don't.

I suspect you're right about the debugger. For the simple use cases that I use it for (setting breakpoints and analyzing variables, it works more than well for me), but I imagine it could be better. No promises, but I wouldn't be surprised if an upcoming version of Xcode had a fantastic new feature to let you debug your code . Check out more recent versions of Xcode. The code sense and code generation these days are very good.

Projects at Apple aren't written without unit testing. OCUnit has been around since before Mac OS X and has been integrated into Xcode since 2005. Check out this post by Bill Bumgarner, one of the engineers at Apple that wrote Core Data.

Option-double-click on any symbol or keyword in your app and it'll pull this up:



Includes links to sample code, related documents, the header file, and a link to the full documentation. Also I've found Apple's programming guides to be excellent.

See my examples above. I think it's worth the weirdness .

Not disappoint you, but from what I've seen, I don't think that Apple is looking to push Python and Ruby for Mac or iPhone development. I think that you'll find, if you spend enough time with it, that the difficult part isn't the language (I actually find it to be one of the less important things; I just happen to like it), but rather the Cocoa framework. To be, as Aaron Hillegass calls it, a stylish Cocoa programmer is difficult. And the Ruby and Python projects are just Cocoa scripting bridges. You still have to learn Cocoa. The ability to use Python instead (with a performance hit) is mostly a comfort thing. I'm not knocking the projects. I had a lot of fun with AppleScript Studio back in the day, which is built on a similar premise, but it's not really appropriate for larger or complex projects.

Also, I think that Apple has already increased developer adoption and it's already taken off. Its developer conference has sold out for the past few years and books about Objective-C programming have apparently gone up over 500% (!). Take a look at Objective-C again and I think you might be surprised. You do have to have to write a lot of classes in order to do what you want, but no more than Java. They, together, require a seemingly large amount of files because they're object-oriented and that's the Object-Oriented Way. If you don't have a lot of objects (and are instead using things like switch statements and constants) you're probably doing something wrong. I've generally found that Cocoa rewards me in the cases that you abstract out your functionality and create additional classes. Java on the other hand, often requires you to create a separate class to do seemingly simple things. Which can be frustrating. I've already type far too much, but one last controversial example:

Java:
Code:

public class ReverseString extends String {
public String reverse(String blah) {
// implementation
}
}

Objective-C
Code:


@interface NSString (ReverseAdditions)
- (NSString *) reverse;
@end

@implementation
- (NSString *) reverse {
// implementation
}
@end

In Java, you whenever you want to use your new reverse string method, you have to create initialize a ReverseString and declare it as a ReverseString. But if the collection class you're passing your string to only accepts strings, then you have to cast it to a String to put it in the collection, and then cast it back when you reversed it because you remembered that it can be reversed. It's true that Java has strong typing, but when you're forced to cast constantly, the type information it enforces becomes less meaningful. In Objective-C you can just open up NSString, add in a reverse method, and all strings (even string literals) will have your additional method. You can do similar things with Ruby and JavaScript's prototype hook. Anyway, that all I have to say for now .

--

An update: JavaCowboy: It's true that Java did borrow from Smalltalk, but Objective-C, too (Java's interfaces were taken directly from Objective-C's protocols). Here's a good post by one of the designers of Java. An excerpt:

A really long post that made me strain my eyes, but really useful too. Thanks!
post #76 of 78
One thing I realized while Googling is that Objective-C does not have closures, and will only be getting them along with Snow Leopard. They'll be called "blocks":

http://landonf.bikemonkey.org/2009/0...Phone.20090703

Yeah, I know that Java doesn't have closures either (and won't have them in Java 7 either), but it strikes me as odd that a language where more flexibility is claimed than Java doesn't yet have a feature that languages such as Scala, Ruby, Perl and Python have had for years.

With the addition of properties and garbage collection in Objective-C 2.0, is there any reason for the sudden push to add features to the language, and why NeXT and Apple didn't add more features to the language before?

Just curious.

Quote:
Originally Posted by JavaCowboy View Post

Thanks for the detailed post. I'm still digesting it and will have to read it several times to process it. Still, this is the thing I love about mature discussion. There's the opportunity to learn a lot from people with different perspectives.....
post #77 of 78
Quote:
Originally Posted by melgross View Post

Some of the most popular articles on the webs two biggest PC sites, PCmag and PCWorld, are about Apple, Macs, iPods, and iPhones.

While most of these are positive, some are negative. Should they stop their coverage?

ARsTechnica, another big news/rumors/fan site has had extensive coverage of Apple going way back. Some of those articles are very critical, and over the years, though less so now, posters were vry Windows oriented, and very derisive about anything Apple.

Anandtech, another mostly Windows tech site has expensive Apple coverage, which has increased over the years, particularly since Anand himself moved to a Mac laptop and Mac Pro as his own mainstay computers.

Most of those sites are news and technology. Of those, only Ars is really a rumors site as well.

But AI is a rumors site and a news site, and is unabashedly Apple based. What would you expect?

If you want more objective news oriented information on a Mac site, then go to MacWorld instead.

But do you really want that, or do you want to come here to complain about Windows coverage?

I suspect the latter.

Oh. MacRumors too.
post #78 of 78
Quote:
Originally Posted by badNameErr View Post

Don't you think that all the other websites already do that job perfectly well?
I don't see why you guys need to do that too.

I guess that what I'm saying is that (and I admit that this could just be me) I really couldn't care about any of that "stuff". I certainly don't care about well the Zune is doing!

If I want to read about the Zune I'll go to engadget or wherever.

You wouldn't want to see Steve Ballmer on an Apple Rumor site.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPhone
AppleInsider › Forums › Mobile › iPhone › Book sales mark shift toward Mac, iPhone development