Apple's secrecy hampering AI development, report says
Apple's intense levels of secrecy are preventing the company from making real leaps in the quality of its artificial intelligence, a report said on Thursday.

The company for example kept a low profile at last year's Neural Information Processing Systems conference, to the extent that its representatives wouldn't admit who they worked for unless asked, Bloomberg observed. The company has also produced zero AI research papers so far, even though researchers at rivals like Google submit them on a regular basis.
People on the company's AI teams are told to lock their offices whenever they leave, and are kept in the dark about what similar internal teams are doing, sources said.
Secrecy may also reportedly be scaring away potential hires, who could be worried about factors like work freedom and the ability to stay involved in the scientific community. Without the ability to publish papers, it can be difficult to maintain status in the research world and build on ideas.
Google in fact has a new residency program dedicated to AI research and publication, geared to draw in yet more experts.
Apple is at least hiring dozens of new people for AI-related work, and acquiring companies like Perceptio and VocalIQ. Rumors from AI researchers suggest that Apple may soon finally publish a research paper, but nothing else is known.
AI is quickly becoming essential to Apple. Its Siri voice assistant is found across iOS devices and the Apple TV, and work is believed to be underway on self-driving car technology. The latter will require synthesizing various branches of AI with absolute attention to detail, since mistakes could result in injuries or even death.

The company for example kept a low profile at last year's Neural Information Processing Systems conference, to the extent that its representatives wouldn't admit who they worked for unless asked, Bloomberg observed. The company has also produced zero AI research papers so far, even though researchers at rivals like Google submit them on a regular basis.
People on the company's AI teams are told to lock their offices whenever they leave, and are kept in the dark about what similar internal teams are doing, sources said.
Secrecy may also reportedly be scaring away potential hires, who could be worried about factors like work freedom and the ability to stay involved in the scientific community. Without the ability to publish papers, it can be difficult to maintain status in the research world and build on ideas.
Google in fact has a new residency program dedicated to AI research and publication, geared to draw in yet more experts.
Apple is at least hiring dozens of new people for AI-related work, and acquiring companies like Perceptio and VocalIQ. Rumors from AI researchers suggest that Apple may soon finally publish a research paper, but nothing else is known.
AI is quickly becoming essential to Apple. Its Siri voice assistant is found across iOS devices and the Apple TV, and work is believed to be underway on self-driving car technology. The latter will require synthesizing various branches of AI with absolute attention to detail, since mistakes could result in injuries or even death.
Comments
So it's an editorial, not "news."
Honesty, this is nothing new for Apple, it was not unusally for Apple employees to got to trade shows and conference registered as working for another company to fake companies. The reason this is an issue is the fact that most people in AI were stuck in some university for all these years since no one would hire them since they did not really know what to do with it. They are so use to taking other peoples ideas and running with them or sharing their ideas since they could not figure out what to do next. Now they being told not to share anymore and they do not know how to function.
The Bloomberg article claims Apple's secrecy "hurts" it's AI software development but then presents mere opinions to back that up. There's no evidence presented that proves that any AI endeavors have been harmed by secrecy.
So it's an editorial, not "news."
Any scientist will tell you that science does not exist in a vacuum. What do scientists do to get ideas? They read papers. What do they do when they're done? They publish papers. Isaac Newton knew this 340 years ago with the "standing on the shoulders of giants"
Even scientists who work on classified research publish and share ideas. The government sponsors classified journals and conferences.
Unlike your comment, the Bloomberg article relies on common sense to those in the field.
Not sure about these whispers and twittering, but it does seem like Siri is improving at a glacial pace while Google Now and Cortana are able to do some really interesting things.
It is modern liberal science. You take your position as fact and then message the data to fit your perspective. If the data shoes to fit well throw out anything that might indicate that you are wrong.
Correct. My post relies on understanding the difference between fact and "common sense." I demand facts. And so do scientists.
The Bloomberg article depends solely on "common sense," something that I regard as an editorial, not news.
You may want to step back from trying to use the logical fallacy of appealing to common sense. I assure you, it won't work on me.
http://scienceornot.net/2014/02/06/the-appeal-to-common-sense-garbage-in-the-guise-of-gumption/
Not sure about these whispers and twittering, but it does seem like Siri is improving at a glacial pace while Google Now and Cortana are able to do some really interesting things.
I'm hearing/feeling/thinking the same way. Cortona for sure is getting some good press, although I cannot verify for myself.
What's going on here? Apple lagging in Siri development or black ops by Google & MS?
Edit: . . . or maybe, to put words in S Newton's mouth: "This is just news media frantically trying to be relevant - scribbling copy, rather than checking facts".
Revisit the IBM partnership to include deep integration of IBM Watson into Siri and other Apple services. Problem solved.
Correct. My post relies on understanding the difference between fact and "common sense." I demand facts. And so do scientists.
You're very obviously not a scientist. Go talk to one and ask them about how scientific consensus occurs. People publish a bunch of opinions until it becomes scientifically accepted. That's all. And 25-50 years later when new evidence arises, all of that stuff will be seen as wrong. If you want to know the truth, science will never give it to you.
Example: Newton's laws of motion are wrong. Einstein came along and showed they are broken. 200 years of scientists were mistaken.
This story is lame. It makes a claim, but doesn't provide one example of how secrecy actually hurts Apple when it comes to AI. First, why would Apple write papers on AI? Apple doesn't want to share its findings with competitors. Look at Webkit. Apple contributed tons of open source information to that, only for Google to take all of that information and break away to create its own competing web browsing engine.
Second, why would well paid and treated employees care about Apple's secrecy requirements? All companies keep trade secrets. I suspect some people want to work for Apple for this reason. The mystic likely increases Apple's ability to hire great people.
Second, why would well paid and treated employees care about Apple's secrecy requirements? All companies keep trade secrets. I suspect some people want to work for Apple for this reason. The mystic likely increases Apple's ability to hire great people.
It's the complete opposite. If you want to move to a new job as a scientist, you have to show your publication record, and show that those publications made a big impact in the field. If you can't publish, the good scientists will avoid that resume stain like the plague. Google and Microsoft Research and Xerox PARC and HP Labs and IBM Research and the remains of Bell Labs have no problems publishing, in fact the latter three have their own in-house journals.
Personal example: right out of grad school I was offered a job at a Department of Defense lab, but the stuff I would be working on would be subject to publication restrictions. Because of this, I turned them down, even though they offered 80% more money (really about 50% with the cost of living difference) than the job I took.
It is modern liberal science. You take your position as fact and then message the data to fit your perspective. If the data shoes to fit well throw out anything that might indicate that you are wrong.
Yup. What the liberal education system teaches. You are the expert in everything and entitled to your correct opinion, even if you aren't at all qualified.
Any scientist will tell you that science does not exist in a vacuum. What do scientists do to get ideas? They read papers. What do they do when they're done? They publish papers. Isaac Newton knew this 340 years ago with the "standing on the shoulders of giants"
Even scientists who work on classified research publish and share ideas. The government sponsors classified journals and conferences.
Unlike your comment, the Bloomberg article relies on common sense to those in the field.
But nowhere does it say Apple's AI researchers aren't going to conferences and reading papers to get ideas. If fact, the article confirms it. All it says is they aren't publishing their own work because it's secret. It may be hampering the progress outside of Apple, and it may even limit Apple's researchers future job prospects, but there is no evidence at all that it's hampering Apple's own progress.
I can see where researchers might feel they have their hands tied if they're not permitted to share or consult with other academics and researchers in the field. Heck, I've consulted with my own competitors on certain issues that required a coupe of good heads put together to solve a problem. In each case we both benefited too.
Now whether Apple has a problem with secrecy and research no one outside of Apple would know. I agree the article is supposition.
Not sure about these whispers and twittering, but it does seem like Siri is improving at a glacial pace while Google Now and Cortana are able to do some really interesting things.
what are examples of things those can do that Siri can't do?
I don't believe a damn thing Bloomberg says.
Looks like a reporter couldn't find a good source for info about Apple AI and that hampered development of their story. So we got this.