or Connect
New Posts  All Forums:

Posts by NormM

A large leap, I'd say. If this name is right, this is not the iPad 2HD or iPad 2S, this is the iPad HD. They consider HD a big enough feature to name the phone after it alone. Retina display was not a minor feature when it appeared in the iPhone 4 -- that was not an incremental upgrade. This is like saying that the MacBook Air was a minor upgrade, because they named it after just one feature, its light weight.
In response, jragosta exhibited a quote from the Wall Street Journal: Since the earlier agreement was a purchase and sale agreement, according to my understanding of the word "bought" this third party analysis says the court ruled the trademark was bought but not delivered.
I assume the article is quoting an Apple spokeswoman saying Proview "refuse to honor" the agreement. The writing could be clearer.
If these displays are being shipped to South Korea, I think that's pretty good evidence they aren't for the iPad 3.
All the current Airs come with Thunderbolt connectors, so I assume you would use that for imaging.
According to Apple's numbers they paid out $2 billion to developers last year. If the top 100,000 got most of the money, that's an average of around $20,000 each. Some must have gotten more and made a good living, and some must have just augmented other income. But it's still a lot of jobs, and obviously much more than your initial estimate of 2,000. And this estimate doesn't include any Apple employees, which I assume would be included in the 43,000 US employee number...
This American Life broadcast part of this show recently. The second part of the radio program was independent fact checking on part 1, and was actually pretty favorable towards Apple.
As long as there is enough information in the captured pixels to produce a crude depth-map for an all-in-focus image (i.e., about how far away is the thing captured by each pixel), then a blur filter applied to parts of the in-focus image can do a pretty good simulation of controlled depth of field. Computational imaging techniques seem to just be getting started, and I doubt if the Lytro idea is efficient enough in its use of available light to be the future.
There have been cellphone cameras for more than five years that have used wavefront coding for extended depth of field (no need to focus, everything is always in focus using post-processing of the sensor's light field information). In fact, there were rumors that Apple would use one of Omnivisions "TrueFocus" wavefront coding sensors in 2007 in the original iPhone, but Apple used a different Omnivision chip instead. I'm not sure why the new light field stuff is getting...
Apple is using exactly the same atoms and molecules as everyone else. They didn't invent a single new atom. So what is this nonsense about others copying Apple's MacBook Air? Apple just markets the same old atoms better to its deluded tiny fanbase.
New Posts  All Forums: