or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple's multi-touch technology seen spawning "mega-platform"
New Posts  All Forums:Forum Nav:

Apple's multi-touch technology seen spawning "mega-platform" - Page 4

post #121 of 199
Quote:
Originally Posted by melgross View Post

As far back as you can go implies all the way to the back of the desk. from a normal seating position, that would give 30" as I stated. Since I've not seen people push their monitors to the back of their 30" desks, it does sound unusual.

But, now you state tyhat it isn't as far back as it can go.

So, fine.

Its 2" from the back of the desk. What are you insane? Is the monitor supposed to levitate off the back of my desk so the FRONT of the screen can be 30" from the front of the desk? Fine add 2" to all the calculations and you still come nowhere near the limitations of the human eye.

Or do you have some LCD monitor that is 0" thick?

Your original quote: "What a lot of people don't realise, it that LCD's save no room at all on a standard depth desk."

Now I know why you think that...to "save room" in your book you need to occupy zero space.

Quote:
I love those definitive charts, and calculators. The problem with them is that they only tell part of the story. While it;'s true that 20/20 vision enables one to see to one degree, a number I've used myself on these forms more than once, and supplied links to charts as well, it only applies for a black and white pixel array. The closer the grey tones are, the lower the resolution we can see. When the greys (or colors) are close, we can't tell the difference between them, unless the size is great. Try it yourself.

Gee...and human can actually see much finer detail than 1 arc minute as has been shown in studies (things like power lines in the distance). Do you wish the breakdown by color spectrum as well? Because humans have different resolving capabilities there as well.

This is stupid. If the resolution was close I'd give it to you. It's HALF.

Quote:
These people aren't using these monitors for detailed viewing, certainly not for any length of time, so the occasional "leaning in" Works.

These people don't use it for detailed viewing of high resolution photos/graphics at ALL (well not strictly true but close enough). The point is not everyone that owns what you deem to be a "high resolution" monitor uses it for photo work or detailed graphics. Neither will users of ANY MT display.

This also has nothing to do with your assertion that LCDs SAVE NO SPACE ON THE DESKTOP.

Which is a pretty stupid assertion to begin with. What space DO they save then? None? because they're still the same width and height.

Rest assured that a 30" CRT would be a hell of a lot deeper than 10" on my desk. IF it even fit without overhanging so far forward to tip over.

Quote:
No, you don't always get it right, but you sure try.

Getting stupid like that seems to be your best shot, though.

Saying LCDs save no space is just bizzare. And to then claim victory because the FRONT of my ACD is 10" from the rear of the desk instead of 8" is even more bizzare. Its not like I can even really get it THAT much further back...the cord cuts into some of that space.

Get it right? Jeez...you aren't even on the same plane of reality on this issue. Stupid is indeed the right word.

Vinea
post #122 of 199
Quote:
Originally Posted by vinea View Post

True, you didn't say MT was better. You said MT (flat, no feedback) keyboards would be no worse than conventional ones. Mkay...the study shows that MT keyboards ARE indeed worse.

Well, just to hassle you a bit, that test wasn't anywhere long enough to really tell what would happen over a really extended time. As it takes months for most people to reach any sort of comfort, accuracy, and speed, on a conventional keyboard, why should we believe that a limited time study, where people have already become expert on a standard keyboard, would tell us much about using a very different keyboard? These people had to unlearn how to use the standard keyboard when starting to use the MTK, and then learn the MTK, something that is very difficult. A better study would have compared people who never used any keyboard, and had two groups, one learn on the standard model, and the other on the MTK.

That would actually be scientific.

Quote:
You stated: "I find that most keyboards these days have poor enough feedback that typing on a monitor keyboard wouldn't be worse. If the monitor is big enough, that would work for a lot of people. My wife's old Atari 400 had a flat, feedbackless keyboard, and she got used to it. A lot of industrial, factory floor equipment, in critical applications, also use such keyboards, so it's not out of the question."

This study says different. That's ALL I said in the original post. How is that "overstating" my case? You made an assertion that a study says is likely incorrect.

1) Typing on a monitor keyboard IS likely worse based on a study with another kind of zero travel MT keyboard

2) Typing on an Atari 400 feedbackless keyboard was shown to be worse (in as much as its closer to the MT keyboard in question) regardless of whether or not your wife "got used to it".

And yes...they used "proficient" typists because well...using non-proficient typpists in evaluating keyboards would be dumb. 45 WPM is by no means excessive...it's within 1 SD from median in one study (21.3 to 54.7 WPM - median of 38 WPM). "Hunt and Peck" typists can achieve up to 37 WPM.

Yes, 45 WPM is faster than the median but still within "average" and below many WPM minimums for typists.


Gee, amazing how in every case I include exactly the words you write specific to a piece of information that contradicts it. You choose to turn this into some bizzare evasion of what was a very simple post that said:

"A study showed that it IS worse than your Apple keyboard".

I didn't comment on ANYTHING else you wrote in that post. Any amplification was limited to saying "this study is limited in the usual ways...".

I provided info in a neutral format about something you were mis-informed about.

Vinea

Because, what you do is to take what I'm saying out of context, as you often do.

Those of us who were discussing this, before you bumped in, were talking about using this for editing in PS, or other similar tasks. That was why we were talking about Multi-Touch, and stylus usage together, the idea I brought up. We were talking about limited uses for certain tasks. In THAT context, typing the number, or word, or naming a file, or typing a short description, would work out fine.

And for that, the 400 keyboard mention was appropriate. I NEVER said that this should substitute for a regular keyboard for long typing tasks, or continued use.

You should read my other posts, before jumping on one thing, and ignoring all the rest.
post #123 of 199
Quote:
Originally Posted by melgross View Post

Well, just to hassle you a bit, that test wasn't anywhere long enough to really tell what would happen over a really extended time.

Which is a perfectly reasonable response. As I said, the study has all the usual limitations of that kind of study.

Saying that "most keyboards" are not like the conventional keyboard studied is just a wierd and very defensive response.

Quote:
A better study would have compared people who never used any keyboard, and had two groups, one learn on the standard model, and the other on the MTK.

That would actually be scientific.

So you're saying that a peer reviewed paper, published at a professional conference (one of the more significant human factors ones) from a doctoral candidate from Cornell is "unscientific" in nature?

Yah, okay. How many peer reviewed HCI papers have you written?

Jeez, can we be more condescending?

Quote:
Because, what you do is to take what I'm saying out of context, as you often do.

Those of us who were discussing this, before you bumped in, were talking about using this for editing in PS, or other similar tasks.

That was why we were talking about Multi-Touch, and stylus usage together, the idea I brought up. We were talking about limited uses for certain tasks. In THAT context, typing the number, or word, or naming a file, or typing a short description, would work out fine.

The previous messages were not limited to "PS or other similar tasks". And Onlooker specifically address keyboards directly if you're going for the "whole thread" context and you yourself "corrected" another poster about the multiple threads of conversations

This is also only one of several MT threads in the forum and given that I don't live here 24/7 like you, I don't follow every thread unless the title or length piques my interest. This one started out more iPhone centric (not so interested) then moved into applications of MT on the desktop (more interested given that's an area that I contend with).

In any case...IF you were only talking about "casual" use then why talk about feedback and error at all? And you SPECIFICALLY state that "If the monitor is big enough, that would work for a lot of people."

How would you take THAT out of context? Especially with the amplification that your wife got used to the 400 keyboard for whatever her tasks were.

Presumably that included normal typing, under continued use and she would be included under the category of "a lot of people".

Quote:
And for that, the 400 keyboard mention was appropriate. I NEVER said that this should substitute for a regular keyboard for long typing tasks, or continued use.

You said so in the very paragraph under discussion. Come on. Are you saying that you completely miswrote that paragraph that you didn't think it be good enough for a lot of people if the monitor was big enough but wrote it that way by accident?

I mean most folks don't do photoshop or use a wacom tablet so the only other thing (beside keyboarding) they would use MT for is manipulation of the desktop and some limited iLife usage. Do you regularly use "a lot of folks" when you mean just some niche group?

Quote:
You should read my other posts, before jumping on one thing, and ignoring all the rest.

Would it amaze you if I said that I did? The comment was very limited in scope and applied directly to the paragraph in question. This is the usual run around you give...the thread is still very short. Folks can take a quick gander over the course of the topic.

Do I NEED to comment on EVERY post you make in order to inject a comment in a thread? Oh my...a forum rule that I must have missed...

Vinea
post #124 of 199
Quote:
Originally Posted by vinea View Post

Its 2" from the back of the desk. What are you insane? Is the monitor supposed to levitate off the back of my desk so the FRONT of the screen can be 30" from the front of the desk? Fine add 2" to all the calculations and you still come nowhere near the limitations of the human eye.

Or do you have some LCD monitor that is 0" thick?

that's not what I said at all. You said that your monitor was three inches from the back of the desk, not as far back as it could go as you first said. That may not seem to be much, but unless you sit with your belly pressed against the desk, your head is a good ten inches away, particularly if you use a keyboard drawer.

I just bought my wife a 22" Samsung 225BW monitor. The front of the screen is 7" from the back of the stand. Some monitors are more, some less. So, add 7 to 3, and we get 10. If you sit normally, which I assume you do, we're talking about 30" again, just as I said. If you have to lean in sometimes, as you said you did sometimes, then you need to get closer to see the detail clearly, again, as I said.

Quote:
Your original quote: "What a lot of people don't realise, it that LCD's save no room at all on a standard depth desk."

Well, most monitors I see on desks are about twelve inches in from the front, just as they were with CRT versions. I sometimes see them even closer, and sometimes further, but not pushed to the rear, as you say. But, I'm not arguing that you do what you do there, just that I've never seen it anywhere.

Quote:
Now I know why you think that...to "save room" in your book you need to occupy zero space.

Amusing, as always.

Quote:
Gee...and human can actually see much finer detail than 1 arc minute as has been shown in studies (things like power lines in the distance). Do you wish the breakdown by color spectrum as well? Because humans have different resolving capabilities there as well.

This is stupid. If the resolution was close I'd give it to you. It's HALF.

It's amazing that you can say some things, then admit that you can't always see the detail at even half the distance yourself, so you have to get even closer.

By the way, resolution for humans is measured in line PAIRS. One dark line (or dot) and one light one. You do have to cut the rez in half.

Quote:
These people don't use it for detailed viewing of high resolution photos/graphics at ALL (well not strictly true but close enough). The point is not everyone that owns what you deem to be a "high resolution" monitor uses it for photo work or detailed graphics. Neither will users of ANY MT display.

And that is why it doesn't matter to them. They don't care if they aren't resolving every pixel.

Quote:
This also has nothing to do with your assertion that LCDs SAVE NO SPACE ON THE DESKTOP.

Go back a few lines and re-read.

Quote:
Which is a pretty stupid assertion to begin with. What space DO they save then? None? because they're still the same width and height.

Side to side. That's true.

Quote:
Rest assured that a 30" CRT would be a hell of a lot deeper than 10" on my desk. IF it even fit without overhanging so far forward to tip over.

Relevance?


Quote:
Saying LCDs save no space is just bizzare. And to then claim victory because the FRONT of my ACD is 10" from the rear of the desk instead of 8" is even more bizzare. Its not like I can even really get it THAT much further back...the cord cuts into some of that space.

Get it right? Jeez...you aren't even on the same plane of reality on this issue. Stupid is indeed the right word.

Vinea

When you get stuck on something, you really are a junkyard dog, aren't you?
post #125 of 199
Quote:
Originally Posted by vinea View Post

Which is a perfectly reasonable response. As I said, the study has all the usual limitations of that kind of study.

Saying that "most keyboards" are not like the conventional keyboard studied is just a wierd and very defensive response.



So you're saying that a peer reviewed paper, published at a professional conference (one of the more significant human factors ones) from a doctoral candidate from Cornell is "unscientific" in nature?

Yah, okay. How many peer reviewed HCI papers have you written?

Jeez, can we be more condescending?

At the beginning of my career I worked at the Musium of Natural History, here In NYC. I wrote grants, and did research as well, so I'm familiar with the process.

You think that all papers delivered are valid just because the old boy peer review approved its publication? Think again.

In an article published for the April 2007 issue of Discover, Seth Lloyd, the MIT engineer, who developed the first quantum computer designs, and who has written for Scientific American, several books, etc said that

"I would bet that 99.8% of ideas put forth by scientists are wrong..."

Murray Gell-Mann quoted in the article "is fond of saying, "The job of a scientist is to generate wrong ideas as fast as possible."

Estimates are that as much as 30% of all scientific peer reviewed papers are wrong, and the fear is that the number may be higher.

So, yes, I find their work to be poorly thought out, and it's not condescending to say so. If you didn't agree, you would have said so.


Quote:
The previous messages were not limited to "PS or other similar tasks". And Onlooker specifically address keyboards directly if you're going for the "whole thread" context and you yourself "corrected" another poster about the multiple threads of conversations

This is also only one of several MT threads in the forum and given that I don't live here 24/7 like you, I don't follow every thread unless the title or length piques my interest. This one started out more iPhone centric (not so interested) then moved into applications of MT on the desktop (more interested given that's an area that I contend with).

It helps to know what someone is saying before commenting on one thing. The proper thing to do is to respond to the correct topic, which was the other posters error.

Quote:
In any case...IF you were only talking about "casual" use then why talk about feedback and error at all? And you SPECIFICALLY state that "If the monitor is big enough, that would work for a lot of people."

How would you take THAT out of context? Especially with the amplification that your wife got used to the 400 keyboard for whatever her tasks were.

I was talking about using it for graphics, and for using multi-touch to move objects, and images, around, so that you would hit the area you needed, which would be difficult on a smaller screen, as the point would be too easily obscured by your finger. Feedback and error always matter.

Quote:
Presumably that included normal typing, under continued use and she would be included under the category of "a lot of people".

A "lot of people" meaning that they would (in my opinion) find mulri-touch easier than using two or three key combo's when selecting, or when using a mouse/trackball to move objects, etc.

The concept was that it would be easier, if the monitor was low, and angled, when needed, to select, move, increase the size of, or decrease,or whatever, with a finger or two than, type, or move the cursor around and do different button pushes, sometimes needing two hands.

I'm pretty sure that you found the Cintiq easier to use, when you just had to touch the screen, rather than to remove your hand from the screen, move it to the keyboard, or mouse, type whatever you had to, and then move your hand to the screen again.

[/quote]
You said so in the very paragraph under discussion. Come on. Are you saying that you completely miswrote that paragraph that you didn't think it be good enough for a lot of people if the monitor was big enough but wrote it that way by accident?

I mean most folks don't do photoshop or use a wacom tablet so the only other thing (beside keyboarding) they would use MT for is manipulation of the desktop and some limited iLife usage. Do you regularly use "a lot of folks" when you mean just some niche group?[/quote]

I just explained what I meant. But, to make you happy, a large monitor would also be needed for a full size keyboard, if it is laid out the way keyboards are. Measure yours, and include the numeric keypad in that.


Quote:
Would it amaze you if I said that I did? The comment was very limited in scope and applied directly to the paragraph in question. This is the usual run around you give...the thread is still very short. Folks can take a quick gander over the course of the topic.

Do I NEED to comment on EVERY post you make in order to inject a comment in a thread? Oh my...a forum rule that I must have missed...

Vinea

When you get the concept wrong, because you are responding to just one small part of it, then yes. I don't post every bit of a discussion in just one post. It evolves as the discussion proceeds. As people comment, we all think of more to say, and - horrors- might even make slight changes in how we think about a topic, based on what others come up with.
post #126 of 199
Quote:
Originally Posted by melgross View Post

that's not what I said at all. You said that your monitor was three inches from the back of the desk, not as far back as it could go as you first said. That may not seem to be much, but unless you sit with your belly pressed against the desk, your head is a good ten inches away, particularly if you use a keyboard drawer.

So add 2 or 3 inches. Are you saying that having the rearmost part of the monitor within a couple inches is NOT pushed to back of the desk? That's bizarre. And I said a couple 3 inches because I didn't measure it. Now its suddenly THREE. No, really, it wasn't.

Fine, I just pushed it all the way back. It makes zero difference except now I have an extra 2 inches in the front. The cords keeps the stand about a quarter to a half inch from the partition so I'm sure you'll find fault with that as well.

Quote:
I just bought my wife a 22" Samsung 225BW monitor. The front of the screen is 7" from the back of the stand. Some monitors are more, some less. So, add 7 to 3, and we get 10. If you sit normally, which I assume you do, we're talking about 30" again, just as I said. If you have to lean in sometimes, as you said you did sometimes, then you need to get closer to see the detail clearly, again, as I said.

No, I don't need to lean in to see detail clearly. I lean in to futz around with 1 pixel offsets on displays because PEOPLE can see when UI elements are off by 1 pixel. Yes, even from 30" away...it looks wrong and sloppy. Its more of a hand/eye thing than to see the difference. It doesn't make me any more accurate with the mouse...it just feels more accurate.

I'm looking at an image now and I can see pixel structure. Ironically its the HFES logo for the 2006 meeting. I can see pixel structure in some photos (mostly in gradients). I can clearly see 1 pixel wide lines. I can even see 1 pixel sized dots.

Quote:
Well, most monitors I see on desks are about twelve inches in from the front, just as they were with CRT versions. I sometimes see them even closer, and sometimes further, but not pushed to the rear, as you say. But, I'm not arguing that you do what you do there, just that I've never seen it anywhere.

So, simply because of your limited experience (or lack of awareness) you state that most folks are unaware that LCDs don't actually save desk space?

Quote:
It's amazing that you can say some things, then admit that you can't always see the detail at even half the distance yourself, so you have to get even closer.

Except that I CAN see the detail. What? Are you saying that I'm lying that I can see 1 pixel wide lines or 1 pixel dots on my display from 30" away? Or that I have some kind of superior vision when normal 20/20 vision would be able to see lines and dots half the width? At least in spectra that human eyes are more receptive?

Quote:
By the way, resolution for humans is measured in line PAIRS. One dark line (or dot) and one light one. You do have to cut the rez in half.

By the way, resolution for humans are measured in a variety of means of which the most common is the Snellen number (20/20) followed by arc seconds/minutes/degrees.

Note that human acuity is 1/a where a = x/arc-minute. Folks vary what x is...it can be lines, line pairs, or cycles. It depends on the method used to measure acuity of which line pairs is one. You don't "cut the rez in half" in as much as when you convert to arc minutes the result is for 1 line.

Quote:
And that is why it doesn't matter to them. They don't care if they aren't resolving every pixel.

Which has nothing to do with the assertion that LCDs save no desk space.

Quote:
Relevance?

The relevance is that even if you push the monitor all the way back it isn't 30" from the front of the desk. And the fact that you can FIT a 30" LCD monitor on a desk is an indicator that LCDs save depth.

Quote:
When you get stuck on something, you really are a junkyard dog, aren't you?

Riiight...because you aren't arguing about 2-3 inches isn't being "all the way back".

And I'm not the one frantically trying to make the statement "What a lot of people don't realise, it that LCD's save no room at all on a standard depth desk." not sound stupid by engaging in a multi-post defence of that statement...

Vinea
post #127 of 199
Quote:
Originally Posted by melgross View Post

At the beginning of my career I worked at the Musium of Natural History, here In NYC. I wrote grants, and did research as well, so I'm familiar with the process.

...

In an article published for the April 2007 issue of Discover, Seth Lloyd, the MIT engineer, who developed the first quantum computer designs, and who has written for Scientific American, several books, etc said that

"I would bet that 99.8% of ideas put forth by scientists are wrong..."

Yes, and then you run an experiment to show that your idea was wrong. Or not. This IS the process which you claim some familiarity with.

Please, I could claim to be the chief scientist in some department of a prestigious university but it wouldn't necessarily make it true. And even if true WHO CARES?

Quote:
Estimates are that as much as 30% of all scientific peer reviewed papers are wrong, and the fear is that the number may be higher.

So, yes, I find their work to be poorly thought out, and it's not condescending to say so. If you didn't agree, you would have said so.

Nice putting words in my mouth. I didn't say it was poorly thought out. I said it was subject to the normal limitations of such studies (small sample size, duration, drawing from a student population, etc). Within that context it appears reasonable and SCIENTIFIC.

You have some evidence that contradicts your assertions but you choose to poo-poo them without any data of your own except ancedotal observation. Now that's scientific from someone that used to "write grants and did research".

Vinea
post #128 of 199
Quote:
Originally Posted by vinea View Post

So add 2 or 3 inches. Are you saying that having the rearmost part of the monitor within a couple inches is NOT pushed to back of the desk? That's bizarre. And I said a couple 3 inches because I didn't measure it. Now its suddenly THREE. No, really, it wasn't.

Fine, I just pushed it all the way back. It makes zero difference except now I have an extra 2 inches in the front. The cords keeps the stand about a quarter to a half inch from the partition so I'm sure you'll find fault with that as well.



No, I don't need to lean in to see detail clearly. I lean in to futz around with 1 pixel offsets on displays because PEOPLE can see when UI elements are off by 1 pixel. Yes, even from 30" away...it looks wrong and sloppy. Its more of a hand/eye thing than to see the difference. It doesn't make me any more accurate with the mouse...it just feels more accurate.

I'm looking at an image now and I can see pixel structure. Ironically its the HFES logo for the 2006 meeting. I can see pixel structure in some photos (mostly in gradients). I can clearly see 1 pixel wide lines. I can even see 1 pixel sized dots.



So, simply because of your limited experience (or lack of awareness) you state that most folks are unaware that LCDs don't actually save desk space?



Except that I CAN see the detail. What? Are you saying that I'm lying that I can see 1 pixel wide lines or 1 pixel dots on my display from 30" away? Or that I have some kind of superior vision when normal 20/20 vision would be able to see lines and dots half the width? At least in spectra that human eyes are more receptive?



By the way, resolution for humans are measured in a variety of means of which the most common is the Snellen number (20/20) followed by arc seconds/minutes/degrees.

Note that human acuity is 1/a where a = x/arc-minute. Folks vary what x is...it can be lines, line pairs, or cycles. It depends on the method used to measure acuity of which line pairs is one. You don't "cut the rez in half" in as much as when you convert to arc minutes the result is for 1 line.



Which has nothing to do with the assertion that LCDs save no desk space.



The relevance is that even if you push the monitor all the way back it isn't 30" from the front of the desk. And the fact that you can FIT a 30" LCD monitor on a desk is an indicator that LCDs save depth.



Riiight...because you aren't arguing about 2-3 inches isn't being "all the way back".

And I'm not the one frantically trying to make the statement "What a lot of people don't realise, it that LCD's save no room at all on a standard depth desk." not sound stupid by engaging in a multi-post defence of that statement...

Vinea

You are amazing. You answer statements with an irrelevant comments. The rest of your statements are contradictions You lean forward to see it better, but you see it fine without doing so.

You seem to be stuck in your cubical. Get outside once in a while.
post #129 of 199
Quote:
Originally Posted by vinea View Post

Yes, and then you run an experiment to show that your idea was wrong. Or not. This IS the process which you claim some familiarity with.

Please, I could claim to be the chief scientist in some department of a prestigious university but it wouldn't necessarily make it true. And even if true WHO CARES?

That's your problem. You learn the issues as you look them up. Sometimes you only learn part.

But the experiment must be correctly designed, or the results won't reflect what you are looking for, but something else. Yes, I do understand. There is no control here. A control is required. If you know something about the scientific methodology of experimental design, you would see that.

We may dislike each other, but I assume you do understand these matters. I think you simply aren't looking.


Quote:
Nice putting words in my mouth. I didn't say it was poorly thought out. I said it was subject to the normal limitations of such studies (small sample size, duration, drawing from a student population, etc). Within that context it appears reasonable and SCIENTIFIC.

Actually I didn't mean what I wrote in that last sentence. Honestly, it was several hours ago, and I don't remember what I meant to say, so ignore it.

Quote:
You have some evidence that contradicts your assertions but you choose to poo-poo them without any data of your own except ancedotal observation. Now that's scientific from someone that used to "write grants and did research".

Vinea

I'm not denying the results of the paper. I didn't do that. It says what it says. But, as you yourself stated, it is a small sample. What I don't agree with is the design of the experiment.

Most of what we say here isn't rigorously affirmed. When links are hard to find, as you can see for yourself with this one, sometimes anecdotal evidence is the best we can do without spending hours looking, which I've done, though not for this trivial affair.
post #130 of 199
Quote:
Originally Posted by melgross View Post

But, to make you happy, a large monitor would also be needed for a full size keyboard, if it is laid out the way keyboards are. Measure yours, and include the numeric keypad in that.

Not that I really want to get into the middle of your little war here, nor do I want to advocate MT keyboards, but that's just not true. An MT keyboard could by its very nature fit in smaller spaces. It's precisely that it's not physical that you could leave out the numeric and cursor keypads until you actually need them, and then it's just a matter of pushing one virtual button to switch from keyboard to keypad layout or back, and not the staggered numeric layouts embedded in laptop keyboards. Of course, it's still virtual with no feedback, which would make it harder to use than a physical keypad, but I'm just saying it need not be as wide as a physical, fixed keyboard. I think it could fit on a 15" LCD if necessary. Of course, that wouldn't leave much room above it for the regular display duties, but that's an unavoidable drawback to on-screen keyboards. You don't even really need the cursor keypad with MultiTouch, with or without touchscreen. I can move either the mouse pointer or the text cursor with my iGesture pad depending on whether I'm using one finger or two.
post #131 of 199
Quote:
Originally Posted by Kolchak View Post

Not that I really want to get into the middle of your little war here, nor do I want to advocate MT keyboards, but that's just not true. An MT keyboard could by its very nature fit in smaller spaces. It's precisely that it's not physical that you could leave out the numeric and cursor keypads until you actually need them, and then it's just a matter of pushing one virtual button to switch from keyboard to keypad layout or back, and not the staggered numeric layouts embedded in laptop keyboards. Of course, it's still virtual with no feedback, which would make it harder to use than a physical keypad, but I'm just saying it need not be as wide as a physical, fixed keyboard. I think it could fit on a 15" LCD if necessary. Of course, that wouldn't leave much room above it for the regular display duties, but that's an unavoidable drawback to on-screen keyboards.

I don't really want to have this "war", but he seems to enjoy it, so I go along until it gets too boring, and tiring.

I agree. I'm only saying that because most people have that physical learning that is hard to give up. If you have a physical keyboard that does have that layout, and a different one for the virtual one, it might cause typo's. I rarely use the numeric board, so it wouldn't bother me, but it might be a problem for some others.

It's possible that the keyboard could be made to appear and disappear the way the Dock can be made to, by touching a part of the screen, or by some other method.
post #132 of 199
Quote:
Originally Posted by melgross View Post

You are amazing. You answer statements with an irrelevant comments. The rest of your statements are contradictions You lean forward to see it better, but you see it fine without doing so.

You seem to be stuck in your cubical. Get outside once in a while.

No, they are not contradictions. You're saying I can't see 1 pixel objects from my seating position and distance to the monitor.

Given that its my eyes and my desk, when I say I can then I would think that would be accepted...especially since I provide the figures that show that anyone with average vision can also see 1 pixel objects from 30".

Get over it. You made an incredibly stupid statement when you claimed that LCDs don't save desk space because you wanted to "prove" that you wouldn't increase desktop footprint by moving to a horizontal form factor. This is patently false as any user of the Cinitq can tell you.

Vinea
post #133 of 199
Quote:
Originally Posted by melgross View Post

That's your problem. You learn the issues as you look them up. Sometimes you only learn part.

But the experiment must be correctly designed, or the results won't reflect what you are looking for, but something else. Yes, I do understand. There is no control here. A control is required. If you know something about the scientific methodology of experimental design, you would see that.

You're narrowly informed and yet claim broad expertise.

If had expertise in science you'd realize that not all studies are controlled experiments nor are they required for all fields. It depends on the rigor that is required and the ethics of the subject matter. You can statistically isolate the effects of confounding dependent variables from the independent variables you're testing.

You CAN argue that the study is a study vs an experiment. Which to be pendantic it is...and clearly stated as such.

You CAN argue that studies have less rigor than controlled experiments. This is very true.

You CANNOT argue that the case is not scientific as you said. Unless you're simply trying to be insulting to the researcher or the field.

I look things up to reinforce my memory of a subject and not say completely stupid things.

Quote:
We may dislike each other, but I assume you do understand these matters. I think you simply aren't looking.

I think you're once again claiming expertise in an area you simply have some general knowledge. There are many kinds of "experiments" (using the generic definition of the term...most research folks will be rather specific in calling a study a study, an investigation an investigation and not an "experiment" when its not) with different standards of rigor and requirements. If you look at HCI study results and as long as you understand the limitations of the techniques, you'll see there is nothing out of ordinary for that particular "experiment".

If there is anything specific about the study vis a vis other similar studies perhaps given your vast experience and expertise in such matters you can point them out.

No? I thought not.

Quote:
Actually I didn't mean what I wrote in that last sentence. Honestly, it was several hours ago, and I don't remember what I meant to say, so ignore it.

Fine...consider it forgotten. I presume you're talking about my thinking it was poorly designed?

Quote:
I'm not denying the results of the paper. I didn't do that. It says what it says. But, as you yourself stated, it is a small sample. What I don't agree with is the design of the experiment.

Most of what we say here isn't rigorously affirmed. When links are hard to find, as you can see for yourself with this one, sometimes anecdotal evidence is the best we can do without spending hours looking, which I've done, though not for this trivial affair.

Nothing we say here, even with links, is rigorously affirmed. The simple point was that one study did not agree with your assertion based on ancedotal evidence.

Whether you should change your mind is up to you but getting all wierd about what "most" keyboards is and claiming the study is unscientific, followed by a broad dismissal of the field of science as "99.8% are wrong" and "30% of peer reviewed studies are wrong" is silly (not to mention out of context).

As far as the design of the study there's nothing obviously wrong based on the slides. The paper (presumably her thesis) is only available to those within Cornell so its hard to look at the specific statistical design. Not that I'm actually qualified to judge the statistical design even if I did look at it except to compare it against what I have seen.

Given a major criteria in getting a doctorate in human studies is showing that you can competently create a statistically sound observational study (given that controlled experiments are rare) I'm going to assume they didn't bollix up something basic. Given your reputation is all you have in the scientific world I'm going to assume that her advisor made sure it wouldn't be judged stupidly incompetent by their peers.

To answer your next question, no, I don't have a PhD in human factors or anything else. I never claim any expertise in anything I write because on the net its completely meaningless. I DO include links so that what I say can be verified and judged independent of claimed expertise/experience. If you like to characterize this as "looking things up" that's your perogative.

In any case, I've never spent more effort on these posts than a couple googles or turning around and looking something up in a reference.

Vinea
post #134 of 199
Quote:
Originally Posted by vinea View Post

You're narrowly informed and yet claim broad expertise.

Please stop with your ad-hominem attacks. Make your points and leave it at that. Thank you.
post #135 of 199
I saw a demo of a Solaris Labs UrbanFace system that uses a no touch interface that is fast and accurate. With some tweaking this could be the way to add direct interaction with the screen to the Mac.
post #136 of 199
Quote:
Originally Posted by Chucker View Post

Please stop with your ad-hominem attacks. Make your points and leave it at that. Thank you.

Saying he's narrowly informed is less abusive than "get out of the cubicle" as well as his ad hominem attacks. Saying he's narrowly informed is also less than saying he's misinformed (i.e. wrong). He's right in what a controlled experiment IS but not that it encompasses the whole of scientific endeavor. Hence narrow.

You're "not welcome" as far as false courtesy goes. No need to couch your partisanship.

Vinea
post #137 of 199
Quote:
Originally Posted by rufusr View Post

I saw a demo of a Solaris Labs UrbanFace system that uses a no touch interface that is fast and accurate. With some tweaking this could be the way to add direct interaction with the screen to the Mac.

Oddly it didn't appear to be multi-touch. Did they do so in the demo? With the two side mounted cameras I suppose you would have problems with one hand obscuring the position of the other.

The only other comment would be that for a visual wall during presentations folks like to point/wave at things on the screen...which UrbanFace would interpret as a gesture. A laser pointer would resolve that issue.

Vinea
post #138 of 199
To be fair here, I think the bickering in this case is getting tiring.
post #139 of 199
Quote:
Originally Posted by Chucker View Post

Please stop with your ad-hominem attacks. Make your points and leave it at that. Thank you.

I agree.

I never the one to start those attacks. If you look at all of the posts, which he starts, you will see, as I'm sure everyone else has, that's it's he who starts attacking me. I'm not the one that throws curses around. He is. I respond, after letting it go for a while. Check his posts.

People have advised me not to respond to him at all. I think that's a good idea.
post #140 of 199
Quote:
Originally Posted by JeffDM View Post

To be fair here, I think the bickering in this case is getting tiring.

You may notice that I never start it. My mistake is in responding to it in the first place.
post #141 of 199
But, but he started it.

post #142 of 199
Quote:
Originally Posted by SpinDrift View Post

But, but he started it.


I was gonna point that out, too
post #143 of 199
I really really really didn't want to get into this fray but I'm forced to based on one of Mel's "citations". My apologies, but this is going to be a rather long post which will likely turn off most readers. I won't be offended if you click the back button right now on you browser. However, I can't present it any other way to make my point clearly. I guarantee if you stay with me, you'lll learn a few things

I am a scientist (Ph.D in 1977 from Cornell in Biochemistry). I have not yet received my April, 2007 issue of Discover so I can't tell how out of context and/or wrong the quotes he attributed to Dr. Seth Lloyd and Dr. Murray Gell-Mann are. To wit mel's quote:

Quote:
Originally Posted by melgross View Post


In an article published for the April 2007 issue of Discover, Seth Lloyd, the MIT engineer, who developed the first quantum computer designs, and who has written for Scientific American, several books, etc said that

"I would bet that 99.8% of ideas put forth by scientists are wrong..."

Murray Gell-Mann quoted in the article "is fond of saying, "The job of a scientist is to generate wrong ideas as fast as possible."

Estimates are that as much as 30% of all scientific peer reviewed papers are wrong, and the fear is that the number may be higher.

(Much of what follows is readily available in Wiki)

I'll present a little background to help the reader. Dr. Lloyd is a world renowned Professor of Mechanical engineering at MIT. His research area is in the interplay of information with complex systems, especially quantum systems. He has made contributions to the field of quantum computation and proposed a design for a quantum computer. He refers to himself as a "quantum mechanic". Dr. Gell-Mann is a Noble Prize winning Physicist who works in the area of theoretical particle physics. He is best known for his work in discovering Quarks and the Quark model, The modern theory of the interactions of quarks is called quantum chromodynamics (QCD), and is based on Gell-Mann's work. So both of Mel's references are to scientists engaged in the work of theoretical physics, a breed apart in the universe of scientists. Theoretical physics employs mathematical models and abstractions of physics, as opposed to experimental processes, in an attempt to understand nature. Central to it is mathematical physics , though other conceptual techniques are also used. The goal is to rationalize, explain and predict physical phenomena. The advancement of science depends in general on the interplay between experimental studies and theory. In some cases, theoretical physics adheres to standards of mathematical rigor while giving little weight to experiments and observations.

Performing experiments in Theoretical physics is not an easy thing to do. To study, create or prove the existence of certain sub-atomic particles, you need access to massive unbelievably expensive government funded structures called Particle Colliders. Only a few of these exist in the world. Hence, most theoretical physicists mainly theorize or attempt to mathematically prove the existence of their theories. If their theories pass a complex government/scientific peer review process, they may be able to test their theories in one of the Colliders. In stark contrast to the biological, chemical, social or behavioral sciences then, it is no trivial matter to come up with a theory, mathematically offer a rigorous defensible proof, convince the government and the review committees to go to Brookhaven National Laboratory or Fermilab and "do an experiment". To put it more crudely, It's a big mother f**ken deal to do an experiment in theoretical physics!

In the aforementioned disciplines, immunologists for example, can propose theories (if I knock out the gene for protein X, mice will not develop a certain type of autoimmune disease) , design experiments, test them, publish them and have their work peer-reviewed rather easily (Duh!) compared to theoretical physicists. In these disciplines, the statement that "I would bet that 99.8% of ideas put forth by scientists are wrong..." is pure unadulterated nonsensical gibberish. This is because there is a massive amount of verifiable and reproducible experimental data available in the biological sciences. See all the cures for certain types of cancers in recent years, the complete sequencing of the human genome, the conversion of AIDS to a chronic condition rather than a near 100 % fatal disease and the great discoveries coming out of stem cell research to name a few of the great leaps forward in medical science in the past ten years.

I'll close with " quantum mechanic " Dr. Lloyd's own words excerpted from an obit to his colleague Dr. Rolf Landauer. These words are probably the context in which he may have spoke the words in Mel's citation.

..."In any field of technology, let alone one that attempts to store information on subatomic particles, most ideas will probably not work. It is by identifying these wrong ideas that scientists and engineers winnow out those few ideas that are potentially right. Yet, despite the importance of mistakes in science, most published papers report work that is potentially right, rather than provably wrong. In such an environment, it is important for a field to have a mechanism for remembering past mistakes and for preventing them from recurring." ...

Seth Lloyd
Department of Mechanical
Massachusetts Institute of Technology
post #144 of 199
Quote:
Originally Posted by lfe2211 View Post

Theoretical physics employs mathematical models and abstractions of physics, as opposed to experimental processes, in an attempt to understand nature. Central to it is mathematical physics , though other conceptual techniques are also used. The goal is to rationalize, explain and predict physical phenomena.

Thanks lfe2211 for these comments. I personally find it good even to just read some general remarks on this topic, especially for those who will never get a chance in their life to acquire scientific knowledge. I would like only to add that the goal of Theoretical Physics is not only to explain and predict physical phenomena, but to explain why the known laws of Physics are like that. This is an even more heavy challenge going deeper in the secrets of the nature than any other scientific discipline.

Quote:
Originally Posted by lfe2211 View Post

Performing experiments in Theoretical physics is not an easy thing to do. To study, create or prove the existence of certain sub-atomic particles, you need access to massive unbelievably expensive government funded structures called Particle Colliders. Only a few of these exist in the world. Hence, most theoretical physicists mainly theorize or attempt to mathematically prove the existence of their theories. If their theories pass a complex government/scientific peer review process, they may be able to test their theories in one of the Colliders. In stark contrast to the biological, chemical, social or behavioral sciences then, it is no trivial matter to come up with a theory, mathematically offer a rigorous defensible proof, convince the government and the review committees to go to Brookhaven National Laboratory or Fermilab and "do an experiment". To put it more crudely, It's a big mother f**ken deal to do an experiment in theoretical physics!

At last, one had to explain this! And I would change the "no trivial matter" to "highly no trivial matter".

Quote:
Originally Posted by lfe2211 View Post

In the aforementioned disciplines, immunologists for example, can propose theories (if I knock out the gene for protein X, mice will not develop a certain type of autoimmune disease) , design experiments, test them, publish them and have their work peer-reviewed rather easily (Duh!) compared to theoretical physicists.

After some point, Theoretical Physics is like Mathematics. You start with some assumptions and you go through a mathematical proof to your conclusions. I have to admit though that I don't like the way the physicists often handle the mathematical tools available to them. So, if the reasoning and computations are correct, the results will be correct, can be verified, get peer-review and published. That's why there is an important number of publications in peer-reviewed Theoretical Physics journals, that are of course correct and deal with very specialized aspects of the more general problems. The BIG problem appears when experimental verification comes forth, and it is this you are talking about I think.
post #145 of 199
Quote:
Originally Posted by melgross View Post

You may notice that I never start it. My mistake is in responding to it in the first place.

So don't. Other than calling your assertion that LCDs save no table space silly there's nothing that needed your response anyway. Certainly not on the first post that spiraled into a general and baseless denounciation of the scientific community...

It wasn't my intent to get into yet another useless flamewar with you but I'm not going to not comment when you say something as silly as that LCD comment.

Takes two to have a flamewar.

Viena
post #146 of 199
Quote:
Originally Posted by vinea View Post

So don't. ...

It wasn't my intent to get into yet another useless flamewar with you but I'm not going to not comment when you say something as silly as that LCD comment.

Takes two to have a flamewar.

What I am seeing here is that you are suggesting that melgross not continue a flame war, but that you have no problem continuing it. I've heard things like that are called a double standard. If you aren't willing to stop, you have no grounds to suggest that someone else stop.
post #147 of 199
Quote:
Originally Posted by lfe2211 View Post

In stark contrast to the biological, chemical, social or behavioral sciences then, it is no trivial matter to come up with a theory, mathematically offer a rigorous defensible proof, convince the government and the review committees to go to Brookhaven National Laboratory or Fermilab and "do an experiment". To put it more crudely, It's a big mother f**ken deal to do an experiment in theoretical physics!

Well...I wouldn't call it "trivial" to get an experiment done but it certainly is several orders of magnitude easier since the funding requirements are typically a couple orders of magnitude lower than booking time at Brookhaven or Fermi.

Even there, I'd guess that getting a space experiment done is harder to do. I worked on COBE (Smoot, Mather and Hauser were the PIs) and COBE was an effort from 1976 to 1998.

Apple trivia: the skymaps from COBE that folks saw at the AAAS was done on a Mac II converted over from COBE datasets in FITS format using modified NCSA code. The software design (structure charts) were done on three Apple Lisas.

Vinea
post #148 of 199
Quote:
Originally Posted by JeffDM View Post

What I am seeing here is that you are suggesting that melgross not continue a flame war, but that you have no problem continuing it. I've heard things like that are called a double standard. If you aren't willing to stop, you have no grounds to suggest that someone else stop.

No, I'm saying if he wishes not to respond to my posts that's fine. But that I'm not going to refrain from commenting as I did with the first two posts. The first was very neutral. The second only addressed his COMMENT as silly. I suppose I can not call it silly but geez...it was in my opinion.

I only made that comment because it seems the entire onus has been placed on me. As I said, it takes two but whatever makes y'all happy. Blame me if it makes ya feel better.

Vinea
post #149 of 199
Quote:
Originally Posted by vinea View Post

I only made that comment because it seems the entire onus has been placed on me. As I said, it takes two but whatever makes y'all happy. Blame me if it makes ya feel better.

I do think melgross was right in what he said, but didn't say it well, and didn't need to refer to his experience to make a case either, and I think that is considered a logical fallacy as well.

One problem I have is with some of the tone and your previous conversations make you look like a heckler with an axe to grind. I think your current argument with melgross really has far more to do with previous conversations than anything with regard to multitouch or keyboards.
post #150 of 199
Quote:
Originally Posted by vinea View Post

Well...I wouldn't call it "trivial" to get an experiment done but it certainly is several orders of magnitude easier since the funding requirements are typically a couple orders of magnitude lower than booking time at Brookhaven or Fermi.

Even there, I'd guess that getting a space experiment done is harder to do. I worked on COBE (Smoot, Mather and Hauser were the PIs) and COBE was an effort from 1976 to 1998.

It seems that you are missing the non-financial part in what lfe2211 said.
post #151 of 199


MacPro Multitouch
post #152 of 199
PB,

Nice to make your acquaintance. It's good to know at least one forum member understood my rather lengthy complex post. Are you a mathematician? I'll be glad to have your input any time, pro or con.
post #153 of 199
Thought I might post here as well.

Stop it with all of this multi-touch jabber.

"Multi-touch this!"
"Multi-touch that!"
"Ohh - I bet we could Multi-touch that!"
"Yes! A multi-touch toilet flusher! Amazing!"

Multi-touch is a technology invented for portable use. The advantage of a mouse interface (in non-portables) is that you can move your cursor on a 30" screen from the lower left to the upper right by only moving your hand from one edge of your mouse pad to the other.

In a portable, you don't have a lot of room for the larger interfaces like a mouse and keyboard. And THAT is where multi-touch is useful.

And when you have the precision of a mouse, why do you need gestures? With a mouse, I can easily click and drag the corner of a photo with a slight twitch of the hand, whereas with multi-touch, I'd have to lift both arms and make a huge pinching motions.

I know you are all excited for the iPhone, but cool it on the multi-touch. Think about why we don't have the click wheel as a computer interface for a moment and it might dawn on you why we don't need the multi-touch over what we have now.

Next person who mentions a multi-touch anything other than an iPhone... I'll multi-touch your mum. (JK)

post #154 of 199
Quote:
Originally Posted by SpinDrift View Post



MacPro Multitouch

Hey, cool! If I could get a 15" one that could slide into my backpack, I'm sold.
post #155 of 199
Quote:
Originally Posted by icfireball View Post

Thought I might post here as well.

Stop it with all of this multi-touch jabber.

"Multi-touch this!"
"Multi-touch that!"
"Ohh - I bet we could Multi-touch that!"
"Yes! A multi-touch toilet flusher! Amazing!"

Multi-touch is a technology invented for portable use. The advantage of a mouse interface (in non-portables) is that you can move your cursor on a 30" screen from the lower left to the upper right by only moving your hand from one edge of your mouse pad to the other.

In a portable, you don't have a lot of room for the larger interfaces like a mouse and keyboard. And THAT is where multi-touch is useful.

And when you have the precision of a mouse, why do you need gestures? With a mouse, I can easily click and drag the corner of a photo with a slight twitch of the hand, whereas with multi-touch, I'd have to lift both arms and make a huge pinching motions.

I know you are all excited for the iPhone, but cool it on the multi-touch. Think about why we don't have the click wheel as a computer interface for a moment and it might dawn on you why we don't need the multi-touch over what we have now.

Next person who mentions a multi-touch anything other than an iPhone... I'll multi-touch your mum. (JK)


Brilliant. You weren't content flaunting your complete and utter ignorance of MultiTouch in just one topic, you wanted to cross-post it in two.

Here's another bit of news for you, Einstein: Nobody's forcing you to read these topics. If you don't like it, don't read it. Nobody died and made you king, so you can't tell people what they can or can't post.
post #156 of 199
Quote:
Originally Posted by lfe2211 View Post

I really really really didn't want to get into this fray but I'm forced to based on one of Mel's "citations". My apologies, but this is going to be a rather long post which will likely turn off most readers. I won't be offended if you click the back button right now on you browser. However, I can't present it any other way to make my point clearly. I guarantee if you stay with me, you'lll learn a few things

I am a scientist (Ph.D in 1977 from Cornell in Biochemistry). I have not yet received my April, 2007 issue of Discover so I can't tell how out of context and/or wrong the quotes he attributed to Dr. Seth Lloyd and Dr. Murray Gell-Mann are. To wit mel's quote:



(Much of what follows is readily available in Wiki)

I'll present a little background to help the reader. Dr. Lloyd is a world renowned Professor of Mechanical engineering at MIT. His research area is in the interplay of information with complex systems, especially quantum systems. He has made contributions to the field of quantum computation and proposed a design for a quantum computer. He refers to himself as a "quantum mechanic". Dr. Gell-Mann is a Noble Prize winning Physicist who works in the area of theoretical particle physics. He is best known for his work in discovering Quarks and the Quark model, The modern theory of the interactions of quarks is called quantum chromodynamics (QCD), and is based on Gell-Mann's work. So both of Mel's references are to scientists engaged in the work of theoretical physics, a breed apart in the universe of scientists. Theoretical physics employs mathematical models and abstractions of physics, as opposed to experimental processes, in an attempt to understand nature. Central to it is mathematical physics , though other conceptual techniques are also used. The goal is to rationalize, explain and predict physical phenomena. The advancement of science depends in general on the interplay between experimental studies and theory. In some cases, theoretical physics adheres to standards of mathematical rigor while giving little weight to experiments and observations.

Performing experiments in Theoretical physics is not an easy thing to do. To study, create or prove the existence of certain sub-atomic particles, you need access to massive unbelievably expensive government funded structures called Particle Colliders. Only a few of these exist in the world. Hence, most theoretical physicists mainly theorize or attempt to mathematically prove the existence of their theories. If their theories pass a complex government/scientific peer review process, they may be able to test their theories in one of the Colliders. In stark contrast to the biological, chemical, social or behavioral sciences then, it is no trivial matter to come up with a theory, mathematically offer a rigorous defensible proof, convince the government and the review committees to go to Brookhaven National Laboratory or Fermilab and "do an experiment". To put it more crudely, It's a big mother f**ken deal to do an experiment in theoretical physics!

In the aforementioned disciplines, immunologists for example, can propose theories (if I knock out the gene for protein X, mice will not develop a certain type of autoimmune disease) , design experiments, test them, publish them and have their work peer-reviewed rather easily (Duh!) compared to theoretical physicists. In these disciplines, the statement that "I would bet that 99.8% of ideas put forth by scientists are wrong..." is pure unadulterated nonsensical gibberish. This is because there is a massive amount of verifiable and reproducible experimental data available in the biological sciences. See all the cures for certain types of cancers in recent years, the complete sequencing of the human genome, the conversion of AIDS to a chronic condition rather than a near 100 % fatal disease and the great discoveries coming out of stem cell research to name a few of the great leaps forward in medical science in the past ten years.

I'll close with " quantum mechanic " Dr. Lloyd's own words excerpted from an obit to his colleague Dr. Rolf Landauer. These words are probably the context in which he may have spoke the words in Mel's citation.

..."In any field of technology, let alone one that attempts to store information on subatomic particles, most ideas will probably not work. It is by identifying these wrong ideas that scientists and engineers winnow out those few ideas that are potentially right. Yet, despite the importance of mistakes in science, most published papers report work that is potentially right, rather than provably wrong. In such an environment, it is important for a field to have a mechanism for remembering past mistakes and for preventing them from recurring." ...

Seth Lloyd
Department of Mechanical
Massachusetts Institute of Technology

Excuse me. But, perhaps you didn't read all of what I said. The quotes were from the current issue of discover (April 2007) which I cited.

The quotes are from the article that HE WROTE. The quotes are either from what HE wrote, or the quote from Nobel Prize winner Murray Gell-Mann, which HE quoted in his article, though I have read it elsewhere.

The entire sentence of the quote from Dr. Lloyd is as follows:

"I would bet that 99.8 percent of ideas put forth by scientists are wrong and will never be included in the body of scientific fact."

He also states:

"The vast majority of scientific ideas are (a) wrong and (b) useless. The briefest acquaintance with the real world shows that there are some forms of knowledge that will never be made scientific."

He doesn't refer specifically to the physical sciences, but is speaking in general.

He then goes to discuss sequencing of the human genome, certainly not a matter of physics.

In this lengthy paragraph, which I'm not going to quote, he talks about citations, which as any in the scientific field knows, is something to live or die by (professionally).

He could have mentioned physics, or perhaps math, but instead, chose to mention biology, in contradiction to the idea put forth by you, though he is not refering to error in that paragraph, I cite it to show that his discussion is, again, not limited to physics and math.

The actual point to his article is not actually the correctness, or incorrectness of scientific work, but the dilution of it by non-scienfific writing, in the arts, etc.

Nevertheless, his quotes stand.

The information about the 30% incorrect papers was not stated as being a quote from Dr. Lloyd, but was from an article in Science I read several months ago where scientific fraud and error were being discussed. The discussion was centered around the matter of "doctoring" photos in papers. The article discussed the "enhancing" of images in astronomical work, which is not only accepted, if done properly, but is actually a requirement! Without such enhancement, many astronomical photo's are useless.

However, the doctoring of photos in the field of biology, and chemistry is showing itself to be a problem, and is one reason why scientists are concerned about invalid papers. We are familiar with several major papers that have been withdrawn because of it, among other problems, such as the lifting of data from other papers, as well as charts, etc.

Most fraud is assumed to go undetected, as are most innocent errors.


Other papers are simply wrong for many reasons, and go undetected by peer review.

Thank you for supplying the background of those two individuals, as I didn't intend my post to be lengthened by including it myself.

I would like to conclude that lest I give the impression of being anti science myself, nothing can be further from the truth. But, it must be acknowledged that both fraud, error, and poor experimental design, is a part of the scientific life, no matter how much the majority of scientific workers strive to be free from it.

As the editors of the major scientific journals such as Science and Nature plead mea culpa on their less than stellar records in detecting problematic papers, and strain to tighten up their checks, we must remember that the world of scientific inquiry is open to correction, while other areas are not.

A flat out denial that a particular paper may be incorrect in its conclusions, or even its design, just because it has been published, particularly if it is in a more obscure journal, where publishing papers is the lifeblood of that journal, is not helpful. I'm not referring to you here ife2211. But, that is what started this.
post #157 of 199
Quote:
Originally Posted by PB View Post

Thanks lfe2211 for these comments. I personally find it good even to just read some general remarks on this topic, especially for those who will never get a chance in their life to acquire scientific knowledge. I would like only to add that the goal of Theoretical Physics is not only to explain and predict physical phenomena, but to explain why the known laws of Physics are like that. This is an even more heavy challenge going deeper in the secrets of the nature than any other scientific discipline.


At last, one had to explain this! And I would change the "no trivial matter" to "highly no trivial matter".


After some point, Theoretical Physics is like Mathematics. You start with some assumptions and you go through a mathematical proof to your conclusions. I have to admit though that I don't like the way the physicists often handle the mathematical tools available to them. So, if the reasoning and computations are correct, the results will be correct, can be verified, get peer-review and published. That's why there is an important number of publications in peer-reviewed Theoretical Physics journals, that are of course correct and deal with very specialized aspects of the more general problems. The BIG problem appears when experimental verification comes forth, and it is this you are talking about I think.

While we seem to have gotten dragged far afield in a discussion about experimentation in physics, I can certainly agree that, today, most experiments in that field are so very costly, that this country has, almost criminally, destroyed the physics community here, by refusing to provide funding for the very expensive facilities required.

While the main facilities have either recently closed down, or will be doing so in the near future, other groups of countries continue to build. While we do take part in those works as well (the US will have the largest contingent when the new facility in Europe goes on line fairly shortly (fairly shortly means something different in the physics community than it does in other disciplines!)), we are subject to a dwindling base of resources here.

The same problem is seen relating to work in fusion.

Modern physics is also very different from most other sciences. Some theoretical advances (String Theories, anyone?) are proving difficult to design experiments around, though Dr. Randall is hopeful about some aspects of the new collider.
post #158 of 199
Quote:
Originally Posted by SpinDrift View Post



MacPro Multitouch

This is the worst idea I've ever seen at AI.

Quote:
Originally Posted by icfireball View Post

Thought I might post here as well.

Stop it with all of this multi-touch jabber.

"Multi-touch this!"
"Multi-touch that!"
"Ohh - I bet we could Multi-touch that!"
"Yes! A multi-touch toilet flusher! Amazing!"

Multi-touch is a technology invented for portable use. The advantage of a mouse interface (in non-portables) is that you can move your cursor on a 30" screen from the lower left to the upper right by only moving your hand from one edge of your mouse pad to the other.

In a portable, you don't have a lot of room for the larger interfaces like a mouse and keyboard. And THAT is where multi-touch is useful.

And when you have the precision of a mouse, why do you need gestures? With a mouse, I can easily click and drag the corner of a photo with a slight twitch of the hand, whereas with multi-touch, I'd have to lift both arms and make a huge pinching motions.

I know you are all excited for the iPhone, but cool it on the multi-touch. Think about why we don't have the click wheel as a computer interface for a moment and it might dawn on you why we don't need the multi-touch over what we have now.

Next person who mentions a multi-touch anything other than an iPhone... I'll multi-touch your mum. (JK)



icfireball I completely agree with you 100%
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #159 of 199
Mel,

With regard to your reply to my lengthy post, I'll just let your own words speak for themselves. (BTW, have you figured out yet which Discover article is the traditional annual April "fake" entry? I'm on the road so I haven't seen my Discover copy yet).

With regard to your reply to PB's excellent reply to my post, we could start a whole forum on the excitement, promise and unimaginable brilliance of string theory, if it's even close to being correct. (I feel certain Albert E. is doing a jig somewhere in the universe).

My own perspective is that we are so fortunate to live in a truly great, exciting, unbelievably productive time vis-a-vis medicine, immunology, genomic science, stem cell science, physics (string theory), computer technology and information communication technology (i.e the internet)!

But alas, we digress from the intent of this forum.
post #160 of 199
Quote:
Originally Posted by lfe2211 View Post

Mel,

With regard to your reply to my lengthy post, I'll just let your own words speak for themselves. (BTW, have you figured out yet which Discover article is the traditional annual April "fake" entry? I'm on the road so I haven't seen my Discover copy yet).

With regard to your reply to PB's excellent reply to my post, we could start a whole forum on the excitement, promise and unimaginable brilliance of string theory, if it's even close to being correct. (I feel certain Albert E. is doing a jig somewhere in the universe).

My own perspective is that we are so fortunate to live in a truly great, exciting, unbelievably productive time vis-a-vis medicine, immunology, genomic science, stem cell science, physics (string theory), computer technology and information communication technology (i.e the internet)!

But alas, we digress from the intent of this forum.

I'm not quite sure of your meaning there. If you are not happy with any of my reply, you certainly may respond, instead of hinting, and then leaving it in the air. I didn't say anything that hasn't been said to me, or that I haven't read in the journals.

I don't read all of the articles, only the ones of interest, but I'll go back and take a look.

I certainly agree with your second and third paragraphs.

It's sad that AI is so specialized that we don't have science reporting as they do at ARs.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple's multi-touch technology seen spawning "mega-platform"