<strong>This is pretty much fuzzy logic. (not akin to "fuzzy math") DNA based computers a long ways down the road, though they will one day see light since base-4 would allow for more accurate AI, since the decision cylce has not two (0 or 1) but four choices (A, T, C, G). I don't see apple ever using this. By then computers will be no where what we think they are now. This is 50 years down the road at least.</strong><hr></blockquote>
There was a lot of talk about "DNA" based computers some years ago, where scientists believed that switching to a base-4 could make computers a hundred times more powerfull than there are today. They even showed a prototype running some simple OS only capable of calculating a few algorithms. And the result was, already then, a lot better than what they had then.
I cant remeber the specifics, but they where very optimistic then (must have been 5 or 6 years ago), i guess the costs killed the project off eventually.
An aquintance of mine works at a research company working on new methods for making chips. They have been working on them for a long time already and they started work that will go into the final fase five years from now.
When the 30nm process chips make it to the market, you can't go any smaller. 20nm is the absolute theoretical limit.
After that we will see quantum computing, which will work with base 3: spin up, down and neutral.
Maybe after that we will go for base 4, but quantum computers work with base 3: trits.
Wait if they can solve that problem they can solve all of the np-complete problems. That would be a huge milestone, how come I didn't hear about it...</strong><hr></blockquote>
heh- getting to a field that I know well.
DNA computers did recently solve the traveling salesman problem.
For those unfamiliar with graph theory, the travelling salesman problem (TSP) is: given an arbitrary number of destinations that must be visited, find the order of visitation that incurrs the least cost. This problem is classified as NP complete, meaning that the truly optimal solution is found in Nondeterministic Polynomial time. For n stops, the problem may be solved in time that (for example) is proportional to the square of the number of stops. Plain English: the more stops you add, the dramatically slower your optimal solution is found.
So, if I remember correctly, a few months ago, a DNA computer solved the TSP and managed to do it in ROUGHLY CONSTANT TIME.
Did this usher in a brave new world where all the UPS and FED EX trucks are more efficiently used? Nope. The reason for this is that where the DNA computer could solve the TSP in constant time, it could not solve it in constant time with constant quantities of DNA. To solve larger and larger problems, the "computer" needed polynomially more DNA. Unfortunately, the ammount of DNA needed to solve a large TSP would weigh more than the earth itself.
Ok, as my first post pointed out, quantum computers also exist today.. and work just fine.
IBM has working multiple qubit computers.
That being said a practical quantum computer desktop is years away if ever.. But that's just an engineering problem. The biggest hurdle is isolating the qubits from.. well.. everything.
Please please, please, read "The age of spiritual machines" it's one of the most insightfull books on the future of computation and well, the future of technology and humans in general.
That's where I read about the travelling salesman project, but it wasn't months ago, I read it was conducted in the 90's.
If quantum / qubit computers are so good at solving optimal path solution problems, than why not ask one what the optimal number of bits a computer logic path needs? <img src="graemlins/lol.gif" border="0" alt="[Laughing]" /> <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" /> <img src="graemlins/bugeye.gif" border="0" alt="[Skeptical]" /> <img src="graemlins/oyvey.gif" border="0" alt="[No]" />
Did this usher in a brave new world where all the UPS and FED EX trucks are more efficiently used? Nope. The reason for this is that where the DNA computer could solve the TSP in constant time, it could not solve it in constant time with constant quantities of DNA. To solve larger and larger problems, the "computer" needed polynomially more DNA. Unfortunately, the ammount of DNA needed to solve a large TSP would weigh more than the earth itself.</strong><hr></blockquote>
So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?
So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?
Yes, this seems to be the case. DNA computers can be considered to be extremely parallel machines (lots of extremely slow CPU's). As the problem size increses, you have to add CPU's.
Mind you, nobody in their right mind solves the TSP by brute force. There are well known heuristics for solving this problem. Most commpnly, you cluster your points and then find a quick, near optimal path between clusters of points.
As I said, there are no free lunches. A solution to a NP complete problem that ran in linear time and used linear requirements (RAM, DNA, etc) would really revolutionize computer science.
[quote]So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?
<hr></blockquote>
ROTFL.
This is probably the best thread I've read on AI in a long time. I thank you, Eupfhoria.
damn ascii art doesn't come out even using CODE tag I give up ^_^; (was trying to draw a cMOS NAND gate, but TJM gave a better explaination )
In a FPGA design class, i learned that you can trade hardware size for software time...while DNA computing can handle parallel processing much better, it's not really "solving" the TSP, at least to my understanding from what Yevgeny said. I was under the impression that quantum computing will do the trick though...heck, throw in quantum encryption while we are at it! the absolute secure method until someone proves Heisenberg (sp?) wrong <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
<strong>Mind you, nobody in their right mind solves the TSP by brute force. There are well known heuristics for solving this problem. Most commpnly, you cluster your points and then find a quick, near optimal path between clusters of points.</strong><hr></blockquote>
lol I have a poorly coded program in LISP that "tries" to do this...turned out pretty damn inefficient though ^_^; that's why i'm not a CS major I guess...
Let's say we have 4 bit numbers, then 1 is 0001, 0 is 0000 and the latter's one's-complement (i.e. -0) is !0000 = 1111. 0001 & 1111 = 0001 = 1. Or am I missing something?
But tristate bit logic actually exists, and has its own rules.
The reason I said One's Complement Bits is because the third state value is -0 and not -1.
The concept of -0 does not generally exist since most processors do Twos Complement math. The value given to the third state helps when doing tristate calculations. Since -1 could be confused with the value and not the bit state.
So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?</strong><hr></blockquote>
Careful here. NP means non-deterministic polynomial and NOT polynomial!!! If you had a deterministic machine of constant size that could solve NP-hard problems in polynomial time, you'd be a rich man (for proving P=NP).
Non-deterministic polynomial time means that you can check/verify a solution in polynomial time, but you cannot solve the problem (on a deterministic machine, which is what we have) in polynomial time. On a non-deterministic turing machine (a machine that evaluates the problem for all possible states in parallel, quantum computers...) such a problem can be solved in polynomial time.
Runtimes for problems in P are O(n^0)= constant, O(n^2), O(n^10000), and any O(n^k) = polynomial time.
Runtimes for NP problems are typically O(k^n) on deterministic machines.
((I think what you meant to ask was: "So in other words, given an NP-complete problem, you can either have a deterministic machine of constant size that canNOT solve it in polynomial time, or a machine of polynomial(or even exp?) size that can solve it in polynomial time?"))
Guys, all this stuff about switches only having two positions and magnetic media only having two states and wires only carrying two voltages and transisitors only conducting or not is all rubbish.
Note that audio and video MAGNETIC tapes are still mostly analogue, as is radio circuitry (ever heard of a TRANSISTOR radio). Transistors are amplifiers, you put in a power supply and a small voltage signal to produce a bigger voltage signal. Not only do all these electronic systems work with analogue, they worked with analogue FIRST.
The reasons that digital electronics are used are for simplicity and reliability. Signals get distorted easily, especially if you try and run lots of them near to each other. A digital signal has only two states, as far away from each other as possible (typically 0-5 or 0-3 voltas seperation for computer hardware, falling to much less within the chip). Because of this there is mininal chance of a '1' being read as a '0' or vice versa, which would make your program crash, or worse.
For digital radio transmission, more than two states are frequently used. The binary signal is translated into 4 or 16 voltage levels (hex) and sent like that, but for this you ideally want high voltages (hence, high seperation) and also you need error correction code, i.e. redundant signal information to allow for detection/correction of the odd bit that goes astray (sometimes called parity).
This would all be highly innefficient on the scale of a modern cpu, and so is not used. Analogue chips do exist for certain applications, particularly radio as they are much faster, but using an analogue chip to process digital information (which computer info is bound to be, whatever base you use, unless you want to use 16.7 million voltage levels to represent each pixel of your 24bit picture faithfully) is inherantly inneficient.
Anyway, It's not like bus width is the primary factor holding back computer tech at the moment.
The immediate future of CPUs will probably lie in 3D chips (wires running in 3 dimensions instead of just on a flat surface as now), and beyond that maybe optical.
Quantum tech will happen, but it will not use 4-state electronics, the fact that qubits are quaternions is irrelevant because they still only represent 1 or 0 depending on their spin. The purpose of qubits is to do massive parallel processing by splitting tasks across parallel universes (don't believe me? look it up - check out <a href="http://www.qubit.org" target="_blank">http://www.qubit.org</a> and click on FAQ). This won't speed up all tasks though, it's sadly not quite equivalent to having infinite processors in you computer, although that is the general idea. Oh, and don't expect to see one on the shelves for about 50 years either.
Did any of that make sense? The first bit at least i hope.
<strong>Guys, all this stuff about switches only having two positions and magnetic media only having two states and wires only carrying two voltages and transisitors only conducting or not is all rubbish....
Did any of that make sense? The first bit at least i hope.</strong><hr></blockquote>
Nice summary Socrates - I was thinking along the very same lines while reading this thread. Just as a side note, analog computers do still exist (as mentioned elsewhere in the thread). Without them, chaos theory would not be where it is today. Of course, some feel the interpretation at Santa Cruz was more than helped by certain organic substances....
As for quantum computers, we'll see those on the shelves as soon as the engineers figure out a way to fit Shrodinger's cat in a 5"x16" box.
Comments
<strong>This is pretty much fuzzy logic. (not akin to "fuzzy math") DNA based computers a long ways down the road, though they will one day see light since base-4 would allow for more accurate AI, since the decision cylce has not two (0 or 1) but four choices (A, T, C, G). I don't see apple ever using this. By then computers will be no where what we think they are now. This is 50 years down the road at least.</strong><hr></blockquote>
There was a lot of talk about "DNA" based computers some years ago, where scientists believed that switching to a base-4 could make computers a hundred times more powerfull than there are today. They even showed a prototype running some simple OS only capable of calculating a few algorithms. And the result was, already then, a lot better than what they had then.
I cant remeber the specifics, but they where very optimistic then (must have been 5 or 6 years ago), i guess the costs killed the project off eventually.
I think the first problem they solved was the travelling salesman problem.
Salesman wants to plot the shortest route between n number of cities without visiting the same city more than once.
Sounds simple, but the best supercomputers can only do a small number of cities.
A DNA computer was able to do many many cities and find the answer in seconds.. it just took some time to verify the answer.
When the 30nm process chips make it to the market, you can't go any smaller. 20nm is the absolute theoretical limit.
After that we will see quantum computing, which will work with base 3: spin up, down and neutral.
Maybe after that we will go for base 4, but quantum computers work with base 3: trits.
Expect them 10 to 15 years from now.
<strong>DNA computers exist and work just fine.
I think the first problem they solved was the travelling salesman problem.
Salesman wants to plot the shortest route between n number of cities without visiting the same city more than once.
Sounds simple, but the best supercomputers can only do a small number of cities.
A DNA computer was able to do many many cities and find the answer in seconds.. it just took some time to verify the answer.</strong><hr></blockquote>
Wait if they can solve that problem they can solve all of the np-complete problems. That would be a huge milestone, how come I didn't hear about it...
<strong>
Wait if they can solve that problem they can solve all of the np-complete problems. That would be a huge milestone, how come I didn't hear about it...</strong><hr></blockquote>
heh- getting to a field that I know well.
DNA computers did recently solve the traveling salesman problem.
For those unfamiliar with graph theory, the travelling salesman problem (TSP) is: given an arbitrary number of destinations that must be visited, find the order of visitation that incurrs the least cost. This problem is classified as NP complete, meaning that the truly optimal solution is found in Nondeterministic Polynomial time. For n stops, the problem may be solved in time that (for example) is proportional to the square of the number of stops. Plain English: the more stops you add, the dramatically slower your optimal solution is found.
So, if I remember correctly, a few months ago, a DNA computer solved the TSP and managed to do it in ROUGHLY CONSTANT TIME.
Did this usher in a brave new world where all the UPS and FED EX trucks are more efficiently used? Nope. The reason for this is that where the DNA computer could solve the TSP in constant time, it could not solve it in constant time with constant quantities of DNA. To solve larger and larger problems, the "computer" needed polynomially more DNA. Unfortunately, the ammount of DNA needed to solve a large TSP would weigh more than the earth itself.
There are no free lunches in computer science.
IBM has working multiple qubit computers.
That being said a practical quantum computer desktop is years away if ever.. But that's just an engineering problem. The biggest hurdle is isolating the qubits from.. well.. everything.
Please please, please, read "The age of spiritual machines" it's one of the most insightfull books on the future of computation and well, the future of technology and humans in general.
That's where I read about the travelling salesman project, but it wasn't months ago, I read it was conducted in the 90's.
[ 07-30-2002: Message edited by: JasonPP ]</p>
<strong>
Did this usher in a brave new world where all the UPS and FED EX trucks are more efficiently used? Nope. The reason for this is that where the DNA computer could solve the TSP in constant time, it could not solve it in constant time with constant quantities of DNA. To solve larger and larger problems, the "computer" needed polynomially more DNA. Unfortunately, the ammount of DNA needed to solve a large TSP would weigh more than the earth itself.</strong><hr></blockquote>
So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?
Somewhere, God is laughing.
[ 07-30-2002: Message edited by: Amorph ]</p>
<strong>
So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?
Somewhere, God is laughing.
[ 07-30-2002: Message edited by: Amorph ]</strong><hr></blockquote>
Yes, this seems to be the case. DNA computers can be considered to be extremely parallel machines (lots of extremely slow CPU's). As the problem size increses, you have to add CPU's.
Mind you, nobody in their right mind solves the TSP by brute force. There are well known heuristics for solving this problem. Most commpnly, you cluster your points and then find a quick, near optimal path between clusters of points.
As I said, there are no free lunches. A solution to a NP complete problem that ran in linear time and used linear requirements (RAM, DNA, etc) would really revolutionize computer science.
<hr></blockquote>
ROTFL.
This is probably the best thread I've read on AI in a long time. I thank you, Eupfhoria.
In a FPGA design class, i learned that you can trade hardware size for software time...while DNA computing can handle parallel processing much better, it's not really "solving" the TSP, at least to my understanding from what Yevgeny said. I was under the impression that quantum computing will do the trick though...heck, throw in quantum encryption while we are at it! the absolute secure method until someone proves Heisenberg (sp?) wrong <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
[ 07-30-2002: Message edited by: evangellydonut ]</p>
How about One's complement bits.
You have 1, 0, and -0. <img src="graemlins/lol.gif" border="0" alt="[Laughing]" />
Tri-Logic Test:
0 OR -0 = ?
1 AND -0 = ?
Tri-Logic Test:
0 OR -0 = -0
1 AND -0 = 0
Did you get either right?
<strong>Mind you, nobody in their right mind solves the TSP by brute force. There are well known heuristics for solving this problem. Most commpnly, you cluster your points and then find a quick, near optimal path between clusters of points.</strong><hr></blockquote>
lol I have a poorly coded program in LISP that "tries" to do this...turned out pretty damn inefficient though ^_^; that's why i'm not a CS major I guess...
<strong>ANSWERS:
Tri-Logic Test:
1 AND -0 = 0
</strong><hr></blockquote>
Shouldn't that result in 1 instead?
Let's say we have 4 bit numbers, then 1 is 0001, 0 is 0000 and the latter's one's-complement (i.e. -0) is !0000 = 1111. 0001 & 1111 = 0001 = 1. Or am I missing something?
Bye,
RazzFazz
[ 07-30-2002: Message edited by: RazzFazz ]</p>
But tristate bit logic actually exists, and has its own rules.
The reason I said One's Complement Bits is because the third state value is -0 and not -1.
The concept of -0 does not generally exist since most processors do Twos Complement math. The value given to the third state helps when doing tristate calculations. Since -1 could be confused with the value and not the bit state.
Class dismissed.
<strong>
So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?</strong><hr></blockquote>
Careful here. NP means non-deterministic polynomial and NOT polynomial!!! If you had a deterministic machine of constant size that could solve NP-hard problems in polynomial time, you'd be a rich man (for proving P=NP).
Non-deterministic polynomial time means that you can check/verify a solution in polynomial time, but you cannot solve the problem (on a deterministic machine, which is what we have) in polynomial time. On a non-deterministic turing machine (a machine that evaluates the problem for all possible states in parallel, quantum computers...) such a problem can be solved in polynomial time.
Runtimes for problems in P are O(n^0)= constant, O(n^2), O(n^10000), and any O(n^k) = polynomial time.
Runtimes for NP problems are typically O(k^n) on deterministic machines.
((I think what you meant to ask was: "So in other words, given an NP-complete problem, you can either have a deterministic machine of constant size that canNOT solve it in polynomial time, or a machine of polynomial(or even exp?) size that can solve it in polynomial time?"))
123
[ 08-01-2002: Message edited by: 123 ]</p>
Note that audio and video MAGNETIC tapes are still mostly analogue, as is radio circuitry (ever heard of a TRANSISTOR radio). Transistors are amplifiers, you put in a power supply and a small voltage signal to produce a bigger voltage signal. Not only do all these electronic systems work with analogue, they worked with analogue FIRST.
The reasons that digital electronics are used are for simplicity and reliability. Signals get distorted easily, especially if you try and run lots of them near to each other. A digital signal has only two states, as far away from each other as possible (typically 0-5 or 0-3 voltas seperation for computer hardware, falling to much less within the chip). Because of this there is mininal chance of a '1' being read as a '0' or vice versa, which would make your program crash, or worse.
For digital radio transmission, more than two states are frequently used. The binary signal is translated into 4 or 16 voltage levels (hex) and sent like that, but for this you ideally want high voltages (hence, high seperation) and also you need error correction code, i.e. redundant signal information to allow for detection/correction of the odd bit that goes astray (sometimes called parity).
This would all be highly innefficient on the scale of a modern cpu, and so is not used. Analogue chips do exist for certain applications, particularly radio as they are much faster, but using an analogue chip to process digital information (which computer info is bound to be, whatever base you use, unless you want to use 16.7 million voltage levels to represent each pixel of your 24bit picture faithfully) is inherantly inneficient.
Anyway, It's not like bus width is the primary factor holding back computer tech at the moment.
The immediate future of CPUs will probably lie in 3D chips (wires running in 3 dimensions instead of just on a flat surface as now), and beyond that maybe optical.
Quantum tech will happen, but it will not use 4-state electronics, the fact that qubits are quaternions is irrelevant because they still only represent 1 or 0 depending on their spin. The purpose of qubits is to do massive parallel processing by splitting tasks across parallel universes (don't believe me? look it up - check out <a href="http://www.qubit.org" target="_blank">http://www.qubit.org</a> and click on FAQ). This won't speed up all tasks though, it's sadly not quite equivalent to having infinite processors in you computer, although that is the general idea. Oh, and don't expect to see one on the shelves for about 50 years either.
Did any of that make sense? The first bit at least i hope.
<strong>Guys, all this stuff about switches only having two positions and magnetic media only having two states and wires only carrying two voltages and transisitors only conducting or not is all rubbish....
Did any of that make sense? The first bit at least i hope.</strong><hr></blockquote>
Nice summary Socrates - I was thinking along the very same lines while reading this thread. Just as a side note, analog computers do still exist (as mentioned elsewhere in the thread). Without them, chaos theory would not be where it is today. Of course, some feel the interpretation at Santa Cruz was more than helped by certain organic substances....
As for quantum computers, we'll see those on the shelves as soon as the engineers figure out a way to fit Shrodinger's cat in a 5"x16" box.
- S&M
<strong>
Runtimes for problems in P are O(n) = constant, O(n^2), O(n^10000), and any O(n^k) = polynomial time.
</strong><hr></blockquote>
Just a slight nitpick: Constant run time is O(1), not O(n).
Bye,
RazzFazz