or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › What's wrong with larger than binary code? (Base 4?)
New Posts  All Forums:Forum Nav:

What's wrong with larger than binary code? (Base 4?)

post #1 of 69
Thread Starter 
For all of you enlighened people here, why couldn't a company move to say Base 4 code rather than base 2?

Concievably it wouldn't be that tough for wiring to be accurate enough to support 4 levels of light/eletricity intensity, plus base 4 will save alot of space when considering a HD.
(Considering DNA uses base 4 with its ATCG codes)

I'm asking, what is the probability of this happening in the future (yeah, we'll prolly get yetaherz G28s first, but....)
Dfn Eupfhoria: the joy of playing the 21st level of marathon.
Reply
Dfn Eupfhoria: the joy of playing the 21st level of marathon.
Reply
post #2 of 69
I think I read this post 3 times and I'm still confused. Does anybody understand what's he is saying?

I'm too lazy right now to do any research at google.com
post #3 of 69
[quote]Originally posted by Eupfhoria:
<strong>For all of you enlighened people here, why couldn't a company move to say Base 4 code rather than base 2?</strong><hr></blockquote>

I don't think this is the forum for such question, but here is an answer:
The Russians had computers working in base 3 (simpler than base 4). This is first helpful from a storage perspective where you can store a 32 binary bit integer in ~20 triary bits, a 37% memory savings. However, the complexity of the memory and the CPU ends up costing you more.
We will probably only see base 4 computing when we move to quantum mechanical computing, as a quantim bit is really a base-4 number.
post #4 of 69
The way I understand it, a chip works essentially by being a huge network of switches. A switch has only 2 possitions (on/off), so all machine code must be in base 2 (0/1).
I could be wrong, but I'm pretty sure that's the reason in a nutshell.
I've been a Macintosh user for as long as there have been Macintoshes, right from the very first one. It was elegantly thought out, intuitive, and it was such a pleasure to use that you wanted to...
Reply
I've been a Macintosh user for as long as there have been Macintoshes, right from the very first one. It was elegantly thought out, intuitive, and it was such a pleasure to use that you wanted to...
Reply
post #5 of 69
as i said... apple to introduce the first quantum computer in august running os x
go AAPL, go to $70 !!! © 2004
Reply
go AAPL, go to $70 !!! © 2004
Reply
post #6 of 69
[quote]Originally posted by Eupfhoria:
<strong>For all of you enlighened people here, why couldn't a company move to say Base 4 code rather than base 2?

Concievably it wouldn't be that tough for wiring to be accurate enough to support 4 levels of light/eletricity intensity, plus base 4 will save alot of space when considering a HD.
(Considering DNA uses base 4 with its ATCG codes)

I'm asking, what is the probability of this happening in the future (yeah, we'll prolly get yetaherz G28s first, but....)</strong><hr></blockquote>

The main reason computing takes place in binary, is due to the binary nature of a switch - it is either on or off, 1 or 0. So in terms of processing, the switches on the CPU are desendants of large physical switches from the very very very early days of computing.

Another reason is Magnetics, if you want to store data on a magnetic medium, such as a hard disk platter, you can store each bit in one of two states - Positive and Negative.

Four and three bit processing are possible, there are chips and switches that have been developed to hold more than two states, but the storage medium, magnetics, is still postive or negative - so there is a translation process from the binary stored in the memory device before it is processed, and it slows down the whole system to the point of inefficiency.

As another poster pointed out, quantum computing offers new vistas of concievable processing systems, and some very interesting things have been done with quartz crystals for storage of more than two states (imagine having a bit for every shade of colour in the raindow).

Theres also a lot of talk about optical computing, which would transmit light through fiber optics and not electricity through semi conductors, which would mean much less heat and a lot more flexibility, and instead of having two states to compute with, you would have any color of the rainbow, and any shade inbetween.

It'll be interesting to see what happens in the future.
this is the way the world ends
this is the way the world ends
this is the way the world ends
not with a bang, but with a wintel machine

-T.S. Eliot
Reply
this is the way the world ends
this is the way the world ends
this is the way the world ends
not with a bang, but with a wintel machine

-T.S. Eliot
Reply
post #7 of 69
[quote]Originally posted by *l++:
<strong>

I don't think this is the forum for such question, but here is an answer:
The Russians had computers working in base 3 (simpler than base 4). This is first helpful from a storage perspective where you can store a 32 binary bit integer in ~20 triary bits, a 37% memory savings. However, the complexity of the memory and the CPU ends up costing you more.
We will probably only see base 4 computing when we move to quantum mechanical computing, as a quantim bit is really a base-4 number.</strong><hr></blockquote>

A Quantum bit is not a base-4 number. Instead a qubit stores both a 1 and a 0 at the same time in a quantum state called superposition. The idea of superposition is what make quantum computer incredibly fast in some instance, such as factoring, and database searching.
post #8 of 69
I would think the move to a 3-base would be possible in a magneto arena: north/south/null.

Elctrical: pos/neg/null.

I would imagine that one reason why things are still 2-base is because they notion of computation stated that way. Call it inertia.

ting5
- I used to be SdC, ting5, and YAR, but I'm sticking to Jet, I promise.
- <a href="http://suckful.com" target="_blank">http://suckful.com</a> &lt;--My NEW! weblog
Reply
- I used to be SdC, ting5, and YAR, but I'm sticking to Jet, I promise.
- <a href="http://suckful.com" target="_blank">http://suckful.com</a> &lt;--My NEW! weblog
Reply
post #9 of 69
It is too expensive to change now.

Its like saying if you can seat four across in a car you could get more people in it. It is in fact true, but there aren't any roads to drive that new car on.

The most tried idea that I have heard of is using tri-state logic. It is already used in many places, but not for things like memory storage. You get 1, 0, and -1. It gets complicated trying to comform to the current 2 state logic since a positive voltage is normally the zero state.

Of course why stop at 3 or 4 states. Many analog computer models currently exist. Some have actually worked.
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
post #10 of 69
[quote]Originally posted by Yet Another Registration:
<strong>I would think the move to a 3-base would be possible in a magneto arena: north/south/null.

Elctrical: pos/neg/null.

I would imagine that one reason why things are still 2-base is because they notion of computation stated that way. Call it inertia.

ting5</strong><hr></blockquote>

Good point regarding Inertia, if you've had trouble switching from OS 9 to OS X, imagine ehat kind of a nightmare would be involved in switching code from base2 to base3, base4, or any other kind of numeric system.

As for your thoughts on tirtiary states in magnetics and electronics, i have no idea if that possible or not.. My source is the Discovery channel, and thing are always over-simplified in TV-land. :^)
this is the way the world ends
this is the way the world ends
this is the way the world ends
not with a bang, but with a wintel machine

-T.S. Eliot
Reply
this is the way the world ends
this is the way the world ends
this is the way the world ends
not with a bang, but with a wintel machine

-T.S. Eliot
Reply
post #11 of 69
I am by no means an expert here.. But I do not believe that null would work on a harddrive. (Correct me if I am wrong) Does the search head thingy know exactly what its position is? Or something like that.. Because if there was no charge whatsoever, how would it not determine it as a bad sector, or know it exists there? Maybe I am just not making sense...
1 Peter 1:6-7
Powerbook G4 12" 1.33ghz, 60gig hd, 1.25 gigs ram.

Powermac G4 "Sawtooth" 400 mhz, 80gig hd, 384mb of ram, Rage 128 Pro graphics.
Reply
1 Peter 1:6-7
Powerbook G4 12" 1.33ghz, 60gig hd, 1.25 gigs ram.

Powermac G4 "Sawtooth" 400 mhz, 80gig hd, 384mb of ram, Rage 128 Pro graphics.
Reply
post #12 of 69
This is pretty much fuzzy logic. (not akin to "fuzzy math") DNA based computers a long ways down the road, though they will one day see light since base-4 would allow for more accurate AI, since the decision cylce has not two (0 or 1) but four choices (A, T, C, G). I don't see apple ever using this. By then computers will be no where what we think they are now. This is 50 years down the road at least.
post #13 of 69
Even if a good three of four state logic media existed, it would still have to interface with everything else, which is binary. The complexity burden far outweighs any small benefits.

Such a discussion belongs with other such topics, like why do we have a base 10 number system? A base 16 numbering system would be much better. It works great for logarithms and computers. Base 10 is a bummer, but you know we will never change.
post #14 of 69
Thread Starter 
Thnx for the psots everyone, I just wanted people to be thinkin about it.

Plus, this IS future hardware, just alot in the future

[ 07-30-2002: Message edited by: Eupfhoria ]</p>
Dfn Eupfhoria: the joy of playing the 21st level of marathon.
Reply
Dfn Eupfhoria: the joy of playing the 21st level of marathon.
Reply
post #15 of 69
Running binary code on a trinary or quadratic system is concevably possible. As long as 2 of the states for the higher powered language equated to the original binary states.

Trinary could be
+1, 0, -1
Easily done, Computers only use half a sine wave now. This just uses the lower half as a negative volt.

Quadratic could be
Frequency: 0, 1
Amplitude(current) 0, 1
Modulate the voltage and the current.
This gives you 4 discreet states
0, 0
0, 1
1, 0
1, 1
This still only uses 1/4 of the available states in AM/FM signals.
0, 1 = 0 binary
1, 1 = 1 binary

Holographic storage would be a necessity. OK children turn in your data crystals so I can check your homework.

Edit: Just did the math, If you include negative voltage and negitive current, then you get 9 discreet states, allowing you to do base9 math.

[ 07-29-2002: Message edited by: Plague Bearer ]</p>
post #16 of 69
[quote]Originally posted by Eupfhoria:
<strong>For all of you enlighened people here, why couldn't a company move to say Base 4 code rather than base 2?

</strong><hr></blockquote>


Lot's of reasons, not the least of which is signal to noise ... in DNA the 4 steps are each fully differentiated, it's a fully quantized system, you can't have an amino acid which is half one and half another ... thus base 4 works.

But you start trying to slice computer voltages even finer than they already are (to the limit as it is), and the ability of the system to differentiate states breaks down pretty quick - now not always, it depends on the medium you're working with; but hell, to remain consistent, it's best just to slice it as fine as you can across the board ... and reap the energy and storage space bonus anyway.
In life, as in chess, the moves that hurt the most, are the ones you didn't see ...
Reply
In life, as in chess, the moves that hurt the most, are the ones you didn't see ...
Reply
post #17 of 69
hey my dad tried working on a tri based computer back when he was smaller. the biggest advantages you guys havent mentioned are neural networks. neural networks running on dna style computers which arer already being developed by a drop out from stanford would be sweet. they can learn from your data. it doesnt matter what yoiu give them they learn it. almost like a baby. ex if you gave a network several pics that looked like a symbol the network would begin to remember the signal. my friend built one last year, it was slow but sweet. anyway my poin is that a dif. style of logic would allow comp. to think more like humans. since humans neurons work around the same way as a trit based chip would. i am onnly 18 so if any of you pros can better explain what i am trying to say please do.
If you had game like me You would still have your girl.
Reply
If you had game like me You would still have your girl.
Reply
post #18 of 69
Another issue is the actual microchip construction. They are made of transistors, which are essentially electronic switches, with two states: on or off. Transistors have three leads: the source, the drain, and the gate. It is the gate voltage that is varied to turn the transistor on and off, so the clock frequency determines how fast it turns the transistors on and off.

I'm contemplating how I would go about building a microchip that would have three (or more) stable states and how I would go about setting and reading the state, and frankly I can't figure it out. I'm not sure how to construct a device that could be integrated to the level of millions per square cm, and still keep everything straight. If you have three states, for example, you end up with, say, -1, 0, and +1 as your three states. The problem I see is that "-1" represents a current flowing in the opposite direction of a "+1", so we need a device which can spit out bits of current in both directions. But then how do you keep a "-1" from cancelling out a "+1"? i.e., how do you detect the difference between a "+1/-1" superposition and a true "0"? My head starts spinning when I try to sort this all out, as I have reached the limit of my own electronics education. I'm sure it can be done in principle, but it seems hugely complex for whatever benefit may arise from it.

The strength of binary logic is in its simplicity. It may require more basic units to do computations than a ternary (or higher) logic system, but the number of ways you can go wrong with the higher number systems seems to rise exponentially. Others who know more may have better insights, but that's my take on it.
"Mathematics is the language with which God has written the Universe" - Galileo Galilei
Reply
"Mathematics is the language with which God has written the Universe" - Galileo Galilei
Reply
post #19 of 69
The reason is pretty straight forward.
In any given chip process the system is capable of distinguishing a certain voltage level. Current chips are running just over one volt internally.
That cant be split any further, so having three or four levels means having a higher voltage.

Then it becomes a trade off of speed ( more data per clock ) versus the increased compplexity of the design and its greater power requirements, and therefore heat dissapation.

I dont think that those issues weigh in favour of greater than binary systems at the moment. However, if we reach some physical limit ( maybe quantum computing ), then advances in speed must come from an approach other than making the chip smaller.

All the other comments about interfacing with other elements are valid. To get people to migrate you have to show a significant improvement over their current systems in some way ( performance is easy to measure ), eg: twice as fast for the same price ( see PC's versus Macs ), half the price for the same speed ( see PC's versus Macs ) or whatever.
post #20 of 69
Originally posted by Eupfhoria:
[quote]For all of you enlighened people here, why couldn't a company move to say Base 4 code rather than base 2?<hr></blockquote>
What a great idea. This would put Apple into the front of the competition immediately. Since 4 is a larger number than 2, the resulting CPUs would be much faster than x86 - incidentially because of the same reason 64Bit CPUs are necessary to stay competitive on the desktop.

I just wonder why no one ever implemented that?
post #21 of 69
The superposition of electrons in up and down states simultaneously in quantum computers increases potential processing power by a huge amount.

Instead of an array of on/off switches in current machines, qubits multiply the number of operations..

a 1 qubit qc has 2 "switches" a 2 qubit qc has 4, 3 qubit qc has 8, 4 qubit qc has 16 ...

you can see at around a few hundred qubits the processing power of a quantum computer accelerates exponentially. We have 7 qubit quantum computers today. There was a discovery recently which may very well lead to massive thousand qubit quantum computing. In that range it's likely we'll be able to simulate every atom in the Earth's weather systems.. so we can predicts weather out a few hundred years with accuracy.
You're going sane in a crazy world!
Reply
You're going sane in a crazy world!
Reply
post #22 of 69
[quote]Originally posted by Animaniac:
<strong>This is pretty much fuzzy logic. (not akin to "fuzzy math") DNA based computers a long ways down the road, though they will one day see light since base-4 would allow for more accurate AI, since the decision cylce has not two (0 or 1) but four choices (A, T, C, G). I don't see apple ever using this. By then computers will be no where what we think they are now. This is 50 years down the road at least.</strong><hr></blockquote>

There was a lot of talk about "DNA" based computers some years ago, where scientists believed that switching to a base-4 could make computers a hundred times more powerfull than there are today. They even showed a prototype running some simple OS only capable of calculating a few algorithms. And the result was, already then, a lot better than what they had then.

I cant remeber the specifics, but they where very optimistic then (must have been 5 or 6 years ago), i guess the costs killed the project off eventually.

In the real world, ignorance is truly a bliss.
Reply
In the real world, ignorance is truly a bliss.
Reply
post #23 of 69
DNA computers exist and work just fine.

I think the first problem they solved was the travelling salesman problem.

Salesman wants to plot the shortest route between n number of cities without visiting the same city more than once.

Sounds simple, but the best supercomputers can only do a small number of cities.

A DNA computer was able to do many many cities and find the answer in seconds.. it just took some time to verify the answer.
You're going sane in a crazy world!
Reply
You're going sane in a crazy world!
Reply
post #24 of 69
An aquintance of mine works at a research company working on new methods for making chips. They have been working on them for a long time already and they started work that will go into the final fase five years from now.

When the 30nm process chips make it to the market, you can't go any smaller. 20nm is the absolute theoretical limit.

After that we will see quantum computing, which will work with base 3: spin up, down and neutral.

Maybe after that we will go for base 4, but quantum computers work with base 3: trits.

Expect them 10 to 15 years from now.
post #25 of 69
[quote]Originally posted by JasonPP:
<strong>DNA computers exist and work just fine.

I think the first problem they solved was the travelling salesman problem.

Salesman wants to plot the shortest route between n number of cities without visiting the same city more than once.

Sounds simple, but the best supercomputers can only do a small number of cities.

A DNA computer was able to do many many cities and find the answer in seconds.. it just took some time to verify the answer.</strong><hr></blockquote>


Wait if they can solve that problem they can solve all of the np-complete problems. That would be a huge milestone, how come I didn't hear about it...
post #26 of 69
[quote]Originally posted by Animaniac:
<strong>
Wait if they can solve that problem they can solve all of the np-complete problems. That would be a huge milestone, how come I didn't hear about it...</strong><hr></blockquote>

heh- getting to a field that I know well.

DNA computers did recently solve the traveling salesman problem.

For those unfamiliar with graph theory, the travelling salesman problem (TSP) is: given an arbitrary number of destinations that must be visited, find the order of visitation that incurrs the least cost. This problem is classified as NP complete, meaning that the truly optimal solution is found in Nondeterministic Polynomial time. For n stops, the problem may be solved in time that (for example) is proportional to the square of the number of stops. Plain English: the more stops you add, the dramatically slower your optimal solution is found.

So, if I remember correctly, a few months ago, a DNA computer solved the TSP and managed to do it in ROUGHLY CONSTANT TIME.

Did this usher in a brave new world where all the UPS and FED EX trucks are more efficiently used? Nope. The reason for this is that where the DNA computer could solve the TSP in constant time, it could not solve it in constant time with constant quantities of DNA. To solve larger and larger problems, the "computer" needed polynomially more DNA. Unfortunately, the ammount of DNA needed to solve a large TSP would weigh more than the earth itself.

There are no free lunches in computer science.
King Felix
Reply
King Felix
Reply
post #27 of 69
Ok, as my first post pointed out, quantum computers also exist today.. and work just fine.

IBM has working multiple qubit computers.

That being said a practical quantum computer desktop is years away if ever.. But that's just an engineering problem. The biggest hurdle is isolating the qubits from.. well.. everything.

Please please, please, read "The age of spiritual machines" it's one of the most insightfull books on the future of computation and well, the future of technology and humans in general.

That's where I read about the travelling salesman project, but it wasn't months ago, I read it was conducted in the 90's.

[ 07-30-2002: Message edited by: JasonPP ]</p>
You're going sane in a crazy world!
Reply
You're going sane in a crazy world!
Reply
post #28 of 69
If quantum / qubit computers are so good at solving optimal path solution problems, than why not ask one what the optimal number of bits a computer logic path needs? <img src="graemlins/lol.gif" border="0" alt="[Laughing]" /> <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" /> <img src="graemlins/bugeye.gif" border="0" alt="[Skeptical]" /> <img src="graemlins/oyvey.gif" border="0" alt="[No]" />
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
post #29 of 69
[quote]Originally posted by Yevgeny:
<strong>
Did this usher in a brave new world where all the UPS and FED EX trucks are more efficiently used? Nope. The reason for this is that where the DNA computer could solve the TSP in constant time, it could not solve it in constant time with constant quantities of DNA. To solve larger and larger problems, the "computer" needed polynomially more DNA. Unfortunately, the ammount of DNA needed to solve a large TSP would weigh more than the earth itself.</strong><hr></blockquote>

So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?

Somewhere, God is laughing.

[ 07-30-2002: Message edited by: Amorph ]</p>
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #30 of 69
[quote]Originally posted by Amorph:
<strong>

So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?

Somewhere, God is laughing.

[ 07-30-2002: Message edited by: Amorph ]</strong><hr></blockquote>

Yes, this seems to be the case. DNA computers can be considered to be extremely parallel machines (lots of extremely slow CPU's). As the problem size increses, you have to add CPU's.

Mind you, nobody in their right mind solves the TSP by brute force. There are well known heuristics for solving this problem. Most commpnly, you cluster your points and then find a quick, near optimal path between clusters of points.

As I said, there are no free lunches. A solution to a NP complete problem that ran in linear time and used linear requirements (RAM, DNA, etc) would really revolutionize computer science.
King Felix
Reply
King Felix
Reply
post #31 of 69
[quote]So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?
<hr></blockquote>

ROTFL.

This is probably the best thread I've read on AI in a long time. I thank you, Eupfhoria.
post #32 of 69
damn ascii art doesn't come out even using CODE tag I give up ^_^; (was trying to draw a cMOS NAND gate, but TJM gave a better explaination )

In a FPGA design class, i learned that you can trade hardware size for software time...while DNA computing can handle parallel processing much better, it's not really "solving" the TSP, at least to my understanding from what Yevgeny said. I was under the impression that quantum computing will do the trick though...heck, throw in quantum encryption while we are at it! the absolute secure method until someone proves Heisenberg (sp?) wrong <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />

[ 07-30-2002: Message edited by: evangellydonut ]</p>
post #33 of 69
This is all so very confusing.

How about One's complement bits.

You have 1, 0, and -0. <img src="graemlins/lol.gif" border="0" alt="[Laughing]" />

Tri-Logic Test:
0 OR -0 = ?
1 AND -0 = ?

:confused: <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" /> <img src="graemlins/bugeye.gif" border="0" alt="[Skeptical]" /> <img src="graemlins/lol.gif" border="0" alt="[Laughing]" />
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
post #34 of 69
ANSWERS:

Tri-Logic Test:
0 OR -0 = -0
1 AND -0 = 0

Did you get either right?
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
post #35 of 69
[quote]Originally posted by Yevgeny:
<strong>Mind you, nobody in their right mind solves the TSP by brute force. There are well known heuristics for solving this problem. Most commpnly, you cluster your points and then find a quick, near optimal path between clusters of points.</strong><hr></blockquote>

lol I have a poorly coded program in LISP that "tries" to do this...turned out pretty damn inefficient though ^_^; that's why i'm not a CS major I guess...
post #36 of 69
[quote]Originally posted by MrBillData:
<strong>ANSWERS:

Tri-Logic Test:
1 AND -0 = 0
</strong><hr></blockquote>

Shouldn't that result in 1 instead?

Let's say we have 4 bit numbers, then 1 is 0001, 0 is 0000 and the latter's one's-complement (i.e. -0) is !0000 = 1111. 0001 & 1111 = 0001 = 1. Or am I missing something?

Bye,
RazzFazz

[ 07-30-2002: Message edited by: RazzFazz ]</p>
post #37 of 69
If you do binary bit logic, you are right.

But tristate bit logic actually exists, and has its own rules.

The reason I said One's Complement Bits is because the third state value is -0 and not -1.
The concept of -0 does not generally exist since most processors do Twos Complement math. The value given to the third state helps when doing tristate calculations. Since -1 could be confused with the value and not the bit state.

Class dismissed.
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
The Bush public works project to repave the road from Suspicion to Paranoia is over budget.
Reply
post #38 of 69
[quote]Originally posted by Amorph:
<strong>

So in other words, given an NP-complete problem, you can either have a machine of constant size that can solve it in polynomial time, or a machine of polynomial size that can solve it in constant time?</strong><hr></blockquote>

Careful here. NP means non-deterministic polynomial and NOT polynomial!!! If you had a deterministic machine of constant size that could solve NP-hard problems in polynomial time, you'd be a rich man (for proving P=NP).

Non-deterministic polynomial time means that you can check/verify a solution in polynomial time, but you cannot solve the problem (on a deterministic machine, which is what we have) in polynomial time. On a non-deterministic turing machine (a machine that evaluates the problem for all possible states in parallel, quantum computers...) such a problem can be solved in polynomial time.

Runtimes for problems in P are O(n^0)= constant, O(n^2), O(n^10000), and any O(n^k) = polynomial time.

Runtimes for NP problems are typically O(k^n) on deterministic machines.

((I think what you meant to ask was: "So in other words, given an NP-complete problem, you can either have a deterministic machine of constant size that canNOT solve it in polynomial time, or a machine of polynomial(or even exp?) size that can solve it in polynomial time?"))

123

[ 08-01-2002: Message edited by: 123 ]</p>
post #39 of 69
Guys, all this stuff about switches only having two positions and magnetic media only having two states and wires only carrying two voltages and transisitors only conducting or not is all rubbish.

Note that audio and video MAGNETIC tapes are still mostly analogue, as is radio circuitry (ever heard of a TRANSISTOR radio). Transistors are amplifiers, you put in a power supply and a small voltage signal to produce a bigger voltage signal. Not only do all these electronic systems work with analogue, they worked with analogue FIRST.

The reasons that digital electronics are used are for simplicity and reliability. Signals get distorted easily, especially if you try and run lots of them near to each other. A digital signal has only two states, as far away from each other as possible (typically 0-5 or 0-3 voltas seperation for computer hardware, falling to much less within the chip). Because of this there is mininal chance of a '1' being read as a '0' or vice versa, which would make your program crash, or worse.

For digital radio transmission, more than two states are frequently used. The binary signal is translated into 4 or 16 voltage levels (hex) and sent like that, but for this you ideally want high voltages (hence, high seperation) and also you need error correction code, i.e. redundant signal information to allow for detection/correction of the odd bit that goes astray (sometimes called parity).

This would all be highly innefficient on the scale of a modern cpu, and so is not used. Analogue chips do exist for certain applications, particularly radio as they are much faster, but using an analogue chip to process digital information (which computer info is bound to be, whatever base you use, unless you want to use 16.7 million voltage levels to represent each pixel of your 24bit picture faithfully) is inherantly inneficient.

Anyway, It's not like bus width is the primary factor holding back computer tech at the moment.

The immediate future of CPUs will probably lie in 3D chips (wires running in 3 dimensions instead of just on a flat surface as now), and beyond that maybe optical.

Quantum tech will happen, but it will not use 4-state electronics, the fact that qubits are quaternions is irrelevant because they still only represent 1 or 0 depending on their spin. The purpose of qubits is to do massive parallel processing by splitting tasks across parallel universes (don't believe me? look it up - check out <a href="http://www.qubit.org" target="_blank">http://www.qubit.org</a> and click on FAQ). This won't speed up all tasks though, it's sadly not quite equivalent to having infinite processors in you computer, although that is the general idea. Oh, and don't expect to see one on the shelves for about 50 years either.

Did any of that make sense? The first bit at least i hope.
"There's no chance that the iPhone is going to get any significant market share. No chance" - Steve Ballmer
Reply
"There's no chance that the iPhone is going to get any significant market share. No chance" - Steve Ballmer
Reply
post #40 of 69
[quote]Originally posted by Socrates:
<strong>Guys, all this stuff about switches only having two positions and magnetic media only having two states and wires only carrying two voltages and transisitors only conducting or not is all rubbish....
Did any of that make sense? The first bit at least i hope.</strong><hr></blockquote>

Nice summary Socrates - I was thinking along the very same lines while reading this thread. Just as a side note, analog computers do still exist (as mentioned elsewhere in the thread). Without them, chaos theory would not be where it is today. Of course, some feel the interpretation at Santa Cruz was more than helped by certain organic substances....

As for quantum computers, we'll see those on the shelves as soon as the engineers figure out a way to fit Shrodinger's cat in a 5"x16" box.

- S&M
Don't ask me, I don't work here.
Reply
Don't ask me, I don't work here.
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › What's wrong with larger than binary code? (Base 4?)