Where does Code start?

Posted:
in General Discussion edited January 2014
I know barely anything about code, I know that there are many different coding languages, and that's about it.



I know that computers are based on a binary system of 0 and 1, but what I don't get is how coding languages can be created from 0s and 1s, additionally, It boggles my mind as to how text input is possible.



such a seemingly simple thing, makes little sense to me, Like, with a typewriters make sense, small pieces of metal shaped like letters, covered in ink...yadda yadda. But, digital means of text input, like where'd that start? how?



please go easy on me, I really am ignorant to this stuff, It may be a dumbingly dumb answer that I'm just not realizing, and if that's the case, a simple answer will suffice



So yeah, that's my main question, how did text input become a possibility? And from that I'm assuming writing a coding language was fairly easy, but still, I'd like to know some history about this stuff.
«1

Comments

  • Reply 1 of 21
    a_greera_greer Posts: 4,594member
    well, most code starts in a "high level language" like c, c++ or fortran, then the compiler converts it into the appropriate files that are stored on the disk in what amounts to binary, which, along with hexedesimal and machine code fall into a class of languages called "low level" languages



    high level language is a simplified way of programing, as close to just feeding commands as we can consevibly get



    hope that helps
  • Reply 2 of 21
    johnqjohnq Posts: 2,763member
    Dont walk, run to the nearest copy of:



    Code:

    The Hidden Language of Computer Hardware and Software


    by Charles Petzold



    Don't skim through it, read it from page one, no skipping ahead.



    Building from simple familiar concepts he will show you exactly how it all ties together in a very accessible way.



    Simple THE best book I've ever bought on computing, and rates high in general regardless of topic.



    (Don't be put off that Microsoft press puts it out of the fact that he has a Windows logo tattoo, ok?)



    You'll learn to appreciate morse code, braille, unary/binary/etc, unicode...the works. It's all related and not anywhere near as hard as you think it is. By learning about such a seemingly broad set of things you'll see the bigger picture of "code" or "programming" as concepts rather than any specific implementation like C or ASP or AppleScript...it's actually less daunting to learn a wide array of things, then narrow your focus.



    Were you to learn C for example, you might never be able to back up the tree to see programming as a concept. You'd be too mired in the details of an implementation.
  • Reply 3 of 21
    I must add, I have no intention what so ever in learning to code, that's just not for me, I just wanted a little 101 in the how.



    thanks for the book suggestion though, I'll look into it
  • Reply 4 of 21
    johnqjohnq Posts: 2,763member
    Right, the book I recommended actually has, I believe, no actual programming code whatosever, as far as languages. So you're safe



    Definitely meant for reading on the porch away from the computer! Clear the mind
  • Reply 5 of 21
    Computers operate through instructions. An instruction is like sentence, but instead of subject - verb - object they are more like register - operation - data. (but not exactly)



    All things in a computer (operations, memory, registers) are numerically addressed. Operations are like verbs in the analogy. Addition is an operation, for example. data is memory, which is manifested in RAM and Cache. Memory is separated into chunks which are numerically addressed. The G5 is a 64bit computer because it accesses memory in 64bit chunks. Registers are memory too, but they are on the chip and are integrated directly into the execution cycle. So to do addition, you generally load numbers into two registers and the answer gets put into a third (also specified)



    If you were to run through some assembly code you might see something like:



    Add R0 R1 R2



    Presumably, R1 and R2 (registers) have numbers loaded into them and you're going to get an answer in R0 after this instruction runs.



    In C, you can type:



    variable0 = variable1 + variable2;



    The compiler will turn this into Add R0 R1 R2, (it will also probably load and save the variables from RAM/Cache to the registers, but forget that for now) As an aside, it would require 2 instructions to get the memory to the registers (R1, R2), and 1 to get the register R0 to memory. This is why it's important to have a fast memory bus, and also why caches were invented.



    Getting back, the Add instruction is converted from man-readable text to computer readable binary in a one-to-one way. In a clean architecture like MIPS or PPC, the Add will correspond to a number and the registers to other numbers and the computer would see:



    36 0 1 2



    but in binary. in this case I said Add was 36. Different instructions have different syntax, but I think you get the idea.
  • Reply 6 of 21
    ast3r3xast3r3x Posts: 5,012member
    It's cool how it works. I know there have been thousands of people working to make computers as advanced as they are, but I am just simply amazed at how far we've come from such "simple" concepts to the amazing advancements we are at.



    I know they couldn't imagine it, but even now I am so impressed at how we have tamed electricity.



    The one thing (well one of the things ) I don't get about computers is processors/transistors...they are supposed to be just switches, but I never understood how they flip on and off.
  • Reply 7 of 21
    Quote:

    Originally posted by ast3r3x

    The one thing (well one of the things ) I don't get about computers is processors/transistors...they are supposed to be just switches, but I never understood how they flip on and off.



    Hooboy.



    There are two basic kinds of transistors. . .



    Bipolar Junction Transistors (BJT)

    Field Effect Transistors (FET)



    Most stuff these days is processed as CMOS, which means Complementary Metal Oxide Semiconductor. MOS here is short for MOSFET, which as you can guess is a FET transistor.



    But I really don't want to have to get into device physics, so I'll give the really quick explanation.



    Silicon is a semiconductor. Silicon Oxide is an extremely good insulator. Furthermore, you can dope Silicon with other Metalloids to make something to the effect of stained glass. Except that these doped impurities give Silicon interesting electrical characteristics, namely polarity.



    It's been a few years since I took that intro to semiconductor physics class, and it taught me one thing above all else: semiconductor physics gets really tedious. So I forget what the actual dopants they use are. But they're all metalloids and their atomic electron properties determine if they dope the Si to be negative or positive.



    Anyway, if you fuse a positive and a negative piece of Si together, you get a diode. (A junction). Current is conventionally positive if it flows from positive to negative, so current can flow through the diode as P->N but not the other way (N->P).



    The next step is to fuse another piece of P Si to the end. essentially a biPolar junction transistor. If you control the amount of dopant in each of the pieces, you can dump some charge into the middle piece from a 3rd source and create a temporary channel through the middle N piece so current can flow all the way through. (a switch has an in, an out, and a control. In BJT's they call 'em emmiter, collector, gate). The gate is the 3rd source here.



    A FET is similar in operation but different electro-mechanically. The 90nm crap you hear about all the time has to do with the "feature size" of the transistor, which, I believe, is the thickness of the Gate Oxide.



    ---



    That's the best I can do. I probably got a lot wrong, since I wasn't a big fan of solid state. But on the whole the idea is there. anyway, as a disclaimer, you don't actually "fuse" but instead "diffuse." but I figured this was easier to understand.
  • Reply 8 of 21
    ast3r3xast3r3x Posts: 5,012member




    Well here is yoru BiPolar Junction Transistor...which when I was working on my EMP, I understood this schematic symbol to be a transistor.



    From what I learned when working on that you have PNP and NPN transistors? Now what I don't get is how at least with them is that to me I don't see it being any different then a diode really.



    If it's PNP (B=N, C,E=P) then C,E can't pass electricity into each other or B, only B can pass into C,E



    If it's NPN (B=P, C,E=N) then B can't pass electricity into each other, only B.



    I understand resistors, diodes, but transistors (which just millions of these make up processors? which somehow linked together create logic circuits?) I can't follow.



    Perhaps since I've derailed this thread into a hardware discussion it should be a different thread, but could you tell me how coil's work



    Edit: I just noticed that they show both PNP and NPN in my picture I have here, but I didn't notice that while writing this thread...and I don't remember which is which so I can't change any mistakes I have. If you tell me which is which, you can refer to A and B if it makes it easier
  • Reply 9 of 21
    A long time ago, in a galaxy not far from toggle switch interface and EBCDIC punch card entry...



    10 Print "Hello World"

    20 Goto 10



    is the first program in ASCII coded characters for many BASIC languages, including PASCAL, FORTRAN, COBOL...



    ASCII characters are everything you can type, upper and lower, punctuation, etc, represented to the computer by coded numerical equivalents.

    If you're asking how the machine knows what letters are supposed mean as ones and zeros, the short answer may be partially explained with ASCII codes



    "Machine Language" code is/was represented in hexadecimal pairs

    Hexadecimal code values go from 0 to 9, A is 10, up to F which is 16,

    and pairs from 00 through FF can thus represent 256 values, sufficient for the ASCII set



    To see conversion tables and brief explanations, click asciitable.com

    Useful reference links on ASCII, Hex, HTML, and the latest International set, UNICODE.

    from guys like Vint Cerf, who really did help father the Internet, hereº



    RGB colour codes for the web are still represented in Hexadecimal pairs. Monkey will be missed



    Early utilities programs allowed sector viewing of your drive, and I recall several that offered Hex on one side of the screen and the equivalent ASCII on the other, so maybe you could score a recent equivalent if you're really curious to see what your actual drive data looks like, but to just test some simple ASCII to Hex conversions online, try this



    hope some of that helps
  • Reply 10 of 21
    Quote:

    Originally posted by ast3r3x

    Well here is yoru BiPolar Junction Transistor...which when I was working on my EMP, I understood this schematic symbol to be a transistor.





    Hey, I think I know some people from there!

    Real live wires some of the time, scheming at others.
  • Reply 11 of 21
    Quote:

    Originally posted by ast3r3x



    . . .





    the only real difference between PNP and NPN (or for that matter N-type and P-type MOSFETs) is that NPN's require a ground current at the base and PNP's require a drive current. (Base = B. . . known as the "Gate" in FET's. . . the switch and the real input to a transistor).



    So in other words, NPN's open up when you send 'em a logic 0 (Ground), and PNP's open up when you send 'em a logic 1 (Positive voltage). So there's your switch.



    You can make a logical gate out of a few transistors. A year ago I could have drawn you all the NANDs and NORs you'd ever want, but it's easy enough to look them up if you actually need that information.



    Anyway, most electronics are made up of N-type MOSFETS. N-type FET's have a slight advantage in switching speed (over p-type FETs)because you can use thinner gate oxides. (really f-ing strange device physics stuff. . . won't get into that). Plus the transistor layouts for NANDs and NORs is really elegant. . . more so than ANDs and ORs.



    Let me know if you need any other information. (PM/email) You can get lots of freebie parts from big vendors if you're interested in this stuff.



    Getting back to the topic (slightly), you can make a 1bit register (memory cell) with 6 transistors. Typical RAM requires only 2 transistors and a capacitor per bit, but you need to refresh it all the time.



    So, those registers that handle everything your computer does are made up of 6 FETs per bit. That's 384 transistors per 64bit word on a G5. And, believe it or not, there are lots of banks of registers that are there just to account for timing and delay issues.
  • Reply 12 of 21
    I think Wrong Robot is going to have a seizure when he comes back to see what the hell happened to his innocent thread.



    . . . let's derail it some more.



    Quote:

    10 Print "Hello World"

    20 Goto 10



    Dude, what is that. . . VAX?



    Everyone knows that 32bit machines increment the program counter by 4 and 64bit machines by 8. . . . but 10?







    Maybe I do need a girlfriend after all.

  • Reply 13 of 21
    ast3r3xast3r3x Posts: 5,012member
    Quote:

    Originally posted by Splinemodel

    I think Wrong Robot is going to have a seizure when he comes back to see what the hell happened to his innocent thread.



    Haha





    Well thanks, I mostly understand, and I have at least a better understanding then I ever have before. You kinda lost me at sending them logic 0/1 pos/grnd mostly because I don't know what that means. Electricity is electrons which as negative.



    IM ast3r3x or email me if you need to explain further, I love learning this, but feel bad to give Wrong a seizure.
  • Reply 14 of 21
    baumanbauman Posts: 1,248member
    Quote:

    Originally posted by Splinemodel

    I think Wrong Robot is going to have a seizure when he comes back to see what the hell happened to his innocent thread.



    . . . let's derail it some more.







    Dude, what is that. . . VAX?



    Everyone knows that 32bit machines increment the program counter by 4 and 64bit machines by 8. . . . but 10?







    Maybe I do need a girlfriend after all.





    I remember using that kind of syntax on an Apple II, but I don't know what language it was.
  • Reply 15 of 21
    kickahakickaha Posts: 8,760member
    No kidding. Jeez guys, way to confuse the poor guy...



    Here's my spin on it:



    Computers are logical machines... literally. They operate on simple propositional logic, which has two values... true, and false. ie, 1, and 0. It was recognized a *long* time ago that all meaningful computation can be performed with propositional logic, meaning that all computation can be performed with zeros and ones. This is known as a Big Deal(tm).



    Transistors are just little switches - they go between an 'off' state and an 'on' state. These little switches can be combined in logical circuits called 'gates' to perform basic logic operations such as AND, NOT, OR, etc. With these logical operators, propositional logic is possible... and hence, computation.



    Alan Turing proposed in 1942 the a-machine, which proved mathematically that computation (any computation) could be performed from input with a *highly* simple machine - one that read from a tape, and could move the tape one space left or right one space at a time. (Colloquially, this is known simply as a 'Turing machine', even though he also proposed two others at the same time. The a-machine was simply the easiest to implement at the time.)



    Turns out that these simple Turing tape operations can be emulated with this little thing we call memory...



    So we have a propositional logic engine (CPU) and a Turing tape emulator (memory). So far, so good. Theoretically, we can perform any calculation we want now.



    Except... how to program it. Zeros and ones *SUCK* for people to think in. It's just not natural.



    So early machines (Zuse Z3, Mark II) came up with mnemonics for the instructions, converting from the binary codes to, simply, equivalent base 10 numbers. If '01100010' was the setting of switches to trigger off a memory read, it was easier to simply remember '98 = memory read'. So now they used base 10 numbers.



    Quickly it became obvious that you could just map the numbers to even easier mnemonics 'memrd = 98 = memory read'. Programs started to be written in pseudo-English:



    memrd 1004 0 ; read memory location 1004 into register 0

    memrd 1005 1 ; read memory location 1005 into register 1

    add 0 1 ; add registers 0 and 1, place sum into 1

    memwr 1006 1 ; write register 1 into memory location 1006



    etc, etc



    Doesn't look like much, but remember... we can perform *any* calculation with this little bit of ability. This kind of programming is 'assembly' or 'machine code'. A simple program called an assembler then reads in a file that a human writes, and performs trivial one-to-one translations to the binary numbers which the computer uses directly.



    And... guess what. Doing the above for anything other than uber-simple is *HORRENDOUS*. I've done it. It sucks.



    So along started to come higher level languages that abstracted out big chunks of the above type of programming and hid them behind even easier to remember mnemonics: 'if' 'for' 'while', and so on.



    Programs were written to translate from these higher languages into machine code... they're called compilers. They let the programmer deal with the abstractions, and let the computer do the grunt work of converting it to assembler... after all, that's what computers are good at.



    And it continues to this day. Object-oriented languages provide higher level abstractions that are (mostly) simply wrappings for older collections of abstractions which are wrappings for older collections of abstractions, and so on down until you hit those pesky, but vital, zeros and ones.
  • Reply 16 of 21
    Quote:

    Originally posted by Splinemodel

    I think Wrong Robot is going to have a seizure when he comes back to see what the hell happened to his innocent thread.



    . . . let's derail it some more.







    Dude, what is that. . . VAX?



    Everyone knows that 32bit machines increment the program counter by 4 and 64bit machines by 8. . . . but 10?



    Maybe I do need a girlfriend after all.





    worked on Apple II, VAX PDP 11/750/780, the comp sci pit terminals during first year Uni...



    but maybe I'm dating myself
  • Reply 17 of 21
    Quote:

    Originally posted by curiousuburb

    worked on Apple II, VAX PDP 11/750/780, the comp sci pit terminals during first year Uni...



    but maybe I'm dating myself




    I was actually making a quasi joke. . . .Compilers are for sissies.





    Thanks for clearing it up Kick. But you've gotta admit that my first post wasn't too far off-base. It wasn't until I started getting pegged with wild questions and silicon dreams.



    Anyway, the code really starts with voltage, which is just energy, and you can't take it deeper than that. Thank Volta.
  • Reply 18 of 21
    Quote:

    Originally posted by Kickaha

    No kidding. Jeez guys, way to confuse the poor guy...



    at least the universal coder-cherry-busting "Hello World" program doesn't need translation



    and having just finished The Code Book, although Turing deserves major praise,

    "Where Does Code Start" could go back to Babbage, Trithemius, and further.
  • Reply 19 of 21
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by Splinemodel

    I was actually making a quasi joke. . . .Compilers are for sissies.





    Thanks for clearing it up Kick. But you've gotta admit that my first post wasn't too far off-base. It wasn't until I started getting pegged with wild questions and silicon dreams.




    No, not bad - I prefer to start at the bottom and work my way up also - I find most people can understand simple logic "A and B or C", then handwave over the propositional logic = computation bit, then start giving them reasons why coding abstractions are good, and how it gets converted back down the chain at each step. Leave out any mention of the guts of a computer (like 64 bit memory access), other than 'CPU' and 'memory' since the rest is just implementation detail. Keep things high-level, and most people can follow it pretty well. If they want more detail, there's plenty out there for them to go get, but it won't do them much good unless they get the concepts first.



    Quote:

    Anyway, the code really starts with voltage, which is just energy, and you can't take it deeper than that. Thank Volta.



    Not up on your superstring theory, are ya?
  • Reply 20 of 21
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by curiousuburb

    at least the universal coder-cherry-busting "Hello World" program doesn't need translation



    and having just finished The Code Book, although Turing deserves major praise,

    "Where Does Code Start" could go back to Babbage and further.




    Babbage was a latecomer.



    Babbage formed much of what we would consider a primitive calculating machine, but it had almost no formal basis - it was an engineering marvel, but not a theoretically sound construct. Turing on the other hand had the theory down *pat*... but the machine itself that he proposed was impractical to build due to its *overly* simplistic nature.



    Luckily, the experiences with mechanical systems ala Babbage and Hollerith combined with Turing's vision and timely advances in electromechanical relays led to the first computing machines as we might recognize them.
Sign In or Register to comment.