ericpeets

About

Username
ericpeets
Joined
Visits
3
Last Active
Roles
member
Points
95
Badges
0
Posts
99
  • The TextBlade keyboard is superb, but you'll have to be patient

    ericpeets said:
    Sorry, I meant "Comfortably Numb"



    Insufferably Glum

    Hello?
    Is there anybody in there?
    Just type if you can hear me.
    Is there anyone at home?
    Come on, now,
    I hear you're feeling down.
    Well I can't ease your pain
    Knock you off your feet again.
    Panics.
    I'll need some information first.
    Just the basic facts.
    Can you tell me where Mark blurts?

    There is such pain, your brain is bleeding
    A distant ship date on the horizon.
    Blather only coming through in waves.
    Mark's lips move but I can't hear what he's saying.
    When Text Blade launched I was a believer
    My hands felt just like two balloons.
    Now I've lost that feeling once again
    I can't explain you would not understand
    This is now what I am.
    Mark has become insufferably ho-hum.

    O.K.
    Mark's a little pinprick.
    There'll be tons more aaaaaaaaah!
    But you may feel a little sick.
    Can you stand up?
    I don't believe it's working, bad.
    Mark's excuses just mean a no show
    Refund it's time to go.

    There is such pain, your brain is bleeding
    A distant ship date on the horizon.
    Blather only coming through in waves.
    Mark's lips move but I can't hear what he's saying.
    When Text Blade first launched 
    I caught a fleeting glimpse
    Out of the corner of my eye.
    I turned to look but it was gone
    I cannot put my finger on it now
    The lies have grown, 
    The dream is gone.
    Mark has become insufferably "umm".


    Hi Alex,

    Thank you, thank you. This is one of my favorite songs. And it's funny how it always entered my head whenever WT posted...

    You truly are a genius!!!
    alexonline
  • The TextBlade keyboard is superb, but you'll have to be patient

    arkorott said:
    @alexonline: these are all awesome !
    You should get a Grammy

    BTW, through the DBK and ericpeets discussion I have a question as I got intrigued: in a distributed system with several CPUs and controllers how can you ensure they stay in sync ? ie how do you avoid that a given controller brings the whole orchestra out of whack ? How / who keeps the tempo according to specs ?
    I will look it up in any case but perhaps somebody knows.

    The answer is, it depends. There are two types I've worked with, but there are probably more in other industries I'm not familiar with.

    First is what I call the distributed database model, where the main objective is to dole out info. A simple example is DNS. There are what are called root servers that only know how to answer partial questions, namely does it end in a .com, .net, .org for example. If you want to know more, they can refer you to other servers with more detailed answers. On and on until you get to the authoritative server.

    And even the final authoritative servers are distributed. There's the master and slave servers. Masters do the actual answering while slaves act as backups. Masters are chosen on regular basis and mainly depends on whether the server's alive or not. You choose enough masters to handle the queries, and the rest are designated as slaves.

    Web applications operate in a similar way. There are perimeter servers that must figure out whether a page request is: static, dynamic and/or require DB query. Once figured out, it will either give out static page by itself or redirect the request to backend servers that will round out the request. 

    In this case, you essentially dedicate one server to be a traffic cop directing you to get to your destination (and find answer), and the rest either has the answer or is another cop with more detailed info on how to get you there.

    There's really no guarantee you'll find your answer, or even get there. That's why there's DNS hijacking, 404 pages, so on. The vast nature of the distribution make end-to-end checking impossible. 

    The second model is the parallel computing model, where it's mainly about how to handle computing. In scientific computer, for example, you have one server (or CPU) dedicated to figuring out how much computation something will take and how to divide that task among other servers (or CPUs). 

    So, in this case you dedicate one server to be the factory manager, and the rest are grunts who are given their share of task and does them.

    Oracle, with grid computing, does something of both (distributed database + parallel computing).

    In this case, end-to-end checking is possible and even required.

    Now, how does that apply to the Textblade? Since they claim to have 4 cpus (cores or separate SOC) when there are only three physical  parts to it, I'm pretty sure they use more the second model, where there's a manager cpu + 3 grunts. The manager gathers reports from the grunts and communicates with the host computer. Each grunt does its share by listening to keypresses, figures out its exact scancode and relays that to the manager. I think in the spaceblade resides the manager + one grunt, for total of two cpu's.

    But of course, that's my guess. What the hell do I know? I don't have a textblade.
    alexonline
  • The TextBlade keyboard is superb, but you'll have to be patient

    Well, let's see what info I'll reveal - because there are some people who would use such information to try to find out who I am. Which normally I wouldn't even care about, but considering some of the crazy things I seen some people write, I think I'll be a bit more conservative about it. And if anyone does happen to know who I am, consider that a signal that I don't approve of them telling anyone else.

    First, it wasn't an Apple, though it was during the Apple II+ period. I just pointed out it was better than Apple programs I had seen (at least that weren't really expensive and thus out of the price range of the typical teacher). At the time, at least where I was, teachers were just starting to get computers for their use, but only a few. And they didn't know what to do with them!

    If I gave the BASIC name (or the computer name) it could narrow things down too much. I was not writing on the typical computers people thought of at the time. I didn't get an Apple until the Apple IIgs, long afterwards. BTW, this was also where I learned it wasn't how popular a computer was that mattered the most when it came to good software. It also wasn't how powerful it was either. It was the programming.

    ...

    The BASIC program would provide my menus - hit a key to show a particular class, or an individual student, etc, and it would call a MC routine to calculate and show the information.

    What are you trying to cover? What do you think it will reveal? It just seems you're throwing any detailed discussion into murky waters. I don't need to know what you named your BASIC program or how it operates in general, and possibly not even your system. Just the CPU you used, because I'm just curious at how you wrote the machine language code. So, are you able to say what CPU it was? I'm familiar with several of the era: Zilog Z80, Intel 8080, 8088, Moto 6800?

    If you can't even reveal that, then what year (approximately) did you write your machine code? If cassette tape was in usage at the time, that's a short window. They were temporary measure for the home market in late, late 70s before displaced by floppy in very early 80s. So that windows must have been around 1979-1980.

    There was a little peek and poke with my program, but as I recall, this was only for saving and loading. Since it used a cassette, normally an save operation would save the entire program plus data. Very time-consuming. So I came up with the trick of fooling the computer into thinking the program started where, in actuality, the data started so only that was saved (unless you wanted to save both together - that was a choice). When loading back in, it would just have to load the new data to replace whatever class you had finished (and saved).

    Peeks and pokes are ways to read and write value to memory. How in God's name did you use them to initiate a routine to record things and save? No wonder it was time consuming. You can peek and poke every memory address, it will not do anything except crash your computer.

    As far as cassettes are concerned. Did you use parity and/or error checking? If so, it must have increased the size of your program and data by three-fold (or more). If not, reading and writing anything must have been horrendously unreliable.

    My math was done by MC routines. At the touch of a key you could get the averages of each student in a class either with every grade counting the same or using the weighted grades if you created those. I do not recall the details now, but it seemed like a simple enough matter at the time to calculate the grades either way. Certainly no errors ever showed up when I finished. The MC also printed most of the stuff to the screen (with BASIC back then, you could see the lines being drawn one by one when you had a lot of data to display, especially if there were calculations going on at the same time).

    Forgive my ignorance, but what is 'MC'? Is this short for machine code? If by 'MC routines' you mean 'machine code routines', yes, that's what I'd like to talk about. You seem to make only a passing reference to it, abbreviating it even. But it's that very 'MC routines' -- the math routines that had stumped Wozniak, Jobs, engineers at Apple for years before Bill Gates dedicated a team to write that and finish out the full BASIC that became Applesoft. Another great milestone for Apple II was Visicalc, which is, if you think about it, nothing more than large math
    package that happens to display interesting things if you enter some numbers.

    You seem to make light of your achievement. If you had created a specialized math 'routines' (however simple and what CPU it was for -- if it was an 8-bit CPU used around the same era as the 6502), I'd love to know about it.

    Any math, precision or otherwise, floating point or otherwise were difficult to do on Apple II. Those 8-bit CPUs, the 6502 or others of its era, can only do two arithmetic ops -- adding and subtracting. Specifically, increment (adding 1) or decrement (subtracting 1). Anything more complicated than this, like 11 + 17 for example, would require you to write some custom math functions. Even starting with a non-zero number, like 10, would mean taking an empty register and incrementing it one by one until you got to 10, and storing that somewhere. Division is traditionally done by continually subtracting 2 until remainder is < 2. Multiplication the other way. Each method takes up valuable cycles. These days with 5Ghz cpus common, cycles are cheap. But in those days, they were precious.

    The crude methods are stable, but too slow to be practical. Or you can go all out with bit-flipping trickery like Bill Atkinson when he wrote QuickDraw, but that requires you to be at the top of your boolean math game. You're not Atkinson, Kahuna. But that's okay... no one is. Thus most methods are somewhere in between -- it has to be. I want to know to which side your 'MC routines' lean.

    I didn't do decimals or fractions, which was probably the only limitation compared to the typical programs like mine back then (actually, most didn't do fractions). Grades were whole numbers and could be from zero to 254 as I recall, though, obviously few would need that! But it was just as easy to go to that as to go to 100 and it allowed for flexibility - and certainly for ordinary extra credit on a test.

    You don't understand the meaning of decimals, which means power of 10, versus binary which is power of 2. So, you didn't do floating point, just whole numbers. You only used 0-254 -- which are decimals, by the way. Why 0-254, not 0-255? Did you do negative numbers? One byte numbers doing both positive and negative will limit you to -127 -> +127. My first primary question is: How did you display those numbers?

    You make it seem like if you just throw 0xff (hex) into the display memory, it automagically appears as 255 (decimal). How did you convert the binary hex into decimal number, and vice versa? How did you add these binary numbers, subtract them, multiply them, divide them. And how did you store them, convert them to decimal, and display them?

    Again, the CPU don't know how to do any of that. None of that automatically happens. All has to be meticulously programmed. I just want to know how you programmed all that. You don't need to show me any source code. I can fill in the opcodes if you can provide me a concise description, maybe even a loose description.

    No idea, at this point, what memory area the MC occupied.

    In the case of Apple II, the ROM reserves a chunk of memory for devices, lo-re and hi-res graphics, BASIC reservers a big chunk for runtime. You would have had to divide userland memory into two parts: for your BASIC program and machine language code. After all that, there's really only a few places where you can safely store anything without clobbering everything, even with the maximum 48K.

    But you mention you had 16K, which makes me wonder where in the world you hid your machine language code?

    I'll stop here. There's just too much that says nothing but raise more questions than answering anything, even after sifting through all the self-congratulatory comments which are moot since he won't reveal any detail of the system.

    Kahuna, your babbling post tells me (and any programmer out there) that you have no idea what you're talking about. Not only can you not tell me anything about your math 'routines', it's obvious you don't even know some of the basic programming terms, yet you throw them about so freely in wrong contexts. You don't seem to know the difference between machine language coding vs assembly. No one in all my years has developed machine code using manual input, peek and poke and assembly -- and I'm sure you don't even know why.

    If there is a morsel of truth in what Kahuna posted, any programmer out there please tell me, because I certainly can't find it.




    alexonline
  • The TextBlade keyboard is superb, but you'll have to be patient

    TextBladeDenied said:

    Cool. Post some of your old 'BASIC AND machine code' on Github so we can check it out.
    Not likely since the computers I wrote for were decades ago, stored on tape which would have long since worn out, and I don't have the paperwork anymore.

    You mean cassette tapes? Wow. That's really ancient. You're talking about the first year (or less) of Apple II, since floppy disk drives came out fairly quick afterwards. Which tells me few things: you must have only had access to Integer BASIC then; there were virtually no programming tools on the market.

    (hint: Mine were faster, 
    Which part, and how? I'm just curious about the math part. Character output and keyboard input I'm sure were lightning fast (for the day).

    held more data
    That's not really up to the program you wrote. Just depends on how much memory you had (which must have been 48K. I'm assuming you're not referring to storage, which would be negligible if you only had cassette tapes.

    , and structured into a more useful form on screen).
    What the heck does this mean? Are you talking about your UI? How pretty things are shown on screen? Normally, programmers talk about structures when referring to the data, not how something manifests on screen.
    alexonline
  • The TextBlade keyboard is superb, but you'll have to be patient


    Wrong again. Sure, I wrote some things in basic. So? If it gets THE JOB done, so what? One of the first things I learned was that there were proficient/expert programmers out there who couldn't write a program that really met the needs of the end user. I'd give examples, but I don't think you'd pay attention. I always focused on the end user and then figured out how to accomplish what they needed.

    But what I mostly did - and have posted about - is write in BASIC AND machine code for the best mesh of each (BASIC being simpler/quicker to do, but machine code being faster and often requiring less memory.

    It was really quite effective. Using a BASIC line for "Input name" is close enough to doing the same thing with machine code. But machine code paid off in a big way when sorting or calculating, etc. Even had a few full time programmers ask to use some things I worked out.
    Ah yes, this reminds of the discussion you and I had many moons ago.

    You never did answer my questions about how you accomplished the things you claim. Just to be sure, you're claiming you wrote some grading program (a mix of BASIC + machine language code) on your Apple II that did some math for you (IIRC).

    My questions were:

    • Which model of Apple II was it? Just II, or II+, or IIe?
    • Which version of BASIC did you use? The Integer BASIC or Applesoft?
    • Which portion (BASIC or the machine language code) did the math calculations? Since there was this speed increase, I'm assuming the machine language code did. So then...
    • How did you enter the machine language portion? Did you punch them in by hand using Hex into monitor? Or what assembler/disassembler did you use? or did you peek and poke each value of the machine language code from BASIC?
    • How the heck did you debug the machine language code
    • How did the BASIC program communicate with the machine language portion and vice versa?
    • What memory area (approximately) did the machine code occupy?
    • How were you able to do decimal math at using binary? How did you store them and what were the precisions, esp the fractional parts?

    The reason why I ask the last one is because doing decimal math via binary is not for the faint of heart. It takes a lot of coding and bit flipping wizardry to create one that is: accurate, stable and fast. Nowadays they have math packages and libs you can simply load and call, but for most of Apple II's life, there never was anything like that, meaning you pretty much had to roll your own. Nor did they have any sort of math coprocessor for it so the CPU (6502) had to do it all.

    Even Steve Wozniak was asked several times by Jobs to create a floating point version of his Integer BASIC, but Wozniak never did. Jobs had to resort to paying Bill Gates and a team of engineers a lot of money to write Applesoft BASIC for him. Wozniak said the scope of doing that was beyond his attention span (which he approximated as more than a month). I mean, this guy built Apple II's disk interface AND the disk operating system in about a week. So he could have written a FP version of BASiC, but he figured it would take longer than his liking.

    How do I know? Because I asked him. In my freshman year in college, I had to write a 6502 emulator, which itself is not hard except I was given limited cycles for the instruction codes and I had to match the clock speed on the Apple II they had in the labs for us to take apart. Wozniak, being a friend of the prof, suddenly appeared one day to do an impromptu AMA session for 30 min. By this time, I had the emulator pretty much finished and was looking to emulate the rest of the Apple II.

    That's why I'm kinda fond of the 6502 cpu, or rather the Apple II. So, just out of curiosity, I'd like to know how you did the things you keep claiming. I'm not asking you to produce the machine language math code, since I have my own... somewhere I think on my NAS. I'm just asking how you approached such a project, and just tell us even in theory how you manage to create something that eluded even Wozniak -- bona fide genius by the way, even if a bit impatient.

    edit: added spacing to questions to make them clearer
    alexonline