But he was wrong about a bunch of things, some of which are easy to show. Such as wondering why I didn't save variables using Basic when that option didn't exist on my system.
You are making things up that are easily fact checked.
There was never a version of BASIC implemented on a kit computer or micro computer that did not offer some means of storing a value in a variable. That was like half the point of BASIC, to be able to write things like y = 2; for x = 1 to 10; print x * y; next x
dabigkahuna said: So, yes, he could CLAIM and I can claim. But I can back it up.
Or you can claim secret knowledge which no one else possesses based on details you are alleging you must withhold in order to protect your identity (the make of computer and version of BASIC you were using).
You are intentionally withholding key details that would disprove your claims, which is the opposite of backing up a claim.
> You are making things up that are easily fact checked. There was never a version of BASIC implemented on a kit computer or micro computer that did not offer some means of storing a value in a variable.
Thank you Mr "I don't keep up on context"!
Because it wasn't about STORING variables in a program. It was about SAVING ONLY the variables to cassette, without also saving the BASIC program itself. It saved at least 50% of save/restore time.
You even made three links to something that wasn't even about the issue!
> You are intentionally withholding key details that would disprove your claims
Logic check - if you don't know what the data is, how the heck do you know it would disprove anything? Gotcha!
Besides, as already pointed out on the matter of "ADD", you don't need to know what cpu I use. Already pointed out you can search for:
assembly ADD
And find hundreds of millions of results! So you don't need to know what I use to know it exists.
I have “The Orville” on the brain, I love that show. I actually saw in one of the episodes a TextBlade being used on the bridge, there was a reference somewhere if you squinted that it finally launched in 2398 after extended firmware rewrites and several thousand generations of new Li-Ion batteries needing replacement in the original batch, which lasted all the hundreds of years of extended re-development - well, the key caps etc lasted, the magnets stayed magnetised, but the batteries died thousands of times in the intervening centuries due to flux capacitor problems).
Of course I meant Orwell when I spoke of Orville but, it’s too late to edit it now. I knew it didn’t seem right but I was in the midst of things when I wrote it. Long live The Orville!
Maybe DBK is Da Big Kaylon? Those super fast fingers would produce amazing WPM test results :-)
I’m guessing Rolanbek means the stuff about trying to pin DBK for his lack, or otherwise, of the knowledge he does or doesn’t have about programming, BASIC, machine language, etc.
I can see it would seem delicious to catch DBK in some kind of untruth about programming, but as you have been able to see, DBK can explain and rationalise anything, whether he does or doesn’t know anything about it.
Well you chaps are doing a decent enough job, but you are just going to end up in the same semantic moon logic that DBK lives in.
As he has not specified enough detail for you guys to actually challenge him on a detail, for example; the chip, and through that he will retain plausible deniability. If you are going for the pin just tell him that withholding information to ensure an argument from ignorance is always in place is a disingenuous move, and that the burden of proof is on the claimant. The claims being what was said in his WTF bio and in various places about a "grading program".
It's never about the argument, and he never makes a relevant point that can't be kicked over in an instant, it's always about prolonging the argument.
It's pretty clear that DBK fails into the 'enthusiastic amateur' bracket here, even if at several decades remove.
I was updating an old Microsoft Surface computer, and in the documents folder I found a file called "Blade Delay". I think this must have ended up at the TextBlade Reddit, because I think I was already shadow banned from the WTF before I wrote this.
Given I have been re-writing songs, and clearly having done it for a long, long time, I thought I might as well update the AI Forum with this re-write of Aerosmith's Walk this Way, which I originally wrote on 25 August, 2017.
The sentiments towards DBK were the sentiments at the time, they are preserved for historical reasons and do not reflect my current sentiments towards DBK. Apologies to DBK for the historical diss, I'm guessing DBK remembers when this was originally published and that it would have been TextBlades off a Tregger's back a long time ago. Apologies again, I do not want to fall foul of AI's rules, this is merely a historical reprint.
It's clear that the Textblade delays have been annoying people for a LONG LONG LOOOOOOOOOOOONG time, and MK's extended delays have been a laughing stock for quite a while now!
"Blade Delay"
Textblade lover
Waytools hidin' 'neath the covers
'Til I talked to your Knighton, he say
He said, "You won't see nothin'
'Til we stop all our puffin
Then you're sure we be a-changin' our ways."
I met a cheerleader,
Was a Big Kahuna bleeder
Oh, the times I could him diss
'Cause the worst BS he shovin'
With his Tregblade or a dozen
Only started with a timeframe miss
Weak piss!
Seesaw swingin' with the dates of the launch
And your ship date up up in the air
Singing, "Hey fiddle diddle"
On your timeframes do you piddle
Screwing us like you didn't care
So I took a big chance
Said my piece in the rants
With some satire, I was ready to play
Wasn't you all I was foolin'
'Cause I knew what I was doin'
As I yearned to just type my way
When I found out there was a...
Blade delay
Type no way
Blade delay
Do not pray
Knighton say
Blade delay
Waytools say
TextBlade nay
Just timeframe miss
Weak piss!
Textblade nerdy is a gadget kinda sassy
Little Textblade's beggin' to be set free
There was three young Treggers in the Waytools forum
When I noticed they was laughin' at me
I was a TextBlade wanter, never made it to a tregger
'Til the Treggers told me somethin' I missed
Then my next door neighbor with a Textblade had a favor
@alexonline: Wow, you have talent. Another Grammy for you. You should post all the songs someplace. They are fun.
Hope the guys drop the argument on the grading program. That was not so much fun to watch, but reminded me of a lot of forgotten things I did long ago and questions came up like: are compilers so efficient now that nobody fools around in assembly anymore, right ?
At that time I really hated that the simplest thing in assembly consumed so much time and effort but it was an unavoidable pain sometimes as BASIC was so slow, but it would seem that nowdays only a masochist would invest the effort when you have languages like Swift 5 and Kotlin 1.3, and very efficient compilers and IDEs, and an incredible amount of horsepower under the hood.
I guess only people working on device drivers / firmware need to get that close to the metal, correct ? edit: I do not even believe people would touch it for 3D animations either, as they are complex with all the math and matrix manipulations and as there are so many tools available.
I didn't like assembly then, and I don't like it now. (edit: it is as much fun as reading a phone book)
Then if that is a decent enough job - basically ignoring what I said, changing the context totally, and making assumptions, I'd hate to see them do a bad job!
> As he has not specified enough detail for you guys to actually challenge him on a detail, for example; the chip, and through that he will retain plausible deniability.
Yet they did challenge me on things which they clearly made assumptions about anyway. Which goes to my point about much of what they post about anything.
> the burden of proof is on the claimant.
I didn't make my programming an issue. So it is their claims about how hard it would be, what commands are possible in MC, etc, that have to be proven then.
I never questioned what they could do with programming. Maybe they made up their stuff, but I sure didn't make it part of my arguments. I addressed what they actually said which were factually wrong.
But, I think you did the best job you could in covering for their ineptitude in their criticisms.
Retired teacher. Self-taught programmer (decades ago) using BASIC and machine code depending on need: Gradebook program, crossword puzzle solver, cryptogram solver, etc. Also developed improvements to existing programs, primarily a database program. Beta tester for AppleWorks. Developed an Access database to automate auditions. Other stuff I can't recall at the moment. Everything I did was to solve my own needs as a teacher originally, but then made it available to others.
Eric would presumably say he could back it up too, and that claims are claims.
But he won't back it up. But let's take a recent one and see what happens.
I'm not the one making claims. You're the one who needs to "back it up." Debunking your claim is not a claim. Nevertheless, I spent a great deal of time explaining why your story is so full of holes, if not downright admission of your utter ignorance. Try re-reading my posts which includes: sample code, links and detailed explanations. Did you miss them? Here's a small sample:
Post #841:
How do I know? Because I asked him. In my freshman year in college, I had to write a 6502 emulator, which itself is not hard except I was given limited cycles for the instruction codes and I had to match the clock speed on the Apple II they had in the labs for us to take apart. Wozniak, being a friend of the prof, suddenly appeared one day to do an impromptu AMA session for 30 min. By this time, I had the emulator pretty much finished and was looking to emulate the rest of the Apple II.
“In the case of Apple II, the ROM reserves a chunk of memory for devices, lo-re and hi-res graphics, BASIC reservers a big chunk for runtime. You would have had to divide userland memory into two parts: for your BASIC program and machine language code. After all that, there's really only a few places where you can safely store anything without clobbering everything, even with the maximum 48K.”
You don't understand the meaning of decimals, which means power of 10, versus binary which is power of 2. So, you didn't do floating point, just whole numbers. You only used 0-254 -- which are decimals, by the way.
You obviously didn't understand any of the explanations, so I even bolded and quoted words, formatted things as bullet items. What more do I need to do, Kahuna? Decorate them with emoji's and illustrations? Is your reading comprehension that bad?
And remember, I'm not the one with burden of proof. You have made your claim several times in the past, but somewhere since you became more brazen to attack people with it, not with any sort of wisdom, mind you, but with more hardened ignorance. To paraphrase the ZDNet writer: "If it wasn't good enough to remember, it wasn't good enough to brag about it."
I wrote: "Even similar programs that only allowed whole numbers (no fractions) for grades - and that was almost all of them - were storing the grades as floating point numbers."
His response > Floating point is another beast all together -- something which further proves the limitless bottom to your ignorance. Besides, you said you didn't do any floating points. How long should we wait for him to back up his criticism? Maybe you could help him out?
Ah floating point. Yes, let's talk about that even when you said you didn't do them in your program. What do you know about doing floating point in binary that you were confident to compare other programs that did them (using 5 bytes) vs your limited (0-254 number range) grading program? Yours didn't "fractions" or even "decimals" at one point.
Since it is true that it took 5 bytes on my computer, what is he basing his made-up ridicule on? Is he going to say it isn't 5 bytes? If so, that would be another mistake since it would probably only mean that it was different on whatever systems he uses.
Why 5? when the industry standard is 4? Not just on my computer. The standard format for floating point numbers for single precision uses 32 bits in this format:
1 bit for the sign of the number 8 bits for the exponent. 23 bits for the mantissa. (wah! wah! You can't back it up!!) Here's a link that explains the standard and how to put it to use (hint: the standard is called IEEE-754): https://www.h-schmidt.net/FloatConverter/IEEE754.html
What kinda messed up BASIC did you use that needed five bytes? Is it the same one that couldn't do variables, so you had to dump them manually into memory and redump onto cassette? Is it the same one you programmed your grading program on? You can't say.
But still, did you forget where you said:
If I saw a bunch of things like. "Goto 10", I'd set up something like, "Let A = 10". Then everything else would be "Goto A". One byte, verses the 5 for the number 10 being floating point and that number would also take two bytes for each digit - so I saved 6 bytes each time.
This is such a dumb statement, that tells me you never even programmed in BASIC, which until now I was willing to give you benefit doubt. This tells me without a doubt that you don't even understand the most fundamentals of BASIC and programming in general.
A "GOTO" would be converted to a token before execution, a one-byte hex token, and '10' would also be tokenized into a one-byte value of '0xa'. So that's two bytes. It doesn't store the "10" as a five-bytes!!, unless the author was brain damaged. And no programmer would think that, unless they were more severely brain damaged.
Anyways, "Let A = 10" would create another 1-byte token for "Let", create a variable storage for "A", a two-byte pointer to where "A" is, create a 1-byte value of '0xa' and store it there. Now, "GOTO A" would have to fetch the 'A' each time you use them, then expand each one to "GOTO 10". Not only are you wasting memory by using 'A', you also slow down the process because the variable has to be expanded each time. How you think this saves you memory or time is beyond me.
It's much faster to use a constant rather than a variable, as that would not require expansion, nor use unnecessary cycles to fetch the same value over and over. You'd know this if you ever programmed anything at all.
Just as he got it wrong about LD A and ADD because they were different or didn't exist on what he was familiar with. As such, making such an argument would only show he is ignorant.
Just be clear. Was 'ADD' a part of your CPU's instruction set? Or was it a part of the 'app' that you did your MC on? If former, I know of no 8-bit CPU that had it built in. If it was part of your 'app' then it may have been a macro, one where it converts several native CPU instructions to give you the 'ADD' instruction.
Also 'LD A' is different from 'LDA'. I guess you can't tell the difference. The former would mean it's a two-byte opcide -- impossible on an 8-bit CPU (only on a 16-bit CPU), whereas 'LDA' is an assembly mnemonic for a 1-byte opcode.
10 *= $0600 0100 ; CIO 0110 ICHID = $0340 ;IOCB 0 S: 0120 ICCOM = $0342 ;IOCB Command 0130 ICBAL = $0344 ;Xfer Buffer Adr 0140 ICBAH = $0345 0150 ICPTL = $0346 ;PutByte Adr 0160 ICPTH = $0347 0170 ICBLL = $0348 ;Buffer Len 0180 ICBLH = $0349 0190 CIOV = $E456 ; CIO Vector 0500 ; Setup CIO Call 0510 LDX #0 ;IOCB 0 0520 LDA #9 ;Put Cmd Val 0530 STA ICCOM,X ;Set it as the cmd 0540 LDA #HELLO&255 ;Str low byte 0550 STA ICBAL,X 0560 LDA #HELLO/256 ;Str high byte 0570 STA ICBAH,X 0580 LDA #0 ;Str Len low byte 0590 STA ICBLL,X 0600 LDA #$FF ;Str Len high byte 0610 STA ICBLH,X 0620 ; Call CIO 0630 JSR CIOV 0640 RTS 1000 HELLO .BYTE "Hello World!",$9B
Wow, that was fun, right ? edit: the guy read 3 BOOKS to do that....
In a way, it was more challenging to do assembly on earlier chips like the 6502. Because we chips have cycles to burn, can do advanced instructions that does complex tasks. We're now spoiled with better IDEs, actual operating systems with fully documented APIs, etc. Here's a similar task in Linux:
section .text
global _start ;must be declared for linker (ld)
_start: ;tell linker entry point
mov edx,len ;message length
mov ecx,msg ;message to write
mov ebx,1 ;file descriptor (stdout)
mov eax,4 ;system call number (sys_write)
int 0x80 ;call kernel
mov eax,1 ;system call number (sys_exit)
int 0x80 ;call kernel
section .data
msg db 'Hello, world!',0xa ;our dear string
len equ $ - msg ;length of our dear string
Just set up a string and make a call! What wuss we've become.
In the 8-bit days, you could tell a real man from the girls. We didn't need stinking high level languages like BASIC or even assemblers. We'd compose everything in hexadecimal in our heads and punch them in with our toes on a keypad. Or better yet, flip DIP switches. We'd eat KIM-1s for breakfast. We could make Altairs sing with paper tapes. Real men used cassette, no sinking floppies. No one would ever need more than 16k!
> Nevertheless, I spent a great deal of time explaining why your story is so full of holes
Not really. Let's look at your poor examples:
I told you early on that mine wasn't a 6502. So it doesn't matter what Woz told you about that one.
> After all that, there's really only a few places where you can safely store anything without clobbering everything, even with the maximum 48K.
Yet, in just 16K, I had probably something over 6K left for data. That's why it was so important to save grades in 1 byte each.
> You don't understand the meaning of decimals, which means power of 10
Fibbing by leaving out context is still fibbing. I told you I was referring to the grades not including decimals or fractions. That is, you could give a grade of 99, but not 99.2, 99-1/2 or anything like that. That was my context. You are free to use your context, but only for what YOU are saying. Oh, and in your post I'm responding to, you wrote: "Yours didn't "fractions" or even "decimals" at one point." So you knew my context. Gotcha.
> I'm not the one with burden of proof. You have made your claim several times in the past, but somewhere since you became more brazen to attack people with it
Really? Yet you brought up my grade book here and made it an issue - in post 841 which you referenced above so you must have seen it!
> What do you know about doing floating point in binary that you were confident to compare other programs that did them (using 5 bytes) vs your limited (0-254 number range) grading program?
LOL, the only thing I needed to know about floating point for my program was storing grades in it would take 5 bytes instead of the single byte my way required!
> Why 5? when the industry standard is 4?
That's a trick question, right? Because the "why" is simple enough. Because that is what MY computer used. I couldn't care less what yours used because I wasn't writing on your computer. Do you think people writing for computer system "A" should write it based on how computer system "B" works? So your link to what OTHER computers use is meaningless.
It doesn't even take much work to find this:
vvvvvvvvvvvvvvvvvvvvvv
Representation in the C-64
There are basically two different layouts how floats are stored:
for calculations with float registers in a some kind expanded layout of 6 or 7 bytes.
for variables compacted into 5 bytes.
^^^^^^^^^^^^^^^^^^^
Gotcha.
> What kinda messed up BASIC did you use that needed five bytes? Is it the same one that couldn't do variables
You're fibbing again. Considering that I mentioned saving the main data in a D$ string (you know, one of those variable things), I also repeatedly talked about saving that data (which were in variables) to cassette without having to include the basic programing every time. So you know darn well I never said it could't use variables because I was using them. As for what did use 5 bytes, see above about the C-64.
> you also slow down the process because the variable has to be expanded each time.
First, you are unqualified to comment on anything in my basic program because you obviously are basing everything you say on some other system.
But as for slowing down the process, yep, it sure did. I knew that at the time. But it was used in areas of the program where speed wasn't important. So the memory savings were more important.
> It's much faster to use a constant rather than a variable
Well, I guess you could say it was a constant - since the ones I was using for that part of the program were set from the start and never changed. Feel better?
> If former, I know of no 8-bit CPU that had it built in. If it was part of your 'app' then it may have been a macro, one where it converts several native CPU instructions to give you the 'ADD' instruction.
Yet I already told you that a search will find hundreds of millions of matches: assembly ADD
You seem to be basing what you say on your lack of knowledge about other systems and not doing simple research. It would be one thing if you thus remained silent, but no, you just claim I'm wrong. It's like trying to explain snow to someone who lives in a place where it is always hot. Some of them simply refuse to believe it and choose to be ignorant.
> Also 'LD A' is different from 'LDA'. I guess you can't tell the difference.
Really? No, you just fibbed. Because I specifically referred to "LD A" which YOU then said (in post 927): >LDA (not LD A) Maybe you couldn't tell the difference. Gotcha again.
Thank you for the opportunity to detail your false and misleading statements though. It was worth the wait.
Retired teacher. Self-taught programmer (decades ago) using BASIC and machine code depending on need: Gradebook program, crossword puzzle solver, cryptogram solver, etc. Also developed improvements to existing programs, primarily a database program. Beta tester for AppleWorks. Developed an Access database to automate auditions. Other stuff I can't recall at the moment. Everything I did was to solve my own needs as a teacher originally, but then made it available to others.
That's a lot of big words he wrote checks for that can't be cashed. It certainly reveals more about himself than the computer, CPU and version of BASIC he claims to have used to write a grading program. Now I see that he also wrote a crossword puzzle solver and cryptogram solver!
If you're going to go that far, why not add operating system onto the list? Faster Linux using only BASIC + MC routines (sic) as needed. I punched the whole code using nothing but hex codes I memorized. I could have been a millionaire many times over but I lost the cassette tape it was on.
But believe me, I wrote things even Steve Wozniak could not. I just can't remember what they were. It was much more disruptive than VisiCalc except it wasn't a spreadsheet. Can't tell you what it was, but it was hell of lot faster than VisiCalc, and it wasn't on the Apple II. If you ask me more than that, I'll call the police on you for privacy invasion, since that means you want to know more about me than the program.
I'm not going to tell you anything about me, except where I live, what I did in the past, what I'm doing now, and what a lovely time I'm having with my Textblade, what my typing speed is (in real time if you want)...
Retired teacher. Self-taught programmer (decades ago) using BASIC and machine code depending on need: Gradebook program, crossword puzzle solver, cryptogram solver, etc. Also developed improvements to existing programs, primarily a database program. Beta tester for AppleWorks. Developed an Access database to automate auditions. Other stuff I can't recall at the moment. Everything I did was to solve my own needs as a teacher originally, but then made it available to others.
That's a lot of big words he wrote checks for that can't be cashed. It certainly reveals more about himself than the computer, CPU and version of BASIC he claims to have used to write a grading program. Now I see that he also wrote a crossword puzzle solver and cryptogram solver!
If you're going to go that far, why not add operating system onto the list? Faster Linux using only BASIC + MC routines (sic) as needed. I punched the whole code using nothing but hex codes I memorized. I could have been a millionaire many times over but I lost the cassette tape it was on.
But believe me, I wrote things even Steve Wozniak could not. I just can't remember what they were. It was much more disruptive than VisiCalc except it wasn't a spreadsheet. Can't tell you what it was, but it was hell of lot faster than VisiCalc, and it wasn't on the Apple II. If you ask me more than that, I'll call the police on you for privacy invasion, since that means you want to know more about me than the program.
I'm not going to tell you anything about me, except where I live, what I did in the past, what I'm doing now, and what a lovely time I'm having with my Textblade, what my typing speed is (in real time if you want)...
Steven Colbert would bring the whole issue of "truthiness" into the equation.
I aced Pascal as a Yr 9 student in a Yr 10 class, but that's the extent of my programming skills (despite dabbling in very basic Basic as a kid in the early 80s), so I'm in no position to judge the truthiness or otherwise of either side.
We know both sides hate each other to death (or so it seems), so perhaps the only thing left to do is to agree to disagree, and to get back to wondering when the TextBlade update will come this month, or if it will come, seeing as keeping to deadlines just ins't something MK is known for.
Perhaps one observation to make, whatever its worth (or worthlessness), is that DBK has not mentioned he is also a beta tester for TextBlade, despite (presumably proudly) mentioning he was a beta tester for AppleWorks.
One might have imagined that being a Tregger is something DBK would have proudly mentioned, given he has probably been beta testing TextBlade for years longer than AppleWorks, but I can only make vague suggestions here as to how proud DBK is or isn't about TextBlade beta-treggerhood, as I am not DBK, don't live in Hawaii, don't program in BASIC and only enjoys it when Arnie potrays machine language in Terminator hardware form.
Less than two weeks to go before we discover whether MK releases a TextBlade truth bomb, or whether May becomes June.
> That's a lot of big words he wrote checks for that can't be cashed.
Funny, coming from a guy who didn't realize there was an ADD capability in some systems as well as some computers using 5 bytes for floating point.
> Now I see that he also wrote a crossword puzzle solver and cryptogram solver!
Because I did. They were part of a set of things I worked on based on newspaper puzzles (crosswords, cryptograms, and word jumbles). The were at least mostly in Basic, though it is possible the crosswords one had a MC operation for searching.
> If you're going to go that far, why not add operating system onto the list? Because I didn't do that. Way beyond my ability anyway. Was that too confusing for you to follow? I did once write a machine code version of the Shell-Metzner sort though which I was pretty proud of. Never used it in anything. Just wanted to see if I could do it - and then a app writer asked if I would let him use it.
> I punched the whole code using nothing but hex codes I memorized.
You just fibbed again. Because I never said I punched in the whole code using memorized hex codes. I specifically said I did SOME in hex because they came up so often I had memorized them. But others I used assembly terms. Why do you keep making such false statements? I mean, it works for me to have you be so blatant about it, but it sure isn't going to do you any good.
People will note that Eric went through a bunch of insinuations that had not only nothing to back them up, but often were directly contrary to what I really said (such as the example above). But he and others will repeat such nonsense, hoping that repetition will fool people into thinking it must be true. I doubt it though. They are being too obvious now. Which works for me!
Perhaps one observation to make, whatever its worth (or worthlessness), is that DBK has not mentioned he is also a beta tester for TextBlade, despite (presumably proudly) mentioning he was a beta tester for AppleWorks.
One might have imagined that being a Tregger is something DBK would have proudly mentioned
Don't imagine. It really is quite simple. I put that info in there before I got in Treg. While I had mentioned some of that on the forums, I thought it might help my chances if I put in my account info in case WayTools checked those, but may not remember what I had posted.
Also, considering how plainly I talk about being in Treg on the forums, I don't think there is much need to add it to my account info. Might do it, but either way, it isn't a big deal.
> Nevertheless, I spent a great deal of time explaining why your story is so full of holes
Not really. Let's look at your poor examples:
I told you early on that mine wasn't a 6502. So it doesn't matter what Woz told you about that one.
It was before you copped out saying it wasn't the Apple II. You said your little grading program ran faster than stuff on the Apple II. First of all, why and how did you compare speeds when yours did not run on the Apple II.
In this and subsequent posts, I'm using the Apple II and 6502 as REFERENCE, since you refuse to tell anyone what system you used. Do you know REFERENCE? You need a REFERENCE to base any intelligible discussion. You, especially, need a REFERENCE, since you keep pointing to multiple things depending on the point you want to make. Without REFERENCE, your points are moot. Understand?
> After all that, there's really only a few places where you can safely store anything without clobbering everything, even with the maximum 48K.
Yet, in just 16K, I had probably something over 6K left for data. That's why it was so important to save grades in 1 byte each.
I guess you could have as much memory left over on your mysterious system as you want. You say '6K', and you apparently realize that that's small, you say that's why you had to '1 byte' to save grades. That's more detail than I asked, yet you keep saying you don't want to reveal any details. But collectively it does beg the question: How did you do it. That's more general question that can be simply answered with much less detail. Yet you don't. Why?
> You don't understand the meaning of decimals, which means power of 10
Fibbing by leaving out context is still fibbing. I told you I was referring to the grades not including decimals or fractions. That is, you could give a grade of 99, but not 99.2, 99-1/2 or anything like that. That was my context. You are free to use your context, but only for what YOU are saying. Oh, and in your post I'm responding to, you wrote: "Yours didn't "fractions" or even "decimals" at one point." So you knew my context. Gotcha.
Here's what I was replying to, in full context. You said:
Post #843:
I didn't do decimals or fractions, which was probably the only limitation compared to the typical programs like mine back then (actually, most didn't do fractions). Grades were whole numbers and could be from zero to 254 as I recall, though, obviously few would need that! But it was just as easy to go to that as to go to 100 and it allowed for flexibility - and certainly for ordinary extra credit on a test.
You do realize that 99, 99-1/2, 99.2 are all decimals, former is a whole number, latter contains fraction, and that the dot in the latter '99.2' is called a decimal point? You said you didn't "decimals", yet give examples of "decimal" numbers. You clearly didn't know what decimal meant. I even gave an explanation "decimal" is a number representation using power of 10, then I further explain by saying versus "binary" which is a number representation using power of 2. I think I even went as far as explaining "octal" is based on power of 8, "hexadecimal" uses power of 16.
(That's a lot of explanation for someone who doesn't "Back it up"!)
I'm not the one that said you didn't do "decimals" and "fractions" -- you did. I merely quoted you to make sure you knew what decimal" means by now. But I'm still not sure. So let me ask you again: Do you know what "decimal" means? Can you say?
And you don't do "fractions" -- which are numbers to the right of the "decimal" point. So you only used INTEGER/WHOLE numbers. I heard you the first time. I even deduced that, despite you saying "I didn't do decimals" you in fact did. Let's move on...
> What do you know about doing floating point in binary that you were confident to compare other programs that did them (using 5 bytes) vs your limited (0-254 number range) grading program?
LOL, the only thing I needed to know about floating point for my program was storing grades in it would take 5 bytes instead of the single byte my way required!
Yet you compared a comparable program to yours using 5 bytes to represent numbers versus yours that only did INTEGER/WHOLE numbers (in the very limited range of 0-254) as being better? How is that a fair comparison?
> Why 5? when the industry standard is 4?
That's a trick question, right? Because the "why" is simple enough. Because that is what MY computer used. I couldn't care less what yours used because I wasn't writing on your computer. Do you think people writing for computer system "A" should write it based on how computer system "B" works? So your link to what OTHER computers use is meaningless.
Yes, that was a trick question. But not to say my computer vs. your computer...
It doesn't even take much work to find this:
vvvvvvvvvvvvvvvvvvvvvv
Representation in the C-64
There are basically two different layouts how floats are stored:
for calculations with float registers in a some kind expanded layout of 6 or 7 bytes.
for variables compacted into 5 bytes.
^^^^^^^^^^^^^^^^^^^
Gotcha.
So then, is this the computer you used? A C-64. You could have saved a lot of words (both yours and mine) if you just said so. I mean, what's the big deal?
> If former, I know of no 8-bit CPU that had it built in. If it was part of your 'app' then it may have been a macro, one where it converts several native CPU instructions to give you the 'ADD' instruction.
Yet I already told you that a search will find hundreds of millions of matches: assembly ADD
You seem to be basing what you say on your lack of knowledge about other systems and not doing simple research. It would be one thing if you thus remained silent, but no, you just claim I'm wrong. It's like trying to explain snow to someone who lives in a place where it is always hot. Some of them simply refuse to believe it and choose to be ignorant.
First of all, my lack of knowledge is only because you can't even narrow down the system you're REFERENCING. Do you know REFERENCING? I've explained the difference between ASSEMBLY vs NATIVE instruction set, gave link to the 6502's instruction set as REFERENCE. Because you're looking to Google to save you, but it won't. Type in any general two words and it will spit out millions of pages. That should tell you that you need to be more specific.
Since you said you memorized the opcodes of your mysterious CPU, you must remember whether there was this ADD in its NATIVE instruction set. That's what really matters, right? Not the hundreds of millions all potentially different systems out there. The 6502 was a very typical CPU in the 8-bit era, and I gave a link of whole instruction set, and there is no such thing. So, are you referring to your machine you wrote your grading program on, or just anything Google spits out?
> Also 'LD A' is different from 'LDA'. I guess you can't tell the difference.
Really? No, you just fibbed. Because I specifically referred to "LD A" which YOU then said (in post 927): >LDA (not LD A) Maybe you couldn't tell the difference. Gotcha again.
That's because you said your machine had an 8-bit, not 16-bit. It would make it a two-byte opcode (as in 16-bit CPU), not one-byte opcide (as it 8-bit CPU. I explain this. Go read again.
Thank you for the opportunity to detail your false and misleading statements though. It was worth the wait.
Didn't I reprimand you re: snide remarks, side swipes, etc? Do you ever learn? Because it sure sounds like you're running out of argument, and so you have to resort to desperate methods: snarky remarks -> name calling -> attacking. Where are you going to fit in the "victimization" and conspiracy theories into that? Better ask Waytools!
edit: further discussion really, really requires Kahuna to pinpoint the system he's referring to. Unless he wants to keep going in circles. edit 2: jesus, i'm still really drunk.
Perhaps one observation to make, whatever its worth (or worthlessness), is that DBK has not mentioned he is also a beta tester for TextBlade, despite (presumably proudly) mentioning he was a beta tester for AppleWorks.
One might have imagined that being a Tregger is something DBK would have proudly mentioned
Don't imagine. It really is quite simple. I put that info in there before I got in Treg. While I had mentioned some of that on the forums, I thought it might help my chances if I put in my account info in case WayTools checked those, but may not remember what I had posted.
Yeah, sure. Plaster the whole world with your BS, sweetheart. But note that: 1) people will not believe, and more importantly 2) don't "attack" people with it.
Also, considering how plainly I talk about being in Treg on the forums, I don't think there is much need to add it to my account info. Might do it, but either way, it isn't a big deal.
Yet your lips are sealed when it comes to details. What's your privacy policy? Is that private as well? Ah, recursive programming. Very clever.
Comments
There was never a version of BASIC implemented on a kit computer or micro computer that did not offer some means of storing a value in a variable. That was like half the point of BASIC, to be able to write things like y = 2; for x = 1 to 10; print x * y; next x
https://en.wikipedia.org/wiki/BASIC
Even the most primitive BASIC implementations had the means to assign values to variables.
https://en.wikipedia.org/wiki/Tiny_BASIC
http://altairbasic.org/int_dis_12.htm
Or you can claim secret knowledge which no one else possesses based on details you are alleging you must withhold in order to protect your identity (the make of computer and version of BASIC you were using).
You are intentionally withholding key details that would disprove your claims, which is the opposite of backing up a claim.
You are full of it Kahuna.
Thank you Mr "I don't keep up on context"!
Because it wasn't about STORING variables in a program. It was about SAVING ONLY the variables to cassette, without also saving the BASIC program itself. It saved at least 50% of save/restore time.
You even made three links to something that wasn't even about the issue!
> You are intentionally withholding key details that would disprove your claims
Logic check - if you don't know what the data is, how the heck do you know it would disprove anything? Gotcha!
Besides, as already pointed out on the matter of "ADD", you don't need to know what cpu I use. Already pointed out you can search for:
assembly ADD
And find hundreds of millions of results! So you don't need to know what I use to know it exists.
I really did appreciate your post!
R
Of course I meant Orwell when I spoke of Orville but, it’s too late to edit it now. I knew it didn’t seem right but I was in the midst of things when I wrote it. Long live The Orville!
Maybe DBK is Da Big Kaylon? Those super fast fingers would produce amazing WPM test results :-)
As he has not specified enough detail for you guys to actually challenge him on a detail, for example; the chip, and through that he will retain plausible deniability. If you are going for the pin just tell him that withholding information to ensure an argument from ignorance is always in place is a disingenuous move, and that the burden of proof is on the claimant. The claims being what was said in his WTF bio and in various places about a "grading program".
It's never about the argument, and he never makes a relevant point that can't be kicked over in an instant, it's always about prolonging the argument.
It's pretty clear that DBK fails into the 'enthusiastic amateur' bracket here, even if at several decades remove.
Have fun chaps.
PS. I like Clam chowder too.
R
It's clear that the Textblade delays have been annoying people for a LONG LONG LOOOOOOOOOOOONG time, and MK's extended delays have been a laughing stock for quite a while now!
Hope the guys drop the argument on the grading program. That was not so much fun to watch, but reminded me of a lot of forgotten things I did long ago and questions came up like: are compilers so efficient now that nobody fools around in assembly anymore, right ?
At that time I really hated that the simplest thing in assembly consumed so much time and effort but it was an unavoidable pain sometimes as BASIC was so slow, but it would seem that nowdays only a masochist would invest the effort when you have languages like Swift 5 and Kotlin 1.3, and very efficient compilers and IDEs, and an incredible amount of horsepower under the hood.
I guess only people working on device drivers / firmware need to get that close to the metal, correct ?
edit: I do not even believe people would touch it for 3D animations either, as they are complex with all the math and matrix manipulations and as there are so many tools available.
I didn't like assembly then, and I don't like it now.
(edit: it is as much fun as reading a phone book)
https://unfinishedbitness.info/2014/04/25/6502-displaying-text-finally/
Basic:
Assembly: Wow, that was fun, right ?
edit: the guy read 3 BOOKS to do that....
Then if that is a decent enough job - basically ignoring what I said, changing the context totally, and making assumptions, I'd hate to see them do a bad job!
> As he has not specified enough detail for you guys to actually challenge him on a detail, for example; the chip, and through that he will retain plausible deniability.
Yet they did challenge me on things which they clearly made assumptions about anyway. Which goes to my point about much of what they post about anything.
> the burden of proof is on the claimant.
I didn't make my programming an issue. So it is their claims about how hard it would be, what commands are possible in MC, etc, that have to be proven then.
I never questioned what they could do with programming. Maybe they made up their stuff, but I sure didn't make it part of my arguments. I addressed what they actually said which were factually wrong.
But, I think you did the best job you could in covering for their ineptitude in their criticisms.
https://forum.waytools.com/users/dabigkahuna/activity
dabigkahuna
Hawaii
Retired teacher. Self-taught programmer (decades ago) using BASIC and machine code depending on need: Gradebook program, crossword puzzle solver, cryptogram solver, etc. Also developed improvements to existing programs, primarily a database program. Beta tester for AppleWorks. Developed an Access database to automate auditions. Other stuff I can't recall at the moment. Everything I did was to solve my own needs as a teacher originally, but then made it available to others.
Nevertheless, I spent a great deal of time explaining why your story is so full of holes, if not downright admission of your utter ignorance. Try re-reading my posts which includes: sample code, links and detailed explanations. Did you miss them? Here's a small sample:
You obviously didn't understand any of the explanations, so I even bolded and quoted words, formatted things as bullet items. What more do I need to do, Kahuna? Decorate them with emoji's and illustrations? Is your reading comprehension that bad?
And remember, I'm not the one with burden of proof. You have made your claim several times in the past, but somewhere since you became more brazen to attack people with it, not with any sort of wisdom, mind you, but with more hardened ignorance. To paraphrase the ZDNet writer: "If it wasn't good enough to remember, it wasn't good enough to brag about it."
Ah floating point. Yes, let's talk about that even when you said you didn't do them in your program. What do you know about doing floating point in binary that you were confident to compare other programs that did them (using 5 bytes) vs your limited (0-254 number range) grading program? Yours didn't "fractions" or even "decimals" at one point.
Why 5? when the industry standard is 4? Not just on my computer. The standard format for floating point numbers for single precision uses 32 bits in this format:
8 bits for the exponent.
23 bits for the mantissa.
(wah! wah! You can't back it up!!)
Here's a link that explains the standard and how to put it to use (hint: the standard is called IEEE-754):
https://www.h-schmidt.net/FloatConverter/IEEE754.html
What kinda messed up BASIC did you use that needed five bytes? Is it the same one that couldn't do variables, so you had to dump them manually into memory and redump onto cassette? Is it the same one you programmed your grading program on? You can't say.
But still, did you forget where you said:
A "GOTO" would be converted to a token before execution, a one-byte hex token, and '10' would also be tokenized into a one-byte value of '0xa'. So that's two bytes. It doesn't store the "10" as a five-bytes!!, unless the author was brain damaged. And no programmer would think that, unless they were more severely brain damaged.
Anyways, "Let A = 10" would create another 1-byte token for "Let", create a variable storage for "A", a two-byte pointer to where "A" is, create a 1-byte value of '0xa' and store it there. Now, "GOTO A" would have to fetch the 'A' each time you use them, then expand each one to "GOTO 10". Not only are you wasting memory by using 'A', you also slow down the process because the variable has to be expanded each time. How you think this saves you memory or time is beyond me.
It's much faster to use a constant rather than a variable, as that would not require expansion, nor use unnecessary cycles to fetch the same value over and over. You'd know this if you ever programmed anything at all.
Just be clear. Was 'ADD' a part of your CPU's instruction set? Or was it a part of the 'app' that you did your MC on? If former, I know of no 8-bit CPU that had it built in. If it was part of your 'app' then it may have been a macro, one where it converts several native CPU instructions to give you the 'ADD' instruction.
Also 'LD A' is different from 'LDA'. I guess you can't tell the difference. The former would mean it's a two-byte opcide -- impossible on an 8-bit CPU (only on a 16-bit CPU), whereas 'LDA' is an assembly mnemonic for a 1-byte opcode.
Oh wait!! Did I just back up my statements?
edit: clean up some line spacing for Kahuna
Just set up a string and make a call! What wuss we've become.
In the 8-bit days, you could tell a real man from the girls. We didn't need stinking high level languages like BASIC or even assemblers. We'd compose everything in hexadecimal in our heads and punch them in with our toes on a keypad. Or better yet, flip DIP switches. We'd eat KIM-1s for breakfast. We could make Altairs sing with paper tapes. Real men used cassette, no sinking floppies. No one would ever need more than 16k!
Not really. Let's look at your poor examples:
I told you early on that mine wasn't a 6502. So it doesn't matter what Woz told you about that one.
> After all that, there's really only a few places where you can safely store anything without clobbering everything, even with the maximum 48K.
> You don't understand the meaning of decimals, which means power of 10
> I'm not the one with burden of proof. You have made your claim several times in the past, but somewhere since you became more brazen to attack people with it
> What do you know about doing floating point in binary that you were confident to compare other programs that did them (using 5 bytes) vs your limited (0-254 number range) grading program?
LOL, the only thing I needed to know about floating point for my program was storing grades in it would take 5 bytes instead of the single byte my way required!
> Why 5? when the industry standard is 4?
That's a trick question, right? Because the "why" is simple enough. Because that is what MY computer used. I couldn't care less what yours used because I wasn't writing on your computer. Do you think people writing for computer system "A" should write it based on how computer system "B" works? So your link to what OTHER computers use is meaningless.
It doesn't even take much work to find this:
vvvvvvvvvvvvvvvvvvvvvv
Representation in the C-64
There are basically two different layouts how floats are stored:
- for calculations with float registers in a some kind expanded layout of 6 or 7 bytes.
- for variables compacted into 5 bytes.
^^^^^^^^^^^^^^^^^^^Gotcha.
> What kinda messed up BASIC did you use that needed five bytes? Is it the same one that couldn't do variables
You're fibbing again. Considering that I mentioned saving the main data in a D$ string (you know, one of those variable things), I also repeatedly talked about saving that data (which were in variables) to cassette without having to include the basic programing every time. So you know darn well I never said it could't use variables because I was using them. As for what did use 5 bytes, see above about the C-64.
> you also slow down the process because the variable has to be expanded each time.
But as for slowing down the process, yep, it sure did. I knew that at the time. But it was used in areas of the program where speed wasn't important. So the memory savings were more important.
> It's much faster to use a constant rather than a variable
> If former, I know of no 8-bit CPU that had it built in. If it was part of your 'app' then it may have been a macro, one where it converts several native CPU instructions to give you the 'ADD' instruction.
You seem to be basing what you say on your lack of knowledge about other systems and not doing simple research. It would be one thing if you thus remained silent, but no, you just claim I'm wrong. It's like trying to explain snow to someone who lives in a place where it is always hot. Some of them simply refuse to believe it and choose to be ignorant.
> Also 'LD A' is different from 'LDA'. I guess you can't tell the difference.
Really? No, you just fibbed. Because I specifically referred to "LD A" which YOU then said (in post 927): >LDA (not LD A)
Maybe you couldn't tell the difference. Gotcha again.
Thank you for the opportunity to detail your false and misleading statements though. It was worth the wait.
If you're going to go that far, why not add operating system onto the list? Faster Linux using only BASIC + MC routines (sic) as needed. I punched the whole code using nothing but hex codes I memorized. I could have been a millionaire many times over but I lost the cassette tape it was on.
But believe me, I wrote things even Steve Wozniak could not. I just can't remember what they were. It was much more disruptive than VisiCalc except it wasn't a spreadsheet. Can't tell you what it was, but it was hell of lot faster than VisiCalc, and it wasn't on the Apple II. If you ask me more than that, I'll call the police on you for privacy invasion, since that means you want to know more about me than the program.
I'm not going to tell you anything about me, except where I live, what I did in the past, what I'm doing now, and what a lovely time I'm having with my Textblade, what my typing speed is (in real time if you want)...
I aced Pascal as a Yr 9 student in a Yr 10 class, but that's the extent of my programming skills (despite dabbling in very basic Basic as a kid in the early 80s), so I'm in no position to judge the truthiness or otherwise of either side.
We know both sides hate each other to death (or so it seems), so perhaps the only thing left to do is to agree to disagree, and to get back to wondering when the TextBlade update will come this month, or if it will come, seeing as keeping to deadlines just ins't something MK is known for.
Perhaps one observation to make, whatever its worth (or worthlessness), is that DBK has not mentioned he is also a beta tester for TextBlade, despite (presumably proudly) mentioning he was a beta tester for AppleWorks.
One might have imagined that being a Tregger is something DBK would have proudly mentioned, given he has probably been beta testing TextBlade for years longer than AppleWorks, but I can only make vague suggestions here as to how proud DBK is or isn't about TextBlade beta-treggerhood, as I am not DBK, don't live in Hawaii, don't program in BASIC and only enjoys it when Arnie potrays machine language in Terminator hardware form.
Less than two weeks to go before we discover whether MK releases a TextBlade truth bomb, or whether May becomes June.
Cheers to all, and may the popcorn be with you.
A.
Funny, coming from a guy who didn't realize there was an ADD capability in some systems as well as some computers using 5 bytes for floating point.
> Now I see that he also wrote a crossword puzzle solver and cryptogram solver!
Because I did. They were part of a set of things I worked on based on newspaper puzzles (crosswords, cryptograms, and word jumbles). The were at least mostly in Basic, though it is possible the crosswords one had a MC operation for searching.
> If you're going to go that far, why not add operating system onto the list?
Because I didn't do that. Way beyond my ability anyway. Was that too confusing for you to follow? I did once write a machine code version of the Shell-Metzner sort though which I was pretty proud of. Never used it in anything. Just wanted to see if I could do it - and then a app writer asked if I would let him use it.
> I punched the whole code using nothing but hex codes I memorized.
You just fibbed again. Because I never said I punched in the whole code using memorized hex codes. I specifically said I did SOME in hex because they came up so often I had memorized them. But others I used assembly terms. Why do you keep making such false statements? I mean, it works for me to have you be so blatant about it, but it sure isn't going to do you any good.
People will note that Eric went through a bunch of insinuations that had not only nothing to back them up, but often were directly contrary to what I really said (such as the example above). But he and others will repeat such nonsense, hoping that repetition will fool people into thinking it must be true. I doubt it though. They are being too obvious now. Which works for me!
Also, considering how plainly I talk about being in Treg on the forums, I don't think there is much need to add it to my account info. Might do it, but either way, it isn't a big deal.
In this and subsequent posts, I'm using the Apple II and 6502 as REFERENCE, since you refuse to tell anyone what system you used. Do you know REFERENCE? You need a REFERENCE to base any intelligible discussion. You, especially, need a REFERENCE, since you keep pointing to multiple things depending on the point you want to make. Without REFERENCE, your points are moot. Understand?
I guess you could have as much memory left over on your mysterious system as you want. You say '6K', and you apparently realize that that's small, you say that's why you had to '1 byte' to save grades. That's more detail than I asked, yet you keep saying you don't want to reveal any details. But collectively it does beg the question: How did you do it. That's more general question that can be simply answered with much less detail. Yet you don't. Why?
Here's what I was replying to, in full context. You said:
You do realize that 99, 99-1/2, 99.2 are all decimals, former is a whole number, latter contains fraction, and that the dot in the latter '99.2' is called a decimal point? You said you didn't "decimals", yet give examples of "decimal" numbers. You clearly didn't know what decimal meant. I even gave an explanation "decimal" is a number representation using power of 10, then I further explain by saying versus "binary" which is a number representation using power of 2. I think I even went as far as explaining "octal" is based on power of 8, "hexadecimal" uses power of 16.
(That's a lot of explanation for someone who doesn't "Back it up"!)
I'm not the one that said you didn't do "decimals" and "fractions" -- you did. I merely quoted you to make sure you knew what
decimal" means by now. But I'm still not sure. So let me ask you again: Do you know what "decimal" means? Can you say?
And you don't do "fractions" -- which are numbers to the right of the "decimal" point. So you only used INTEGER/WHOLE numbers. I heard you the first time. I even deduced that, despite you saying "I didn't do decimals" you in fact did. Let's move on...
Yet you compared a comparable program to yours using 5 bytes to represent numbers versus yours that only did INTEGER/WHOLE numbers (in the very limited range of 0-254) as being better? How is that a fair comparison?
Yes, that was a trick question. But not to say my computer vs. your computer...
So then, is this the computer you used? A C-64. You could have saved a lot of words (both yours and mine) if you just said so. I mean, what's the big deal?
First of all, my lack of knowledge is only because you can't even narrow down the system you're REFERENCING. Do you know REFERENCING? I've explained the difference between ASSEMBLY vs NATIVE instruction set, gave link to the 6502's instruction set as REFERENCE. Because you're looking to Google to save you, but it won't. Type in any general two words and it will spit out millions of pages. That should tell you that you need to be more specific.
Since you said you memorized the opcodes of your mysterious CPU, you must remember whether there was this ADD in its NATIVE instruction set. That's what really matters, right? Not the hundreds of millions all potentially different systems out there. The 6502 was a very typical CPU in the 8-bit era, and I gave a link of whole instruction set, and there is no such thing. So, are you referring to your machine you wrote your grading program on, or just anything Google spits out?
That's because you said your machine had an 8-bit, not 16-bit. It would make it a two-byte opcode (as in 16-bit CPU), not one-byte opcide (as it 8-bit CPU. I explain this. Go read again.
Didn't I reprimand you re: snide remarks, side swipes, etc? Do you ever learn? Because it sure sounds like you're running out of argument, and so you have to resort to desperate methods: snarky remarks -> name calling -> attacking. Where are you going to fit in the "victimization" and conspiracy theories into that? Better ask Waytools!
edit: further discussion really, really requires Kahuna to pinpoint the system he's referring to. Unless he wants to keep going in circles.
edit 2: jesus, i'm still really drunk.
Yet your lips are sealed when it comes to details. What's your privacy policy? Is that private as well? Ah, recursive programming. Very clever.