Watching a seminar on RISC-V and LLM optimizations being done by SiFive. 15 mins in they are talking about assembler and vector math.
They don't screw around.
Learned about a new compiler thing, converts pytorch stuff to some other closer to metal runtime ( and in theory bare metal ). Great, more stuff to learn .. lol
"this AI coding stuff is stupid"
It's just cribbing from Stack Overflow, which is what everyone was already doing anyway.
Far more came from GitHub actually since there was zero paywall there,. So actual ( mostly ) functioning code. But yes, some from Stack O and some from Reddit, and Medium, and other places, and manuals, and 3rd party books ( big stink on that one.. due to the source of the books ) But really its more than the simple sum of its parts. Just like people. That is how transformers work. Just like how we operate. Only real difference is real-time feedback to change the model on the fly, like humans have.
And no, they are not perfect ( just like people ), but they can save a lot of time if used with the correct expectations. Suggestion: Yes i know i'm a big supporter of AI for decades ( long before transformers came onto the scene, which is what most are built on now ) , but instead of being so negative towards the idea, you should it sometime and get some first hand experience, might find its helpful. If you try one of the big free commercial ones, do keep in mind they are not 'tailored' for coding and its just one piece of 'general use' training, so wont be as good as a locally run model that is tailored for coding will be.
Resource needs aside, they are brain dead easy to run locally these days. But perhaps if we coordinated the time i could open mine up for a bit to the outside if you wanted. Could try different models and such. I dont have a lot of horsepower, so really cant run 2 people at once, and large models are slow, but that does not effect accuracy, just response time.
Sun Oct 20 2024 18:17:32 EDT from IGnatius T Foobar"this AI coding stuff is stupid"
It's just cribbing from Stack Overflow, which is what everyone was already doing anyway.
Oh, and you know i was not trying to be insulting.. i just see sooo much opportunity in this stuff i hate for people to miss it. Got nothing to lose by trying, Zoe wont eat you.
As we all know by now, my wording is not always the best and often sounds harsher than meant. that silly lack of emotion thing creeps in..
Ok, ill give you that one. Marketing ( for anything really ) is awful these days. But i think its in part due to the general public's attention span being that of a flea. If its not 'rah rah look at me' every 10 seconds, the public moves on to the next video....
( but, the offer still stands, if you want to experiment ;) )
Tue Oct 22 2024 09:25:47 EDT from IGnatius T FoobarIt isn't the technology that is detestable, it is the hype cycle.
May be too late by the time anyone sees this, but we have a FORTH zoom meting going. Charles Moore is speaking at the moment.
https://app.zoom.us/wc/92427635440/join?fromPWA=1&pwd=a2xUR0M0bk5ISDF3c1g1RTM3QVdkdz09
2024-11-09 14:15 from Nurb432 <nurb432@uncensored.citadel.org>
May be too late by the time anyone sees this, but we have a FORTH
zoom meting going. Charles Moore is speaking at the moment.
https://app.zoom.us/wc/92427635440/join?fromPWA=1&pwd=a2xUR0M0bk5ISDF3
c1g1RTM3QVdkdz09
Aah, too bad I missed it. I couldn't have joined anyway, for one because I'm anonymous, and second because I don't have a Zoom.
My year-long revisit of FORTH has left me in a very bipolar place. Some days I wake up excited about writing what is now probably the sixth or seventh iteration of my ultimate FORTH engine, and other days I look at it and think "what I'm doing? I'll never be productive with this shit." FORTH has this bizarre property, at least for me, that when I'm using it, it feels like I'm in full control and when I make something work, it feels like I could do anything, but then when I look at what I've done in the past with C, it feels very daunting to imagine how I would accomplish the same thing in FORTH. And I can't exactly put my finger on why that is. I think it might be the lack of typing and structs, and I know some FORTHs have something kind of like struct (I've even posted about my own interpretation here), but it just isn't the same. I haven't been able to find something that lets me picture the layout of data structures in FORTH, or manipulate them, quite as intuitively as it comes to me in C.
Anyway, what did Chuck have to say? I think it's awesome he's still involved. Is the GA144 still his focus or have they moved on to the next incarnation yet?
They have a web client ( its what i use.. no app for me ) and as far as i know you don't need an account. Just asks for a random name.
Tho it will be YT, it will be posted here soon.
Sun Nov 10 2024 21:55:25 UTC from zelgomerAah, too bad I missed it. I couldn't have joined anyway, for one because I'm anonymous, and second because I don't have a Zoom.
I didn't get to hear it all as i was busy with family stuff, but at one point he was talking about some long distance networking stuff back in the old days.. and they were talking about analog computers..
Sun Nov 10 2024 21:55:25 UTC from zelgomerAnyway, what did Chuck have to say? I think it's awesome he's still involved. Is the GA144 still his focus or have they moved on to the next incarnation yet?
My year-long revisit of FORTH has left me in a very bipolar place. Some days I wake up excited about writing what is now probably the sixth or seventh iteration of my ultimate FORTH engine, and other days I look at it and think "what I'm doing? I'll never be productive with this shit." FORTH has this bizarre property, at least for me, that when I'm using it, it feels like I'm in full control and when I make something work, it feels like I could do anything, but then when I look at what I've done in the past with C, it feels very daunting to imagine how I would accomplish the same thing in FORTH.
I've heard LISP described the same way. Elegant but useless. But that's probably not fair because people have used FORTH for actual useful projects.
It just isn't used as a general purpose language much, I guess.
LISP is not at all useless, people like Paul Graham have used it for significant real-world projects (don't get me started about emacs)
It's not useless, it just sucks LOL.
My code in C looked like BASIC for the first couple of years. It was awful.
Even once I got out of the bad habit of using gotos everywhere I still had the habit of traversing a string like this:
char str[128];
int i;
...
for (i=0; i<strlen(i); ++i) {
...
}
Willi (dothebart) used to chastize me for doing it that way. Although functionally correct and easy to read, he complained that it required a full traversal of the string for each character scanned. In big-O notation that's O(too_many).
He rewrote a lot of my loops like this:
char str[128];
int i, j;
...
j = strlen(i);
for (i=0; i<j; ++i) {
...
}
Something about that bugged me. Maybe it was just that the extra variable and extra line took up too much space. Or maybe it was the sinking feeling that if the loop did something to change the string length it would fail.
But he wasn't wrong about the extra CPU time, even if it was unnoticeable on nearly all modern computers.
Lately I've started doing it like this, and I'm kicking myself for not figuring it out decades ago:
char str[128];
int i;
...
for (i=0; str[i]; ++i) {
...
}
It sure took me long enough. But look on the bright side, at least I'm not a self righteous Rust programmer.
putchar(c); /* or whatever */
for (char *c = s; *c; ++c) {
...
}
There's always a tradeoff between readability and efficiency. The common wisdom these days is that "the computer's time is worth less than yours" which is often true. As long as you're not making it readable so that an imported idiot can take over your code while being paid a fifth of your pay, of course.
The example above could be made more readable, for example, by expanding it out to:
char *c;
for (c=s; *c != NULL; ++c) {
...
}
That would optimize out to pretty much the same code while explaining a bit more why each piece is there.
What I've found while revisiting decades-old code is that it's easier to write readable code when you're not constrained to the old 80x24 terminal size. You can write things that are both beautiful and efficient, add longer variable names that make more sense, and work with line lengths that conform to the task being done rather than the size of the screen.
And it's still a beautiful thing to write in C. If you are writing code strictly to be arrogant, you write in Rust or Go.
And the c99 style declarations inside the for () are good because it restricts the scope of those variables. Again, it's about keeping the context on my screen tightly limited.