switch to room list switch to menu My folders
It's not "AI", please call it...
As all sensible people know, the technology pundits are calling "Artificial Intelligence" is neither artificial nor intelligent.

This is not to say that they aren't doing some very impressive things. But it isn't sentient, it never will be, and some of the noisiest and unpleasant people are leading the talk in the cluelessosphere tech press.

Let's come up with a better term to refer to this technology. I sometimes use "very dense algorithms" but that isn't very catchy. "The technology inappropriately referred to as AI" is obviously too many syllables. But as the commoditization of massively parallel processing and inference models continues, it'll be in the hands of more people, so we need a better name.

Any ideas?

Posted by IGnatius T Foobar on Mon Apr 15 2024 16:37:17 EDT
4 comments 0 new | permalink
My WWVB clock is still cool

Let's talk 60 KHz.

As you may know, I have a lovely WWVB-synchronized clock that I built by hand about five years ago.  Here's a photo:

The firmware is my own code and once in a while I tweak it a bit to try to get a little better performance out of it.  The little module on the top left, just below the ferrite antenna, is a WWVB receiver module that pulls in the 60 KHz signal and outputs it on a single pin, as a high-or-low state.  The pulse train from this pin must then be decoded.  I think you can see that I used an Arduino Nano as the brains of the operation.  It then speaks to the display using I2C.

We had a brief power blip, just for a few seconds, on Monday morning.  Here on Wednesday evening it still hadn't picked up a full frame of time code.  There is no battery backed RTC on this thing, and no manual setting buttons (a deliberate design choice) so it's been sitting there with a blank screen for three days.  Not cool.  As I drifted off to sleep last night, I noticed that the time code pulses (the yellow LED) were totally readable by me just by looking at them, so why couldn't I train the software to be smarter?  What it needed was some hysteresis.  The original code read the receiver pin every loop cycle and considered high=1 and low=0, so if there was even the tiniest flickering of the signal in the middle of a pulse, it would throw the whole frame away.  I added some new code: now we take 100 samples (which still happens in less than 1 millisecond) and if at least 20% of those samples are high then we consider that a 1; fewer than 20% is a 0.  If that sounds like a ridiculously low threshold to you, I agree.  But I originally set it to 50% and it was still not enough.

The photo above was taken this evening, right after I brought the clock back into my bedroom after reprogramming it.  It locked on to the time signal in the very first minute and set the clock.

That's what the red LED on the bottom is, by the way.  I cut out the Nano's power LED.  The built-in LED on D11 illuminates when the software has detected the top-of-minute frame marker and is locked in and receiving time code.  If it receives any pulse length other than 200 (logical 0), 500 (logical 1), or 800 (marker bit) milliseconds, it assumes it's receiving junk and throws the frame away.  The red LED goes out and it waits for a new frame to begin.  If it completes the frame, however, it sets (or resets) the time, and illuminates the green LED.  When I see green, I know it has performed a complete synchronization within the last 24 hours.

It's got a few more tricks up its sleeve, however.

As previously noted, there's no battery backed RTC.  It's completely a free-running software clock timed by the 1 ms timer interrupt on the Arduino.  That means 60,000 timer ticks is one minute.  Or is it?  As it turns out, the accuracy of this timer is terrible.  One minute is somewhere around 60,000 ticks, give or take a few hundred.  So even if we didn't receive a full frame of time code, if we receive the start-of-minute marker we can do a couple of things:

1. We know the end of the marker is exactly 800 ms past the top of the minute.  So if the software clock reads anywhere between :45 and:15 seconds, we snap to :00.8

2. If we have two top-of-minute markers in a row, even if the time code in between was garbled, we now know the exact number of timer ticks that comprise a minute on this hardware.  So we save that number.  Up to ten of them, actually.  And we average those out, and that becomes our reference minute.

Bonusfest: there's also a cadmium cell detecting ambient light.  We crank up the display brightness when the room is bright, to avoid it washing out, and we dim the display in darkness, so it doesn't blasticate the room with display light while we're trying to sleep.

All in all it's been a ton of fun, and in addition to being a useful bedroom clock that never needs to be set manually, it also lets me geek out every night while I am drifting to sleep by watching das blinkenlichten.

Posted by IGnatius T Foobar on Wed Oct 11 2023 23:54:28 EDT
0 comments 0 new | permalink
In death, as in life...

David Heinemeir Hansson is the CEO of 37signals, the company behind HEY and BaseCamp. One feature of HEY is that you can post to your blog simply by composing a new message addressed to it. It works well, but I've got news for ya, Dave: Citadel has had that feature for well over a decade.

I like him, though. He's got good insights and is a fun read. If you're reading my blog here on Uncensored, and you want to read his blog too, go to the hidden room called "DHH" to follow it.

But this blog post isn't about him. It's about people I *don't* like. Specifically, Dianne Feinstein, and Richard Marx Stallman. The former recently died, and the latter is terminally ill. And so, out come the chants of "YOU MUST RESPECT TEH DEAD!!!1"

Nope. In death, as in life, and as I have blogged to death (heh) in this location before: RESPECT IS EARNED. If you were not worthy of respect when you were alive, you don't suddenly become respectable simply because you're now taking a dirt nap. DiFi was a scumbag of the worst kind in life, and she remains a scumbag in death.

As Richard Marx Stallman once said of Steve Jobs: "I'm not glad he's dead, but I'm glad he's gone." Stallman was glad that Jobs was no longer a malign influence on computing. It is one of the few things I agree with him on.
And when Stallman himself is gone, I will say the same thing about him.

To wish death upon another human being or to celebrate their death is in poor taste, with one and only one exception: I will take the indulgence of celebrating the death of Bill Gates. May that day come soon and may Gates, the worst person in the history of the entire world, experience extreme pain and misery on his way down.

But no, if you're an Ultimate Scumbag you don't get respect simply because you died. Some of you might know that I come from a family in which there were quite a few people who were funeral directors. One of them was my great grandfather, who famously said, "don't be concerned about dead people; it's the alive ones you've got to worry about."

Someday I too will pass from this side of eternity into the other side. I will continue to do my best to keep my affairs in order and maintain the respect of my family, friends, and peers. Because, for the thousandth time, respect is earned. I hope that I have done my best to earn the respect of you my readers. I usually only say this on March 10 (the site's birthday) but I truly appreciate the online company of each and every one of you.

Posted by IGnatius T Foobar on Sun Oct 01 2023 18:23:56 EDT
0 comments 0 new | permalink
IBM is obsolete

Thirty years ago, I completed a college degree and joined the millions of people who discovered that a college degree is completely useless and I want my money back. But that's not what this blog post is about.

My alma mater ran just about everything on a Burroughs A-9 mainframe. Burroughs was later acquired by Sperry-Univac and at some point they upgraded to a Unisys A-12 mainframe. Whatever. Today, they are still keeping track of class enrollments, grades, tuition, housing, and all the other day-to-day minutiae using the same software, but on "Unisys ClearPath" -- which of course runs on bog-standard AMD64 hardware and emulates the old mainframe.

Unisys people have accepted this fate. IBM people have not. Because they are morons.

From a purely subjective level, you know a platform is dead when the information stupidhighway is saturated with articles written by people who insist that it is not dead. Oh, it's so MODERN now, they breathlessly chant to anyone who is willing to listen (which is nobody, so they then move on to bothering people who don't want to hear about it). It has files and pipes and internet and cloud and rainbows and unicorns and all sorts of modern wonders!

This is very true with regard to AIX (sorry, "IBM p") which is so dead that IBM has sacked all of their AIX developers in the United States and moved support-and-maintenance to an offshore sweatshop. But it's *extremely* true with regard to System/38, which eventually got renamed to "AS/400" and is now known as "IBM i" and runs on the same hardware as AIX, using the same CPU that Apple abandoned two generations ago.

There's a litmus test. Simply ask yourself, "Would I build a brand new (greenfield) workload on this platform?" And I'm not talking about some bank or insurance company that has a bunch of old AS/400 stuff already running and just needs to add one more task. If you're opening a new organization with a new IT department and all new software, are you going to build them on "IBM p" or "IBM i"? No, you would get fired for that, and you would deserve it.

Anything from IBM is, without question, a legacy platform. You might be supporting existing workloads for a few more decades, but as a go-forward play it's dead. It's technical debt. Call it what it is, and stop trying to pretend otherwise. You sound like an idiot.

Posted by IGnatius T Foobar on Mon Apr 17 2023 00:10:33 EDT
7 comments 0 new | permalink
Thoughts from AWS re:Invent 2022

[DISCLAIMER: the opinions posted here do not necessarily represent those of my employer.]

To great relief, I am now back in the Northeast after a trip to AWS re:Invent 2022.  For the benefit of no one in particular I am now journaling my thoughts, in no particular order.

My biggest observation is, quite simply, to hell with Las Vegas.  It's not the place for me.  Very overstimulating.  Everyone and everything wants your attention, and there is almost no escape from it.  I can only imagine what my introvert daughter would do if she were there ... she'd probably curl into a ball with a blanket over herself and noise-canceling headphones on for the entire time.  There are not just lights, but jumbotrons everywhere.  On the sides of buildings, on the backs of trucks, there are flashing lights everywhere everywhere everywhere.  There is no keeping to yourself in Las Vegas; everyone and everything is in your face.

Amazon wasted a lot of space.  Their convention took up space in half a dozen different hotels.  And these aren't just ordinary hotels; each one of them is a mini city with a large convention center, a casino, an entire shopping mall, and thousands of rooms.  I believe they wasted a lot of space and they could have done the convention in maybe one or two of these hotels.  And they didn't have to spread it out all over the strip either.  How about using hotels that are all next to each other so you don't need buses to get between them!  Including the beautiful and fabulous Trump hotel, which they passed on because Amazon is full of the kind of people who work for Amazon.

How about the food?  The food in Las Vegas is overpriced.  It's good, but I'm from New York so I'm no stranger to good food.  It's just "good".  Not out-of-this-world.  But the food inside the convention?  Practically inedible.  Once again, it was put together by the kind of people who would work for Amazon.  Nearly all of it gluten-free, dairy-free, and taste-free.  Even the "ethnic" food was lousy: a man of Indian origin who sat at a table with me said "I have eaten a lot of curry, and this is not good curry."  Hey Amazon, how about you just put out a table full of hot dogs?  It's the easiest food in the world to serve to tens of thousands of people.  In the world of food, wide appeal is diametrically opposed to politically correct food fads.  After about the second day our team didn't even bother with the grub hall and we just went out for lunch.  Maybe they're counting on that.  (If I go back next year I'll probably hack the system by requesting a kosher meal.  Those looked edible.)

To be honest ... I really think that Amazon simply doesn't care.  They're the biggest name in technology right now and this whole convention is just a way for them to flaunt their bigness.  There's no pan-industry conference like COMDEX anymore, so "anyone who's anyone" simply shows up at re:Invent, sometimes with only a barely viable token connection to cloud computing.  And that's probably what Amazon is thinking: "bring the whole industry to here because we are the industry."  And for the time being, that is true, since "cloud" is the current mania.

(For the truth about cloud mania, read David Hansson's excellent blog post "Why We're Leaving The Cloud []" in which he correctly points out the places where hyperscale cloud computing excels -- at the small end of the market where a new organization can't afford infrastructure, and at the high end where massive elasticity is needed -- and that the stable, predictable middle is better served by other hosting setups.)

As mentioned in the disclaimer, my opinions do not necessarily represent those of my employer.  As an IT architecture professional I work in both public and private cloud spaces.  As a technolibertarian I want a level playing field, and my opinion is that both Amazon and its conference are too damn big.  The only relief I found was at a Denny's across the street from my mega-hotel, where I sat, late in the evening, in a room that was not overcrowded, without flashing lights in my face, without loud noise everywhere, sipping some good coffee and finding my zen.  In that moment, I found a peaceful space that mimicked home, until I could finally get home.

Posted by IGnatius T Foobar on Sun Dec 04 2022 16:28:31 EST
1 comments 0 new | permalink