Language:
switch to room list switch to menu My folders
Hi Grandma!

My grandmother supposedly died in 2017.  Except she didn't.

One of her favorite restaurants was Piazza Roma on route 6 in Mohegan Lake.  She loved when we went there.

She wasn't happy when my aunt and uncle moved out of town in 2016 and were not up front with her about their intentions.  She was also not happy when my sister got married in 2017 and had the wedding out of town, with Grandma too frail to travel.

So it's pretty clear what happened.  Grandma didn't die.  She faked her own death and has been hiding at Roma's ever since.  She's 101 years old now and as far as I'm concerned she's still there.

A couple of months ago my father died at age 81 with mid to late stage Alzheimer's disease.  It is also very clear that, because of the Alzheimer's, he forgot that he was dead and went to go hang out with Grandma at the restaurant.

I'd like to go visit them both, but the restaurant isn't there anymore.



Posted by IGnatius T Foobar on Tue Dec 16 2025 02:40:47 UTC
0 comments 0 new | permalink
It's not "AI", please call it...
As all sensible people know, the technology pundits are calling "Artificial Intelligence" is neither artificial nor intelligent.

This is not to say that they aren't doing some very impressive things. But it isn't sentient, it never will be, and some of the noisiest and unpleasant people are leading the talk in the cluelessosphere tech press.

Let's come up with a better term to refer to this technology. I sometimes use "very dense algorithms" but that isn't very catchy. "The technology inappropriately referred to as AI" is obviously too many syllables. But as the commoditization of massively parallel processing and inference models continues, it'll be in the hands of more people, so we need a better name.

Any ideas?

Posted by IGnatius T Foobar on Mon Apr 15 2024 20:37:17 UTC
4 comments 0 new | permalink
My WWVB clock is still cool

Let's talk 60 KHz.

As you may know, I have a lovely WWVB-synchronized clock that I built by hand about five years ago.  Here's a photo:

The firmware is my own code and once in a while I tweak it a bit to try to get a little better performance out of it.  The little module on the top left, just below the ferrite antenna, is a WWVB receiver module that pulls in the 60 KHz signal and outputs it on a single pin, as a high-or-low state.  The pulse train from this pin must then be decoded.  I think you can see that I used an Arduino Nano as the brains of the operation.  It then speaks to the display using I2C.

We had a brief power blip, just for a few seconds, on Monday morning.  Here on Wednesday evening it still hadn't picked up a full frame of time code.  There is no battery backed RTC on this thing, and no manual setting buttons (a deliberate design choice) so it's been sitting there with a blank screen for three days.  Not cool.  As I drifted off to sleep last night, I noticed that the time code pulses (the yellow LED) were totally readable by me just by looking at them, so why couldn't I train the software to be smarter?  What it needed was some hysteresis.  The original code read the receiver pin every loop cycle and considered high=1 and low=0, so if there was even the tiniest flickering of the signal in the middle of a pulse, it would throw the whole frame away.  I added some new code: now we take 100 samples (which still happens in less than 1 millisecond) and if at least 20% of those samples are high then we consider that a 1; fewer than 20% is a 0.  If that sounds like a ridiculously low threshold to you, I agree.  But I originally set it to 50% and it was still not enough.

The photo above was taken this evening, right after I brought the clock back into my bedroom after reprogramming it.  It locked on to the time signal in the very first minute and set the clock.

That's what the red LED on the bottom is, by the way.  I cut out the Nano's power LED.  The built-in LED on D11 illuminates when the software has detected the top-of-minute frame marker and is locked in and receiving time code.  If it receives any pulse length other than 200 (logical 0), 500 (logical 1), or 800 (marker bit) milliseconds, it assumes it's receiving junk and throws the frame away.  The red LED goes out and it waits for a new frame to begin.  If it completes the frame, however, it sets (or resets) the time, and illuminates the green LED.  When I see green, I know it has performed a complete synchronization within the last 24 hours.

It's got a few more tricks up its sleeve, however.

As previously noted, there's no battery backed RTC.  It's completely a free-running software clock timed by the 1 ms timer interrupt on the Arduino.  That means 60,000 timer ticks is one minute.  Or is it?  As it turns out, the accuracy of this timer is terrible.  One minute is somewhere around 60,000 ticks, give or take a few hundred.  So even if we didn't receive a full frame of time code, if we receive the start-of-minute marker we can do a couple of things:

1. We know the end of the marker is exactly 800 ms past the top of the minute.  So if the software clock reads anywhere between :45 and:15 seconds, we snap to :00.8

2. If we have two top-of-minute markers in a row, even if the time code in between was garbled, we now know the exact number of timer ticks that comprise a minute on this hardware.  So we save that number.  Up to ten of them, actually.  And we average those out, and that becomes our reference minute.

Bonusfest: there's also a cadmium cell detecting ambient light.  We crank up the display brightness when the room is bright, to avoid it washing out, and we dim the display in darkness, so it doesn't blasticate the room with display light while we're trying to sleep.

All in all it's been a ton of fun, and in addition to being a useful bedroom clock that never needs to be set manually, it also lets me geek out every night while I am drifting to sleep by watching das blinkenlichten.



Posted by IGnatius T Foobar on Thu Oct 12 2023 03:54:28 UTC
0 comments 0 new | permalink
In death, as in life...


David Heinemeir Hansson is the CEO of 37signals, the company behind HEY and BaseCamp. One feature of HEY is that you can post to your blog simply by composing a new message addressed to it. It works well, but I've got news for ya, Dave: Citadel has had that feature for well over a decade.

I like him, though. He's got good insights and is a fun read. If you're reading my blog here on Uncensored, and you want to read his blog too, go to the hidden room called "DHH" to follow it.

But this blog post isn't about him. It's about people I *don't* like. Specifically, Dianne Feinstein, and Richard Marx Stallman. The former recently died, and the latter is terminally ill. And so, out come the chants of "YOU MUST RESPECT TEH DEAD!!!1"

Nope. In death, as in life, and as I have blogged to death (heh) in this location before: RESPECT IS EARNED. If you were not worthy of respect when you were alive, you don't suddenly become respectable simply because you're now taking a dirt nap. DiFi was a scumbag of the worst kind in life, and she remains a scumbag in death.

As Richard Marx Stallman once said of Steve Jobs: "I'm not glad he's dead, but I'm glad he's gone." Stallman was glad that Jobs was no longer a malign influence on computing. It is one of the few things I agree with him on.
And when Stallman himself is gone, I will say the same thing about him.

To wish death upon another human being or to celebrate their death is in poor taste, with one and only one exception: I will take the indulgence of celebrating the death of Bill Gates. May that day come soon and may Gates, the worst person in the history of the entire world, experience extreme pain and misery on his way down.

But no, if you're an Ultimate Scumbag you don't get respect simply because you died. Some of you might know that I come from a family in which there were quite a few people who were funeral directors. One of them was my great grandfather, who famously said, "don't be concerned about dead people; it's the alive ones you've got to worry about."

Someday I too will pass from this side of eternity into the other side. I will continue to do my best to keep my affairs in order and maintain the respect of my family, friends, and peers. Because, for the thousandth time, respect is earned. I hope that I have done my best to earn the respect of you my readers. I usually only say this on March 10 (the site's birthday) but I truly appreciate the online company of each and every one of you.

Posted by IGnatius T Foobar on Sun Oct 01 2023 22:23:56 UTC
0 comments 0 new | permalink
IBM is obsolete

Thirty years ago, I completed a college degree and joined the millions of people who discovered that a college degree is completely useless and I want my money back. But that's not what this blog post is about.

My alma mater ran just about everything on a Burroughs A-9 mainframe. Burroughs was later acquired by Sperry-Univac and at some point they upgraded to a Unisys A-12 mainframe. Whatever. Today, they are still keeping track of class enrollments, grades, tuition, housing, and all the other day-to-day minutiae using the same software, but on "Unisys ClearPath" -- which of course runs on bog-standard AMD64 hardware and emulates the old mainframe.

Unisys people have accepted this fate. IBM people have not. Because they are morons.

From a purely subjective level, you know a platform is dead when the information stupidhighway is saturated with articles written by people who insist that it is not dead. Oh, it's so MODERN now, they breathlessly chant to anyone who is willing to listen (which is nobody, so they then move on to bothering people who don't want to hear about it). It has files and pipes and internet and cloud and rainbows and unicorns and all sorts of modern wonders!

This is very true with regard to AIX (sorry, "IBM p") which is so dead that IBM has sacked all of their AIX developers in the United States and moved support-and-maintenance to an offshore sweatshop. But it's *extremely* true with regard to System/38, which eventually got renamed to "AS/400" and is now known as "IBM i" and runs on the same hardware as AIX, using the same CPU that Apple abandoned two generations ago.

There's a litmus test. Simply ask yourself, "Would I build a brand new (greenfield) workload on this platform?" And I'm not talking about some bank or insurance company that has a bunch of old AS/400 stuff already running and just needs to add one more task. If you're opening a new organization with a new IT department and all new software, are you going to build them on "IBM p" or "IBM i"? No, you would get fired for that, and you would deserve it.

Anything from IBM is, without question, a legacy platform. You might be supporting existing workloads for a few more decades, but as a go-forward play it's dead. It's technical debt. Call it what it is, and stop trying to pretend otherwise. You sound like an idiot.



Posted by IGnatius T Foobar on Mon Apr 17 2023 04:10:33 UTC
7 comments 0 new | permalink