Truckers at Rest stops don't count.
On TOS/GEM...
It wasn't a terrible OS. There just wasn't a lot to do with it. It was squarely aimed to be an affordable and direct competitor to Mac OS Classic. There was even a focus on paperwhite mono screens at release - they were going after the DTP niche originally.
And the hardware itself was *faster*... but unfortunately, without the custom graphics and sound chips - it simply couldn't deliver the kind of multimedia experience that the Amiga pioneered. Side scrolling platformers really suffered... You would get from the left side to the right side of the screen, and the entire scene would freeze as it scrolled the entire next screen back to the left side. Being that it was *ATARI* and most people expected it to be very competitive with entertainment - no amount of OS was going to make up for the machine shortcomings. By the time they started to close the gap, it was already too late.
But if you wanted to make church newsletters or digital music - it was the tits, for sure.
From my view, ultimately the ST was primarily marketed toward musicians due to its early adoption of MIDI, which i think limited its market potential too much to survive even in the best of situations. ( and built-in scsi was ahead of is time too, even if it was SCSI-1 ). With such a small target market they sort of sealed their fate. I do think they could have been a 'low cost alternative' to the Mac as the hardware was there, but due to lack of solid business programs and bad marketing, they missed the boat. Sure we had Calimus, AtariWorks, etc and they did function, but it wasn't what business really wanted at the time.
While i am still a hard core fan ( no news there ) Atari dropped the ball whey they didnt keep advancing the OS. It wasn't until we got MiNT, which was 3rd party at first, that we had a modern OS. ( sure you could run Minix, CP/M, or a few others, but those really didnt matter in the practical picture ) But by the time MiNT became official it was too late anyway, the WinTel boat built by IBM had already left port, and left us behind. ( and the Amiga people, and Digital Research crowd.. ) Apple was in a life raft trying to not drown, but the rest of us were all standing on the dock looking at each other wondering what happened while we fought among ourselves.
We did have custom sound ( but really it was meant for MIDI so onboard sound was 'well, we need to make noises' ) and the blitter chip in the STE ( and newer models ). but i agree, too little too late.
The Falcon was a beast for its time. But again, it was so late in the game and so few people even knew it existed due to piss-poor marketing. And dont get me started on the ATW.. Damned idiots didnt know what they had. It could have been a game changer and changed the entire market.
Linux is clearly where the flies swarm.
That's basically saying "I know you are but what am I"
I accept your concession. Now go play with the other mean kids.
was squarely aimed to be an affordable and direct competitor to Mac
OS Classic. There was even a focus on paperwhite mono screens at
I *liked* Mac OS Classic. It really was a shame that they gave up that UI when they finally put a real OS underneath it. Seems like that was the last time almost anyone was willing to do a sensible desktop where every element actually looked the way it acted and acted the way it looked.
Mac OS X led the charge into the overly skeuomorphic, 16 million color 3-D everything, which everyone else aped, and then Microsoft led the charge into the flatso look, which everyone else aped. Mac OS Classic, Amiga, CDE, even Windows 95 got it right. Then the whole world lost its mind.
Snarky meme poster obviously hasn't used Linux, which these days demands reboots more often than Windows.
Then you are doing it wrong. Only time i have to reboot Linux is with a kernel upgrade. Never due an app upgrade. I have up times in the realm of months, and not longer only as i am to lazy to get a UPS and our power here blinks a lot ( we have overhead power, so trees )
Even after kernel upgrades, its optional and you dont *have* to reboot. And yes, that could be avoided if it was not a monolithic blob. but hey, at least its just the kernel that needs it, not the rest of the system.
Mon Mar 14 2022 12:29:47 PM EDT from nonservatorSnarky meme poster obviously hasn't used Linux, which these days demands reboots more often than Windows.
You should have liked TOS/GEM too. Very similar.
I agree we have lost something over the decades, adding flash and fancy stuff just to sell it as 'new' , not really an improvement.
Mon Mar 14 2022 09:09:41 AM EDT from IGnatius T FoobarI *liked* Mac OS Classic. It really was a shame that they gave up that UI when they finally put a real OS underneath it. Seems like that was the last time almost anyone was willing to do a sensible desktop where every element actually looked the way it acted and acted the way it looked.
Even after kernel upgrades, its optional and you dont *have* to
reboot. And yes, that could be avoided if it was not a monolithic
blob. but hey, at least its just the kernel that needs it, not the
rest of the system.
Remember the old days when all the computer science types were claiming that everything would go to microkernels?
Linux didn't, and still took over the world.
Windows did, and then they screwed it all up by continuing to put stuff in the kernel instead of building everything as kernel servers.
GNU Hurd *did* , and sucked so badly that it was never finished.
So maybe microkernels are a bad idea after all?
GNU Hurd *did* , and sucked so badly that it was never finished.
If they had wanted to, they probably could have built a halfway decent microkernel or mostly-microkernel based OS.
But they didn't want to actually *build* an OS, instead they wanted to spend 10 years *designing* the *perfect* OS.
You can get Hurd kernel with a Debian user space. So not sure it qualifies as never finished.
I agree, it has issues, partially due to its leadership. but conceptually nothing wrong with micro kernels. I think its just a matter that monolithic was there, getting resources and marketing and acceptance. Microkernels, while a better idea didnt offer enough to undo the damage. They do still exist but never hit 'mainstream'. ( tho if you include embedded minix, there are more microKernels on 'pcs' than anything else now )
Mon Mar 14 2022 03:04:57 PM EDT from IGnatius T FoobarEven after kernel upgrades, its optional and you dont *have* to
reboot. And yes, that could be avoided if it was not a monolithic
blob. but hey, at least its just the kernel that needs it, not the
rest of the system.
Remember the old days when all the computer science types were claiming that everything would go to microkernels?
Linux didn't, and still took over the world.
Windows did, and then they screwed it all up by continuing to put stuff in the kernel instead of building everything as kernel servers.
GNU Hurd *did* , and sucked so badly that it was never finished.
So maybe microkernels are a bad idea after all?
Doesn't matter much at this point. Hurd will go down in history as an interesting research experiment but not much more. Linux succeeded because others did what Stallman only talked about. And then Stallman has spent the last three decades doing little more than bitching about it.
I do agree with that.
it is a shame tho, perhaps if he was not involved it might have become something as it IS a better design concept. But then again, without him the overall free-movement would have been decades later i think. He had a place and got things moving, but he should have stepped aside after that... "my job here is done, thank you" and exit stage left.
I forget who else ( other than Minix ) is using a microkernel. I think QNX does, and a few others. Not as much into operating systems ( or ISAs ) as i was 20 years ago. These days i just want it to work.
Fri Mar 18 2022 10:28:56 AM EDT from IGnatius T FoobarDoesn't matter much at this point. Hurd will go down in history as an interesting research experiment but not much more. Linux succeeded because others did what Stallman only talked about. And then Stallman has spent the last three decades doing little more than bitching about it.
ST and Amiga started out as a fight amongst EACHOTHER that shouldn't have been. It should have been one platform - instead of two different companies fighting one another on small differences and dividing the market. It was the TIME though - right... where there would be a schism among the founding developers and architects, and instead of figuring it out, they would tear the baby in two and go in their different directions to try and prove to one another that one team was right and the other was wrong.
Then they would get bought out by some company and suits would come in and demand certain parameters and metrics be met, and both things would get wrapped up in the shady corporate side of the business. Atari and Commodore both had it all going for them and big head starts and managed to piss it all away through bad management, marketing...
The Mega, the Falcon, the Amiga 1200/3000/4000 - you're absolutely right about that - by the time Atari and Commodore got the product right, the corporations had screwed themselves beyond salvation.
The built in Midi on the ST was a killer feature - and had application beyond music. You used to be able to hook up to 16 STs together, as early as 1987, in a Midi Network, and then there was a game, MidiMaze, a First Person 3D maze shooter where you were a smiling spherical happy face. It was digital laser tag, and it was networked LAN gaming, half a decade before PC gamers would start messing around with IPX/SPX networking over serial null cables for FPS games like Duke Nuke Em. As much as Amiga put me ahead of the curve in being ready for Windows becoming the dominant corporate business platform, MidiMaze is why I was way ahead of the curve on understanding FPS physics and strategies, resulting in a period of time when I was world ranked in the top 500 Quake III Arena players of the time.
So don't get me wrong about the ST. I had a 520 and a 1040 in addition to my Amiga 2000.... I know where it had advantages as well as I know where its weaknesses were.
As for the "If you have to reboot Linux, you're doing it wrong..."
THIS... exactly THIS.... what I've been saying about Linux and the people who love it and why it will never work, and exactly why Microsoft and Apple continue to dominate.
If you're telling the end user "It isn't the OS, it is you're an idiot who doesn't know how to work it..."
You're going to remain a niche. Especially if you're doing it so much it kind of becomes the calling card of your platform. Linux, the "RTFM" OS for brilliant, socially maladjusted neckbeards. There is a REASON this stereotype exists of Linux.
Every time I apply a security patch on Debian it REQUIRES a reboot to apply the changes - JUST like OS X or Windows.
If there is a way to do it, an EXPERT level way to do it, that avoids this - the same COULD be implemented into OS X (probably already exists) and Windows...
But the fact is - it is almost ALWAYS easier to just reboot... especially for end-user consumers - or ANYTHING not running on a non-fault redundant single-point-of-failure mission critical enterprise server.
It is EASIER to just reboot my Citadel every now and then and have a 30 second downtime than to learn the complex method by which I could apply security patches without rebooting. 99.8% of the WORLD feels this way about things like this. The other 2% are Linux evangelists who don't understand why all the other idiot users of other platforms won't admit that they're doing it wrong - and who can't wait to tell them so every chance they get.
More or less, if there is something mission critical I am serving that needs 100% uptime, this can be practically achieved on Windows as easily as on Linux, if you have that kind of corporate budget to put the right infrastructure in place (and we know, even with all those checks and balances and redundancy in place, getting true 100% uptime is a unicorn, on ANY platform, even Google, Facebook and Amazon have had downtime).
If you're not that important - if 20 seconds of downtime doesn't measure in a 7 figure revenue hit... or even a 5 figure one... then 10 minutes of downtime, 30 minutes of downtime, a WEEK probably doesn't matter - let alone a few minutes to reboot a server.
It is this benchmark that *nix users pride themselves on that is a leftover of the days when you would get BSODs 2 or 3 times in a 8 hour day and you would lose hours of work if you hadn't saved often... and it took 10 minutes to get back to where you were each time - which WAS a problem at one point. That doesn't happen anymore. Linux users are living in a past where this DID matter and they thought THEY were the solution. That never really happened.
Machines boot in 7 seconds now... even faster, and they hardly ever crash, and they all need an occasion reboot for the average users - and it isn't any big deal - and that is the same experience no matter WHAT OS you run.
I agree with the fight. But not so much that it shouldn't have been. They really targeted 2 different markets and the 'fight' made them both improve as innovation wont take place without some competition.
Problem is for a while it was ONLY about the competition between us, and we stopped paying attention to what as around us, as IBM/Intel/MS strolled past and took the prize while we bickered.
And while i dont like modern apple, back then at least they did have the internal goal of 'doing better'. Tho at the cost of, well, cost .. man their stuff was expensive for what it was.
Sat Mar 19 2022 09:48:40 AM EDT from ParanoidDelusionsST and Amiga started out as a fight amongst EACHOTHER that shouldn't have been. It should have been one platform - instead of two different companies fighting one another on small differences and dividing the market.
Sort of like the 2 drivers in 1st and 2nd place.. get out and start punching each other in the face as one cut the other off. Meanwhile the dude in 3rd place, 10 laps behind, just slowly catches up and rolls past without any extra effort, giving them the finger as they *still* were punching each other, not paying attention.
You're going to remain a niche. Especially if you're doing it so much
it kind of becomes the calling card of your platform. Linux, the
"RTFM" OS for brilliant, socially maladjusted neckbeards. There is a
REASON this stereotype exists of Linux.
Dude, that ship has sailed. Microsoft has lost every OS sector except the desktop. Believe me, I work in the data center business. *Everything* is Linux. The cloud is Linux, telecom is Linux, supercomputing is Linux, mobile is Linux, no one runs Windows Server anymore except to run Microsoft's own server software, and even that is questionable (SQL Server now runs on Linux, for example).
And the fact that the desktop is the only place where an oddball OS is still dominant, is kind of an albatross around its neck at this point. If you're comfortable with rebooting, you and the rest of the 99% can keep doing it.
The other 1% are the IT people who keep the world running.
And i could be wrong, but it 'feels' like the foundation is starting to show cracks in desktops too.. I think the coming windows 11 debacle where they obsolete so many functioning machines may be a tipping point. Most people are smart enough now to go 'wtf' 'why do i need a new computer it was fine yesterday'