I don't think so.
Most real work doesn't get done in the browser. The same problems that applied to thin-computing way back when that Oracle CEO was promoting it as "The Next Big Thing" apply today.
Whatever you can do on your REMOTE computer that is WORTH doing... I can do better, faster and cheaper on my LOCAL computer. Inevitably. It is the same reason FPGA is superior to Software Emulation, regardless of the theory of tuning machines and enough cycles making it impossible to tell. It is the same reason that deep fakes are still detectable because they can't overcome the uncanny valley. The more abstraction layers between the immediate reality and the virtualized one - the cloud, the hardware abstraction layer, the CGI animated embedded face... the less *satisfying* the end experience is. The less convincing it is. The more distant it feels.
You can't overcome that paradox.
If I want "good enough", sure, I can edit a movie or design a 3D printed STL or create a spreadsheet or a database on a web based program.
But if I want to do it WELL - I don't do it on someone else's processors. I do it on my own.
There is a balance. People who have *ideologies* behind their views on technology very rarely are able to see those balances. They see "defeating X" as the motivating factor.
But it isn't. Doing it the most efficiently, the least expensively, the easiest, the most reliable... there are a billion factors more important than if something will displace Windows or OS X or Google or Facebook that make a platform, technology or solution successful or not. That is why technology is littered with the corpses of good ideas that were technically superior on paper but were never adopted.
It is why some of you will constantly be disappointed when the crushing defeat you are banking on never really arrives. It will ebb and flow - because both paradigms have advantages and disadvantages that are highly situational.
Mon Apr 12 2021 14:57:30 EDTfrom IGnatius T Foobar
They certainly won, but it was a Pyrrhic victory.
The real "browser war" was not between Netscape's browser and Microsoft's browser. It was the browser as a platform vs. Windows as a platform. The browser won, it won decisively, and it won big. Netscape unfortunately died in battle.
And that's ok, because all of the Netscape people I've met or read on the Internet have been monumental assholes.
And we've been watching this ebb and flow for what... 60 years now. It should be obvious now... thin-client vs. local computing is like a tide going in and out. You may think thin client is winning NOW, and you may DIE thinking it won...
But the tide will go back out - and the power of local computing will come in on the next tide... it is clearly cyclic.
We do it because distributing software to all of its users and keeping every installation maintained is a gigantic nuisance.
Some things just run better locally. Games, graphics processing, video processing, stuff that needs high power and low latency, if you can get your own computer running it, it's just better.
But your ordinary information-based applications ... those are never coming back to the desktop. They shouldn't. They ran better centrally before the personal computing revolution, they run better centrally now, and it simply works better.
You forgot the lock-in angle too.
"you now get to pay, forever, to access your stuff" Renting software out is far more profitable than selling it. And when you can hold their data hostage, even the better.
Tue Apr 13 2021 09:28:22 EDT from IGnatius T FoobarWe don't put applications behind the glass because they run better there.
We do it because distributing software to all of its users and keeping every installation maintained is a gigantic nuisance.
Some things just run better locally. Games, graphics processing, video processing, stuff that needs high power and low latency, if you can get your own computer running it, it's just better.
But your ordinary information-based applications ... those are never coming back to the desktop. They shouldn't. They ran better centrally before the personal computing revolution, they run better centrally now, and it simply works better.
I think Ig makes the mistake that a lot of software developers make.
"It is EASIER for the developers means it is better for the end users."
Which is hardly EVER the actual case. Especially in the long run. It may start out that way - but eventually it turns into, "what is better for the developers is best, and the end users are locked in, so screw them."
And the lock-in angle is another actual example of that.
It tends to cause the ebb and flow. Here is the deal - eventually these web based solutions are going to make mistakes and businesses and individuals are going to seek alternatives. More than that - offering a local alternative is going to be a market opportunity for some group of developers against an incumbent. Someone will INNOVATE and use local use as a value add to compete with using server side deployment.
Nurb didn't even have to try hard to come up with one of the major liabilities of distributed computing and software.
The idea of "continuous improvement" is BS too. Every time I boot up a Linux machine, there are dozens of updates. Same for OS X, Windows, and honestly - even my XBox.
There was a time when things weren't constantly being updated and tweaked with remotely. You bought a software package - and MAYBE there was an update if there was a critical bug - but generally, you weren't on this circus. Adobe moves your cheese so constantly the biggest challenge in Photoshop and Illustrator is figuring out how to do something you knew how to do perfectly just last week.
There are unique disadvantages to this model. Eventually, it'll cause a backlash among the market - and the developers don't get to CONTROL the market - they respond to it.
Hell, Corel has actually fought their way into the position of viable competitor against Adobe simply by offering an alternative to the Creative Cloud distribution model.
Tue Apr 13 2021 10:04:24 EDT from Nurb432You forgot the lock-in angle too.
"you now get to pay, forever, to access your stuff" Renting software out is far more profitable than selling it. And when you can hold their data hostage, even the better.
Tue Apr 13 2021 09:28:22 EDT from IGnatius T FoobarWe don't put applications behind the glass because they run better there.
We do it because distributing software to all of its users and keeping every installation maintained is a gigantic nuisance.
Some things just run better locally. Games, graphics processing, video processing, stuff that needs high power and low latency, if you can get your own computer running it, it's just better.
But your ordinary information-based applications ... those are never coming back to the desktop. They shouldn't. They ran better centrally before the personal computing revolution, they run better centrally now, and it simply works better.
When i worked at ford back in the early 90s they had a plant floor system that people used on the assembly lines on a daily basis. To avoid the problem most dev teams have ( siting in a cube thinking they do good stuff ) they had teams made up of both dev people AND people from the floor. Really wonderful and usable stuff came out of a process when you include the actual users up front in the requirement and design phases. And to keep from getting stale, they would rotate people on/off the floor every 6 months or so, and move dev guys to different plants.
After i left, i heard that to cut costs they dropped the co-op program and went back to isolated developers ( in another sate, perhaps country, i donno ) and the results were dismal. In a matter of months, everyone hated the system and started destroying the floor terminals again. Not traditional terminals, but dedicated use, huge NEMA style weather sealed cabinets that used touch screens. The first thing i noticed is now they required keyboards and mouse ( in a manufacturing plant..wtf ). complexity of use went thru the roof.
*thumbs up*
Developers are often too smart to design things that work well for dumb, or even average people - and they've got short tempers for people who aren't smart enough to see how much superior the way they DID design it is, if you're smart enough to understand it conceptually.
Tue Apr 13 2021 11:43:03 EDT from Nurb432When i worked at ford back in the early 90s they had a plant floor system that people used on the assembly lines on a daily basis. To avoid the problem most dev teams have ( siting in a cube thinking they do good stuff ) they had teams made up of both dev people AND people from the floor. Really wonderful and usable stuff came out of a process when you include the actual users up front in the requirement and design phases. And to keep from getting stale, they would rotate people on/off the floor every 6 months or so, and move dev guys to different plants.
After i left, i heard that to cut costs they dropped the co-op program and went back to isolated developers ( in another sate, perhaps country, i donno ) and the results were dismal. In a matter of months, everyone hated the system and started destroying the floor terminals again. Not traditional terminals, but dedicated use, huge NEMA style weather sealed cabinets that used touch screens. The first thing i noticed is now they required keyboards and mouse ( in a manufacturing plant..wtf ). complexity of use went thru the roof.
Or smart people in other fields they dont understand.
Tho they had their faults EDS ( in the old days ) had that right. They had you 'live' with the customers a while before you were allowed to do anything of any value.
Tue Apr 13 2021 14:09:33 EDT from ParanoidDelusions*thumbs up*
Developers are often too smart to design things that work well for dumb, or even average people -
Don't underestimate brilliant marketing and dumb luck.
Thu Apr 15 2021 15:49:55 EDT from IGnatius T FoobarThat's ok, I know this company in Armonk that designed a personal computer that was so awful compared to its competitors that it became the industry standard design for all computers for decades to come.
Cool. Pine64 released their watch. Cheap too.
https://pine64.com/product-category/pinetime-smartwatch/?v=0446c16e2e66
Sort of funny to think. Here i am doing an update on an SBC that cost under 150 bucks that has more 'power' than one could even dream of having perhaps 20 years ago. Its got a 2 GB OS update today, and dont even blink an eye.
Some days its the little things.
when you want to rysnc ~400gb across 2 usb devices. Be sure you plug BOTH into a USB3 port, not just one, if you want it to finish in your lifetime.. ( idiot )
Not the first attempt at something like this, but looks like they have a good chance. If they make it to ARM and RISC-V support, i might just have to get one as that means they are going to be around a while.
But, they need to market and get some retailers involved.. i ran across this just by chance.
I wouldn't have built it with a motherboard. I would have built it with a passive midplane that carries power and PCI-E lanes, and put maybe six internal bays in the machine, one of which would be for a "brain" that feeds the bus.
Basically like an S-100 but much smaller.
Agreed, it was a shame they used a 'traditional' MB form factor.
From the videos and stuff i have seen you can fairly easily remove it for upgrades, or architecture swaps for people like me, but making it a "module" you shove in the back like their USBC modules do would have been even better.
Forgot there was a company who tried this before, even their 'CPU' was on a PCMICA card you shoved into the side of the device, like a modem, but it was the 'brains' instead.
Wonderful idea, but they didnt survive long.
Not quite to the level we would all want, but was updating the m.2 in my PinebookPro today, decided to take a picture. ( finally found one that will fit properly. its so thin many bulge the case. )
It is modular. Their intent is to offer new mainboards as time goes on.