Language:
switch to room list switch to menu My folders
Go to page: First ... 28 29 30 31 [32] 33 34 35 36 ... Last
[#] Tue Sep 07 2021 17:16:20 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

My understanding that the reason that they moved away from PPC was due to their power use.  Not overall supply issues.  "we need to put more powerful chips in laptops NOW, not in 5 years"   And they just were not to that point, so Steve bailed.

And i do agree with him, in that while PPC was a far better architecture than x86 could ever hope to be, it was never really meant for mobile use and market said: go mobile or go home.   



[#] Wed Sep 08 2021 02:41:30 EDT from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

They've been playing this, "We're *actually* kicking Intel's ass" game all the way back to the 68000. They never actually have been. In carefully optimized benchmarks done on fine tuned machines using very specific software designed specifically for their platform - they turn in a benchmark that is higher. 

But in real world application, it generally means shit. It is kind of like 0 to 60 times published by automotive journalists. For things that people actually do - whenever impartial people have done real world benchmarks on performance on any generation when Apple claimed to be the fastest and most powerful - the highest end Intel processors were always smoking the Mac stuff. Or if they weren't at the start, they were shortly thereafter. For a while there might have been some leapfrogging - but eventually Intel and AMD became the rivalry for actual performance - and it wasn't long thereafter that Apple announced it was abandoning PPC for Intel. 

A great example is the Mac Powerbook 2012 i7 15". It was hailed as running Windows faster than other Windows specific Intel machines with the same specs at the time. Lots of Apple guys were gloating about this when it first came out. I remember the benchmarks comparing it to an HP flagship and showing it easily outpacing it.  

But this particular system has the ATI Radeon flaw in it - and if you run it at full power for very long, it cooks itself. Apple was already dealing with heat dissipation issues and the throttling is super aggressive on this model. It is like having an engine with twin turbos and the FMIC is so small it gets heat soak the first time you gas it - and worse, that'll eventually cause the head to crack - sooner not later. 

Very recent Macbook Airs have this issue too. There is a video on YouTube where a guy runs one in a tub of ice-water to show how much more performance it could get if it wasn't so heat constrained. It was significant. 

This is *why* Apple is going with ARM - because they can't get Intel to run cool enough - their design ethos calls for low power computing - and yeah, I get it that the M1 is hitting some mid-range Intel core performance metrics at much lower thermal dynamics - but again - they're not comparing to a latest Gen core i9 - and there are Core i5's that skunk my 2012 i7. And of course, no one is suggesting that the M1 - or anything Apple will produce on ARM any time soon - will ever challenge Xeon processors. Intel has a "no replacement for displacement" ideology about not caring about the wattage when going after raw performance - and I doubt ARMs ideology will ever be able to get out in front of that. 






Tue Sep 07 2021 15:27:52 EDT from LoanShark
two vendors. ARM brought it in-house, which to me sounds more like
hubris than pragmatism.

Maybe not; Apple has been building ARM chips for years in the iPhone and iPad products, so they know what they're doing by now.

M1 is probably outperforming Rocket Lake on some metrics.

 



[#] Wed Sep 08 2021 06:21:16 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

I donno about that.

I had a couple of PPC machines.  Other than power use, they outperformed in everyday life for me compared to an intel box of relative comparison ( its hard, its not a 1:1 ). By a lot. Even something as simple as transcoding wav->mp3. 

 

 



[#] Thu Sep 09 2021 14:39:29 EDT from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

I've owned *lots* of PPCs. I have 3 in the room I'm in now. A Quadra 8500, which was the flagship Mac at one point, a G4 Cobalt, and a G4 Quicksilver (also the flagship at the time). They didn't. Games - once again - are actually one of the most demanding things we task our machines with, and the PPC architecture ultimately could not deliver games as rich, complex and demanding as Intel processors with GPUs of the time. The PPC had very specific functions it was superior to Intel on - but the real world application fell short. 

I'm not saying that the PPC was a bad architecture. 

I do notice that people who do a lot of things like compiling, coding, and other "nuts and bolts" operations with their computers still rave about PPC performance. I don't consider these "real world" operations - as a mere fraction of the computer using population cares about these kind of things - but when I argue with someone who has a night and day experience - often once we peel the layers away - this is at the root of the disagreement. 



Wed Sep 08 2021 06:21:16 EDT from Nurb432

I donno about that.

I had a couple of PPC machines.  Other than power use, they outperformed in everyday life for me compared to an intel box of relative comparison ( its hard, its not a 1:1 ). By a lot. Even something as simple as transcoding wav->mp3. 

 

 



 



[#] Thu Sep 09 2021 16:45:07 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Games are meaningless to me.  To me everything else is 'real world' workload and they would eat intel for lunch.

And as you say, modern games, CPU is not relevant anyway, its all about the GPU.  



[#] Thu Sep 09 2021 16:52:42 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Besides, in my day, you played games on consoles.  We did work on computers. 

 

Now, get the hell off my lawn :) 

 

( kidding aside, it really was like that, games, computers? nah )



[#] Thu Sep 09 2021 17:15:19 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Ok that is odd, suddenly the person who posted is no longer there..  did i break something here?



[#] Thu Sep 09 2021 21:28:03 EDT from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

They've been playing this, "We're *actually* kicking Intel's ass"
game all the way back to the 68000. They never actually have been. In

The 68000 was in every way superior to the 8086. Unfortunately for Motorola, they went to production just a *little* bit too late for it to be used in the 5150 PC. Things could have been soooooo much different. The 68000's clean, flat 32-bit address space would have been a dream to work in, and the PC might have even had a usable OS from the beginning.

I still cringe when I remember working in segmented memory mode and I hope AMD (and its follower, Intel) drop support for Real Mode and anything that doesn't work in a flat memory mode VERY soon.

[#] Fri Sep 10 2021 10:30:54 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

And dont forget the 6502 :) 



[#] Sat Sep 11 2021 14:05:43 EDT from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

Games utilize all subsystems more completely than anything except possibly complex 3D rendering. If you're outperforming on games - you're outperforming on "real world" workloads, in general - as far as your average consumer cares about.

Again - making me wonder if there is something with what *you* consider "real world" workloads that is better optimized on PPC architecture - because it is generally guys like you and Ig who rave about the PPC - which really had a brief moment of "best" that couldn't scale with the competition.




Thu Sep 09 2021 16:45:07 EDT from Nurb432

Games are meaningless to me.  To me everything else is 'real world' workload and they would eat intel for lunch.

And as you say, modern games, CPU is not relevant anyway, its all about the GPU.  



 



[#] Sat Sep 11 2021 17:14:23 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Technically PPC never went away. Just went to be on more powerful devices, not suited for lame personal use. ( AS400, for example ) And they dropped the "PC" out of the name, just to prove the point :) 

 



[#] Sun Sep 12 2021 10:51:34 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Lol.  Spending the day throwing things away ( or packing them up to take to the local bookseller ).  Ran across my PPC programmers manual and data books in the 2nd box.

 



[#] Sun Sep 12 2021 16:13:08 EDT from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

architecture - because it is generally guys like you and Ig who rave
about the PPC - which really had a brief moment of "best" that
couldn't scale with the competition.

Still hoping those Intel shares are going to go above the strike price, are we? ;)

I actually don't have any feelings one way or another about PPC/POWER. My complaint is that Intel made the stupid decision to use segmented addressing when flat 32-bit spaces had already been in use for some time. It created a lot of hassle, and also spawned a lot of bad architecture decisions such as Bill Gates "640K ought to be enough for anyone".

It took decades of evolution to get to where we are now, and we might have gotten there sooner if we weren't dragging such a gigantic pile of legacy cruft behind us. Honestly, now that all modern operating systems now have the ability to boot directly into a *flat* 64-bit address space using UEFI, and will stay there for the entire uptime of the machine, it's time to ditch the legacy modes completely.

That means no BIOS, no real mode, no 16-bit protected mode, no virtual-8086 mode. We should be segmenting people who want to keep segmented addressing.

[#] Sun Sep 12 2021 16:29:20 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

That is something i always wonder too. Sure, it has held us back, and made a lot of things bloated and unstable, but would a complete re-do too early on have killed the entire industry ?  Its possible.

Sun Sep 12 2021 04:13:08 PM EDT from IGnatius T Foobar
 if we weren't dragging such a gigantic pile of legacy cruft behind us. 

 



[#] Sun Sep 12 2021 17:12:59 EDT from LoanShark

[Reply] [ReplyQuoted] [Headers] [Print]


I haven't really looked at M1 vs Intel in any kind of detail, but if I could get 95% of the single-thread performance at 50% of the power consumption vs Rocket Lake, that's probably a tradeoff I would take.

(I'm typing this from a Ryzen 3900X right now. It does not have the single-thread performance of Rocket Lake, and I just bought it last year. It's still way better than my old Haswell rig, and it would outperform on multithreaded workloads requiring >6 cores if I actually was running that kind of workload...)

[#] Sun Sep 12 2021 18:16:55 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

As much as i hate to admit it, i may end up with a M1 at some point.   Not that i dislike my PinebookPro, or my ASUS Chromebook ( both running a RK3399 ) but having a 'legit' ARM laptop may just be too much of a temptation.



[#] Sun Sep 12 2021 18:19:33 EDT from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

I still dont get the hell bent push for vax.


In another room, Nurb432 asked this question. He might have been talking about something else, but it brought to mind the 32-bit VAX architecture released by Digital in 1977, which also managed to run in a flat 32-bit memory model a year before Intel released its brain-damaged segmented memory 8086 a year later.

Seriously ... there was NO excuse for such a stupid design. And there was no excuse for IBM to choose it, especially when the PC was not built to be backwards compatible with 88-bit CP/M software.

[#] Sun Sep 12 2021 18:43:02 EDT from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

I agree Dec Architecture was decent. Tho the other post was referencing vaccinations :) 

Back in late 80s ( or early 90s.. cant remember now ) i had one the intersil 'vax on a chip' dev boards i manged to get a hamfest/fleamarket where they didnt know what they had so it was cheap.  Never did much with it tho.

 

Sun Sep 12 2021 06:19:33 PM EDT from IGnatius T Foobar
I still dont get the hell bent push for vax.


In another room, Nurb432 asked this question. He might have been talking about something else, but it brought to mind the 32-bit VAX architecture released by Digital in 1977, which also managed to run in a flat 32-bit memory model a year before Intel released its brain-damaged segmented memory 8086 a year later.

Seriously ... there was NO excuse for such a stupid design. And there was no excuse for IBM to choose it, especially when the PC was not built to be backwards compatible with 88-bit CP/M software.

 



[#] Sun Sep 19 2021 08:38:31 EDT from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

They did. I have two new AC units - overkill for my square footage, courtesy of Intel. 

Only took 20 years to happen, but once it did, the payout was nice. 

It is an interesting observation in that from a developer's perspective, I can see why these things frustrate you... but despite Intel's stupid decisions - through a chain of circumstances, that became the legacy that developers had to deal with. Not all of these, or really any of them, were Intel's *decisions* in architecting x86 memory addressing. 

IBM deciding to go with the 8086/8088 on the original PC is part of it. IBM not defending their IP early on against off-brand clones and losing control of the PC architecture was another part of it. Commodore fumbling the momentum of the C-64 with the Amiga was part of it. Apple being so sure they were doing it "right" that they kept the platform proprietary and niche was part of it. The specific time frames at which these events happened in the evolution of personal computing was part of it. 

If just one thing had gone differently here or there - the x86 architecture might be a forgotten platform and architecture of the 80s/90s much like PPC and 68xxx are today. 

For 90% of the computing public - maybe more, it has all been a linear chain of awesome, constant improvement in technology - and they don't care what kind of difficulties it presented to the people designing the hardware or the software that ran on it.


It is also a way of looking at things... and almost entrepreneurial mindset. Business owners tend to lose site of the "my business did not fail and generates me revenue..." but instead look at, "it should be generating MORE revenue but we're not doing as good as possible, and therefore I am not *making* money, I am *losing* it!" 

And... I suppose this is necessary - it is why entrepreneurs who are successful tend to be so, I guess.  It seems to have been a key part of Job's success. He was never *really* satisfied with the results. I think his pragmatic side was, "well, this is probably the best I could drive these idiots to deliver - but I've got to continue beating them to reach higher goals..." 

Any kind of backward compatibility necessary should increasingly be done with emulation and/or FPGA. Hell... do whatever you want with the future of personal computing technology - it *seems* that it introduces the driver for innovation in how PCs handle legacy code where necessary. We've hit a weird stage where almost all PCs have become portable - and "peripheral and resource expansion" is no longer a requirement. PCMCIA failed, the idea of modular laptops failed. You buy what you want, and USB pretty much handles anything additional and custom you might want to plug in. Outside of *gaming* - no one worries about upgrades - even memory and storage are things where, "you need more? ditch the old machine and get a replacement." 

But it would be cool to see FPGA integrated as a subsystem in PCs. I suppose it is a corner-case deal though. Most people just don't care. Smartphones *should* be killing the PC right now. I've got Samsung DEX on my Note 10+ - which has 12gb of RAM. It very easily integrates with LCDs and keyboards. There is *no* reason a huge majority of the computing public needs anything *more* than this - and it would offer them huge advantages and convenience, as well as overall cost savings, if they adopted *just* this platform as an approach. I'm not sure why this doesn't gain more traction. I think it is just the inertia of what is *established*. Samsung's Dex solution is quantifiably *better* for most users in most use cases - but there is comfort in maintaining the status quo they are familiar with. 

There may also be a distrust in the tethered nature of mobile devices. I was thinking about something similar with Smart devices and IoT devices and the complaints about them listening in, data-mining for algorithms, and selling the consumer soul to corporations. Smart TVs in particular - you just have to turn off the network. But... the consumers want to have their cake and eat it too - they want all the bells and whistles of network connectivity - without having Alexa listening in 24 x 7 so that Amazon and Facebook can better serve you ads. They want commercial-free free broadcast TV, more or less. The truly "smart" house would have one machine connected to the Internet - and everything else would be sneaker net from that device. It is really the only way to maintain a black-hole that prevents corporate networked solutions from building an almost complete algorithm of pretty much who you are. But doing so would also make you a digital hermit isolated from the rest of society - and personally I think they've made the algorithms so efficient and accurate even *that* would give them a wealth of information about "who you are". When you gather enough information about who everyone ELSE is - you also start to build a profile of "who those who are not participating," are, too.   

 

Sun Sep 12 2021 16:13:08 EDT from IGnatius T Foobar
architecture - because it is generally guys like you and Ig who rave
about the PPC - which really had a brief moment of "best" that
couldn't scale with the competition.

Still hoping those Intel shares are going to go above the strike price, are we? ;)

I actually don't have any feelings one way or another about PPC/POWER. My complaint is that Intel made the stupid decision to use segmented addressing when flat 32-bit spaces had already been in use for some time. It created a lot of hassle, and also spawned a lot of bad architecture decisions such as Bill Gates "640K ought to be enough for anyone".

It took decades of evolution to get to where we are now, and we might have gotten there sooner if we weren't dragging such a gigantic pile of legacy cruft behind us. Honestly, now that all modern operating systems now have the ability to boot directly into a *flat* 64-bit address space using UEFI, and will stay there for the entire uptime of the machine, it's time to ditch the legacy modes completely.

That means no BIOS, no real mode, no 16-bit protected mode, no virtual-8086 mode. We should be segmenting people who want to keep segmented addressing.

 



[#] Mon Sep 20 2021 17:07:55 EDT from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

Ironically, the machine that came closest to what you're describing was the Apple IIgs.  What a great machine.  It was superior in every way to the Mac, and handled legacy Apple II software using the "Mega II" which was basically an entire Apple II on a chip.  It was only the hubris of Steve Jobs that killed the IIgs and put the Mac out in front.

Anyway, the missing part of this conversation is that there is no longer a desire for anyone to make a "better computer" because what all the tech companies now want most is recurring revenue.  Build the machines as cheap and disposable as possible, and make everything depend on The Cloud.  Mobile devices are, in fact, now powerful enough that something like Samsung DEX could replace most laptops and no one would notice, but what's probably going to happen is that we'll skip right past that and there will be a remote-desktop program running on a SoC inside your monitor or keyboard, and the desktop will be on a server somewhere, and you pay yet another monthly utility bill for it.

You are correct about one thing: most people don't care, they just want it to work, and they don't mind spending a few dollars to make it someone else's problem.  Then again, most people are morons, and are constantly bombarded by stimuli intended to keep them that way.



Go to page: First ... 28 29 30 31 [32] 33 34 35 36 ... Last