Intel's biggest liability right now is their fabs. AMD doesn't own
fabs and can send manufacturing to whatever fab has the process they
Used to be their asset, they used to have an 18-month lead on all the other fab players. Now it's looking the other way around.
The transition to 10nm (TSMC calls it 7nm for marketing reasons, but it's the same as Intel's 10nm) and EUV lithography has been difficult and brutal for everyone in the industry. You can't do this with a bunch of lazyfuck engineers sitting at their desks doing nothing and resting on the laurels of past successes.
market competition. AMD got to 1Ghz, the SEC backed off, AMD wasn't
able to *scale* production, and shortly thereafter Intel announced it
had broken 1Ghz, and flooded the market with processors. That was the
LAST time AMD was a threat to Intel.
Disagree. Athlon 64 was the technically superior product until the introduction of Conroe in August 2006. Netburst was a major misstep. There was a period of several years, from the introduction of the original 1Ghz Athlon until Conroe, where AMD products were a very smart choice.
Well, you are right in a marketing sense - AMD has never really THREATENED Intel in an existential sort of way, except during the whole Netburst debacle, but people kept buying Netburst anyway even though it sucked.
There is a difference between "being competitive" and "being an existential threat.".
I welcome the newly competitive (on more than just price) AMD - perhaps they will start turning a profit again.
By the way, I was one of those people who bought Netburst even though it sucked. I owned, I can't excactly a remember, a 2.something ghz P4, and specced out a 3.0ghz P4 with hyperthreading for my mom.
We bought the hype. We thought Intel knew what they were doing, and that the Ghz was more important. But the benchmark numbers were telling. AMD was winning on performance per watt.
But we convinced ourselves that this disadvantage would go away when developers started optimizing their code for the P4's longer pipelines. That never happened, and Intel ended up backtracking. Conroe's pipelines were half the length. Intel was doing what AMD was doing, again. Only better, because Conroe was wider superscalar, it could dispatch 1 more instruction per clock cycle. Intel started winning on IPC and TDP again.
Those cheap foreigners are going to run us over with their superior
expertise at some point. It's already happening. You can't find any
tooling engineers in this country anymore because they're all in China.
How is that distinguishable from "you can't find any tooling engineers in this country anymore because their jobs were all shipped off to China" ?
I don't work in the semiconductor biz, and maybe it's different there. All I know is that in IT when I see a name like "Rumjumtunesh Balasubrishkarvar" I know I'm usually talking to a halfwit whose only talent is a willingness to work for peanuts.
Yes there are always exceptions.
I don't want to argue either for or against the generalizations. What I'm saying is this: get used to the contractors, and the consultants, and the more junior developers. They're not going away, they're going to define *how* things get done, whether we like it or not, by the simple act of *doing it* without seeking our advice. They need to be treated with professionalism and a modicum of respect, and you and I as the old guys in the room need to continue to figure out how to compete and earn our paychecks. That is all.
expertise at some point. It's already happening. You can't find any
tooling engineers in this country anymore because they're all in China.
Actually LS. There are still some in the US. My dad is one. But it's hard as heck to find them.
There was an interesting article a while back. Said that you could fill a few footballs fields with the tooling engineers in China. In the US, you would be lucky to fill a large conference room.
Not computer hardware, but to go with the conversation, the US is losing a number of specialists due to age and jobs being shifted to other countries. There is a type of power plant that is still in use around the world, and there are last I knew two engineers that have the level of expertise to do a complete shutdown, restart and troubleshooting of that power plant. However my information is from 2015. In 2014 there were 3 people in the world with that knowledge, my father-in-law was one of them. He and the other two were working on a book, so that others could work on those plants.
I wonder if no one bothered to learn those jobs, and that is why you can only fill a conference room with the number of tooling engineers in America. Just like that power plant, everyone wanted to work on the newest greatest thing, and as a result till it is needed, the people that know it will slowly die off.
My father-in-law was able to command a very high price as a consultant when he would work, due to being one of a select few.
I can't speak for every country, but the gut feeling I have is that the main reason why it is hard to replace old, valid professionals is because we are just not making new, valid pros anymore.
My time in college was a joke. I learnt a lot of things, half of them useful, but I didn't learn a profession. In the middle ages, if you wanted to be a master builder, you became an apprentice to a master builder and your learnt your crap working and knowing how stuff was done in real life. Now you go to a building where they feed you a government aproved program that does not necessarily match the needs of the customers you will have to satisfy once you are out. End result is that, depending on the season, I am managing 2 or 3 jobs at once and none is rellated to my formal education at all.
I was once inscribed to a crash code training program alongside a lot of fifteen people who had just got their computer sicence degrees. They only knew Java, and not very good Java at that. Many didn't know network concepts that are very basic. It is ok to not be an expert just after college, but there is a minimum, and I suspect many education programs in the West are not meeting it. I mean, my experience with coding before that training program was a bit of ksh and the exercises of some old Perl book and I was already in a better starting position than most of the classmates. Does anybody expect an educative system such as this one to produce reliable professionals to replace the old ones?
Trades are still in demand, still fairly easy to get into as an apprentice, and still pay well once you get a few years in.
Have a plan, know the skill you want to learn, and how you intend to apply it in the workforce. And then keep learning. Learn something new every day, or you die stupid.
https://www.anandtech.com/show/14694/amd-rome-epyc-2nd-gen
Interesting times. This thing just rendered Intel's entire Xeon lineup obsolete.
This would be just the thing for "ultra high density computing". Those servers are insane. We have customers putting those in some of our colocation data centers now. 35-40 kilowatts per cabinet, with chilled water being fed into each server. One rack with ultra high density servers has the same computing power as an entire row of racks (7 to 10) with conventional air-cooled servers.
Who knows whether they can really scale supply like people might want; this may be a bit of a PR launch. Time will tell. Great product though, if you can get a hold of one.
Looks like AMD is playing a sly marketing game with their maximum advertised boost frequency. https://www.tomshardware.com/reviews/amd-ryzen-3000-turbo-boost-frequency-analysis,6253.html
Nonetheless, their new EPYC product is game-changing. If you can get a hold of one.
About a decade and a half late to the party ... I finally bought my first Arduino. Two of them, actually (a couple of Nano clones, along with some extra parts to play around with).
I wish I had access to this stuff when I was a kid playing around with LEDs and other electronic components. I had to make do with the Radio Shack electronics kits. For that matter, I wish I had access to this stuff in the late 1990's when I was playing around with home automation.
It seems like the really good components started coming out right after I got out of the hobby.
Also late to the party, I'm finally upgrading from mechanical RAID1 to an SSD. SATA-based. Really should have bought a z97 motherboard when I built this PC 5 years ago; that would have supported NVMe
LG makes ultrawides that even let you split the display between two computers (which happens to be the exact way I work all day). Of course, for the $500 I am looking at spending [ https://www.microcenter.com/product/460907/34UC88-B_34_UW-QHD_75Hz_HDMI_DP_FreeSync_Curved_LED_Monitor ] I could probably buy two regular monitors, but then I'd be staring straight at a pair of bezels all day, which might bother me.