Language:
switch to room list switch to menu My folders
Go to page: First ... 18 19 20 21 [22] 23 24 25 26 ... Last
[#] Tue Jan 19 2021 21:44:17 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

I know it was a pig, we had to have it on a WIP on an isolated circuit, because when it shared circuits with the row of racks it was in on the DC, it would brown out those systems when we powered it up. Because we were Intel black-ops, we got them when they were still in a very developmental phase - but they never got much further than that, to be honest. 

Tue Jan 19 2021 16:49:25 EST from IGnatius T Foobar

 

You mentioned IA64 but I don't know anything about that ill-fated design.  I know it required a special compiler so I would have to assume it had at least some RISC-like design elements.



 



[#] Wed Jan 20 2021 08:49:42 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

From the outside, it looked like it had a lot of promise.  But that x86 problem .. 



[#] Wed Jan 20 2021 10:05:00 EST from LoanShark

[Reply] [ReplyQuoted] [Headers] [Print]

You mentioned IA64 but I don't know anything about that ill-fated
design.  I know it required a special compiler so I would have to
assume it had at least some RISC-like design elements.

It was a lot like VLIW. Essentially more RISCy than RISC: not only were the instructions simpler and fixed-length, but it left all of the superscalar instruction scheduling to be handled by the compiler. I assume this means it was a superscalar, but *in-order* design.

[#] Wed Jan 20 2021 11:36:16 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

All I know is that it sucked power like a $10 street hooker sucked... on a crack pipe... 

 

Wed Jan 20 2021 10:05:00 EST from LoanShark
You mentioned IA64 but I don't know anything about that ill-fated
design.  I know it required a special compiler so I would have to
assume it had at least some RISC-like design elements.

It was a lot like VLIW. Essentially more RISCy than RISC: not only were the instructions simpler and fixed-length, but it left all of the superscalar instruction scheduling to be handled by the compiler. I assume this means it was a superscalar, but *in-order* design.

 



[#] Sat Jan 23 2021 19:50:26 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

From the outside, it looked like it had a lot of promise.  But that

From the outside, it looked like Intel was trying to do the same thing with IA-64 that IBM tried to do with Micro Channel ... and the industry responded the exact same way. And now Intel and AMD have to cross-license each other FOREVER.

They should have known better than to break the cardinal rule of Wintel, you don't break backwards compatibility.

[#] Sun Jan 24 2021 11:29:21 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

From the inside, it looked like Intel didn't know what they were trying to do. 

Craig Barret was a spectacularly bad CEO. He knew a critical inflection point was coming up, and he wanted to redirect Intel - but he wasn't sure what to do, so he tried to do it all. 

We were buying up every little remotely related business... hosting businesses, home networking device businesses, toy businesses, mobile device businesses - and then we didn't know what to do with them and they were bleeding money from us. Most of them, we eventually spun back off after a few years of going, "this really doesn't remotely fit our core business model." 

That is the worst part, Intel's mantra is that they're not a tech company, they're a manufacturer - and that everything we do should focus on their core business. They drill this into you constantly. But, they started buying consumer retail businesses and service businesses and all kinds of things that weren't in our wheelhouse at all. 

He was probably the most incompetent CEO in Intel's history. 

 

Sat Jan 23 2021 19:50:26 EST from IGnatius T Foobar
From the outside, it looked like it had a lot of promise.  But that

From the outside, it looked like Intel was trying to do the same thing with IA-64 that IBM tried to do with Micro Channel ... and the industry responded the exact same way. And now Intel and AMD have to cross-license each other FOREVER.

They should have known better than to break the cardinal rule of Wintel, you don't break backwards compatibility.

 



[#] Sun Jan 24 2021 11:31:20 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

He was so bad... Intel spends *billions* on their branding - and the first thing Otellini did when he took the helm was completely rebranded all of the iconic things Intel had developed. Dancing Bunnymen - OUT. Drop e in the logo? Out? 

I bet he probably wanted to drop Intel Blue as the corporate color. 

He wasn't a great CEO either - but standing in Barrett's shadow, he looked like a giant. 

 

Sun Jan 24 2021 11:29:21 EST from ParanoidDelusions



He was probably the most incompetent CEO in Intel's history. 

 

Sat Jan 23 2021 19:50:26 EST from IGnatius T Foobar


[#] Mon Jan 25 2021 12:00:42 EST from LoanShark

[Reply] [ReplyQuoted] [Headers] [Print]

responded the exact same way. And now Intel and AMD have to
cross-license each other FOREVER.

Mega-win for the consumer. But that was already settled in court.

[#] Mon Jan 25 2021 15:42:36 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

The other problem with the "RISC + Microcode" design of modern x86 and z BM Z architectures is that the lower layer is not really offered directly to the user as a way to write code for it. One cannot build a compiler that "just uses the RISC part" so that the legacy instruction set can be phased out. The documented ISA is still CISC and will continue to be that way.

So it may be a win for CPU designers, but not for computer manufacturers, compiler designers, etc.

The end state of all this could be that we end up with RISC IA (probably ARM just because of the momentum behind it right now) and the CISC gateway moving out of microcode and into a hypervisor.

[#] Tue Jan 26 2021 11:44:10 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

I hate the terms people come up with in technology. 

Why did they pick calling it a "hypervisor," and what exactly does it mean? It sounds very cyberpunk - you hear people throw it around all the time. People in tech love to have a reason to use it. 

Isn't it just a Virtual Machine engine? 

Mon Jan 25 2021 15:42:36 EST from IGnatius T Foobar


The end state of all this could be that we end up with RISC IA (probably ARM just because of the momentum behind it right now) and the CISC gateway moving out of microcode and into a hypervisor.

 



[#] Tue Jan 26 2021 15:57:37 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

It can also be called a "virtual machine monitor". The word "hypervisor" seems to be at least 50 years old [ https://tinyurl.com/y2ujron7 ]. On a regular computer, the kernel or the userland manager (init, systemd, etc) is often known as the "supervisor", and "hyper" is the next adjective up from "super". <shrug>

[#] Tue Jan 26 2021 21:11:21 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

Oh, I knew it would be logical and witty. That is one thing we are abundant with in the Tech sector, logical, witty people. :) 

 

It just makes me picture a woman in a tennis skirt with a VERY over-active demi-hat on. 

 

Tue Jan 26 2021 15:57:37 EST from IGnatius T Foobar
It can also be called a "virtual machine monitor". The word "hypervisor" seems to be at least 50 years old [ https://tinyurl.com/y2ujron7 ]. On a regular computer, the kernel or the userland manager (init, systemd, etc) is often known as the "supervisor", and "hyper" is the next adjective up from "super". <shrug>

 



[#] Sun Feb 07 2021 18:45:40 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

And since we're being logical and witty, if there's ever something that operates a higher level of supervision than the hypervisor, it would be called an "ultravisor", and the next one up after that would be the "jumbovisor".

[#] Fri Feb 12 2021 08:45:54 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

Except in asia - where it will become a Super-Mega-Ultravisor, be 60 feet tall, and look like a robot. 

 

Sun Feb 07 2021 18:45:40 EST from IGnatius T Foobar
And since we're being logical and witty, if there's ever something that operates a higher level of supervision than the hypervisor, it would be called an "ultravisor", and the next one up after that would be the "jumbovisor".

 



[#] Thu Feb 18 2021 12:16:09 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Cool. Lights went out in our data center.  ( and possibly the chillers ). But not the equipment. ( servers, network, mainframe ).   Get your flashlights out boys!

Everyone is freaking out.  

Gotta love watching the chat traffic " the power went out, is anyone out there "  LOL   about as bad as "email is down, i should email someone to check the email server"  or " customer's outlook is not working, i emailed for them to call us "



[#] Thu Feb 18 2021 14:55:54 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

"The best we can figure out is someone in the basement turned off the wrong breaker"

 

lol

Thu Feb 18 2021 12:16:09 EST from Nurb432

Cool. Lights went out in our data center.  ( and possibly the chillers ). But not the equipment. ( servers, network, mainframe ).   Get your flashlights out boys!

Everyone is freaking out.  

Gotta love watching the chat traffic " the power went out, is anyone out there "  LOL   about as bad as "email is down, i should email someone to check the email server"  or " customer's outlook is not working, i emailed for them to call us "



 



[#] Sat Feb 20 2021 21:11:37 EST from IGnatius T Foobar

[Reply] [ReplyQuoted] [Headers] [Print]

Nothing quite matches that brief moment of panic when the lights go out in the data center, and you have to look around and realize that the equipment is still running. Then it's usually a quick trot over to the switchgear room to verify that yes, the generators are starting up and will take over before the UPS runs out.

War stories of data center disasters happening due to the company owner being a cheapskate ... some of those might get someone sued if they got out, so I'll have to wait a few more decades before telling them :)

[#] Sun Feb 21 2021 09:20:28 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Early 90s i was out in the plant most of the day doing 'stuff'.  My office was attached to the data-center. It was mid summer so 90ish.  ( 120+ in the plant )

Walked into my office looking forward to cooling down, and it was melting.. and all i heard was beeping from around the corner.. "ackk..." ran into the main room and the chillers were dead.  90% of the room had shut down due to the heat.  I dont know how the guys out in the office area didnt notice. ( we were not as 'networked' on the PCs as we are now, mostly email and printers , but still ). 

Of course panic ensued. as i yelled out the door to get help shutting off what was left *now*.

The coolies got the chillers working again ( i forget why they quit but it took them several hours ) room finally cooled down and started turning stuff back on.

  • PC print servers.. lost like 3  out of perhaps 50 ( mostly ps/2 70s ) 
  • PC OS/2 file servers..  survived ( mostly ps/2 95s )
  • SGI.. back up
  • VAXs .. back up.
  • network gear ( token ring and coax, depending on where in the plant it terminated ) all alive

Went to the HP, i think it was a 900 if i remember right, but could be totally wrong. It was a mini, with 4 extension cabinets and external 2 disk drive units, and 2 RS232 end caps for serial terminals, and it had red flashing lights on it. We didnt even know those lights could turn colors as they were normally disk access indicators...  Called them, read the error code. "Oh, you got what again? Um hang on".... wait... wait.. "ya, that is a 5v power supply we have 2, we will get both on planes and be there at 7am with one of them"  They had 2 spares on the planet, on in CA and another in Japan..  I figured its just a 5v supply? Geesh i can rig that up to get us by for the afternoon/evening. I have plenty of things to scavenge in the PC room ( i did PC hardware service then too, ibm stuff ).  We went over and opened the case to see what i was working with... The bus plane was like 1/2" thick solid copper.   "Um, no, we are not doing this"  And put it back together :)   It was like 20 amp... Nope, not going to happen.

 

 

sort of like around 87 or so ( GM this time, not Ford like the first case )  we had a tornado go thru down.  Went to the plant saw all the truck ramps underwater..  went into the data-center area and the doors were all open and fans going..  1' of water in the raised floor. Tripped everything.  Pumped the water out, got everything dry and turned back on and running, then went to the network front end ( IBM network cabinets to talk back to the real data centers, i forget the series ) so we could reconnect to the world and the power switch would not reset.     IBM guy.. well that is weird, but never seen one trip before either.   Got on the phone "um, ya, um, oh really.. ok.."

Turns out once tripped you CANT reset it.  For safety. you have to take it out, tear it apart and rebuild it.  

 

Or the time we lost the data center in NC during hurricane season one spring.  Most of us just sat around and twiddled our thumbs for 2 days up here in Indiana. :) Of course i had work to do as i also worked with the local plant equipment too.. but many did not. Normally all you hear is are keyboards on IBM terminals clicking away, but walk into the main office and hear voices and 50+ IT people moving around, you know something is broke. ( back when keyboards were made of steel and good weapons if you get in a fight )

 

Fun times. I mIss the plants.   Man i feel old today tho.



[#] Sun Feb 21 2021 09:28:06 EST from Nurb432

[Reply] [ReplyQuoted] [Headers] [Print]

Nope, it was a 3000, series 64. 

Lights were below the name plate there. 

 



[#] Mon Feb 22 2021 12:01:24 EST from ParanoidDelusions

[Reply] [ReplyQuoted] [Headers] [Print]

Your weather related DC disaster recovery stories remind me of working in Ohio, and make me glad most of my career was spent in California. 

 



Go to page: First ... 18 19 20 21 [22] 23 24 25 26 ... Last