Early availability of B550's is starting to show up, but the price needs to be around $100. That was supposed to be the whole point.
https://www.newegg.com/p/pl?N=100007625%20601292786%20601352805
Feh. I give up.
I bought a "TESmart 18 Gbps HDMI 2.0 4-port KVM" for my computers. It worked for about a day, while all of the computers and the monitor were happy with each other and still sort of remembered how everything was working. This morning, I sat down to use the computers and only got occasional flickers of the screen, eventually coming up and settling at 1024x768.
I bought it from Shamazon instead of eBay to make sure I could return it if I needed to, and I do.
I think I'm stuck with a USB switch and the input select on the monitor.
It's not like these things are hard to build; the one on my surround receiver works fine (although only supports 4K@30 because it's a little older. Newer Sony products support 4K@60.
Maybe I got that wrong - might support 4K@60, just not HDR. I never use it because it does have limitations, so I use the TV to select inputs.
What I need is a dumb switch rather than something that tries to spoof the
signal in both directions. I've got half a mind to find an old 25-pin parallel
printer switch and solder HDMI cables onto it. My monitor uses an uncommon
resolution (3440x1440) so the switches just don't know how to deal with it.
If the display is *already running* it manages to do the right thing, but if any component disconnects and it has to re-negotiate the video mode, I get half a minute of flickering followed by it settling in at 1024x768.
I'd probably have better luck with a "true 4K" screen at 3840x2160, but I wanted an ultrawide.
If the display is *already running* it manages to do the right thing, but if any component disconnects and it has to re-negotiate the video mode, I get half a minute of flickering followed by it settling in at 1024x768.
I'd probably have better luck with a "true 4K" screen at 3840x2160, but I wanted an ultrawide.
True 4K displays are super cheap these days. I have one at the office (which I haven't been to in months because of Covid.) I also work off of 4K at home: I sit down in a butterfly chair, with my laptop in my lap, connected via a USB-C multiport hub to the HDMI input on my 55" LG C9.
The laptop is an Ice Lake running Ubuntu, and the USB-C negotiation still isn't sorted out, which I've posted about ad nauseum in Linux>, but it's working acceptably well at this point.
Anyway, I think my office monitor was probably like a $250 item. My OLED at home, an order of magnitude more, but that's another story.
Now that I think about it, I probably could have gotten a 4K (3840x2160) for
the same cost as my ultrawide 2K (3440x1440). I really do love my ultrawide
though. It has a gentle curve, it hovers over the desk with the pole-mount
stand I got for it, and it's just a pleasure to spend the day looking at it.
I don't find myself wishing I had more screen space, like I did with my 1920x1200 (16:10) monitor. I could use the laptop screen as a second display but I don't need it.
So really the only problem is that the oddball resolution continues to confound any KVM switch that I throw at it.
What I really want is for ${work} to get us moved onto VDI so I don't need the work computer except when traveling.
I don't find myself wishing I had more screen space, like I did with my 1920x1200 (16:10) monitor. I could use the laptop screen as a second display but I don't need it.
So really the only problem is that the oddball resolution continues to confound any KVM switch that I throw at it.
What I really want is for ${work} to get us moved onto VDI so I don't need the work computer except when traveling.
The typical 32" 4K display probably requires higher DPI font settings than your ultrawide, for effectively less screen real estate. If I buy a 32" 16:9 display for the home, it would be a 1440p display. Anything 16:9 bigger than 32" is a TV, not a monitor.
Yeesssss. She earned it!!!
And while Lisa Su is flying high, MSI CEO Charles Chiang decided to take a flight in the opposite direction, after taking a plummet from the seventh floor of their building in Taiwan. Their PR people said he "passed away due to personal health factors."
I guess you can say that each in their own way, Su and Chiang both had an impact.
Is the decline of Intel continuing?
It's well known by now that their rollout of 10nm technology was delayed by years as they went down some design dead ends, and their 7nm process was going to be what turned it around. But now the 7nm chips are delayed by a year, possibly two, and word on the street is that they may have to use third-party foundries to get anything to market at all.
Some of those foundries are already tooling up for 5nm, and already taking bookings from Apple, AMD, and other fabless clients.
The next couple of years are going to be interesting. Between Intel's missteps, AMD's current successes, and Apple's switch to ARM, we could be looking at quite a shuffle.
I don't know, I was waiting for desktop Ice Lake to come out to get Intel a chance, but I finally pulled the trigger on an AMD build. This multicore monster will make quick work kernel compiles.
The deciding factor was not so much that I need multicore performance, but the creeping obsolescence of my Haswell rig. Anything older than Kaby Lake/Coffee Lake has got issues with security or performace or both. I have been enabling Memory Integrity on Windows on this box recently, and I learned that it can take a big performance hit because that runs HyperV and the processor lacks Mode Based Execution Control... so it will run real-mode code in emulation.
I've got all the parts I need for this build sitting here... except for a few screws and motherboard standoffs. Ordered those a day or two late when I remembered that they had gone missing. I'm going from MicroATX to ATX, so I need 3 more. Damn you, 2020 shipping delays.
Finishing building my AMD box.
Used brute force and ignorance with a motherboard standoff, and it's put together OK, but it's not going to be easy to get this motherboard out without an act of violence. I used the wrong type of screw for one particular standoff (which was not originally supplied with the case), and it's stuck now.
Anyway, the system works great. But, crashes on boot if you try to enable Windows' Memory Integrity feature in the Windows Security dialog. And that feature was one of my major reasons for the upgrade, because it's virtualization-based security and performs better on a modern processor with MBEC.
10 minute kernel compile instead of 30.
It's a 3900X.
How often are you compiling kernels that this makes a difference in your world?
(Genuinely curious.....)
(Genuinely curious.....)
2020-08-11 13:45 from Ragnar Danneskjold
How often are you compiling kernels that this makes a difference in
your world?
(Genuinely curious.....)
Occasionally, for bug reporting and debugging purposes. I also have an Ice Lake laptop that has some annoying kernel problems that need to be sorted out, so lately I've been doing the occasional build from the "drm-tip" branch to test the current development codebase and coordinate with the upstream devs on troubleshooting.
I don't really need 12 cores, but they were reasonably priced and they run cool.
Creeping obsolescence of my old system was the main reason for the upgrade.
Solved the Windows BSOD. There is a BIOS setting that impacts on it:
https://www.reddit.com/r/Amd/comments/hm23sd/does_anyone_else_get_a_blue_screen_at_boot_when/
This all became clear when I installed Debugging Tools for Windows and analyzed the crash dump, googled it and found that smarter-people-than-me had already worked through it.
Tested Eco Mode on the 3900x and it does seem to produce only a small performance loss in the kernel compile, on the order of about 20 seconds over a 10 minute compile, so, <5%. I don't have any devices to measure actual power consumption at the wall, so I assume this test case jibes with the big reduction in Cinebench that the geek sites are publishing. So, no reason not to crank that sucker down if you just want to run it closer to the sweet spot of the exponential power curve.
Glad I didn't buy one of the 65W models because Eco mode on the 105W-ers is 65W; I am basically running it at he 3900 OEM-part spec, and these limits may be better for long-term reliability of the silicon, no?