Announcement

Join us on Discord: https://discord.gg/nf43FxS
unnamednewbie13
Moderator
+1,847|5533|USA

I'm not really in the store-camping crowd but I can see how it could be a social adventure for some people. Just not...in pandemic weather, you know?

The 5900X closes the gap with the 10900K, with similar price and performance, and are two of the processors I'd consider in an ever-evolving build list in case my PC with its ancient cpu/ram/mb kicks the bucket. If just gaming, I'd go for a 3000 series or a 9600K and put that money towards whatever's coming out after nvidia's 30x0s. Either way, massive performance bump across the board, and maybe some years of relief from MS trying to decide what kind of drivers go best with a 12 year old component.
SuperJail Warden
Gone Forever
+412|2481
I love seeing old hardware and machinery worked into the ground rather than dumped. Reduce, reuse, recycle. Keep using your old machine until a MS security patch overwrites your BIOS.

Also what are you specs?
uziq
Member
+351|2213
massive performance gains in processors mean relatively little in 95% of games. you're talking an extra 5 fps.
unnamednewbie13
Moderator
+1,847|5533|USA

SuperJail Warden wrote:

I love seeing old hardware and machinery worked into the ground rather than dumped. Reduce, reuse, recycle. Keep using your old machine until a MS security patch overwrites your BIOS.

Also what are you specs?
The legendary 2600K, 16GB DDR3, on an Asus board. Don't worry, I wouldn't "dump" it. I like having backup rigs.

The driving force behind replacement is obviously the age of the components and ever increasing compatibility issues thanks to both MS mismanagement and mfgs dropping support. Second to that is I'd like to not experience an out-of-memory crash for at least another 8-ish years, and that it would be nice for my GPU to not be bottlenecked.

Having a working computer available through the process of ordering parts and building a new one is nicer than having to build a new one because your old one is out of commission. The next newest one I have is a thing from 2004 running Windows 7 which I don't relish ordering anything on. But I've been on the fence about it ad nauseam since last November, so if something happens, that's on me. I do feel like my allotted "I need a new PC" posts have expired.

Anyway my point was I guess that it's nice to see the new AMD chips draw up even with Intel's farted-out 10900 and that not everyone would be upgrading from last-gen.
Larssen
Member
+47|649
My biggest concern with older machines would be security to be honest. First off I'd thoroughly erase/factory reset anything that ends up in the 'spare' category. But I'd also be hesitant to hook something like that up to the internet again, esp. a windows 7 machine.

Just get yourself a cheaper chromebook of sorts if you need a backup machine. Few hundred dollars total cost and are usually supported for quite some years. Works fine for most anything except gaming of course.
unnamednewbie13
Moderator
+1,847|5533|USA

Right, like I mentioned with not wanting to shop parts using the one from 2004 running Windows 7. It's also been an endless battle in my business arguing for replacements of older PCs for security's sake. I just end up unilaterally replacing them myself and dealing with the arguments later (one guy has a bulky old Compaq laptop somehow updated to Windows XP that I won't authorize for network use).

If I got a laptop as a backup machine, I'd just get a Macbook. I don't particularly want a Chromebook that I'd barely touch unless something goes wrong or I'm on an extended trip.

Anyway I'd be unsurprised if the 2600K overran the life of my mb or ram. It's not aggressively overclocked by any means (I don't really want to break it) and has a pretty hefty noctua keeping it at low temps.

@zeek it's true that some games won't see much of a performance boost, but tests I've encountered on articles exploring the 2600k (for instance) in recent years seem to emphasize the low gain for gaming at 4k. One (albeit running a GTX 1080) shows 1080p being more of a stark contrast than 1440 or especially 4k. I won't deny that isn't somewhat disappointing, but gaming isn't all you can be doing with a PC I guess?
uziq
Member
+351|2213
if you're doing video rendering, coding work, virtual desktops and number crunching, of course it's a huge gain.

i'm similarly interested in what apple's new silicon chip is all about. it sort of merges CPU/GPU/memory with a machine learning engine. very interesting stuff and seems to promise huge gains right across the board (especially in pretty woeful mobile GPUs).
unnamednewbie13
Moderator
+1,847|5533|USA

I'm actually going over articles and eyeballing the 13" right now. Might spring for the mini if nothing else for my first personal Apple anything, but the quick portability of a laptop is a nice perk that might be worth the added cost.

But it may be a bit premature until more universal apps are available. I'd like to find a timeline for Adobe and MS Office so I don't have to run those through emulation.
uziq
Member
+351|2213
yep probably not until next year. i am never very keen to public beta test a first-generation device.

i would really like a new 16" though and feel salty about buying the current one if a mini-OLED/apple silicon version is on the way.
unnamednewbie13
Moderator
+1,847|5533|USA

Your current one should at least remain relevant for some years. And yes I would have much liked to see the M1 in that wonderful 16" chassis. For myself, I'd definitely look at external storage solutions. Uptiering even the mini puts lots of lols on the price tag.
SuperJail Warden
Gone Forever
+412|2481
It is nice that Intel is launching a GPU division. More competition is good for the blah blah blah. All a little pointless though anyway. I can't remember the last time I saw an advancement in graphics that blew me away. Mirror's Edge in 2009? Yikes 2009 was 11 years ago.
unnamednewbie13
Moderator
+1,847|5533|USA

I've seen some shader makeovers that impressed me since then. It's weird seeing Minecraft get better lighting sometimes than simulators. Mirror's Edge was fun, but I don't think I finished it or tried the sequel. I think the pathways were too limited and iirc not enough of the game revolved around the parkour.
SuperJail Warden
Gone Forever
+412|2481
Has anyone seen a triple panel monitor for sale? One solid unit.
uziq
Member
+351|2213
buy them separately. you can take advantage of multiple uses. a high refresh one for gaming, a nice colour reproduction one for work, etc.
SuperJail Warden
Gone Forever
+412|2481
I plan to connect a HDTV to the PC too. My GPU has the plugs for 3 monitors and a HDMI but I feel like managing 4 monitors is going to be a software nightmare compared to just managing 2.
uziq
Member
+351|2213
the GPU will struggle as iirc you didn’t buy a high end one but rather a smart performance choice (like i did).

still not sure why you specifically need 3 separate monitors. are you running 24/7 CCTV feeds or something? an ultra-wide gives you a lot of real estate to work with. plus windows has virtual desktops you can flick between with a single button. if i were you, and you’re intending to play games on a HDTV and have a PC on at the same time, i’d think about consolidating my PC display to one screen at least.
SuperJail Warden
Gone Forever
+412|2481
I haven't played any games in weeks and the games I do play are older. The RTX card I have is just fine for what I need. I have been using my dad's home office PC to better focus on work and that thing has a triple monitor setup running off of a GT 710 without any issues at all.

I need a triple monitor display for two reasons: when I am hosting virtual class, I work in 3 windows. The virtual class, the attendance system, and the system that let's me watch the kids chromebooks. So I will actually use 3 windows 5 days a week. When it comes to grading school work, 3 windows is great too since I can keep the gradebook open in one, the classwork open in the other , and YouTube open in the third. Not having to switch between tabs makes things a lot easier when grading. I don't like ultrawide displays and I like how Win 10 treats each separate monitor as it's own space with a taskbar.

I am going to mount the HDTV to my wall that is angled. That way I can lay in bed on my back drugged up and watch zombie movies and pornography.
unnamednewbie13
Moderator
+1,847|5533|USA

I'll double down on the suggestion for three separate monitors. RTX 2060 can even handle 4 as per specs.
uziq
Member
+351|2213
ah, if you're not playing demanding games on that set-up, then of course. yes, a 2005-spec GPU could handle the pixel output alone. it gets a bit more complicated if you want to run large monitors at 144/240Hz or whatever. i assumed you would be driving games through multiple monitors.

mounted HDTV above the monitors is a trusty and pretty nice set-up.
SuperJail Warden
Gone Forever
+412|2481
I decided to get this triple monitor mount instead of a solid single unit. I do like the idea of being able to move around and space out the monitors rather than having then having to adjust myself to a single unit's layout. This unit also has a USB and audio passthrough. I will probably not use the audio but the USB is a nice addition.
https://i.imgur.com/1RZ5GrD.png
RTHKI
mmmf mmmf mmmf
+1,687|5498|Oxferd Ohire
Do you plan on playing any simulators
https://i.imgur.com/tMvdWFG.png
SuperJail Warden
Gone Forever
+412|2481
No, just do work.

Board footer

Privacy Policy - © 2020 Jeff Minard