you won't notice speed increases because you're using 60Hz tech from tesla's day, anyway.
I don't play modern or graphics intense games. The game I play are two sets of numbers subtracting each over maps of Europe.
refresh rate actually makes the biggest tangible difference for me in day-to-day use. i made the exact same objections as you ... before taking the plunge on a good high-refresh monitor.
the 'mouse' moves across the screen in a much smoother manner. windows move around better. pages scroll better. it's hard to explain the qualitative effect. your monitor is literally refreshing 2x or 1.5x as fast. of course it looks a lot quicker and smoother on just about every task.
the 'mouse' moves across the screen in a much smoother manner. windows move around better. pages scroll better. it's hard to explain the qualitative effect. your monitor is literally refreshing 2x or 1.5x as fast. of course it looks a lot quicker and smoother on just about every task.
Oh well too late now. I am still happy with what I have.
of course. 4k is still nice regardless. but skip the mouse with lights.
GPU while hosting virtual class today. It actually spiked to over 60% at times. I was very happy to see this. So I host virtual class on one monitor while on the second I have software that streams the screens of up to a dozen students straight to my PC. So all of the streams put the GPU to work. It's nice seeing all of the horsepower of the GPU actually put to use instead of doing nothing.
You should be able to get some sort of credit for your multi-screen setup since you're basically using it for your job. Then again, teachers having to buy their own supplies isn't exactly new.
I think teachers get a $600 tax deduction opportunity. I know the Republicans wanted to eliminate it in the 2017 tax cuts. Ken probably knows more about how the taxes all work. Maladjusted people cannot do taxes.
remember when qq was a thing?
qq
i'm not a teacher and I don't talk to my teacher friends (really wives of my friends) about taxes so I don't know how the $600 tax credit for teachers works
i'm not a teacher and I don't talk to my teacher friends (really wives of my friends) about taxes so I don't know how the $600 tax credit for teachers works
It seems to be only $250 for a single filer. I'm not sure if that would qualify, either.
IIRC, when I did HRBlock (or TurboTax) last year I just ticked the box and didn't have to provide any proof. I think as long as you don't hit the standard deduction you don't need to itemize everything. I have a degree in political science. You would think I would understand how taxes worked a little better without needing to read a book about it. I blame the tax system for being too convoluted.KEN-JENNINGS wrote:
remember when qq was a thing?
i'm not a teacher and I don't talk to my teacher friends (really wives of my friends) about taxes so I don't know how the $600 tax credit for teachers works
Also
I finally opened one of my favorite games, jacked up the resolution, and setting and played a little. Steam Screenshot compression doesn't really do it much justice.
I was again very pleased by the 75% GPU usage spikes.
I finally have a setup that would justify some top of the line GPU though $1500 on an RTX 3090 is not something I would ever buy.
Do you remember the 8800 GTX from 2007 that was considered a crazy good back then? Literally the largest GPU ever made at the time of its release. It retailed for $599. Inflation adjusted to $751.79. Nvidia literally doubled their highest price points.
I was again very pleased by the 75% GPU usage spikes.
I finally have a setup that would justify some top of the line GPU though $1500 on an RTX 3090 is not something I would ever buy.
Do you remember the 8800 GTX from 2007 that was considered a crazy good back then? Literally the largest GPU ever made at the time of its release. It retailed for $599. Inflation adjusted to $751.79. Nvidia literally doubled their highest price points.
It would be funny if that animal was actually a fox that screamed, cringed, and peed all over your helicopter's interior.
i had 2x 8800GTXs in SLI. i won one at a LAN back when i was a teenager playing with a team full of adults, and the pinnacle of 'competitive gaming' was basically winning a graphics card or a processor at an Intel-sponsored event. almost zero money or cash prizes. what an era.
also macbeth the CPU usage in games is still woeful. i did mention this to you when you were adamant about getting a fancy motherboard/chipset that you'll never use. i see you haven't mentioned CPU usage in almost any of your daily posts about WFH/media/gaming. i bet your CPU is still woefully under-utilised.
Last edited by uziq (2020-12-10 22:47:24)
Reminds me of an argument recently with one of those "40 years in electronics engineering" guys (where whatever you've done they have a STEM achievement to one-up it, ex: "my physics professor was a nobel laureate and we had to do space travel calculations with a slide rule and paper!"). He insisted the highly unoptimized game we were playing could "definitely support 128 players if only the server owner put it on a threadripper!" Incidentally, that's more than twice the number of players of any of the game's servers with players on it. If true, I'm sure at least one of the many servers running on that kind of hardware would have picked up on it …
But honestly, I dunno. Just how much processing power can Unity take advantage of.
But honestly, I dunno. Just how much processing power can Unity take advantage of.
which will still be spread over 3/4 cores, typically, rather than 12.
chucking a lot of money into processors is still the best investment if you're doing lots of video/graphics processing, creative work/rendering, programming and running virtual test environments, etc. no question about it. just i feel that lots of gamers take on sort of 2007-era assumptions about budgeting for a processor when they build machines nowadays. you really don't need a beefy $500+ processor to run modern games. it's not like BF2, which was a huge bump in processor demand, i recall, from smaller-scale quake/unreal engine games.
It helped run the shit show known as lightroom run a lot faster. Ssd and new gpu helped too but those were later.
Only game besides star citizen that used a lot was ac odyssey. And once I got a new video card usage went down 15% or so.
i5 was good when I built in 2011 never thought about choosing a i7 or i9 or amd equiv when building this time.
Only game besides star citizen that used a lot was ac odyssey. And once I got a new video card usage went down 15% or so.
i5 was good when I built in 2011 never thought about choosing a i7 or i9 or amd equiv when building this time.
The CPU isn't putting in any work, sure. Chrome is using a gigabyte and a half of ram though when you look at the "Background processes" too. Pretty happy that about 6 gigabytes of Ram is being used by the system + 2 GB of VRAM in use too.uziq wrote:
also macbeth the CPU usage in games is still woeful. i did mention this to you when you were adamant about getting a fancy motherboard/chipset that you'll never use. i see you haven't mentioned CPU usage in almost any of your daily posts about WFH/media/gaming. i bet your CPU is still woefully under-utilised.
I really hate that there are 86 vague background "Service Host" and other processes going on that you have no clue is doing what. They don't use any processing power and I have the RAM to spare but they do eat up a lot of it. I installed Win 10 on a PC with only 4 GB of RAM once and a fresh install of idle Win 10 used like 3.5 GB of RAM.
Do you not like my work from home post? These are the post the apex bully Ken Jennings would prefer from me.
chrome is horrendously inefficient with RAM, notoriously so. i have 32Gb RAM and that's definitely the main 'redundancy' in my system at present: i could pretty much only ever go over 16Gb RAM if i purposefully switched to using RAM hungry chrome for the sheer sake of it (i use waterfox).
your work from home posts are good. i am happy to see you have a nice new set-up. it definitely makes the long grind and monotony much easier to deal with. i have no harsh or mean-spirited criticism.
your work from home posts are good. i am happy to see you have a nice new set-up. it definitely makes the long grind and monotony much easier to deal with. i have no harsh or mean-spirited criticism.
I have heard that Chrome is so RAM hungry to increase stability. They treat each tab as a entirety new launch of chrome in order for no single tab taking down your entire browser. Very rarely do I see Chrome take a shit. Win 10 is fairly stable too. It has nightmarish update patterns, awful capability with old games, and eats RAM but I don't see the sorts of crashes and lock ups that I used to see on Win 98, XP, Vista or even 7.uziq wrote:
chrome is horrendously inefficient with RAM, notoriously so. i have 32Gb RAM and that's definitely the main 'redundancy' in my system at present: i could pretty much only ever go over 16Gb RAM if i purposefully switched to using RAM hungry chrome for the sheer sake of it (i use waterfox).
your work from home posts are good. i am happy to see you have a nice new set-up. it definitely makes the long grind and monotony much easier to deal with. i have no harsh or mean-spirited criticism.
Win 7, now that's a word I haven't written in awhile. After Windows brought back that button on the lower left hand corner and got rid of the tiles Win 7 looks a lot less hot in hindsight.
My win10 ryzen first gen work PC has randomly reset more than any computer I have ever owned. The first two ryzen gens had serious issues in video compatibility, especially with dual monitors or using another video port. Im glad AMD finally smoothed out those errors.
Win 7 will no longer receive security updates so our MIS department is mandating win10 upgrades. Most people see this as a good thing because "new PC!" but now I have to guide all the tech illiterate people on using a newer UI. Its such a pain.
When we switched from having locally hosted storage to an internal cloud-based solution i had to mount the drive in windows so the drive was viewable as a folder in windows explorer because people couldn't understand the concept of uploading/downloading to the cloud unless the look and feel was the same as opening or dragging and dropping to a folder just like before.
Win 7 will no longer receive security updates so our MIS department is mandating win10 upgrades. Most people see this as a good thing because "new PC!" but now I have to guide all the tech illiterate people on using a newer UI. Its such a pain.
When we switched from having locally hosted storage to an internal cloud-based solution i had to mount the drive in windows so the drive was viewable as a folder in windows explorer because people couldn't understand the concept of uploading/downloading to the cloud unless the look and feel was the same as opening or dragging and dropping to a folder just like before.