Hazzard
@Hazzard@lemmy.zip
Migrated over from Hazzard@lemm.ee
- Comment on DLSS Multi-Frame Generation Is Now Easier To Enable On Steam Deck, And It Makes Gameplay Worse 5 days ago:
Yeah, that’s reasonable. I think it’s pretty cool tech, even if my own priorities and my display prevent me from using it as well.
The only place I really take issue with it is when someone like Capcom pushes it hard in a game like MH: Wilds to reach 60FPS. 30->60 is adding 33ms of input lag, in an action game, reaching a level of input lag we haven’t seen in the mainstream since N64 games that couldn’t push past 15-20FPS.
Once you’re at least at 60FPS native, you’re only adding 16ms of input lag, and that begins to feel like a pretty reasonable trade if you really like that smooth look.
- Comment on DLSS Multi-Frame Generation Is Now Easier To Enable On Steam Deck, And It Makes Gameplay Worse 6 days ago:
What? Your numbers are right, if you were running the game at 100FPS it would take 10ms to render a frame. Plus your 10ms of additional latency from holding the frame. 10ms + 10ms is 20ms.
If you were running the game natively at 50FPS, it would take 20ms to render a frame. That’s the same number. The total input lag from rendering is identical. Add in the slowdown from your GPU rendering the in betweens and its even worse.
- Comment on DLSS Multi-Frame Generation Is Now Easier To Enable On Steam Deck, And It Makes Gameplay Worse 1 week ago:
This doesn’t surprise me. Raw math, frame gen makes no sense to me unless you’re already hitting 120 FPS natively, and therefore you need at minimum a 240Hz display to make use of it.
Basic math, to generate frames, you must have the next frame ready to generate an in-between. Which means your frame display is delayed by a frame, meaning your input lag is equivalent to natively running at half the rate you’re natively running at. And this is assuming flawless, instant frame generation. For “motion smoothness”, a vague, not all that important element of game feel, IMO.
So, crunch some numbers. Natively running at 60? Neat, you can have the “motion smoothness” of 120 for the input lag of 30. Not worth it IMO, 30 feels pretty rough when you’re used to 60.
Native 120? Alright, the difference in input lag to 60 is way less. 8ms of added lag is tolerable, and with 4x frame gen you can drive a 480Hz monitor. Pretty good, and the time gap is small enough you’ll have minimal visible errors in the generated frames. The question of course being… do you own a 480Hz monitor? Not to mention 120 has solid motion smoothness already, so it’s still kind of a questionable trade. I’d still personally prefer native 120, but it’s at least reasonable.
A debatable sweet spot might be 80-100, 40-50FPS is more than halfway to 60 from 30 (in milliseconds), and you can multiply into more reasonable monitors than 480Hz. 360Hz to fully leverage 4x frame gen is something you’re more likely to actually own.
End of the day though, my core takeaway is that frame gen is incredibly niche. You either need to be obsessive about motion smoothness without caring about input lag, have a hella fast monitor and great performance, or uh… most likely, not understand any of this and want framerate go bigger.
- Comment on Steam Controller 4 months ago:
Been using the 8BitDo TMR sticks for a while, and they’re great so far. There’s no super long-term usage yet, the technology just isn’t old enough yet for anyone to have 10+ years of real world usage, but it’s fantastic hardware thus far, and I have faith in it to last as long as I’ll want to use the controllers for, unlike say, my Switch Pro Controllers (which I promptly replaced with 8BitDo controllers with TMR after discovering how bad the drift was on my OG ones).
- Comment on Baldur's Gate 3 introducing a native Steam Deck build that improves performance by reducing CPU load and memory usage 6 months ago:
Haha, you’re not wrong. Ours tend to ebb and flow with whatever urgent priority upper management has set as well, and it tends to take slack alongside our tech debts. Our management is listening and getting better though, I’m hopeful that in a few years we truly will catch up on our tech debts and have all our managed products in good shape at once.
That said, even in that environment, we’ve had some pretty incredible 20% success stories. Some of my own experiments from when I’ve had the time have become proper released features, although I mostly use it to skip the bureaucracy and address my pet peeve tech debts, which isn’t the point but is nice to be able to do. And one of our major internal products, with a large dedicated team and roadmap, began as one developer’s 20% project a few years back.
- Comment on Baldur's Gate 3 introducing a native Steam Deck build that improves performance by reducing CPU load and memory usage 6 months ago:
The best companies will do something like “20% time”, leaving one day a week or something to work on whatever, which is fantastic for stuff like this. Some of your employees almost certainly have the best ideas, if you just trust them with the space to prove them out.
Great way to get cool stuff like this without unpaid labour.
- Comment on Tech to protect images against AI scrapers can be beaten, researchers show 8 months ago:
Amen to that, here’s to hoping.
- Comment on Tech to protect images against AI scrapers can be beaten, researchers show 8 months ago:
Mhm, fair enough, I suppose this is a difference in priorities then. Personally, I’m not nearly as worried about small players, like hobbyists, who wouldn’t’ve already developed something like this in house.
And I keep bringing up “security through obscurity” because frankly, I’m somewhat optimistic that this can work out like encryption has, where tons of open source research was done into encryption and decryption, until we worked out encryption standards that we can run at home that are unbreakable before the heat death of the universe with current server farms.
Many of those people releasing decryption methods would’ve been considered villains, because it made hacking some previously private data easy and accessible, but that research was the only way to get to where we are, and I’m hopeful that one day we actually could make an unbeatable AI poison, so I’m happy to support research that pushes us towards that end.
I’m just not satisfied preventing Bill down the street from AI training on art without permission while knowingly leaving Google and OpenAI an easy way to bypass it.
- Comment on Tech to protect images against AI scrapers can be beaten, researchers show 8 months ago:
Exactly, it is an arms race. But if a few students can beat our current best weapons, it’d be terribly naive to think the multiple multi-billion dollar companies, sinking their entire futures into this, and also already amoral enough to be stealing content en masse from the entire internet, hadn’t already cracked this and locked everyone involved into serious NDAs.
Better to know what your enemy has then to just cross your fingers and hope that maybe they didn’t notice, and have just been letting us poison their precious AI models they’re sinking billions of dollars into.
- Comment on Tech to protect images against AI scrapers can be beaten, researchers show 8 months ago:
Eh, it’s a fair point. Not trying something like this is essentially “security by obscurity”, which has been repeatedly proven to be a mistake.
Wouldn’t surprise me if OpenAI or someone else already had something like this behind closed doors, but now the developers of tools like Nightshade can begin to work on developing AI poison that’s more resilient against these kinds of “cleanup” tools.