

I little of column A, a bit from column B, I suppose.
I little of column A, a bit from column B, I suppose.
The same thing seems to be happening with regard to regular citizens and ICE raiders, and I’m here for it.
Agreed, I love to see it.
I’ma homebody and live in bumfuck where thankfully almost nothing ever happens, but I’ve signed up for an action newsletter for news and information on how I can help, and I do my part to donate locally to my food bank and other things I care about. Dunno that it helps with the current state of things, but it’s something. Hopefully more opportunities to help in the future as I hate this draconian shit going down.
Apologies for misunderstanding your comment, that’s on me.
Holy shit, my brain did the 10 year skip again. Fuck, it’s been a long, long mess.
You know what? That’s entirely fair.
Ah, gotcha. Weirdly vague comment, but in that case, I agree.
What does that have to do with anything? Terrorists hijacked a plane 14 years ago and now we’re here.
The terrorists today are domestic, running around in tacticool gear and working for our government.
Is this provided via T-Mobile or did he just rip off T-Mobile’s logo font?
You can’t have roving mobs destroying your city every night.
No, the curfews are for L.A. residents and civilians, not ICE and the L.A.P.D. The roving mobs are still out.
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.
F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.
I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.
The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).
This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.
Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.
My experience with Code Vein was briefly playing it on game pass, but couldn’t get past the weeb bait waifu chick. Like seriously, first cutscene and her boobs are waving in the breeze while she’s standing still, like they’re fucking flags or something. It was downhill from there when the gameplay was mediocre and I was supposed to somehow connect with and protect said waifu as my motivation.
Uninstalled in under an hour. Wife and I jokingly refer to the game as “Code Titty-Flap.”
I hate that. I had my home built to spec a few years ago. The exterior siding is cedar shake stained a chocolatey brown with forest green trim, and the interior is white walls but with natural wood trim, pale golden laminate wood flooring, and two tone hickory wood cabinets, and the interior doors are all just natural wood unpainted.
I’ve leaned into the wood aesthetic with my DIY standing desk and custom pine desktop stained a dark red oak color, among various other earth tone color hints, and splashes of brighter decoration here and there.
Was going for “cozy cabin/cottage” and I think we nailed it. It’s very rustic.
I really hate the modern trends of white, black, steel, and glass.
Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.
Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).
Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.
Sounds great. I’m in my 40s with myopia, astigmatism, and more recently, presbyopia.
Progressive lenses don’t work for me, and needing two pairs of glasses is not ideal, even if it mostly works. Plus I can’t even just buy reading glasses off the shelf, even my short range office lenses need a prescription and are expensive as hell.
Autofocusing lenses sound like an awesome alternative.
As a former VMware employee this is just sad.
VMware was a great place to work, with a lot of people who cared about what they were building and supporting, and now it’s just a hollowed out vulture capitalist’s pump and dump. Anybody with any sense is migrating to alternatives, if they haven’t already.
Nearly finished re-reading Murderbot as well, currently on book 7. After that I’m planning to check out the Warhammer 40k Dark Imperium series on my wife’s recommendation (she’s listening to Dark Imperium via audiobook now).
This is a weirdly aggressive take without considering variables. Almost petulant seeming.
6” readers are relatively cheap no matter the brand, but cost goes up with size. $250 to $300 is what a 7.8” or 8” reader costs, but there’s not a single one I know of at 6” at that price.
There’s 10” and 13” models. Are you saying they should cost the same as a Kindle?
Not to mention, regarding Kindle, Amazon spent years building the brand but selling either at cost or possibly even taking a loss on the devices as they make money on the book sales. Companies who can’t do that tend to charge more.
Lastly, it’s not “feature creep” to improve the devices over time, many changes are quality of life. Larger displays for those that want them. Frontlit displays, and later the addition of warm lighting. Displays essentially doubled their resolution allowing for crisper fonts and custom fonts to render well. Higher contrast displays with darker blacks for text. More recently color displays as an option.
This is all progress, but it’s not free. Also, inflation is a thing and generally happens at a rate of 2% to 3% annually or thereabouts during “normal” times, and we’ve hardly been living in normal times over the last decade and a half.
Can’t speak for everyone, but I exercise and drink a lot of chamomile tea.