Show newer

I guess the RX480 & GTX970 are old now (just for a lot of that time, GPU market has been broken by two different crypto distortions & a pandemic shock) but I'm still not used to seeing them listed as minimum specs for a new release. "Wait, those are nice if old GPUs!"
But we're talking 4-5TFLOPS & 200GB/s memory bandwidth. I guess for a 1080p30 target, with less capable workload scheduling than new cards & limited upscaling boosts, you need that.

We claim to process colours all day (for display on monitors, seen by human eyes) but then questions like "what's the colour half way between these other two colours?", after a short period of people discovering they are wrong, turns to everyone crying or talking about how their eyes lie.
This is before we even get started on limitations of our visual systems (creating optical illusions) & if you embrace them all or try to suppress undesirable results.

Devs who automatically use Xbox button names when detecting a gamepad on XInput & PlayStation button names for DirectInput are doing it correctly. If you're giving me "Xbox but in the wrong order" (ie Nintendo) by default for anything you don't know for sure is a Nintendo controller then everyone will just get confused, keep that for an option to be enabled in the settings.

Finally gave GreedFall a go (sounds like it did quite well, even if the sequel is appearing with a new publisher so maybe somewhat turbulent negotiating there).
It really is Dragon Age II-2 (because we sure aren't getting that from EA any time soon). Right down to the uncomfortable subject matter from using a historical-fantasy setting (the Elven ghettos & reserves in that conquered land was not exactly subtle references in DA either, if ever so slightly more fantastic a setting).

Some possibly legacy advice (especially if using a modern API where you may expect to iterate all possible devices, including understanding perf expectations in latest updates) on ensuring you render on the right GPU:
Some of the info is definitely outdated (per-app settings now handled in Windows not vendor CP) but exposing the symbols probably still something to consider doing if not using the latest rendering APIs.

Show thread

Steam hwsurvey VRAM: 12GB: 5%; 10/11GB: 5%; 8GB: 25%; 6GB: 21%; 4GB: 17%.
Slightly surprised AAA games don't seem to test much on lower VRAM configs. Few oddities:
Dirt 5 has no way of asking for lower res textures (in settings)? but does tell you you're low on VRAM when benchmark loads up; proceeds to run at 10fps even in 720p low mode.
Halo Infinite simply refuses to use more than 75% VRAM (Windows compositor/apps run on iGPU so don't eat into dGPU VRAM) no matter settings. Very low quality.

Something having several usable GPUs inside my laptop has gotten me to change is thinking of visual settings as having one set per computer.
My settings file now contains settings per adaptor, so that if someone flips between an iGPU on battery & dGPU or xGPU at a desk then they only have to configure things once (and no risk of trying to load a settings block that contradicts feature level then resetting everything to a "safe" default on failure).

I suspect this is the first time I've ever had duelling GPUs from a purchase (laptops have previously been powered by an iGPU that couldn't come close to any contemporary dGPU; desktop upgrades always demolish the previous card in every aspect).
Interesting to contrast something like Far Cry 4 (1070 clearly superior) with Avengers (3050 may not be perfect in static shots but in motion the dynamic DLSS allows a locked framerate target at settings not on the table with a Pascal desktop card).

I hadn't loaded up EA Desktop (the not-Origin that hooks into GamePass PC) for quite a while but browsing through the archive of games, I notice all those deals EA did to get non-EA games onto EA PlayPro have been allowed to expire so now they just have a rather random assortment of smaller games on sale on their store (outside the rental system). Enough to look a bit weird because it's definitely not so many as to look like a normal store that sells a lot of different games.

I thought maybe I was doing something weird with some little DX12 test apps that was causing some builds to load on my iGPU while other loaded on the dGPU (Windows has heuristics to pick) but, having gone through the various DX12 samples, this seems to be typical for rather skeletal apps. Even just going through the DX12 Hello World samples, some are picked up as demanding "high performance" while others are flagged for "power saving".

I realise that at a certain point of corporate ownership you have zero ability to make decisions around content sponsorship, but if I worked for a US gaming-related company who have made queer-inclusivity part of their brand in recent years, I probably would have done anything to delay announcing a new series sponsored by State Farm.

RDNA2 CUs inside various chips:
Xbox Series S: 20 [customised]
Radeon RX 6300M (bottom of the dGPU stack): 12
Rembrandt R7/9 iGPU: 12
Steam Deck: 8
Rembrandt (budget) laptop: 6
Samsung 2200 phone SoC: 6 [?]
The new Mendocino APU that AMD are confidently claiming will be right at home inside a laptop that costs up to $700 this holiday: 2 [!]
I just... why?

So Creative Assembly are changing all their user account branding from "Total War". I wonder if Sega have something to announce about their future projects soon.
Alien: Isolation becoming a bit of a cult classic in the decade since release probably hasn't hurt the chances that CA get to have another go at making something other than just more Total War in the future.

Interesting that people are questioning fab-based significant CPU [or now GPU] improvements (with Intel & now AMD demanding significantly more power to feed their new designs; nVidia pumping the power) at a moment where I'm finding a 5800H (2021) using up to 50W is more than a match for my old desktop 2700X (2018) at up to 140W. Both monolithic designs, similar cache, but a die shrink & a couple of generational updates to the Zen design allowing a major efficiency win (in high perf scenarios).

One thing I'm yet to figure out about Optimus is if you can actually force an app to use a certain GPU (by making the other one invisible to it). Since last I used a laptop with two GPUs, now Windows (not nVidia CP) controls preferences. But it's not a forced thing (if apps probe for GPUs & specifically target one) so currently the Epic Store app is set to prefer iGPU (it's just a web browser) but will still spin up dGPU to render on & I have no idea if I can actually stop it doing that.

Jess Birch boosted

Another cool thing that I learned when doing this work is that humans see red in MUCH HIGHER RESOLUTION than blues.

Reds always grab your attention first because our eyes have many more red-sensitive cones than blue (64% vs 2% (!))

Show thread

Amazing how well Apple PR lie about "Retina display = you biologically only need this many pixels" has survived in the imagination of nerds.
Someone just said "eye's visual acuity" means anything higher than 4K 27" at 60cm is totally pointless. And let me tell you, typing this on a 16" 3840x2400 laptop that's 60cm from my eyes, that is not remotely true. But then my glasses bring me up to 20:12 & I see with both my eyes at once so don't really care about one 20:20 eye unmoving angular acuity.

So apparently you could make an absolute fortune by offering retreats explaining to the executives at big game companies (Sony, EA so far) what the paradox of tolerance actually means. It is categorically not "inclusive" to be welcoming of reactionary views which deny the basic rights of marginalised employees (as just "another point of view").

We can all agree that hardware 3D is definitely here when you're talking Glide or OpenGL fixed pipeline; it's dominant with consumer hardware T&L. It's definitely not present in the polygons of Another World (CPU-driven on every platform).
But what about the area in the middle? Is the 3DO & Saturn fully 3D accelerated or very clever use of primarily 2D co-processors? Is Virtua Racing Genesis hw 3D (SVP chip) but software on 32X? Is the line stuff like the PS1 or does that fall just short?

It's the PS5, Sony get a do-over with a new console & the launch of a new service (tiered PS+). Yet we still get dumping of emulated PAL PS1 ports onto the international audience, despite 50Hz display technology being an endangered species (does the PS5 even auto-switch when plugged into a TV that supports 50Hz input?). This needs to be better.

Show older
Gamedev Mastodon

Mastodon server focused on game development and related topics.