5 months later, no chance at buying a new GPU (or Ryzen CPU) in sight.

I think I'm going to participate in GitHub's month-long "Game Off" game jam this November. I've never participated in a game jam & it seems like a good opportunity to kick my ass into gear & actually work on something.

Planning on using Godot (+ Rust? Not sure yet), Blender & Aseprite as my tools.

Anyone else planning on participating?

@ComedyReflux finally, a game that lets me stab myself in the eye!

@vnen That *does* look fun! I wish I could play it without subscribing to Nintendo's online service - I'd pay $10 to own it.

Microsoft buying ZeniMax is huge, although exclusivity deals are not something I'm a fan of.

At least Game Pass will get an even bigger library? I've been meaning to play Doom Eternal for a while.

@shivoa Darn, that seems to have been the story all over. What a silly "launch" from Nvidia. I'm seeing some 3080s on local classified sites going for $700 over retail price & screenshots of people ordering like 20 FEs directly from Nvidia with some bot help. Looks like most of us will have to wait.

Attempting to get my hands on a 3080 this morning turned out to be a collosal failure. No anti-bot measures on any Canadian retailers = no stock after seconds of going up. Did anyone manage to get one?

I'm now writing Scala as a part of my day job & so far... I'm not a huge fan. Implicit method parameters & variables are a little whack.

Why work on game mechanics when I can spend hours looking for the perfect sprites & tiles to prototype with & inevitably replace?

@Kimimaru one great thing about Godot is that if you find certain code paths are too slow due to GDScript, you should be able to rewrite that module in C#/C++/Rust for a speed boost!

I'm working on a prototype for a top-down action game (so unique, I know!).
This game will allow the player to jump (e.g.: Terranigma).

I'm curious what the best way to implement this would be: either keep everything 2D and implement the jumping there or switch to 3D where implementing the jump is easier and just "fake" the 2D look but?

@shivoa Maybe that will happen sooner rather than later.


Maybe it will be worth it to wait for one of these? The more I read online, the more I feel 10GB might be a bottle neck within a couple of years.

@shivoa It would be cool if the 3080 came with 16GB of VRAM. The 3090 is just too expensive for someone who just games and doesn't plan on doing machine learning or rendering.

I'm sure a 3080 Ti will come out in 6 months with more VRAM. My 1060 can't handle 4K though so I might just deal with the 3080 for now.

@scott ack, and I thought we had it bad here...
3080 is $699 US which is around $910 CDN so I'm expecting that number to be round up to 1K... Not too terrible (although actually that's so much for a GPU).

I just checked the AUS conversion and it seems to be around 950. Hopefully that cost isn't inflated more for you folks!

I am bracing myself for the cost of Nvidia Ampere in $CDN.

I've been looking for a new laptop that:
- Plays nice with Linux
- Will be used for game/graphics dev/Blender
- Has an AMD discreet GPU (or something powerful enough for my use cases without an Nvidia dGPU due to open source Nvidia drivers not supporting newer cards)

I haven't been able to find anything that hits all these points. Any suggestions, or am I screwed?

I opened Blender with the goal of creating a Corgi... I think I failed?

Also, does anyone happen to know of a good Blender 2.8 tutorial series? xD

@shivoa that is a good point. I hadn't realized that it has already been a year since release!

Show older
Gamedev Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!