Malix

joined 2 years ago
[–] Malix@sopuli.xyz 12 points 1 day ago* (last edited 1 day ago) (8 children)

about the spacebar:

  • you might want to swap the switch on that, could be it's a wonky one.
  • unless I'm horribly wrong, the spacebar width is the standard 6.25u width one, finding a keyboard with shorter one could turn out to be nearly impossible unless there's some unicorn-one-of-akind-layout out there, unless you're looking at ergo/split/40/other-weirdo -ones.

about via:

if you can flash it with a firmware which supports vial, it might provide a better customization experience. edit: linky: https://get.vial.today/

[–] Malix@sopuli.xyz 1 points 1 day ago

tbh, that's quite a bit more than what I would have expected - especially for a game what I thought was more or less a counter-strike variant where the whole point is the pvp experience.

[–] Malix@sopuli.xyz 13 points 2 days ago (2 children)

So... I'm completely outsider to this game (and series), and... is there actually much lore/story to these characters - or better yet, lore/story that anyone cares for or matters for the game?

I get that the date is a bit unfortunate all things considered, but.. why do the character need to have a stated birthday anyway? Are there in-game birthday parties or what?

[–] Malix@sopuli.xyz 20 points 4 days ago* (last edited 4 days ago)

zstd is generally stupidly fast and quite efficient.

probably not exactly how steam does it, or even close, but as a quick & dirty comparison: compressed and decompressed a random CD.iso (~375 MB) I had laying about, using zstd and lzma, using 1MB dictitionary:

test system: Arch linux (btw, as is customary) laptop with AMD Ryzen 7 PRO 7840U cpu.

used commands & results:

Zstd:

# compress (--maxdict 1048576 - sets the used compression dictionary to 1MB) :
% time zstd --maxdict 1048576 < DISC.ISO > DISC.zstd
zstd --maxdict 1048576 < DISC.ISO > DISC.zstd  1,83s user 0,42s system 120% cpu 1,873 total

# decompress:
% time zstd -d < DISC.zstd > /dev/null
zstd -d < DISC.zstd > /dev/null  0,36s user 0,08s system 121% cpu 0,362 total
  • resulting archive was 229 MB, ~61% of original.
  • ~1.9s to compress
  • ~0.4s to decompress

So, pretty quick all around.

Lzma:

# compress (the -1e argument implies setting preset which uses 1MB dictionary size):
% time lzma -1e < DISC.ISO > DISC.lzma
lzma -1e < DISC.ISO > DISC.lzma  172,65s user 0,91s system 98% cpu 2:56,16 total

#decompress:
% time lzma -d < DISC.lzma > /dev/null
lzma -d < DISC.lzma > /dev/null  4,37s user 0,08s system 98% cpu 4,493 total
  • ~179 MB archive, ~48% of original-
  • ~3min to compress
  • ~4.5s to decompress

This one felt like forever to compress.

So, my takeaway here is that the time cost to compress is enough to waste a bit of disk space for sake of speed.

and lastly, just because I was curious, ran zstd on max compression settings too:

% time zstd --maxdict 1048576 -9 < DISC.ISO > DISC.2.zstd
zstd --maxdict 1048576 -9 < DISC.ISO > DISC.2.zstd  10,98s user 0,40s system 102% cpu 11,129 total

% time zstd -d < DISC.2.zstd > /dev/null 
zstd -d < DISC.2.zstd > /dev/null  0,47s user 0,07s system 111% cpu 0,488 total

~11s compression time, ~0.5s decompression, archive size was ~211 MB.

deemed it wasn't nescessary to spend time to compress the archive with lzma's max settings.

Now I'll be taking notes when people start correcting me & explaining why these "benchmarks" are wrong :P

edit:

goofed a bit with the max compression settings, added the same dictionary size.

edit 2: one of the reasons for the change might be syncing files between their servers. IIRC zstd can be compressed to be "rsync compatible", allowing partial file syncs instead of syncing entire file, saving in bandwidth. Not sure if lzma does the same.

[–] Malix@sopuli.xyz 15 points 4 days ago (1 children)

I'll preface this with the fact that I don't know the game, but looking at the system requirements on steam, both of those systems are below minimum spec.

Also, it would probably help if the macbook actually was on fedora 41, it reads 40 in the screenshot - so maybe try upgrading that one, since the game runs on the system with actual fedora 41?

[–] Malix@sopuli.xyz 5 points 1 week ago (1 children)

names of games I didn't expect to hear to day. Only had Major Stryker shareware as a kid, played it a lot, along with Duke Nukum and some other old SW games.

But those games on massive tv? Pixels the size of cats.

Were you using any crt shaders or such?

[–] Malix@sopuli.xyz 45 points 1 week ago (1 children)

doesn't seem like it was all that popular game to begin with, so all things considered the number of affected users is likely to be like, 10 at best? And some of those are likely affiliated with the dev/publisher anyway.

But it sucks that these things still can get through the sieve to begin with :/

[–] Malix@sopuli.xyz 2 points 1 week ago

eyyyy, World Class Leaderboard, one of the first games I ever had on PC. Got fond memories of that - but rose tinted glasses didn't survive revisit in dosbox x)

This game looks neat tho. Might pick it up!

[–] Malix@sopuli.xyz 1 points 1 week ago

heh, who would have thought that free anything gains attention. Oh well.

[–] Malix@sopuli.xyz 1 points 1 week ago (2 children)

...aww. Seems to be region locked, none of the movies available here (Finland)

[–] Malix@sopuli.xyz 1 points 2 weeks ago

Wobbly Life? https://store.steampowered.com/app/1211020/Wobbly_Life/

It's essentially a 3d platformer coop about doing random jobs because your granny doesn't like you laying about her house. There's all kinds of "jobs" to do, like racing, fishing, disposing nuclear waste...

 

Hi

So, the thing I want to accomplish is to add .png images, compile them and then transform the compiled montrosity (move/scale, etc).

But the thing is, if I "alphaover" the images with some offset, for example:

the image laid over the other cuts off, as the overlay can't reach outside the dimensions of the underlaying one.

I know I can just:

  • use eg. gimp and combine the images there, but I'd rather have my workflow entirely in blender.
  • add transparent padding for ~billion pixels around the decal as a workaround, but that sounds silly and "bruteforcing" the concept.

How would I go about getting all overlaid images to display in full in such case? I've tried different options on the "alpha over" and "color mix" -nodes without results, but entirely possible that I just missed some critical combination.

So, thoughts?

0
submitted 5 months ago* (last edited 5 months ago) by Malix@sopuli.xyz to c/blender@lemmy.world
 

(also; an album: https://imgur.com/a/H4EZCdV - changed the topic link to show the render, for obvious visibility. Album includes node-setups for geonodes and the gist of what I use for most materials)

Hi all, I'm an on/off blender hobbyist, started this "project" as a friend of mine baited me a bit to this, so I went with it. The idea is to make a "music video" of sorts. Gloomy music, camera fly/walkhrough of a spoopy house, all that cheesy stuff.

It started basically with the geonodes (as shown in the imgur album) - basically it's a simple thing that generates walls/floorlists around a floor-mesh and applies given materials to them, nothing fancy but it allows me to quickly prototype the building layout.

The scene uses some assets from blendswap:

  • https://blendswap.com/blend/14139 - furniture, redid the materials as they were way too bright for the direction I intended to go. But the modeling on the furniture is top notch, if a bit lowpoly but nothing a subsurf mod. can't fix.
  • https://blendswap.com/blend/25115 - the dinner on the table. Also tweaked the material quite a bit, the initial one was way too shiny and lacked SSS.

Thanks to the authors of these blendswaps <3

edit: the images on paintings on the wall are some spooky paintings I found on google image search, but damn it I can't recall the search term to give props. I'm a failure.

The source for the wall/floor textures are lost to time, I've had these like a decade on my stash. Wish I could make these on my own :/

view more: next ›