"Some might say "just get a better computer". This is why getting a better computer is bad:
1. Affordance: A lot of people, especially from 3rd world countries are very poor and can't afford to buy hardware to run Turbobloat.
2. e-Waste: Producing computer chips is very bad on the environment. If modern software wasn't Turbobloated you would buy new hardware only when the previous hardware broke and wasn't repairable.
3. Not putting up with Turbobloat: Why spend money on another computer if you already have one that works perfectly fine? Just because of someone else's turbobloat? You could buy 1000 cans of Dr. Pepper instead."
Took the words from my mouth. What a great project. Please keep posting your progress.
"Screen resolutions from 320x200 to 800x600."
Still, higher resolutions were not just invented because of Turbobloat.
Important:
This was just a joke from the site, I actually took serious!
There is no 800x600 limit.
But also a convenient excuse to sell more ramm and disk space 'for the textures'.
Hard to know how to respond to that. This could be applied to virtually all technology changes that benefit users but also make money for someone else.
I assume you use a refrigerator and not a hole in the ground with ice. Have you been manipulated into giving money to Big Appliance?
To an absolute hardliner for appropriate technology, probably -- but simplicity isn't necessarily all-or-nothing, and (IMO) helping people pull off cool things with simpler tools isn't so bad.
Sure, but we're not talking about how to irrigate a field here, we're talking about being limited to 600x800 resolution when playing a game.
Some people were teenagers when that was the best you could get, so I'm guessing they see it as a "good old days" baseline that they can be principled about while indulging their nostalgia.
I can see that, but I think calling it just nostalgia-driven is judging a book by its cover.
First off, I want to say you can totally have a design ethos that covers game engines as much as irrigation systems -- Lee Felsenstein explicitly cited Ivan Illich's notion of 'convivial technology' as an influence on his modems. And Illich mostly talked about bicycles.
What I see in this project is a specific kind of appropriate technology -- 'toaster compatibility' -- mixed with conscious adoption of old methods and aesthetics to serve and signal that end. Which is cool, IMO.
HTMX uses similar techniques in trying to 'bring back' hypermedia and reduce dependencies, although I think they're after a different kind of simplicity. And of course, their Hypermedia Systems book makes similar nods to 90s-software aesthetics: https://hypermedia.systems/
I remember that was the best I can get and I was thrilled for it at the time. But then I was even more thrilled when Far Cry came out. Then Crysis ... why would I go back? Now you surely can argue, that nowdays the creativity got lower in favour of just more textures, but I like to have both.
Still, for a simple game limiting to 800x600 for performance and dev reasons - why not? But for me it means I see no use case for myself.
It is enough to make gameplay the main challenge?
You probably missed it in another subthread, but that limit was a joke on their website, not an actual limit.
There is no such resolution limit. That was a joke.
Somebody in rural Africa once told me, "one advantage you have living in a colder area is that you don't have to run your fridge for half the year!" I honestly didn't have any good answer for him as to why I do anyway.
Off topic but I always wanted a fridge that uses cold outside air to cool in the winter.
That's actually kind of a "cool" idea. Likely reduce bills significantly with some kind of external HVAC connection, like your dryer, that pulls in cold air from a shaded overhang on the side away from solar input (or maybe underground).
This paper [1] has some discussion of testing differences between 16 C, 25 C, and 31 C ambient exhaust conditions. It's actually a fairly significant difference under testing. ~(0.35, 0.70, 1.05) kWh / 24h for (16 degC, 25 degC, 31 degC). Refrigerators in experiments were kept at ~ 5 degC (approx 600 tests).
[1] https://d1wqtxts1xzle7.cloudfront.net/82169783/j.ijrefrig.20...
That sounds either really difficult to make and maintain or an absolutely fridge industry destroying innovation. Given weather and stuff I fear the first. Sick idea tho. I know nothing of fridge engineering besides basics so could be way off.
Depending on what "colder" means, some days it'll still be too warm outside, or some days it will be freezing, or both. Neither is good for many foods or drinks you keep in your fridge.
Of course this might still be micro-optimization from a rural Africa point of view. And a part of the reason for running the fridge is still just convention and convenience.
in rural plqces often they will also use alternate ways to keep things good besides keeping things cold, because its cheaper or more easily available than using a fridge. drying things, salting (pickle? not sure of the term sry) etc. so they have less usecases for a fridge than us (lazy?) ppl whi just throw a fridge at any such problem of food preservation
Fridge in winter isn't wasteful. All the energy consumed goes towards heating
haha this... had similar experience :'). for ice cubes? haha
I would argue refrigerators provide a lot more utility for most people than high poly counts.
I think I’ve gained more utility from being able to look at 3 spreadsheets at once than I’ve gained from my refrigerator(not if we’re counting the refrigeration of the supply chain for food and medicine then that wins out by a landslide)
Most people don't need 3 monitors. Pretty much everyone needs or has a fridge except for the least fortunate in society. He said most people, so u just fall in the much, much smaller minority with a bit of a questionable claim. Like, If u had to give up one, it'd be your fridge over monitors? Utility of the monitors runs out when u have to spend time getting fresh ingredients every other day.
Fake Optimization in Modern Graphics (And How We Hope To Save It):
Dude is pitching and wanting funding for THEIR solution from the vids I saw, not a general industry change or free fix.
Also their AI upscaling makes it look like the guy is wearing foundation and makes it hard to take seriously lol.
>Dude is pitching and wanting funding for THEIR solution from the vids I saw, not a general industry change or free fix.
Terrible
- [deleted]
A higher rendering resolution doesn't require higher resolution textures, and a higher source resolution for textures is what would require more storage and more RAM. (I think a higher rendering resolution does require more video RAM though.)
Of course after some point a higher rendering resolution starts giving diminishing returns if the resolution for the source material isn't also increased.
>But also a convenient excuse to sell more ramm and disk space 'for the textures'.
Except different companies sell different things. This is like the conspiracy that women's pants don't have pockets to sell more purses.
"This is like the conspiracy that women's pants don't have pockets to sell more purses."
Oh my god, this explains everything!
(btw. I recently learned, that the 9/11 inside job conspiracy evolved. Nowdays the standard theory is, that there were not even planes in the first place, just bombs and smoke)
[flagged]
I cant tell if you're on the side of conspiracy or not but you are correct that no plane crashed into building 7. Debris fell from 1 and 2 and set the building on fire, and since there was no fire suppression, it all went up pretty badly
[flagged]
No, sorry. Claiming a whole big city is part of a conspiracy to cover up, no planes actually crashed into the towers is more weird.
But if you would call for a proper neutral investigation of the whole story, then I would support that.
The Bush Jr administration got reelected, in my mind that's a bit larger than a single-city conspiracy.
Mohs: Aluminum 2.75, Steel 4 Rockwell: Aluminum 25, Steel 60 Brinell: Aluminum 15, Steel 120 Yield strength: Aluminum 79-570 N/mm2, Steel 250-1000 N/mm2
Textures are bad, but screen resolution is good.
Ya gonna just leave empty pixels on display?
Shaded, of course
- [deleted]
[dead]
Is that a hard wired limit? I know nothing about game engines, so I'm a bit in the dark why it would only support up to that resolution. Is this about optimized code in terms of cpu cache aligned instruction pipelines etc?
"Is this about optimized code in terms of cpu cache aligned instruction pipelines etc?"
That is what I would assume, but so far I did not found a reason explaining the limit. Might also just be like it, because the author likes it like it.
Author stated in the thread that the limit doesn't exist. It's just a joke
They say that but the engine seems to require an OpenGL 4 GPU while the graphics look like something that could be done on a Voodoo card.
Requires a 15 year old card (so, 2010.) Six years after Half Life 2 but looks like Half Life 1, which shipped with a software renderer (no GPU needed at all!)
I fear the turbobloat is still with us.
Ok, so one the one hand we have one of the most universally acclaimed PC games in history, with a team of amazing programmers and artists behind it and a 40 million dollar development budget, and which represented the cutting edge of what was possible at the time in terms of squeezing every bit of performance out of a machine. On the other we have a one-person hobbyist project that is trying to make a statement about consumerist expectations for more, more, more.
If you're sincere about that comparison then I think you're missing the point.
Being able to run something on fifteen year old machines is still plenty anti-turbobloat. And I suspect the 2010 requirement has more to do with the fact that it's pretty difficult to debug software for 1990s hardware that you don't have (or lack proper emulation for).
And if you go back far enough one reaches a tipping point where supporting old hardware can get in the way of something running on new hardware, especially if we're talking about games, unless we're really careful about what we're doing and test on real hardware all the time. Not very realistic for a one-person side project.
That six year gap between HL2 and 2010 is considerable, so I don't think I'm being terribly unfair. Also, the article invited the Half Life comparison.
What is ‘turbobloat’?
From context, I interpret it to be ‘graphics tech I don’t like’, but I’m not sure what counts as turbobloat.
The whole post in tongue in cheek, it just means "features the game you're making doesn't need (like modern graphics with advanced shaders and super high resolution requiring the latest graphics cards)".
If you're making a game that needs those features, obviously you'll need to bloat up. If you're not, maybe this SDK will be enough and be fast and small as well.
- [deleted]
Manufacturing and shipping a new computer can be worth it long term. Improvements in performance and energy consumption can offset the environmental impact after some time.
Of course for entertainment it’s difficult to judge, especially when you may have more fun on an old gameboy than a brand new 1000W gaming PC.
> after some time.
This is doing a lot of heavy lifting in this sentence.
What you're talking about is called the embodied energy of a product[0]. In the case of electronic hardware it is pretty staggeringly high if I'm not mistaken.
Yes it can be. Last time I did the maths for one of my use cases. It was a matter of a few years, when replacing a few old amd64 boxes by a single Mac mini.