[Discussion] Browser gaming vs Web3.0?

Started by
23 comments, last by tillymacdonald 1 year, 10 months ago

You having an interesting observating when basically saying AAA and indie world is diverging far more nowadays than what the situation was for example in the mid 2000s. Then AAA and C-class titles didnt differed too much, and we had a strong B class as well (in between the two worlds) which we nowadays barely see any more.

If you want a more proper argument about 64x64 / 200 polygons / few 10-100 mbyte game sizes: the human eye has a limited number of receptors, and the human perception is logarithmic. When you increase the polygon count of a game from 10k polygon per screen to 100k polygon per screen, thats a very big and impressive jump in graphics quality. When you increase the resolution from 100k to 1 million, thats not that much impressive, and not that clearly noticable compared to the 10k to 100k…. Despite you increased the polygon count by 10x in both cases. This is true for texture sizes, screen resolution, sound files and so on. The AAA guys will hit a wall very soon (or they probably already hit it and checkmated themself) where it doesnt matters any more if they increase the textures size and polygon count any more, they cant make the visuals any better, no matter how hardly they want to try spending all of the money of this existence, this conception seems like a dead end.

So personally, for me, if they would start biting my ears, like demanding to increase resolutions and they wouldnt pay/wouldnt download/wouldnt play any more if i dont do it… then i would simply do it, as there is plenty of headroom, and teleporting the game size from like 50 mbyte to 150 mbyte, which is still an acceptible and realistic game size, and the increase wouldnt have an impact on adoption.

Advertisement

Geri said:
You having an interesting observating when basically saying AAA and indie world is diverging far more nowadays than what the situation was for example in the mid 2000s. Then AAA and C-class titles didnt differed too much, and we had a strong B class as well (in between the two worlds) which we nowadays barely see any more.

Yeah. Not sure if there was a indie scene in the 2000s at all, but currently it looks like something in between is missing. So the same sweetspot i miss in hardware offerings, i also miss in games themselves as well. I don't want to download 150 GB just to see i don't like the game and refund. I don't want 200h of gameplay either, because there is life and work. And i want some content made for grown ups, no teen power fantasy. Occasionally there is some indie game that i really like, but i'm sure the demand from my >40 group is higher than the offer.

Geri said:
If you want a more proper argument about 64x64 / 200 polygons / few 10-100 mbyte game sizes: the human eye has a limited number of receptors, and the human perception is logarithmic. When you increase the polygon count of a game from 10k polygon per screen to 100k polygon per screen, thats a very big and impressive jump in graphics quality. When you increase the resolution from 100k to 1 million, thats not that much impressive, and not that clearly noticable compared to the 10k to 100k…. Despite you increased the polygon count by 10x in both cases.

This argument ignores two points: In 3D games, scale of content to screen varies wildly, so if you want an immersive and realistic impression, you need close up detail. Otherwise seeing huge texels or hexagons instead cylinders breaks your immersion.
The other is the size of the games levels or worlds. Even if we settle on a low poly artstyle, we may still need a lot of storage for all the content, and 100mb won't cut it anymore, even for the lowest standards i could imagine.

We sure want some progress on compression. Either that or we'll move to the cloud to solve the storage problem. But i don't believe in moving back in time to older standards.

Geri said:
The AAA guys will hit a wall very soon (or they probably already hit it and checkmated themself) where it doesnt matters any more if they increase the textures size and polygon count any more, they cant make the visuals any better, no matter how hardly they want to try spending all of the money of this existence, this conception seems like a dead end.

It looks like that to me too. But not sure. Actually the standard seems pretty consistent and next gen did not yet push fidelity to some unpractical levels. Most games are cross gen. They surely do adapt to current situation and are not stupid.
We'll see what happens with first UE5 games. If they run at 20fps like the city demo, but all AAA studios using it regardless, then i'll agree about the checkmate. I won't buy a 1000W GPU just to run it at 40 fps.
However, such problems would be temporary. I'm sure we can achieve a next gen look with fluid fps on affordable HW after some time.

JoeJ said:
and 100mb won't cut it anymore, even for the lowest standards i could imagine.

To put that into perspective…

Cartridges were always a bane for size. Even old Atari 2600 games relied on bank switching to get more memory out of them than was officially supported. The largest mainstream game cartridges hit was 32 megabytes and they were the most expensive tier for N64, although a few like RE2 in 1998 paid a small fortune for a custom 64 MB cartridge.

The CD drive for Sega, Neo Geo, and a few others opened up graphics to a level that was amazing at the time, that was about 700 MB. Then quickly games grew to 2 disk, 3 disk, and even larger collections.

The PS2 (released early 2000, 22 years ago) and later XBox supported DVDs, 4GB of media. It took a few years, but even those started to go multi-disk late in its lifetime.

The PS3 (released late 2006, 16 years ago) supported Blu-Ray, XBox 360 went with HD DVD. Later even those weren't enough for some games that started downloading gigabytes to hard drives.

So if you're staying below 100 megabytes, you're looking at 1990s level technology. That's the era of old Pentiums and 486 processors, dial-up 33.6K and 28.8K modems, and VGA or if you could afford it, Super VGA graphics. Having a graphics card with more than 256 colors was a luxury only just being adopted. 3D graphics cards weren't much of a thing until the trail end of that time frame, and then they were cutting edge with extremely blocky low-polygon models. For a small game that never needs those resources that's one thing, or for people who like retro styles or technical challenges, that's a choice they can make if they decide to constrain themselves.

Just recognize that with that decision, you're choosing to stay more than two full decades behind the industry, and nearly three full decades behind the cutting edge. When you start talking web3.0, metaverse, and crypto, it doesn't make a lot of sense to tie that to 1990's tech.

Frob:

I think you are referring to experimental hi-tech with a lot of your examples, some of them reflects to special occasions (some of them: bad decisions), and some of those weren't utilized even for the newest AAA of that given era. For example the Xbox 360 which you have mentioned, had no HD-DVD, they had an external optional HD-DVD player unit and it was able to play movies from it, it didn't supported games from HD-DVD. No game was released for HD-DVD for that platform ever.

This means Microsoft simply went with DVD till like the christmas of 2013 (and this was less than 10 years ago). Typical PC user still have no blu-ray drive, only a DVD drive (or none at all, newest cases sadly dont even have 5.25 bays any more). With online releases, this is less of a problem, but then there are the a lot of people with only a few mbit/sec internet access (as faster connection is not available in their region), and those are not going to be very happy with very large games. To reflect a little bit of the original topic, the users will not be really happy if the browser tries to download gigabytes of data when they click on the new game menu (i am pretty sure i would close the window after like 5 seconds if the game is still not running). Something wispers me, the users would lay salt on its remains instantly if it would start to store gigabytes data in the browser profile.

Geri said:
To reflect a little bit of the original topic, the users will not be really happy if the browser tries to download gigabytes of data when they click on the new game menu

Maybe that's the current state of some limitations. But the post was not about a limited present state, but about increasing storage sizes with time. This has always happened on any platform which lived long enough to evolve.
So the same thing happens to browser games. They will become larger as people solve related problems, e.g. user manageable browser caches, and devs optimizing for background downloads.

@Alex_prfct Check out Ember Sword. It does both things you ask for: Blockchain (for NFT's) and Web 3.0.

I don't know much about the technologies, but to me it seems like Web 3.0 is still very much a beta-thing.

This topic is closed to new replies.

Advertisement