5K resolution, Quad-SLI and 10-core support?

Feedback on past, current, and future development.
User avatar
psi29a
Posts: 5361
Joined: 29 Sep 2011, 10:13
Location: Belgium
Gitlab profile: https://gitlab.com/psi29a/
Contact:

Re: 5K resolution, Quad-SLI and 10-core support?

Post by psi29a »

My wife works for a technology concern, dealing in future technologies. They've been doing work with AM (additive manufacturing, they will laugh at you for saying 3d printing) for more than two decades. I'm going to ask them if they could render a terrain for me on their systems.

If they say yes...maybe I could push it a bit and see what I can really do and at what level of detail. ;)
Neargoth
Posts: 2
Joined: 30 Jul 2016, 22:12

Re: 5K resolution, Quad-SLI and 10-core support?

Post by Neargoth »

This thread turned into gold, thank you. :D

BTW, using machine learning to improve game AI does not seem that far off. Since most games nowadays are connected to the internet, you just need to collect data from all play sessions, crunch them at your central data center and push out small AI changes to all games connecting.

But obviously the developers needs to focus on real time 3D-printing first. It is UNACCEPTABLE that this is not even in the 1.0 roadmap. Right now I use modelling clay to 3D-print in real time (works best with abots real-time travel mods), but the rig I will soon get can easily print all of Tamriel before I even install the data files. And I don't even use shaders. Stop this shortsighted DISCRIMINATION of rich people. I demand that you stop it NOW!
User avatar
Greywander
Posts: 119
Joined: 04 Dec 2014, 07:01

Re: 5K resolution, Quad-SLI and 10-core support?

Post by Greywander »

aesylwinn wrote:Damn, I wish I had a supercluster of 64 raspberry pies.
Ah, let me clarify: I don't actually have a supercluster of RPi's. My current rig is about 5 years old, I have an old Sandy Bridge i3 and a Radeon HD 6870. (Somehow I have 20 GB of RAM. I can't remember how that happened.) It was a budget (but still modest) gaming rig when I got it, now it's fossilized potato. I'm starting a new job, though, so I'm hoping to build a new rig with a 1080 or 1070 some time within the next year or so (got some debt to pay off first).
aesylwinn wrote:I really like your idea about using your cluster to simulate multiple intelligent AIs. The idea of having NPCs that actually do things is pretty cool. That's a task that can be scaled fairly well and would benefit from more processing cores. If you were to implement some networking code, you may even be able to use your pie cluster.
Again, RPi's probably wouldn't be the optimal choice to build a cluster, although it would probably be the cheapest. At $35 a pop, you could build a cluster of 8 pi's for about the cost of a GTX 1060. Each pi has a quad-core 1.2 GHz CPU and 1 GB RAM; you get a bunch of those and you have quite a lot of processing power on your hands. My only real concern is the bandwidth sharing between the USB ports and ethernet putting an upper limit on how fast data could be transferred to/from/within the cluster. For lighter use it shouldn't be a problem, though.

I expect this area to evolve naturally as long as someone pushes the envelope. If someone writes some software that allows a PC to automatically detect and utilize the cluster, and someone starts taking advantage of that to design more powerful AI, it should create a feedback loop where better software demands better hardware, which in turn enables better software, etc. Kind of similar to how better graphics have demanded better GPUs, which enable better graphics, and so on.
ezze wrote:Or perhaps, it's time to accept that games are not about using more and more resources, but are about fun and we had pretty much the same fun cracking the difficult levels of Bubble Bobble than playing a game with each hair independently animated.

Think about the Wii, less powerful than the other console, big success for this reason.
You're not wrong, but I think we also need to talk about how little things like AI have advanced, something that could be revolutionized if cluster computing became the standard. Artificial Intelligence in gaming is anything but; it's really just complex scripting, essentially a massive chain of if/then statements rather than a real decision-making process. I'd like to see NPCs that are actually capable of both acting intelligently and learning.

There's also the fact that Bubble Bobble has vastly different gameplay than Morrowind does. Try playing a retro demake top-down (or side scrolling) Mirror's Edge and tell me you get the same experience as the original game. I love my retro games, but some gameplay concepts require advanced tech, whether it's 3D graphics, physics, AI, or what have you. If all you want is to make a fun game, then you're better off using GameMaker, RPG Maker, Ren'Py, or some other easy-to-use 2D engine, but if you have a specific concept in mind you might require something more specific and more advanced.

I'm not someone that really cares a lot about graphics. I think the most important part of graphics is animation, not polygons or textures or godrays or whatever. This is why older games all the way back to the N64 and PS1 can still look great despite being so old (while others, on the other hand...). Even in 2D games animation plays a major role in the aesthetic appeal. Think about the sprite animations in FF6 or Chrono Trigger. It's not about using more resources, it's about developing an area of gaming that's been sadly neglected (specifically, AI). And this will require more resources. And besides, I'm not going to turn up my nose at game because it has pretty graphics, at most I'll lament that they could have spent that time and effort (and money) improving other parts of the game, but only if it needed it. *coughStarCitizencough*

(As an aside, I'd be happy to play an Elder Scrolls style game that used more cartoony graphics, which would theoretically be both easier to make and less resource-intensive. I'm not saying go full Wind Waker, but I wouldn't object to that either. Unfortunately, the more realistic style as seen in most ES games seems to be the default when people are designing OpenMW assets.)

I don't think the OP is asking for as much as you might think. It's all about scalability, optimization, and future-proofing:
  • As far as I'm aware, OpenMW doesn't care what resolution your monitor is, so it should scale to any resolution you like, even ones that don't currently exist on any monitors but may exist in the future. Although now I wonder if multiple monitors are supported... And of course VR is yet to be supported, and will likely be a highly requested feature.
  • Cell preloading helps a lot with reducing load times, but how well does OpenMW scale with RAM? If someone did have a monstrous 128 GB of RAM, I'd expect them to never see a loading screen as the game should be able to load anything they'd ever need long before they'd need it.
  • Regarding quad-SLI, even nVidia isn't supporting it anymore. You can do dual-SLI, and possibly use a third for PhysX. There's a small amount of optimization that could be done that would allow benefits from having more than two cards, but honestly I don't think it's worth the effort. Optimizing the graphical processes to more efficiently use a single card would provide much more benefit to everyone. I'd be more concerned about making sure things still scale well when the next card comes out rather than optimizing multiple cards.
  • As far as multi-core support, I think all we really need to do is make sure OpenMW runs a process in a separate thread when it is efficient to do so. That way, you'll automatically utilize your extra cores as much as you could, even if you don't end up using all of them. I think there's room for things like AI and some scripts to run in their own threads, theoretically allowing infinite threads, and thus allowing benefit from more cores.
We're not asking you to deliver a premium gaming experience, we're just asking the tools be designed in such a way that they could deliver a premium gaming experience if someone took the time to build such a game with these tools and the player had the hardware to run that game on the desired settings. But hey, what do I know? I've barely touched C++, and wouldn't really call myself a programmer. Maybe this is a lot harder than it would seem.

I know most people don't have the cash to build shiny new PCs, but I think many of us like to imagine that some day we will. It's nice to have something to look forward to. (Also if you literally haven't been able to afford upgrading your computer at all in the last 2 or 3 years, you might want to consider getting a better job. Either that or taking a serious look at where you're spending your money.)
taipanroxy
Posts: 4
Joined: 29 Jul 2016, 16:55

Re: 5K resolution, Quad-SLI and 10-core support?

Post by taipanroxy »

i have absolutely 0 intention of spending 33,000 dollars on a gaming pc that will be outdated in 3 years and i imagine the developers don't have any intention of doing this either. all declarative programming is complex if then statements in the end and there's no reason to expect this to change just because you have vastly more powerful hardware, unless it's of a completely different nature (and therefore, need to spend a bunch of time learning it again), and of course, that's not even getting into the challenges of AI. the idea of making ai that is *actually* intelligent and *actually* learns, for a game like morrowind, is... like, it's not an option. i don't care how many saltlake qubits you're running

much less important than additive manufacturing by 250 loaded exterior cells at once anyway
BTW, using machine learning to improve game AI does not seem that far off. Since most games nowadays are connected to the internet, you just need to collect data from all play sessions, crunch them at your central data center and push out small AI changes to all games connecting.
yes but it would be ridiculous for morrowind, while not so ridiculous for something else. frankly the point where you add machine learning to morrowind ai, the actor segments of the engine must look... completely, totally different
User avatar
Greywander
Posts: 119
Joined: 04 Dec 2014, 07:01

Re: 5K resolution, Quad-SLI and 10-core support?

Post by Greywander »

taipanroxy wrote:i have absolutely 0 intention of spending 33,000 dollars on a gaming pc that will be outdated in 3 years and i imagine the developers don't have any intention of doing this either.
This is just silly. No one is asking you or the devs to buy expensive gaming rigs (which will more realistically run you $2000-$4000, not $33k), only that OpenMW, which, remember, is a tool to make and run games, not the game itself, be able to scale well to take advantage of such a system. And I would hope that in the next 5 to 10 years the hardware would be available at an affordable price with all of the features previously mentioned in this thread. You limit your thinking if you only consider what's available right now instead of thinking about what we might have in the future.
the idea of making ai that is *actually* intelligent and *actually* learns, for a game like morrowind, is... like, it's not an option. i don't care how many saltlake qubits you're running
Again, you're limiting your thinking if you only consider Morrowind. Morrowind has always been the measuring stick to determine the progress of OpenMW, not the end goal. The end goal is to build an engine capable of creating newer and better games than Morrowind. In fact, OpenMW isn't even limited to open-world RPGs; you could create a variety of different types of games.
yes but it would be ridiculous for morrowind, while not so ridiculous for something else. frankly the point where you add machine learning to morrowind ai, the actor segments of the engine must look... completely, totally different
As has been established many times on this website, OpenMW is not Morrowind, it is merely an engine that is capable of running Morrowind. By your own admission, advanced AI is not ridiculous for other games that aren't Morrowind, and I can definitely see such games being created in OpenMW.

There's been a lot of discussion in this thread, as well as a lot of joking around. I want to make it clear that there's a vast difference between expecting the OpenMW team to implement some feature and expecting OpenMW to be designed in such a way that a game developer could implement said feature into their game. OpenMW doesn't need to implement advanced AI, it just needs to lay the framework that would allow advanced AI to be implemented by a game dev. This is far less complicated, and something we could realistically see implemented in the immediate future, even though it might not be taken advantage of for several years.

If your game engine is only designed around current hardware, then it is already obsolete. A well designed engine will scale infinitely (in theory), and be capable of running brand new games 20 years from now with very little changes. The only real changes you should need to make are to take advantage of new technology, and yes, I mean new technology, not more powerful hardware. If all you've done is add a faster core, or another core, but it's essentially the same core you've been using, you shouldn't need to make any changes to the engine.

All that's really being asked is that:
  • OpenMW uses as many threads as is efficient in order to maximize performance with multiple cores.
  • OpenMW intelligently anticipates what data might be needed and preloads it into RAM, as long as free RAM is available.
  • OpenMW is graphically optimized and supports the latest graphical features, which may or may not be implemented in specific OpenMW games.
I don't expect this to come before 1.0, but that doesn't mean it shouldn't be on the roadmap.

Morrowind was a good game, but it's also an old game. I'm far more excited to see what sorts of new games could be made in OpenMW. Let's live in the future, not the past.
User avatar
FiftyTifty
Posts: 63
Joined: 15 Oct 2014, 21:02

Re: 5K resolution, Quad-SLI and 10-core support?

Post by FiftyTifty »

Asking for more robust graphics is a bit wanky, considering the antiquated visuals the game ships with, but having the game be parallelized can only be a good thing.

Let's have the AI being processed in the background, with all of yer cores workin' on it. Why have that? So we can have buttloads of NPCs doing their own stuff; akin to "Radiant AI", where you will happen upon NPCs in the middle of some routine as they're being constantly processed, but without some low performance/memory limit.

Let's have the game utilize a robust object batching implementation, ta boost performance even further. OGRE used some generic static batchig, but 'ey, why not batch everythin' as much as possible? More perf is always good, and since it's dealing with models (vertices 'n' textures), it'd be easy to do in parallel.

Why not chuck in deferred rendering? It's ruddy awful that the entire game scene is rendered repeatedly, for as many lights as there are. Got four lit torches? Good game, bub. Deferred rendering makes that pretty much a non issue, and when that's coupled with decent mesh batching, ya could chuck in ruddy hundreds of lights.

Let's also have all the meshes, all the sounds, all the textures and everythin' in-between chucked in RAM. It ain't 2003, 8GB of RAM seems to be the norm these days, so let's make use of it. Screw loadin' stuff frae the hard drive. Worst case scenario, it's made into an on/off thingymajigger.


Nae point sticking to single threaded workloads 'n' "good as vanilla" performance, when the whole point of OpenMW is to supersede the original, outdated 'n' right bollocks Gamebryo engine.
User avatar
AnyOldName3
Posts: 2673
Joined: 26 Nov 2015, 03:25

Re: 5K resolution, Quad-SLI and 10-core support?

Post by AnyOldName3 »

A few quick points about SLI/Crossfire:
  • I'm pretty sure it's OSG's job to support SLI, not OpenMW's.
  • NVidia are only phasing out support for SLI, but not multi-GPU in general. It's still possible in a different way in DirectX 12 and Vulkan.
  • OSG's creators are planning on creating a similar library for Vulkan, in which case, OpenMW will probably eventually utilise it.
  • This makes explicit multiadaptor a possibility for OpenMW in the future, so we probably shouldn't ignore multi-GPU systems totally, unlike what was suggested earlier.
greetasdf
Posts: 20
Joined: 12 Jul 2016, 21:35

Re: 5K resolution, Quad-SLI and 10-core support?

Post by greetasdf »

Greywander wrote:
taipanroxy wrote:i have absolutely 0 intention of spending 33,000 dollars on a gaming pc that will be outdated in 3 years and i imagine the developers don't have any intention of doing this either.
This is just silly. No one is asking you or the devs to buy expensive gaming rigs (which will more realistically run you $2000-$4000, not $33k), only that OpenMW, which, remember, [...]
@Greywanderer: You have serious writing skills! But hey, I understand you. After scrupulously reading your posts, I'll summarize what you want(TL;TR version):
  1. Some sort of artificial neural networks integrated in openMW (optimally and obviously mutlicore gpu-optimized libraries)
  2. Endemic genetics for creatures/mobs/NPC's (e.g. resistances)
  3. 3D printing of a cell
  4. Give everyone a good laugh about weird outlandish requests that *some people* incessantly elaborate more on :o
  5. to have more humour ;)
sitting back and getting some popcorn
Dasher42
Posts: 11
Joined: 22 Apr 2015, 16:41

Re: 5K resolution, Quad-SLI and 10-core support?

Post by Dasher42 »

And here I thought playing games was about gameplay! :D

Back when I was on the Sacrifice of Angels modding team for Homeworld as the scripting guy alongside a team of mesh and texture artists, we made things polygon-efficient, we got the most out of textures, we made sure shared texture memory allocated well, we LODded the hell out of things to make sure our polygon count per ship stayed under a thousand and that no texture was bigger than it needed to be. Why? Because the game was about having from one to hundreds of ships engaged in mass fleet action in 3D real-time strategy, not a 5fps slideshow of a studio-grade model.

Seems to me this whole OpenMW project is here because people remember what's great about a game over a decade old. They're not rewriting the graphics demos of 2002.

Hey, don't lecture a bunch of open source volunteers with an awesome project on why they need to buy $4000 rigs to code - unless you're willing to buy them those fancy machines, and create a bunch of models and textures that actually benefit from it. It will win them over if you do, I'm sure. :lol:
Dasher42
Posts: 11
Joined: 22 Apr 2015, 16:41

Re: 5K resolution, Quad-SLI and 10-core support?

Post by Dasher42 »

taipanroxy wrote:some of us load 250 exterior cells at once and if the api cannot account for this and print all of morrowind, tamriel rebuilt, skyrim, and cyrodiil, and then use machine learning and neural networks to extrapolate and auto make elseweyr and argonia, and print those, and then take me to akavir, in real life (which is VERY realistic with technology in the next years or so) you will be outdated to those of us with good computers in a flash.
All that and the Khajit will still walk like something's stuck up their arse. ;)
Post Reply