PROGRAMMING (Engine, API, Hardware, etc)

  • sailor67

    Posts: 12175

    Posted:
    Posted:
    @ABrown_CIG

    Just watched the latest ATV. Erin ( @Addis )was showing the new ships control system. Can you or one of your team mates explain how someone using a stick/HOTAS is supposed to interact with that? I genuinely don't know how that can work without a Track IR or equivalent.

    XIuGBuC.png
    html>
  • ABrown_CIG

    Staff

    Posted:
    Edited: by ABrown_CIG
    Posted:
    Edited:
    Hi everyone,

    There's quite a few questions here about HUD / MFDs, controller mapping, and networking, but you'll have noticed that Ben is our most prolific poster and I try my best to jump in where I can, but we're both part of the graphics team so don't have the knowledge to answer these questions. I'll poke some of the other programming leads to see if they have time to answer a few questions, but I just wanted to let you know that we're not ignoring you!
    [hide]

    I would love to know why DirectX and Vulkan are still an option for multiplatform games?
    When SC is ready to be ported over to Linux would it not be easier if you would already use Vulkan on Windows and do not even bother implementing DirectX 12.
    You would not spent resources fixing or implementing DX rendering stuff.

    Hi hyper159,

    Years ago we stated our intention to support DX12, but since the introduction of Vulkan which has the same feature set and performance advantages this seemed a much more logical rendering API to use as it doesn't force our users to upgrade to Windows 10 and opens the door for a single graphics API that could be used on all Windows 7, 8, 10 & Linux. As a result our current intention is to only support Vulkan and eventually drop support for DX11 as this shouldn't effect any of our backers. DX12 would only be considered if we found it gave us a specific and substantial advantage over Vulkan. The API's really aren't that different though, 95% of the work for these APIs is to change the paradigm of the rendering pipeline, which is the same for both APIs.
    [hide]

    What's the technical reason for the low FPS in the near-sun QDrive warp in today's 10FTC as seen here:
    https://gfycat.com/UglyLinearDog
    Just a lot of objects on the screen at the same time causing issues?

    Hi Hawxy,

    One thing to remember is that as Erin mentioned this video represents our 'visual target' rather than a specific star system, and was an artist led exploration of the type of gas clouds and color themes art want to achieve to add visual interest to the star systems. As a result it wasn't made to be efficient and hasn't been profiled by any tech-artists or programmers. It also wasn't using our volumetric gas cloud system, and instead was manually put together to show us one example of what they want to achieve more easily and robustly with the gas cloud system. Our existing use-cases for gas clouds have been quite different to the example shown here, but based on this and other artist reference work we're about to start a major upgrade in our volumetric tech in order to achieve this at a hopefully silky-smooth frame-rate!

    In terms of number of objects, when using our dedicated systems for asteroids & debris we can already handle over 100,000 individually moving objects on screen at well over 60fps, and intend to use an imposter system to handle the visualization of millions more in the background, so there's no concerns there :-)
    [hide]

    [Question]: Why does it seem like CIG is avoiding the most important questions? I see and hear all the progress of easy stuff that is being done that can wait to be done after the game has been released. Example: graphics detailing and animation improvements as well as gameplay improvements. If all of this can be done, then why don't we have a working release of the game????? I see no discription of progress on the programming development of the game. I don't care how small of progress has been done I just want to know what was done. This telling us what is going to be done is great but we need to know what was actually done each week. You are working on our money, so in theory we should be let known how small of progress was doe the week on the game progression.
    (What has been done in the week not what will be done in the future,like everyone talks about.)

    Hi Calgen,

    We do our best to update the community on both the work we're doing which can be seen in the monthly reports and Around The Verse (which takes a significant effort to produce). Also it's worth remembering that our programming team has many specialized departments, game code, animation, audio, graphics, engine, tools, UI, AI & networking, and it's not like they can just jump into each others code to help out. But I can assure you that all the teams are working hard and none of their work is "easy".

    Here's some links which shows our recent updates:

    https://robertsspaceindustries.com/comm-link/transmission/15790-Monthly-Studio-Report
    https://robertsspaceindustries.com/comm-link/transmission/15704-Monthly-Studio-Report
    https://robertsspaceindustries.com/comm-link/transmission/15786-Around-The-Verse
    https://robertsspaceindustries.com/comm-link/transmission/15778-Around-The-Verse

    So for example in my January report you can see the Graphics team started work on area lights, and in the February report they had made a great deal or progress and were then shown in this weeks Around The Verse. The progress of the other programming teams can equally be seen, so I'm not sure which programming department it is you're looking for more information on.

    Of course often the community want to also know what we *intend* to do in the future, as a good chunk of our previous work is already evident in our public released alpha, so we try to strike a balance between describing past and future work.

    Cheers,

    Ali Brown - Director of Graphics Engineering

  • VesperTV

    Posts: 5

    Posted:
    Edited: by VesperTV
    Posted:
    Edited:
    Hello, I have 2 question for you guys:

    1: What is being done about the collision in the game ?
    {
    It's so easy to glitch out to a wall at the moment, like, you just walk into one of those "toilet door", and 2sec later you're in EVA outside of Port Olistar.

    Or you enter a ship, say, Aegis Avenger Titan Renegade, and you have 30% chance of glitching outside of the ship when trying to enter the buffer zone before pilot seat.
    }

    2: Are you planning on adding default mapping for your french users ?
    {
    AZERTY keyboard mapping is totally different than QWERTY, when you save settings, you use the "physical" location of the key on the keyboard rather than the actual key. (example, it gave me trouble to realize that to accept a party invitation with "[", I actually had to press "^" key.
    Ie: https://upload.wikimedia.org/wikipedia/commons/b/b9/KB_France.svg (belgium AZERTY follow QWERTY mapping better)
    }

    Keep up the good work!
  • Paldren

    Posts: 2295

    Posted:
    Posted:
    [hide]

    Years ago we stated our intention to support DX12, but since the introduction of Vulkan which has the same feature set and performance advantages this seemed a much more logical rendering API to use as it doesn't force our users to upgrade to Windows 10 and opens the door for a single graphics API that could be used on all Windows 7, 8, 10 & Linux. As a result our current intention is to only support Vulkan and eventually drop support for DX11 as this shouldn't effect any of our backers. DX12 would only be considered if we found it gave us a specific and substantial advantage over Vulkan. The API's really aren't that different though, 95% of the work for these APIs is to change the paradigm of the rendering pipeline, which is the same for both APIs.

    I am really excited by this, considering the immoral ways Microsoft has been going with Windows 10 ... but does this mean that mGPU will only be an option under Windows 10?


    Source: https://www.khronos.org/assets/uploads/press_releases/2017-rel149-vulkan-update.pdf
    [hide]

    Native multi-GPU support for NVIDIA SLI and AMD Crossfire platforms
    – WDDM must be in “linked display adapter” mode

    Enemy%20Contact%20Signature_zpss1tiqgir.
  • Notavi

    Posts: 7

    Posted:
    Edited: by Notavi
    Posted:
    Edited:
    [hide]

    [hide]

    I would love to know why DirectX and Vulkan are still an option for multiplatform games?
    When SC is ready to be ported over to Linux would it not be easier if you would already use Vulkan on Windows and do not even bother implementing DirectX 12.
    You would not spent resources fixing or implementing DX rendering stuff.

    Hi hyper159,

    Years ago we stated our intention to support DX12, but since the introduction of Vulkan which has the same feature set and performance advantages this seemed a much more logical rendering API to use as it doesn't force our users to upgrade to Windows 10 and opens the door for a single graphics API that could be used on all Windows 7, 8, 10 & Linux. As a result our current intention is to only support Vulkan and eventually drop support for DX11 as this shouldn't effect any of our backers. DX12 would only be considered if we found it gave us a specific and substantial advantage over Vulkan. The API's really aren't that different though, 95% of the work for these APIs is to change the paradigm of the rendering pipeline, which is the same for both APIs.
    Excellent. I've been avidly awaiting news about how the Vulkan work is going (proud member of the Grounded Linux Navy here). I occasionally see glimpses of news about it in the monthly reports but since it seems to be proceeding on a when-it's-done timescale it doesn't make it into the weekly reports (which is fair enough, since from what I understand you're not yet ready to commit it to a particular release - though it would be lovely to have a section in that report discussing other background work that might be happening).

    The last news was a few months ago, and indicated that the team was mainly busy doing the engine re-organisation work needed to make the best use of Vulkan. Is that still where you're at, or is there some news to share on this front?
  • Dranor-Zylander

    Posts: 1359

    Posted:
    Posted:
    ABrown.CIG, will the first public Linux build have a code name? I'd like to suggest Sprinty Leopard lol.

    Someone should let Richard Stallman know.
    33f3i1w.png
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]


    Hi Dictator,

    Are you sure you don't actually work here? This summary is pretty comprehensive, even highlighting the major assumptions & flaws in each piece of tech :-)

    Haha. I guess my years of meticulously obsessing over pixels has had some benefits! *Cough Cough* If you guys are ever looking for an external graphics QA... *Cough Cough* ;)

    Seconding the impressed-ness with the summary. I'm meant to be writing something like that up before we make changes... I might just use this.
    [hide]

    [hide]

    -snip'd-

    That sounds pretty sound. Since you guys would be then making use of the LY-screen-aligned voxels, would you guys also use that for planetary cloud rendering (non-gas-cloud-type-clouds)?
    While the exact unification plan isn't nailed down, our rough view is that any system we have needs an answer for when it's inside the Froxel Fog's range, and an answer for when it's outside. We basically end up with three categories:
    1) In some cases, the answer will be that it's small enough to disappear once it's beyond the range, this is the way standard LY deals with fog volumes for instance.
    2) In other cases where Froxel Fog would work better, or another system that uses it may be present (eg will fog volumes exist at cloud altitude?) , we'd like the long distance solution to be able to export its own data into the froxel buffers, so that there's no discontinuity where the two solutions switch over.
    3) Finally there may be cases where the "distant" solution looks as good or better than Froxel Fog, and in that case the systems don't need to interact at all.

    Since planetary clouds are going to be visible at very long distances, we know they can't be solved only with Froxel Fog, so they'll fall into either category 2 or 3.
    [hide]

    [hide]

    By the way, I'm avoiding the word 'nebula' when discussing space volumetrics as that suggests a ridiculous scale that is pointless trying to represent as a volume as even at light speed you wouldn't see any parallax.

    Yeah, perhaps just "space dust" fits well enough ;)

    Lookin at the most recent monthly report, I definitely loved seeing @BParry_CIG 's work on the rectangular area lights.
    vlcsnap-2017-03-19-11pmxhq.png
    vlcsnap-2017-03-19-11nkzzo.png
    vlcsnap-2017-03-19-11ehy18.png
    It looks a lot better on character faces than the last model... especially the way diffuse propagates over small discrete features. Like underside eye lids. Is some of this work based on the eric-heitz work / unity stuff that has been going around? If so, the most recent paper also shhowed off how to make textured rectangular area lights. That could be interesting given the amount of signage in SC atm.
    I may have posted the video of Heitz's LTC paper to Ali's Facebook page under the heading "ALI. ALI LOOK" the moment I heard about it. As it happens though, we're not using it - while it's miles ahead of the pack in performance/quality tradeoff, it's still a huge cost (approx 1ms per fullscreen light on a GTX980) and we would then have no way to downscale on lower-spec machines given how different the scenes would look with them disabled.
    Instead, the diffuse component is closely based on Sébastien Lagarde's work for planar lights in Frostbite engine, and the specular component is somewhat based on the version we got from CryEngine, the reworking mostly consisted of taking it apart to find out which approximations led to artefacts, replacing them with alternatives, and exhaustively testing them for new issues.
    One thing that did come out of this rework, however, is that the planar lights now live in their own rendering pass. Given the time, it would be interesting to see if we could add some kind of "ultra mode" that replaces this pass with a version that uses LTC lights.
    [hide]


    I am curious then, what kind of work, if any, will be done for occlusion or shadows from such lights? So, for example, one cannot see the specular highlight across character eyes when the hood of their eyelid would perhaps occlude it. Obviously that is an open point of research for video game graphics, but unshadowed lights in general (even area lights) tend to look rather gamey in the end. Especially given how the human face needs good occlusion and shadows to cross the uncanny valley usually.

    I am curious as well, because some shots outdoors on planetside from the most recent AtV got me to thinking about specular occlusion in SC, as it seemed like some edges were highlighted from probes even though you would imagine that SSDO would directionally occlude that.
    vlcsnap-2017-03-19-11vjxss.png

    Specular occlusion and a better SSDO term are definitely on our radar. Unfortunately shadow maps from large area lights, as you mention, are a huge open research topic, so we're focusing on improving softer screen space techniques to take up the slack.
    [hide]


    And last question, I swear. Given the prevalence of space helmets with glass in SC, how will this figure in to DOF? At the moment DOF seems to act as if the glass is not there and blends it into the background or foreground depth of field, even though it should not from a realism standpoint (like below).
    31967381334_4781434e4rka89.jpg
    I know Ryse had some funky way of making transparency not have this problem ("depth fix up"). Is a similar idea going to be used in SC when you guys make use of LY's post processing for motion blur and depth of field?

    Anyway, thanks for answering any questions if you guys do.
    Best!

    We have two solutions to this! First, as you say, there's depth-fixup. We're already using that on hair, and (I think) on particles. In general though, the trick is to do depth fixup only on the parts that are opaque enough that they dominate the image, which is less easy to do for things like glass and holograms.
    The second solution is that we simply sort objects into two lists based on whether they're beyond a certain (situationally-varying) distance. Things beyond that distance are drawn before the DOF and motion blur calculations, things nearer than that plane are left blur-free. While this clearly isn't a perfect solution, nearby transparencies tend to be more problematic when they blur with what's behind them, and more distant transparencies are problematic if they remain crisp when objects around them are blurred. This second solution is already working at the render end, but we still need to hook it up to the systems that will control it.

    Keep em coming!
    Ben
    Programmer - Graphics Team
  • Senkan

    Posts: 57

    Posted:
    Posted:
    Hi,

    Thanks for answering so many questions in this thread. I was wondering about global illumination. The last time you talked about it, it wasn't locked as a feature but you started really needing it. Have your thoughts on it changed? From what you said the current implementation samples very sparsely placed points, I believe on planetary scale. What implementations and approaches are you leaning toward, if applicable? What benefits are you looking for?

    I'm just trying to pick your brains on this topic which is pretty interesting to me :).

    The best to you all, thanks for your great work.
  • MrBobarian

    Posts: 1

    Posted:
    Posted:
    Hi! We saw a sneak peak of your ID masking and general masking system in a recent ATV. I would love to hear a bit more about how that works. I would assume you make masks outside the engine and import. Also to my knowledge you could either use an ID texture mask to mask out different materials or save the material sets from your 3d package, how do you go about this?
    Thanks in advance ;)
  • elec

    Posts: 16

    Posted:
    Posted:
    3 short questions:
    - Will the game have explicit multi gpu support, later on? (it should, when we think about big simulator screens and stuff in the future)

    - Will we be able to change the simulationspeed of singleplayerparts like SQ42 or ArenaCommander on the fly? (to make epic looking videos together with the advanced camera options and TrackIR)

    - Will the game support nVidia ANSEL? (super-resolution screen/render-shot tool, here some examples from GR:Wildlands )

    thank you :)
Sign In or Register to comment.