PROGRAMMING (Engine, API, Hardware, etc)

  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]


    That's awesome too, but I actually mean on the pitch ladder in the center of the HUD as you can see in this video @ 6:15: 0.jpg


    It adjusts if you're over a hill or whatever so you can align your ship to the terrain all without having to look in another direction.

    Ah, I see it. Yeah, the UI folks there really thought through what they needed to communicate. I don't know what our plans are in that area, since whatever they are doesn't seem to need any weird shaders or rendering pipelines.

    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    Hey, I was curious, is the team making any use of ScreenSpace Shadows currently? Particularly with some of the shadowing detail issues on character faces, it would probably help clear up some of those issues. Not sure what Cryengine base you guys are working from, though, since varying versions have been in CryEngine in the past?

    Shadowing detail is something that @geoffbirch has put some time into during the last couple of months (we now call him Shadow Man). The upshot is that we're not yet using screenspace shadows, though we've discussed it and possibly will in the future, whether by integration or not depends on whether they've made them work for anything other than the sun.
    One thing with any screenspace effect is that it immediately breaks when the important features are off-screen, so we're determined that screenspace shadows should be adding the final touches to the image, not fixing any glaring problems. That's why, in the shorter term, we've thrown out all the old shadow bias calculations (since they were mostly just arbitrary unexplained numbers being added and multiplied into things that shouldn't directly affect the outcome), and started over with some sensible calculations that take into account slope, texel size, softness, and all the other things that are actually relevant.
    Programmer - Graphics Team
  • MCorbetta_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]

    [hide]

    I have a question relative to the potential large water surfaces on a planet. I know planetary water is in the early stage of development as mentioned in AtV 3.11.

    I think some sort of this technique could enhance large water areas a lot.

    Hi, yes we have plans to change water color and waves strength based on water depth and other properties, and reducing tiling at distance.

    Cheers,
    Marco

    Going off of this answer. Have you looked into or see the work that has gone into what going with this particular engine (Unigine Engine) in regards to water simulation. Though i understand that just because one engine can do something well, it does not mean the other can.


    https://developer.unigine.com/en/devlog/20160317-unigine-2.2

    Hi, the water in the video looks great, as it was mentioned we will be improving our planetary water and shoreline.

    Cheers,
    Marco

  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    I wish my eyes could unsee these things... but in the recent Star Marine footage... This damnable motion blur inconsistency still pops up! :D
    vlcsnap-2016-11-19-15tfs0s.png
    vlcsnap-2016-11-19-15kgsqn.png
    Only the helmet and gun! hehe

    What do you guys think causes that exactly @geoffbirch / @BParry_CIG ? Interestingly, it applies to the view model hands/legs/etc in first person camera. Just the 3rd person camera / other character models in first person camera have this issue. Are those things not generating motion vectors? Is that damnable "no motion blur" flag still checked for them? I do wonder... as the motion blur is rather sublime looking on the objects it actually applies to.

    [hide]

    [hide]

    How about the velocity of limbs over the main body. Clearly they are moving independent of and probably in addition to the total movement.

    It is hard to say what it is exactly, but it seems to be localised to tagged objects. Discrete models. I presume the gun and helmet are loaded separate and adhered to the player model different than the armour overlays and the fleshy parts.
    jump_motionblur234sk4.gif

    Like, even though there is obvious movement and large exaggerated motions with big arcs, you can see it only applies to the helmet and the gun. Even though everything should be rather motion blured there given the amount of movement. You should see it super obviously on his arms and legs as he vaults... and then all over his body as he falls. Yet... just on the helmet and gun model.

    Yet at the same time, motion blur is applied to those model parts for first person assets (which are the same as those in 3rd person given how the game just has one universal rig). Like in the .gif below, you can see it applying to first person hands, where it does not in the .gif above.
    reload0068fws2j.gif


    Yet at the same time, other player models you see other than your own out of the first person view have the same afforementioned issue where motionblur only applies to hands and helmets.

    Offhand, I wouldn't like to guess. Something's obviously wrong, and it's wrong at or before the point that motion vectors are drawn to the screen. Next time I pass something that's doing it, I'll try to get a RenderDoc capture of it, since that'll tell us if it's writing no vectors, doubling them somehow, or whatever. Trying to guess which it is from the end result just tends to result in confirmation bias.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    this effect is some sort of legacy of the original Cryengine .. there it was happening all the time.

    You can find this in other places - i find this blurring "disgusting".
    it happens in quantum travel and when wheels spin .. everything that is exceeding a certain "speed" starts to blur
    i just hate it -.-

    i suspect it was put in the Cryengine to hide precision issues when dealing with large momentum changes ..
    probably because of networking issues that arose when predicting the position of attached items - especially when player controlled.
    (in my experience it never happens to the main object .. aways to its attachments)

    Hi @Mavy2143!
    I think I know the effect you mean in quantum travel. In truth, that tends to be a case of things being told not to motion blur, oddly enough. The game knows the camera is moving at ludicrous speed, so if an object doesn't record a vector to say that it, too, is travelling at ludicrous speed, the renderer assumes it must be a stationary object that you're hurtling past. I imagine this was less noticeable in games where you were largely walking or driving around on a stationary landscape.
    [hide]

    The skin of the characters in Star Citizen seems to have in some cases still a plastically or glossy look.
    Is it a issue with Subsurface Scattering?

    skinrdw4hen23l.jpg

    I have seen a video about Seperable Subsurface Scattering for video games which gives a really natural results.

    0.jpg


    The link to the documentation:

    https://users.cg.tuwien.ac.at/zsolnai/gfx/separable-subsurface-scattering-with-activision-blizzard/

    Hi again, @Speedbeat. You'll be glad to hear that CryEngine already had a screenspace subsurface scattering pass when we got it. Ali's looked inside it and I don't know if he made any changes in the end.
    The reason some characters look a bit too shiny is... probably that they're actually a bit too shiny. When I last turned on our verification overlay, several characters had drastically different specular levels and gloss on them. We've talked to art about it, and thankfully we took single-light reference photos of a few of the actors, so we've got a ground truth image to home in on.
    [hide]

    With the recent move towards showing us the current production milestones / progress into the development of Star Citizen, I noticed that they haven't mentioned much about the Engine / Vulkan work up on the page: https://robertsspaceindustries.com/schedule-report

    Is this just because it's not aligned with a particular release (i.e. it's a "when-it's ready" feature?). Can we look forward to hearing about how this work is progressing in future reports?

    Hi @Notavi,
    Sorry to say I've got no further info on the Vulkan work. The guy driving it is heavily involved in the view culling tech (as well as a bunch of other vital systems) so I'm sure you can imagine that takes priority.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    I was watching the movie "Silent Running" (1972) and in an exterior shot showing the protagonist's spaceship, one could see the sun in the distance, however it looked like to me, as if the ship was dimly lit (orbiting the planet of Saturn). Maybe it was just that one scene, but..

    I am wondering: Will the illumination on objects coming from the sun appear to be stronger closer to the sun, and weaker when objects are much farther away from the sun?

    Admittedly I have no idea what might be realistic in terms of level of luminosity. Presumably the Sun would appear to be more dim from a great distance at the outer ring of a star system.

    Hi @realspacehobo,
    It's certainly technologically possible to vary the brightness correctly by distance. When you run the numbers though, you tend to suddenly realise that the sun's disappointingly dull out near Saturn, and way brighter than you really wanted it, even at Earth orbit, let alone closer. So people tend to throw some artistic license at it, some custom curve that makes things a bit darker in the outer system, a bit brighter in the inner, but nothing that will wreck everything.
    [hide]

    So in this year there are new games that come to pc with sort of upscaling rendering method, will SC implement such method?

    [hide]

    Quoting myself to reelaborate my question : will SC implement any kind of checkerboard rendering or similar tech? Is it even under consideration?

    Checkerboard isn't something I'd heard of @napoleonic, thanks for bringing it to my attention!
    Though temporal techniques can be a pain, and cause a lot of fiddly bugs (looking at the Rainbow Six Siege presentation, they had to impose quite a few art and effects limitations to make it play nice), their results are hard to ignore.
    Counterbalancing this, we probably couldn't use precisely the tech one of the reasons they probably got good results here is using a clustered renderer, rather than tiled. This means that the bulk of their lighting work is done by the pixel shader, rather than as a postprocess. We could maybe do a similar thing at the postprocess stage, but I couldn't say offhand what the tradeoffs would be.
    [hide]

    I've read alot about AI recently thats its better than human at:
    Super human picture recognition http://www.businessinsider.com/googles-photo-recognition-system-has-superhuman-abilities-2016-2?r=UK&IR=T&IR=T
    Beating human speech recognition http://www.technewsworld.com/story/84013.html
    AI beating human at lip reading https://www.technologyreview.com/s/602949/ai-has-beaten-humans-at-lip-reading/
    Is it possible to optimize games with AI or isn't the AI quite there yet?

    Caveat: I'm not an AI programmer.
    While we've seen a lot of recent AI breakthroughs recently, they're mostly centred around improving pattern recognition to the point where the computer can reason about the world. With game AI, we're generally much more constrained on processing budget, but we can take shortcuts around the pattern recognition step: show an ingame AI an object, it can just query a table to find out what the object's name is and what it can do with it.
    So, short form: probably not useful for Star Citizen because we didn't need to do what they're getting better at.
    Programmer - Graphics Team
  • Sunrick

    Posts: 3

    Posted:
    Posted:
    we know AWS sever has been avaliable in US
    but when AWS sever can be avaliable in ASIA? for now ping too much much much high than before
    thanks xd:
  • Mad_Syringe

    Posts: 170

    Posted:
    Posted:
    Can we get a reset button somewhere in game or on the website?

    Let me elaborate. There are several Item system related bugs in game atm.

    - The dreaded gun attachment missing bug.
    - Medpen showing on hand bug
    - Additional pistol showing on wrist bug
    - Gun mag on front of barrel bug

    And so on.
    Specifically the missing gun attachment bug is very annoying, since it carries over from the PU, to Star Marine. Which obviously is a big issue for that game mode. All these bugs, are permanent, since they seem to be related to data in the item database of the account.

    So can we please have a button (ingame or on the website), that resets the progress in the PU, similar to a reset that you guys perform for our accounts with each major patch?

    This would solve several problems.
    First, customer service would have less demands for resets.
    Second, we could repeatedly try to generate the issue, thus giving more input on what is going wrong.
    This might also help with other issues in the future, since there might be more bugs, that occur once, and are not reproducable if you can't revert to the state before the bug happened.

    All in all, this should not be too much of a task to implement, but would be a great relief for many suffering these issues.

    Thank you very much in advance for any consideration.
    Cheers and keep up the good work.
    Referral Code: STAR-P6QZ-PLW4
    ELYA_Sig

  • Inventors

    Posts: 12

    Posted:
    Posted:
    This is a copy/paste of my discussion from Q&A cause I'm not sure where to put it.


    Greetings

    If nowadays fighter jets can get a lock on fast moving targets from almost a 100km, why is it impossible to get a lock on non-moving target in a space ship from 20km?
    If you fire AMRAAM (AIM 120C) missile from 40 km on a non-moving target, you will have a 100% succes rate. But, somehow, if you fire a missile in space ship (that uses much better tech), on another non-moving target from 10km, you miss it? What the heck? (

    this video proves it, he fires a torpedo which should even have better guidance than usual missiles...)
    You are unable to get a lock on a target that is 50km away, like 50km is a big distance in space.... If you can travel so easily from planet to planet, that are thousands of km away, why would you be unable to do something that nowadays jets can do?
    I know, it is because you want to make a game, not a simulation. You want ppl to have some fun, to make it more like a star wars, not like a real life situations.
    And you want more close combats cause you think it has more fun in it.
    But than please do not say that you are making a sim.

    What if I tell you that it is much more fun to fire a missile from BVR mode (beyond visual range) while working on tactics (with your wingman and other groups) how to avoid other missiles and outsmart your opponents. Where to position,in what formation would you flight, which type of radar to use, which type of missiles, when to fire them, on which opponent (the closer but slower ones or the further but faster one that has more dangerous weapons), how to communicate ( #1 "Alpha 1, Alpha2, enemy spike 2 o'clock" or #2 "Charlie, unknown contact, BRA 120/70, 60 clicks, 2 angels above you. Can you get a positive id? Over". "Alpha, this is Charlie, confirming enemy contact. Requesting permission to engage. Over. " , "Charlie, Alpha, negative, wait for further instructions. Over and out". In close combat it's like "f**k I'm hit. And again. Bro, somebody's on your 6. ahhh respawning...") etc. There are many things you have to take into account. And just to train them, if you wanna be a fighter pilot in space sim, is fun.

    If you are somebody that calls for help your buddy fighters from your org, it would be fun just to listen to them while they are coming to defend you.

    Please consider this.

    Thank you and goodbye

  • Jorseal

    Posts: 52

    Posted:
    Posted:
    First off, Happy Holidays to you all.

    In addition to what was said above here about AI, and with the new Lumberyard networking code, would this perhaps make it possible for the future to enable an live connected "Server Client".
    This then acts to be an addition to the live universe by calculating background AI behaviour and actions (subsumption support?), universe physics (planets/asteroids/space stations) or maybe some other use of the CPU time that those participating could be offering? For that little extra push to make more things possible in the 'verse.

    For example, each "Server Client" needs an valid user account with active game package and you enable it by requesting it on the website, much like you would activate the PTU account.
    After that with the new to be made launcher, you can select to play Star Citizen, Squadron 42 or the new available "Server Client".
    With the "Server Client", once connected to the live service, you share an amount or % of RAM and CPU time. And the live server then checks your hardware, bandwidth and ping and decides what you could be crunching as you become part of an available resource pool.
    Though I would guess that if a "Server Client" would suddenly disconnect, it should not cause live server issues.
    Maybe let the calculations be done redundantly on more then one computer, then the first response gets processed, the redundant answer will be used as verification or dropped.

    Maybe a silly idea and completely impractical or impossible...
    I just thought to share it because I would love to support in any way possible, even once live.
  • Dexio

    Posts: 834

    Posted:
    Edited: by Dexio
    Posted:
    Edited:
    Hi Programmers;
    I follow the making of the game Rust and there engine: Unity.
    They discovered a interesting behaviour in the tool.
    See this link: Optimization Tool in Rust
    The part that is about; Optimization

    Also see this link: https://trello.com/b/UW0P1eK7/rust-optimization

    My question is. How is Lumberyard (Cryengine) doing this? or do CIG have there own tool, like the one the Rust dev´s. use?

    Cheers
    //Harri
    Dexio_02.jpg
  • Valdore

    Posts: 167

    Posted:
    Edited: by Valdore
    Posted:
    Edited:
    Hello Devs! :D

    A new sync-technology (FreeSync 2) is up the horizon and it looks very promising in terms of efficiency and getting rid of a whole bunch of latency/input-lag while giving the game Developers the ability to calibrate their game for the monitor instead of the user (which 99% of the users don't do anyway)

    A short summary can be found here.

    Do you plan on adopting such technology?

    Cheers!
    Cry woe, destruction, ruin and decay: The worst is death, and death will have his day.
    XBkiVim.gif
  • Sentient_Mind

    Posts: 3

    Posted:
    Posted:
    Regarding M.2 Drives with high transfer Rates.

    We are now getting M.2 drives with the theoretical speeds of 2,000+ Mb / Sec file transfer rates compared to the ~500 Mb / Sec transfer rates of standard SATA3 SSDs.

    Disappointingly however, we are currently not seeing significant gains in load times of games currently available between M.2 and SATA3 SSDs.

    I am curious if this is something that could be optimized in the coding of Star Citizen to be better able to take advantage of M.2 faster transfer rates? Is this something already in development?
  • Johan_26

    Posts: 51

    Posted:
    Posted:
    Heya, rendering question.

    Since Star Citizen now runs on Lumberyard, and we know that Amazon added the " true " HDR video output.

    And I am not talking of the engine renderer which is already in HDR.

    What is the new statement regarding that point ?
    You obviously know the benefits of having a 10bit image in some very dark places which is quite common in space.

    Is the game already displayed in HDR/10bits in 2.6 or do we need to wait ?
    I know HDR monitors are still rare but since this game aims for the best graphics this is obviously a crucial point.
  • B1acksh33p

    Posts: 5

    Posted:
    Posted:
    Will there be an API in the future to access trading data?

    I bet websites will be raised to plan trading routes once the universe becomes larger. The question is if CIG will support these by providing an API to access trading market data or if they have to collect the required data (e.g with a client parsing screenshots of trading screens using OCR) by their own.

    Or have CIG plans to provide such tools by their own?
  • Klexmoo

    Posts: 18

    Posted:
    Edited: by Klexmoo
    Posted:
    Edited:
    Are there plans to add a congestion control layer to the launcher, apart from manually setting download speeds? (choices which really don't make much sense - there are very few choices!)

    Since the download is done over UDP, it easily congests a 1 Gbit network because there's no congestion control done.

    Protocols like the Micro Transport Protocol (µTP) could be used much like with torrent clients, which utilize the available bandwidth very well without creating packet loss for the user.

    The algorithm is introduced here; https://tools.ietf.org/html/rfc6817
  • Viscerous

    Posts: 51

    Posted:
    Edited: by Viscerous
    Posted:
    Edited:
    Are there any plans for efficient temporal rendering methods? This probably ties in slightly with anti-aliasing, however Ubisoft managed to get a nearly 'free' up to 60% performance increase simply by implementing this checkerboard/temporal interlaced rendering.

    Nvidia wrote a bit about it in their Watch Dogs 2 performance guide HERE, along with image comparisons and performance figures.

    They also link a presentation from Ubisoft on an older implementation of it for Rainbow Six: Siege, HERE (page 45-65).

    VeytBUQ.jpg

    This seems like a good thing to have considering the game's resource heavy budgets.
    LCR
  • T-Rah

    Posts: 18

    Posted:
    Posted:
    Might be a silly question, but, will the Oculus Rift/Touch be used along side a HOTAS? Oculus Rift/Touch for out of cockpit, Rift/HOTAS when in your ship. Pobably a bit off of your main track right now, but, wanted to ask.
    It's a hell of a thing killing a man. You take away everything he is and everything he will be......
  • Gr1m_Reaper

    Posts: 2

    Posted:
    Edited: by Gr1m_Reaper
    Posted:
    Edited:
    HI,

    I would like to know if eye tracking will be supported, especially for controlling gimbal mounted weapons along with a Joystick/HOTAS? In detail, today's attack helicopters work with eye or head tracking to define the alignment of your tower-mounted weapon while your stick is used for flight maneuvering. Is it planned to support simultaneous use with HOTAS / stick and eye tracking? For example you would control your ship with your stick, aiming you gimbal-mounted guns with eye- tracking and firing your guns with the fire-button on your stick.

    Best regards!
  • Duke_Dirty_Work

    Posts: 304

    Posted:
    Posted:
    Hello I was wondering if there are plans for algorythmic river and lake system propagation for the planetary tech? No mans sky didn't have any rivers or any bodies of water other than a global ocean sphere at sea level.
    Now Ya Dead
  • Valdore

    Posts: 167

    Posted:
    Edited: by Valdore
    Posted:
    Edited:
    Hello dear Devs! Maybe you know the following channel, maybe not - but its may be worth a look for the coders building water-effects and all-around fluid simulations:

    0.jpg


    My question: Are you guys interested in such papers, or do you already have a performance/quality solution that fits your needs?

    Cheers,
    the tooth-fairy.
    Cry woe, destruction, ruin and decay: The worst is death, and death will have his day.
    XBkiVim.gif
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    How's the giant gas cloud tech going? Is it done, tabled, or is work ongoing? It has been a while since we've heard about it.

    Hi @alienwar!
    There has been a bit of progress with the gas clouds, but it's been tabled for a bit due to some really hard lighting tech problems I'm working on instead.
    [hide]


    Thanks Ben,

    Are some lighting improvements planned for instance regarding to global illumination?
    I like the natural look of the Frostbite Engine in Battlefront space ship battles.

    0.jpg



    Edit:

    The Big Guns of the UEE Trailer is made in Cinebox or a tool like this, because the lighting cant be ingame, right? ;)

    Hi again, @Speedbeat. There's been a lot of discussion about improving the global illumination beyond the very sparse sample points we have at the moment. I wouldn't call it a guaranteed feature at this point, but it's definitely well above "would be nice" and moving into "we really need this".
    As for the Big Guns trailer, as far as I know we do everything in-engine, rather than in CineBox. If you see any CIG-specific tech (or bugs, I guess :D) in a video, you can be pretty sure we couldn't have done it in CineBox. Eg. if you see a planetary fly-over that looks like the ones we've shown, that's pretty much got to be in-engine.
    [hide]

    Just read an article on "NaturalVision ✪ Photorealistic GTA V 2.0" https://gta5-mods.com/misc/naturalvision-photorealistic-gtav

    kFxwpo7.jpg

    Just wondering if this could eventually be applied to Star Citizen

    Hi @CooperJCW,
    The page you linked to lists a bunch of tools I'm not familiar with, but they're all essentially tools to allow third-parties to replace textures and shaders in an already-shipped game. So I guess, in theory, someone will want to replace our shaders and will make that possible.
    In terms of the things they're actually putting in there, there's nothing in particular that they call out as unusual tech.
    [hide]

    [hide]

    [hide]

    Hey, I was curious, is the team making any use of ScreenSpace Shadows currently? Particularly with some of the shadowing detail issues on character faces, it would probably help clear up some of those issues. Not sure what Cryengine base you guys are working from, though, since varying versions have been in CryEngine in the past?

    Shadowing detail is something that @geoffbirch has put some time into during the last couple of months (we now call him Shadow Man). The upshot is that we're not yet using screenspace shadows, though we've discussed it and possibly will in the future, whether by integration or not depends on whether they've made them work for anything other than the sun.
    One thing with any screenspace effect is that it immediately breaks when the important features are off-screen, so we're determined that screenspace shadows should be adding the final touches to the image, not fixing any glaring problems. That's why, in the shorter term, we've thrown out all the old shadow bias calculations (since they were mostly just arbitrary unexplained numbers being added and multiplied into things that shouldn't directly affect the outcome), and started over with some sensible calculations that take into account slope, texel size, softness, and all the other things that are actually relevant.
    Fair enough! All reasonable things to keep in mind, certainly. Have you considered using Screenspace Shadows for more organic use-cases, particularly where casting only from the sun isn't as particularly a big issue, like planetary environments, especially those with high detail and large amounts of vegetation, etc?

    I noticed SonicEther (of Minecraft shader modding fame) recently showed off some ScreenSpace Shadow integration work he was doing on Unity 5, and in environments with high amounts of vegetation, grass, etc, the improvements in detail and depth and just overall scene quality, are truly remarkable, His version actually appears to work with POM too, which SC is unusual (and awesome) in making liberal use of, so you get even extra detail there.

    se-screen-space-shadows-grass-off.jpg
    se-screen-space-shadows-grass-on.jpg
    sessshadows1off.jpg
    sessshadows1on.jpg

    POM Results:

    sesssUBER0off.jpg
    sesssUBER0on.jpg
    sesssUBER1off.jpg
    sesssUBER1on.jpg

    Forum Page discussing his WIP results: https://forum.unity3d.com/threads/wip-se-screen-space-shadows.441909/
    Shader listing on his website: http://www.sonicether.com/screenspaceshadows/
    Hi @MittenFacedLad, if we were to use screenspace shadows, I'd expect it to be applied as a post-process against everything on screen that has opaque pixels, so you'd expect it to affect organics, characters, etc equally.
    [hide]

    Not too long ago, I think I heard in a video from CIG that anti aliasing didn't currently work with parallax occlusion mapping (Edit: In game I mean), I wonder, is there some progress on this issue?

    Note: I suppose this might already have been fixed. Unsure exactly when I think I heard about this.

    Hi @realspacehobo,
    While it's true that POM screws up traditional multisample AA (because it antialiases triangle edges, and POM edges don't have triangles), there's never really been much chance of getting MSAA working due to the additional memory cost and shading complications. The AA we do have is a post-process effect, so it should be able to antialias POM just as well as non-POM.
    That said, heavy use of POM, when viewed from a shallow angle, does tend to suffer from worse aliasing to begin with, so it might just generate too much sparkle for the AA system to make any sense of, in some cases.
    [hide]

    With Lumberyard being our supporting tech going forward, is this going to change how Star Citizen is delivered? I'm thinking specifically in terms of where the client game is processed and ran. I hope we're still going for a 50/50 split in game and client authority with the game running off my CPU and GPU, rather than changing to a streaming "game by remote view" model. The whole reason I backed Star Citizen was because the game is pushing the hardware I own rather than streaming a video of someone else's.

    Don't worry, @Paulus1978, it's very unlikely we'd suddenly go down that streaming video route.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    Hi:

    Will be support for ultra HD screens, i have a 3440x1440 display and some ships, star marine interface and racing interfaces doesnt appear completely
    It's curious that sometimes the aspect ratio work correctly but is 1 of every 10 times. Is this normal? Will get better in next patches?
    And i was playing at 40-50 fps with cryengine, now im playing below 30, is this normal?

    Thanks a lot =)

    Hi @Gwyr, if your aspect ratio is wonky on one run of the game, but then it's right on another, that's definitely a bug (one we're unaware of, I think) and you should report it. There's no reason that a wide ratio monitor should be losing stuff.
    [hide]

    @BPARRY_CIG

    Rb6 isn't what I had in mind though, was thinking about more graphics heavy games like quantum break and watch_dogs 2, the latter seem to gain praises for it's use of temporal reconstruction both for the quality and performance.

    Thanks, @napoleonic, I'll definitely take a look at those too. It'd be nice to have one piece of tech that lets us hit performance targets on both low-spec and high-spec-at-4k machines.
    [hide]

    [hide]

    (...)

    Caveat: I'm not an AI programmer.
    While we've seen a lot of recent AI breakthroughs recently, they're mostly centred around improving pattern recognition to the point where the computer can reason about the world. With game AI, we're generally much more constrained on processing budget, but we can take shortcuts around the pattern recognition step: show an ingame AI an object, it can just query a table to find out what the object's name is and what it can do with it.
    So, short form: probably not useful for Star Citizen because we didn't need to do what they're getting better at.

    maybe the following article makes some interesting background reading:

    source article

    New artificial intelligence beats tactical experts in combat simulation

    Artificial intelligence recently won out during simulated aerial combat against U.S. expert tacticians. Importantly, it did so using no more than the processing power available in a tiny, affordable computer (Raspberry Pi) that retails for as little as $35.
    -snip-

    ALPHA and its algorithms require no more than the computing power available in a low-budget PC in order to run in real time and quickly react and respond to uncertainty and random events or scenarios.

    -snip-

    To reach its current performance level, ALPHA’s training has occurred on a $500 consumer-grade PC.

    Ah, @ElkarDyn, this is pretty sweet to hear about. Unfortunately, it's wandered well outside my area of expertise so I now can't give you any good answers :(
    [hide]

    With the new Lumberyard Engine and its easier to use integration functions, will we be seeing more talk of possibly moving back toward VR support for this game? My Dream is Star Citizen, a VR Treadmill and a VR Headset.

    Hi @SaturnSquared. Sorry to say, do not hold your breath for this. Ignoring the render tech for VR itself (which given the work we've done, would definitely be a read-and-rewrite job, not a merge-this-file job), making a game properly VR compliant takes a lot of work at the design and testing level regardless of the engine used. We'd probably need to get the framerate up a bit higher too, come to think of it.
    [hide]

    [hide]

    Hi @realspacehobo,
    It's certainly technologically possible to vary the brightness correctly by distance. When you run the numbers though, you tend to suddenly realise that the sun's disappointingly dull out near Saturn, and way brighter than you really wanted it, even at Earth orbit, let alone closer. So people tend to throw some artistic license at it, some custom curve that makes things a bit darker in the outer system, a bit brighter in the inner, but nothing that will wreck everything.

    From what I know in most games and engines the sun is a single directional light that affects the entire scene, and then there are point lights strewn across the map for lamps etc. Is the sun in the Star Engine now a point light with a massive radius, or is it a directional light whose angle and intensity is determined by the position of the sun object in relation to the camera?
    Hi @Oberscht, most games do indeed do that, and for the moment so does our engine. Something I want to change, once you're getting near enough that the sun's a fairly big disc on screen, since at the moment all the highlights from it will be too thin. Luckily, in this case, it's a handful of lines of code that would need to be swapped over.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    Wait. Did you just confirm that the $12m goal is cancelled?

    Sorry for any misunderstanding, my point was that some of the key obstacles to VR support aren't about whether the engine has the technical capability for it. That kind of thinking leads to, well, this guy explains it better than I do. I'd prefer we don't accidentally and permanently ruin anyone's ability to enjoy VR.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    A key point, though, is that if proper functioning in VR has not been part of the design process, doing it later will suck -- i.e., either be very costly or very crappy. I'm getting the impression that, despite the $12m goal, CIG has not been incorporating VR design into its work. If so, which will it be?

    a) Massive costs are coming down the line (better get cracking on chapters 2 and 3 of S42 to cover the cost).
    b) Lousy implementation is coming, eventually (details TBD, but hold onto your lunch).
    c) I'm misreading your message. Instead, "It's all good -- VR design is built in; we just have some more work to do that does not make sense to do at this point in development."
    d) VR is canceled (oops). We just don't want to admit it yet.

    Is there an 'e'?

    e) Many of the problems you find with VR, from my experience at least, are only detectable once you have the technological components in place. Some you kick yourself for not spotting at the design stage, but others are just... something looks unexpectedly flat, something else sends you crosseyed when you try to focus on it. Some piece of camera work reliably makes people queasy, while another sequence that breaks all the best-practice guidelines somehow doesn't.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Posted:
    ZET | ZET said:
    [hide]

    In other words "Game should be finished first or in a finalized stage because implementing VR that early can be a waste of time or resources and possibly bad overall support of that device and experience for all VR players.

    I wouldn't say finished. It's a balance. It certainly adds an extra % time tax to every feature that involves drawing things or player interaction, for specialised testing and the inevitable bug fixing.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]

    [hide]

    [hide]

    Wait. Did you just confirm that the $12m goal is cancelled?

    Sorry for any misunderstanding, my point was that some of the key obstacles to VR support aren't about whether the engine has the technical capability for it. That kind of thinking leads to, well, this guy explains it better than I do. I'd prefer we don't accidentally and permanently ruin anyone's ability to enjoy VR.
    @BParry_CIG

    The relates directly to the question and discussion in these two threads (also posted above):

    https://forums.robertsspaceindustries.com/discussion/358446/devs-it-s-time-to-discuss-head-tracking-in-sq42

    https://ptu.cloudimperiumgames.com/spectrum/community/SC/forum/4/thread/so-will-sq42-release-w-o-any-form-of-head-tracking

    I second this question, especially considering that in the past CIG has said that headtracking would be part of the VR development process. If VR is far off in the future, can we expect separate progress on this?
    Hi @Krel, I'm not sure where we're at with head tracking, it doesn't have quite the same need for deep integration with every part of the game like VR, so it doesn't depend on it. It also doesn't involve drawing stuff on the screen, though, so it's not something I'm in the loop for.
    Programmer - Graphics Team
  • Senkan

    Posts: 57

    Posted:
    Edited: by Senkan
    Posted:
    Edited:
    [hide]


    Hi again, @Speedbeat. There's been a lot of discussion about improving the global illumination beyond the very sparse sample points we have at the moment. I wouldn't call it a guaranteed feature at this point, but it's definitely well above "would be nice" and moving into "we really need this".
    As for the Big Guns trailer, as far as I know we do everything in-engine, rather than in CineBox. If you see any CIG-specific tech (or bugs, I guess :D) in a video, you can be pretty sure we couldn't have done it in CineBox. Eg. if you see a planetary fly-over that looks like the ones we've shown, that's pretty much got to be in-engine.

    That sounds very exciting and promising. Is there any way I could get you to elaborate on your and the team's thoughts on possible GI solutions at this point? Nothing definite, just the kind of thoughts you had such as what qualities you were looking for, what you found superfluous and so on and so forth. That's a pretty interesting topic to me, especially considering the single-bounce technique you already have on planets. So if you have time to spare, I'd appreciate the insight :).

    I also had a question about the animation systems and motion matching which I posted in the relevant thread but I trust that's not really your area.
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    Hi Ben,
    Oculus Rift support (for at least a limited portion of the game) was an early, and highly anticipated, stretch goal for star citizen. ($12,000,000). Can you please clarify what you mean by, "Sorry to say, do not hold your breath for this." As a deep-into-concierge supporter, I am surprised and disappointed that this may be the stance that CIG is now taking on the issue and would appreciate a more detailed response. Unfortunately, your quote below does nothing to actually divulge what you meant...

    [hide]

    [hide]

    Wait. Did you just confirm that the $12m goal is cancelled?

    Sorry for any misunderstanding, my point was that some of the key obstacles to VR support aren't about whether the engine has the technical capability for it. That kind of thinking leads to, well, this guy explains it better than I do. I'd prefer we don't accidentally and permanently ruin anyone's ability to enjoy VR.
    Are you suggesting that you think CIG would do such a poor job that it will not be part of the process going forward? If so, I would expect this is something Mr. Robert's should have considered prior to making it an explicit stretch goal, or discussing it's integration publically on numerous occasions afterwards. (see, e.g., http://www.roadtovr.com/star-citizen-to-refocus-on-vr-support-in-early-2016/), not to mention stating it would be supported on the ORIGINAL kickstarter webpage... (https://www.kickstarter.com/projects/cig/star-citizen/description) - "Virtual Reality is here! We have backed Oculus Rift and will support it in Star Citizen / Squadron 42. Who doesn't want to sit in their cockpit, hands on your joystick and throttle, swiveling your head, to track that enemy fighter that just blew by?"

    To the extent that you are correct on this issue, and that CIG will NOT be implementing this feature (or only in severely limited or "portion" implementation) I think that CIG and Mr. Roberts should promptly and directly and formally inform the backers, including myself.
    Hi there @Grindstone,
    I really did just mean "don't assume that just because of a bullet point engine feature, that it's suddenly right around the corner". Similar to if someone had asked whether the engine change means we might suddenly start releasing builds for consoles or something.
    I'm a big fan of VR too, and as and when it gets too the front of the schedule I hope I get to work on it.
    Programmer - Graphics Team
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    Ben, what is confusing and concerning is that you also specifically admitted that "making a game properly VR compliant takes a lot of work at the design and testing level regardless of the engine used"... In fact, John Carmack (Oculus) also said as much, as did numerous other development articles on the subject (as does common sense). You also noted that doing it badly can end up ruining VR for the average user.

    Now, despite the numerous statements by CR that this kickstarter promise/stretch goal was going to be specifically addressed beginning in 2016, we're being told that this isn't being done, and isn't even a priority... and in fact would "definitely be a read-and-rewrite job, not a merge-this-file job."

    This isn't a question on if the engine has the "capability" to do VR... Of course some version of Cryengine has the capability... the question is if CIG is actually planning on implementing it into CIG's engine, and has been working in this "design and testing" stage to do so, so that the VR experience isn't disastrous for the average user.

    I apologize if I am holding my breath for this development, but that's what I was told was coming... repeatedly. As a lead, please address this issue internally.

    [hide]



    Hi there @Grindstone,
    I really did just mean "don't assume that just because of a bullet point engine feature, that it's suddenly right around the corner". Similar to if someone had asked whether the engine change means we might suddenly start releasing builds for consoles or something.
    I'm a big fan of VR too, and as and when it gets too the front of the schedule I hope I get to work on it.

    Hi @Grindstone,
    One thing to get out the way first - I'm not actually a lead, I'm just a noisy senior. Ali just doesn't like wading into discussions like this for some reason.
    I should stress also that what the render team is working on is always in the monthly report. If we do some work on stereoscopic rendering then we're going to tell you about it, generally the only stuff that doesn't go in there is stuff that's too boring to mention, like bug fixes.
    Most of the stuff we're working on right now would also block other goals (stretch- and non-stretch alike) if we didn't work on them, and we try to prioritise things that have a long tail of other work depending on them (so, for instance, a change to particle tech might affect everything the VFX team can do from that point onwards). In many cases though, we're working on win-win stuff, eg things that'll clear out a big framerate sink, reworking things for picture-in-picture support will also tidy the render pipeline code up to the point where it's a lot easier to sensibly add a stereoscopic rendering feature.
    The other teams, I expect, are balancing things similarly, but I can't (and won't) speak for them, it's not my place to do so. Similarly while I'd love to be the guy to promise when any given feature will get worked on, I'm not the guy who decides that.
    [hide]


    Many people have been saying for a while that fixed cameras views during canned-animations are going to be a problem, but SC still hasn't let us unlock the head and look around as we're climbing into or out of our ships. Having put in a decent chunk of my career testing in ATP flight sims (not the games, the actual full-motion sim) there is nothing quite as nauseating as having a disconnect between what your're feeling and what you're seeing, and being unable to turn your view to re-orient. The only times I've gotten motion sick was when this happened in the sim.

    Did E: D get lucky or did they build it correct from the beginning? Displays are legible, menus nested accordingly, the operating view from the cockpit is clear of obstacles (paging @RadiantFlux) A lot of us have been calling for fixes here that we see will be problems for gameplay in general and for VR specifically. It feels like these issues aren't being taken into consideration, and that can only mean either a klodgy implementation, one that requires a huge overhaul to accomodate, or for the promise of VR to be thrown out.

    I'd like to know which way we're heading.

    Definitely agree that locked cameras in canned animations would ruin people, it's one of the things I'm pretty sure the animation team knows they'll have to fix. E: D got some things right by diligence, others by chance, and some others by regular nagging as far as I remember.
    [hide]

    @BParry_CIG

    Some of us who are VR users have been using every work around we can find to play the current iteration(s) of Star Citizen with our VR headsets right now. For those that don't tend to have VR sickness issues, Star Citizen even in it's current unoptimized, buggy, laggy state is still a wonderful thing to behold. For me, even the jerky seat entrance animations aren't a deal breaker.

    Over time we have lost things like the stereoscopic console commands that make VR much easier to implement. Having to rely on frame shifting to get stereoscopic 3D is at best pretty crappy, but I still prefer to play that way when I can, then to play on a monitor.

    Is there any way that some of the console commands can be switched back on for users like us? At least the stereoscopic ones? I understand not wanting people to have a bad experience in VR, and I can see where not making it plug and play right now is for the best, but for those that are already willing to do whatever legwork is necessary to get a less than optimal experience in the game right now, isn't there something that can be done to make things better?

    @lmac, I'll double check about the stereoscopic console commands tomorrow. I think it's likely that they're broken right now, the rendering pipeline was in a very fragile state when we got it, so getting it working again would probably require someone to spend some dedicated time working out a clean solution for it. I don't know that though, they might just not be on the whitelist.
    Programmer - Graphics Team
Sign In or Register to comment.