PROGRAMMING (Engine, API, Hardware, etc)

  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    What are your thoughts on Nvidia's new "Fast Sync?" There have been reports of people experiencing the benefits of Vsync without any noticeable input lag. Could it be useful for Star Citizen?

    More info (scroll down a bit): http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,4.html

    "The experience that FAST SYNC delivers, depending on frame rate, is roughly equal to the clarity of V-SYNC ON combined with the low latency of V-SYNC OFF."

    Hi @Fushko,
    Sounds cool. Especially cool since it's probably a feature that doesn't mean any work for me.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    Here are some example if Realtime Computer Generated Clouds, I was wondering if the clouds your currently researching will look similar to this.

    The researcher that developed this his name is Antoine Bouthors

    I've never seen realtime clouds in games look like this before! So beautiful!

    Thanks for this, I hadn't seen this one and the thesis looks meaty. A lot to go through, and no doubt a bunch of it won't be compatible with what we want, somehow, but it looks like it has a thorough treatment of scattering that might improve some of our calculations.
    [hide]

    Hello @BParry_CIG

    I was wondering about how exactly the current POM self-shadows function and what plans there are for it given its current ubquitous use on basically every game object.

    How does POM exactly work in the engine so that it can only self-shadow in direct sun light and not under other direct shadow casting lights? And is there a way for it to also self-shadow under indoor point light or spot light shadow casters? I imagine it could be a serious quality win given how many objects and assets in star citizen use it (the inside of ships especially...).

    I remember Crysis 1 had self-shadowing POM from point lights... but that is ancient history in engine terms (all forward...dx9...etc...).

    Thanks for any response!

    POM self-shadow in its default implementation does some ray marching per pixel inside the material shader, and outputs a specific sun-shadow term into our G-Buffer. This has the upside that it's very high resolution where it works, and the downside that it doesn't scale to more than one light. We've been talking about a more generalised "do the shadow maps right" type solution, which would probably be more appropriate given how much of the game is not sunlit.
    [hide]

    Hello I saw in the latest ATV 2.37 in the cloth creation segment that you use Substance Painter. Have you integrated substance support in your version of cryengine?

    I do not believe so.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    I have been thinking about network LODs as they are getting mentioned a lot lately. My question concerning it is that since network lod is proximity based network data transfer, which is similar to proximity based graphics LODs, now just like in graphics, will there be frustum culling involved in network lods or are there some pitfalls to using it in a network setup? thanks.

    Hi @endeaverse, I doubt we'd use view frustum culling for network LODs, because that would force clients to constantly send camera information to the server, and would be very fragile if someone whipped their view direction around faster than it could keep up. It's much more likely to involve sending no information about people inside sealed rooms, or updating more distant objects at a slower rate (maybe we do some of that already too? I don't keep track of that side of things).
    [hide]

    Hey again @BParry_CIG !

    Thanks for answering my question!

    [hide]

    POM self-shadow in its default implementation does some ray marching per pixel inside the material shader, and outputs a specific sun-shadow term into our G-Buffer. This has the upside that it's very high resolution where it works, and the downside that it doesn't scale to more than one light. We've been talking about a more generalised "do the shadow maps right" type solution, which would probably be more appropriate given how much of the game is not sunlit.

    Interesting. If you guys ended up fixing up, unifying, etc. so that you "do the shadow maps (and all shadow maps!) right", would that means POM self shadoiwng would inheret the resolution of the shadow map from the local lights perhaps? Meaning, it would technically not be as precise (perhaps suffer from acne, under sampling, or bias) but now applying to everything?

    Or are there ways to more smartly apply shadow maps so that they are generally always very high res and precise within x distance of the camera?
    Dead on. By "doing it right" I mean guaranteeing that all POM translates into true scene depth, and also that the shadow maps from things in the player's view also have that depth information adjusted. That probably means less than one shadow texel per pixel so you'd see undersampling. The current shadow resolution has room for improvement though, the assignment of shadow memory isn't very smart on a frame-to-frame basis and I think some sort of smart manager for that is in the schedule.
    [hide]

    If there is one thing I would have you guys not "fix" :D would be the method of filtering shadows that is default in many scenes so far in the game (albeit, not from sun shadow in port olisar oddly enough):
    http://abload.de/img/starcitizen_2016_01_059kj1.png

    Here the filtering in the above scene creates an extremely convincing bokeh'd pin hole effect that actually occurs with real world shadows and since it is based upon the light source's surface area and shape, it can even take on exotic shapes in extreme circumstances. There is even a nice umbra and penumbra being emulated as a result (I think CE shadows use PCF to simulate that?) It looks fantastic.

    Unfortunately shadow filtering tends to be a bit of a fluke when it looks nice. Working out how far away casters are (to work out how widely to blur them) unfortunately means reading all the texels you'd need to read to do the widest blur at the highest quality. The plant there is probably looking good now, but other objects in the same map would look too soft where they meet the floor. The bokeh effect you linked to is excellent, and for distant sources like the sun would probably be fakeable (to be technically correct, you'd need a different shadow map for each ray). Like depth-of-field effects though, you'd probably end up needing a few passes and some intelligent shortcuts to make it cheap enough to run, so it's not exactly trivial.
    [hide]

    I was thinking about the whole particle lighting and volumetric shadows stuff that Ali recently worked on (based on your tiled lighting work stuff). We know that it will be able to create shadows on particles in a volumetric manner ala the most recent DOOM:
    http://abload.de/img/doomx64_2016_06_20_23o1zkx.png

    But since those are just layered alpha billboarding particles, when you switch angles... they just turn toward the camera. So the shadows are volumetric, but the particles themselves are slices essentially and the illusion breaks (you also get the strange macro effect if you look at the particle while turning... you see it turning with you and the shadow casting through makes it more obvious almost!):
    http://abload.de/img/doomx64_2016_06_20_237xahv.png

    Are there plans to negate or reduce particle billboarding in an obvious manner? A while back Ali said he was looking into something to remedy that I think (if I read it correctly):

    So it's great for the sun, not so great for interiors. One alternative is voxel-grids which can give you good volumetric lighting from many lights, but will suffer from much lower resolution shadows.
    I've done some initial R'n'D to improve our explosions by giving them more accurate lighting and depth to avoid the flat-billboard look. I'm hoping to pick this up again at some point as most games suffer from very flat/fake looking explosions, and I want our cap-ship deaths to be as impressive as possible!

    Or you mentioning once:

    Where a particle is using a plane to really represent a volume, though, things could get hairy. Maybe there are enough planes that it still looks volumetric? Maybe we'd have to ray-march a little within each to make it look right? We'll have to see what we have before we know what we can make.

    Thanks once again for responding if you do.
    Best,
    I guess you'll have to wait and see how well the solution in question holds up from different angles. That "ray-march a little" that I suggested is probably close to a description of how it works, but I didn't know much about how we handle vertex lighting and tessellation on particles, so it was able to do much more than I was imagining at the time. Ali talked a bit about how to do smart voxel grids which might have given equal or better results, but decided that it makes more sense to use data structures that are already in place than to plan out a theoretically superior system and then have no time to write it.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]

    [hide]

    Hello I saw in the latest ATV 2.37 in the cloth creation segment that you use Substance Painter. Have you integrated substance support in your version of cryengine?

    I do not believe so.

    curious, whatever happened to the joint force between cig and warhorse studio on that multi layered clothing for cryengine tech?
    I never heard about that, so it may have happened before I joined. What I do know is that we have a multi-layer clothing system, and it isn't directly wired to Substance as far as I know. This doesn't stop people from using it to produce the source data for the blends, though.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    Hi Ben,

    I wanna ask when we can see a fix for the connection issue, joining PortOlisar or ArcCorp?

    Sorry, @FjuryX, connection's not something I know a lot about. It's not even something I do regularly, since most graphics work can be done in single player.
    [hide]

    I hope you all appreciate Ben and the other devs taking their time to answer us all on their days off. This shows how much love and dedication they put into this project.

    Thanks for the thanks, @Cael817. I don't make any posts during work hours, because that wouldn't be working. A lot of these posts are just what I do when it's late but I've messed up my sleep schedule somehow.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    I've been having a look at the game and am wondering about the overhead in Cryengine.
    I have an optimised config for my GTX750Ti (no shadows, no motion blur, triple buffered, ambient occlusion off, modest texture and pool memory etc) with a balance of image quality / speed.
    In a test where I am just in the hangar (no ship visible), looking down to get a full view, I get ~30FPS. Battlefield 4 with everything turned on except AA, I get ~60 FPS looking into the horizon on a map with 64 players with a heap of effects going on.
    Same for other games - e.g. Guild Wars 2 inside a massively detailed city with ~40 players onscreen, FPS is around the 52 mark.

    I am wondering if development is falling into the dollar trap, where every developer @ Cloud Imperium has twin Titan X cards and not noticing what the rest of the world is experiencing :)

    Moreover, if it carries on this way it will lose out on VR gaming without twin GTX 980's, for example - unless the Kickstarter fund will provide $700 worth of graphics cards to users ;)

    Hi there @Space-Turd,
    The dollar trap, as you call it, is always a risk I agree. I mean, who wants to take out the 1080 once it's in there? On the other hand, it's likely that you're being limited by something else. If you turn things like AO back on, do you actually see a reduction in performance? If so, there's still a good chance that it's some single stupid thing torching a bunch of performance, and we'll eventually get to it and squash it. It's kind of horrifying to see how many milliseconds get saved in the final hours before a game goes to market.
    [hide]

    Did 2.4.1 get some sort of Anti-Alias treatment? I could swear it looks a lot smoother than before.

    Er... I'll have to check. 2.4.1 is not likely to have anything 2.4.0 didn't, but if you mean 2.4 vs 2.3, I've lost track of all the little tweaks and fixes that went in, could be a few things here and there.
    [hide]

    Hi! Thanks for answering questions!

    I know that the server runs on Linux, but I was wondering if the client is built for Linux regularly as well? Even if it has terrible performance or has game breaking bugs, I'd be interested to know if it even builds! Along those lines, are you trying to avoid middle-ware that's Linux incompatible? Are considerations being taken to ensure a smoother porting process in the future?

    Thanks!

    Hi @JoseJX, I'm unaware of us running anything seriously Linux-unfriendly besides Direct3D 11. Our pre-checkin TryBuild definitely confirms that a bunch of our code will build for Linux, I'm not sure whether that's only server code or whether it's a client with a headless renderer. I think it might be the latter, though.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    Right, back from holidays, where were we?
    [hide]


    One thing to try on Cryengine - which is quite eye-opening performance-wise - is with a modest graphics card (say a GTX750Ti or equivalent Radeon) go to the hangar and just face a wall so only that wall is visible onscreen. That is running at <30FPS; that's quite shocking - pretty much every other game engine I know will do that at over 60FPS...

    Honestly, I find that pretty good news. If you were seeing 100FPS looking at a wall, then 15 looking across a room, it would suggest we've made the game hideously expensive to run. If you're sub-30 looking at nothing, it suggests we've got a few big problems that we can kick over and get a decent improvement.
    [hide]

    @BParry_CIG
    Thanks for answering my questions! I definitely look forward to seeing the new particle shading and shadowing to see how thick / volumetric it ends up looking. The same with POM shadows and shadows in general :D

    I was just playing Black Mesa (amazing fan remake... thanks Mark Abent! :D) and I was reminded of the damage shader / how decals are applied to objects in SC (either POM decals or otherwise).
    In Black Mesa, when you shoot a gun in close proximity to an enemy their projectile blood flies back onto the view model ala so (just a diffuse decal):
    362890_20160705222013r0sv7.png

    Then you have enemy weapon effects causing persistent yet fading decals to apply to the view model (here an emmissive decal.. which is kinda neat):
    362890_20160705222423rfs85.png

    My question based on this would be, can we expect similar decals being applied due to close range combat (Vanduul blood and viscera caking the weapon / our combat armour)? I imagine that would be a decal (albeit... a better looking one than in Black Mesa: with normals, higher resolution and perhaps POM even!).

    From a tech standpoint, it's highly feasible to splat decals onto anything that's using deferred shading, so doing it to the view model is a totally viable place for them to go. Whether there's plans to use them for that, I don't know - I think it would be an LA thing.
    [hide]


    In the same vein... will the damage shader for ships ever be ported to personal armours / human bodies? In that case - and since SC / SQ42 uses 3rd + 1st person merged assets and animations - I imagine it would also be perfectly visible on 1st and 3rd person assets.

    i.e. Bullet hole / scuff / wound directly on the forearm or hand after being shot? Or looking down and seing a ruined limb or combat damage on your chest plate?

    If so, what are the challenges of transfering said ship damage shader (in a theoretical capacity) to skinned geometry? Furthermore, are the extensions to the damage shader as mentioned in the GDC presentation still on the table (geo deformation, shrunken asset / tessellation, GPU particles?)?

    Best and thanks for answering these hypothetical questions if you can!
    -Dictator

    I don't know much about the technicalities of the damage shader, but I know a man who does. I'll try to summon him.
    @geoffbirch. @geoffbirch. @geoffbirch.
    Now, we wait.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • geoffbirch

    Posts: 9

    Posted:
    Posted:
    Ok, 1st: the damage tech working on the characters isn't quite so simple as it was created specifically for metals e.g. we have a thickness, burn, denting and temperature set of values we work with to evaluate how to render it and characters don't work in the same way. However character impacts are normally rendered with decals (I'm guessing projected) and that usually ends up looking pretty nasty, now rendering the character damage into a object space would certainly work and wouldn't look so nasty; that's a good possibility. The Character crew are super hard at work at the moment but they have got a unique UV set which could be used for damage stuff and they use a layer blend shader already, so we could do something like increase wear and dirt values and localise that using a damage map system.

    2nd: (In relation to the GDC talk) So when we started the damage tech we were working with a very talented Vehicle Artist Neil McKnight who was working on the Gladius (still my favourite ship) and we were moving away from a damage state system to a more procedural system, however there was still debate as to whether we needed any form of state in there, or more specifically a sort of 100% destroyed ship carcass which would be an artist created asset; we decided against it in the end because it added more unnecessary work for the artist as the procedural damage system was more than capable of creating the look we were going for. So because we had this carcass at the time of development, we had access to its relevant data so we could do vertex manipulation to move the standard assets vertices towards those of the carcass when damage was taken. In the end we settled on using a screen spaced perturbed normal technique, which gives a great result.

    3rd: (Again, in relation to the GDC talk) The ideas about GPU particles rendered off when we do damage and could also link with our concept of the removal of metal (thickness level in the damage map) so the more metal lost we could spawn off an amount and size of GPU particles which signify how much metal came off. And because we use an object space position map we could spawn them off at the impact location without having to feed back damage visualisation information to the CPU to produce the effect and location of spawning we want. We should probably get around to making a GPU particle system first though, I'm sure that's on the VFX list of requests.

    4th: (Finally, in relation to the GDC talk) There is a new piece of GPU technology which we could take use of with DX12/Vulkan which is tiled resources. There's a possibility we could use this to split up the Damage maps and Position/Bone maps with tiled resources and stream the data about when needed. However since the GDC article we have made plans to improve the Damage Tech system to get around some mesh streaming issues we have and also make the Position/Bone Map quicker to render so that we don't need to store them on the GPU whilst they're not being used (we only need them when rendering the damage into the damage map) and we can render them just in time when an impact pushed to the GPU. The plan is to use a proxy mesh to render these position/bone maps instead of LOD0 or LOD1 of the actual geometry, that will make them cheap to render and in terms of memory will be equally as cheap PLUS they'll always being available in system memory (which fixes our streaming issue), unlike the current geometry which gets stream in and out as required. So yeah the damage tech is cheap for performance (it barely even registers as a GPU performance dip) and substantially lower VRAM usage than the old damage state system (so you can stop calling it Damage States because there's no states any more :P) but we can reduce the VRAM usage by half with this new technique of just in time rendering of one of the maps.

    AND FINALLY, we have recently (not that recent, but stuff becomes a blur after a while so I can't remember when any more) done a little bit of fix up for the damage tech to get the glow to seat a bit better so now the cool down decay is more reminiscent of the laws of thermal dynamics (I said reminiscent, because in reality they're nothing like the calculations of thermal dynamics, I only had 8 bits to do the glow calculations alright! I'm no physicist but I don't think it's possible to do any thermal dynamics calculations using only 8 bits of memory storage. However what it was reminiscent of was, hot stuff cools down quicker than warm stuff < I'm sure you can tell we're super technical here in the render team :D) and whilst doing this improvement noticed that as the glow goes down towards the cool end of the spectrum and it blends in with the metal colour, stuff gets a little, well....erm....green! So yeah, we have to fix that at some point because it'll be super noticeable on cap ships when they take damage. To put it simply, damage tech never sleeps because we've got too many ideas :P

    Right, now I need sleep!
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    It worked!
    It even made Reddit!
    Next mission: Tempt non-render programmers to this thread somehow.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    In monthly report you talked about CryEngine's VisArea:

    To allow us to use VisAreas on planets, we had to implement support for rotation. This was implemented by keeping the 2.5D representation but adjusting all the checks to be in VisAreas space, hence transform each point into the space of the visarea before it is checked.

    So the problem is that the local reference frame where the VisArea is located is rotated, but not the hosting reference frame/other coordinate systems? So you need to transform the points _outside_ the VisArea that are not in the local reference frame prior to being able to check their visibility through the portals of the VisArea?
    Hi @RoiDanton,
    The problem was that VisAreas are only defined as a 2D shape + height, kind of like maps in the original Doom were, and came with an implicit assumption that they'd never be rotated. We could probably have supported this by putting a Zone around any world area, thus creating a local coordinate system where they're flat (this is how they already work in ships), but the Frankfurt guys had some reasons they didn't like that as a solution. Instead, VisAreas "rotate" by transforming anything that needs testing against them into their own local coordinates first.
    [hide]

    @BParry_CIG
    Just a small reminder towards my question.
    It woul'd be really cool, if you coul'd answer it, because it's some kind of information i really need to develop my controller.

    [hide]

    I'm actually building my own Star Citizen Controller, and want to add small displays to show some ingame stats like ship speed, ammo count, shield status, etc.
    At least we know that the official HOTAS will get some kind of display.

    Is there any kind of API for this (now or in the future) that we can use for pushing the ingame stats data to our DIY display hardware or tablets/smartphones?

    Please tell us more, what you have planned and still achieved in this department.

    Best regards
    Buncan

    Sorry, @Buncan, was avoiding it because it's not my department at all, pretty much the furthest end of the engine from anything I'll have ever touched.
    [hide]

    [hide]


    4th: (Finally, in relation to the GDC talk) There is a new piece of GPU technology which we could take use of with DX12/Vulkan which is tiled resources.

    Very interesting! Is this in a new version of AMDs CodeXL?
    Please tell us more about this, @geoffbirch (or @Bparry_CIG).
    Tiled Resources, aka partially-resident textures, are a feature that allows one to create a large texture, but only have selected parts of it really exist in memory.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    He's back! Here's a question I was wondering lately: A year ago Counter Strike introduced this technology to prevent players from clipping through walls and being seen by enemies that have no business seeing them. Would such a thing be possible for Star Citizen? Could it perhaps even be used to prevent modular equipment from clipping into each other?

    Hi, @Oberscht.
    I've honestly never seen this tech in action before, very cool. I imagine it works by clipping the model to one arbitrary plane, or by laying down some kind of stencil mask before each character is drawn. Either way, it's feasible technologically for us because it wouldn't interact harmfully with anything else we're doing, but it would probably take quite a lot of designer time to mark up all the surfaces that should occlude like this.
    [hide]

    Hello Ben I was wondering if you can tell me whether Star Citizen is doing any kind of cloud computing? The game is massive in scope with a lot of data that needs to be processed constantly. I feel the local CPUs may not have the guts to handle it all by itself. Will the game off-load some of the processing duty from the clients to the servers?

    Thanks

    Hi, @Rocker1m1!
    Well, the game's going to run on a server, so at least some of the work is up there. If you mean offloading player-local calculations to cloud servers I'd expect that's unlikely, sounds like we'd just end up with a new kind of lag, and less bandwidth for the player to use for crtitcal data.
    [hide]

    Regarding the network rework, i'm wondering (and i might be wrong :p) if currently every item on a ship or character has its own ID and position X/Y/Z ID in the universe sent per package? If yes is the plan to have a compiled entity id of some sort instead, like a ten digit/letter sequence could contain all the statistics/items of a ship/player and send that instead like the old NES RPGS "saves". There's more to this i'm sure and i'm mainly curious why its not done, or done that way?

    Hi @Beolith. If I follow what you're describing, you're asking if all the relevant information could be crammed together into some denser data format? If so, that's already effectively being done, there's a layer (that I frankly know nothing about) that packs information into a sending-efficient format. It can't be as small as the ten-letter combination though: a player's position is 24 bytes big, and sending it as anything less than 24 bytes means you've sent a different less correct number. I don't know much about NES saves, but I'm guessing they'd have used, say, the first six letters for six numbers between 0 and 255, and the next four for 32 true/false flags about things you'd done in the game.
    [hide]

    I have been keeping track of all the various media CGI puts out, and one of the most common things I've seen mentioned are changes of one shape or another to cryengine. At what point does it go from a cryengine renderer to a custom renderer?

    Hi @Karu, that's a real "Ship of Theseus" type question! Right now there are miles and miles of code that we've barely changed at all, but even if we did, and we eventually replaced every single line with new CIG-written code, I think you'd still be able to see CryEngine in its overall structure. Only in areas where the structure was actively harmful to development are you likely to see stuff get paved over.
    [hide]

    Star Citizen would really benefit from changes to the lighting contrast curve by using Abs split for both Luma and Chroma.

    I am not totally clear what these things are that you have described, @Azaral. At a guess, is this SweetFX terminology?
    [hide]

    Hello, Do you have thinking about a sukerpunch that was bigger than a size 1.
    Like a energyriffle sukerpunch S3.
    I think it will be more interresting.

    Hello, @mephi65, you may want to ask this over in a design thread. While we programmers occasionally create monstrous overpowered weapons, they're usually to test features locally and get deleted afterwards.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    As a huge fan of SC, and a game developer in college, I have to ask. Have you considered the new Stingray from AUTODESK ? now the LUA formatting is cool as I have used LUA for Love 2d making some basic games when I started learning code way back 3 years ago. The engine is C++ and gameplay in lua, the work flow for artists in MAYA, the ability to seamless transfer art, all good points. I know multi-threading seems like it will be a major set back though. Would Stingray ever be a viable replacement for the mod. Cryengine or is the source api way to deep in cry to even consider switching? last of all what is your opinion of the former bitsquid engine? Im a heavy Unity dev myself but the MayaLt options are amazing, and the shaders will never alter..... Meaning its the first engine where what you see is what you get and if you need to make changes its instant not hours of playing program hop before winding back to the engine. Thoughts people?

    Hi @Novafox88,
    Short answer: no way in hell.
    Long answer: Your experience with designed-for-mod engines may have given you a mistaken perspective on what SC development looks like. Even transferring from, say, Unity to Stingray, would probably be a terrifying amount of work to achieve, but at least there's a clear distinction between the engine and the work that you've done. With CryEngine, a huge amount of the work we've done is actually getting inside CryEngine, ripping out the parts that don't work for us, and replacing them with fresh.
    Besides this, almost all our code is C++ (thankfully, Lua seems to become a monster beyond a certain level of project complexity), we also have levels, flowgraphs, materials and more that are in CryEngine-only formats.
    Finally, I think if we *did* have to change engines for some reason, one key thing we'd require at bare minimum, is a source code license for it. Without a source code license, if any part that "just works" turns out not to, or works well but too slowly, there's no solution and the project has to just drop the feature. A good example would be the 64-bit conversion we did here - every system needed to be touched at a low level to make it work.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Edited: by BParry_CIG
    Posted:
    Edited:
    [hide]

    Hey @BParry_CIG
    The new monthly report had some awesome stuff regarding what the graphics team have done, namely:

    On the shading side we’ve improved our specular reflection model based on recent research in the field, and this should give us brighter and more accurate reflections from lights as well as being cheaper (which was actually one of our main motivations for the change). We’ve also improved the physical accuracy of how rough or glossy materials look when you zoom in/out from them.

    What I know of the CE PBS model is from slides 8 - 11 in this PDF from Crytek's Nicholas Schulz, so what improvements to which area of the shading model have you guys done (new specular BRDF? New Fresnel term? CHange the way cubemaps convolve / srr convolves? etc.) ? I would love to know!

    Lastly, Ali's report about the new tiled lighting and volumetric shading / shadowing to particles was really interesting. I just had a tiny quesiton there, how exactly are the shadows of such high quality as in this screenshot here:
    xv8OrC0.png
    Other games that tend to have particle shadows from arbitray light sources (doom 2016, alien isolation, or the division), usually cannot capture such small detail and/or are aliased to hell and back. How are the shadows so high quality? And, if you do not mind my asking, are there any plans like previously mentioned to give particle billboards even more depth (i.e. not bilboarding anymore but actually being volumes)?
    Best,
    Hello, frequent flyer!
    Off the top of my head, there were two big changes on the shading algorithm:
    1) There was a line in the top of a function saying "clamp roughness because we don't have area lights yet" which was making high gloss surfaces darker. We have area lights, we removed that.
    2) The "hotness remapping" as presented in those slides seemed wonky to us and we couldn't find a justification. Burley used a * 0.5 + 0.5 which keeps everything nicely in the [0,1] range, whereas a * 0.5 + 0.8 clearly doesn't. We've decided to take John Cena Hable's advice instead because he made a pretty cunning argument for it.

    The detail in the particles is just a question of appropriate levels of tessellation - there's some automation but in the end it's in the hands of the VFX team to balance quality against cost. Ali's made it so that if something would alias, it should fade out before it does.
    I doubt we'd ditch particle billboards for volumes as we don't have a tool pipeline to author them. Also, while billboards aren't technically correct, they can give the impression of correctness while being quite high resolution. Switching fully to voxels would probably incur a serious resolution penalty.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]

    Hello, frequent flyer!

    Hello @BParry_CIG ! :D
    [hide]

    Off the top of my head, there were two big changes on the shading algorithm:
    1) There was a line in the top of a function saying "clamp roughness because we don't have area lights yet" which was making high gloss surfaces darker. We have area lights, we removed that.

    IIRC this roughness clamp had something to do with large areas of flat land usually before the sun was an area light. Interesting!
    Speaking of area lights:

    we’ve finally made the sun into an actual object that you can fly around. Previously in CryEngine, the Sun was always 10km away in an artist specified direction no matter where you were in the universe, but now it’s a hot glowing sphere that casts light and shadows in all directions as you would expect, and has a physically accurate reflection which grows as you get closer.

    Is the sun then a special type of area light now? Or does it use something similar to the bulb size paramater for sphere area lights? And if so, does that means such sphere / bulbsize area lights can now cast correct area light shadows (i.e. are not just from a point)?
    I would love to hear how something like that works at all. This means it is no longer a faked directional light with directional shadows, but coming from some sort of massive area light billions of km away...
    It's still a directional light, actually. There wouldn't be much code to change to make it an omni, we just haven't yet. Shadows are a real annoyance though - area lights we may have, but true area shadows just aren't something you can do in a single shadow map. Likely the sun's going to keep using a cascade of orthographic shadow maps for a long time yet. Shading-wise it's a really big sphere really far away, so far we've not exploded the floating point precision but we're keeping an eye on it.
    [hide]


    Related to area lights as well:
    Just Friday Nathan Dearsley made mention of how he used bulb size area lights to recreate the reflection of the holo-globe in the idris' command and control room which then combines with SSR. He mentions at the end something about "hologram tech" being used to replace this bulb size area light method / whatever when it comes online. If you do not mind my asking, what hologram tech is that and what is the relation to area lights and reflections?

    I've frequently not got any idea what Nate's talking about, this is one of those times.
    [hide]

    [hide]

    2) The "hotness remapping" as presented in those slides seemed wonky to us and we couldn't find a justification. Burley used a * 0.5 + 0.5 which keeps everything nicely in the [0,1] range, whereas a * 0.5 + 0.8 clearly doesn't. We've decided to take John Cena Hable's advice instead because he made a pretty cunning argument for it.

    People on other forums were speculating that was exactly what you guys did. hah. Cool to hear!
    [hide]

    The detail in the particles is just a question of appropriate levels of tessellation - there's some automation but in the end it's in the hands of the VFX team to balance quality against cost. Ali's made it so that if something would alias, it should fade out before it does.
    I doubt we'd ditch particle billboards for volumes as we don't have a tool pipeline to author them. Also, while billboards aren't technically correct, they can give the impression of correctness while being quite high resolution. Switching fully to voxels would probably incur a serious resolution penalty.

    So if I had the parameters set in my user.cfg of "r_ParticlesTessellationTriSize = XXX (i.e. 1)", would that mean I would force over any artist set tessellation levels for any given particle as a global variable? Thus having high quality shading and shadowing everywhere in spite of performance considerations.
    The low resolution of froxels and whatnot is something that would definitely hurt the game's look in spite of their greater "physical correctness" in the end. That and you would need a way to author them as you mention... though I do wonder if there is a way to overlay the two types in the end to create a holistic effect of sorts...

    Thanks again as always for answering my inane line of questioning, if you do!
    I'm not sure what we've done with the particle tessellation CVars, it may be that we've swapped them out for something with a similar yet different name, or interpreted them differently, or straight up ignored them. I think maybe we treat that as a maximum, so you may well be able to ruin your performance that way.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    So I'm not sure if this is an Art, Design, or Programming related question but it has to do with clouds!

    I'm interested in if there are ways to implement different types of clouds, if we might see it in the future, or at all.

    Here's a really neat vid on some strangely majestic clouds.
    0.jpg

    Hi, @THEMIKEBERG,
    We've been talking about clouds a bit recently as it happens. I think it's most likely that we'll be trying to get something coherent and stable going on with a fairly limited set of cloud types (like just cumulus and cirrus or something). If we extend to other types it's hard to say what we'll include, there's that weird problem where a lot of those real world examples don't look very realistic...
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]


    It's still a directional light, actually. There wouldn't be much code to change to make it an omni, we just haven't yet. Shadows are a real annoyance though - area lights we may have, but true area shadows just aren't something you can do in a single shadow map. Likely the sun's going to keep using a cascade of orthographic shadow maps for a long time yet. Shading-wise it's a really big sphere really far away, so far we've not exploded the floating point precision but we're keeping an eye on it.

    So, what about planet shadows? Right now, the sun's light is being cast right through planets. In general, something like an Eclipse would be a giant shadow being cast by a giant object onto another giant object. Would this still be compatible with the current shadow tech?
    Speaking of such things, have you ever taken a look at the Space Engine? If so, any interesting things you've seen, on the graphics programming side? For me it's still the epitome of procedural space stuff... for now.
    One of the first things I worked on when I got here was an entity I named the Analytic Shadow, based on the idea that you don't need a huge shadow map if you know the caster's a sphere. We used it to give Yela a shadow, if planets aren't using them then it's probably just a setup issue or some temporary incompatibility with the atmosphere tech (I gave it a soft height falloff, probably you don't want both reducing the light at the same time).
    [hide]

    No answer for the specs question? Am I asking the wrong section, or is it something not settled yet?

    As far as I know the goal is that both should hit the same requirements, given we're using mostly the same tech and developing both at the same time.
    [hide]

    [hide]

    [hide]

    Hi! Thanks for answering questions!

    I know that the server runs on Linux, but I was wondering if the client is built for Linux regularly as well? Even if it has terrible performance or has game breaking bugs, I'd be interested to know if it even builds! Along those lines, are you trying to avoid middle-ware that's Linux incompatible? Are considerations being taken to ensure a smoother porting process in the future?

    Thanks!

    Hi @JoseJX, I'm unaware of us running anything seriously Linux-unfriendly besides Direct3D 11. Our pre-checkin TryBuild definitely confirms that a bunch of our code will build for Linux, I'm not sure whether that's only server code or whether it's a client with a headless renderer. I think it might be the latter, though.

    Are you able to provide any more information on this? The impression I've formed thus far is that it's just the lack of a graphics engine (with CryEngine's OpenGL implementation needing too much to get it up to speed and Vulkan not yet supported) that's holding the Linux version back. There's even been sightings of a Linux Launcher in a screenshot from BugSmashers.

    Would this be an accurate understanding of the situation (and how's the next-gen API work going, BTW) ?
    Hi @Notavi, unfortunately, the graphics API part is the only thing I have any knowledge of, I've never really put much thought into what else might keep a game from running in Lunix.
    The next-gen API stuff is mostly a Frankfurt affair, so I couldn't say what their roadmap looks like.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    Hello. Not sure, if its design or programming, possibly both:

    I've seen some discussions on reddit about skyboxes and how performance-friendly they are, compared to the need of rendering every single object out of the planet and process all the light/shadow/reflection/etc properties. Now, how is Star citizen doing it? Do we have a "generated skybox" based on the surrounding (like Elite is doing it) or can't we do it, because it would break other stuff (like the real atmo-entry landing, dynamic object visibility etc) or is there even a less performance hefty approach to render everything w/o a skybox?

    Also, even if it may be a little engine tech: Will the game take advantage of HDR (High Dynamic Range) that the actual GPUs take advantage of?

    Hi there @Valdore.
    Your question might need a little unpacking I think. First off, Elite's skybox rendering is (as far as I know) just a case of them baking down all the background galaxy, nebula, etc into a skybox. Comparatively, SC is currently running something effectively similar in terms of performance - some arbitrary 2D graphics on planes in the distance.
    Outside of the skybox situation, the plan is just a lot of dynamic level of detail, hiding objects when they get too small etc. Trying to identify objects that are stationary enough to bake out to a skybox would probably take more work than it would save, and there'd likely be a visible pop when something went from one approach to the other.
    Re: HDR, almost every game on the market uses HDR at this point, even games that don't dynamically adjust their exposure tend to calculate lighting with a little extra headroom for the sake of quality. Up until now the lighting has tended to be a little flat, but coming in 2.5 you should definitely see a lot more (actually probably too much) variance in brightness, backed up with the improved optics tech.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    AcheronAttacks:
    [hide]

    AcheronAttacks:
    [hide]

    CryEngine 5 - It seems like the latest update to CryEngine overcomes a lot of the hurdles that was present in the version you have 'hacked', for choice of a better word. Are you too far down the path with the version you have modified to go to this new version of CryEngine? Or is it possible to port this new code into the version you are working with to tackle some blockers?

    @BParry_CIG I made a video illustrating what I've come to understand as one of the blockers with the modded CryEngine 3, the think time the system takes when transferring between the "zones". Don't know if that's the right term, but eg the internal zone of the ship v the universe proper, or transferring from zero g in the universe proper to say a pad with gravity. Just wondering how big an issue this is, what it hinders without being fixed and how it can be fixed

    https://youtu.be/BjouQPwFpk0

    AcheronAttacks:
    [hide]

    CryEngine 5 - It seems like the latest update to CryEngine overcomes a lot of the hurdles that was present in the version you have 'hacked', for choice of a better word. Are you too far down the path with the version you have modified to go to this new version of CryEngine? Or is it possible to port this new code into the version you are working with to tackle some blockers?

    Hi @AcheronAttacks,
    I'm not really an expert on what is or isn't hard to do with zones, but as far as I know there wouldn't be a zone transition at that point. I'd hope it's just some performance spike (let's say something about the seat makes the engine suddenly want to load and parse a huge block of mostly unnecessary XML), and we'd fix it by just watching our "most expensive functions" list, then poking around in whatever hotspots it showed.

    Re: CryEngine 5, I'm not actually certain what the license situation is there, whether we're 3.8.X only? 5 as well? Lumberyard? Obviously someone knows and I'd check before I started rooting around in the wrong codebase. When it comes to integrations, I'd imagine we're past the point where we could click a button and then work through the conflicts, but doing that was always fraught with difficulty because it can (for example) introduce bugs in systems that haven't officially been changed according to the release notes. While it's more time consuming up front, it's better to compare their old files to their new ones, make sure you understand them, then make the same changes to your own codebase.
    [hide]

    I saw a video where someone was approaching Grim-Hex. There I noticed that the asteroids where brightly lit and when he got closer they faded to being in dark shadow. Is this a limitation of the shadow tech (some sort of shadow LOD) and do you think that it can be solved?

    Hi, rocker_lx!
    Yes, there are some limitations in the shadow tech right now - we're still using the standard CryEngine approach where you have a bunch of progressively lower-detail dynamic maps, and a single huge static map for distant scenery. We've already extended things a little to make the engine understand that there might be more than one static map, and to use whichever's nearest, but we have work scheduled to make that system a little more adaptive. For instance in the asteroid field, maybe each asteroid or cluster could register for a smaller static map, so we'd not be wasting pixels on the gaps in between.
    [hide]

    Will we ever see our own reflection in the game, on mirrors or other objects? We have vampires right now...

    [hide]

    Torque Game Engine (Tribes 2 engine) had a mirror implementation.
    Could probably be implemented using a camera.. but a fully reflective material would be more taxing.

    It's one of those big, irritating problems in modern game rendering - the more advanced we get, the more we screw up the ability to do mirrors, especially since the move to deferred. On older engines it's pleasingly easy, especially with engines that think with portals, since a mirror just becomes a portal that messes with the camera to look back into the room at you (this, by the way, is something I love about the original Prey. They started out in the Build Engine, and as far as I can see all those scifi travel portals would have been naturally-occurring engine features put to clever use).
    Doing it in CryEngine wouldn't be impossible, but it'd either be a picture-in-picture affair or would need specially-written cheats in all the different scene-wide or screen-wide effects, eg explaining to lights that they can shine through doors usually, but no, that door's actually a mirror, also, here's a whole bunch of mirror-universe lights that follow the opposite rule... basically it's a "nice to have" result with the price tag of a major engine overhaul.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]


    Hmmm..... PiP has other uses as well. Security cameras, MFDs, and "holographic windows" off the top of my head. If you assume a relatively flat mirror PiP + tracking angle of incidence would be a dirty cheat, but mostly effective, but would effectively double render demand while the mirror is visible. Animated textures feeding off of a camera logically would be the simple approach.

    Agreed that PiP is useful for other things too, and that if we had it we could theoretically use it for mirrors, but we'd then have to be very sparing with actually putting mirrors anywhere, since seeing one would activate a whole recursive render detour to fill it in.
    [hide]

    When the new netcode is out, can we have a window that say if the gpu or cpu is bottlenecking, and how much % of them is being used?

    Or do you just recommend GPU-Z for that?

    Or will it be to early for that still, since game is in development?

    We have something like that internally, but I'm not sure if the release build includes the tracking markers that it relies on. External tools are probably more trustworthy anyway.
    [hide]

    Hello Ben,

    So the engine is creeping more and more towards low-level API support, while eating away obsolete DX9-11 tech in the process.
    Any information regarding multi-GPU in this?
    Is it a desired/planned feature?
    The improved scaling over SLI/Crossfire as well as the ability to mix and match GPU's (across brands!) would be a pretty big deal.

    Hi Zed! I heard you were dead, clearly not.
    Yes, multi-GPU is a desired feature. We've been trying to keep our D3D11 code as MGPU-friendly as possible so far, and it would be wasted work to just toss it out. We're most likely to focus development and testing on traditional setups though, so don't be surprised if putting two vastly different GPUs in a machine has some unintended results.
    [hide]

    I really must ask this.

    As we are currently using old-gen API's... DirectX 11. The draw calls are significantly limited compared to the next-gen API's.
    Is Star Citizen going to keep DirectX 11 as an option when the next-gen API's are supported? Wouldn't this impact on the potential amount of objects in a given area? A bit like how a fleet is only as fast as its slowest ship... The benefits of next-gen API's will be negated due to the continued support of DirectX 11.

    Is there a point in supporting DirectX 11 in the future, when Vulkan can service all the OS compatibility needs??

    Why then, is CIG still developing DirectX 11 code? Isn't it time to fork away from it?

    Hi @Nianfur, I absolutely agree that in the long term we'll want to retire D3D11.
    In the short term, the fundamental reason that we're still developing under D3D11 is that the 12 and Vulkan transition is not ready yet, and the people who need to work on it have other tasks to balance against it. Once it comes, though, I don't expect we'll be held back by D3D11 performance limits, what's more likely is that the moment we can afford to do more, you'll start seeing more divergence between high and low graphics settings. My personal hope is that once Vulkan arrives, it services everything we need, and we won't need to maintain multiple APIs... but we'll see.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    Hello Mr Parry,

    I was wondering if there is any thought that you could share with us about the new planned launcher.
    It has been said before, I believe that the current launcher/patcher is not doing so ... uhm optimal ... with differential updates.
    Resulting not only for you internally but also to each client, to have very large patches even if inside the pak file one single txt file has been changed.
    This reminds me on how another MMO used to patch in the past. (forgive me for the reference)

    I was wondering if the engine that all of you are creating now, would also be able to support stream patching in the future.
    Not sure if it would also be possible to have this stream patching be able to get the game running in an early stage, for example, the core skeleton of the engine and base UI needing 600 MB to 1GB, until you can start the game in "suboptimal" state and all textures, sounds and models will be downloaded on priority, based on your location, and the rest gets downloaded after when the high demand objects are downloaded.
    But I guess this is quite difficult to realize as this also requires the engine to create invisible walls to stop you from going somewhere that has not been finished yet that could cause clipping issues and create placeholders until that object has finished.

    And who knows what other features you can add in the launcher/patcher, 2 factor authentication is already included.

    TL;DR
    Is there perhaps anything you could share with us about the future planned launcher/patcher or what the idea/goal for it is?

    Hi @Jorseal, sorry to say that all the patcher stuff is done by enigmatic Texans, I think, so I know nothing of their plans.
    The idea is a promising one, since we do a very similar thing by streaming assets from disk to memory - things like physics data and the roughest textures and models are loaded immediately, followed by all the quality stuff.
    I've no idea what difficulties you'd run into when trying this in real life, though.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    So, according to Hannes in the current ATV, PiP is actually planned. Wouldn't that be a job for the german studio, anyway? Considering how fundamental of a change it is, and Frankfurt seems to be the go-to guys for such things.

    I'm not sure whose schedule it's on, but likely it would be Frankfurt, Wilmslow, or a combination of the two. The team here is well suited to do things like plumbing render targets, shuffling of the pipeline etc, but making sure that things like visibility, streaming etc properly understand the idea of multiple viewpoints is something that the guys over there are better at.
    Hannes has also done some cool test renders for a kind of "lightweight PiP", so characters can pop up with messages without the engine going "Oh no! Stream and render that character's entire ship!" and ruining your framerate for the sake of a two-second line.
    Nuclearshotgun:
    [hide]

    Hello there, I have a quick question about flora and fauna, and really just anything to do with what kind of assets planet surfaces will be covered in. Will star citizen use middleware such as speed tree, or is Chris Roberts planning on implementing some sort of photogrammetry technology such as that used in UE4 or Frostbite to develop more photo realistic assets in house, like that of the forests and snowy/icy landscape of Battlefront 3?

    Hi @Nuclearshotgun. I know we're at least evaluating the viability of photogrammetry. Macbeth charts have been placed and photos have been taken, we'll have to wait and see what comes of it.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • geoffbirch

    Posts: 9

    Posted:
    Posted:
    [hide]

    [hide]

    So, according to Hannes in the current ATV, PiP is actually planned. Wouldn't that be a job for the german studio, anyway? Considering how fundamental of a change it is, and Frankfurt seems to be the go-to guys for such things.

    I'm not sure whose schedule it's on, but likely it would be Frankfurt, Wilmslow, or a combination of the two. The team here is well suited to do things like plumbing render targets, shuffling of the pipeline etc, but making sure that things like visibility, streaming etc properly understand the idea of multiple viewpoints is something that the guys over there are better at.
    Erm....I sit behind you, how do you not know that I'm the one CURRENTLY working on PiP for Hannes?

    i3ZWA.gif

  • geoffbirch

    Posts: 9

    Posted:
    Posted:
    [hide]

    Hi devs, I know you have a lot of work and you do it well. But PLEASE, look at the issue report n° SC-23069
    The problem is still there !! nothing was done since the patch 2.3.1. I downloaded the 2.5 today and I still can not play and I think I'm not the only one.
    So, I have not paid a starter pack, ships etc just to watch videos of others on Youtube ^^
    I love this project and I do not believe it is a scam !!
    So PLEASE AGAIN this needs to be fixed...

    Best regards from France :) Paci

    I don't use the backer bug tracking system, do you have a link for the bug? We don't directly get the bugs from you guys; they get filtered through by QA who put them through a clearing system, repo and then triage...or something like that, hence the reason I'm not familiar with the system.

    If someone can get me a link then I might be able to conveniently drop it in someone's lap :P
  • geoffbirch

    Posts: 9

    Posted:
    Posted:
    [hide]

    [hide]

    [hide]

    issue report n° SC-23069

    I don't use the backer bug tracking system, do you have a link for the bug? We don't directly get the bugs from you guys; they get filtered through by QA who put them through a clearing system, repo and then triage...or something like that, hence the reason I'm not familiar with the system.

    If someone can get me a link then I might be able to conveniently drop it in someone's lap :P
    Here you go: SC-23069 - Game Crashes Graphics Drivers
    Yeah it's for the render team so I'll drop it in erm...@BParry_CIG 's lap, I suppose :P. It's a driver crash, meaning it's attempting to do something it can't do and the driver is crashing out because it can't accommodate the commands it has received. It doesn't seem to be specifically hardware or driver related, but I'd suggest keeping up to date with your graphics drivers and removing any 3rd Party graphics software you might be running alongside the game to see if that fixes it. But I'm sure we'll have a chat about it next week, but if our QA were getting this issue we'd be all over it, so I'll see if they know of any hot-fixes and make sure we've got that hardware to test on (pretty sure we're covered in that regard tbh).
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]

    [hide]

    So, according to Hannes in the current ATV, PiP is actually planned. Wouldn't that be a job for the german studio, anyway? Considering how fundamental of a change it is, and Frankfurt seems to be the go-to guys for such things.

    I'm not sure whose schedule it's on, but likely it would be Frankfurt, Wilmslow, or a combination of the two. The team here is well suited to do things like plumbing render targets, shuffling of the pipeline etc, but making sure that things like visibility, streaming etc properly understand the idea of multiple viewpoints is something that the guys over there are better at.
    Erm....I sit behind you, how do you not know that I'm the one CURRENTLY working on PiP for Hannes?

    i3ZWA.gif

    Eh, you're noodling about with render-to-texture stuff, thought it was some other game-wants-to-draw-on-a-screen feature. Now I understand why you were so surprised to find it had reserved ten slots!
    [hide]

    [hide]

    [hide]

    [hide]

    issue report n° SC-23069

    I don't use the backer bug tracking system, do you have a link for the bug? We don't directly get the bugs from you guys; they get filtered through by QA who put them through a clearing system, repo and then triage...or something like that, hence the reason I'm not familiar with the system.

    If someone can get me a link then I might be able to conveniently drop it in someone's lap :P
    Here you go: SC-23069 - Game Crashes Graphics Drivers
    Yeah it's for the render team so I'll drop it in erm...@BParry_CIG 's lap, I suppose :P. It's a driver crash, meaning it's attempting to do something it can't do and the driver is crashing out because it can't accommodate the commands it has received. It doesn't seem to be specifically hardware or driver related, but I'd suggest keeping up to date with your graphics drivers and removing any 3rd Party graphics software you might be running alongside the game to see if that fixes it. But I'm sure we'll have a chat about it next week, but if our QA were getting this issue we'd be all over it, so I'll see if they know of any hot-fixes and make sure we've got that hardware to test on (pretty sure we're covered in that regard tbh).
    Given that it's a 2.4 bug, I'd suggest waiting until 2.5. I remember a bunch of GPU crashes going round the studio around 2.4 time that we had trouble nailing down for ages, but I'm pretty sure we've not had any recently (or people just stopped telling me about them I guess). Especially if it's tiled lighting, since I turned all the numbering round the other way so errors make it stop instead of running forever.
    [hide]

    https://robertsspaceindustries.com/community/issue-council/star-citizen-alpha/SC-25652-_2_5_0__Black_Graphical_Error_-_Self-Land_Hangar
    Seen this now also when flying around crusader and view is in a special angle. Is this tracked ?

    Hi @HerrMatthias, I've not seen that one reach us, but that probably just means it's filtering through QA's queue, or if we're really lucky it's been assigned straight to the art team.
    What you're seeing there is roughly the shape of a lensflare, but the colour it's trying to blur is NaN (Not a Number). NaNs have the interesting property that any operation involving them also results in NaN, so even the softest blur becomes a solid brick of screen-killing doom, and any attempt to draw over it just becomes NaN as well.
    This is almost certainly because that object by the door is either brighter than the sun, or somehow has a spot of negative brightness. When it gets to us we'll run it through RenderDoc, find which pixel caused the problem, and go bother whoever made it that way.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]


    Given that it's a 2.4 bug, I'd suggest waiting until 2.5.

    But... 2.5 is already here...
    My mistake, I thought it was still in PTU.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    Thanks @BParry_CIG for the detailed answer.

    Found another broken lens flare effect when a explosion happens.
    Is it the yellow color from the explosion that adds to the lens flare and produces the overglow effect in yellow ?
    Seen more of such with also different colors.
    source.jpg

    Yeah, that's probably just a nutty brightness on something too. Until the new optics system went in, it wasn't immediately obvious if something was a hundred times too bright, because it still looked roughly OK. Now we're doing bloom and flare
    [hide]

    Hello Ben, longer question to elaborate:

    I've seen the procedural tech from the quite some examples now (other games + the demo in cologne) and I've stumbled across something: Hard edges. If I understand it correctly, we will spawn "regions" of places that are for example "ocean" or "woodland" or "desert" or even "city" etc - now, watching the procedural tech in general, I've seen a lot of hard edges where you can clearly see the edge of a desert and then fading in the wood w/o much fade-in/fade/out. In real world, the region between the forest and the ocean would be: Dense forest/sparse forest/a couple of trees/land without trees like grass/beginning of the beach/beach/ocean.

    Another thing I see in the procedural tech is that the size of spawning regions seem to be similar in size. So while there may be something like [big]--> mountains // [medium] --> single mountain // [small] --> individual rocks, the sizes seem to be pretty fixed which is causing a great deal of the "repetitive feeling".

    There might be even more to cover to get the "real" feeling, but I'll try to keep it short and just focus on those two (fading regions and dynamic region sizes)

    Is there a "plan" on how to work those issues out? And will be incorporate advanced seeding logic like: "Hey, this mountain-line is hindering clouds to pass through, so behind it should be a desert" (just a rough example) or "Hey, the climatic properties only allow for specific seeds here"

    Thanks! :)

    As I understand it, the new planet version will have pretty hard edges between areas, but for a soft gradient they might place a layer of sparse trees between the plains and the forest, for instance. I don't think there's the intention to build in advanced seeding logic like you describe, but because we're going down the authored-procedural path, those kind of broad-stroke decisions could be done by hand or by tool before the planet is released, in the same way as the decision of where to put seas, major mountain ranges etc. are done up front.
    I agree that there's a risk of same-size tiles making things look like everything's made of same-size tiles, but hopefully Marco's got a plan in mind for how to break up that pattern.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    [hide]


    As I understand it, the new planet version will have pretty hard edges between areas, but for a soft gradient they might place a layer of sparse trees between the plains and the forest, for instance. I don't think there's the intention to build in advanced seeding logic like you describe, but because we're going down the authored-procedural path, those kind of broad-stroke decisions could be done by hand or by tool before the planet is released, in the same way as the decision of where to put seas, major mountain ranges etc. are done up front.
    I agree that there's a risk of same-size tiles making things look like everything's made of same-size tiles, but hopefully Marco's got a plan in mind for how to break up that pattern.

    Thank you very much for the reply! I have a very strong (gut) feeling that those crucial elements should be in some sort of "plan" - even if not initially feasible, they might make up a core element of the quality of the procedural tech. Maybe its a far-out call, but better be safe than sorry and forward it to Marco? (I feel dirty asking that - shame... shame... shame *whips himself*)
    Don't worry, it's been discussed at least once already, and he is confident in his solutions.
    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • BParry_CIG

    Developer

    Posted:
    Posted:
    [hide]

    The new bloom/flare tech is awesome, but is it normal that in some cases, the reflection from the light source can be brighter than the light source itself?

    It happens a lot with the sun:
    fanfxl.jpg

    But also with some light sources:
    357h5jq.jpg

    The reflection is so much brighter that it actually looks like the light source here. To give you some perspective, here it is from another angle:
    2462wxy.jpg

    Hi Frank,
    Yeah, the issue with the sun is something we know about - at the moment the sun source is an arbitrary brightness, because we know just how over the top it'll look at true brightness, but then you get reflections that are just as bright. A slightly arbitrary fix went into the main stream last week, so you'll probably see at least some change in 2.6.
    The case with the local light source is probably a result of the firefly-reduction stage, where we try to prevent aliasing by discarding isolated bright pixels. It's probably under-blooming an over-bright light source. Or, if that's a rectangle light, then it's probably just outputting too much light, the light falloff curve on rectangle lights is offensively wrong even in the base SDK and we've been trying to find the time (and plan) to make it right.

    "Odds are that by the end of it, Mr Parry will have skipped the gloves and gone straight to semi-automatic weaponry."
  • geoffbirch

    Posts: 9

    Posted:
    Posted:
    [hide]

    Hi everyone,
    I have a suggestion regarding the launcher. I'm, with my girlfriend, playing a lot of Star Citizen. My question is :

    Could we get an alternative launcher to be put on a linux server to use for local redistribution of updates as the torrent doesn't seem to detect local clients that well, and also to help redistribute the update, so we don't download it twice ?

    My idea would be to set up a SC launcher on my raspberry pi that'll run when I'm away, downloads the updates and redistributes them to others, stops seeding when I'm back from work and only share the updates it made localy from that point until I leave again the next day. I'm already doing this to share Manjaro ISOs and selected package updates. I'm sure lots of people would participate as it means faster updates for everyone.

    I hope you find my suggestion pertinent and wish you guys a good day !

    ps : after thinking of it for a while I found that If I got the right torrent file I could set it up on my own.

    This is a good idea, I'll see if I can send this out to the DevOps guys in Texas, I'm sure they're up for the challenge. I mean, once we get a linux client for the engine sorted this will be required too right? They're probably already working on it tbh, it might even exist in dev only form atm, but you never know till you ask. Leave it with me.
Sign In or Register to comment.