![]() ![]() Your point about Mega-City and the future of ECS/DOTS and HDRP's 'poly' count limit is relevant, so is the SSGI. I guess all the issues we thought we were having.like terrain not having any detail shaders for grass/foliage or needing to buy 3rd party assets to have any chance at baking a complex light map under 12 hours, dont exist or we are just too amateur to understand that they aren't problems at all. I suppose anybody who dares to come into the Unity Forum to casually discuss issues can expect a near instantaneous snide, condescending reply. It's also bypassing the inefficiency of small triangle per pixel that plague hardware rasterizer.Ĭlick to expand.Evidently not.apparently. Mesh are automatically LOD by subsampling the geometric image, and since it's using a virtual paging, only the right size of image is served at the right distance. Ie each leaf contain a distance field and a geometric image.įor the rasterization they probably sort data as screen tile sorted front to back, which mean they can use the voxel octree to just serve a list of potential image geometric patch, then reading the geometric image, they can discard per pixel backfacing and frustum culling, and have a cache that keep track of tile coverage to stop patch processing when it's fully covered on screen. There is a lot we can infer from that, in how it's done, it mean they probably unified the whole structure, as the voxel is probably octree used to serve the virtual paging, occlusion culling, GI, etc. It's probably using Geometry image as per the Brian Karis blog, which allow for virtualization in the form of sparse texture, so there is no (traditional vertice based) mesh, it's images, and the eurogamer interview tells you it's a SOFTWARE RASTERIZATION done in compute, unity won't have that for a while because that mean changing the whole pipeline. Unity would need some tooling to catch up, especially non rtx like updating and sign distance field. According to the eurogamer interview that is. The unreal solution is an hybrid of voxel based tracing for large scale, distance field for medium scale and screen space for details, basically a combination of all technique they used so far (vxgi, distance field ambient and shadow, ssgi). Are you even real developers capable of an argument based discussion? Sorry guys, some posts are just PR advertisement without any arguments. So they pushed the graphics, because PS5 can do with realtime raytracing cores and faster speed due to SSD, so what? Already showed by the Uncharted series (Naughty Engine !) and Havok Physics (which you can also use in Unity). While I like the demo from Unreal 5, it's nothing new. Are Unity devs willing to pay in order Unity can invest more? Probably most of the user base want to have many as free as possible. So somebody has to pay for the costs of free high quality assets and overworked graphics and lighting developers. It's because they charge you 5% of your earnings you know and they are lucky by milking Fortnite. Obviously Epic has the money, for pumping everything into expensive scanned assets. But you know, features cannot be developed by air and love, they need money. Unity rasied the subscription by 5$, probably they need money to invest in the high demanding user base requesting high quality features but at the same time, everybody is ranting about the higher asked price. Probably Unity got access from Sony to the same API tech for PS5 as Epic already has. Fast react time and high quality bounce GI will require an expensive hardware setup. Nevertheless, Unreal are tied to the same limits of user hardware and current state of art as any other game engine in the solar system, so please do not overreact. ![]() So yes, no fancy temple assets because it was not the demo's theme. ![]() Īlso megascans from "Book of the Dead" are free to use. But can you made them or pay for them yourselves? Unity already prove with "Book of the Dead" that they can handle mega scans. So with the same mega scans assets, you will probably achieve and equal look. Also the quality heavily depends on the assets. I mean Unity already has heavy instancing with the Hybrid renderer, the tech is proved with the "Mega City" demo. I don't see why Unity will be not able to render the exact same amount of polygons. But atleast they offered Enlighten for free since years when Unreal did not! īut you're right that Unity is slow and late regarding that point. Atleast there is a new branch on GitHub for a Screen Space GI approach for HDRP.Īlso, Unity has access to the same hardware realtime raytracing tech as Epic has, so why not. It's true HDRP lacks a modern Realtime GI at the moment. (At the same time that also applies to Unity.) But nevertheless: Please calm down and wait until testing the actual release in 2021. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |