Aces High Bulletin Board
General Forums => The O' Club => Topic started by: Ramesis on August 23, 2018, 02:06:13 PM
-
I just read an article on ray tracing and it is apparently the coming
thing in graphics cards although it has been around awhile
The latest is the Nvidia rtx 2080 series of graphics cards :aok
I'm not quite sure about the technology but apparently another
way of rendering light and shadows for even more realistic graphics.
I'm sure it will be quite expensive at first but the future looks bright
for gaming
:salute
-
Not really. It is a proprietary feature of NVidia. Most proprietary features eventually die. They are, usually, good for some short term marketing gain and that is about it.
-
I have to disagree with you skuzzy. Ray-tracing has been used for decades to render high-end models/situations. This modern hardware implementation of raytracing is akin to the adding of hardware transform and lighting back in the 90s/early 00s. It will become standard and used in all games in time (if the promise of playable frames is kept, which is what all this hubbub is about).
-
I am very familiar with ray tracing and the programs that have been around for over 20 years.
This is an NVidia only feature. It will not become a standard unless they open it up to allow other video card manufacturers to support it. Then it has a shot.
Right now, it is a nice marketing tool and there will be some games coming along to support it, but in the long run it will meet the same fate as every other proprietary solution anyone has come up with.
No one has proven it will be useable in games, yet. Right now it is more hype than substance.
-
I dont understand skuzzy. If this method has been used for soo long and I know they cant do it real-time yet, if Nvidia could allow it to run in real-time how will it just be a passing fad? Isnt it like the holy grail of rendering?
-
Ahhh memories of pov-ray.
LOL multi day final renders.... woot.
-
I dont understand skuzzy. If this method has been used for soo long and I know they cant do it real-time yet, if Nvidia could allow it to run in real-time how will it just be a passing fad? Isnt it like the holy grail of rendering?
It does not matter how good it is. Not many companies will be willing to support multiple rendering methods in their product. Right now, they hire other companies to do console ports, or vice-versa for a PC port because they do not want to do it themselves. Supporting disparate rendering methods takes a lot of time and money.
As long as it is kept proprietary, it will eventually fail or find limited support in a niche market, no matter how good it is.
-
I'd seriously question anyone claiming to do true ray tracing in real time. It takes huge amounts of math processing. There are lots of "ray tracing" systems that use a lot of shortcuts, ie not true full blown ray tracing so I'd expect this to be one of those.
Ray tracing has been around for consumers for nearly 30 years, my first dabble with it was Imagine 3D on the amiga: https://en.wikipedia.org/wiki/Imagine_(3D_modeling_software)
-
Well, if they are only using 1 ray, or only one bounce.....
It really depends on the configuration. Unlike a CPU which has a few floating point units, video cards can do hundreds of calculations in parallel to each other. Get enough going on and it becomes possible to do real-time ray tracing.
There are shortcuts available. Static objects would only need to be ray-traced one time for a scene, until the lighting changed. It would take a lot of memory to keep those pre-calculated pixels.
It will not be long before results pour in.
-
It does not matter how good it is. Not many companies will be willing to support multiple rendering methods in their product. Right now, they hire other companies to do console ports, or vice-versa for a PC port because they do not want to do it themselves. Supporting disparate rendering methods takes a lot of time and money.
As long as it is kept proprietary, it will eventually fail or find limited support in a niche market, no matter how good it is.
Unfortunately, NVidia are very good in getting industry to accept their proprietary solutions and than dominating the market.
- In modern industry virtually entire GPU computing is done using proprietary CUDA even though OpenCL exists for a long period and works very well.
- The shiny new AI and Deep learning is fully controlled by NVidia with their cuDNN libraries and frameworks based on CUDA, only limited number of DL frameworks support AMD GPUs and it is not a main stream by any means.
So you should be very careful with an assumptions that NVidia can't start dominating a specific area...
-
I did not see where anyone said "can't". I believe he was saying unlikely.
Of course most anything is possible.
-
Unfortunately, NVidia are very good in getting industry to accept their proprietary solutions and than dominating the market.
- In modern industry virtually entire GPU computing is done using proprietary CUDA even though OpenCL exists for a long period and works very well.
- The shiny new AI and Deep learning is fully controlled by NVidia with their cuDNN libraries and frameworks based on CUDA, only limited number of DL frameworks support AMD GPUs and it is not a main stream by any means.
So you should be very careful with an assumptions that NVidia can't start dominating a specific area...
Never said they could not dominate any specific area. Right now, this is more hype than substance. Preliminary assertions by a couple of game companies have ranged from not so good, to so-so, so far. It is still too early to make the call.
That does not mean it will not find support in non-real time render farms. It probably will if it is faster than the current render technology being used and provides as good, if not better, than what they are using today.
By the way, you do know why CUDA gets supported? It is because NVidia cards are not very good at OpenCL versus AMD. Most applications support both and run whatever is most efficient for the application.
It is far more complex and costly to support multiple render paths in a game.
Time will tell.
-
I watched a couple 20 03 clips (from Tomb raider and Battle something) rendered by the current
ray tracing algorithm and they were quite impressive...
:x
-
Skuzzy is correct IMO.
Time will tell on how well the new Turing/RTX cards perform outside of the Ray Tracing hype - early vids of Tomb Raider @1080p at sub 40fps were out there "scaring" the internet into believing that the new cards will be poor performers, but IMO that's bunk, and it's far too early to tell what their actual performance will be. It's accurate that only a few companies have released anything regarding RT performance, and it's a very short list of current games and apps that even will support it in the near future. Typical nVidia IMO. Same deal with the deep learning super sample - a very short supported list of games.
I think that the performance increase the PC saw going from Maxwell to Pascal will be larger than that of Pascal to Turing, but I'm willing to be pleasantly surprised and proven wrong on that. Still have a couple eVGA 2080ti on pre order, I'll be upgrading all our systems anyway, unless the previews show them to be shockinly poor performers for the $, which I doubt. Ray Tracing/etc all aside, I believe most PC gamers are interested in the RTX overall performance in a wide variety of game types, games without current or future RT support planned so far at least, far more so than any "magic new tech" like Ray Tracing, at least I am. VR with units like the Odyssey (much higher power required than the Rift/Vive) and the newer Vive need a lot of single GPU horsepower right now. Picked up a 4k 144hz Asus LCD, it too is bottlenecked with current CPU/GPUs, and the upcoming 65" 4k 120hz Gsync nVidia big screens will require even more power.
One other thing of interest is that nVidia claims to be focusing on SLI again, and working with developers for more support in this area. As a long time SLI user (since the old 12meg card from Creative long ago I bought in SLI for F15E Strike Eagle), right up until the 1080ti, I wish this to be true, although I have strong doubts about it actually ever happening, as SLI has essentially been abandoned in my experience, with 5/6 of the most frequent games I play on PC not having a decent SLI profile thanks to nVidia (and other's) neglect on that issue. Again, hopefully I (we) will be surprised in this area, but it'll be a big surprise.
-
One other thing of interest is that nVidia claims to be focusing on SLI again, and working with developers for more support in this area. As a long time SLI user (since the old 12meg card from Creative long ago I bought in SLI for F15E Strike Eagle), right up until the 1080ti, I wish this to be true, although I have strong doubts about it actually ever happening, as SLI has essentially been abandoned in my experience, with 5/6 of the most frequent games I play on PC not having a decent SLI profile thanks to nVidia (and other's) neglect on that issue. Again, hopefully I (we) will be surprised in this area, but it'll be a big surprise.
There is news/speculation recently that the only SLI capable cards will be the 2080+ ones, no more SLI with 2070s or lower.
-
I watched a couple 20 03 clips (from Tomb raider and Battle something) rendered by the current
ray tracing algorithm and they were quite impressive...
:x
Nvidia paid Square Enix a lot of money to be in Nvidia's partnership program. Which is why the new Tomb Raider supports it.
-
Nvidia paid Square Enix a lot of money to be in Nvidia's partnership program. Which is why the new Tomb Raider supports it.
I don't have a dog in this hunt.
I only brought ray tracing up because it is akin to how CT scanners create images
of the internal body. I spent about 34 yrs as field engineer working with CT and MRI.
CT uses a form of ray tracing called projections, from an Xray tube, that rotate around the body and
because Xrays penetrate the body from a given number of projections and different angles or positions, the internal structures of a body can be
reconstructed through a process known as back projection. This is essentially how ray tracing works to create an image (as I understand it).
The math involved is quite extensive but CT has been doing it since the '70s.
:aok
-
See Rule #4
-
I don't have a dog in this hunt.
I only brought ray tracing up because it is akin to how CT scanners create images
of the internal body. I spent about 34 yrs as field engineer working with CT and MRI.
CT uses a form of ray tracing called projections, from an Xray tube, that rotate around the body and
because Xrays penetrate the body from a given number of projections and different angles or positions, the internal structures of a body can be
reconstructed through a process known as back projection. This is essentially how ray tracing works to create an image (as I understand it).
The math involved is quite extensive but CT has been doing it since the '70s.
:aok
I used to work in x Ray industry working with Astrophysics ext, I won’t get on a plane again
-
I used to work in x Ray industry working with Astrophysics ext, I won’t get on a plane again
That explains your glowing personality.
-
Here Zack hard at work:
(https://www.highlandmirror.com/wp-content/uploads/2018/06/X-ray-Inspection-Systems-in-Food-Industry-Market.jpg)