On Sun, 12 Jan 2014 01:30:34 +1000 David Seikel <[email protected]> said:
> On Sun, 12 Jan 2014 00:14:43 +0900 Christophe Sadoine > <[email protected]> wrote: > > > On 11 January 2014 23:35, David Seikel <[email protected]> wrote: > > > BTW, in case it's not so obvious, when I say "big 3D game", I mean > > > one where the 3D world the game is set in is big, not a 3D game > > > made by a big company. Just in case that's confusing any one. 3D > > > Tetris is not a big 3D game. Skyrim is. I'm sure you are > > > perfectly correct when it comes to things like 3D Tetris, but I'm > > > talking a much bigger scale. > > > > > > > > > On Sat, 11 Jan 2014 21:53:31 +0900 Carsten Haitzler (The Rasterman) > > > <[email protected]> wrote: > > > > > >> On Sat, 11 Jan 2014 19:07:06 +1000 David Seikel <[email protected]> > > >> said: > > >> > > >> this is way too long. i just skimmed. > > > > > > You really should read it all. B-) > > > > > >> i'm going to cut it down to the gl supporting bits of efl you > > >> need/want for ALL cases of gl rendering with efl - virtual worlds > > >> or not. it's not relevant. > > > > > > It's entirely relevant, coz you are coming from the wrong > > > perspective. 3D performance is very important in big 3D games, coz > > > this sort of app has HUGE sets of 3d performance needs. Do you > > > have any idea how many triangles are in an entire world? Running > > > the gears demo is an entire different kettle of fish compared to > > > running an entire 3D world. It has a little more than three gears > > > in it. Scaling is the issue, I want a solution that scales to > > > "shit, there's an entire world in there", but this sort of thing > > > seems to be beyond your grasp. > > > > > >> 1. use the preferred engine set. end of story. it isn't broken. it > > >> works. i have a 3d modelling from christophe app proving that it > > >> works. it works now. today. check it out. go google for "slime" on > > >> github. you have messed up somehow in an unknown way to me - but it > > >> WORKS. look at slime. (and nice work christophe!) > > > > > > See Christophe's reply, seems he is having trouble to. He says > > > himself slime is not suitable for virtual worlds. "far from being > > > usable", his words, in the example you say proves it just works. > > > > I meant my application is "far from usable" because it is still in > > development hasn't reach a usable state. > > It's not due to the efl or elementary. > > Fair enough. > > > My only trouble is this: https://phab.enlightenment.org/T518 > > I noticed that resizing gives garbage issue as well. > > > I am also concerned I might not be able to use desktop opengl in the > > future but for now I don't need it. > > As Raster said, let's hope some one adds it. I wont, my plate is full, > and getting fuller if I have to port a 3D graphics engine to EFL as > Raster seems to be saying. i won't be adding it. see my other mail to christophe. > > With efl 1.8.4 and elementary 1.8.3, on my machine and with my > > application, if the engine is set to software in elementary_config and > > I use elm_config_preferred_engine_set("opengl_x11") then it is > > working, I have hardware acceleration. > > If it's not working on your machine, maybe you should open a bug? > > This is the part that was not working, though worked in EFL 1.7. > Perhaps it works after I installed osmesa, but that just sounds silly > to me. it has nothing to do with osmesa. engine choice is made long before that. my guess is some other part of your code is defeating it. try slime - it has it there. it works. if i comment it out it nicely uses the default engine (whatever i have it configured to) or uses ELM_ENGINE env var as requested. > > Like cedric said (I think), it would be nice if it was something like > > elm_config_hardware_accelerated_engine_request() so it is also more > > portable than asking for opengl_x11. > > For now I'm just gonna ignore that part and force opengl_x11 for > testing purposes. I'm sure Cedric will fix this up eventually one way > or another, he said he would. Failing that, by the time I need this > to start be real and not just a playground, I'll add my own probing for > OS and engine. I don't think it will take too many lines of code, but > the point is I shouldn't have to. If you are using Evas_GL, you should > get the best GL support the system can provide automatically. you can't as it is not possible to predict before engine setup time what you will do with it later. using gl comes with a major memory footprint cost. i wish it didn't. it shouldn't, but it does. you pay between 5-50mb cost (ballpark - depending on driver/platform) for using gl at all. that's for the context, cmd buffers, shaders and more. gl comes with a major memory footprint cost. there is also the cost of very slow resizes (mostly due to x and gl interaction and how buffers are handled), and other "real life" problems that are not just prevalent, but the default status quo when you have "real gl". if "real gl" came with minimal extra cost vs software i'd damned well make it a default choice to use, if available, but that simply is not the case. -- ------------- Codito, ergo sum - "I code, therefore I am" -------------- The Rasterman (Carsten Haitzler) [email protected] ------------------------------------------------------------------------------ CenturyLink Cloud: The Leader in Enterprise Cloud Services. Learn Why More Businesses Are Choosing CenturyLink Cloud For Critical Workloads, Development Environments & Everything In Between. Get a Quote or Start a Free Trial Today. http://pubads.g.doubleclick.net/gampad/clk?id=119420431&iu=/4140/ostg.clktrk _______________________________________________ enlightenment-devel mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/enlightenment-devel
