livecode, 3D, Unity, and all that
Posted: Mon Aug 29, 2011 3:53 pm
Hi,
I would like to develop an app that is basically 2D for its GUI, and for 90% of the content. However, I also need to render and display 3D content, with lighting, texture mapping, etc, on some of the cards. The app is to be deployed on ipad.
I've read enough docs and websites on this that the result is that it has done my head in, and I remain confused as to the best way forward. I think have several options :
1 continue to use livecode, and use externals to handle opengl.
2 Use a third party 3D package (and Unity looks like the job) and work out how to integrate Unity and livecode
3 abandon livecode, switch to Unity, and do the 2D GUI in Unity, and incorporate 3D where required.
The problem with (1) is that externals need to be written to handle the opengl. The additional problems are that probably a lot of externals would be needed, and this on top of the learning curve of opengl itself.
(2) is the ideal solution, as it makes the building and rendering so much easier but I would need a method in livecode that does this :
"load the 3D model from an external file and use Unity (or something) to fly around it"
(3) obviously would work, but it seems a shame to have to abandon all the advantages that livecode has to offer and switch completely to a different package.
There have been a few postings (and only a few) in these forums over the years, including the suggestion of doing it all in a revbrowser. This still leaves the problem of needing a plugin like the unity webplayer, which of course may violate Apple's restrictions on an app loading external executable code.
Ideally I would like to do the 3D work in Unity (seems to have a good visual way of drag n drop, etc), and seems to do for 3D what livecode does for 2D, and than have an Apple-compliant method of rendering into a livecode object. I am happy for the 3D view to occupy the entire screen of the ipad (as any buttons etc can be done in Unity), but the entire screen perhaps being regarded by livecode as a card ?
Or is a browser method the way forward - Unity have a webplayer that can be loaded and played with javascript, but I'm not sure I would be allowed by Apple to call a unity player from an app, nor am I sure if the livecode browser could deal with it, as it seems happy to point to urls, but not load plugins ?
Is there anybody out there who has integrated Unity (or similar) with livecode, and in such a way that it will work under ios and keep Apple happy ? Or is there anybody who has built a 3D model in any package and had it render within livecode in an Apple-complaint way ?
Thanks
Kevin
I would like to develop an app that is basically 2D for its GUI, and for 90% of the content. However, I also need to render and display 3D content, with lighting, texture mapping, etc, on some of the cards. The app is to be deployed on ipad.
I've read enough docs and websites on this that the result is that it has done my head in, and I remain confused as to the best way forward. I think have several options :
1 continue to use livecode, and use externals to handle opengl.
2 Use a third party 3D package (and Unity looks like the job) and work out how to integrate Unity and livecode
3 abandon livecode, switch to Unity, and do the 2D GUI in Unity, and incorporate 3D where required.
The problem with (1) is that externals need to be written to handle the opengl. The additional problems are that probably a lot of externals would be needed, and this on top of the learning curve of opengl itself.
(2) is the ideal solution, as it makes the building and rendering so much easier but I would need a method in livecode that does this :
"load the 3D model from an external file and use Unity (or something) to fly around it"
(3) obviously would work, but it seems a shame to have to abandon all the advantages that livecode has to offer and switch completely to a different package.
There have been a few postings (and only a few) in these forums over the years, including the suggestion of doing it all in a revbrowser. This still leaves the problem of needing a plugin like the unity webplayer, which of course may violate Apple's restrictions on an app loading external executable code.
Ideally I would like to do the 3D work in Unity (seems to have a good visual way of drag n drop, etc), and seems to do for 3D what livecode does for 2D, and than have an Apple-compliant method of rendering into a livecode object. I am happy for the 3D view to occupy the entire screen of the ipad (as any buttons etc can be done in Unity), but the entire screen perhaps being regarded by livecode as a card ?
Or is a browser method the way forward - Unity have a webplayer that can be loaded and played with javascript, but I'm not sure I would be allowed by Apple to call a unity player from an app, nor am I sure if the livecode browser could deal with it, as it seems happy to point to urls, but not load plugins ?
Is there anybody out there who has integrated Unity (or similar) with livecode, and in such a way that it will work under ios and keep Apple happy ? Or is there anybody who has built a 3D model in any package and had it render within livecode in an Apple-complaint way ?
Thanks
Kevin