SceneCapture:GenerateTexture
complete
SyedMuhammad
complete
Added a EncodeToBase64 method to SceneCapture!
SyedMuhammad
under review
SyedMuhammad
Hey, would be awesome if you post some usage examples! It helps me a lot of designing the perfect solution!
For example "generate texture" doesn't mean much, is it to return a new entity "texture"? how would that texture be used? who will need this texture? another materials? how that texture could be accessed?
Otherwise I may think on a solution thinking on a usage that won't fit your needs because I didn't think your way 😉
T
Timmy
SyedMuhammad: I had several ideas in mind, the first is to be able to display my 3D character in a 2D WebUI (HUD, character menu, etc..)
The same goes for some of my gamemode items that I would like to be able to "inspect", or why not be able to generate item icons to use in a WebUI as well.
Another use could be to display the texture in a
Canvas
.I don't know if generating a texture file would be the most optimal since it'll require a lot of search in the disk to update WebUIs but I don't see any other solution
SyedMuhammad
Timmy: Got it!
Recently I had the idea to replace the current methods
SetMaterialFromWebUI
or SetMaterialFromCanvas
, etc to a new format:Instead, you call the usual
SetMaterial()
but pass a new pattern, e.g.:webui://my-webui-name
or canvas://my-canvas-name
.This way we can set Materials referencing WebUIs, Canvas, SceneCaptures, which allows in a future to access them through the HTML/JS as well!
T
Timmy
SyedMuhammad: Wouldn't it make the API more complicated? Just wondering
Anyway if we're able to send textures to WebUIs it would be really noice
SyedMuhammad
Timmy: It depends 🤷♂️. If there's another idea I'm open to it
D
DKFN
SyedMuhammad: Probably without using the patterns it would make stuff easier and more flexible.
Naming is likely bad.
Assume that each entity able to return a material will have such methods:
local mat = mySceneCapture:GetMaterialScreenshot()
local mat = myWebUI:GetMaterialScreenshot()
local mat = myCanvas:GetMaterialScreenshot()
Where "mat" is the actual image content in whatever format (bitmap or something ?), then it will be very easy to use this material in whatever context needed.
In a WebUI you will be able to use base64 to inline the image directly in HTML like this:
```<img style='display:block; width:100px;height:100px;' id='base64image'
src='data:image/jpeg;base64[......]/>
```
It is also possible to do the same Javscript side concerning WebUIs
For Canvas, as there is the "DrawTexture" method that takes a path in parameter, maybe it is possible to create a "DrawTextureFromMaterial" where instead of taking the Path and loading the file, you can directly pass the generated material from the screenshot.
Finally, for BasePaintable, the same logic than for the Canvas can be applied to set the material from the content directly too, without having to rely on a file.
What I like with this approach is that it does not have to break anything, all the existing functions can still exist:
my_paintable:SetMaterialFromCanvas or my_paintable:SetMaterialFromSceneCapture
can still exist and call
mySceneCapture:GetMaterialScreenshot()
given the refresh rate asked by the user then calling myCanvas:DrawTextureFromMaterial
.This will allow very good interoperability with all existing rendering methods while keeping the usefull helper functions in place to get things quick and providing even more powerfull features like on the fly image manipulation. (Like instagram filters on an RP telephone or something, who know, we can work with the raw image ! :D)
Let me know what you think about it.
SyedMuhammad
DKFN: The screenshot thing could be useful but is something really heavy, not only unreal side to be able to generate it (it's not that straight to generate texture from a material), it would be like 8 MB of data passing through stuff (could see some fps spikes).
Also, this "generated" thing is not a material anymore, it would be better called Texture instead.
Creating a new "Class" for material or textures could help some generalization yeah, but in the other hand scripters would need to have much more responsibility on managing and dealing with it's lifecycle, which adds much more complexity to things.
And this still doesn't allow "dynamic" materials to render inside WebUIs, it's indeed a though topic 😅
D
DKFN
SyedMuhammad: Yes, its expensive it should be acceptable for a one time, or debounced accross multiple seconds operations tho if its needed to have a "one time preview" of a prop/character/vehicle in customisation using the screenshot method.
However for dynamic live preview, I do not see much more solutions than spawning a 3D webUI in the background and actually render the scene in front of the UI and using perspective to make it look like it is integrated in the WebUI given the constraints you explained.
Maybe some better idea will be found by that time tho :)