@mana - the "stars" brush strokes around the middle of the flower - can you erase them or something to see if it fixes it? that's where Blender stops loading the scene.
https://x.com/willeastcott/status/1821506505956331748@andybak per our chat about Web 3D editors. This one of course specific to Gsplats so not a great fit but Iāll share a few more generic examples in this thread
After 10 months of development, it's finally here! SuperSplat 1.0 The open source editor for working with 3D Gaussian Splats. TRY NOW: https://t.co/ZpCz2o7GpB GITHUB: https://t.co/fmjd2oDRMc
I couldn't figure out how to download it. Actually recorded one the other day, then waited about 30 hours for it to process. Seemed there used to be a way. Maybe have to have a subscription?
Luma app for iPhone or droid allows video upload. Usually takes 5-10min to process. No subscription needed. Can view via their website & download, then load in Unity or Unreal via pluggin. Highly recommend. Workflow, compression & quality improves constantly⦠https://apps.apple.com/us/app/luma-ai/id1615849914
āShow your world in spectacular quality 3D, and share anywhere on the web. Brought to you by Luma AI.
Luma is a new way to create incredible lifelike 3D with AI using your iPhone. Easily capture products, objects, landscapes, moments and scenes wherever you are. Share these fast spectacular scenes wā¦
I'm on android. I was able to upload and process the video (currently takes 1-2 days if not a subscriber). What I want to do, if possible is download the ply/splat/model whatever is produced. I thought this used to be an option, but looks like can only download the video now?
Depends on what level of the creative process you want to work with. It costs a fortune and requires a huge number of people, typically to make a feature length film. If my goal is to tell a story, then I'm perfectly happy editing clips together. People build scenes using models all the time, rather than painstakingly craft them piece by piece, because the emphasis of their interests is in the composition or final output rather than working out mesh topology for each one.
Also some people enjoy randomness. For those who are more commercially inclined it comes down to whether the client is willing to accept long turn around times and quality over speed. Unfortunately this has adverse impacts on the industry. From a competitive lens it's obviously not a great situation for artists. From a purely creative lens, I can see why some people feel differently about it.
(but I'll just say that generative and procedural techniques have been part of art since the first cave painting used pigment blown through a straw - it's a difference in degree not in kind...)
Oh wow what a pain⦠feel free to share the video with me & Iāll process & export in your chosen formats when I have time. See attached screencap showing options.
I suspect its because I'm not a paid subscriber? I systematically cancelled everything over the last couple of months. Almost everything is something I use once in a while, hard to justify keeping them all active.
That looks really cool. I often feel primitive in android world. Those scan extractions look great though. Might take you up on that, going to look at the mesh you sent when I fire up the headset
Cool feel free - also keep in mind that mesh is high poly - there is also medium & low poly options. You may have more luck with web upload & processing. Luma used to allow it though I haven't checked recently. Also if interested see below. @andybak no doubt you've seen these & think we discussed a while back but resurfacing with all the Icosa progress of late - as well as increasing public interest in nerfs/splats/3D scans:) https://github.com/aras-p/UnityGaussianSplattinghttps://github.com/antimatter15/splat
That's exactly what Icosa Gallery is for - we're a few weeks away from a release with that functionality (well - you can already manually upload and download .tilt files from your own account but I suspect you want to make them available for other people as well?)
is an (open) 3D file(format) for XR experiences. XR Fragments (XRF) spec is about (implicit+explicit) metadata (found in a 3D file) which hints the viewer-application (icosagallery e.g.) to enhance XR experiences, for example:
navigate in-VR to other 3D file by make objects with URL-metada clickable
load/project video on a plane with videoURL-metadata
switch camera/position/animation by clicking an objectIn theory, the XRF js-parser can be included in THREE.js to leverage those features (it just scans the `.userData`-fields of a loaded `.tilt|glb|obj` 3D file).(I'm writing an NLnet proposal atm, and am thinking of making a XRF>Icosagallery PR a milestone)
OK. I think there's an important distinction between "editing a scene" and "viewing/interacting with a scene". We do try and cover both in Open Brush but the lines are a bit blurry and the "view only" side of things is a bit unfinished
I think the Icosa Gallery is a better target for everything related to "Viewing/Interacting with" - I'm currently wondering if it's worth incorprating parts of Spoke (the Mozilla Hubs scene composer/editor) or the three.js editor into Icosa.
These thoughts make sense. I actually studied Spoke & Three.js editor during drafting the XR Fragments spec. The idea was to create a fileformat-agnostic version (first I wanted to bet on Spoke, until it got abandoned by mozilla). The texturescrolling- and link-features (of XR Fragments) were inspired by Spoke extensions, and can be used by other fileformats too now. The deeplinking/teleporting-via-URL aspects of XR Fragments were something I've always longed for with Spoke/Hubs.