I couldn't figure out how to download it. Actually recorded one the other day, then waited about 30 hours for it to process. Seemed there used to be a way. Maybe have to have a subscription?
Luma app for iPhone or droid allows video upload. Usually takes 5-10min to process. No subscription needed. Can view via their website & download, then load in Unity or Unreal via pluggin. Highly recommend. Workflow, compression & quality improves constantly… https://apps.apple.com/us/app/luma-ai/id1615849914
Show your world in spectacular quality 3D, and share anywhere on the web. Brought to you by Luma AI.
Luma is a new way to create incredible lifelike 3D with AI using your iPhone. Easily capture products, objects, landscapes, moments and scenes wherever you are. Share these fast spectacular scenes w…
I'm on android. I was able to upload and process the video (currently takes 1-2 days if not a subscriber). What I want to do, if possible is download the ply/splat/model whatever is produced. I thought this used to be an option, but looks like can only download the video now?
Depends on what level of the creative process you want to work with. It costs a fortune and requires a huge number of people, typically to make a feature length film. If my goal is to tell a story, then I'm perfectly happy editing clips together. People build scenes using models all the time, rather than painstakingly craft them piece by piece, because the emphasis of their interests is in the composition or final output rather than working out mesh topology for each one.
Also some people enjoy randomness. For those who are more commercially inclined it comes down to whether the client is willing to accept long turn around times and quality over speed. Unfortunately this has adverse impacts on the industry. From a competitive lens it's obviously not a great situation for artists. From a purely creative lens, I can see why some people feel differently about it.
(but I'll just say that generative and procedural techniques have been part of art since the first cave painting used pigment blown through a straw - it's a difference in degree not in kind...)
Oh wow what a pain… feel free to share the video with me & I’ll process & export in your chosen formats when I have time. See attached screencap showing options.
I suspect its because I'm not a paid subscriber? I systematically cancelled everything over the last couple of months. Almost everything is something I use once in a while, hard to justify keeping them all active.
That looks really cool. I often feel primitive in android world. Those scan extractions look great though. Might take you up on that, going to look at the mesh you sent when I fire up the headset
Cool feel free - also keep in mind that mesh is high poly - there is also medium & low poly options. You may have more luck with web upload & processing. Luma used to allow it though I haven't checked recently. Also if interested see below. @andybak no doubt you've seen these & think we discussed a while back but resurfacing with all the Icosa progress of late - as well as increasing public interest in nerfs/splats/3D scans:) https://github.com/aras-p/UnityGaussianSplattinghttps://github.com/antimatter15/splat
That's exactly what Icosa Gallery is for - we're a few weeks away from a release with that functionality (well - you can already manually upload and download .tilt files from your own account but I suspect you want to make them available for other people as well?)
is an (open) 3D file(format) for XR experiences. XR Fragments (XRF) spec is about (implicit+explicit) metadata (found in a 3D file) which hints the viewer-application (icosagallery e.g.) to enhance XR experiences, for example:
navigate in-VR to other 3D file by make objects with URL-metada clickable
load/project video on a plane with videoURL-metadata
switch camera/position/animation by clicking an objectIn theory, the XRF js-parser can be included in THREE.js to leverage those features (it just scans the `.userData`-fields of a loaded `.tilt|glb|obj` 3D file).(I'm writing an NLnet proposal atm, and am thinking of making a XRF>Icosagallery PR a milestone)
OK. I think there's an important distinction between "editing a scene" and "viewing/interacting with a scene". We do try and cover both in Open Brush but the lines are a bit blurry and the "view only" side of things is a bit unfinished
I think the Icosa Gallery is a better target for everything related to "Viewing/Interacting with" - I'm currently wondering if it's worth incorprating parts of Spoke (the Mozilla Hubs scene composer/editor) or the three.js editor into Icosa.
These thoughts make sense. I actually studied Spoke & Three.js editor during drafting the XR Fragments spec. The idea was to create a fileformat-agnostic version (first I wanted to bet on Spoke, until it got abandoned by mozilla). The texturescrolling- and link-features (of XR Fragments) were inspired by Spoke extensions, and can be used by other fileformats too now. The deeplinking/teleporting-via-URL aspects of XR Fragments were something I've always longed for with Spoke/Hubs.
Btw. is there a drop-in THREEjs viewer-library of files? I'd love to add support for -files on https://xrfragment.org The fact that it is an open portable format is important for XR Fragments to promote.
There is a tilt viewer for threejs but it's very rudimentary. There is far too much magic baked into our brushes for tilt files to be easily rendered outside of the app.
indeed, I noticed a very cool scene in icosagallery which had some sort of illuminating fog/nebula. I figured that perhaps some sort of shadermagic was going on.
Yep. Currently only the more basic brushes look correct in other apps (unless they can directly use our Unity or WebGL shaders).
I am trying to improve the export (with the help of @Nick ) to get closer to what is theoretically possible. The lack of additive blending in GLTF is a hindrance but I want to propose an extension to cover this and try and get a few other apps to implement it.
btw. once icosagallery has a search-feature, I can implement a searx-plugin for it, to include in this opensource WebXR searchengine (which I also maintain) : https://searxr.me/
Bear in mind that https://icosa.gallery/ hasn't been touched in several years - the new site is still in development although we're fairly close to launching a beta
We are going to implement oEmbed as it's fairly simple. Activitypub is something we're looking into but in general I'm interested in supporting as many open formats and protocols as possible.
A minimal integration could be done on various levels:
**EASY:** render an AFRAME div-playerframe for non-`.tilt` uploaded assets (basically .glb/.usdz etc) with the XR Fragments AFRAME plugin (https://xrfragment.org/#%F0%9F%A7%B0%20AFRAME) like in this codepen: https://codepen.io/coderofsalvation/pen/yLwedvX**Pros**: drop-in solution including VR/AR & hand-controls (I can do an example PR next week)**Cons**: no cons
**LESS EASY:** integrate the XRF THREE-plugin into the icosa-viewercode: https://xrfragment.org/#%F0%9F%A7%B0%20THREE.js which adds metadata-listeners onto THREE loaders (and enhances objects accordingly, making them clickable e.g.).**Pros**: would enable XRF-features/metadata for `.tilt` assets too**Cons**: extra layer of complexity, no hand-control interactions, no AR-mode
**HARDCORE**: integrate the XRF parser-library into the icosa-viewer, which only scans/parses/validates the metadata to aid further implementation: https://xrfragment.org/#%F0%9F%94%A8%20XR%20Fragments%20parser
we're pretty wedded to our threejs based viewer at the moment - although it's pretty lacking in XR navigation features.
A transition to A-Frame is interesting but I'm not sure how much work it would be. Is this something you might want to help with? We could put some of the NLNet funding towards it, if that would help.
Or at least maybe advise on - a few hours consultancy might save me a ton of time.
yes that's definately possible. TBH it's an interesting conversation to have. AFRAME is basically THREE.js with batteries-included XR goodies. Last year, I had no time to get into the nitty gritty of THREE.js XR controls/navigation/local teleporting e.g., so I quickly wrapped the XRF THREE.js-code into an AFRAME component. I don't know if much changed, but last year I could not really find community-maintained XR goodies for THREE.js (other than AFRAME)