so, somehow i ended up reading a long

so, somehow i ended up reading a long back and from december in this channel about client/server io protocols which reminded me, i've been bouncing the idea around for some time that it might be beneficial from a compatibility and design perspective to implement (or modify) the kitty keyboard protocol and/or the (in-progress) ghostty terminal api for stardust? in addition to the fact that these protocols seem likely to become de facto standards, it might be both a good way for people to feel at-ease in the stardust environment and a launchpad for extending the functionality of the client landscape as-is.
Also, in case it's not obvious, i'm not suggesting a literal keyboard layout but putting some creativity behind how to implement these protocols in order to interact with existing software more thoroughly.
What do y'all think? I think it would def require some careful thought to get right (and would likely require some flexibility regarding implementation for different contexts/clients), but i like the constraints/direction it would put on what we're trying to do here. Thoughts? (btw Nova i just caught your talk from a few years back in the pinned posts... was excellent to hear the vision laid out so clearly)
166 Replies
schlich
schlichOP2w ago
PS i like the fact that the kitty protocol is progressive, so we can start with baby steps if it's a good idea. Also, i think i read that you already have some of those lower level keybindings implemented on the lower level?
Nova
Nova2w ago
i don't quite understand, what is the kitty protocol... for? also stardust doesn't work like a terminal
schlich
schlichOP2w ago
it's a protocol for an input device to talk to a server using established escape codes/patterns that have established semantics for example in that long back and forth i think there was some debate on how we might know a client is 'listening' for input, the protocol aims to take care of that. it should also standardize some "keybindings" that folks are used to, like ctrl+ a key being some action i'm curious to hear about what you mean by stardust doesn't work like a terminal, this is why i bring it up cuz it's entirely possible this abstraction completely breaks down at a lower level and just ends up being square peg round hole-ing it
Nova — 12/9/24, 3:16 PM anyway, i'm debating what to make the interface... do i make it have a separate set_keymap method from key that tells a key press/release
this is the type of problem these protocols are designed to address, with the bonus of being battle tested and compatible with some low level processes widely in use. I know one of the stated goals of ghostty is to have a "headless" terminal that you can slap a UI on as you please, and it would handle most of the communication headaches
Nova
Nova2w ago
but that's 2D input in a CLI stardust is 2D input given to specific clients also i am overwhelmed with stuff to to today so will have to think about this more later plz dump your info here and i will look at it when i can
Schmarni
Schmarni2w ago
as far as i can tell, the kitty protocol and the protocol for stardust have completely different goals, the kitty one is for terminal apps and communicates using stdin/stdout, stardust doesn't use stdin/stdout at all tho, the keyboard (in in the future mouse) stuff in stardust needs to be spatial, stardust also doesn't have a terminal built into the server, i don't see a reason why stardust should use this instead of letting a 2d terminal emulator implement that on top of wayland, which in stardust is implemented ontop of the stardust keyboard protocol
schlich
schlichOP2w ago
what is the stardust keyboard protocol?
Schmarni
Schmarni2w ago
a dbus interface for sending keypresses to other clients
schlich
schlichOP2w ago
yes, that was what i was reading about. Are we confident in that implementation? Does it map well to existing UIs? genuine questions
Nova
Nova2w ago
it literally forwards raw key events over
Schmarni
Schmarni2w ago
implementing the kitty wouldn't gain us any client support because you can't directly run cli apps ontop of stardust, you would run them inside something like kitty or alacritty, which implement the protocol ontop of wayland
Nova
Nova2w ago
the alternative would be sending keycodes and keysyms separately but the STUPID XKB SYSTEM does not like that i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb i hate xkb they forced you to use it in wayland to get any keysym data out and it's a hot garbage mess
schlich
schlichOP2w ago
well, again, ghostty is aiming to be a headless API
Nova
Nova2w ago
but afaik i cannot plug that into wayland that requires raw keycodes and a keymap to send to apps then the apps use that keycode and parse it with the keymap to get out the keysyms
schlich
schlichOP2w ago
why would we "have" to run them inside a terminal emulator? cant that just be the job of stardust clients?
Schmarni
Schmarni2w ago
what does that even mean? stdin/stdout (which as far as i can tell is what they use) is already headless, or do you mean background processes like what zellij/tmux/screen does?
Nova
Nova2w ago
stdin isn't the thing that worries me it's hardcoding in cli common key shortcuts into 3D clients and also wayland clients too
Schmarni
Schmarni2w ago
well if we had a client that runs cli stuff, using those new apis, displaying the textoutput, that would be a terminal emulator, just not one ontop of wayland but ontop of the stardust protocol
schlich
schlichOP2w ago
i'm honestly not really thinking about this as a text interface. the whole point of escape codes is to communicate non-textual keypresses between client and server, and ultimately i'm interested in things such as keybindings that can aide in making the new frontier of XR UI a little more intuitive/friendly/familiar
Nova
Nova2w ago
most people don't use cli tho
schlich
schlichOP2w ago
ok, but everyone knows what ctrl+c does
Nova
Nova2w ago
no
schlich
schlichOP2w ago
(ironically... unless you're in a terminal)
Nova
Nova2w ago
No description
Nova
Nova2w ago
as an optional thing, sure
schlich
schlichOP2w ago
lol my point is that the utility of keybindings are not restricted to text interfaces
Schmarni
Schmarni2w ago
also those escape codes are used to escape from textual representation, i just don't see any point in using the protocol instead of doing it custom
Nova
Nova2w ago
sure but they should be implemented on the interface not on the protocol
schlich
schlichOP2w ago
can you explain that? or, i know you said you were busy lol so maybe that's something i can chew on
Nova
Nova2w ago
well think about it, if I have a 3D git client how do half those commands make sense can you really extend those keybindings to every 3D object ever and make them compatible with wayland?
schlich
schlichOP2w ago
like i said totally willing to concede this is not the right tool for the job, I just think it could provide some battle tested solutions to old problems
Nova
Nova2w ago
it just doesn't make sense
Schmarni
Schmarni2w ago
i mean if you really want global shortcuts (and i and nova both can't really think of good uses) you would implement https://github.com/flatpak/xdg-desktop-portal/pull/711
schlich
schlichOP2w ago
i dont understand what git has to do with keyboard protocols
Nova
Nova2w ago
a git GUI in 3D i think schmarni gets it so i'll let them explain like again, this is a good idea for stuff that knows how to use it but it's a choice on the client dev
Schmarni
Schmarni2w ago
okay think about it this way, what would a 3d client gain from this protocol being implemented at the core stardust level? how would you generically map specific key combinations in 3d space? remember, stardust doesn't (and shouldn't) need a physical keyboard,if its about having some generic actions that can be mapped to key inputs sure, but i feel like they should be mapped by the client that brings the keyboard input into 3d, or a client that provides some virtual keyboard
schlich
schlichOP2w ago
yeah, i mean that's more or less what i'm trying to suss out here, I think there is a benefit to thinking about the way that the effective aspects of 2D ui translate into 3D problems. not saying "lets implement this protocol exactly" more like "let's use these solutions to help us think about established solutions to common problems when we interact with software." attached are a few examples from the ghostty protocol, and it's worth going over the whole list if you wanna go through the thought experiment. say we have an arrangement of objects in 3D space, like a git tree in this hypothetical git UI. Do we have a simple, universally understood way to iterate between objects? to scroll through them? to multiselect? Have we even thought about the concept of a cursor that isn't just wherever one of your fingers/controllers arent pointing? Some of these actions are a stretch, for sure, but i think we could get a lot of mileage just going through stuff and seeing what sticks, what can be thrown out.
Ghostty
Reference - Terminal API (VT)
A reference of all VT sequences supported by Ghostty.
No description
Nova
Nova2w ago
that i will agree to but can't just directly apply it gotta reinterpret it and adapt the intent to the new medium
schlich
schlichOP2w ago
100 the key aspect here is that a client shouild be able to know "these bytes" iterate the cursor to the "next" object, and yeah, what that means totally depends on the client, but that's the way it already is
Nova
Nova2w ago
no disagree
Schmarni
Schmarni2w ago
fun fact: the server -> client input relation breaks in the case of stardust, only input methods (things like xr hands and xr controllers) are handled through the server, non-spatial-input like keyboard and mouse are send in an inter client way, one client like non-spatial-input/simular gets the input from the hardware and then sends it to other clients using a dbus based protocol, each object that wants to receive keyboard input is exposed through dbus, together with a spatial and some field, the input spatializer (in this case simular) gets to select the objects it sends input too in any way it wants
Nova
Nova2w ago
that's too close to implementation in the old mendium, we need to keep the intent and remake the interface to what makes sense because that way the server only worries about spatial stuff separatin of concerns
schlich
schlichOP2w ago
that's good info and gives me a bit to chew on can someone link to the dbus implementation?
Nova
Nova2w ago
uhhh we haven't exactly centralized it yet :blobcatgoogly: actually @Schmarni i want to fix the keyboard impl and just YEEET keymaps to the curb
schlich
schlichOP2w ago
one thing i definitely agree on is i never want to touch a keyboard in vr lol
Nova
Nova2w ago
so like, i want to develop a keymap that maps keycodes to keysyms with the same exact code for compatibility then copy windows and send keycode and keysym over the protocol
Schmarni
Schmarni2w ago
molecules no?
Nova
Nova2w ago
oh right yea :blobcatgoogly:
schlich
schlichOP2w ago
what confuses me about this response is that i'm 100% suggesting using xr hands and controllers to implement this hypothetical protocol ultimately the keyboard protocol is not about keyboards but about sending bytes that signify common actions ESCAPE is just a label
Nova
Nova2w ago
this is even more confusing wait a moment you're just talking about the openxr input aren't you @Schmarni does this sound like input actions to you? the problem with those is that they flatten intent, they do not make it contextual
schlich
schlichOP2w ago
Schmarni
Schmarni2w ago
kinda? this would have a tiny spatial aspect tho because of keyboard handlers, so as spatial is the current one ig? also it maps closer to concepts than buttons? ig think the datamap for input methods? at least i am working under the expectation that we would keep the keyboard handlers
Nova
Nova2w ago
this sounds good in theory but like, every time you try to do it in practice you end up restricting the types of apps that can be built in practice
Schmarni
Schmarni2w ago
oh no no no no no no, like i don't hate the OpenXR actions system, but for stardust? NO
Nova
Nova2w ago
it globalizes the set of actions whereas stardust, given there are tons of devs independently making stuff, localizes it this is why i like protocols implement all the protocols you want, bind them to interfaces on either end
Schmarni
Schmarni2w ago
(to add onto this) like how do you select an object that receives input? would it just be global?
Nova
Nova2w ago
provide as much context as you can in stardust you just switch which input you send input to because the keyboard (or in non-spatial-input, the spatializer) chooses
Schmarni
Schmarni2w ago
yeah, but not if you do OpenXR style global actions
Nova
Nova2w ago
yeah if you want a standard set of actions, add it to the toolkit and make it trivial to add where relevant just like many UX conventions are
schlich
schlichOP2w ago
what do you mean by openXR style global actions, and how is what i'm suggesting restrictive?
Nova
Nova2w ago
give me a practical example of this please
Schmarni
Schmarni2w ago
but in current stardust stuff yeah, tho only global things in stardust i am aware of are a SkyBox and SkyLighting, both of which i want to remove soon and replace with spatial alternatives
Nova
Nova2w ago
from the high level action the user wants to do and then how you see it implemented it'll help me understand and critique better
schlich
schlichOP2w ago
ok, let's stick with the git UI example** and our main goal is just to navigate and inspect the DAG, maybe perform a merge by dragging and dropping a node onto another branc, etc. And let's say the DAG is represented with a simple ball and stick model. We want designer of the client to be able to map user actions to actions on the git tree. Having a protocol that implements established byte sequences can make it easier for the designer to implement common functionality, like "jumping to the beginning of the tree" (such as what the "home" button might do) or "skipping 5 nodes at a time" (such as what the tab might do). IMO these arent implementation specific or restrictive (the actual actions can be whatever we want, but are represented consistently in the machine code!), but just a client-friendly way to not need to reinvent the wheel on some things, some of the most common things that people will want to do, at that. **This example might very well break down with the way your client/server model is implemented, which i'm very much am naive about and trying to learn more about, the focus here is demonstrating what an established, published so-called"VT" protocol can bring to the table instead of the specifics of client/server interaction and the textual representations of the bytes
Schmarni
Schmarni2w ago
hmmm, so the first thing i see is, why wouldn't you just do direct interaction with the DAG? also how would you map that onto something like hands? ideally without having to use a big keyboard style object like this DAG would be represented as multiple connected nodes, why treat it as one object/concept?
Nova
Nova2w ago
just make a scrollbar and make the root commit the bottom then make it have momentum user just grabs scrollbar and flicks up fast and voom, they're there
schlich
schlichOP2w ago
in a way that's what i want to figure out! like, how can we take principles that we've learned about interacting with our computers and apply in in XR in a way that actually makes intuitive sense. There are 10 buttons at our disposal, 10 fingers if we're going controllerless. Our arms have about 6 additional degrees of freedom we can tap into as well. can we design a system that provides some coherence to the way our input components interact to do the things that we all know how to do in 2D, or at least provide some way for client designers to make these customizable?
Nova
Nova2w ago
that's not really how that works in xr fingers aren't float values in practice to a designer
schlich
schlichOP2w ago
not really my point
Nova
Nova2w ago
it is infinitely customizable you get raw input in clients for hands and controllers and so on relative to the input handler's field
Schmarni
Schmarni2w ago
how can we take principles that we've learned about interacting with our computers and apply in in XR
i think that this is the complete wrong question to ask, rather we should ask how we can apply what we have learned irl and apply that to how we interact with computers using xr
schlich
schlichOP2w ago
yes and i'm saying "infinitely customizable" is not inherently a good thing especially when it comes to design and adoption
Nova
Nova2w ago
but it's infinitely customizable while local
Schmarni
Schmarni2w ago
as the base layer "infinitely customizable" is the correct amount of customizability as long as you can avoid conflicts, which we can (using the SUIS)
Nova
Nova2w ago
the protocol determines the most customizable it can be barring straight-up exploits
schlich
schlichOP2w ago
I feel like this glosses over the fact that we are inherently working with software. If im designing a new Photoshop or whatever in stardust it would be helpful both as a designer and to my potential customers that they have at least some idea how to perform certain actions
Schmarni
Schmarni2w ago
as a more highlevel designer you would probably use some premade elements and glue them together, the elements is the correct level to "restrict" things
schlich
schlichOP2w ago
i wanna learn more about the SUIS too, on my to-do list
Nova
Nova2w ago
i really need a good explainer on this stuff but i suck at writing things well-formatted tho i can do cool graphs but i'd need to show graphs over time and blrgh
Schmarni
Schmarni2w ago
i feel like using better abstractions like a pen or brush would make more sense than trying to do things like a keyboard would, having to learn esoteric gestures is not more intuitive than picking up a brush, or a select tool where you grab 2 corners and place and scale it that way
schlich
schlichOP2w ago
so yeah this isn't far from my perspective i think, i think we might just differ in providing a more opinionated framework vs a more open field approach
Nova
Nova2w ago
we can do soft standards by choosing what's in the toolkit
schlich
schlichOP2w ago
yes that's exactly what i'm saying and have tried to repeat several times, this isn't about keyboards or text
Nova
Nova2w ago
make that the standard by convenience
schlich
schlichOP2w ago
text is just bytes
Nova
Nova2w ago
yes but like, to a designer that doesn't matter this is a UX design thing, not a technical one
Schmarni
Schmarni2w ago
xr is a completely new medium, it being software is just an implementation detail
schlich
schlichOP2w ago
trust me, you are preaching to the choir. But I think it being a new medium just necessitates all the more providing a framework that brings out intuitive flow intead of impeding it i dont want to be too restrictive either, i'm honestly trying to open things up because vr is so, so restrictive right now
lea
lea2w ago
my question is, why the kitty keyboard protocol lol
schlich
schlichOP2w ago
And i think these common actions/shortcuts can provide a bridge for people
Schmarni
Schmarni2w ago
yes but (as far as i can tell) you are saying we should use this kitty protocol at the low stardust protocol level, implementing something like it on a framework level makes sense, even exposing it in a more generic way through dbus might make sense, but the base protocol is the wrong layer for this
Nova
Nova2w ago
for the record, i support making d-bus interfaces for every widget like, dials should expose programmatic control from d-bus and all
schlich
schlichOP2w ago
where i see the kitty protocol potentially helping is with things like having a systematic way to delineate from different types of input
lea
lea2w ago
what do you mean by this, give a concrete example?
schlich
schlichOP2w ago
like they make a distinction between normal input, escape input, event input, etc in a way that has become industry standard for good reasons
Schmarni
Schmarni2w ago
only some tho probably (oh no, do we need to implement components in asteroids through dbus?)
Nova
Nova2w ago
no i mean just hooking up dbus into asteroids elements like, make the elements expose a d-bus object to control anyway this isn't relevant could you give me even more examples of what you want to do @schlich? from a user perspective
Schmarni
Schmarni2w ago
an industry standard for text input based cli apps, sure, but that is not even close to what stardust is, or how pretty much anything using stardust works
schlich
schlichOP2w ago
that's not really a helpful distinction for me, sorry here's an excerpt from the kitty protocol:
If you are an application or library developer just interested in using this protocol to make keyboard handling simpler and more robust in your application, without too many changes, do the following: Emit the escape code CSI > 1 u at application startup if using the main screen or when entering alternate screen mode, if using the alternate screen. All key events will now be sent in only a few forms to your application, that are easy to parse unambiguously. Emit the escape sequence CSI < u at application exit if using the main screen or just before leaving alternate screen mode if using the alternate screen, to restore whatever the keyboard mode was before step 1. Key events will all be delivered to your application either as plain UTF-8 text, or using the following escape codes, for those keys that do not produce text (CSI is the bytes 0x1b 0x5b):
Try to forget everything having to do with UTF and keyboards, the protocol is about input processing in a way that disambiguates different implementations from clients
Nova
Nova2w ago
just make more protocols over d-bus or swap out UI controls does the same thing if i get you correctly
Schmarni
Schmarni2w ago
i don't think it makes sense to use one "event stream" for things like selecting stuff in stardust, instead you would want to tag spatial things as selectable, then other clients (or your own one of course) can select things, then your client can decide how to handle that select request, should it select that object alongside others, should it select the object and unselect the other ones? i don't think its necessarily a good idea to treat one client as one continues object/thing that you control, trying to push everything through one api surface doesn't make sense and adds unneeded complexity
schlich
schlichOP2w ago
yeah, sure, i just want to like, actually do the thing in a way that will (hopefully) free us up to make more interesting apps
Nova
Nova2w ago
it's closer to compare stardust hands to touchscreen input and xr controller to wiimote even though it can do both seamlessly and do more seriously tho this would help a ton tho i need end user cases then i can figure out the best way to do that in an xr friendly manner like this, and don't include specifics of keyboards or such
schlich
schlichOP2w ago
word, i mean i brought up the photoshop thing but for real the whole reason i'm here is to try and develop some kick ass creativity suites, and i just don't think that's possible without some kind of systematic way to interpret user input if i wanted to make Ableton in XR how would we replicate that "pro user" flow in a way thats accessible and intuitive to noobs
Nova
Nova2w ago
ableton live?
schlich
schlichOP2w ago
yeah
Nova
Nova2w ago
ok idk that interface well can you explain it a bit more or give me another one that i might know?
schlich
schlichOP2w ago
i mean, substitute your favorite creative suite. How do we bridge power and functionality with control, simplification and ease of use
Nova
Nova2w ago
is blender a good comparison?
schlich
schlichOP2w ago
obviously we want to make 3D skeumorphisms or whatever, but we still need an intricate way to like acess and switch between objects and actions yes
Nova
Nova2w ago
ohhhh yes ok so switching tools
schlich
schlichOP2w ago
yeah and patterend outpuit
Nova
Nova2w ago
Meta Quest
YouTube
Introducing Google Blocks—Available Now on Rift!
Today, we’re excited to share that Google Blocks is now available on the Oculus Store. The latest VR app from our neighbors in Mountain View, Blocks lets you build 3D models in virtual space. Check out vr.google.com/objects (http://vr.google.com/objects) to see what people are making, and share your own on social with #madewithblocks. https:...
schlich
schlichOP2w ago
how do i ctrl+c, ctrl+d etc
Nova
Nova2w ago
what do you need ctrl+c for? there's no modes to exit from
schlich
schlichOP2w ago
ctrl+c is just a stand in for a pattern of input that reliably maps to an action
Nova
Nova2w ago
Open Blocks
YouTube
Open Blocks Trailer
Open Blocks Early Access released for Quest! The most intuitive 3D modelling app is now available for standalone headsets. Free and open source forever. Full "publish and share" solution coming soon. Store page: https://www.meta.com/en-gb/experiences/open-blocks/8043509915705378/ Steam: https://store.steampowered.com/app/3077230/Open_Blocks/ ...
Nova
Nova2w ago
yes but can you put that in a practical example of what it's used for in a bunch of applications? i think this overall pattern is generally just... not how XR works at all it's a very 2D computer centric method entirely so you can do the end goal without using it like here there you switch tools by snapping them into your controller and you don't select, you just... move things directly for blender, in XR i'd use a similar thing
schlich
schlichOP2w ago
i mean, blender is a particularly spatial app. that video does actually portray some of the functionality i'd like to see in stardust, but it kinda breaks down when you need to introduce abstractions, like you would when you're making music instead of 3D visual art. Regardless of what you are doing, it will be massively helpful to have a defined way to iterate through items, group items, duplicate items, advance through layers of hierarchy, etc
Nova
Nova2w ago
duplicating is pulling the items apart when they're unresizeable advance through layers... what do you mean?
schlich
schlichOP2w ago
yes and what i'm interested in is all the details around making that happen and providing some infrastructure for people who want to do this across domains
Nova
Nova2w ago
also btw the action system you're proposing is not possible in xr
schlich
schlichOP2w ago
and ideally in a way that's customizable
Nova
Nova2w ago
because you have 50 types of controllers with varying controls not like a keyboard with a mostly common key layout and every key being the same as others
schlich
schlichOP2w ago
doesn't openXR constrain that some? is that what the rant was about earlier?
Nova
Nova2w ago
yes and in practice that is difficult for everyone involved some games just do not support your controller or do not have all actions mapped or have them in weird combos
schlich
schlichOP2w ago
that's what the protocol would help with imo
Nova
Nova2w ago
idk how it'd help it's a classic problem where when you have a common set of actions and a fixed interface you can't bind them always
schlich
schlichOP2w ago
that's EXACTLY what the kitty protocol was designed to address
Nova
Nova2w ago
it has a keyboard to work with key combos work on keyboards, they don't work on controllers button combos suck
schlich
schlichOP2w ago
i get that it's not about the keyboard
Schmarni
Schmarni2w ago
what? afaict it literally forwards specific buttons, its less generic than even openxr?
schlich
schlichOP2w ago
i'm getting to the point where i feel like i'm being intentionally misunderstood so i'm gonna put a bow on this for today... genuinely enjoyed the conversation and the critical thought though, and i'm excited to be here. i'm gonna go touch some grass haha
Nova
Nova2w ago
definitely not intentional i think we're getting wayyyy too caught up in specificsc
schlich
schlichOP2w ago
also that wasnt meant to be a direct reply
Nova
Nova2w ago
you just wanna know how to do these actions with audio, right?
schlich
schlichOP2w ago
or programming, or anything that involves complex creativity
Nova
Nova2w ago
given they use keyboard shortcuts?
Schmarni
Schmarni2w ago
yeah def not intentional
Nova
Nova2w ago
you have to reinterpret the interface before it gets to that point like, don't do selection
schlich
schlichOP2w ago
totally agree
Nova
Nova2w ago
as for the actual app case, it's per app how that interface is best i already made a little sketch of a DJ board made for XR but this is a reinterpreted one:
Nova
Nova2w ago
SoundStage VR
YouTube
SoundStage: VR Music Maker
Available now on Steam Early Access: http://store.steampowered.com/app/485780 SoundStage is a virtual reality music sandbox built specifically for room-scale VR. Whether you’re a professional DJ creating a new sound, or a hobbyist who wants to rock out on virtual drums, SoundStage gives you a diverse toolset to express yourself. If you look ...
Nova
Nova2w ago
just expand outward don't bother with jumping to things just make the window encompass it all and move around then have a way to unfold complexity contextually
schlich
schlichOP2w ago
we're on the same page on the interface side, i agree per app the interface is best, but ultimately my point is that it would probably be beneficial to stardust and the ecosystem as a whole if we put some thought into what kind of commonalities we can provide for people to not need to worry about the plumbing so much
Nova
Nova2w ago
i think that's just something that will take time to develop in practical usage pulling things apart to reveal their contents is one i'm fond of for heirarchy
schlich
schlichOP2w ago
absolutely. and my open question is "can we borrow useful concepts from well-established solutions and renvision them for the new medium" and i think the answer is not a clear yes or no but i'm interested in pursuing it as a sort of socratic method
Nova
Nova2w ago
Ultraleap
YouTube
Experience Aurora | Ultraleap
Welcome to a whole new world of interaction with Aurora, Ultraleap's latest VR experience! Aurora is a virtual space created by Ultraleap that's all about hand interaction. Aurora is made up of three islands, each hosting a different experience for you to explore. Navigate around the world of Aurora without needing to physically move yourself b...
Nova
Nova2w ago
like, see the exploded view headset? idk i think the answer is no, every time it's been tried it falls flat but i'm open to being wrong there
Schmarni
Schmarni2w ago
honestly yeah, as far as i can tell the answer is probably somewhere around "some tiny parts maybe", also open to being wrong tho
schlich
schlichOP2w ago
and on my end i fully admit to being green about implementaiton details in the code and challenges unique to XR... just trying to think about how to carve a way forward for what i think we can all agree are pretty exciting and appealing end goals. I believe in your vision Nova and thank you for providing this space!
Nova
Nova2w ago
no problem hope i didn't shut down any of your stuff btw
schlich
schlichOP2w ago
not at all
Nova
Nova2w ago
i want everything to be possible in stardust, but how that's done is what i disagree on
schlich
schlichOP2w ago
unfortunately i'm a perennial ideas guy so its honestly nice to get critical pushback. Just wish y'all see what i see 😉 it's important we get it right!
Nova
Nova2w ago
perennial ideas?
schlich
schlichOP2w ago
read: i'm always the ideas guy
Nova
Nova2w ago
ahh yea
schlich
schlichOP2w ago
i blame the adhd
Nova
Nova2w ago
ideas are good but you gotta know the medium to know how they're implemented
schlich
schlichOP2w ago
fo sho. but i'm also a scientist, taking abstractions and applying them to new mediums is the name of the game there
Nova
Nova2w ago
i'm a systems and UX designer/dev i do the same haha but i know wayy faster if something doesn't work in a new medium because i can identify its architecture and what fits in well faster
schlich
schlichOP2w ago
valuable skill, i wasted 10 years on a doctorate sooooo 🙂
Nova
Nova2w ago
haha i'm sure you got something out of it i didn't go to college tho
schlich
schlichOP2w ago
my mom is proud of me at least haha
Nova
Nova2w ago
that's good mine just doesn't get what i'm doing

Did you find this page helpful?