Discussion about this post

User's avatar
Neural Foundry's avatar

The PIP project sounds fascinating - love seeing XR used to explore altered states of perception rather than just slapping a headset on existing content. The semantic visualization in Aletheia reminds me of some early attempts at debugging transformer attention patterns, but taking it into immersive space adds a whol different dimension. One challenge I've run into with embodied AI interfaces is the uncanny valley gets way worse in VR than on flat screens - curious how you're handling that with AI JOE's viseme system. The MIT Spatial Sound collabs make sense for this kinda work dunno if binaural rendering helps bridge that gap or makes it more obvius when something feels off.

No posts

Ready for more?