You’re doing Mixed Reality wrong. – Noteworthy — The Journal Blog

You’re doing Mixed Reality wrong. From a promo video for Microsoft’s new Mixed Reality Headset (emoji added by me)Laptops were designed for convenience. To be carried, slid into a bag, popped open on an airplane tray table. But as we have found, in hunched backs and jutting chins and a market saturated with ergonomic contraptions desperate to reverse the widespread health effects of the use of these computers, convenience centrism leads down a pretty crap road.Mixed, Virtual, Augmented… I don’t care what you call it.Immersive wearable computing is still in a malleable phase and I refuse to let the next era of computational interfaces mature into yet another body blind, brain draining, anti-social tool.Here’s how:In order to make immersive wearable computing an all-brain, all-body tool we have to take on design problems both inside and outside the headset. We are not just creating a new piece of software but a new way of working with computers with our bodies. I call this new way ?studio metaphor, finally replacing ?desktop metaphor for good.Outside This is my model office of the future. Let me give you a tour.The first thing to notice is that there are no chairs. The floor is covered in a 2 cm think blue foam mat, no shoes allowed. Green and purple roll out mats offer more padding for extra sensitive knees and give the floor extra stick for vigorous activities. The space is strewn with orange cushions and foam blocks which can be endlessly and easily reconfigured throughout the day.Dog recommended but not required.Work here can look like this: This space balances mobility and stability. The floor is safe to crawl, fall, walk and bump around on. The “furniture” can build structures of support around you no matter your position or range of motion. (Transition to effectively using your body in this kind of space takes more practice and education for some than others so if you are looking to ditch the ?desktop metaphor environment give me a shout and we can have a one-on-one.)And this balance lets you, a human being with a body made of connective tissue and feelings, move fluidly with your immersive wearable.I call this fluidity Variable Viewpoint Computing.?Desktop metaphor is Fixed Viewpoint Computing. It keeps our bodies locked in fixed, forward facing, industrialized relationships to screens and keyboards and mice and chairs and desks and on and on. Even your precious phone, which gets to come with, is still haunted by the hunched back of fixed viewpoint. But no more! I have a software paradigm to cure all that ails us! Onward! Or actually…inward!InsideBy way of example let first lets take a look at my model of a podcast app.[embedded content]The first thing to notice is that instead of windows and drop-downs to organize functionality this paradigm uses three different scales: ?wearable, ?holdable, and ?habitable.The ?wearable scale is small and usually kept on the avatar. It gives you glance-level access to information, quick access storage, has only a few easy means of interaction and can modify your senses like glasses or earbuds.The ?holdable scale is comprised of tool-sized objects, body scale interactions, and the highest level of interactivity. These are the things we use to get things done, to create and find and navigate and discover.?Habitable scale comprises the functionalities sorted into the navigable environment. Anything best placed at different stations to help you focus on the flow of a process, storage with slower accessibility, and information that challenges the limits of short-term memory all belong in this scale.Not only is this UX paradigm made to be used in any body position, not just standing or stuck in a swivel chair, it is also designed to offload interaction demands from the prefrontal cortex onto the motor cortex and hippocampus thus distributing load more evenly across brain and body systems making the use of computation tools less fatiguing over all. This works by calling on place cells in the hippocampus to recall spatially organized information in the ?habitable scale of the software while using the motor cortex’s proprioception and orienteering to recall information organized at ?holdable and ?wearable scales. ?Habitable __________________________________?Holdable _________________________________?WearableRethinking software and operating system design with this avatar centrist approach has a bunch of huge advantages. Since multiple ?wearable and ?holdable instances of different pieces of software can all coexist inside of the same ?habitable scale we can start to think of the UX as an ecosystem. In the current system each piece of software is essentially a bucket with its own special import and export procedures. The user extracts data to the desktop in a special format called a saved file then re-imports it elsewhere to share or further manipulate it. But in our ?studio metaphor, ecosystem data transitions in situ from scale to scale.For example:Your favorite podcast is holding a competition for a new 3D logo design. So you go into the ?habitable scale of you favorite 3D modeling app and whip up a fun new logo you think will be a winner. When it’s done you pull the ?wearable scale podcast app off your avatar’s ear and it magically inflates into the ?holdable scale where you can submit your creation. Still in your ?habitable 3D modeling app, you slot your submission into place and send it off. Then pop the podcast app back on your ear.What have we learned?☑️Variable Viewpoint: Both our physical environments and our mixed reality software must work in any body position. No more locking people into fixed relationships with their computers because a designer thought they knew best. I have used so many VR and MR interfaces that only work standing or at chair sitting height. Sit on the floor! Crawl around! Lay on your side! Prop yourself up on some cushions! Throw your legs up the wall! Dance! Play! Move!☑️Three-scale UX design: With the scaffold of ?wearable, ?holdable, and ?habitable scales to frame the design of software and operating systems going forward we can throw out the last lingering shades of ?desktop metaphor (we have all seen them in VR and MR demos: the floating rectangle menus, the forward facing biases, the magic window browsers…) and create a new era of embodied computing.☑️All-Brain, All-Body, All-Bodies: Where does all this distributed cognitive load and long-term, physical well-being work get us? We can steer computation, one of the most powerful tool humans have ever invented, toward a future in which it engages with and empowers a wider range of human capacities allowing us to create entirely new forms of thought and action we cannot currently imagine.Thanks for readingM EiflerEndnotes: For the last 4 year my work was supported by eleVR at HARC/YCR, an immersive wearable research team that released all of its work on an open source basis. Unfortunately, funding for that lab is no more and I’m now independent . If you are looking for a SPEAKER, CONSULTANT or HIRE in this field check out my site at can read more about how I researched and developed my model office for wearable computing and operating system paradigm in my posts “The Office of the Future” and “?Studio Metaphor: An Embodied Software Paradigm”.?? Get to know the people and ideas shaping the products we use every day. Subscribe to Noteworthy — the product & design newsletter written by the Journal team.
Ir a la fuente / Author: M Eifler

VEO lo que NO SE VE

seo o no seo precio seo
seo o no seo precio seo coste seo


Posicionamiento SEO, Hosting Servidores SSD optimizados para WordPress, Diseño de páginas web WordPress

Primer ANÁLISIS SEO GRATIS! Envía un email con tu dominio a:

Josean | |  |  656 545 123  🙂

  Licencia de Creative Commons