Mixed Reality Blog articles https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/bg-p/TheMixedRealityBlog Mixed Reality Blog articles Fri, 30 Sep 2022 09:37:14 GMT TheMixedRealityBlog 2022-09-30T09:37:14Z Building Volumetric UI with MRTK3 https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/building-volumetric-ui-with-mrtk3/ba-p/3631764 <P>MRTK3 represents a significant step forward in the maturity of our user interface design tooling in MRTK. Over the last year (and more) we've invested significant resources into modernizing our design systems for UI in mixed reality, as well as overhauling the component libraries and tooling for building out these UI designs in Unity. If you've had experience with MRTK in the past, you'll know that building beautiful, modern user interfaces for mixed reality applications has never been an easy task. High-quality volumetric UI requires unique tools and systems, and organizing all of it under a cohesive design language is even harder. Throughout the course of developing more mature design systems, we've run into and overcome several categories of UI tooling challenges with our existing setup, ranging from the human challenges (usability, workflow, keeping a large design team consistent) to the engineering challenges (layout engines, 3D volumetric interactions, analog input, rendering/shaders).</P> <P>&nbsp;</P> <CENTER><VIDEO src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/Microinteractions.mp4" autoplay="autoplay" loop="loop" muted="" width="700" height="300"></VIDEO></CENTER> <P>&nbsp;</P> <P>In the next generation of UI tooling for MRTK3, we've sought to significantly improve the developer experience with more powerful engineering systems, improve the designer experience with more modern design features, and improve the user experience with more delightful, immersive, and "delicious" microinteractions and design language.</P> <P>&nbsp;</P> <CENTER><VIDEO src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/spinning.mp4" autoplay="autoplay" loop="loop" muted="" width="881" height="400"></VIDEO></CENTER> <P>&nbsp;</P> <H2>Variant Explosion</H2> <P>&nbsp;</P> <P>In previous versions of MRTK, designing 3D UI often meant manual calculations, back-of-the-napkin math for alignments and padding, and a lot of hand-placed assets that couldn't respond to changes in layout or dimensions. Most of these limitations were due to the fact that the main way of building UI in MRTK2 didn't use the typical UI tooling available in Unity. Later versions of MRTK2 explored building 3D UI with Canvas and RectTransform layouts, but it was never the preferred/primary way to build UI in MRTK.</P> <P>Internally at Microsoft, as we've built bigger and more ambitious applications for MR devices, we've hit the scale where we need more modern design tooling, workflows, and methods for managing highly complex UI layouts. When you have 100+ engineers, designers, and PMs, keeping design language and layouts consistent is a significant challenge! If we were still using the manual methods of aligning, sizing, and designing UI, we'd quickly hit a wall of hundreds of slightly misaligned buttons, off-by-a-millimeter issues, exponentially exploding numbers of prefab variants and assets... a true nightmare.</P> <P>&nbsp;</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/Prefabs.png" border="0" /></CENTER> <P>&nbsp;</P> <P>External customers of MRTK in the past might have experienced miniature versions of this problem, notably from the huge number of prefabs and prefab variants required to describe all possible configurations and sizes of UI controls. We had an exponentially-nasty number of variants, where we required variants for every permutation of button style, configuration, layout, and even *size*! (That last one was particularly frustrating...)</P> <P>&nbsp;</P> <P>With MRTK3, we've drastically reduced the number of prefabs. Instead of needing prefab variants for sizes and configurations, we can use a single prefab and allow the user to freely resize the controls, add and remove their own sub-features within the control, and even expand button groups/lists/menus with dynamic layout.</P> <P>&nbsp;</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/new_ui.png" border="0" /></CENTER> <P>&nbsp;</P> <P>(Nearly) every button you generally work with in MRTK3 will be the same prefab. All UI elements are resizable and can be dynamically fit both to their surrounding containers and to the content they wrap. None of this is really an MRTK-specific invention; we're just ensuring that all of our UX is built to the same standards that Unity's own UI is built, with compatibility for all of the existing layout groups, constraints, and alignement systems.</P> <P>&nbsp;</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/layout_demo.gif" border="0" width="371" height="358" /></CENTER><CENTER></CENTER> <P>&nbsp;</P> <P>Every single button you see on this UI tearsheet is actually, in fact, the exact same prefab:</P> <P>&nbsp;</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/tearsheet.png" border="0" width="724" height="271" /></CENTER> <P>&nbsp;</P> <P>The speed at which designers can build UI templates is drastically accelerated. It's actually *fun*, now, to build UI in MRTK... simple building blocks, combined together, to form more complex layouts.</P> <P>&nbsp;</P> <H2>Measurements</H2> <P>&nbsp;</P> <P>Another problem with working with UI at scale is that there are very specific requirements from the design language/library for measurements, both for usability concerns (minimum touch targets, readability, etc) as well as design (padding, margin, corner radii, branding). This was one of the most critical areas where our design in mixed reality had departed from the typical workflow that most designers are used to in 2D contexts. In the past, we had not only specified everything in absolute coordinates without any sort of flex or alignment parameters, but we used physical, real-world units for <EM>everything</EM>. All UI was designed in millimeters; branding guidelines were in millimeters, margin, padding, gutter, spacing, all in millimeters. Even fonts were specified in millimeters!</P> <P>&nbsp;</P> <P>This had some advantages: notably, we were working in a real, physical environment. UI in mixed reality isn't just some abstract piece of information on a screen somewhere; it's essentially a real, physical object that exists in the real, physical world! It must have a defined physical size with physical units, at some point in the development process. We also have very strict requirements for usability with holograms; we have user research telling us that certain touch target sizes (32mm x 32mm, or 24mm at the absolute smallest) is acceptable for performing 3D volumetric pressing interactions with our fingers.</P> <P>&nbsp;</P> <P>However, this attachment to physical units also had drawbacks. Primarily, this is an alien working environment to typical front-end designers, who are used to non-physical design units like <CODE class="language-plaintext highlighter-rouge">em</CODE>, <CODE class="language-plaintext highlighter-rouge">rem</CODE>, <CODE class="language-plaintext highlighter-rouge">%</CODE>, <CODE class="language-plaintext highlighter-rouge">vh</CODE>, or “physical” units that aren’t even really <EM>physical</EM> to begin with (<CODE class="language-plaintext highlighter-rouge">px</CODE>, <CODE class="language-plaintext highlighter-rouge">pt</CODE>). Traditional 2D design has a concept of DPI, or screen density, as well; but in mixed reality, there’s a much closer relationship between design and the physical world. (I like to think about it as something closer to industrial design, or physical product design: you’re building real, physical affordances that sit on a real, physical object!)</P> <P>&nbsp;</P> <P>The most direct drawback was that in this system of everything being specified in absolute physical units, there was zero room at all for <EM>scaling</EM>. In mixed reality, there are still some circustances where the entire UI layout or design should be scaled up or down; this is common when reusing UI layouts for faraway, large objects (or reusing UI layouts for near or far interaction!) For UI elements like stroke, outline, and corner radius, we use shader-driven rendering techniques. When these are specified only in absolute physical units (like, say, a "1mm" stroke, or a "5mm" corner radius), there is no way for these measurements to remain <EM>proportionally consistent</EM>&nbsp;with the rest of your design. If your 32mm x 32mm button is scaled up 5x, your 1mm and 5mm design elements will still remain 1mm thick and 5mm wide. They will be <EM>proportionally</EM>&nbsp;incorrect, despite being specified in <EM>absolute</EM>&nbsp;units.</P> <P>&nbsp;</P> <P>This gets pretty confusing, but the core of the issue is this: without RectTransforms or Canvas, there is no such thing as a <EM>dimension</EM>. There is only <EM>scale</EM>. For elements like stroke, or corner radius, we had to specify them "absolutely" so that they were consistent across the <EM>scaling</EM>&nbsp;operations used to adjust their size and shape. However, when the overall UI layout needed to be scaled up or down, those absolute measurements would become proportionally incorrect.</P> <P>&nbsp;</P> <P>Here, let's take a look at some visual examples to make this a bit less confusing. First, let's see what happens to a non-RectTransform-based Dialog control when we want to "scale it up":</P> <P>&nbsp;</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/scaledemo.png" border="0" width="811" height="253" /></CENTER><CENTER></CENTER><CENTER></CENTER> <P>&nbsp;</P> <P>You can see that all of the strokes, corner radii, etc, that were absolutely specified in their physical units stayed "physically correct"; i.e., the strokes were always exactly 1mm. However, as the rest of the design scaled up and down, they became out of proportion!</P> <P>You might say "just specify all of the elements as *relative* to the overall design scale... " <STRONG>The issue is that, without RectTransform/Canvas, there is nothing to be relative <I>to</I>!</STRONG> If everything is just scaling operations on top of scaling operations, there's no way to define any sort of true relative measurement. Every relative measurement would be relative to its parent, which would have any number of destructive scaling operations applied to it. There is no "root" of the design, and no "DPI" that could be used to specify a relative measurement unit.</P> <P>&nbsp;</P> <P>How do you solve this? The answer is <I>non-physical</I> design units, with a certain "scale factor" (similar to a display's DPI, except now we're applying this to the relative size of a hologram to the physical world!). Non-physical units and design scale factors are only possible with Canvas and RectTransform layout, where the Canvas itself serves as a "design root", and individual UI elements are not <I>scaled</I>, but instead <I>sized</I>.</P> <P>&nbsp;</P> <P>UI is designed in an arbitrary, non-physical unit. Let’s call it <CODE class="language-plaintext highlighter-rouge">u</CODE> for now, for lack of a better name! (Internally, we generally call it <CODE class="language-plaintext highlighter-rouge">px</CODE>, but that’s quite the overloaded term… it’s not pixels, not on any actual physical device!)</P> <P>&nbsp;</P> <P>We’ll also define a scale factor, or metric. MRTK3’s component library uses a <CODE class="language-plaintext highlighter-rouge">1mm</CODE> to <CODE class="language-plaintext highlighter-rouge">1u</CODE> scale metric by default, but other component libraries at Microsoft have used other metrics, like <CODE class="language-plaintext highlighter-rouge">1mm</CODE> to <CODE class="language-plaintext highlighter-rouge">3u</CODE>. In MRTK3, our trusty 32mm button is now measured <CODE class="language-plaintext highlighter-rouge">32u x 32u</CODE>. At the default scale (of <CODE class="language-plaintext highlighter-rouge">1mm : 1u</CODE>) we get our standard, recommended 32mm touch target! However, most critically, we also have the freedom to rescale our measurements whenever we want, so we can <EM>scale up and down entire designs while maintaining design integrity.</EM></P> <P>&nbsp;</P> <P>Here's a RectTransform-based Dialog control, showing how even when we scale it from 0.5x to 2x, all of the branding and visual elements remain proportionally correct.</P> <P>&nbsp;</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/canvasscaledemo.png" border="0" width="686" height="394" /></CENTER> <P>&nbsp;</P> <P>Now, when designers build layouts in external tools like Figma, they can deliver redlines to engineers or tech designers that can actually be implemented! By using design units rather than physical units, along with powerful realtime layout, flex, and alignment systems, we can implement much more modern and robust designs without resorting to manual placement and napkin math.</P> <P>&nbsp;</P> <H2>Volumetric UI in a Flat World</H2> <P>&nbsp;</P> <P>Interacting with 3D user interfaces, while at the same time following traditional 2D interface "metaphors" like clipping and scrolling is difficult. Our UI controls in MRTK3 are fundamentally real, solid objects, with depth, volume, and thickness.</P> <P>&nbsp;</P> <CENTER><VIDEO src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/Carousel.mp4" autoplay="autoplay" loop="loop" muted="" width="750" height="350"></VIDEO></CENTER> <P>&nbsp;</P> <P>Containing these objects within UI metaphors like "scroll views" can be tricky; what does "clipping" look like in a volumetric context? Normally, UI has raycast/hit targets that are represented as 2D rectangles (or even alpha-tested bitmaps/hitmaps) that can overlay and intersect, and can be clipped by a 2D clipping rectangle.</P> <P>&nbsp;</P> <P>With 3D volumetric UI, what does that even look like? Unity UI generally functions with image-based raycast hit testing, as described above; that generally doesn't cut it for our volumetric UI, as we need full, physicalized colliders for our 3D pressing interactions and free-form 3D layout. Colliders can't easily be "clipped" like an image-based raycast target, right?</P> <P>&nbsp;</P> <P>As part of our effort to adopt existing Unity UI constructs like LayoutGroups and RectTransform hierarchy-based clipping, we've developed systems to clip volumetric UI and its corresponding physical colliders in a component-for-component compatible way with the existing Unity UI scroll view system. Colliders are clipped in a depth-correct (planar) way that allows 3D UI with thickness and volume to accurately conform to the bounds of a Unity UI scrollview/clipping rectangle, even when objects and colliders are partially clipped, or intersecting the edges of the clipping region.</P> <P>&nbsp;</P> <CENTER><VIDEO src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/collider_clipping.mp4" autoplay="autoplay" loop="loop" muted="" width="650" height="350"></VIDEO></CENTER> <P>&nbsp;</P> <P>In previous iterations of MRTK, we've simply enabled or disabled colliders as they leave the footprint of the clipping region. This resulted in users accidentally pressing buttons that were 90% invisible/clipped, and buttons that were still visible being unresponsive. By accurately clipping colliders precisely to the bounds of the clipping region, we can have millimeter-accurate 3D UI interactions at the edge of a scroll view. Even better, this is all based on the existing Unity UI layout groups and scroll components, so all of your scrolling physics remains intact, and is simultaneously compatible with traditional 2D input like mouse scroll wheels, multitouch trackpads, and touchscreens.</P> <P>&nbsp;</P> <H2>Input</H2> <P>&nbsp;</P> <P>Our 3D volumetric UI sits at the intersection of a huge number of input methods and platforms. Just in XR, you have</P> <UL> <LI>Gaze-pinch interaction (eye gaze targeted, hand tracking pinch and commit), with variable/analog pinching input</LI> <LI>Hand rays, with variable/analog pinching input</LI> <LI>Pressing/poking with hand tracking (any number of fingers!), with volumetric displacement</LI> <LI>Gaze-speech ("See-It-Say-It")</LI> <LI>Global speech (keyword-based)</LI> <LI>Motion controller rays (laser pointer), with analog input</LI> <LI>Motion controller poke (same volumetric displacement as hands!)</LI> <LI>Gaze dwell</LI> <LI>Spatial mouse (a la HoloLens 2 shell, Windows Mixed Reality shell)</LI> </UL> <P>The list expands even further when you consider flat-screen/2D input...</P> <UL> <LI>Touchscreen/multitouch</LI> <LI>2D mouse</LI> <LI>Gamepad</LI> <LI>Accessibility controllers</LI> <LI>Keyboard navigation</LI> </UL> <P>Unity's UI systems are great at 2D input. They offer out-of-the-box touchscreen, mouse, and gamepad input. They even offer rudimentary point-and-click-based XR input. However, when you look at the diversity and richness of the XR input space we try to solve with MRTK, basic Unity UI input is unfortunately inadequate.</P> <P>&nbsp;</P> <P>Unity UI input is fundamentally two-dimensional. The most obvious gap is with volumetric 3D pressing/poking interactions; sure, pointer events could be emulated from the intersection of a finger with a Canvas plane, but XR input is so much richer than that! Your finger can be halfway <EM>through</EM>&nbsp;a button, or your gaze-pinch interaction could be halfway-pinched, or maybe any number of combinations of hands, controllers, gaze, speech... When UI is a real, physical object with physical characteristics, dimensions, and volume, you need a richer set of interaction systems.</P> <P>&nbsp;</P> <P>Thankfully, MRTK3 leverages the excellent XR Interaction Toolkit from Unity, which is an incredibly flexible framework for describing 3D interaction and manipulation. It's more than powerful enough to describe all of our complex XR interactions, like poking, pressing, pinching, and gazing... but, it's not nearly as equipped to handle traditional input like mice, touchscreens, or gamepads. Hypothetically, we could re-implement a large chunk of the gamepad or mouse input that Unity UI already provides, but that sounds pretty wasteful! What if we could combine the best parts of XRI's flexibility with the out-of-the-box power of Unity's UI input?</P> <P>&nbsp;</P> <P>We do just that with our component library, in a delicate dance of adapters and conversions. Each MRTK3 UI control is <STRONG>both</STRONG> a UnityUI <CODE class="language-plaintext highlighter-rouge">Selectable</CODE> and an XRI <CODE class="language-plaintext highlighter-rouge">Interactable</CODE>, simultaneously! This gives us some serious advantages vs only being one or the other.</P> <P>&nbsp;</P> <P>Our XRI-based interactors can perform rich, detailed, 3D interactions on our UI controls, including special otherwise-impossible behaviors like analog "selectedness" or "pressedness" (driven by pinching, analog triggers, or your finger pressing the surface of the button!). At the same time, however, we get touchscreen input, mouse input, and even directional gamepad navigation and input as well, without needing to implement any of the input handling ourselves.</P> <P>&nbsp;</P> <P>We achieve this by translating incoming UnityUI events (from the <CODE class="language-plaintext highlighter-rouge">Selectable</CODE>) into instructions for XRI. The XRI <CODE class="language-plaintext highlighter-rouge">Interactable</CODE> is the final source of truth in our UI; the only “click” event developers need to subscribe to is the <CODE class="language-plaintext highlighter-rouge">Interactable</CODE> event. However, using a proxy interactor, we translate UnityUI events (like <CODE class="language-plaintext highlighter-rouge">OnPointerDown</CODE>) into an equivalent operation on the proxy interactor (like an XRI Select or Hover). That way, any UnityUI input, such as a touchscreen press or gamepad button press, is translated into an equivalent XRI event, and all codepaths converge to the same <CODE class="language-plaintext highlighter-rouge">OnClicked</CODE> or <CODE class="language-plaintext highlighter-rouge">OnHoverEnter</CODE> result.</P> <P>&nbsp;</P> <P>The flow of how these events get propagated through the different input systems and interactors is detailed in this huge diagram... feel free to open in a new tab to read closer!</P> <CENTER><IMG src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/unification_diagram.svg" border="0" /></CENTER> <P>What this means for your UI is that you can use the exact same UI prefabs, the exact same events, and the exact same UI layouts across a staggeringly huge number of devices and platforms, all while retaining the rich, volumetric, delightful details that MRTK UX is known for.</P> <P>&nbsp;</P> <H2>Wrap-up</H2> <P>&nbsp;</P> <P>This post could go on for several dozen more pages about all of the new MRTK3 UX tooling and features… but, the best way to explore it is to go build! <A href="#" target="_blank" rel="noopener">Check out our documentation for the new systems here</A>, and learn how to set up your own new MRTK3 project <A href="#" target="_blank" rel="noopener">here!</A> Alternatively, you can directly check out our sample project by cloning this git repository at the <CODE class="language-plaintext highlighter-rouge">mrtk3</CODE> branch <A href="#" target="_blank" rel="noopener">here</A>.</P> <P>&nbsp;</P> <P>In the future, I’ll be writing some more guides, teardowns, and tutorials on building volumetric UI with the new tooling. In the meantime, you can <A href="#" target="_blank" rel="noopener">check out my talk at MR Dev Days 2022</A>, where I go over most of the topics in this post, plus a breakdown of building a real UI layout.</P> <P>&nbsp;</P> <P>I hope you've enjoyed this deep dive into some of the new UI systems we've built for MRTK3, and we can't wait to see what you build with our new tools. Personally, I love gorgeous, rich, expressive, and delightful UI, and I can't wait to see all of the beautiful things that the community can cook up.</P> <P>&nbsp;</P> <P>&nbsp;</P> <CENTER><VIDEO src="https://zee2.github.iohttps://techcommunity.microsoft.com/images/canvas/handmenu.mp4" autoplay="autoplay" loop="loop" muted="" width="674" height="374"></VIDEO></CENTER> Tue, 20 Sep 2022 18:36:46 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/building-volumetric-ui-with-mrtk3/ba-p/3631764 FinnSinclair 2022-09-20T18:36:46Z Create Low-Code MR Apps with Power Apps https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/create-low-code-mr-apps-with-power-apps/ba-p/3629444 <P><SPAN data-contrast="auto">For the past month, our Cloud Advocacy team has shared projects with you all that could be created with the Mixed Reality Toolkit. For this month, we're taking a bit of a detour to showcase a Mixed Reality project that can be created with Microsoft Power Apps!</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">Power Apps are organizational-specific applications or tools that allow you to create custom applications using many features, including a data platform that provides a flexible development environment to build custom apps for your business. Applications built with Power Apps offer great business logic to transform your manual business functions into automated processes. Also, these applications can be accessed via mobile devices or the browser, enabling users to create custom apps without writing code.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Power Apps provides an extensible platform for users with minimal coding experience to develop applications with rich business logic. In addition, creators find it easy to interact with data and metadata while using Power Apps.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Through Power Apps, you can create canvas applications. Microsoft Power Apps allows you to build business apps from a canvas with minimal coding. Creating a canvas app in Power Apps is as easy as dragging and dropping components onto the canvas, just as designing a PowerPoint presentation. You can also integrate business logic through various data sources.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">For this week, we put Power Apps to the test to create an app that displays 3D models both directly on screen as well as in Mixed Reality! The result? An awesome learning experience that leverages the Smithsonian 3D API and open-source models! Joining me this week is Daniel Laskewitz, Sr. Cloud Advocate within our Power/Fusion Cloud Advocacy team. We partnered together to create this project and we're excited to tell you more about it!</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">The Idea</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">In my last post, I shared with you all a </SPAN><A href="https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/apollo-11-vr-exhibit-with-azure-text-to-speech-amp-mrtk/ba-p/3606561" target="_blank"><SPAN data-contrast="none">VR museum experience</SPAN></A><SPAN data-contrast="auto"> that leveraged the Apollo 11 mission models provided by the </SPAN><A href="#" target="_blank"><SPAN data-contrast="none">Smithsonian 3D Digitization</SPAN></A><SPAN data-contrast="auto"> project. I wanted to explore other collections and thus landed on the </SPAN><A href="#" target="_blank"><SPAN data-contrast="none">Coral Collection</SPAN></A><SPAN data-contrast="auto">. As I thought about how I could turn this collection into a learning opportunity, I thought it'd be cool to bring the models into my own space and learn more about coral both in-app and in Mixed Reality! What I envisioned would be an app that provides a selection of coral to both learn facts and view the models in 'real life' without the need to visit the museum. The added convenience of bringing the museum experience to me was the icing on the cake.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">One of the cool parts about the Power Platform is that there are many data sources to connect to. There are already more than 800 available out of the box! But, even when your data source is not one of those 800 connectors, you can create your own connector. The Smithsonian 3D Digitization project also has an API available which offers the possibility to search for 3D objects. If you could combine that with the idea of the app with the collection of corals, it would be a real killer app.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Finding Inspiration</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">I'm a huge fan of using Dribbble and Pinterest to find inspiration for creating projects. For this project, I went down the rabbit hole of searching for </SPAN><I><SPAN data-contrast="auto">AR Museum Apps</SPAN></I><SPAN data-contrast="auto">. Once I exhausted that option, I began to search for</SPAN><I><SPAN data-contrast="auto"> AR Learning Apps</SPAN></I><SPAN data-contrast="auto">. Having this variety in form and function provided insight into the various ways creators are designing UI for AR learning experiences.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">After browsing the work of many creative designers, I settled on the following design for the app:</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="A mockup of the app." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405726i44E75D0453704888/image-size/large?v=v2&amp;px=999" role="button" title="mock-up.png" alt="A mockup of the app." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">A mockup of the app.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Creating the Main Screen</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">If this app were to ever be created as a full-fledged experience, I'd image that there'd be an introduction for each collection available in the app. Keeping that in mind, I decided to create an introduction screen for the Corals and Coral Reefs collection. I leveraged a very beautiful image on </SPAN><A href="#" target="_blank"><SPAN data-contrast="none">Unsplash by Scott Web</SPAN></A><SPAN data-contrast="auto"> for the background and configured some basic Power Apps components. I wanted to maintain the vibrancy of the sea urchins while also ensuring that the text on the screen would be legible. With that said, I added a black rectangle component behind the collection description and lowered the opacity so that the white text would stand out more without taking away from the beauty of the sea urchins. Another thing I made sure to do was try to incorporate roundness as best as possible. The default Power Apps button is round, however, I desired more curvature. Therefore, I increased the border radius to 50 and got exactly what I wanted!</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Introduction screen for Corals and Coral Reefs collection." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405728i41D866A4EC6AA717/image-size/large?v=v2&amp;px=999" role="button" title="intro-screen.png" alt="Introduction screen for Corals and Coral Reefs collection." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Introduction screen for Corals and Coral Reefs collection.</span></span></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><STRONG><SPAN data-contrast="auto">Storing Models on OneDrive</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">With the </SPAN><I><SPAN data-contrast="auto">Introduction</SPAN></I><SPAN data-contrast="auto"> screen complete, I needed to find a place to store the models. We ran into a bit of a hiccup while trying to render the models in Mixed Reality using the API, therefore, we needed an alternative solution. Fortunately, the models could be saved to OneDrive and in return a data connector could be made in Power Apps to OneDrive to reference the models. I'll admit, this part of the project took quite a bit of setup because I had to both download the models and properly structure everything within the OneDrive folder. When you're referencing models from OneDrive, you do so from an Excel spreadsheet that contains relative links to models within the overall project folder. </SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">I'll break it down for you:</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="The folder structure for the project assets." style="width: 842px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405734i8F81C2E0CCDCF596/image-size/large?v=v2&amp;px=999" role="button" title="folder-structure.jpg" alt="The folder structure for the project assets." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">The folder structure for the project assets.</span></span></P> <P><SPAN data-contrast="auto">Essentially, I had to first create a folder for the models and a folder for the photos of the coral. I made sure to follow a simple naming convention that could be repurposed for naming models and photos - this proved to be beneficial for my memory! Then in the Excel spreadsheet, I created a row for each coral which included its species, a description, a photo, and a model. The photo and model columns contain a relative link to the location of the coral's model and photo.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Excel spreadsheet table for the coral." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405731i30C6D4CE3E3D186B/image-size/large?v=v2&amp;px=999" role="button" title="table.jpg" alt="Excel spreadsheet table for the coral." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Excel spreadsheet table for the coral.</span></span></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">After all the corals were added to the spreadsheet, I had to create a table of the data - this part (like all parts) is crucial because Power Apps pulls data from the table.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Creating the Galleries</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">After making the data connection in Power Apps to OneDrive, I was ready to create the galleries. After assigning the table as the data source, I configured the formula for the gallery components to display the image for the coral. I didn't want to take up too much space on the screen with words - especially since species names are relatively long. Instead, I opted to just show the image. I also added more roundness to the UI by increasing the border radius of the images so that its corners were rounded. It really gave a different look and feel to the gallery as opposed to the default straight edges/corners.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Gallery screen for the app." style="width: 562px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405729i55E1DC10031EBFD0/image-size/large?v=v2&amp;px=999" role="button" title="gallery.png" alt="Gallery screen for the app." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Gallery screen for the app.</span></span></SPAN></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Creating the Information Screens</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">The final step for this portion of the project was to add in the 3D and Mixed Reality components. Power Apps provides a variety of Mixed Reality controls. For this project, I chose to integrate View in 3D and View in Mixed Reality:</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><STRONG><SPAN data-contrast="auto">View in 3D</SPAN></STRONG><SPAN data-contrast="auto"> - The View in 3D control enables you to view 3D content in the app. You can rotate and zoom into the model with simple gestures.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><STRONG><SPAN data-contrast="auto">View in Mixed Reality</SPAN></STRONG><SPAN data-contrast="auto"> - The View in MR control enables you to see how a particular item might fit within a specified space. The control creates a button in your app. When the button is pressed, an overlay of the selected 3D model (in .glb, .stl, or .obj file formats) displays onto the live camera feed of the device.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">After adding the necessary components to the screen, I dragged and dropped in the Mixed Reality components to add in the wow-factor to the app. And honestly, it was like magic! I at most needed to reference the model selected in the gallery in the Power Apps formula bar - but I promise it was very easy! Specifically: </SPAN><STRONG><SPAN data-contrast="auto">galCoral.Selected.'3DModel'</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Information screen for the app." style="width: 553px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405730i3957A283B81BBDD4/image-size/large?v=v2&amp;px=999" role="button" title="info-screen.png" alt="Information screen for the app." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Information screen for the app.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">I'll hand things over to Daniel now to share more about the work he did for the custom connector - Daniel, take it away!</SPAN></P> <P><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <H2><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">Creating a Custom Connector</SPAN></H2> <P>&nbsp;</P> <P><SPAN data-contrast="auto">I have been working on the </SPAN><A href="#" target="_blank"><SPAN data-contrast="none">Independent Publisher Connectors program</SPAN></A><SPAN data-contrast="auto"> for over a year now. This program enables you to create connectors for the Power Platform and make them available for every user of the Power Platform, without having to be the owner of the API. If you would like to build a connector for a service that you use, you’re welcome to submit a connector for that in the </SPAN><A href="#" target="_blank"><SPAN>Power Platform Connectors GitHub repository</SPAN></A><SPAN data-contrast="auto">.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">I love building connectors for the Power Platform. So, when I heard about the API that’s available from the Smithsonian 3D Digitization project, I wanted to build a connector for that API immediately. The Smithsonian 3D API has only one operation – the File Search operation – so that makes it easy to develop a connector for it!&nbsp;</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">When you build the connector, always make sure you think about the person who will use your connector later. In the Power Platform, it could very well be that other app makers also want to include your connector in their apps, so why not make it user friendly?</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">On top of building the connector – I also created a search screen to search for 3D objects and a screen where you can look at and interact with the 3D object.</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Smithsonian 3D Search UI" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/405722i3B0202460E675004/image-size/large?v=v2&amp;px=999" role="button" title="MicrosoftTeams-image (5).png" alt="Smithsonian 3D Search UI" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Smithsonian 3D Search UI</span></span></SPAN></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Workshop</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Oh - before we go, we have to tell you something exciting! We turned this entire project into a self-led workshop! To share the ability to create and host your own Power Apps workshop, we collaborated together to create an entire workshop equipped with a slide deck and complete instructions from start to finish. You can find the workshop by visiting: </SPAN><A href="#" target="_blank"><SPAN><STRONG>https://aka.ms/mr-power-platform</STRONG></SPAN></A> <SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">The workshop consists of 5 labs and all the assets you'll need to create your own version of the app we created. We've even provided the full Power Apps solutions as well. You're welcome to swap out the models for your own! If you have any questions, feel free to submit a GitHub Issue and we'll follow up with you in the repository.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> Fri, 16 Sep 2022 16:57:41 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/create-low-code-mr-apps-with-power-apps/ba-p/3629444 April_Speight 2022-09-16T16:57:41Z Apollo 11 VR Exhibit with Azure Text to Speech & MRTK https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/apollo-11-vr-exhibit-with-azure-text-to-speech-amp-mrtk/ba-p/3606561 <P><EM>Made with MRTK is a monthly series where we share sample projects that leverage the Mixed Reality Toolkit (MRTK). You’re welcome to leverage the resources provided to create your own experiences or iterate on the samples provided. For those new to MRTK, the toolkit provides a set of components and features used to accelerate cross-platform MR development in Unity. To learn more, visit<SPAN>&nbsp;</SPAN><A href="#" target="_blank" rel="noopener noreferrer">aka.ms/mrtk</A>!</EM></P> <P>&nbsp;</P> <P>Growing up in the DC area, I loved everything there was to love about visiting the Smithsonian Air and Space Museum! Upon entering the museum, you’re greeted by massive aircraft and the chance to see so many awesome artifacts in-person. I’ve personally always been fond of viewing the space exploration exhibits. Over the years, I’ve had my fair share of space exploration encounters – from visiting NASA, to <A href="#" target="_self">interviewing Astronaut Stanley G. Love</A> for Microsoft Build, and even a <A href="#" target="_self">Q&amp;A with Bill Nye the Science Guy</A> where we discussed The Planetary Society – his organization in which I’m a proud member!</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="April standing in front of airplanes at NASA Armstrong" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397965iC29CD6788BD28467/image-size/large?v=v2&amp;px=999" role="button" title="IMG_20200107_114106_Original.jpg" alt="April standing in front of airplanes at NASA Armstrong" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">April standing in front of airplanes at NASA Armstrong</span></span></P> <P>&nbsp;</P> <P>For this week’s <EM>Made with MRTK</EM> project, I decided to revisit my love of space exploration and bring the educational experience into VR. I created my very own interactive Apollo 11 exhibit with 3D models courtesy of the Smithsonian 3D Digitization project and NASA.</P> <P>&nbsp;</P> <H2><FONT size="5">Models</FONT></H2> <P>&nbsp;</P> <P>I discovered <A href="#" target="_blank" rel="noopener">Smithsonian 3D Digitization</A> a few years ago when I first began exploring educational experiences for AR and VR. This resource provides open-source models of artifacts across the Smithsonian museums. Since I wanted to create a space exploration exhibit, I headed over to their collections and saw that there was already a small collection of models for <A href="#" target="_blank" rel="noopener">Apollo 11</A>. Not only do they provide the models, but there’s also a description to accompany the models as well. This turned out to be really helpful as I was able to leverage the description provided when created the words for the exhibit.</P> <P>&nbsp;</P> <P>The other resource I used for models is <A href="#" target="_blank" rel="noopener">NASA Astromaterials 3D</A>. I honestly just learned about this resource while working on the project. I had thought to myself “How cool would it be to have an exhibit of Moon rocks?”. Lo behold, after a quick search online, I came across this website. There’s an entire <A href="#" target="_blank" rel="noopener">Apollo Lunar Collection</A> which consists of rocks from various Apollo missions. The files can be pretty heavy, so I recommend using the low-resolution files which are just a few megabytes. Likewise, NASA also provides a detailed description for the models.</P> <P>&nbsp;</P> <H2>Exhibit Inspiration</H2> <P>&nbsp;</P> <P>The layout of museum exhibits has always fascinated me. They’re all so different in design with regards to the colors, lighting, signage, and surfaces for artifacts. I headed over to Pinterest for some inspiration to help me decide how to layout the exhibits within the experience. I took a scroll through the results for <EM>Space Museum Exhibits</EM> to get an idea of how various museums layout and design their own exhibits. A common theme I encountered was that the rooms were relatively dark in nature with just the right amount of mood lighting – which made sense given that it's pretty dark in outer space. I also went on a deep dive in search of examples for exhibit signage. Given that there’s so much to read when viewing a museum exhibit, I wanted to ensure the written parts of the exhibit were both eye-catching and legible.</P> <P>&nbsp;</P> <P>Since I wanted to feature the models in the Apollo 11 collection and some of the lunar rocks, I narrowed down my exhibit layout to the following:</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Exhibit layout" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/398351i25B20BA06A9610AB/image-size/large?v=v2&amp;px=999" role="button" title="ExhibitLayout.png" alt="Exhibit layout" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Exhibit layout</span></span></P> <P>&nbsp;</P> <H2>Designing the Environment</H2> <P>&nbsp;</P> <P>Lately, I’ve been diving into world building and wanted to push myself outside my comfort zone for this project. Typically, whenever I create a simple proof of concept, I don’t put much effort into creating a captivating environment. However, I truly wanted to emulate the feeling of being in a museum. And not just any museum – I wanted it to feel like being in a museum in outer space! I swapped out the usual default sunny horizon in Unity for a <A href="#" target="_self">space skybox</A> that I found in the Unity Asset Store. The skybox I used provides a view of Earth from space!</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Skybox for the environment is a view of Earth." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397974i7F878A356AEC005C/image-size/large?v=v2&amp;px=999" role="button" title="skybox.jpg" alt="Skybox for the environment is a view of Earth." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Skybox for the environment is a view of Earth.</span></span></P> <P>Although there’s not a lot of light in space, I wanted to ensure that everything within the exhibit was still visible. Rather than brightening the entire environment with one bright directional light, I opted for gallery lights – or at least the essence of gallery lights! From my search on Pinterest, I saw that most exhibits rely on strategically placed spotlights to brighten the artifacts. I found a <A href="#" target="_self">model of gallery lights on Sketchfab</A> and paired a spotlight in Unity with each light so that it could look as though the light was coming from the gallery lights.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Gallery lights above the exhibits." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397975i113552DA304F3F96/image-size/large?v=v2&amp;px=999" role="button" title="gallery-lights.jpg" alt="Gallery lights above the exhibits." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Gallery lights above the exhibits.</span></span></P> <P>&nbsp;</P> <P>As for the exhibit platforms, I used Unity primitives and modified the shapes based on examples I had saw on Pinterest. I headed into Blender to create the exhibit titles so that there would be some depth between the exhibit title in comparison to the exhibit descriptions. This <A href="#" target="_self">YouTube video</A> helped me with creating the text as I had never simply created 3D text before in Blender. It’s tricker than it sounds given that the geometry tends to get a bit messy.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="'Neil Armstrong' written in 3D" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397977i5B3F1AAD191D11DE/image-size/large?v=v2&amp;px=999" role="button" title="neil-armstrong.jpg" alt="'Neil Armstrong' written in 3D" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">'Neil Armstrong' written in 3D</span></span></P> <P>&nbsp;</P> <P>As for the images within the exhibit, the Smithsonian had plenty to leverage across their websites. For example, the ones used for the timeline are courtesy of their very own <A href="#" target="_blank" rel="noopener">Apollo 11 Timeline</A> which is part of the National Air and Space Museum website.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Photos within the Apollo 11 Timeline" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397978i51884745153BE3B6/image-size/large?v=v2&amp;px=999" role="button" title="timeline-photos.jpg" alt="Photos within the Apollo 11 Timeline" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Photos within the Apollo 11 Timeline</span></span></P> <P>&nbsp;</P> <H2>Interacting with Artifacts</H2> <P>&nbsp;</P> <P>Adding an element of interactivity to a museum always makes the experience <EM>that</EM> much more exciting for me! In physical world, it’s not often that you can touch and grab the artifacts in a museum. You often must admire from afar. Since I was creating this experience in VR, I decided to break the rules a bit and make it so that you <EM>could</EM> interact with the artifacts. I chose to add interactivity to the Extra-Vehicular Gloves.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Extra-Vehicular Gloves" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397979i94573644B581FDF8/image-size/large?v=v2&amp;px=999" role="button" title="gloves.jpg" alt="Extra-Vehicular Gloves" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Extra-Vehicular Gloves</span></span></P> <P>&nbsp;</P> <P>Using MRTK’s <A href="#" target="_self">Object Manipulator</A> and <A href="#" target="_self">Near Interaction Grabble</A> scripts, I made it so that you could pick up and view the gloves either from a far or near distance. I’m honestly glad I did because there’s a lot of little detail that would’ve been missed if the gloves were stationary. For example, Armstrong is written on the inside of each glove – a detail that would surely be missed if a person couldn’t pick up the glove to see.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="'Armstrong' is written on the inside of the gloves" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397982iB287A96A2A451FC0/image-size/large?v=v2&amp;px=999" role="button" title="inside-glove.png" alt="'Armstrong' is written on the inside of the gloves" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">'Armstrong' is written on the inside of the gloves</span></span></P> <P>&nbsp;</P> <H2>Azure Text to Speech</H2> <P>&nbsp;</P> <P>I’m always a big fan of leveraging our <A href="#" target="_self">Azure Cognitive Services</A> in my projects – especially when it involves our <A href="#" target="_self">Speech services</A>! Whether it’s to make an experience more accessible or to provide an auditory layer to the experience, leveraging this service is always a favorite of mine. I chose to include an audio transcription of the exhibit descriptions to accompany the Neil Armstrong and Command Module exhibits. With <A href="#" target="_self">Azure Text to Speech</A>, I was able to send a string of text to the service which returned an audio clip of the transcribed speech. We have a <A href="#" target="_self">Speech SDK for Unity</A> that needs to be imported into the project - and you'll need to create a<A href="#"0%200%2016%2015'%20class%3D'msportalfx-svg-placeholder'%20role%3D'presentation'%20focusable%3D'false'%20xmlns%3Asvg%3D'http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg'%20xmlns%3Axlink%3D'http%3A%2F%2Fwww.w3.org%2F1999%2Fxlink'%3E%3Cg%3E%3Ctitle%3E%3C%2Ftitle%3E%3Cpath%20d%3D'M14.758%201.242c.276.276.505.578.688.906.188.328.325.669.414%201.024a4.257%204.257%200%200%201-1.103%204.086L8%2014.008l-6.758-6.75a4.269%204.269%200%200%201-.695-.906%204.503%204.503%200%200%201-.414-1.016%204.437%204.437%200%200%201%200-2.164c.094-.354.232-.695.414-1.024A4.302%204.302%200%200%201%202.625.32C3.141.107%203.682%200%204.25%200s1.109.107%201.625.32c.516.214.977.521%201.383.922l.742.75.742-.75A4.292%204.292%200%200%201%2010.125.32C10.641.107%2011.182%200%2011.75%200s1.109.107%201.625.32c.516.214.977.521%201.383.922z'%20class%3D'msportalfx-svg-c19'%2F%3E%3C%2Fg%3E%3C%2Fsvg%3E%22%7D%2C%22emptyHeartIcon%22%3A%7B%22type%22%3A1%2C%22data%22%3A%22%3Csvg%20viewBox%3D'0%200%2016%2015'%20class%3D'msportalfx-svg-placeholder'%20role%3D'presentation'%20focusable%3D'false'%20xmlns%3Asvg%3D'http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg'%20xmlns%3Axlink%3D'http%3A%2F%2Fwww.w3.org%2F1999%2Fxlink'%3E%3Cg%3E%3Ctitle%3E%3C%2Ftitle%3E%3Cpath%20d%3D'M11.75%200c.588%200%201.14.112%201.656.336.516.224.966.529%201.352.914.385.38.687.83.906%201.352.224.515.336%201.065.336%201.648%200%20.568-.11%201.112-.328%201.633-.214.52-.518.979-.914%201.375L8%2014.008l-6.758-6.75A4.256%204.256%200%200%201%20.32%205.883%204.263%204.263%200%200%201%200%204.25a4.177%204.177%200%200%201%201.242-3c.386-.385.836-.69%201.352-.914A4.113%204.113%200%200%201%204.25%200c.432%200%20.818.05%'%20class%3D'msportalfx-svg-c19%20msportalfx-svg-c19'%2F%3E%3C%2Fg%3E%3C%2Fsvg%3E%22%7D%2C%22deleteIcon%22%3A%7B%22type%22%3A17%2C%22options%22%3Anull%7D%2C%22searchId%22%3A%221661207479779_plusNewBladeSearchContext%22%2C%22searchTelemetryId%22%3A%226a417b58-87da-45c7-80ce-3c9e116ed9cf%22%2C%22searchIndex%22%3A0%2C%22privateBadgeText%22%3Anull%2C%22curationCategoryDisplayName%22%3A%22AI%20%2B%20Machine%20Learning%22%2C%22menuItemId%22%3A%22home%22%2C%22subMenuItemId%22%3A%22Search%20results%22%2C%22createBladeType%22%3A1%2C%22offerType%22%3A%22None%22%2C%22useEnterpriseContract%22%3Afalse%2C%22hasStandardContractAmendments%22%3Afalse%2C%22standardContractAmendmentsRevisionId%22%3A%2200000000-0000-0000-0000-000000000000%22%2C%22supportUri%22%3Anull%2C%22galleryItemAccess%22%3A0%2C%22privateSubscriptions%22%3A%5B%5D%2C%22isTenantPrivate%22%3Afalse%2C%22hasRIPlans%22%3Afalse%7D/selectionMode~/false/resourceGroupId//resourceGroupLocation//dontDiscardJourney~/false/selectedMenuId/home/launchingContext~/%7B%22galleryItemId%22%3A%22Microsoft.CognitiveServicesSpeechServices%22%2C%22source%22%3A%5B%22GalleryFeaturedMenuItemPart%22%2C%22VirtualizedTileDetails%22%5D%2C%22menuItemId%22%3A%22home%22%2C%22subMenuItemId%22%3A%22Search%20results%22%7D/searchTelemetryId/6a417b58-87da-45c7-80ce-3c9e116ed9cf" target="_self"> Speech resource in the Azure Portal</A>!</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Pressing the button plays an audio clip of the written text" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397984i0FE91B8CCE815F70/image-size/large?v=v2&amp;px=999" role="button" title="transcription.jpg" alt="Pressing the button plays an audio clip of the written text" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Pressing the button plays an audio clip of the written text</span></span></P> <P>&nbsp;</P> <P>I created a script which contains a method to trigger the workflow. To trigger the execution of the method, I created a custom button for each exhibit. When pressed, the method would execute and in less than a second, the audio clip returned by Azure plays in the scene. The custom button is configured with the MRTK Pressable Button and Interactable scripts. Configuring this button was a little tricky at first given that I had the wrong collider on the button. I used a cylinder primitive as the base of the button which resulted in me having a capsule collider. When creating a custom button with MRTK, the Pressable Button script provides layers which represents the distance values that are configured in the Press Settings. The capsule collider placed the layers in the wrong direction. Therefore, I had to swap out the capsule collider for a box collider.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Press Setting Layers" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/397985i807AF30629BCC511/image-size/large?v=v2&amp;px=999" role="button" title="button-layers.jpg" alt="Press Setting Layers" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Press Setting Layers</span></span></P> <P>&nbsp;</P> <H2>Demo</H2> <P>&nbsp;</P> <P>I recorded a demo of the experience viewed directly from my Meta Quest – viewable via the Unity Editor in Play mode. Feel free to check out the demo:</P> <P>&nbsp;</P> <P><LI-VIDEO vid="https://www.youtube.com/watch?v=H5a1kmKHmoo" align="center" size="large" width="600" height="338" uploading="false" thumbnail="https://i.ytimg.com/vi/H5a1kmKHmoo/hqdefault.jpg" external="url"></LI-VIDEO></P> <P>&nbsp;</P> <P>I’ve also created a <A href="#" target="_self">GitHub repository</A> for the project. As-is, the experience can be viewed on a Meta Quest while in Play mode within Unity by using the Quest<SPAN>&nbsp;Link cable. If you intend to view the experience for yourself, be sure to follow the instructions provided in the README as you’ll need to create an Azure Speech resource and add your own Key and Region into the Unity project to generate the transcribed audio clips.</SPAN></P> <P>&nbsp;</P> <H2>Conclusion</H2> <P>&nbsp;</P> <P>Creating VR museum experiences such as the Apollo 11 exhibit shines a light on how many resources are available for us to create our own educational experiences from scratch. Resources such as Smithsonian 3D Digitization and NASA Astromaterials 3D provides essentially the most challenging assets that would one need for creating VR experiences – the models! And it’s great that each resource also provides the accompanying descriptions to truly help bring more context to an experience. I hope that this project inspires you to create your own educational experiences - not just for VR but for AR as well! In a future post, I’ll showcase a similar project for AR that leverages Microsoft Power Apps.</P> <P>&nbsp;</P> <P>Until next time, happy creating!</P> Tue, 23 Aug 2022 23:23:08 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/apollo-11-vr-exhibit-with-azure-text-to-speech-amp-mrtk/ba-p/3606561 April_Speight 2022-08-23T23:23:08Z MR Dev Days Hackathon Winners - explore projects made with MRTK3 Public Preview & StereoKit https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mr-dev-days-hackathon-winners-explore-projects-made-with-mrtk3/ba-p/3600378 <P data-unlink="true">&nbsp;</P> <P data-unlink="true">At <A href="#" target="_blank" rel="noopener">Mixed Reality Dev Days</A> in June, we announced the availability of <A href="https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mrtk3-public-preview/ba-p/3556892" target="_blank" rel="noopener">MRTK3 Public Preview</A>, <A href="https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/point-cloud-support-in-azure-remote-rendering/ba-p/3484253" target="_blank" rel="noopener">point cloud support in Azure Remote Rendering</A>, and the kickoff of a month-long online hackathon with the theme "A New Way to Solve an Old Problem".&nbsp; Check out videos of all the qualifying projects in the hackathon <A href="#" target="_blank" rel="noopener">project gallery</A>.</P> <P>&nbsp;</P> <P>We are so proud of the hackers who participated and thankful for their valuable feedback. The issues, feature requests, and use cases surfaced will help make MRTK3 Public Preview and StereoKit even better for all developers.</P> <P>&nbsp;</P> <P>And the winners are….</P> <P>&nbsp;</P> <P><STRONG>In 3rd Place ($2,500 prize) - </STRONG><A href="#" target="_blank" rel="noopener"><STRONG>MRToolbelt</STRONG></A> <STRONG>by Aristides Staikos</STRONG></P> <P>&nbsp;</P> <TABLE style="border-style: none; width: 100%;" border="1" width="100%"> <TBODY> <TR> <TD width="50%"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Side of table measured with 2 fingers" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/396032iB64BA478A99A94DA/image-size/large?v=v2&amp;px=999" role="button" title="toolbelt fingers.png" alt="Side of table measured with 2 fingers" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Side of table measured with 2 fingers</span></span></TD> <TD width="50%"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Area of floor being estimated with blue box" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/396033iFEA53C30BBAC2877/image-size/large?v=v2&amp;px=999" role="button" title="toolbelt area.png" alt="Area of floor being estimated with blue box" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Area of floor being estimated with blue box</span></span></TD> </TR> </TBODY> </TABLE> <P>&nbsp;</P> <P>If you’ve ever needed to take and record a measurement, but didn’t have the right tools, <A href="#" target="_blank" rel="noopener">MRToolbelt</A> can help. Measure length with your fingers or estimate area or volume, then record it with photos, videos, or voice notes.</P> <P>&nbsp;</P> <P>To build MRToolbelt, Aristides leveraged spatial mapping and meshing provided in AR Foundation as well as the hand tracking subsystem, speech subsystem, and phrase recognition capabilities in MRTK3 Public Preview. Aristides also learned a lot about the XR Interaction Toolkit while building MRToolbelt and plans to add in object recognition and the ability to generate step-by-step instructions in the future.</P> <P>&nbsp;</P> <P><STRONG>In 2nd Place ($5,000 prize) - </STRONG><A href="#" target="_blank" rel="noopener"><STRONG>Xposure Therapy</STRONG></A><STRONG> by Flavius Lador, David-Bobola Ojoawo, and Jack Daus</STRONG></P> <P>&nbsp;</P> <TABLE style="border-style: none; width: 100%;" border="1" width="100%"> <TBODY> <TR> <TD width="33.333333333333336%"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Menu and two white balls roughly representing spider" style="width: 700px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/396040i1B4C9069BC65BEFB/image-size/large?v=v2&amp;px=999" role="button" title="therapy1.png" alt="Menu and two white balls roughly representing spider" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Menu and two white balls roughly representing spider</span></span></TD> <TD width="33.333333333333336%"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Menu and two white balls with insect legs" style="width: 698px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/396041iF05248D9B1E37C1A/image-size/large?v=v2&amp;px=999" role="button" title="therapy2.png" alt="Menu and two white balls with insect legs" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Menu and two white balls with insect legs</span></span></TD> <TD width="33.333333333333336%"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Menu and digital hand touching spider" style="width: 702px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/396042iCE797F91B27C5402/image-size/large?v=v2&amp;px=999" role="button" title="therapy3.png" alt="Menu and digital hand touching spider" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Menu and digital hand touching spider</span></span></TD> </TR> </TBODY> </TABLE> <P>&nbsp;</P> <P>Exposure therapy is a proven method for helping patients overcome phobias by gradually teaching them to reduce their fear response. Flavius, David-Bobola and Jack worked to make this important therapy tool more convenient and accessible. In their project, <A href="#" target="_blank" rel="noopener">Xposure Therapy</A>, patients are provided a list of objectives to complete starting with very rough representations of their phobia. They then progress little by little to interacting with more realistic and immersive representations.</P> <P>&nbsp;</P> <P>Xposure Therapy was built with StereoKit in C# using Visual Studio. The team used the StereoKit’s simulator for fast iterations and Quest + Link with hot reloads for quick deployments to Meta Quest 2. They were excited at how much they were able to learn and leverage while building their first mixed reality application including model loading, animation, sounds, and physics. They also offered up this important tip they learned that could help others starting out - make all your asset names in lower case when doing cross-platform development! Files in the Asset folder are not case-sensitive in Windows, but they are on Android.</P> <P><STRONG>&nbsp;</STRONG></P> <P><STRONG>1st Place ($10,000 prize) - </STRONG><A href="#" target="_blank" rel="noopener"><STRONG>MasterPlan XR</STRONG></A><STRONG> by Brett Jackson</STRONG></P> <P>&nbsp;</P> <TABLE style="border-style: none; width: 100%;" border="1" width="100%"> <TBODY> <TR> <TD width="100%"> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Grid lines and 3D project plan" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/396044i494E3719152C92D5/image-size/large?v=v2&amp;px=999" role="button" title="projectPlan.png" alt="Grid lines and 3D project plan" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Grid lines and 3D project plan</span></span></P> </TD> </TR> </TBODY> </TABLE> <P>&nbsp;</P> <P>In <A href="#" target="_blank" rel="noopener">MasterPlanXR</A>, Brett reimagined project planning by going beyond the 2D Gantt chart to efficiently utilizing 3D space. Teams can collaborate in a persistent project room where they can present to each other and interact with plans by adding, updating, dragging and dropping tasks. The 3D interface makes it easier to identify dependencies and conflicts, and brings additional project clarity across team members.</P> <P>&nbsp;</P> <P>Tasks in MasterPlanXR make use of StereoKit handles for task placement and a custom shader to surface additional information to users on the task blocks. Project assets and details are saved to the cloud using Azure, and peer-to-peer networking is enabled via Epic Online Services. In the future, Brett plans to add PDF import capabilities, data visualizations, Meta Quest support, enhanced security and permissioning, and more.</P> <P>&nbsp;</P> <P><STRONG>Thank you, hackers!</STRONG></P> <P>&nbsp;</P> <P data-unlink="true">We again want to thank all of the tremendously talented hackers who participated in the Mixed Reality Dev Days 2022 Hackathon and congratulate the winners! To learn more about <A href="#" target="_blank" rel="noopener">MRTK3 Public Preview</A>&nbsp;and <A href="#" target="_blank" rel="noopener">StereoKit</A>, check out the <A href="#" target="_blank" rel="noopener">on-demand sessions from Mixed Reality Dev Days</A>. To be the first to know about upcoming events and hackathons, join the <A href="#" target="_blank" rel="noopener">Mixed Reality Developer Program</A> today.</P> <P>&nbsp;</P> <P>The Mixed Reality Dev Days Planning Team</P> Mon, 15 Aug 2022 21:15:05 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mr-dev-days-hackathon-winners-explore-projects-made-with-mrtk3/ba-p/3600378 Desiree Lockwood 2022-08-15T21:15:05Z Creating an Interactive Art Gallery Wall with Vuforia Image Targets & MRTK https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/creating-an-interactive-art-gallery-wall-with-vuforia-image/ba-p/3595590 <P><EM>Made with MRTK is a monthly series where we share sample projects that leverage the Mixed Reality Toolkit (MRTK). You’re welcome to leverage the resources provided to create your own experiences or iterate on the samples provided. For those new to MRTK, the toolkit provides a set of components and features used to accelerate cross-platform MR development in Unity. To learn more, visit <A href="#" target="_blank" rel="noopener">aka.ms/mrtk</A>!</EM></P> <P>&nbsp;</P> <P>I’ve spent a lot of time lately looking into ways to enhance the experience of viewing art in a gallery. While most galleries and museums these days have either a tour led by a docent or QR codes that can be scanned to view more information on a mobile device, I thought it’d be awesome to push things one step further and introduce augmented reality (AR) into the mix. There’s a wealth of opportunity that awaits once the door to immersive experiences is opened!</P> <P>&nbsp;</P> <H2><STRONG>Research</STRONG></H2> <P>&nbsp;</P> <P>I initially checked out examples of gallery and museum AR concepts on <A href="#" target="_blank" rel="noopener">Dribbble</A> for inspiration. Dribble is my favorite go-to resource whenever I’m in need of ideas for projects to create with AR. Every so often I’ll come across an idea that a creator has fully prototyped using AR. However, in most cases the examples I find are concepts put together in programs such as Photoshop, Illustrator, or video editing software. So where does that leave me? Well, it leads me to finding out just how can I turn someone’s idea into a proof of concept!</P> <P>&nbsp;</P> <P>After an hour of scrolling through some very interesting submissions from the community, I was able to narrow down the gallery/museum AR experience into two categories:</P> <UL> <LI>Use AR to display additional information about the artwork</LI> <LI>Display the artwork itself in AR</LI> </UL> <P>The latter is a cool concept to see in-person. I first saw art in AR at Refinery 29’s 29 Rooms. I’ll cover that concept in a future post! However, for now I chose to focus on using AR to display additional information about the artwork.</P> <P>&nbsp;</P> <H2><STRONG>My Idea</STRONG></H2> <P>&nbsp;</P> <P>Since the goal was to create a proof of concept, I didn’t find it necessary to venture over to LACMA for this project. Granted, that would’ve been a cool reason to visit the galleries! Instead, I decided to leverage the artwork that I currently have in my home office. Most of the artwork I own is created by art duo Oliver Gal. The two sisters have an aesthetic that resonates well with my luxury fashion background and the various fashion décor of my office. I wanted to take 3 of my canvas prints and overlay AR elements on top to mimic the look and feel of what one would expect to see in a gallery – thus, creating my own at home gallery of art.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Oliver Gal wall art in my office." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394617i74F8DCE56AD898C9/image-size/large?v=v2&amp;px=999" role="button" title="wall-art.jpg" alt="Oliver Gal wall art in my office." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Oliver Gal wall art in my office.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>For starters, I needed to add a placard that listed the name of the print, the art, and information about the art material. Next, I wanted a button that could be pressed to view more information about the artists. Rather than display everything at once, I wanted the additional artist information to only be visible upon button press. Otherwise, the wall would be cluttered with too much digital content. The only thing now that stood between me and my experience was finding a way for the app to recognize the artwork and thus display the correct information upon recognition.</P> <P>&nbsp;</P> <H2><STRONG>Recognizing the Artwork</STRONG></H2> <P>&nbsp;</P> <P>I decided to go with Vuforia to track images. Vuforia uses what is known as <A href="#" target="_blank" rel="noopener">Image Targets</A> to recognize and augment digital content onto a recognized image. What’s even more awesome is that Vuforia supports using multiple image target simultaneously! This worked out great for my use case given that I wanted to use 3 pieces of artwork for my home gallery.</P> <P>&nbsp;</P> <P>Although Vuforia supports this functionality on mobile devices, I decided to use my HoloLens so that I could have direct hand interactions with the MRTK features. Fortunately, Vuforia provides a <A href="#" target="_blank" rel="noopener">HoloLens 2 Sample</A> that can be imported into Unity and configured to track whichever images you desire. This cuts out a lot of the manual setup that you’d encounter if you were to start from scratch. There’s also an <A href="#" target="_blank" rel="noopener">article</A> available which details how to configure and try the sample on the device. If I may suggest, be sure to check out the MRTK documentation on adding MRTK to a Unity project with the <A href="#" target="_blank" rel="noopener">Mixed Reality Feature Tool</A> so that you have the full picture of how to get started. The MRTK packages do not ship with the sample and therefore the toolkit needs to be manually added to the project via the Mixed Reality Feature Tool.</P> <P>&nbsp;</P> <P>In short, once you have the sample open and the MRTK foundation package imported, you can swap out the images provided in the sample with your own. The first thing you’ll&nbsp;need to do is upload your images to be tracked. Within the Target Manager, you’ll need to add a database for your images and upload an image for each real-world image to be tracked.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Vuforia database and it's targets." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394592i85183EE58705BBF3/image-size/large?v=v2&amp;px=999" role="button" title="vuforia-databases.png" alt="Vuforia database and it's targets." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Vuforia database and it's targets.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>&nbsp;</P> <P>Since I was using artwork that Oliver Gal currently sells, I went onto the website and saved the corresponding product images. I had a tiny hiccup while uploading – my images weren’t in the correct color format. Targets must be either 8-bit gray scale or 24-bit RGB. I resolved this by opening the images in Adobe Photoshop, selecting <STRONG>Export &gt; Save for Web (Legacy)</STRONG> and choosing <STRONG>PNG-24</STRONG>.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Photoshop Export Settigns for PNG-24." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394593i1FEEFBEBC89D58EE/image-size/large?v=v2&amp;px=999" role="button" title="ken-photoshop.png" alt="Photoshop Export Settigns for PNG-24." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Photoshop Export Settigns for PNG-24.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>After the images are uploaded, you'll need to download the database for the Unity Editor. Back over in Unity, you’ll import the Unity package for the database. Once imported, you’ll be able to use the image targets that were uploaded in the portal.</P> <P>&nbsp;</P> <P>Within the <STRONG>2-ImageTargets</STRONG> scene is a <STRONG>VuforiaContent</STRONG> GameObject that contains each Image Target within the sample. To swap out the images, select one of the targets and within the Inspector, change the <STRONG>Database</STRONG> to your database and select the appropriate <STRONG>Image Target</STRONG> from the list.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Select the Database imported to access the image target." style="width: 705px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394594iA7D9ACF9854D62D4/image-size/large?v=v2&amp;px=999" role="button" title="select-database.png" alt="Select the Database imported to access the image target." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Select the Database imported to access the image target.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>Once you have your images selected, you can add objects to display upon recognition as children of the image.</P> <P>&nbsp;</P> <H2><STRONG>Creating 2D Images</STRONG></H2> <P>&nbsp;</P> <P>I used both Adobe Illustrator and Canva to create the images for the gallery. I have a love for rounded corners and knew it’d be easiest for me to create assets with rounded corners outside of Unity. Starting with the placard, I created the rounded corner background image in Illustrator. Although setup in Illustrator is minimal, you’ll want to ensure that you’re configuring the 2D sprite properly in Unity before you begin to scale to your preferred size. There’s a <A href="#" target="_blank" rel="noopener">Master Rounded Corners for your UI | Unity UI Tutorial</A> I found on YouTube to be very helpful for figuring out the configuration. As for the text, I added the text onto the placard using the Unity canvas and Text Mesh Pro.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Placard for the artwork." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394595i707633F7C19FD6F1/image-size/large?v=v2&amp;px=999" role="button" title="placard.png" alt="Placard for the artwork." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Placard for the artwork.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>As for the additional information about the artists, I found an image of the sisters online and removed the background of the original image in Canva. Afterwards, I exported the new image of the sisters into Illustrator and created a background behind the sisters and a blank white surface just below to provide a place for their bio. Once I imported the image into Unity, I used the Unity Canvas and Text Mesh Pro once more to add in the text.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="2D image used for the artist's bio." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394596iE1814628F279090B/image-size/large?v=v2&amp;px=999" role="button" title="artist-bio.png" alt="2D image used for the artist's bio." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">2D image used for the artist's bio.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <H2><STRONG>Creating a Custom MRTK Button</STRONG></H2> <P>&nbsp;</P> <P>Although MRTK comes equipped with button prefabs, I had a particular look and feel that I wanted to achieve which required me to hop into Blender and create my own buttons – I really wanted rounded corners! Although MRTK 2.8 has pre-configured button prefabs, you can also leverage the toolkit to create your own buttons from scratch while preserving the integrity of a MRTK button. We have documentation available on how to <A href="#" target="_blank" rel="noopener">Make a Button from Scratch</A>.</P> <P>&nbsp;</P> <P>I will admit, I got a little mixed up while adding in visual states for the button. Although this worked well when using a cube primitive, I hadn’t realized that I assigned the incorrect GameObject as the <STRONG>Target</STRONG> for the profile on the <STRONG>Interactable</STRONG> script. With that said, be sure that you’re targeting the correct object when setting up your custom button. Another tricky part was adjusting the <STRONG>Press Settings</STRONG> on the <STRONG>NearInteractableTouchable</STRONG> script. I initially started with my own arbitrary numbers but soon realized that I couldn’t quite see the button being pressed. Instead, I opened the <STRONG>MRTK UX</STRONG> <STRONG>PressableButtonExample </STRONG>scene (within the Mixed Reality Toolkit Examples package) and mimicked the configuration from one of the custom buttons in the sample. That seems to have resolved my issue in no time!</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="About the Artist button." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394597i7662E10AF690D40E/image-size/large?v=v2&amp;px=999" role="button" title="about-button.png" alt="About the Artist button." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">About the Artist button.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <H2><STRONG>Adding a Hand Coach</STRONG></H2> <P>&nbsp;</P> <P>The last thing I thought would be cool to add was the Hand Coach. I hadn’t used this feature until creating this project. Essentially, it’s an animated hand that appears to guide the user on how to interact with an object in the scene. There’s various animations available, which is great given that you’re not limited to how you can integrate this feature into your own experience. I chose to go with a hand that demonstrates the press of a button. The <STRONG>HandCoachExample</STRONG> scene (within the Mixed Reality Toolkit Examples package) provides examples on how hand coach can be configured. I drew inspiration from the <STRONG>Near Select</STRONG> example for my own project. I personally have some more experimenting to do with this feature. However, I think this came out well for first time use.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Hand Coach is used to provide guidance." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394599iE1E7448B811439F9/image-size/large?v=v2&amp;px=999" role="button" title="hand-coach.jpg" alt="Hand Coach is used to provide guidance." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Hand Coach is used to provide guidance.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <H2><STRONG>Conclusion</STRONG></H2> <P>&nbsp;</P> <P>Overall, it took me roughly 2 days to bring this project to life! I’m quite proud of how this turned out. Check out the video and images below to view the experience in all its glory.</P> <P>&nbsp;</P> <P><LI-VIDEO vid="https://youtu.be/gTZpNqLFDCU" align="center" size="large" width="600" height="338" uploading="false" thumbnail="https://i.ytimg.com/vi/gTZpNqLFDCU/hqdefault.jpg" external="url"></LI-VIDEO></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Glam Gentleman Doll artwork and it's AR elements." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394600iA13BB03B5B73793D/image-size/large?v=v2&amp;px=999" role="button" title="20220809_173514_HoloLens.jpg" alt="Glam Gentleman Doll artwork and it's AR elements." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Glam Gentleman Doll artwork and it's AR elements.</span></span></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="No Time for This artwork and it's AR elements." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394601iCE073E29755D9D6C/image-size/large?v=v2&amp;px=999" role="button" title="20220809_173456_HoloLens.jpg" alt="No Time for This artwork and it's AR elements." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">No Time for This artwork and it's AR elements.</span></span></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Oh How Cute artwork and it's AR elements." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/394602iAF1C852D3F4A3EB2/image-size/large?v=v2&amp;px=999" role="button" title="20220809_173547_HoloLens.jpg" alt="Oh How Cute artwork and it's AR elements." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Oh How Cute artwork and it's AR elements.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>Referring to the configurations within the MRTK Examples package was a time-saver. I’d highly recommend importing that package into a Unity project if you’re unsure how a particular MRTK UX building block works or how to configure one. I’d love to see this project brought to life in a real gallery – even if it’s not with a head mounted device! As I mentioned earlier, creating such an experience is not limited to just HoloLens. You can also try this out on mobile as well. Imagine hosting your own AR art gallery exhibit virtually anywhere that your art is mounted. Certainly, saves a lot of money on materials! If you do create your own experience, be sure to let us know in the comments or on Twitter at <A href="#" target="_self">@mxdrealitydev</A>!</P> <P>&nbsp;</P> <P>Until next time, happy creating!</P> <P>&nbsp;</P> <P>&nbsp;</P> Wed, 10 Aug 2022 03:56:06 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/creating-an-interactive-art-gallery-wall-with-vuforia-image/ba-p/3595590 April_Speight 2022-08-10T03:56:06Z (Guest Blog) From MRTK2 to MRTK3 - going cross platform with HoloLens 2 and Quest 2 https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/guest-blog-from-mrtk2-to-mrtk3-going-cross-platform-with/ba-p/3577325 <P><EM><STRONG>This is a Guest Blog Post from Joost&nbsp;van Schaik, MVP.</STRONG></EM></P> <P>&nbsp;</P> <P>Back in 2016, I published my first real HoloLens app in the Microsoft Store, back then for HoloLens 1: <A href="#" target="_blank" rel="noopener">AMS HoloATC</A>. Since then, this app has been my litmus test for everything I tried to do in Mixed Reality, as it is relatively simple, yet uses almost all aspects of Mixed Reality. So, when a new platform/toolkit/device arrives, I take this app and try to adapt it for that first. When the Windows Mixed Reality headsets came along, I made it run on that as well. Then came Mixed Reality Toolkit, Mixed Reality Toolkit 2, and I converted the app to work on those – and the last step gave me a pretty easy conversion to HoloLens 2 when it popped up. In fact, I made the app run on that device without even having access to a device. In fact, I only had a few minutes with a prototype of it, yet the app was ready before HoloLens 2 was released.</P> <P>&nbsp;</P> <P>Fast forward to earlier this year, MVPs were invited to a preview of the Mixed Reality Toolkit 3 – a radical departure from the past. And I went in it the usual way: after a bit of mucking around I took my AMS HoloATC app, yanked the MRTK2 from it and put in the MRTK3. Anyone who knows anything about coding knows that this approach tends to break a lot of things. Visual Studio experienced a lot of errors – not quite surprising when you pull the rug under the app’s feet. But the fun thing about this head-first approach is: it shows you clearly where things break or are changed, yet there is a clearly limited set of things you want to achieve. It is basically, reconnecting a lot of wires that you know <EM>must</EM> go somewhere. How do you deal with the Spatial map here? How does interaction work here? How can you build a user interface with these new components? That is where my inner geek kicks in - with years of experience. In a couple of days’ time worth of finicking, I had my app working again. Mind you – nearly without documentation, just peeking into the code of the MRTK3 and the few samples that came with it.</P> <P>&nbsp;</P> <P>So then came a defining moment: on the Mixed Reality Developer Days 2022, Microsoft showed an interesting app: <A href="#" target="_blank" rel="noopener">Zappy’s Playground</A>. If you have not seen it – it’s basically a simple kind of game, where you have to provide a robot with enough power to operate. You do this by picking up, placing, and orienting small wind generators. Now this may sound a bit weird and even a bit lame, but the main point of the game is not the game <EM>itself</EM>, but to demonstrate how to implement all kinds of typical techniques, interactions and other ‘building blocks’ in a Mixed Reality application using MRTK3. And like every developer knows: ten lines of example code working in context, with all nuts and bolts around it in place, <EM>are worth more than a thousand lines of documentation</EM>. The most intriguing thing about the app itself was on stage, it ran both on HoloLens <EM>and</EM> the Meta Quest 2. Unchanged. While HoloLens is Windows based, and the Quest Android. Now, was this true or a stage trick? I have a Quest 2 - I had bought it for a WebXR development experiment early this year, which was completed successfully. After that… it was basically sitting there, gathering dust.</P> <P>&nbsp;</P> <P>The first thing to find out was whether or not Microsoft's claim about Zappy's cross platform play was the real thing or not. So, I dusted off the Quest, to see if I could make it work on that device. Spoiler alert: it turned out the most challenging part of the whole operation was actually getting the Quest 2 into ‘developer mode’. Although it’s not complicated, once you know what to do, there are several steps required. First you need to set the device in developer mode using your Oculus app on your <EM>phone</EM>, this requires you to create an ‘organization’ to associate your account with, your accounts needs to be verified, drivers needs to be installed... it's quite a process, sometimes a bit confusing,&nbsp; but in the end, I apparently had the Quest 2 in developer mode. I already had cloned the Zappy’s playground code from GitHub to run it on HoloLens. I needed to install the Android tools next to the UWP tools, then I opened the app in Unity, switched to the Android build platform, hit “build and run” … and basically, that was it. It ran on the Oculus, without any changes. The stage demo had definitely not been staged – it was real! Before my very eyes. I flexed my knuckles. It was time to let good old AMS HoloATC make the jump to Quest 2!</P> <P>&nbsp;</P> <P>I repeated the procedure. Switched to Android build, copied over all the Android settings from the Zappy app, and deployed. Yay! It ran. In a flat floating window, instead of a VR app. I double checked my setting - I had forgotten one.<BR />Take 2: now I saw my app appearing as a VR app, I saw the airport around me, and planes appear… but I could not interact with it. I could not touch airplanes, nor could I look at them and see the pictures.</P> <P>&nbsp;</P> <P>After some careful studying, it looked like the MRTK3 developers had created a lot of extra settings for the so-called “Input settings” for the XR Interaction Toolkit, on which the MRTK3 is built on. This is a concept I am still getting familiar with, but I could determine which files were involved. I copied those over as well from Zappy to my app. I checked and double checked. Everything was in order. And then it was time for deploy 3.</P> <P>&nbsp;</P> <P>… And everything worked. I could gaze at airplanes and saw pictures, select them with the controller ray - even with about 4 hours ago, I never had deployed a native app on a Quest 2. In that time, I had learned how to put it into developer mode, how I could deploy apps to it, and what I needed to do to adapt a working MRTK3 HoloLens app to a Quest 2 app. Literally only configuration. Zero lines of extra code. For a native app. Color, me impressed.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="20220720_094431_HoloLens .jpg" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/390649iF0D9A4091C64A1BF/image-size/large?v=v2&amp;px=999" role="button" title="20220720_094431_HoloLens .jpg" alt="20220720_094431_HoloLens .jpg" /></span></P> <P> </P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="com.LocalJoost.HoloATC3-20220720-094844.jpg" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/390651i4B68D1D4C6F379DE/image-size/large?v=v2&amp;px=999" role="button" title="com.LocalJoost.HoloATC3-20220720-094844.jpg" alt="com.LocalJoost.HoloATC3-20220720-094844.jpg" /></span></P> <P> </P> <P> </P> <P>Now, I realize, for more complex apps, some more device specific code may be needed. In fact, shortly after the first run I added a few platform-specific lines of code - to make stuff appear bigger. The Quest 2 is a VR device – basically a screen close to your eyes with not <EM>nearly</EM> the resolution of a HoloLens. If you show 10 cm (about 4”) airplanes, they tend to look pretty crappy and pixelated. My rule of thumb is to scale everything up 3.5x in VR with respect to HoloLens. To that end, I could simply re-use code I wrote for Windows Mixed Reality headsets. So, although I did add some platform specific code, it was not any <EM>new</EM> code.</P> <P>&nbsp;</P> <P>Long story short: there is a bit of work to do to move your app from MRTK2 to MRTK3. Depending on how you have built your app, this may be a bit of work or quite a lot. Especially if you have built a lot of user interface using buttons and panels from the MRTK2, you will have some work to do. <EM>However</EM>, if you have used some kind of halfway decent architecture, porting functionality is quite doable. After all, most of it is Unity and plain C#. In addition, there is a lot less custom code in MRTK2 – they are very much building on standard <EM>Unity</EM> stuff. And <EM>once</EM> you are on MRTK3, the step to Oculus Quest is ridiculously small.</P> <P>&nbsp;</P> <P>I think this is the big prize. There are still some pretty rough edges – MRTK3 is in preview after all, yet already in much better shape than when I saw it first – but it seems MRTK3 is aimed at making the promise of OpenXR come true: a generalized, somewhat device-abstracted way of building cross-platform Mixed Reality apps. A kind of MAUI for Mixed Reality, built on top of Unity. This is great news for developers who want to build business apps, but whose customers feel that buying a fleet of Hololens devices is just too steep of a hill to climb all at once – for now. It also paves the road for future devices which may be more advanced and show HoloLens-like capabilities, like the Meta Cambria. Or whatever Microsoft itself, or other companies may have up their sleeves.&nbsp;However, as this pans out, the area of Mixed Reality will be even more exciting in the near future than it already is today!</P> <P>&nbsp;</P> Tue, 26 Jul 2022 16:41:17 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/guest-blog-from-mrtk2-to-mrtk3-going-cross-platform-with/ba-p/3577325 LocalJoost 2022-07-26T16:41:17Z HoloLens 2 launches in the United Arab Emirates https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/hololens-2-launches-in-the-united-arab-emirates/ba-p/3577850 <P><STRONG><SPAN data-contrast="auto"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="HLS19_AEC1Hologram_001.jpg" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/389643i2B33F63FD09D89D4/image-size/large?v=v2&amp;px=999" role="button" title="HLS19_AEC1Hologram_001.jpg" alt="HLS19_AEC1Hologram_001.jpg" /></span></SPAN></STRONG></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Microsoft launches HoloLens 2 in the UAE, empowering organizations with the innovation of mixed reality&nbsp;</SPAN></STRONG><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><I><SPAN data-contrast="auto">The ergonomic, untethered, self-contained holographic mixed reality HoloLens 2 headset provides out-of-the-box value with enterprise-ready applications and is backed by the reliability, security, and scalability of Microsoft’s Cloud and AI Services.</SPAN></I><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:256}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">20 July, 2022; Dubai, United Arab Emirates –</SPAN></STRONG><SPAN data-contrast="auto"> Microsoft today announced the availability of its industry-leading HoloLens 2 mixed reality headset in the UAE. HoloLens 2 provides the most comfortable, intuitive, and immersive mixed reality experience available. It delivers enterprise value across key sectors with Dynamics 365 business applications and industry ISV solutions and is backed by the reliability, security, and scalability of Microsoft Azure.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">Microsoft’s comprehensive mixed reality platform draws on augmented and virtual reality to blend the physical and digital worlds in a unique way. By extending computing beyond two-dimensional screens, mixed reality increases productivity and optimizes operations. Thousands of leading organizations across the globe, in industries such as manufacturing, construction, healthcare, retail, and education, are using HoloLens 2 and Azure mixed reality services to transform their businesses. The benefits may include things like reduced energy consumption and operational emissions, improved learning and retention, enhanced delivery of patient treatment, greater employee and customer satisfaction, and significant cost savings.</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">“Once again, Microsoft demonstrates its commitment to the UAE as we continue to invest and bring innovative solutions to market,” said Ihsan Anabtawi, COO and CMO, Microsoft UAE. “HoloLens 2 is a cutting-edge device that's ahead of its time, with AI-powered holograms that respond to commands and interact with real-world surfaces in real time. We are confident that enterprises in the UAE and beyond will leverage it to accelerate their digital transformation journeys and contribute to sustainable economic growth by innovating with confidence and collaborating without boundaries.”&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">A Forrester Total Economic Impact Study commissioned by Microsoft showed that HoloLens 2 offers a 177 percent return on investment (ROI) over three years, as well as improvements to employee health and safety, business continuity, customer experience, and customer outcomes.</SPAN><SPAN data-contrast="auto"> “As we shift to the next computing paradigm, Microsoft is excited to upskill every business and every developer to build secure, collaborative metaverse experiences using best-in-class hardware, intelligent cloud services and cross-platform tools” said Ksenia Ternavskykh, Mixed Reality Product Marketing Lead, Microsoft Middle East and Africa. “By starting with mixed reality, organizations can derive significant and quantifiable impact today and build necessary capabilities to </SPAN><SPAN data-contrast="auto">be ready for the future.”&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">HoloLens 2 enables organizations to empower their workforce from day one with mixed reality apps from Microsoft Dynamics 365. There are over 200 available applications from a rich partner ecosystem that spans Independent Software Vendors, System Integrators and Digital agencies; these apps address a wide range of unique industry-specific use cases. HoloLens 2 will act as a business catalyst and empower partners to create new value, increase revenue, and improve customer relationships by offering mixed reality capabilities.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">Redington has been appointed as the sole distributor for HoloLens 2 in the UAE and has been fully onboarded together with a number of partners.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN><SPAN data-contrast="auto">“As a market leader in broadline and value-added distribution, boasting a wide portfolio of world-class brands, Redington Gulf is uniquely positioned to offer end-to-end solutions with Microsoft HoloLens 2. With the combined strengths of both our business divisions – Volume and Value – our extensive partner network can provide creative and customized solutions with Microsoft HoloLens 2 to add</SPAN> <SPAN data-contrast="auto">unparalleled value to their customers’ operations. We have seen a high demand for mixed reality across a variety of industries and, together with our strong partner ecosystem, we are primed to help customers leverage mixed reality solutions and become more innovative than they ever thought possible,” said Viswanath Pallasena, CEO, Redington Gulf.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">-ENDS-</SPAN><SPAN data-ccp-props="{&quot;335551550&quot;:6,&quot;335551620&quot;:6}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="none">About Microsoft</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">Microsoft (Nasdaq “MSFT” <LI-USER uid="41501"></LI-USER>) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more. Microsoft opened its Dubai-based headquarters in 1991, which, today, oversees operations across the region.</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><STRONG><SPAN data-contrast="none">About Redington</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Redington Gulf is the largest distributor of IT products in the Middle East and Africa. The business model is purely channel-oriented; it empowers channel partners with technology through trainings and joint engagement. These solutions span across technology domains such as Networking, Voice, Servers, Storage, Software, Security and Infrastructure and emerging technology brackets such as Hyper Convergence, Cloud Computing and Big Data Analytics. Redington continues to be a pioneer in understanding, helping, and assisting partners and customers to digitally transform themselves by leveraging Cloud technologies. With increasing vendor relationships in various parts of the Middle East and Africa we ensure that all the requirements of the channel are met under one roof. To learn more, visit </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">http://www.redingtongroup.com/mea/</SPAN></A><SPAN data-contrast="auto">&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><STRONG><SPAN data-contrast="none">For more information (Press only):</SPAN></STRONG><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">Magdalena Stepien, Microsoft UAE</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">E : </SPAN><A href="https://gorovian.000webhostapp.com/?exam=mailto:mastepie@microsoft.com" target="_blank" rel="noopener"><SPAN data-contrast="none">mastepie@microsoft.com</SPAN></A><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">M: +971 56 114 2948</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none"> </SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">Husain Gandhi, ProGlobal Media</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">E: </SPAN><A href="https://gorovian.000webhostapp.com/?exam=mailto:h.gandhi@proglobal.ae" target="_blank" rel="noopener"><SPAN data-contrast="none">h.gandhi@proglobal.ae</SPAN></A><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> <P><SPAN data-contrast="none">T: +971 56 534 8764</SPAN><SPAN data-ccp-props="{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}">&nbsp;</SPAN></P> Wed, 20 Jul 2022 19:12:25 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/hololens-2-launches-in-the-united-arab-emirates/ba-p/3577850 SeanKerawala 2022-07-20T19:12:25Z MRTK3 Public Preview! https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mrtk3-public-preview/ba-p/3556892 <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Mixed Reality Toolkit Logo" style="width: 964px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384263iCDC5F23AEE0A1DC7/image-size/large?v=v2&amp;px=999" role="button" title="gracehsu.png" alt="Mixed Reality Toolkit Logo" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Mixed Reality Toolkit Logo</span></span></P> <P>&nbsp;</P> <P>MRTK3 Public Preview is now available on the MR Feature Tool as well as GitHub! MRTK3 offers many improvements with its architecture, UI tools, interaction system, and other new features.</P> <UL> <LI>Download <A href="#" target="_self">MRTK3 Public Preview</A></LI> <LI>Learn more: <A href="#" target="_blank" rel="noopener">Introducing MRTK3 – Shaping the future of the MR Developer Experience</A></LI> <LI>Join us on July 14th, 2022: <A href="#" target="_self">MRTK3 Public Preview&nbsp;Ask us Anything session </A></LI> </UL> <H1>Modular Packaging</H1> <P>We’ve separated MRTK into many modular packages that you can independently install and update. You can pick and choose the MRTK packages that suit your needs, significantly reducing the amount of MRTK code needed to take advantage of our features.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="List of all packages in the toolkit" style="width: 958px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384271iA004BE7D8B7CAF7D/image-size/large?v=v2&amp;px=999" role="button" title="ScreenShot1.png" alt="List of all packages in the toolkit" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">List of all packages in the toolkit</span></span></P> <P><SPAN>MRTK3 is all-in on OpenXR and Unity’s XRI Toolkit. One of the goals with MRTK3 was to take everything we've learned from the start of MRTK2 back in early 2018, combine it with the work that's been done by our industry partners across OpenXR and Unity since then, and come out the other side with a strong, extensible foundation that allows MRTK to focus more on providing differentiators and an overall improved user (and developer!) experience.</SPAN></P> <P>&nbsp;</P> <H1>OpenXR</H1> <P>OpenXR is an open royalty-free API standard from Khronos, providing engines with native access to a range of devices across the mixed reality spectrum. Unity ships an OpenXR plugin providing integration of the core spec for features like rendering, head tracking, and controller input. Microsoft ships the Mixed Reality OpenXR Plugin, which builds on Unity’s plugin and adds support for additional features relevant to devices like HoloLens 2. Microsoft-specific additions include spatial mapping and planes, spatial anchors, locatable camera, and tracked hand mesh. It also adds support for hand joint tracking via a cross-vendor OpenXR extension, which means it’s supported on more devices than just Microsoft’s. This is the piece MRTK3 depends on for features like articulated hand joint tracking and the poke pointer.</P> <P>&nbsp;</P> <UL> <LI>Learn more:&nbsp;<A href="#" target="_self">#Open - Deploy Everywhere with OpenXR and MRTK3</A></LI> </UL> <H1>XR Interaction Toolkit</H1> <P>MRTK3 uses the Unity XR Interaction Toolkit (XRI) as the foundation for input abstractions, interaction, and manipulation. Most of MRTK3 is simply a very thin layer atop the interaction primitives provided by XRI; as a result, MRTK3 is no longer a bespoke interaction and input system, but merely a well-behaved “citizen” of the XRI framework and community. In fact, Unity’s XRI was built with the MRTK team’s input, and we’ve been fortunate enough to have frequent syncs with the XRI team to give feedback, help with bugs, and ensure that the MRTK systems are built in the spirit of the XRI framework. We’ve also been able to help shape what XRI looks like in terms of future plans and architectural decisions.</P> <P>&nbsp;</P> <P>One of the biggest advantages of this approach is that MRTK3 is highly compatible with other XR applications and interaction systems, if they also build on top of XRI. For instance, existing unmodified XRI projects and applications can utilize MRTK3 interactables like volumetric UI controls, ObjectManipulator, and BoundsControl. On the other hand, MRTK3 projects can use existing XRI-based interactables, such as those that ship out of the box with the XRI framework itself. Even the XR rig itself is highly flexible and compatible, and custom rigs can be used with no MRTK code whatsoever.</P> <P>&nbsp;</P> <P>Most importantly, we can rely on the rock-solid interaction systems that XRI provides, and focus on what MRTK does best, like new ways to manipulate holograms, collections of useful prefabs, volumetric UI/UX, and other mixed-reality-specific building blocks. Encouragingly, XRI has already seen wide adoption across the industry and has generated a fair bit of excitement in the XR space. We’re proud to be working with Unity on this and we’re excited to see what opportunities this collaboration will unlock, especially as we expand MRTK’s out-of-the-box compatibility to a much wider range of XR applications.</P> <P>&nbsp;</P> <H1>Volumetric UI on Canvas</H1> <P>MRTK3 introduces volumetric UI integrated with Unity's RectTransform + Canvas system. While RectTransform and Canvas have historically been primarily used for 2D flat UI, they’re also fully capable of rendering and laying out volumetric 3D UI. MRTK3 includes systems and prefabs for full-parity canvas-based 3D UI, accelerating design iteration and raising the bar for fidelity and detail when designing volumetric user interfaces.</P> <P>&nbsp;</P> <P>Previously in MRTK2, building UI was difficult, especially if any dynamic layout or collections were required. Building on Unity’s RectTransform-based UI tooling allows for a completely re-invented UI workflow. Unity’s layout groups can be used for dynamic and flexible arrangements of content within your UI, in the same way that they are used for traditional flat UI. In addition, many pieces of basic functionality that designers expect from modern design tools like resizing, responsive layouts, alignment, anchoring, margin, and padding are all included by default.</P> <P>&nbsp;</P> <P>Instead of building UI manually with hand-calculated dimensions, lossy scaling operations, and fragile static layouts, UI can be constructed much in the same way that designers use other modern presentation frameworks. Layouts are built from horizontal, vertical, and grid-like groups, and are fully composable and responsive. The sizes of panels, windows, and slates can be inflated from a dynamic number of child elements, and the children can be edited, reordered, and removed at runtime as the layout system responds immediately.</P> <P>&nbsp;</P> <P>Another convenient benefit is that by unifying our UX controls under the Canvas/UnityUI system, we can take advantage of Unity’s cross-platform UI input system to enable directional navigation and gamepad input by default. Out of the box, you’ll get rich gamepad support on the same volumetric UI controls that you can touch with hand-tracking. It’s also a great win for accessibility, because many accessibility inputs and controllers require these traditional 2D gamepad-type input modalities. Architecturally, these traditional 2D inputs (including gamepad, mouse, touchscreen, and more) invoke the exact same events and appear completely identical to advanced XR interactions like poke and gaze-pinch, which means that there’s a single codepath for every single method of interaction with your UI.</P> <P>&nbsp;</P> <P>As a result, building production-grade UI is vastly easier and more maintainable, and can be scaled across larger teams and more mature designs. It’s been critical for us internally, as we build larger and more complex mixed reality applications, we’ve needed more mature, robust, and maintainable UI systems. In addition, it’s allowed us to reduce the quantity and churn of prefab variants, as we no longer need a huge number of prefabs for each permutation of UI control size or configuration. These improvements have let much larger teams work together while keeping our design language consistent and polished across large and complex layouts.</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="New Volumetric UI" style="width: 998px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384273i89DB94F0F950C8A6/image-size/large?v=v2&amp;px=999" role="button" title="NewVolumetricUI.png" alt="New Volumetric UI" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">New Volumetric UI</span></span></P> <P>&nbsp;</P> <UL> <LI>&nbsp;Learn more: <A href="#" target="_blank" rel="noopener">Building Rich UI for MR in MRTK3</A></LI> </UL> <H1>New Mixed Reality Design Language</H1> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="New Mixed Reality Design Language" style="width: 998px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384275iEFB6301923EC2045/image-size/large?v=v2&amp;px=999" role="button" title="NewMixedRealityDesignLanguage.png" alt="New Mixed Reality Design Language" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">New Mixed Reality Design Language</span></span></P> <P>&nbsp;</P> <P>In MRTK3, we are providing refreshed UI building blocks based on the new design language that we introduced in the Mesh app for HoloLens 2 last year. Throughout the past several years, the team has learned a lot from many different projects and experiments; the new Mixed Reality Design Language is the result of many design iterations by our designers, researchers, and engineers, and we hope you’ll love it.</P> <P>&nbsp;</P> <P>Here are some of the updates:</P> <UL> <LI><STRONG>Rounded corner geometry</STRONG> for more approachable and friendly experiences</LI> <LI><STRONG>Updated visual system</STRONG> (grids and modules) to support various types of Mixed Reality UI scenarios</LI> <LI><STRONG>Improved visual feedback</STRONG> for multi-modal input such as eye-gaze with pinch gesture input</LI> <LI><STRONG>Modular backplate system</STRONG> for building complex layouts that remain clear and usable</LI> <LI><STRONG>Updated bounding box visuals</STRONG> to reduce visual noise and enable fluid gaze-powered interactions<BR /> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Examples of modular UI building blocks and improved visual feedback for multi-modal input" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384055iD97EB4A0C6AF6AE5/image-size/large?v=v2&amp;px=999" role="button" title="MRTK3_MRDL_VisualSystem2.gif" alt="Examples of modular UI building blocks and improved visual feedback for multi-modal input" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Examples of modular UI building blocks and improved visual feedback for multi-modal input</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> </LI> </UL> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Improved visuals for new multi-modal interactions" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384276i18323E0DDDA17A16/image-size/large?v=v2&amp;px=999" role="button" title="multi-modal-interactions.png" alt="Improved visuals for new multi-modal interactions" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Improved visuals for new multi-modal interactions</span></span>&nbsp;</P> <P>&nbsp;</P> <P>&nbsp;</P> <H1>Advanced Interactions</H1> <P>&nbsp;</P> <P><STRONG>Gaze in MRTK3</STRONG></P> <P>MRTK3 aims to make it as easy as possible for users to use gaze to interact with objects. To accomplish this, we’ve dramatically improved our gaze targeting implementation, as well as introduced a new kind of first-class interaction type; <STRONG>gaze-pinch manipulation.</STRONG></P> <P>&nbsp;</P> <P><STRONG>Gaze targeting improvements:</STRONG></P> <P>Throughout most of MRTK2, gaze targeting was a more limited feature. You could adjust the minimum and maximum gaze distance, and the user’s eye gaze was captured as a single raycast which followed the user’s tracked eye gaze. While this implementation was functional, it had difficulties targeting smaller objects and required users to keep a focused, steady gaze to maintain targeting.</P> <P>&nbsp;</P> <P>In MRTK3, we’ve dramatically improved this implementation. The user’s eye gaze is now represented by several spherecasts, each of increasing precision. These coverage of the spherecasts can then be further refined to a cone by specifying a cone angle. This allows objects which are not directly aligned with the user’s eye gaze to be targeted, greatly reducing the effort required to target and maintain targeting on smaller objects. In conjunction with this, we’ve also developed a smarter algorithm which scores all potential targets captured by the user's eye gaze, using different factors like distance and angle to determine what the user is most likely trying to select.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="precisegizmos.gif" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384203iF0844EC90DF7DD49/image-size/large?v=v2&amp;px=999" role="button" title="precisegizmos.gif" alt="precisegizmos.gif" /></span></P> <P>&nbsp;</P> <P>All of these parameters can be tweaked by a developed at design-time and run-time, providing developers with the flexibility to tune gaze in the way that best suits their needs.</P> <P>&nbsp;</P> <P><STRONG>Gaze-Pinch </STRONG></P> <P>Eye gaze isn’t just a great way to passively target and select objects; it’s also a great way to determine the user’s intent and focus, and is especially powerful when paired with hand-tracking-based manipulation gestures. We’ve developed a delightful and powerful new way for users to manipulate objects through a combination of gaze targeting and subtler hand gestures, aiming to reduce both mental and physical fatigue during interactions.</P> <P>&nbsp;</P> <P>Gaze-pinch leverages the new fuzzy gaze algorithm described above to first determine which object the user intends to manipulate. Then, then user can use their fingers and hands to manipulate the object as if it were nearby, using pinch, grab, and other familiar input gestures, even while the object may be far out of reach. In this input modality, we can relax the constraints of what we consider an “upright hand” to be, enabling subtler hand manipulations (instead of pointing a ray at the object!)</P> <P>&nbsp;</P> <P>Dwell-based gaze interactions, like you might be familiar with from MRTK2, make a return in MRTK3. You can still use the same dwell-based interactions as you did in previous versions of MRTK, even alongside the new gaze-pinch system. Dwell is a great solution for enabling more accessible UI in your app, and it’s even better in MRTK3 with our improvements to the gaze detection system.</P> <P>&nbsp;</P> <P><STRONG>Multi-handed by default</STRONG></P> <P>In MRTK2, users enjoyed intuitive multi-handed interactions, both when manipulating holograms directly and when using hand rays. We’ve done quite a lot of interaction design iteration to perfect multi-handed interactions, and we’re glad to have been able to share our work in the past in MRTK2.</P> <P>&nbsp;</P> <P>In MR environments, users expect to be able to use both of their hands at minimum, making <STRONG>multi-input interaction</STRONG> more important than ever. In MRTK3 all interactables can now be interacted by multiple interactors by default, thanks to our collaboration with Unity’s XRI framework. Using XRI’s clearly defined <A href="#" target="_blank" rel="noopener">interaction states and events</A>, developers can design interactables with multi-input in mind. This means a button can now be pressed by multiple input sources and can remain pressed when any of those input sources ends its selection.</P> <P>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</P> <P>In MRTK3, we’ve taken multi-handed interactions to the next level. Friendly APIs for dealing with an arbitrary number of participating interactors make it easy to build more complex multi-handed interactions. In MRTK2, multi-handed interactions were mostly used with ObjectManipulator and BoundsControl; now, it’s included by default in every interactable, and users can expect all objects to react reasonably to any number of inputs, simultaneously. Buttons, sliders, and even drawing surfaces respond to an arbitrary number of interactions, without requiring the developer to keep track of each input manually.</P> <P>&nbsp;</P> <P><STRONG>Variable Selectedness</STRONG></P> <P>Mixed reality is fundamentally a highly “analog” input scenario; most ways that users can interact with holograms are variable, analog, or even sometimes subjectively determined from the user’s body. As a result, we are faced with a set of highly “fuzzy” inputs from the user, along with a collection of interactables and behaviors that must respond to these variable inputs. Motivated by this challenge, MRTK3 introduces the concept of <STRONG>variable selectedness</STRONG>, allowing users to select objects partially and with a wide range of input modalities, such as partial button presses, partial pinches, analog triggers on controllers, and more.</P> <P>&nbsp;</P> <P>Internally, our interactables have a <EM>selectedness</EM> variable, which is a max of the selectedness returned by all hovering and selecting interactors. For interactors which implement <EM>IVariableSelectInteractor,</EM> this can be calculated using pinch strength, trigger-pressedness, or whatever the developer decides to implement. For all other interactors, it is either 0 or 1 based on whether the interactor is currently selecting or not.</P> <P>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</P> <P>This definition of <EM>selectedness</EM> is fully extensible, as more specialized interactables like our buttons also incorporate the pressedness of a touch interactor to determine its selectedness value. However, no matter what interactor you decide to use, our MRTK3 interactable will be able to respond to it, without needing to change any of its underlying source code.</P> <P>&nbsp;</P> <P><STRONG>Interaction Modes</STRONG></P> <P>In MR, users have multiple ways to interact with an object, no longer restricted to a singular mouse pointer/finger/controller. To best leverage the advantages having multiple “controllers” gives people, MRTK3 introduces the concept of <STRONG>Interaction Modes. </STRONG>Each <STRONG>interaction mode</STRONG> specifies a set of <STRONG>Interactors</STRONG> that are active, and each controller is in a single <STRONG>interaction mode</STRONG> at any given time. This allows the developer the flexibility to allow each controller to be in a different interaction mode or to synchronize them all to the same mode. This can be very useful for cases where the user may need to manipulate a nearby object with one hand, while targeting a far away object with another.</P> <P>&nbsp;</P> <P style="text-align: center!important;"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="mode_switch.gif" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384060iE26F3276F7129D6F/image-size/medium?v=v2&amp;px=400" role="button" title="mode_switch.gif" alt="mode_switch.gif" /></span></P> <P>&nbsp;</P> <P>Modes will take priority over each other based on the order they are defined in the <STRONG>Interaction Mode Manager. </STRONG>This makes the assignment of a mode via a <STRONG>Mode Detector</STRONG> a monotonic operation, no matter what order the mode changes are specified in, the mode with the highest priority will always be assigned to the relevant controllers in the end.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Mode Manager" style="width: 864px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384277i57538F8E30765F7A/image-size/large?v=v2&amp;px=999" role="button" title="ModeManager.png" alt="Mode Manager" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Mode Manager</span></span></P> <P>&nbsp;</P> <UL> <LI>Learn more: <A href="#" target="_blank" rel="noopener">MRTK3 Interaction Building Blocks</A></LI> </UL> <H1>Data Binding</H1> <P>In MRTK3, we provide tools that make it easier to build dynamic UI. We provide a framework for sourcing, binding, and consuming a wide variety of dynamic data; this includes binding text to your UI, updating materials and textures on the fly, and even binding more complex data like audio clips. Data and updates can be pulled from any type of source, including both remote web services and locally-hosted JSON or assets.</P> <P>Although we still have quite a few improvements planned for data binding prior to MRTK3 General Availability (GA), it is already in production in Dynamics 365 Guides v7 released last fall. Here are buttons and lists that are all populated using the new data binding feature:</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Data Bound List" style="width: 932px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384280i48E2E34A8D923BFA/image-size/large?v=v2&amp;px=999" role="button" title="DataBoundList.png" alt="Data Bound List" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Data Bound List</span></span></P> <P>&nbsp;</P> <H2>Theming</H2> <P>MRTK3 Theming makes it possible to change the entire look and feel of an entire application, scene, or prefab at runtime. The theme’s styling information can be specified both locally and remotely, either through a configuration object in the scene or bound to a remote theme source, like a cloud-based branding repository. Theming also makes it possible for MRTK3 to support the MRTK2 style without including an entire replicated set of UX prefabs.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Theming Use case" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/384282iE9C6FCFC2F4A0926/image-size/large?v=v2&amp;px=999" role="button" title="ThemingUsecase.png" alt="Theming Use case" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Theming Use case</span></span></P> <P>&nbsp;</P> <UL> <LI>Learn more: <A href="#" target="_blank" rel="noopener">Working with Dynamic Data and Theming in MRTK3</A></LI> </UL> <H1>Get Started!</H1> <P>We hope you enjoy <A href="#" target="_self">MRTK3 Public Preview</A> and we look forward to hearing your feedback as we continue its development. To learn more, check out our <A href="#" target="_self">comprehensive on demand videos</A> from Mixed Reality Dev Days and join the <A href="#" target="_self">Mixed Reality Developer program</A>&nbsp;and the MRTK <A href="#" target="_self">Holodevelopers Slack</A> to stay up-to-date on MRTK3 and connect with the team!</P> <P>&nbsp;</P> <P><EM>Brought to you by the MRTK team: Hoff Hoffman, Max Wang, Roger Liu, Yoon Park, Finn Sinclair, Kurtis Eveleigh, David Kline, Nick Klingensmith, Grace Hsu</EM></P> Wed, 29 Jun 2022 16:39:04 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mrtk3-public-preview/ba-p/3556892 gracehsu 2022-06-29T16:39:04Z Point cloud support in Azure Remote Rendering! https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/point-cloud-support-in-azure-remote-rendering/ba-p/3484253 <P><SPAN data-contrast="auto">Today at Mixed Reality Dev Days, we are excited to announce the </SPAN><STRONG><SPAN data-contrast="auto">public preview of</SPAN></STRONG> <STRONG><SPAN data-contrast="auto">point cloud file support in </SPAN></STRONG><A href="#" target="_blank" rel="noopener"><STRONG><SPAN data-contrast="none">Azure Remote Rendering</SPAN></STRONG></A><SPAN data-contrast="auto">! Remote Rendering enables developers to render high-quality interactive 3D content and stream it to devices like HoloLens 2 in real time. Today,&nbsp;Remote Rendering is great for rendering detailed meshes from FBX and GLTF/GLB files. In the past you would need to convert to these formats to use your models with the service. Now with point cloud file support, you can take raw data from your favorite lidar scanner and visualize places and spaces quickly and easily. We have added this feature in a way that does not change how you use the service. Point cloud capabilities have been one of the top asks from our partners and developers, so we are excited to bring this to public preview today! We have also made </SPAN><STRONG><SPAN data-contrast="auto">numerous updates</SPAN></STRONG><SPAN data-contrast="auto"> to the service to improve performance and usability.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P class="lia-indent-padding-left-330px"><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="AndrewDon_0-1654718725580.jpg" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/378738iD53EFA8C4BC6D620/image-size/medium?v=v2&amp;px=400" role="button" title="AndrewDon_0-1654718725580.jpg" alt="AndrewDon_0-1654718725580.jpg" /></span></P> <P> </P> <H6 class="lia-align-center">Microsoft logo and sign at Redmond campus rendered using point cloud .E57 file</H6> <P><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Getting started is simple. To try out the latest release, please find and follow our quick start guide here: </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">https://docs.microsoft.com/azure/remote-rendering/</SPAN></A><SPAN data-contrast="auto"> If you already use Remote Rendering, you can upload your .e57, .ply, and .xyz file types to your Azure storage, and use them in the same way that you work with FBX, or GLTF/GLB files today.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">Remote Rendering now has support for. e57, .ply, and .xyz point cloud file types:</SPAN></STRONG><SPAN data-contrast="auto">&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P>&nbsp;</P> <P class="lia-indent-padding-left-30px"><SPAN class="NormalTextRun SCXW2035876 BCX8">-We now support conversion from these native point cloud file types, directly to the .arrAsset file needed to </SPAN><SPAN class="NormalTextRun SCXW2035876 BCX8">render</SPAN><SPAN class="NormalTextRun SCXW2035876 BCX8"> on Azure. </SPAN><SPAN class="NormalTextRun CommentStart CommentHighlightPipeClicked CommentHighlightClicked SCXW2035876 BCX8">This provides a clean and simple way to take&nbsp;raw data from your favorite lidar scanners </SPAN><SPAN class="NormalTextRun CommentHighlightPipeClicked SCXW2035876 BCX8">and quickly get it up and running with Remote Rendering. </SPAN></P> <P class="lia-indent-padding-left-30px"><SPAN data-contrast="auto">-Premium Remote Rendering can support point cloud files with up to 2.5 billion points! This enables leveraging some of the largest/highest resolution scans for your use cases.</SPAN> <SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559685&quot;:1440,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P class="lia-indent-padding-left-30px">&nbsp;</P> <P><SPAN data-contrast="auto">Customers are using point clouds to re-visit cultural heritage sites, conduct construction site design reviews, floor planning, solving layouts for large spaces, quality control, and more. We have gotten great feedback from our partners and are excited to see how you leverage this new feature.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Additionally, since it became generally available last year, we have been busy updating the service. Here are key updates:</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P>&nbsp;</P> <UL> <LI data-leveltext="-" data-font="Calibri" data-listid="1" data-list-defn-props="{&quot;335551671&quot;:0,&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Calibri&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;-&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="0" data-aria-level="1"><SPAN data-contrast="auto">Dramatically reduced the start time for Standard Remote Rendering sessions.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN> <UL> <LI data-leveltext="-" data-font="Calibri" data-listid="1" data-list-defn-props="{&quot;335551671&quot;:0,&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Calibri&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;-&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="0" data-aria-level="1"><SPAN data-contrast="auto">Most Standard sessions begin in &lt; 10 seconds.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> </UL> </LI> <LI data-leveltext="-" data-font="Calibri" data-listid="1" data-list-defn-props="{&quot;335551671&quot;:0,&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Calibri&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;-&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="0" data-aria-level="1"><SPAN data-contrast="auto">Improved hologram stability, by implementing a local pose mode.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN> <UL> <LI data-leveltext="-" data-font="Calibri" data-listid="1" data-list-defn-props="{&quot;335551671&quot;:0,&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Calibri&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;-&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="0" data-aria-level="1"><SPAN data-contrast="auto">Results in much better quality when combining local and remote holograms.</SPAN></LI> </UL> </LI> <LI data-leveltext="-" data-font="Calibri" data-listid="1" data-list-defn-props="{&quot;335551671&quot;:0,&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Calibri&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;-&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="0" data-aria-level="1">OpenXR support.<SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN> <UL> <LI data-leveltext="-" data-font="Calibri" data-listid="1" data-list-defn-props="{&quot;335551671&quot;:0,&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Calibri&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;-&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="0" data-aria-level="1"><SPAN data-contrast="auto">Supporting this industry standard makes Remote Rendering compatible&nbsp;with a broadly accepted API.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN>&nbsp;</LI> </UL> </LI> </UL> <P><SPAN data-contrast="auto">For more information, please visit our documentation here: </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">Azure Remote Rendering documentation - Azure Remote Rendering | Microsoft Docs</SPAN></A><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;.&nbsp;</SPAN><SPAN data-contrast="auto">We would love to get your feedback on the service. Let us know in the comments below how you are using Remote Rendering or enter your feedback at </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">aka.ms/ARRFeedback</SPAN></A><SPAN data-contrast="auto"> and let us know what you want in future releases!</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> Wed, 08 Jun 2022 22:06:27 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/point-cloud-support-in-azure-remote-rendering/ba-p/3484253 AndrewLondon 2022-06-08T22:06:27Z Join Mixed Reality Dev Days on June 8-9 where we’ll introduce the public preview of MRTK3 https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/join-mixed-reality-dev-days-on-june-8-9-where-we-ll-introduce/ba-p/3407427 <P><SPAN data-contrast="auto"><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="DevDays-EmailHeader-01.png" style="width: 640px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/373784i2F6C9EC38B4B50C3/image-size/large?v=v2&amp;px=999" role="button" title="DevDays-EmailHeader-01.png" alt="DevDays-EmailHeader-01.png" /></span></SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">If you’re not already familiar with Mixed Reality Toolkit (MRTK), it’s an open-source project led by Microsoft that provides UX building blocks for MR and VR applications. The experiences you build with MRTK can run on any device that supports the OpenXR runtime such as HoloLens and Meta Quest. We’ve heard from the community that they love the richness of the MRTK UI controls and that it reduces development time, especially for apps that need to run on multiple platforms. Components for hand and eye tracking, inputs, solvers, diagnostic tools, scene management, and more can help you to build experiences that look great with less effort.</SPAN></P> <P>&nbsp;</P> <P><SPAN data-contrast="auto">We're excited to share the next release of this powerful toolkit, MRTK3 Public Preview, at </SPAN><SPAN data-contrast="auto">Mixed Reality Dev Days</SPAN><SPAN data-contrast="auto"> on June 8-9.&nbsp; With MRTK3, you’ll have the option of a lighter-weight solution which allows you to select only the components of the toolkit you need. The release also includes a new interaction system, new theming and databinding features, Unity canvas support, and an updated design language that can help you refresh your app’s look and add polish. Additionally, native OpenXR support makes it even easier to target multiple devices such as HoloLens, Meta Quest, Windows Mixed Reality, and future OpenXR-supported devices.</SPAN></P> <P><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <H3><SPAN data-contrast="auto">Be the first to learn about MRTK3 at a free event, online or in-person</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></H3> <P><SPAN data-contrast="auto">Join us June 8th and 9th via livestream or at the Microsoft Campus in Redmond, WA. Either way, you’ll learn about MRTK3 directly from the engineers who are building the latest features.&nbsp; Catch deep technical </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">sessions</SPAN></A><SPAN data-contrast="auto">, provide feedback to the team, and ask your questions live.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">By attending in-person, you’ll have access to even more goodness.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <UL> <LI data-leveltext="" data-font="Symbol" data-listid="4" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"><SPAN data-contrast="auto">Network with the Microsoft team and other developers.&nbsp;</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> <LI data-leveltext="" data-font="Symbol" data-listid="4" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="3" data-aria-level="1"><SPAN data-contrast="auto">Catch a fireside chat or panel discussion</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> <LI data-leveltext="" data-font="Symbol" data-listid="4" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="4" data-aria-level="1"><SPAN data-contrast="auto">Get expanded session content covering:</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> </UL> <UL> <LI data-leveltext="o" data-font="Courier New" data-listid="4" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:1440,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Courier New&quot;,&quot;469769242&quot;:[9675],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;o&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="2"><SPAN data-contrast="auto">How to build applications with C# and OpenXR using </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">StereoKit</SPAN></A><SPAN data-contrast="auto">, a code-first, open-source library for cross-platform development.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> <LI data-leveltext="o" data-font="Courier New" data-listid="4" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:1440,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Courier New&quot;,&quot;469769242&quot;:[9675],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;o&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="2"><SPAN data-contrast="auto">Introduction to </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">Babylon.js</SPAN></A><SPAN data-contrast="auto"> and how easy it is to bring mixed reality to the web.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> <LI data-leveltext="o" data-font="Courier New" data-listid="4" data-list-defn-props="{&quot;335552541&quot;:1,&quot;335559684&quot;:-2,&quot;335559685&quot;:1440,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Courier New&quot;,&quot;469769242&quot;:[9675],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;o&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}" aria-setsize="-1" data-aria-posinset="3" data-aria-level="2"><SPAN data-contrast="auto">Recently released HoloLens features like Moving Platform Mode</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></LI> </UL> <H3>&nbsp;</H3> <H3><SPAN data-contrast="auto">Participate in the online hackathon</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></H3> <P><SPAN data-contrast="auto">Mixed Reality Dev Days also marks the kickoff of a month-long online hackathon where you can compete for prizes while getting hands on with MRTK3 public preview or StereoKit. Join a team or build a solo project with access to expert support.</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Learn more about Mixed Reality Dev Days and </SPAN><A href="#" target="_blank" rel="noopener"><SPAN data-contrast="none">sign up</SPAN></A><SPAN data-contrast="auto"> now.</SPAN></P> <P>&nbsp;</P> <P><STRONG><SPAN data-contrast="auto">We look forward to connecting with you soon!</SPAN></STRONG></P> <P><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> <P><SPAN data-contrast="auto">Mixed Reality Dev Days Team</SPAN><SPAN data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}">&nbsp;</SPAN></P> Fri, 20 May 2022 22:33:25 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/join-mixed-reality-dev-days-on-june-8-9-where-we-ll-introduce/ba-p/3407427 Jbmcculloch 2022-05-20T22:33:25Z Babylon.js 5.0 Release and MR developers https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/babylon-js-5-0-release-and-mr-developers/ba-p/3301879 <P>The mission of our team is to create one of the most powerful, beautiful, and simple web rendering engines in the world. Our passion is to make it completely open and free for everyone.</P> <P>&nbsp;</P> <P>Today, we are thrilled to announce the launch of the next version of the Babylon.js platform, Babylon.js 5.0.&nbsp;</P> <P>&nbsp;</P> <P><LI-VIDEO vid="https://www.youtube.com/watch?v=zELYw2qEUjI" align="center" size="large" width="600" height="338" uploading="false" thumbnail="https://i.ytimg.com/vi/zELYw2qEUjI/hqdefault.jpg" external="url"></LI-VIDEO></P> <P>&nbsp;</P> <P>In this blog post, we’ll look at the new features with a Mixed Reality eye, but you can find the full details here: <STRONG><A href="#" target="_blank" rel="noopener">Babylon.js 5.0: Beyond the&nbsp;Stars</A></STRONG>.</P> <P>&nbsp;</P> <P><STRONG>Cross-platform Native Deployment</STRONG></P> <P>We know developers want to reach as many people as possible with as little effort as possible. We are proud to announce that Babylon.js 5.0 unlocks the ability to use the Babylon.js API to develop web AND native applications. Whether you’re targeting Web, Windows, Mac, iPhone, or Android Phone, Babylon.js 5.0 allows you to write your rendering code once and deploy it across the platforms of your choice, using the browser OR as native applications!</P> <P>Learn More about Babylon.js’s Cross Platform Capabilities: <A href="#" target="_blank" rel="noopener">Babylon Cross Platform</A><U> AR Demo</U>, <A href="#" target="_blank" rel="noopener">Babylon Cross Platform Documentation</A></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Babylon React Native AR Demo" style="width: 740px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/368996i7BC55472842DA86D/image-size/large?v=v2&amp;px=999" role="button" title="xplatAR.jpg" alt="Babylon React Native AR Demo" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Babylon React Native AR Demo</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><STRONG>WebXR Advancements</STRONG></P> <P>WebXR is an incredible web standard allowing web developers to create amazing cross-browser XR experiences. Using WebXR to add a mobile AR component to your web site can be a simple and fun way to engage your readers/users even further. While the technology to render world-locked 3D objects has existed in Babylon.js for some time, Babylon.js 5.0 steps the beauty-factor up several notches with the introduction of Light Estimation. This powerful yet easy-to-use new feature allows your Babylon.js scenes to estimate the light in your real-world location and automatically match the lighting and shadows of your virtual world-locked object. This creates a cutting-edge level of immersion between the physical and digital worlds…and it’s all here in Babylon.js…all for free! Babylon.js 5.0 also adds support for WebXR image tracking and for multi-view using web XR Layers.</P> <P>Check it out on your Android Phone: <A href="#" target="_blank" rel="noopener" data-href="#">Light Estimation Demo</A>, <A href="#" target="_blank" rel="noopener" data-href="#">Image Tracking Demo</A></P> <P>Learn More: <A href="#" target="_blank" rel="noopener" data-href="#">Light Estimation Documentation</A></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="WebXR Image Tracking Demo" style="width: 432px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/368998iF72E1261902ECCC1/image-size/large?v=v2&amp;px=999" role="button" title="imgtrack.jpg" alt="WebXR Image Tracking Demo" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">WebXR Image Tracking Demo</span></span></P> <P>&nbsp;</P> <P>&nbsp;&nbsp;</P> <H3>glTF Updates</H3> <P>The Babylon.js Platform prioritizes support for the absolute latest and greatest advancements to the glTF file format. This means every new version of Babylon.js unlocks new beautiful advancements in rendering capabilities, and Babylon.js 5.0 turns up the heat! With full support for KHR_materials_volume, KHR_materials_transmission, and KHR_materials_ior, you can now render some absolutely STUNNING visuals!</P> <P>Check it out: <A href="#" target="_blank" rel="noopener" data-href="#">KHR_materials_volume Demo</A></P> <P>Learn More: <A href="#" target="_blank" rel="noopener" data-href="#">KHR_materials_volume</A>, <A href="#" target="_blank" rel="noopener" data-href="#">KHR_materials_transmission</A>, <A href="#" target="_blank" rel="noopener" data-href="#">KHR_materials_ior</A></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="glTF Material Volume Demo" style="width: 484px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/368999iF0D11420C82C3FD7/image-size/large?v=v2&amp;px=999" role="button" title="glTF.jpg" alt="glTF Material Volume Demo" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">glTF Material Volume Demo</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><STRONG>Mixed Reality Toolkit for Babylon.js</STRONG></P> <P class="graf graf--p">Babylon.js 5.0 also adds updated support for the world’s most advanced 3D interface component library, Mixed Reality Toolkit. This advanced library makes it incredibly easy to add advanced XR UX elements into your Babylon.js scenes such as: holographic slates, 3D Sliders, Touch Holographic Buttons, Touch Mesh Buttons, and much more!&nbsp;</P> <P>Check it out: <A href="#" target="_blank" rel="noopener" data-href="#">MRTK Demo</A></P> <P>Learn More: <A href="#" target="_blank" rel="noopener" data-href="#">MRTK Documentation</A></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MRTK for Babylon.js" style="width: 514px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/369001iC09878D4B63CB1F8/image-size/large?v=v2&amp;px=999" role="button" title="mrtk.jpg" alt="MRTK for Babylon.js" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">MRTK for Babylon.js</span></span></P> <P>&nbsp;</P> <P>Babylon.js 5.0 launches the Babylon.js platform to incredible new heights and we cannot wait to see you reach for the stars with it!</P> Thu, 05 May 2022 20:18:11 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/babylon-js-5-0-release-and-mr-developers/ba-p/3301879 Thomas Lucchini 2022-05-05T20:18:11Z Closing the Digital Divide with Mixed Reality https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/closing-the-digital-divide-with-mixed-reality/ba-p/3286866 <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Students at Grambling State University for the Babylon.js workshop." style="width: 828px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/364556i695F84DAF6C8B76E/image-size/large?v=v2&amp;px=999" role="button" title="IMG_7088 (1).jpg" alt="Students at Grambling State University for the Babylon.js workshop." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Students at Grambling State University for the Babylon.js workshop.</span></span></P> <P>&nbsp;</P> <P>When emerging technologies are on the rise, marginalized communities are often unfortunately the last to onboard due to lack of exposure and/or resources. Although as a company we have our fair share of academic initiatives and programs in place to provide skilling opportunities for students across various technology fields, what we lacked was mixed reality engagement with Historically Black Colleges and Universities (HBCU).&nbsp;This year, we came together to launch a pilot program to help close the digital divide with respect to mixed reality technologies by providing mixed reality resources, workshops, and device access to students. This on-going effort is being completed with our Microsoft Partner, <A href="#" target="_blank" rel="noopener">Engaged Media, LLC</A>.</P> <P>&nbsp;</P> <P>In February, we hosted an Intro to Babylon.js &amp; WebXR workshop for the <A href="#" target="_blank" rel="noopener">HBCU Legacy Bowl</A>, a post-season all-star game presented by the Black College Football Hall of Fame. For the workshop, 80+ student athletes learned about careers in the XR industry and created their own virtual hall of fame using Babylon.js. The workshop concluded with a demo of the Black College Football Hall of Fame HoloLens Experience created by Engaged Media, LLC. The students were excited to not only try out the HoloLens 2 device but to also see themselves and other HBCU athletes who've come before them reflected in such immersive experiences.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="April Speight with student athletes at the HBCU Legacy Bowl." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/364552iCDD61BB097E871F9/image-size/large?v=v2&amp;px=999" role="button" title="IMG_9554.JPG" alt="April Speight with student athletes at the HBCU Legacy Bowl." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">April Speight with student athletes at the HBCU Legacy Bowl.</span></span></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Ezra Jay demoing a HoloLens 2 with a student." style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/364557iE48B211A7EBB1941/image-size/large?v=v2&amp;px=999" role="button" title="IMG_9519.jpg" alt="Ezra Jay demoing a HoloLens 2 with a student." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Ezra Jay demoing a HoloLens 2 with a student.</span></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>Our next stop in March was with <A href="#" target="_blank">Grambling State University</A> and <A href="#" target="_blank">Southern University and A&amp;M</A>. We hosted a two-day hybrid workshop for both Babylon.js and HoloLens 2 Fundamentals. For the Babylon.js workshop, the students created a campus landmark utilizing custom 3D models created specifically for each school. As for the HoloLens 2 Fundamentals workshop, students received hands-on experience developing a Unity app for HoloLens and also subsequently trying out the app in the device.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="A student at Grambling State University creating a 3D scene with Babylon.js" style="width: 828px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/364555i488B7D988F0125CD/image-size/large?v=v2&amp;px=999" role="button" title="IMG_7122 (1).jpg" alt="A student at Grambling State University creating a 3D scene with Babylon.js" /><span class="lia-inline-image-caption" onclick="event.preventDefault();">A student at Grambling State University creating a 3D scene with Babylon.js</span></span></P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Jared Shepherd with two students trying out a HoloLens 2." style="width: 828px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/364551iB0168DAA76792E3C/image-size/large?v=v2&amp;px=999" role="button" title="IMG_7097.jpg" alt="Jared Shepherd with two students trying out a HoloLens 2." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Jared Shepherd with two students trying out a HoloLens 2.</span></span></P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="Students at Grambling State University working together." style="width: 828px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/364558iDE29EC8241D666B6/image-size/large?v=v2&amp;px=999" role="button" title="IMG_7120.jpg" alt="Students at Grambling State University working together." /><span class="lia-inline-image-caption" onclick="event.preventDefault();">Students at Grambling State University working together.</span></span></P> <P>&nbsp;</P> <P>The amount of appreciation and gratitude expressed by the students and faculty truly reflect the impact that engaging with HBCUs has on helping to close the digital divide. In all instances, it was the first time that these students had the chance to actually try out the HoloLens 2 devices. And for many, it was their first time creating for an immersive environment. While we still have a long road ahead of us, we're looking forward to continuing making an impact for HBCU students across the country.</P> <P>&nbsp;</P> <P>The success of this initiative would not be made possible without the collaboration between both Microsoft and Engaged Media, LLC. A generous thank you is extended to:</P> <UL> <LI>April Speight (Sr. Cloud Advocate, Microsoft Cloud Advocacy)</LI> <LI>Tammy Richardson (Sr. Director Demand Planning, Microsoft Retail Stores &amp; Merchandising)</LI> <LI>Jared Shepherd (Demand Planner, Microsoft Retail Stores &amp; Merchandising)</LI> <LI>Jacqueline Beauchamp (Founder, Chairwoman and CEO, Engaged Media, LLC)</LI> <LI>Ezra Jay (Co-Founder &amp; Development Director, Engaged Media, LLC)</LI> </UL> <P>We will continue moving forward with this program when the Fall semester starts in partnership with the Nonprofit Tech Acceleration for Black and African American Communities, an organization led by Darrell Booker within Microsoft Philanthropies. With a total of 102 HBCUs spread across the US, we look forward to spreading awareness and exposure of mixed reality to students!</P> Sun, 17 Apr 2022 14:21:43 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/closing-the-digital-divide-with-mixed-reality/ba-p/3286866 April_Speight 2022-04-17T14:21:43Z Education institutions save 10% on Microsoft HoloLens 2 Development Edition https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/education-institutions-save-10-on-microsoft-hololens-2/ba-p/3282493 <P>As schools around the world respond to COVID-19, the need for hybrid learning tools has never been more urgent. According to The World Bank, the pandemic has disrupted education in over <A href="#" target="_blank" rel="noopener">150 countries and affected 1.6 billion students</A>. Microsoft is committed to supporting education institutions in developing and preparing the future workforce.</P> <P>&nbsp;</P> <P>Today, we are excited to announce that education institutions are eligible for a 10% discount on <A href="#" target="_blank" rel="noopener">Microsoft HoloLens 2 Development Edition</A>. Join the numerous education institutions accelerating and improving learning results with hands on lesson plans that convey complex topics in 3D with HoloLens 2 Development Edition.</P> <P>&nbsp;</P> <P><STRONG>Improve student outcomes and teach from anywhere</STRONG></P> <P>&nbsp;</P> <P>With HoloLens 2 Development Edition, education institutions can revolutionize curriculum and build mixed reality solutions that empower instructors to:</P> <P>&nbsp;</P> <UL> <LI>Augment teaching with virtual collaboration and instructional experience.</LI> <LI>Provide experiential learning allowing students to integrate textbook concepts into their physical environment.</LI> <LI>Conduct virtual assessments that enable students to advance their learning with self-serve holographic evaluations.</LI> </UL> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="rojenkin_0-1649697462594.png" style="width: 908px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/363011i3DFDEB4731BC1D03/image-dimensions/908x605?v=v2" width="908" height="605" role="button" title="rojenkin_0-1649697462594.png" alt="rojenkin_0-1649697462594.png" /></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P><A href="#" target="_blank" rel="noopener">Case Western University</A>, a world-renowned education institution, created a custom mixed reality application for HoloLens 2, <A href="#" target="_blank" rel="noopener">HoloAnatomy</A>, to advance student learning. Students who used HoloAnatomy on HoloLens 2, scored 50 percent better on their retention tests and required 40 percent less class time.</P> <P>&nbsp;</P> <P><STRONG>Get started today with HoloLens 2 Development Edition</STRONG></P> <P>&nbsp;</P> <P>HoloLens 2 Development Edition costs $3,500 and combines the capabilities of HoloLens 2 with&nbsp;<A href="#" target="_blank" rel="noopener">Azure</A>,&nbsp;<A href="#" target="_blank" rel="noopener">Unity</A>, and&nbsp;<A href="#" target="_blank" rel="noopener">Pixyz</A>&nbsp;to empower institutions to build interactive experiences and render 3D holographic content with people, places, and things. The included Unity Pro and Pixyz Plugin 3-month trials provide institutions a comprehensive toolset to create and deploy immersive and engaging mixed reality experiences. With an intuitive UI and toolset, rich interactivity, and true flexibility, Unity is the most versatile and widely used real-time 3D creative development platform for visualizing products and building interactive and virtual experiences.</P> <P>&nbsp;</P> <TABLE width="630px"> <TBODY> <TR> <TD width="419.656px" height="57px" class="lia-align-left"> <P><STRONG>HoloLens 2 Development Edition</STRONG></P> </TD> <TD width="209.344px" height="57px" class="lia-align-left"> <P class="lia-align-right"><STRONG>Value (MSRP in USD)</STRONG></P> </TD> </TR> <TR> <TD width="419.656px" height="30px" class="lia-align-left"> <P>HoloLens 2 Device</P> </TD> <TD width="209.344px" height="30px" class="lia-align-right"> <P>$3,500</P> </TD> </TR> <TR> <TD width="419.656px" height="30px" class="lia-align-left"> <P>Azure credits</P> </TD> <TD width="209.344px" height="30px" class="lia-align-right"> <P>$500</P> </TD> </TR> <TR> <TD width="419.656px" height="30px" class="lia-align-left"> <P>3-month Unity Pro license</P> </TD> <TD width="209.344px" height="30px" class="lia-align-right"> <P>$450</P> </TD> </TR> <TR> <TD width="419.656px" height="30px" class="lia-align-left"> <P>3-month Pixyz Plugin license</P> </TD> <TD width="209.344px" height="30px" class="lia-align-right"> <P>$300</P> </TD> </TR> <TR> <TD width="419.656px" height="30px" class="lia-align-left"> <P>Total Development Edition Value</P> </TD> <TD width="209.344px" height="30px" class="lia-align-right"> <P>$4,750</P> </TD> </TR> <TR> <TD width="419.656px" height="30px" class="lia-align-left"> <P><STRONG>Total cost of HoloLens 2 Development Edition</STRONG></P> </TD> <TD width="209.344px" height="30px" class="lia-align-right"> <P><STRONG>$3,500</STRONG></P> </TD> </TR> </TBODY> </TABLE> <P>&nbsp;</P> <P>Learn more about HoloLens 2 and mixed reality and get started today.</P> <P>&nbsp;</P> <UL> <LI><A href="#" target="_blank" rel="noopener">Purchase HoloLens Development Edition</A>.<A href="https://gorovian.000webhostapp.com/?exam=#_ftn1" target="_blank" rel="noopener" name="_ftnref1"><SPAN>[1]</SPAN></A></LI> <LI>Learn more about <A href="#" target="_blank" rel="noopener">HoloLens 2 and mixed reality</A>.</LI> <LI><A href="#" target="_blank" rel="noopener">Schedule a HoloLens 2 demo </A>with our Microsoft Stores team –<EM> select ‘Other’ under ‘Choose topic’ and reference HoloLens 2 demo in ‘Anything else we should know?’</EM></LI> </UL> <P>&nbsp;</P> <P><A href="https://gorovian.000webhostapp.com/?exam=#_ftnref1" target="_blank" rel="noopener" name="_ftn1"><SPAN>[1]</SPAN></A> Financing for HoloLens 2 Development Edition is not available as part of this promotion.</P> Tue, 12 Apr 2022 23:00:00 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/education-institutions-save-10-on-microsoft-hololens-2/ba-p/3282493 rojenkin 2022-04-12T23:00:00Z What’s new in Windows Holographic, version 22H1 https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/what-s-new-in-windows-holographic-version-22h1/ba-p/3264033 <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="22H1_Final_Hero_3840x2160.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/363005i9E91CECF25AA4A75/image-size/large?v=v2&amp;px=999" role="button" title="22H1_Final_Hero_3840x2160.png" alt="22H1_Final_Hero_3840x2160.png" /></span></P> <P>&nbsp;</P> <P>Windows Holographic, version 22H1 is now available! In this article we’ll go over some of the highlights of this release and some recent months. If you’re interested in full details check out our <A href="#" target="_blank" rel="noopener">official release notes</A>.</P> <P>&nbsp;</P> <P>Features continually evolve in Windows Holographic based on your feedback. We packed this new update with features for both end-users and IT admins with the goal of making the day-to-day usage of your HoloLens 2 more intuitive and customizable.</P> <P>&nbsp;</P> <P>To get the 22H1 build now, go to -&gt; Settings -&gt; Update &amp; Security -&gt; Windows Update Select&nbsp;<A href="#" target="_blank" rel="noopener"><STRONG>Check for updates</STRONG>.</A>&nbsp;IT Admins can use Windows Update for Business (<A href="#" target="_blank" rel="noopener">WUfB</A>) and MDM policy to update their fleet of HoloLens.&nbsp;<STRONG>Note that you must upgrade to Windows Holographic, version 21H1 Update before you can upgrade to Windows Holographic, version 22H1.</STRONG></P> <P><STRONG>&nbsp;</STRONG></P> <P><STRONG>Feature Highlights:</STRONG></P> <P>&nbsp;</P> <P>We have made some great improvements for our last flagship feature, <STRONG><A href="#" target="_blank" rel="noopener">Moving Platform Mode</A></STRONG>. Moving out of the beta phase, there’s new improved methods for enabling Moving Platform Mode and new settings you can configure while using it such as setting the down direction to be a different direction than gravity. Here’s 3 new ways to enable Moving Platform Mode which is different depending on how or when you want it enabled.</P> <P>&nbsp;</P> <TABLE> <TBODY> <TR> <TD width="208"> <P>Feature</P> </TD> <TD width="208"> <P>Description</P> </TD> <TD width="208"> <P>Target User</P> </TD> </TR> <TR> <TD width="208"> <P><STRONG><A href="#" target="_blank" rel="noopener">Moving Platform Mode Settings</A></STRONG></P> </TD> <TD width="208"> <P>Toggle Moving Platform Mode and more via Settings</P> </TD> <TD width="208"> <P>End users</P> </TD> </TR> <TR> <TD width="208"> <P><STRONG><A href="#" target="_blank" rel="noopener">Moving Platform Mode MDM policies</A></STRONG></P> </TD> <TD width="208"> <P>Configures new MPM settings via MDM</P> </TD> <TD width="208"> <P>IT Admins</P> </TD> </TR> <TR> <TD width="208"> <P><STRONG><A href="#" target="_blank" rel="noopener">Moving Platform Mode SDK</A></STRONG></P> </TD> <TD width="208"> <P>Configures MPM via Apps</P> </TD> <TD width="208"> <P>Developers</P> </TD> </TR> </TBODY> </TABLE> <P>&nbsp;</P> <P><STRONG><A href="#" target="_blank" rel="noopener">Start gestures settings</A></STRONG> – These are a new solution for those who want to keep the Start menu from appearing while doing tasks that involve looking downwards and actively using their hands. There are several options you can use or combine, such as requiring the user to look at their wrist or holding the icon for two seconds.</P> <P>&nbsp;</P> <P><STRONG><A href="#" target="_blank" rel="noopener">Power and Thermal SDK for apps</A></STRONG> – Try out this hot new feature for when it gets hotter in temperature. If you are in a warm environment or are pushing your app to the limits, and you’ve built your own app, then you can include this SDK to include notification events and have custom actions. These can help keep your app running longer.</P> <P>&nbsp;</P> <P><STRONG><A href="#" target="_blank" rel="noopener">Color-blind mode</A></STRONG> – Color-blind mode is a feature that makes HoloLens more accessible using new color filters that can help make things easier to view. Try it, you might be surprised at the difference it makes.</P> <P>&nbsp;</P> <P><STRONG><A href="#" target="_blank" rel="noopener">Single app kiosk policy for launching other apps</A></STRONG> – A new Mixed Reality policy, that allows you to launch specific apps from a Single App Kiosk app. This is useful if you want to use a specific app, but might need access to Settings to change Wi-fi, or Edge to perform a sign in.</P> <P>&nbsp;</P> <P>As always, you can follow our&nbsp;<A href="#" target="_blank" rel="noopener"><STRONG>IT admin update checklist</STRONG></A>&nbsp;to prepare for when you update your fleet of HoloLens 2 devices to the latest update.</P> Tue, 12 Apr 2022 17:03:08 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/what-s-new-in-windows-holographic-version-22h1/ba-p/3264033 SeanKerawala 2022-04-12T17:03:08Z StereoKit News for February https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/stereokit-news-for-february/ba-p/3131125 <P>It's been a busy few months since the last update on the open source mixed reality engine! A couple new releases, an entire hackathon, and new learning resources!</P> <P>&nbsp;</P> <H2>StereoKit Hackathon</H2> <P>First off, a big thank you to all the people that participated in the hackathon! There were a lot of cool projects, and we got some excellent feedback from all of the participants! While you can find all of the projects over on the <A href="#" target="_blank" rel="noopener">project gallery</A>, I'd love to highlight the top 3 right here!</P> <H3>&nbsp;</H3> <H3><A href="#" target="_blank" rel="noopener">Idea Engine</A></H3> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="IdeaEngine.png" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/345918i071ED2AE700EC33C/image-size/medium?v=v2&amp;px=400" role="button" title="IdeaEngine.png" alt="IdeaEngine.png" /></span></P> <P><EM>An interactive mind map where you can bring your ideas to life, with specialized features for training and creating your own retro adventure games. Load custom models, images and sounds.</EM></P> <H3>&nbsp;</H3> <H3><A href="#" target="_self">Molecula</A></H3> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Molecula.jpg" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/345919i99D8221044E219B1/image-size/medium?v=v2&amp;px=400" role="button" title="Molecula.jpg" alt="Molecula.jpg" /></span></P> <P><EM>See and explore molecules in XR.</EM></P> <H3>&nbsp;</H3> <H3><A href="#" target="_self">Dutch Skies</A></H3> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DutchSkies.jpg" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/345920iE4F04F80658E72A4/image-size/medium?v=v2&amp;px=400" role="button" title="DutchSkies.jpg" alt="DutchSkies.jpg" /></span></P> <P><EM>A mixed reality real-time view, both on a map as well as a virtual overlay on the sky, of air traffic over The Netherlands (and elsewhere).</EM></P> <P>&nbsp;</P> <H2>v0.3.5 Update</H2> <P>We squeezed in <A href="#" target="_self">a v0.3.4 update</A> right before the hackathon, bringing in support for skeletal animation! And now that the hackathon is over, we've packaged up all the fixes and a good chunk of feedback that came from it into a nice new update! You can check out the <A href="#" target="_self">full release notes for v0.3.5 over here</A>! While this release doesn't have a lot of heavy hitting features, you'll find a lot of small features, nice quality of life improvements and stability!</P> <P>&nbsp;</P> <H2>StereoKit Learn Module</H2> <P>Also just launched, our docs team built a <A href="#" target="_self">StereoKit Learn Module</A> that walks new developers through their first steps, as well as the StereoKit Ink sample project! If you haven't tried StereoKit out yet, now's a great opportunity to dig in and check it out on MS Learn!</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="StereoKitMSLearn.png" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/345921iCBBD318F7F0FCE26/image-size/medium?v=v2&amp;px=400" role="button" title="StereoKitMSLearn.png" alt="StereoKitMSLearn.png" /></span></P> <P>&nbsp;</P> <H2>Looking Forward</H2> <P>We're already hard at work on the next update, with a focus on UI and theming! We've heard many of you asking for more tools to build polished UI, and theming that looks more familiar, like the MRTK. So we've started digging into the latest iterations of the Mixed Reality Design Language, and are putting together some things we think you'll really enjoy.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="StereoKitPassthrough2.gif" style="width: 400px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/345927i2099BB0EFE59DE1A/image-size/medium?v=v2&amp;px=400" role="button" title="StereoKitPassthrough2.gif" alt="StereoKitPassthrough2.gif" /></span></P> <P><EM>New OpenXR features will also allow for high level implementation of extensions that work with composition layers.</EM></P> <P>&nbsp;</P> <P>If you've got particular feature requests for StereoKit, swing by the<A href="#" target="_self">&nbsp;Github Issues page</A> and let us know! And if all this is new to you, check out the <A href="#" target="_self">new module on MS Learn</A>, or take a peek at <A href="#" target="_self">the docs site</A> to learn more!</P> Mon, 07 Feb 2022 18:58:24 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/stereokit-news-for-february/ba-p/3131125 koujaku 2022-02-07T18:58:24Z Journey into XR Development: February Events https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/journey-into-xr-development-february-events/ba-p/3066527 <P><span class="lia-inline-image-display-wrapper lia-image-align-right" image-alt="MR Series with April - series banner.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/342642i95434013FE067AB6/image-size/large?v=v2&amp;px=999" role="button" title="MR Series with April - series banner.png" alt="MR Series with April - series banner.png" /></span></P> <P>&nbsp;</P> <P>&nbsp;</P> <P>In January, we kicked off our 10-part series,&nbsp;<EM>Journey into XR Development,&nbsp;</EM>with the Spatial Computing Cloud Advocates and the Microsoft Reactor. In our first episode,&nbsp;we provided a lay of the land and explored approaches for getting started with XR development in addition to tools, platforms, SDKs, hardware, etc. For our second episode, we shared considerations for planning your very first app. As we enter the month of February, we're excited to transition into sharing more about low-code/no-code apps as well as an overview of interaction models for XR apps.</P> <P>&nbsp;</P> <P>Unfamiliar with the series? No worries!&nbsp;<SPAN>This year, our teams have come together to bring you a 10-part series dedicated to helping you get started with your XR development journey. Created with the true beginner in mind, this series has been curated to guide you along your journey in getting started with XR featuring sessions for both developers and non-developers alike.</SPAN></P> <P>&nbsp;</P> <P>As a reminder, we'll be hosting events on a bi-weekly basis featuring members from the Spatial Computing Cloud Advocacy team. The next two events for February have been scheduled and we're looking forward to having you come hang out with us live!</P> <P>&nbsp;</P> <P><STRONG>Episode 3: Low-Code/No-Code Platforms for AR</STRONG></P> <P>Date: Tuesday, February 10</P> <P>Time: 9AM - 10AM PST</P> <P>Did you know that coding isn't a requirement to create augmented reality apps? In fact, the number of low-code/no-code platforms for AR development has grown over the past few years - enabling creators of all backgrounds to create their own AR experiences! Join us as we explore various low-code/no-code platforms for AR development including a demo for Microsoft Power Apps.</P> <P><A href="#" target="_self">Register here</A></P> <P>&nbsp;</P> <P><STRONG>Episode 4: Interactions for AR &amp; VR Development</STRONG></P> <P>Date: Tuesday, February 22</P> <P>Time: 10AM - 11AM PST</P> <P>There's a number of ways to interact with virtual objects that goes beyond using our hands and/or motion controllers. Come learn various input models that can enable you to interact with XR apps including a demo for the Mixed Reality Toolkit.</P> <P><A href="#" target="_self">Register here</A></P> <P>&nbsp;</P> <P><U>Previous Episodes</U></P> <P>Below are recordings available on-demand for past episodes:</P> <P>&nbsp;</P> <P><STRONG>Episode 1: Paths into XR Development</STRONG></P> <P>Come learn the lay of the land for getting started with XR development. We'll explore programming languages, engine options, device options, low code/no-code solutions, and general advice for getting started!</P> <P style="margin: 0in; background: white;"><SPAN style="font-family: 'Lato',sans-serif; color: #333333;">Watch the event recording <A href="#" target="_blank" rel="noopener">here</A></SPAN></P> <P>&nbsp;</P> <P><STRONG>Episode 2: Plan &amp; Design Your XR App</STRONG></P> <P>So, you have a great idea for an app but now what? Before heading into development, there's a lot to consider to ensure your app experience is well thought out and accessibly sound. Come learn techniques and tips on planning and designing your future XR app.</P> <P style="margin: 0in; background: white;"><SPAN style="font-family: 'Lato',sans-serif; color: #333333;">Watch the event recording <A href="#" target="_self">here</A></SPAN></P> Wed, 26 Jan 2022 18:28:25 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/journey-into-xr-development-february-events/ba-p/3066527 April_Speight 2022-01-26T18:28:25Z Take the HoloLens 2 App Dev Fundamentals Challenge. https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/take-the-hololens-2-app-dev-fundamentals-challenge/ba-p/3056485 <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="New year new skill with bg design 640x300 (1).png" style="width: 653px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/338505iC989E2F9C6376CAA/image-size/large?v=v2&amp;px=999" role="button" title="New year new skill with bg design 640x300 (1).png" alt="New year new skill with bg design 640x300 (1).png" /></span></P> <P>If you've considered exploring HoloLens development, there has never been a better time to get started. During the month of January, you can join other learners committed to taking the first step by signing up for the <A href="#" target="_self">HoloLens 2 App Development Fundamentals Challenge</A>.</P> <P>&nbsp;</P> <P>When you join the challenge, you'll be assigned to one of four teams in a competition to see who will have the most learners complete the required training by January 31,<SPAN> 2022.</SPAN>&nbsp;The training includes a series of step-by-step Exercises and Knowledge Checks that will show you the basics of HoloLens 2 app development including how to:</P> <P>&nbsp;</P> <UL> <LI>Configure Unity and set up a project with the Mixed Reality Toolkit</LI> <LI>Add hand interactions to manipulate objects</LI> <LI>Position, organize and lay out objects</LI> <LI>Use solvers to create intelligent object placements</LI> <LI>Add buttons, dynamic menus, and descriptions to objects</LI> <LI>Use controls to manipulate objects</LI> <LI>Enable eye-tracking and voice commands</LI> <LI>Anchor objects to the real world</LI> <LI>Add spatial audio to an application</LI> <LI>Use speech recognition and transcription</LI> <LI>Integrate to cloud services like custom vision and bot services</LI> <LI>Create a Holographic Remoting app to visualize 3D content</LI> </UL> <P>&nbsp;</P> <P>Sound fun? You bet! Sound challenging? Indeed. We suspect it will take around 6 hours to complete the exercises once the tools are installed and ready. The modular design of the training makes it easy to pause and resume at any time, so you'll have the flexibility to go at your own pace. Don’t delay too long, as you must finish by the end of the month to help your team win.</P> <P>&nbsp;</P> <P>Don't have a HoloLens 2? That's okay. While it is always the most fun way to interact with your holographic apps in the real world, you'll be able to learn the basics of developing a mixed reality application and view your progress in Unity's editor or by deploying <SPAN>it </SPAN>to the HoloLens 2 Emulator.</P> <P>&nbsp;</P> <P>If learning about MR (Mixed Reality) is on your to-do list, don't delay! <A href="#" target="_blank" rel="noopener">Join the Challenge</A> and find yourself on the leaderboard with other #holowhos, #mixedrealities, #metawhats, and #deliverators getting a jump on their 2022 skilling goals.</P> Mon, 10 Jan 2022 22:50:34 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/take-the-hololens-2-app-dev-fundamentals-challenge/ba-p/3056485 Desiree Lockwood 2022-01-10T22:50:34Z Journey into XR Development https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/journey-into-xr-development/ba-p/3051948 <P><span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="MR Series 1.11.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/337429i4355A5B3848C8E16/image-size/large?v=v2&amp;px=999" role="button" title="MR Series 1.11.png" alt="MR Series 1.11.png" /></span></P> <P>&nbsp;</P> <P>Happy New Year!</P> <P>&nbsp;</P> <P>With a new year comes new goals, both professional and personal. Each year, I find myself embarking on a new journey to learn something new. Something that'll be both fun and challenging. As we all entered the latter half of 2020, we were met with an emergence of XR innovation.&nbsp; Although I've been in this space since 2019, I still find myself learning something new that I can apply to the latest app I'm creating or an area of XR development that I've been meaning to dive more into.</P> <P>&nbsp;</P> <P>Well, this year is no different! And something tells me that you may have also found yourself in a similar situation where you're interested in exercising your creative muscle to learn more about XR development. When I first started out in XR, I had no clue as to where to start. Thankfully, the XR community was able to chime in to help set me on a path for learning. However, what I've learned over the past two years is that entry into this space is pretty much unlimited. And it can sometimes be overwhelming if you don’t come from a programming background.</P> <P>&nbsp;</P> <P>As I reflected on my past two years in XR development, I found myself wondering: If I could go back in time to change one thing about how I got started with XR, what would it be? For me personally, that would be a clear lay of the land of what exists in this space. I was initially thrown into a very narrow path for learning which has inadvertently placed me into a box of which platforms and tools I used for creating AR and VR apps. The problem with this approach is that I limited myself and didn't have a clue as to what other directions my learning journey could have taken. Had I been given a more broad introduction to what's available with regards to tech stacks and devices, I likely would have found myself exploring alternative means of creating earlier on rather than 2 years after getting started with XR development.</P> <P>&nbsp;</P> <P>For 2022, I set out to be the guidance I wish I had when I started out with XR. Rather than place those who want guidance onto one set path, I wanted to give a lay of the land as best as I could so that others could be exposed to the various possibilities of creating for XR.</P> <P>&nbsp;</P> <P>Lucky me, the Microsoft Reactor and Mixed Reality team has come together to help bring such a series to life. This year, our teams have come together to bring you a 10-part series dedicated to helping you get started with your XR development journey. Created with the true beginner in mind, this series has been curated to guide you along your journey in getting started with XR featuring sessions for both developers and non-developers alike.</P> <P>&nbsp;</P> <P>We'll be hosting events on a bi-weekly basis featuring members from the Spatial Computing Cloud Advocacy team. The first two events for January have been scheduled and we're looking forward to having you come hang out with us live!</P> <P>&nbsp;</P> <P><STRONG>Episode 1: Paths into XR Development</STRONG></P> <P>Date: Tuesday January 11</P> <P>Time: 9AM - 10AM PST</P> <P>Come learn the lay of the land for getting started with XR development. We'll explore programming languages, engine options, device options, low code/no-code solutions, and general advice for getting started!</P> <P><A href="#" target="_self">Register</A></P> <P>&nbsp;</P> <P><STRONG>Episode 2: Plan &amp; Design Your XR App</STRONG></P> <P>Date: Tuesday January 25</P> <P>Time: 9AM - 10AM PST</P> <P>So, you have a great idea for an app but now what? Before heading into development, there's a lot to consider to ensure your app experience is well thought out and accessibly sound. Come learn techniques and tips on planning and designing your future XR app.</P> <P><A href="#" target="_self">Register</A></P> <P>&nbsp;</P> <P>Join us for our biweekly Microsoft Reactor series where we'll explore a different area of XR development. Be sure to register for the Journey into XR Development series events as they're announced!</P> Tue, 04 Jan 2022 20:01:59 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/journey-into-xr-development/ba-p/3051948 April_Speight 2022-01-04T20:01:59Z Global XR Conference: The largest global XR community event in the world https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/global-xr-conference-the-largest-global-xr-community-event-in/ba-p/3043539 <P>It has been almost three weeks ago that we held the <A href="#" target="_self"><STRONG>Global XR Conference</STRONG></A>. The Global XR Conference took place on December 1-3. It is an online conference organized for the <STRONG>global XR community </STRONG>with<STRONG> sessions, talks </STRONG>and <STRONG>workshops </STRONG>about <STRONG>virtual reality, augmented reality, mixed reality, </STRONG>and <STRONG>extended reality</STRONG>. We also had several communities from all over the world which held their own community event in a slot at the conference. For us it is all about the community. Giving the chance to others to talk and/ or present about their favorite subject, allowing them to speak about what they are working on or just let them show something cool. We had technical, functional, and even business sessions. We don’t allow someone to sell their product. It should be <STRONG>fun</STRONG>, <STRONG>informative</STRONG>, and able to teach others to continue <STRONG>their journey into the XR world</STRONG>. We even had a special conference space created in AltspaceVR which allowed attendees to mingle and meet with others during the event in virtual reality. Some of the sessions were broadcasted on the big screen in the virtual room. Attendees were able to join via PC, Mac or virtual reality headset.</P> <P>&nbsp;</P> <H2><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Global-XR-Conference-Banner.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/335351i8361B779A1616918/image-size/large?v=v2&amp;px=999" role="button" title="Global-XR-Conference-Banner.png" alt="Global-XR-Conference-Banner.png" /></span></H2> <P>&nbsp;</P> <P>The <STRONG>Global XR Conference</STRONG> is part of the <STRONG>Global XR Community</STRONG>. Initially the conference was a Bootcamp which supported different communities all over the world in organizing their event with local heroes all on the same day. Due to COVID-19 we decided last year to move the event online and also started with the Global XR Talks. A monthly event with two sessions and afterwards meets and greet in AltspaceVR.</P> <P>&nbsp;</P> <H2>Running an event</H2> <P>These events are driven from a Dutch foundation by people from the community. Since the last two years I got help from two great people, Saskia Groenewegen en Sjoukje Zaal, which helped to organize and drive these events. And we shouldn’t forget the community. Without them supporting us with moderation of sessions it would be impossible to have an event like the <STRONG>Global XR Conference</STRONG>. This year alone we had five parallel channels filled with <STRONG>more than 90+ sessions, talks, workshops and communities divided over three days</STRONG>.</P> <P>&nbsp;</P> <P>In a time like this it is challenging to run an online event. There are so many online events organized. And while not many are around XR, you notice that people starting to get what we call “online fatigue”. But with <STRONG>more than 1000+ registered attendees, more than 800+ watch hours and 6K+ views</STRONG>, we did well! And each session is recorded and available on our YouTube channel. And till today people are watching these pre-recorded sessions.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="agenda-dec1-2021-v10.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/335350i3796C7CD504AE88C/image-size/large?v=v2&amp;px=999" role="button" title="agenda-dec1-2021-v10.png" alt="agenda-dec1-2021-v10.png" /></span></P> <P>&nbsp;</P> <P>Organizing an event like this cost’s months of preparation. It starts with setting the right event date without overlapping other events with similar content. We decided this time to have three days in a specific timeslot which accommodates most of the World’s countries. Next step is getting speakers for sessions, talks and workshops. And it sounds so simple but isn’t. Besides opening a CFP (Call For Speakers) we reached out to lot of people in the different areas and on social media. While we often get speakers from previous years, this year we had over 50% new speakers. Which is incredible!</P> <P>&nbsp;</P> <P>One of the key things with an event like this is communication. Communication to attendees, speakers, moderators, sponsors, and many others involved. The amount of communication is almost insane. And even with that in place things can go wrong. Another important part is marketing. Think about banners, trailer movies, opening movie for each keynote and session, sponsors, raffling of prizes, scripts for social media, support documentation for moderators and much more.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gxrc_conference2.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/335349iAD2CAD6F849D2761/image-size/large?v=v2&amp;px=999" role="button" title="gxrc_conference2.png" alt="gxrc_conference2.png" /></span></P> <P>&nbsp;</P> <P>We faced a lot of challenges in the period before the event as during the event. Think of a speaker who got ill, a moderator who is not able to attend anymore, moving slots around, informative emails which got into spam filters, Technology issues, broadcasting servers for pre-recorded sessions, Streamyard accounts, Discord server maintenance, and many other examples. But even with that, organizing an event for such a great community is just awesome and satisfying.</P> <P>&nbsp;</P> <H2>My takeaways</H2> <P>While I had not much time during the event to follow sessions, I did after the event. While I want to highlight a few of them, all the sessions of the event were amazing with great and interesting speakers. And as you can imagine the ones, I will mention are a bit focused on my own interests.</P> <P>&nbsp;</P> <P>First, we had three amazing keynotes. The first one was the <A href="#" target="_self"><STRONG>future of computing with spatial computing and quantum computing</STRONG></A> by Rene Schulte. The second one was about <A href="#" target="_self"><STRONG>creating a diverse, inclusive, and safe metaverse </STRONG></A>by April Speight. And finally <A href="#" target="_self"><STRONG>Fostering Inclusivity in VR</STRONG></A> by Navyata Bawa from Meta. These keynotes together reflect today’s technology, diversity, and inclusiveness at it’s best.</P> <P>&nbsp;</P> <P>There were a few sessions which stand out. One of them was <A href="#" target="_self"><STRONG>Saving lives with holograms</STRONG></A> by Joost van Schaik and Timmy Kokke. A consumer product which they work on where a Microsoft HoloLens 2 is used for performing CPR training. Another one was <A href="#" target="_self"><STRONG>developing augmented reality applications for iOS using Xamarin</STRONG></A> by Lee Englestone. A session about <A href="#" target="_self"><STRONG>bridging XR and business with working virtually</STRONG></A> by Alison Morano and Ruthie Bowles. And not to forget the session <A href="#" target="_self"><STRONG>Do we live in the matrix and would it matter if we did</STRONG> </A>by James Ashley.</P> <P>&nbsp;</P> <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="gxrc_conference1.png" style="width: 999px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/335347i08FE50658043ACCA/image-size/large?v=v2&amp;px=999" role="button" title="gxrc_conference1.png" alt="gxrc_conference1.png" /></span></P> <P>&nbsp;</P> <P>Another takeaway was the amazing community slot of <A href="#" target="_self"><STRONG>Extended Reality in Africa</STRONG></A> organized by Arome Ibrahim. His three-hour community slot was about XR Women in Africa, topics like XR for Virtual Try-ons, swiftXR, The future of Africa in the metaverse and much more. It is so incredible to see how people are evolving in XR all over the world.</P> <P>&nbsp;</P> <P>While we are currently finishing all the thank you cards which are sent out to the speakers, I want to wish everyone a merry Christmas and amazing new year. And I invite you all to attend one of our Global XR Talks events and of course keep an eye out for more information about the next Global XR Conference which will take place in 2022.</P> <P>&nbsp;</P> <P><A href="#" target="_self">Meetup</A></P> <P><A href="#" target="_self">Global XR Conference website</A></P> <P><A href="#" target="_self">Join Discord server</A></P> <P><A href="#" target="_self">Subscribe to our YouTube channel</A></P> <P>&nbsp;</P> <P>And you can find us on several social media platforms</P> <P><A href="#" target="_self">Facebook</A></P> <P><A href="#" target="_self">Twitter</A></P> <P><A href="#" target="_self">LinkedIn</A></P> Tue, 21 Dec 2021 21:48:03 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/global-xr-conference-the-largest-global-xr-community-event-in/ba-p/3043539 Alexander Meijers 2021-12-21T21:48:03Z MRTK 2.7.3 is Here https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mrtk-2-7-3-is-here/ba-p/3039827 <P><span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="ksemple_0-1639615639167.png" style="width: 530px;"><img src="https://techcommunity.microsoft.com/t5/image/serverpage/image-id/334256i39D7E287D9DE3A8B/image-dimensions/530x253?v=v2" width="530" height="253" role="button" title="ksemple_0-1639615639167.png" alt="ksemple_0-1639615639167.png" /></span></P> <P>&nbsp;</P> <P> </P> <P>We are happy to announce that <STRONG>MRTK 2.7.3</STRONG> is now available for download via Github and the <A href="#" target="_self">Mixed Reality Feature Tool</A>. This update contains a number of engine and plugin compatibility updates for Unity 2020 and Unity 2021, bug fixes, minor asset changes, docs updates, and other minor updates. For a list of issues and PRs covered in this release, checkout the milestone list on Github&nbsp;<A href="#" target="_self">here</A>.</P> <P>&nbsp;</P> <P>If you would like to read the release notes online, check out the <A href="#" target="_self">release page</A>.</P> <P>&nbsp;</P> <P>The MRTK team has been hard at work on our next major iteration of the toolkit, MRTK v3. We are very excited to share a public preview with you in 2022. For information on our upcoming releases, including the v3 public preview, check out the <A href="#" target="_self">roadmap page</A>.</P> <P>&nbsp;</P> <P>Want to learn more about the Mixed Reality Toolkit? Check out our docs at <A href="#" target="_self">aka.ms/mrtk.</A></P> Thu, 16 Dec 2021 21:25:16 GMT https://gorovian.000webhostapp.com/?exam=t5/mixed-reality-blog/mrtk-2-7-3-is-here/ba-p/3039827 ksemple 2021-12-16T21:25:16Z