Made with MRTK is a monthly series where we share sample projects that leverage the Mixed Reality Toolkit (MRTK). You’re welcome to leverage the resources provided to create your own experiences or iterate on the samples provided. For those new to MRTK, the toolkit provides a set of components and features used to accelerate cross-platform MR development in Unity. To learn more, visit aka.ms/mrtk!
I’ve spent a lot of time lately looking into ways to enhance the experience of viewing art in a gallery. While most galleries and museums these days have either a tour led by a docent or QR codes that can be scanned to view more information on a mobile device, I thought it’d be awesome to push things one step further and introduce augmented reality (AR) into the mix. There’s a wealth of opportunity that awaits once the door to immersive experiences is opened!
I initially checked out examples of gallery and museum AR concepts on Dribbble for inspiration. Dribble is my favorite go-to resource whenever I’m in need of ideas for projects to create with AR. Every so often I’ll come across an idea that a creator has fully prototyped using AR. However, in most cases the examples I find are concepts put together in programs such as Photoshop, Illustrator, or video editing software. So where does that leave me? Well, it leads me to finding out just how can I turn someone’s idea into a proof of concept!
After an hour of scrolling through some very interesting submissions from the community, I was able to narrow down the gallery/museum AR experience into two categories:
The latter is a cool concept to see in-person. I first saw art in AR at Refinery 29’s 29 Rooms. I’ll cover that concept in a future post! However, for now I chose to focus on using AR to display additional information about the artwork.
Since the goal was to create a proof of concept, I didn’t find it necessary to venture over to LACMA for this project. Granted, that would’ve been a cool reason to visit the galleries! Instead, I decided to leverage the artwork that I currently have in my home office. Most of the artwork I own is created by art duo Oliver Gal. The two sisters have an aesthetic that resonates well with my luxury fashion background and the various fashion décor of my office. I wanted to take 3 of my canvas prints and overlay AR elements on top to mimic the look and feel of what one would expect to see in a gallery – thus, creating my own at home gallery of art.
For starters, I needed to add a placard that listed the name of the print, the art, and information about the art material. Next, I wanted a button that could be pressed to view more information about the artists. Rather than display everything at once, I wanted the additional artist information to only be visible upon button press. Otherwise, the wall would be cluttered with too much digital content. The only thing now that stood between me and my experience was finding a way for the app to recognize the artwork and thus display the correct information upon recognition.
I decided to go with Vuforia to track images. Vuforia uses what is known as Image Targets to recognize and augment digital content onto a recognized image. What’s even more awesome is that Vuforia supports using multiple image target simultaneously! This worked out great for my use case given that I wanted to use 3 pieces of artwork for my home gallery.
Although Vuforia supports this functionality on mobile devices, I decided to use my HoloLens so that I could have direct hand interactions with the MRTK features. Fortunately, Vuforia provides a HoloLens 2 Sample that can be imported into Unity and configured to track whichever images you desire. This cuts out a lot of the manual setup that you’d encounter if you were to start from scratch. There’s also an article available which details how to configure and try the sample on the device. If I may suggest, be sure to check out the MRTK documentation on adding MRTK to a Unity project with the Mixed Reality Feature Tool so that you have the full picture of how to get started. The MRTK packages do not ship with the sample and therefore the toolkit needs to be manually added to the project via the Mixed Reality Feature Tool.
In short, once you have the sample open and the MRTK foundation package imported, you can swap out the images provided in the sample with your own. The first thing you’ll need to do is upload your images to be tracked. Within the Target Manager, you’ll need to add a database for your images and upload an image for each real-world image to be tracked.
Since I was using artwork that Oliver Gal currently sells, I went onto the website and saved the corresponding product images. I had a tiny hiccup while uploading – my images weren’t in the correct color format. Targets must be either 8-bit gray scale or 24-bit RGB. I resolved this by opening the images in Adobe Photoshop, selecting Export > Save for Web (Legacy) and choosing PNG-24.
After the images are uploaded, you'll need to download the database for the Unity Editor. Back over in Unity, you’ll import the Unity package for the database. Once imported, you’ll be able to use the image targets that were uploaded in the portal.
Within the 2-ImageTargets scene is a VuforiaContent GameObject that contains each Image Target within the sample. To swap out the images, select one of the targets and within the Inspector, change the Database to your database and select the appropriate Image Target from the list.
Once you have your images selected, you can add objects to display upon recognition as children of the image.
I used both Adobe Illustrator and Canva to create the images for the gallery. I have a love for rounded corners and knew it’d be easiest for me to create assets with rounded corners outside of Unity. Starting with the placard, I created the rounded corner background image in Illustrator. Although setup in Illustrator is minimal, you’ll want to ensure that you’re configuring the 2D sprite properly in Unity before you begin to scale to your preferred size. There’s a Master Rounded Corners for your UI | Unity UI Tutorial I found on YouTube to be very helpful for figuring out the configuration. As for the text, I added the text onto the placard using the Unity canvas and Text Mesh Pro.
As for the additional information about the artists, I found an image of the sisters online and removed the background of the original image in Canva. Afterwards, I exported the new image of the sisters into Illustrator and created a background behind the sisters and a blank white surface just below to provide a place for their bio. Once I imported the image into Unity, I used the Unity Canvas and Text Mesh Pro once more to add in the text.
Although MRTK comes equipped with button prefabs, I had a particular look and feel that I wanted to achieve which required me to hop into Blender and create my own buttons – I really wanted rounded corners! Although MRTK 2.8 has pre-configured button prefabs, you can also leverage the toolkit to create your own buttons from scratch while preserving the integrity of a MRTK button. We have documentation available on how to Make a Button from Scratch.
I will admit, I got a little mixed up while adding in visual states for the button. Although this worked well when using a cube primitive, I hadn’t realized that I assigned the incorrect GameObject as the Target for the profile on the Interactable script. With that said, be sure that you’re targeting the correct object when setting up your custom button. Another tricky part was adjusting the Press Settings on the NearInteractableTouchable script. I initially started with my own arbitrary numbers but soon realized that I couldn’t quite see the button being pressed. Instead, I opened the MRTK UX PressableButtonExample scene (within the Mixed Reality Toolkit Examples package) and mimicked the configuration from one of the custom buttons in the sample. That seems to have resolved my issue in no time!
The last thing I thought would be cool to add was the Hand Coach. I hadn’t used this feature until creating this project. Essentially, it’s an animated hand that appears to guide the user on how to interact with an object in the scene. There’s various animations available, which is great given that you’re not limited to how you can integrate this feature into your own experience. I chose to go with a hand that demonstrates the press of a button. The HandCoachExample scene (within the Mixed Reality Toolkit Examples package) provides examples on how hand coach can be configured. I drew inspiration from the Near Select example for my own project. I personally have some more experimenting to do with this feature. However, I think this came out well for first time use.
Overall, it took me roughly 2 days to bring this project to life! I’m quite proud of how this turned out. Check out the video and images below to view the experience in all its glory.
Referring to the configurations within the MRTK Examples package was a time-saver. I’d highly recommend importing that package into a Unity project if you’re unsure how a particular MRTK UX building block works or how to configure one. I’d love to see this project brought to life in a real gallery – even if it’s not with a head mounted device! As I mentioned earlier, creating such an experience is not limited to just HoloLens. You can also try this out on mobile as well. Imagine hosting your own AR art gallery exhibit virtually anywhere that your art is mounted. Certainly, saves a lot of money on materials! If you do create your own experience, be sure to let us know in the comments or on Twitter at @mxdrealitydev!
Until next time, happy creating!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.