Category Archives: Microsoft HoloLens

Auto Added by WPeMatico

Making mixed reality: a conversation with Alexandros Sigaras and Sophia Roshal

Dr. Olivier Elemento (left) alongside with his Ph.D students Neil Madhukar and Katie Gayvert, analyze medical network data (photo courtesy of the Englander Institute for Precision Medicine)
Welcome! This is Making mixed reality, a series celebrating the passionate community creating apps and experiences with Windows Mixed Reality. Here, developers, designers, artists (and more!) share how and why they got started, as well as their latest tips. We hope this series inspires you to join the community and get building!
Meeting Alexandros Sigaras and Sophia Roshal was a lot like mixed reality: a digital-physical fusion. It first happened through a flurry of tweets and emails as Alexandros, a senior research associate at Weill Cornell Medicine (WCM), and Sophia, a WCM software engineer, rapidly prototyped a Microsoft HoloLens application to achieve the Englander Institute for Precision Medicine’s “cancer moonshot,” a promise to empower better and faster cancer research, data collaboration, and accessible care. Soon after I was lucky enough to demo their project in-person. It’s now in the Windows Store as Holo Graph, an app enabling researchers to bring their own network data into the real world to explore, manipulate, and collaborate with other researchers in real-time, be they in the same room or on the other side of the planet.
Find out what makes this team tick, and how they make big data approachable with Windows Mixed Reality.
Sophia Roshal looks at a graph of medical data (photo courtesy of the Englander Institute for Precision Medicine)
Why HoloLens, and why Windows Mixed Reality?
Sophia: It’s the logical next step. You usually constantly switch from window to window [on a PC]. With HoloLens, you stay in one place. You can just point at something; you don’t have to use your mouse. It’s just so much more of a natural environment, which is great.
The best part of mixed reality for me is seeing other people try it for the first time. They are surprised how well interactions between the real world and holograms work, and are excited to see new updates. The most exciting part is to see the endless possibilities of mixed reality. From games to medical research, there are still many applications of mixed reality to explore.
Alexandros: One of the key questions we get every single time we show HoloLens to someone who is already an avid developer is, “Why HoloLens, and not a 2D screen? Why does this revolutionize our work?” The key answer behind this is simplicity, connecting these dots. The amount of high-quality data that you can parse through with holograms is significantly more than the amount of data that you could create in a table and fuse together in your brain! Tangibility and collaboration are the biggest improvements. It’s like saying the mouse and the keyboard are absolutely great, but phone touch screens are a better user interface. We treat HoloLens as a technology that allows us to go to a higher level, make things more tangible, and remove the challenges of making connections in your brain because you actually see and manipulate them.

Using @Hololens for realtime collaborative & interactive visualization on metabolomic networks @RoshalSophia @ElementoLab @ksuhre pic.twitter.com/JP60UH8Wrs
— Alex Sigaras (@AlexSigaras) August 14, 2017

Who uses Holo Graph?
Alexandros: In a nutshell, the end users for Holo Graph are computational biologists, clinicians, and oncologists. Instead of looking at “big data” in a two-dimensional structure, they immerse themselves and explore and focus on their areas of interest in 3D. There are two scenarios that we currently use with Holo Graph. One is for cancer research and genomics, and the other is metabolomics.
For cancer research, it’s for drug discovery. We want to find how specific drugs relate to specific genes. With our app, I can upload my network that has all of this correlating information, and I can explore it, manipulating and changing the ways I look at data. If I click on a hologram of a drug, I’ll see the drug’s most up-to-date information directly from the Food and Drug Administration (FDA) – that’s an API tool. If I click on the gene, gene cards will tell me more about that specific gene.
The other use case that we’re doing with our colleague Dr. Karsten Suhre from Weill Cornell Medicine in Qatar is metabolomics. Dr. Karsten Suhre has identified connections between metabolites, genes, and diseases such as Crohn’s disease and diabetes. Using Holo Graph he can browse and identify unexpected paths in the network. One of the latest videos that we shared was collaborative sharing and manipulation of this very network on Crohn’s disease to identify if there are unexpected connections to other diseases.
The Holo Graph team comes from the Englander Institute for Precision Medicine at Weill Cornell Medicine and the  Weill Cornell Medicine Qatar. From left, Sophia Roshal, Dr. Karsten Suhre, Dr. Olivier Elemento, Dr. Andrea Sboner and Alexandros Sigaras. (Photo courtesy of the Englander Institute for Precision Medicine)
When did you get started building and designing for Windows Mixed Reality? Any tips for others just beginning? 
Alexandros: We became interested on the platform about two years ago and were delighted to be included in the first wave of HoloLens devices that shipped. Whether on an online forum or a meetup, there are a lot of talented people happy to help you get there and share their experience. Don’t try to reinvent the wheel. You will be surprised how many questions have already been answered before! As far as tips go, download and try out apps from the Windows Store and make sure to reach out to the community for any questions.
Sophia: Fragments was the biggest inspiration for me because the virtual characters sit on actual chairs. We recently updated our app to include avatars and do the same thing, and it was really cool to see that! Our avatars follow the person’s movement, and we also use spatial mapping to find the floor because the only point of reference on HoloLens is the head. There are MixedRealityToolkit scripts that finds planes, where you can find the lowest one and that will be the floor. Then you calculate the height between the head and the floor and you can map the avatar from there.
Alexandros: Doing tutorials Holograms 240 with the avatars and Mixed Reality 250 with sharing across devices are excellent examples to see the capabilities here.
Sophia: For someone just beginning, Mixed Reality Academy is by far the easiest way to start building apps. MixedRealityToolkit is the main tool I use. I will write my own scripts most of the time, but if you’re just starting, you have to get it!
What inspires you?
Sophia: Impact on our patients’ lives. Seeing your code making a positive impact in someone else’s life battling with cancer is one of the most rewarding experiences ever. The Englander Institute for Precision Medicine is using cutting-edge technology to go through a sea of data and provide our care team and their patients better treatment options. We believe that AI and devices such as HoloLens are just beginning to show their true potential, and we look forward on what’s yet to come in the near future.
Alexandros: And it’s all about the power of people. The headset doesn’t save someone’s life; the clinician does. But mixed reality helps them see the patterns and get there. With HoloLens we want to answer, “If I were to show you this before you made the call, would that change anything? Would it make your decision and response faster? Would it give you more data?” And every person that we ask nods their head and says, “Yes, it’s right there. It’s almost like I can touch it.” Clinicians are used to reviewing genomic reports that can span up to hundreds of pages requiring significant time and effort. This doesn’t have to be the case though. HoloLens can act as a catalyst by significantly reducing the review time and make an impact at scale when combined with other tools such as AI, machine learning, and deep learning that we also do at the Institute.
Holo Graph can help researchers identify patterns in networks (photo courtesy of The Englander Institute for Precision Medicine)
How are you getting data into your application? Any tips for those who want to do data visualization with HoloLens?
Sophia: The easiest way to load dynamic data into an application is through a cloud integration app such as OneDrive or Dropbox. When you share data across the network to other users you need to consider secure transfer and adopt standard formats. Holo Graph currently supports .csv and XML/GraphML formats on OneDrive. We tend to share data across using JSON.
OneDrive loading has been a great surprise. We used to add files to the backend. With OneDrive, now anyone who wants to can load their data into the experience.
Alexandros: Data and data privacy are of utmost importance to the Institute. The real value of Holo Graph is not just about its looks; it’s about empowering researchers to get their real data in securely. As far as visualizing the data, my tip is to enable your users to break out of the 2D window and put their data on their environment on their terms.
I’m ending with a favorite quote from our conversation. Spoiler alert: mixed reality’s got game!
Alexandros: The way I explain mixed reality is this: Imagine you have a virtual basketball. If you’re throwing it onto the real ground, it bounces off because it knows where the ground is, and there’s “friction.” You can repeat this 100 times and it would happen the same way – you can expect it. It’s literally bringing digital content into real life, allowing you to bend the rules.
Sophia and Alexandros are seriously inspiring. You can connect with Sophia and Alex on Twitter @RoshalSophia and @AlexSigaras.
Want to get started #MakingMR? You can always find code examples, design guides, documentation, and more at Windows Mixed Reality Dev Center. Want more? Check out mixed reality design insights on Microsoft Design Medium. Inspiration abounds!

Windows Mixed Reality holiday update

We are on a mission to help empower every person and organization on the planet to achieve more, and one of the ways we are doing that is through the power of mixed reality. Since January 21, 2015, when we announced Microsoft HoloLens, we have seen developers, partners, and customers innovate in ways we have never seen before. As a creator, it is inspiring to see the world embrace mixed reality; to see organizations and developers stretch the boundaries of what we can do with technology. Together we have created the most vibrant mixed reality community out there and it has been phenomenal to share in this journey with our community.  
A little over ten months ago our mixed reality journey took a leap forward when we announced that the world’s largest PC makers were working with us to democratize virtual reality this holiday with Windows Mixed Reality headsets.  
This week at IFA we will be sharing more details on Windows Mixed Reality headsets and I encourage everyone to tune into the news that will be coming out of Berlin.  
As we get ready for a big week with our partners, I would like to share with you some exciting details about our product, including the first wave of content experiences you can immerse yourself in this holiday.
Windows Mixed Reality: easy setup, affordable gear, a range of hardware choices, and immersive experiences 
Easy setupExisting high-end VR headsets with external cameras are cumbersome to set up. For more people to experience VR one of the barriers that needed to be removed was the need for external markers. That is why the Windows Mixed Reality headsets coming this holiday will be the first to deliver VR experiences with built-in sensors to track your physical position without requiring you to purchase and install external sensors in your room. You don’t need to spend hours to set up a single room in your house with a large play space, just plug and play. This also means these experiences are portable – whether you are traveling for work or visiting friends and family, just pack your PC and headset and you can share the magic of mixed reality.  
 Affordable gear
A variety of Windows Mixed Reality headsets and motion controllers will be available this holiday from HP, Lenovo, Dell, and Acer.  Headset and motion controller bundles will start as low as $399 and will be compatible with exciting and new PC models starting at $499. Along with our partners, we are committed to making mixed reality affordable. 
Range of hardware choices
When it comes to deciding which hardware is right for you, we know that our customers value choice in brand, industrial design, and features. That is why we created Windows Mixed Reality as a platform for you to enjoy experiences across multiple devices that meet your mobility and performance needs.  
We have talked a lot about the headsets and motion controllers, now let’s talk about PCs that will be compatible with Windows Mixed Reality. This holiday, customers can choose the PC that’s right for them – Windows Mixed Reality PCs and Windows Mixed Reality Ultra PCs. Here is some context on what makes the two experiences different: 
Windows Mixed Reality PCs: will consist of desktops and laptops with integrated graphics.  When plugged into these devices, our immersive headsets will run at 60 frames per second.  
Windows Mixed Reality Ultra PCs: will consist of desktops and laptops with discrete graphics. When plugged into these devices, our immersive headsets will run at 90 frames per second.   
Both configurations will support today’s immersive video and gaming experiences such as traveling to a new country, exploring space, swimming with dolphins, or shooting zombies.  Use your Windows Mixed Reality motion controllers to enjoy a world of discovery and imagination this holiday.  
Immersive experiences
We are working with an incredible set of partners to bring the most immersive experiences to the Windows Store. First, we are excited to announce the first wave of content partners coming to Windows Mixed Reality. Second, it’s my pleasure to let you know that we are working with 343 Industries to bring future Halo experiences into mixed reality.  We are not providing specifics right now, but it is going to be a lot of fun to work with them. 

In addition, I am thrilled to announce that Steam content will also run on Windows Mixed Reality headsets. Virtual reality enthusiasts know that Steam is a great place to enjoy cutting edge immersive experiences. We can’t wait to bring their content to you.   
Here is a sneak peek into what you can expect this holiday. We are just getting started and we are honored to work with world class creators and developers.  

Mixed Reality is the future, and we want everyone on the journey with us. For customers, we are creating the best, most affordable mixed reality headsets and bringing you immersive experiences that you will love. For developers, we are making it easy to create great content spanning from simple augmented reality to virtual reality and of course holograms.  
There is more in store for this holiday and I look forward to sharing more details with you in the coming weeks! In the meantime, don’t hesitate to get in touch with me on Twitter, and please continue to share your feedback with us. 
Alex

Making mixed reality: a conversation with Lucas Rizzotto

I first met Lucas Rizzotto at a Microsoft HoloLens hackathon last December, where he and his team built a holographic advertising solution. Fast forward to August, and he’s now an award-winning mixed reality creator, technologist, and designer with two HoloLens apps in Windows Store: MyLab, a chemistry education app, and CyberSnake, a game that makes the most of spatial sound…and holographic hamburgers. Little did I know, Lucas had no idea how to code when he started. Today, he shares how he and you can learn and design mixed reality, as well as some tips for spatial sound. Dig in!

Why HoloLens, and why Windows Mixed Reality?
It’s the future! Having the opportunity to work with such an influential industry on its early days is a delightful process – not only it’s incredibly creatively challenging, you can really have a say on what digital experiences and computers will look like in 10, 20 years from now – so it’s packed with excitement, but also responsibility. We are designing the primary way most people will experience the world in the future, and the HoloLens is the closest thing we’ve got to that today.
The community of creators around this technology right now is also great – everyone involved in this space is in love with the possibilities and wants to bring their own visions of the future to light. Few things beat working with people whose primary fuel is passion.
How did you get started developing for mixed reality?
I come from mostly a design background and didn’t really know how to code until two years ago – so I started by teaching myself C# and Unity to build the foundation I’d need to make the things I really wanted to make. Having the development knowledge today really helps me understand my creations at a much deeper level, but the best part about it is how it gives me the ability to test crazy ideas really quickly and independently – which is extremely useful in a fast-paced industry like MR.
HoloLens wise, the HoloLens Slack community is a great place to be – it’s very active and full of people that’ll be more than happy to point you in the right direction, and most people involved in MR are part of the channel. Other than that, the HoloLens forums are also a good resource, especially if you want to ask questions directly to the Microsoft engineering team. Also, YouTube! It has always been my go-to for self-education. It’s how I learned Unity and how I learned a ton of the things I know about the world today. The community of teachers and learners there never ceases to amaze me.
Speaking of design, how do you design in mixed reality? Is anything different?
MR is a different beast that no one has figured out quite yet – but one of the key things I learned is that you need to give up a little bit of control in your UX process and design applications more open ended. We’re working with human senses now, and people’s preferences vary wildly from human to human. We can’t micro-manage every single aspect of the UX like we do on mobile – some users will prefer to use voice commands, others will prefer hand gestures – some users get visually overwhelmed quickly, while others thrive in the chaos. Creating experiences that can suit all borders of the spectrum is increasingly essential in the immersive space.
3D user interfaces are also a new challenge and quite a big deal in MR. Most of the UI we see in immersive experiences today (mine included!) is still full of buttons, windows, tabs and reminiscent visual metaphors from our 2D era. Cracking out new 3D metaphors that are visually engaging and more emotionally meaningful is a big part of the design process.
Also, experiment. A lot. Code up interactions that sound silly, and see what they feel like once you perform them. I try to do that even if I’m doing a serious enterprise application. Not only this is a great way to find and create wonder in everything you build, it will usually give you a bunch of new creative and design insights that you would never be able to stumble upon otherwise.
An example – recently I was building a prototype for a spiritual sequel to CyberSnake in which the player is a Cybernetic Rhinoceros, and had to decide what the main menu looked like. The traditional way to set it up would be to have a bunch of floating buttons in front of you that you can air tap to select what you want to do – but that’s a bit arbitrary, and you’re a Rhino! You don’t have fingers to air tap. So instead of pressing buttons from a distance, I made it so players are prompted to bash their head against the menu options and break it into a thousand pieces instead.
This interaction fulfills a number of roles: first of all, it’s fun, and people always smile in surprise the first time they destroy the menu it. Secondly, it introduces them to a main gameplay element (in the game players must destroy a number of structures with their head), which serves as practice. Thirdly, it’s in character! It plays into the story the app is trying to tell, and the player immediately becomes aware of what they are from that moment forward and what their goal is. With one silly idea, we went from having a bland main menu to something new that’s true to the experience and highly emotionally engaging.
HoloLens offers uniquely human inputs like gaze, gesture, and voice. So different from the clicks and taps we know today! Do you have a favorite HoloLens input?
Gazing is highly underestimated and underused – it implies user intention there’s so much you can do with it.  A healthy combination of voice, hand gestures, and gaze can make experiences incredibly smooth with contextual menus that pop in and out whenever the user stares at something meaningful. This will be even truer once eye-tracking becomes the standard in the space.
What do you want to see more of, design wise?
I want to be more surprised by the things MR experiences make me do and feel challenged by them! Most of the stuff being done today is still fairly safe – people seem to be more focused on trying to find ways to make the medium monetizable instead of discovering its true potential first. I live for being surprised, and want to see concepts and interactions that have never crossed my mind and perfectly leverage the device’s strengths in new creative ways.
Describe your process for building an app with Windows Mixed Reality.
I try to have as many playful ideas as I possibly can on a daily basis, and whenever I stumble upon something that seems feasible in the present, I think about it more carefully. I write down the specifics of the concept with excruciating detail so it can go from an abstraction into an actual, buildable product, then set the goals and challenges I’ll have to overcome to make it happen – giving myself a few reality checks on the way to make sure I’m not overestimating my abilities to finish it in the desired time span.
I then proceed to build a basic version of the product – just the essential features and the most basic functionality – here I usually get a sense if the idea works or not at a most basic level and if it’s something I’d like to continue doing. If it seems promising, then the wild experimentation phase begins. I test out new features, approach the same problem from a variety of angles, try to seize any opportunities for wonder and make sure that I know the “Why?” behind every single design decision. Keep doing this until you have a solid build to test with others, but without spending too much time on this phase, otherwise projects never get done.
In user testing, you can get a very clear view of what you have to improve, and I pay close attention to the emotional reactions of users. Whenever you see a positive reaction, write it down and see if you can intensify it even further in development. If users show negative emotional reactions, find out what’s wrong and fix it. If they’re neutral through and through, then reevaluate certain visual aspects of your app to find out how you can put a positive emotion on their face. Reiterate, polish, finish – and make a release video of it so the whole world can see it. Not everyone has access to an immersive device yet, but most people sure do have access to the internet.

CyberSnake’s audio makes players hyper-aware of where they are in the game. Can you talk about how you approached sound design? After all, spatial sound is part of what makes holograms so convincing.
Sound is as fundamental to the identity of your MR experience as anything else, and this is a relatively new idea in software development (aside from games). Developers tend not to pay too much attention to sound because it has been, for the most part, ignored in the design process of websites and mobile applications. But now we’re dealing with sensory computing and sound needs to be considered as highly as visuals for a great experience.
CyberSnake uses spatial audio in a number of useful ways – whenever user’s heads get close to their tail, for example, the tail emits an electric buzz that gets louder and louder, signaling the danger and where it’s coming from. Whenever you’re close to a burger, directional audio also reinforces the location of the collectibles and where the user should be moving their head. These bits of audio help the user move and give them a new level of spatial awareness.
Sound is an amazing way to reinforce behaviour – a general rule of thumb is to always have a sound to react to anything the user does, and make sure that the “personality” of said sound also matches the action that the user is performing thematically. If you’re approaching sound correctly, the way something looks and moves will be inseparable from the way it sounds. In the case of CyberSnake, there was some good effort to make sure that the sounds fit the visual, the music and the general aesthetic – I think it paid off!
Spending some time designing your own sounds sounds like a lot of work, but it really isn’t. Grab a midi-controller, some virtual instruments and dabble away until you find something that seems to fit the core of what you’re building. Like anything else, it all comes down to experimentation.
What’s next for you?
A number of things! I’m starting my own Mixed Reality Agency in September to continue developing MR projects that are both wondrous and useful at a larger scale. I’m also finishing my Computer Science degree this year and completing a number of immersive art side projects that you’ll soon hear about – some of which you may see at a couple of major film festivals. So stay in touch – good things are coming!
As always, I’m impressed and inspired by Lucas’s work. You can connect with Lucas on Twitter @_LucasRizzotto and his website, where you’ll find nuggets of gold like his vision for mixed reality and AI in education. And maybe even his awesome piano skills.
Learn more about building for Windows Mixed Reality at the Windows Mixed Reality Developer Center.
Lucas is right about spatial sound—it adds so much to an experience—so I asked Joe Kelly, Microsoft Audio Director working on HoloLens, for the best spatial sound how-tos. He suggests using the wealth of resources on Windows Mixed Reality Developer Center. They’re linked below—peruse and use, and share what you make with #MakingMR!
Spatial sound overview
Designing/implementing sounds
Unity implementation
Programming example video (AudioGraph)
GitHub example (XAudio2)