Tag Archives: mixed reality

Announcing Babylon.js 3.0

Babylon.js is an open source framework that allows you to create stunning 3D experiences in your browser. Built with simplicity and performance in mind, it is the engine that fuels the Remix3D site or the Xbox Design Lab.

Today, I’m thrilled to announce the availability of Babylon.js’s biggest version ever: Babylon.js v3.0. This version is packed with incredible features, but before listing some of them, I want to thank the awesome community behind this release. I’m humbled by the number of external contributors (120+) who dedicated their time to help build this great framework. They have made possible this 46th release of Babylon.js.

Here is a quick overview of a few features included in this release of the framework.

Support for WebGL 2

WebGL 2 is a great step forward for 3D developers as it allows more control over the GPU. The support for WebGL 2 is completely transparent with Babylon.js 3.0. This means that the engine will automatically use WebGL 2 if available, and it will fall back to WebGL 1 if not. Mode details can be found here.

Support for WebVR 1.1

With a single line of code, you can now unleash the power of virtual reality directly in your browser. Babylon.js 3.0 supports all VR devices, including the new Windows Mixed Reality headsets. Babylon.js can also transparently use WebVR 1.0 if your device does not support the latest version of the specification (Gear VR for instance). It also supports using device orientation events to provide virtual reality on mobile.

You can visit the Build 2017 website to watch a complete session about Babylon.js and WebVR.

You can find the Sponza demo here.

Support for glTF 2.0

glTF is a file format for GL APIs. With Babylon.js 3.0, we added complete support for loading glTF 2.0 files (including physically based rendering materials).

This version was ratified recently by Khronos group. glTF will help the 3D ecosystem to enable all new ways to create, share and consume 3D.

Improved physically based rendering (PBR)

The PBRMaterial used to render physically based objects was completely rewritten. It is now more accurate and better aligned with GLTF2.0 specifications. This material can be used to simulate real life lighting and provide photorealistic scenes.

You can find a live demo here.

Introducing Babylon.GUI

The Babylon.js GUI library is an extension you can use to generate interactive user interface. It relies on hardware acceleration to produce a fast and light way to deal with user interaction.

The Babylon.GUI extension can be helpful with VR scenarios when you cannot display HTML elements. It can also be used to project your UI in 3D. In this case, the UI will be textured on a 3D object but will remain functional.

Support for morph targets

Morph targets are a great way to animate objects by using morphing between different targets. This technique is widely used to animate character heads, for instance.

You can find a technical demo here.

Support for live textures using WebCam

You can now create project webcam content to any textures in your world. This could be used to simulate mixed reality experience or apply some fun effects like in this ASCII art demo.

Developer story

Version 3.0 is not only about features. It is also about enabling developers to achieve more with less code.

To fulfill this goal, we introduced a new version of our documentation where you can search for tutorials and classes documentation as well as playground samples.

The playground is a central tool for Babylon.js where you can learn by experimenting with live coding editor. Using the code panel on the left, you can discover at your pace the features of Babylon.js, thanks to our integrated code completion helper:

With more than 150,000 samples in the playground, we were sitting on top of an immense knowledge base for developers.

Therefore, we added the playground search section to our documentation to let you search for live code samples (either using sample title, description, tags or code):

We also thought about advanced WebGL developers with Spector.js, which is a fully functional WebGL (1 and 2) debugger. This tool is a must-have if you must deal with WebGL in depth, as it will expose in an easy-to-read way all the inner details of the rendering of every frame:

We really hope you will find this new version useful and exciting. If you want to know more or just want to experiment with our latest demo, please visit www.babylonjs.com.

If you would like to contribute, please join us on GitHub!

Calling all game devs: The Dream.Build.Play 2017 Challenge is Here!

Dream.Build.Play is back! The long-running indie game development contest was on hiatus for a few years, so it’s high time for it to make a resounding return.

The Dream.Build.Play 2017 Challenge is the new contest: It just launched on June 27, and it challenges you to build a game and submit it by December 31 in one of four categories. We’re not super-picky — you can choose the technology to use just so long as it falls into one of the challenges and that you publish it as a Universal Windows Platform (UWP) game. It’s up to you to build a quality game that people will line up to play.

The four categories are:

Cloud-powered game – Grand Prize: $100,000 USD

Azure Cloud Services hands you a huge amount of back-end power and flexibility, and we think it’s cool (yes, we’re biased). So, here’s your shot of trying Azure out and maybe even win big. Build a game that uses Azure Cloud Services on the backend, like Service Fabric, CosmosDB, containers, VMs, storage and Analytics. Judges will give higher scores to games that use multiple services in creative ways — and will award bonus points for Mixer integration.

PC game – Grand Prize: $50,000 USD

Building on Windows 10, for Windows 10? This is the category for you. Create your best UWP game that lives and breathes on Windows 10 and is available to the more than 450 million users through the Windows Store. It’s simple: Create a game with whatever technology you want and publish it in the Windows Store. We’ll look favorably on games that add Windows 10 features such as Cortana or Inking because we really want to challenge you.

Mixed Reality game – Grand Prize: $50,000

Oh, so you want to enhance this world you live in with something a little…augmented? Virtual? Join us in the Mixed Reality challenge and build a volumetric experience that takes advantage of 3D content in a virtual space. You’ll need to create your game for Windows Mixed Reality, but you can use technology like Unity to get you kickstarted. Oh, and don’t forget the audio to really immerse us in your world.

Console game – Grand Prize: $25,000

Console gamers unite! Want to try your hand at building a game for Xbox? This category is your jam. Your UWP game will be built for the Xbox One console family and must incorporate Xbox Live Creators Program with at least Xbox Live presence. Consideration will be given for games that incorporate more Xbox Live services such as leaderboards and statistics.

There are some important dates to be aware of:

  • June 27: Competition opens for registration
  • August 2: Team formation and game submission period opens
  • December 31: Game submission period closes
  • January 2018: Finalists announced
  • March 2018: Winners awarded

We have big things planned for you. Maybe some additional contests and challenges, maybe some extra-cool prizes for the finalists, maybe some extra-cool interviews and educational materials. Once you register, we’ll keep you updated via email, but also keep an eye on our Windows Developer social media accounts.

As I mentioned earlier, you can pretty much use whatever technology you want. Create something from the ground up in JavaScript or XAML or C++ and DirectX. Leverage one of our great middleware partners like Unity, GameMaker, Cocos2D or Monogame. Or do a bit of both – do your own thing and incorporate Mixer APIs into it, Vungle or any one (or more) of our other partners. The biggest thing we want from you is a fun game that’s so enjoyable for us to play that we forget we’re judging it!

Speaking of that, you might be wondering how we judge the games. We have four “big bucket” criteria for you to aim for:

  • Fun Factor – 40%: Bottom line – your game needs to be fun. That doesn’t mean it has to be cutesy or simple. Fun comes in many forms, but we can’t forget what we’re aiming for here – a great game. Take us for a ride!
  • Innovation – 30%: And while you’re taking us on that ride, surprise us! We’re not looking for a clone of an existing game or a tired theme that has been done a bazillion times before. Mash-up two genres. Take a theme and turn it on its head. Don’t restrict your innovation to the game, but also the technology you’re using and how you’re using it. Think outside the box when you incorporate Windows features, or how you can creatively use a service like Mixer.
  • Production Quality – 20%: Games have to be fun and we want them to be innovative, but if they don’t run, then they’re just not ready to be called a game. This scoring criterion is all about making sure your framerate is right, you have audio where you should, you’ve catered for network instability and more. Give us every opportunity to get to your game and enjoy it the way you intended.
  • Business Viability/Feasibility – 10%: And of course, what’s your plan to engage your gaming customers? Do you have a good revenue-generating plan (e.g., in-app purchases, premium charges, marketing, rollouts, etc.)? That’s stuff you might not normally think about, but we’re gonna make you. Because we care.

If you want to get started with UWP game development, you can try our Game Development Guide.

Want more? Check out the introductory .GAME episode here:

So, what are you waiting for? Get in there and register!

We’re expanding the Mixed Reality Partner Program

Just over a year ago at Build 2016, we welcomed 10 digital and creative agencies to develop mixed reality solutions as part of the HoloLens Agency Readiness Partner Program. As part of the program, we provided these partners with technical readiness training to extend their design competencies and help them deliver compelling mixed reality solutions across industries, including education, healthcare, architecture, engineering, construction, and design.

With the expansion of Microsoft HoloLens to more developers and commercial customers around the globe this last March — including those in Australia, Ireland, France, Germany, New Zealand, the United Kingdom, China, and Japan — we welcomed new partners to the program to create the future of mixed reality experiences for customers in those regions. Now more than 30 HoloLens Agency Readiness Partners are producing tangible results such as proof-of-concepts (POC), pilots and deployments of world class mixed reality solutions, leading the digital transformation for customers like Boeing, Cirque du Soleil, Cleveland Cavaliers, Cylance, Lowe’s, Jabil, Paccar, PGA Tour, Real Madrid, and Stryker.

Today, we are excited to announce that due to growing demand from partners and customers, we have created the Mixed Reality Partner Program, which expands the agency readiness program to welcome systems integrators (SIs) and digital agencies around the world. All existing HoloLens Agency Readiness Partners will be grandfathered into the Mixed Reality Partner Program.

We’ve learned that successful mixed reality solutions are built on great experiences — and those experiences require both a creative design component and a strong competency in application and infrastructure integration and deployment. SIs around the world already know how to build, support, integrate, and extend Microsoft technologies to meet their customers’ business and IT goals. As members of the Mixed Reality Partner Program, these SIs, and digital/creative agencies will play a critical role in building 3D and mixed reality experiences for enterprise commercial customers.

So how can you get involved? Eligible partners will begin a multi-week readiness program that consists of both in-depth technical training on mixed reality solutions and sales and marketing readiness. Upon completion of the program and a successful customer POC, partners can qualify to receive a wide range of benefits, such as direct access to Microsoft engineering support and mentorship, as well as marketing and sales assistance. Qualified partners that are accepted into the program will participate in joint business planning with the Mixed Reality extended team, which includes engineers, product managers, field sales and marketing leads

Partners who already have deep expertise in designing and deploying mixed reality solutions will have the option to take a fast track and immediately work with the Microsoft team on plans to engage customer accounts.

We’re excited for new partners around the globe to join us on this mixed reality journey, and we’re excited to see the mixed reality solutions that our partners create!

Partners can learn about and apply to join the Mixed Reality Partner Program, and if you’re attending Microsoft Inspire this week, I encourage you to join me for a breakout session on July 12 to learn more about the program.


Connecting with partners to empower the modern workplace

This is one of my favorite weeks of the year! It’s a week where Microsoft leaders connect with thousands of partners from around the world to talk about what’s new and the opportunity in front of us in the coming year. Since our last partner conference a year ago, we’ve been working together to better serve customers who have continuously evolving demands of technology.

We’re on a mission to empower every person and organization on the planet to achieve more in the Modern Workplace. A big part of that mission is delivering a new way for customers to transform their business with modern Microsoft products and services that help make employees more productive, creative and secure.

Just last week we announced new security and management features in Windows 10 that will arrive in the Windows 10 Fall Creators Update.

!function(a,b){“use strict”;function c(){if(!e){e=!0;var a,c,d,f,g=-1!==navigator.appVersion.indexOf(“MSIE 10”),h=!!navigator.userAgent.match(/Trident.*rv:11./),i=b.querySelectorAll(“iframe.wp-embedded-content”);for(c=0;c<i.length;c++){if(d=i[c],!d.getAttribute(“data-secret”))f=Math.random().toString(36).substr(2,10),d.src+=”#?secret=”+f,d.setAttribute(“data-secret”,f);if(g||h)a=d.cloneNode(!0),a.removeAttribute(“security”),d.parentNode.replaceChild(a,d)}}}var d=!1,e=!1;if(b.querySelector)if(a.addEventListener)d=!0;if(a.wp=a.wp||{},!a.wp.receiveEmbedMessage)if(a.wp.receiveEmbedMessage=function(c){var d=c.data;if(d.secret||d.message||d.value)if(!/[^a-zA-Z0-9]/.test(d.secret)){var e,f,g,h,i,j=b.querySelectorAll(‘iframe[data-secret=”‘+d.secret+'”]’),k=b.querySelectorAll(‘blockquote[data-secret=”‘+d.secret+'”]’);for(e=0;e<k.length;e++)k[e].style.display=”none”;for(e=0;e1e3)g=1e3;else if(~~g<!]]>

Today, we’re excited to announce Microsoft 365, a new set of offerings that include Office 365, Windows 10, and Enterprise Mobility + Security, to create a complete, intelligent, secure solution that empowers everyone to be creative and work together, securely.

We introduced two Microsoft 365 solutions:

  • Microsoft 365 Enterprise includes Office 365 Enterprise, Windows 10 Enterprise, and Enterprise Mobility + Security and is offered in two plans – Microsoft 365 E3 and Microsoft 365 E5.  Both plans provide customers with a complete set of productivity and security capabilities, while Microsoft 365 E5 provides the latest and most advanced innovations in security, compliance, analytics, and collaboration.
  • Microsoft 365 Business is designed for small-to-medium sized businesses (SMB) and includes Office 365 Business Premium, security and management features for Office apps and Windows 10 devices, upgrade rights to Windows 10, and a centralized IT console. It will be available in public preview starting August 2.

For our partners, Microsoft 365 offers exciting new opportunities – from the ability to modernize a customer’s environment through managed services, to the ability to differentiate their offerings with advanced enterprise services. We believe Microsoft 365 will be a further catalyst to drive customer creativity, security and simplicity in their desktop management.

In addition to having end-to-end solutions with Windows 10, Office 365, and Enterprise Mobility + Security, we know customers need the ability to decide how to operate. Today we are announcing that Windows 10 E3 and E5 customers will now have the option to add virtualization use rights to Windows subscriptions in the CSP program starting in September.

!function(a,b){“use strict”;function c(){if(!e){e=!0;var a,c,d,f,g=-1!==navigator.appVersion.indexOf(“MSIE 10”),h=!!navigator.userAgent.match(/Trident.*rv:11./),i=b.querySelectorAll(“iframe.wp-embedded-content”);for(c=0;c<i.length;c++){if(d=i[c],!d.getAttribute(“data-secret”))f=Math.random().toString(36).substr(2,10),d.src+=”#?secret=”+f,d.setAttribute(“data-secret”,f);if(g||h)a=d.cloneNode(!0),a.removeAttribute(“security”),d.parentNode.replaceChild(a,d)}}}var d=!1,e=!1;if(b.querySelector)if(a.addEventListener)d=!0;if(a.wp=a.wp||{},!a.wp.receiveEmbedMessage)if(a.wp.receiveEmbedMessage=function(c){var d=c.data;if(d.secret||d.message||d.value)if(!/[^a-zA-Z0-9]/.test(d.secret)){var e,f,g,h,i,j=b.querySelectorAll(‘iframe[data-secret=”‘+d.secret+'”]’),k=b.querySelectorAll(‘blockquote[data-secret=”‘+d.secret+'”]’);for(e=0;e<k.length;e++)k[e].style.display=”none”;for(e=0;e1e3)g=1e3;else if(~~g<!]]>

Now is the best time to be a Surface partner

Since last year when we first introduced Surface as a Service at Inspire, the program has grown from one partner in the channel to more than 50 partners in 15 markets worldwide. The Surface channel is well equipped to handle the demand that will come from Microsoft 365 customers worldwide and we cannot wait for customers to get the best of the complete Microsoft stack in their hands.

On top of the momentum Microsoft Surface partners have worldwide, Surface is highlighting two new opportunities for partners to get more value out of being a Surface partner: a new Services and Support opportunity as well as a new Surface Reseller Alliance. The partner program for Services and Support was successfully piloted and is now active in 10 countries: the US., U.K., Germany, France, Australia, New Zealand, Denmark, Sweden, Norway, and Finland. Japan and the expansion to other Surface markets are coming over the next few months. Surface also announced a partnership with IBM Technology Support Services (TSS) – one of the world’s leading support providers – to enhance our Microsoft Complete extended warranty offerings and deliver technology services and support for Surface devices.

!function(a,b){“use strict”;function c(){if(!e){e=!0;var a,c,d,f,g=-1!==navigator.appVersion.indexOf(“MSIE 10”),h=!!navigator.userAgent.match(/Trident.*rv:11./),i=b.querySelectorAll(“iframe.wp-embedded-content”);for(c=0;c<i.length;c++){if(d=i[c],!d.getAttribute(“data-secret”))f=Math.random().toString(36).substr(2,10),d.src+=”#?secret=”+f,d.setAttribute(“data-secret”,f);if(g||h)a=d.cloneNode(!0),a.removeAttribute(“security”),d.parentNode.replaceChild(a,d)}}}var d=!1,e=!1;if(b.querySelector)if(a.addEventListener)d=!0;if(a.wp=a.wp||{},!a.wp.receiveEmbedMessage)if(a.wp.receiveEmbedMessage=function(c){var d=c.data;if(d.secret||d.message||d.value)if(!/[^a-zA-Z0-9]/.test(d.secret)){var e,f,g,h,i,j=b.querySelectorAll(‘iframe[data-secret=”‘+d.secret+'”]’),k=b.querySelectorAll(‘blockquote[data-secret=”‘+d.secret+'”]’);for(e=0;e<k.length;e++)k[e].style.display=”none”;for(e=0;e1e3)g=1e3;else if(~~g<!]]>

The Surface Reseller Alliance training program includes a revamped online portal with training modules, live webinars for newly launched Surface products, and will provide incentives for partner sellers to complete training curriculum.  You can read more about it on the Devices Blog.

Expanding the Mixed Reality Partner Program

We’ve also been working to expand our partnership program for Windows Mixed Reality around the world. Today, we are excited to announce the creation of the Mixed Reality Partner Program, welcoming both system integrators (SIs) and digital agencies around the world.

!function(a,b){“use strict”;function c(){if(!e){e=!0;var a,c,d,f,g=-1!==navigator.appVersion.indexOf(“MSIE 10”),h=!!navigator.userAgent.match(/Trident.*rv:11./),i=b.querySelectorAll(“iframe.wp-embedded-content”);for(c=0;c<i.length;c++){if(d=i[c],!d.getAttribute(“data-secret”))f=Math.random().toString(36).substr(2,10),d.src+=”#?secret=”+f,d.setAttribute(“data-secret”,f);if(g||h)a=d.cloneNode(!0),a.removeAttribute(“security”),d.parentNode.replaceChild(a,d)}}}var d=!1,e=!1;if(b.querySelector)if(a.addEventListener)d=!0;if(a.wp=a.wp||{},!a.wp.receiveEmbedMessage)if(a.wp.receiveEmbedMessage=function(c){var d=c.data;if(d.secret||d.message||d.value)if(!/[^a-zA-Z0-9]/.test(d.secret)){var e,f,g,h,i,j=b.querySelectorAll(‘iframe[data-secret=”‘+d.secret+'”]’),k=b.querySelectorAll(‘blockquote[data-secret=”‘+d.secret+'”]’);for(e=0;e<k.length;e++)k[e].style.display=”none”;for(e=0;e1e3)g=1e3;else if(~~g<!]]>

The Mixed Reality Partner Program is an expansion of the HoloLens Agency Readiness Partner Program which was announced just over a year ago at //build 2016. At that time, we welcomed 10 digital and creative agencies to develop mixed reality solutions. As we have expanded Microsoft HoloLens to more developers and commercial customers around the globe, we added new Europe-focused partners to the program. Now with more than 16 HoloLens Agency Readiness Partners, we’re creating the future of mixed reality experiences with partners and customers.

As the technology landscape continues to advance at unprecedented rates, Microsoft and the Windows and Devices Group will be at the forefront with new ideas to enhance the way our partners work with customers.

Hearing from our partners is extremely important and we look forward to talking with you during a great week at Inspire!

Announcing Microsoft Build Tour 2017

Figure 1 Sign-up at http://buildtour.microsoft.com

On the heels of the Build conferences these last few years, we have had the pleasure of meeting thousands of developers around the world. Their feedback and technical insight has helped us to continue the tradition and explore more technical depth.

Today, I’m excited to announce the Microsoft Build Tour 2017, coming to cities around the globe this June! The Microsoft Build Tour is an excellent way to experience Microsoft Build news first-hand, and also work directly with Microsoft teams from Redmond and your local area.

This year, we’re bringing the Microsoft Build Tour to these locations:



June 5-6 Shanghai, China
June 8-9 Beijing, China
June 12 Munich, Germany *
June 13-14 Seoul, Republic of Korea
June 14-15 Helsinki, Finland
June 19-20 Warsaw, Poland
June 21-22 Hyderabad, India
June 29-30 Sydney, Australia

The Microsoft Build Tour is for all developers using Microsoft platform and tools. We will cover a breadth of topics across Windows, Cloud, AI, and cross-platform development. We will look at the latest news around .NET, web apps, the Universal Windows Platform, Win32 apps, Mixed Reality, Visual Studio, Xamarin, Microsoft Azure, Cognitive services, and much more.

We also want to go deeper into code, so this year we’re bringing the tour as a two-day* event. You can decide to attend just the sessions on the first day, or sign-up for a deep (and fun!) hands-on experience on the second day.

  • Day 1: Full day of fast-paced, demo-driven sessions, focusing primarily on new technology that you can start using immediately in your projects, with a bit of forward-looking awesomeness for inspiration.
  • Day 2: Full day hackathon where you’ll use the latest technologies to build a fun client, cloud and mobile solution that meet the requirements of a business case given at the beginning of the day. Seating is limited for Day 2, so be sure to register soon!

In most locations, on both days, we’ll also have a Mixed Reality experience booth where you’ll be able to sign up for scheduled hands-on time with Microsoft HoloLens and our latest Windows Mixed Reality devices.

To learn more and register, visit http://buildtour.microsoft.com. Can’t make it in person? Select cities will be live-streamed regionally to give you a chance to view the sessions and participate in the Q&A.

We can’t wait to see you on the tour!

*Munich is a single day, session-focused event.

Windows Mixed Reality Dev Kits available for pre-order

Anyone following the excitement around virtual reality and augmented reality over the past year is aware of the anticipation surrounding Microsoft’s new Windows Mixed Reality headsets. You understand that a rapidly expanding mixed reality market is just waiting for developers like you to get involved. During Alex Kipman’s Build keynote, we announced that Windows Mixed Reality dev kits from Acer and HP are now available for pre-order through the Microsoft Store for developers in the US (Acer, HP) and Canada (Acer, HP) —please sign up here so we can notify you once dev kits are available in additional countries.

The Acer Windows Mixed Reality Headset Developer Edition is priced at $299 USD and the HP Windows Mixed Reality Headset Developer Edition is priced at $329 USD. The headsets use state-of-the-art, inside-out tracking so you don’t need to set up external cameras or IR emitters to have a truly immersive experience as you move with six degrees of freedom (6DoF) in mixed reality. You’ll be ready to code new mixed reality experiences out of box with a headset and a Windows 10 Creator’s Update PC that meets our recommended hardware specifications for developers. We invite developers to join the Windows Insider program to receive the latest mixed reality experiences from Microsoft each week.

Acer and HP built new mixed reality headsets with different industrial designs to capture the spirit of more personal computing and creativity in Windows. Developers can choose the bright and lightweight headset from Acer or the modern and industrial look of the HP headset with a common set of display and audio features across both headsets:

Acer Windows Mixed Reality Headset Developer Edition HP Windows Mixed Reality Headset Developer Edition
Pre-order in the US Pre-order in the US
Pre-order in Canada Pre-order in Canada
  • Two high-resolution liquid crystal displays at 1440 x 1440
  • Front hinged display
  • 95 degree horizontal field of view
  • Display refresh rate up to 90 Hz (native)
  • Built-in audio out and microphone support through 3.5mm jack
  • Single cable with HDMI 2.0 (display) and USB 3.0 (data) for connectivity
  • Inside-out tracking
  • 4.0 meter cable
  • Two high-resolution liquid crystal displays at 1440 x 1440
  • Front hinged display
  • 95 degrees horizontal field of view
  • Display refresh rate up to 90 Hz (native)
  • Built-in audio out and microphone support through 3.5mm jack
  • Single cable with HDMI 2.0 (display) and USB 3.0 (data) for connectivity
  • Inside-out tracking
  • 4.0m/0.6m removable cable
  • Double-padded headband and easy adjustment knob for all day comfort

As a developer, you can start preparing your machine to build immersive experiences today. Visit the Windows Dev Center to view documentation, download tools and join the emerging community of Windows Mixed Reality developers. Download Unity 3D, the most widely-used developer platform for creating immersive applications. Also download the free Visual Studio 2017 Community edition to package and deploy your immersive apps to the Windows Store. Additionally, you should check to make sure your workstation meets the recommended specifications for developers:

 System Recommendations for App Developers


  • Desktop: Intel Desktop Core i7 (6+ Core) OR AMD Ryzen 7 1700 (8 Core, 16 threads)


  • Desktop: NVIDIA GTX 980/1060, AMD Radeon RX 480 (8GB) equivalent or greater DX12 and WDDM 2.2 capable GPU
  • Drivers: Windows Display Driver Model (WDDM) 2.2
  • Thermal Design Power: 15W or greater


  • Headset connectors: 1x available graphics display port for headset (HDMI 1.4 or DisplayPort 1.2 for 60Hz headsets, HDMI 2.0 or DisplayPort 1.2 for 90Hz headsets)
  • Resolution: SVGA (800×600) or greater
  • Bit depth: 32 bits of color per pixel

Memory: 16 GB of RAM or greater

Storage: >10 GB additional free space


  • 1x available USB port for headset (USB 3.0 Type-A). USB must supply a minimum of 900mA.
  • Bluetooth 4.0 (for accessory connectivity)

The Windows Mixed Reality headsets are priced to lower the barriers to create immersive experiences. Mixed reality is now open to you as a developer—and if your Windows PC already meets the minimum specs, you don’t really need anything more to start building games and enterprise apps for the rapidly expanding mixed reality market. We can’t wait to see what you build!

Get all the updates for Windows Developers from Build 2017 here.

Windows Developer Awards: Honoring Windows Devs at Microsoft Build 2017

As we ramp up for Build, the Windows Dev team would like to thank you, the developer community, for all the amazing work you have done over the past 12 months. Because of your efforts and feedback, we’ve managed to add countless new features to the Universal Windows Platform and the Windows Store in an ongoing effort to constantly improve. And thanks to your input on the Windows Developer Platform Backlog, you have helped us to prioritize new UWP features.

In recognition of all you have done, this year’s Build conference in Seattle will feature the first-ever Windows Developers Awards given to community developers who have built exciting UWP apps in the last year and published them in the Windows Store. The awards are being given out in four main categories:

  • App Creator of the Year – This award recognizes an app leveraging the latest Windows 10 capabilities. Some developers are pioneers, the first to explore and integrate the latest features in Windows 10 releases. This award honors those who made use of features like Ink, Dial, Cortana, and other features in creative ways.
  • Game Creator of the Year – This award recognizes a game by a first-time publisher in Windows Store. Windows is the best gaming platform–and it’s easy to see why. From Xbox to PCs to mixed reality, developers are creating the next generation of gaming experiences. This award recognizes developers who went above and beyond to publish innovative, engaging and magical games to the Windows Store over the last year.
  • Reality Mixer of the Year – This award recognizes the app demonstrating a unique mixed reality experience. Windows Mixed Reality lets developers create experiences that transcend the traditional view of reality. This award celebrates those who choose to mix their own view of the world by blending digital and real-world content in creative ways.
  • Core Maker of the Year – This award recognizes a maker project powered by Windows. Some devs talk about the cool stuff they could build–others just do it. This award applauds those who go beyond the traditional software interface to integrate Windows in drones, PIs, gardens, and robots to get stuff done.

In addition to these, a Ninja Cat of the Year award will be given as special recognition. Selected by the Windows team at Microsoft, this award celebrates the developer or experience that we believe most reflects what Windows is all about, empowering people of action to do great things.

Here’s what we want from you: we need the developer community to help us by voting for the winners of these four awards on the awards site so take a look and tell us who you think has created the most compelling apps. Once you’ve voted, check back anytime to see how your favorites are doing. Voting will end on 4/27, so get your Ninja votes in quickly.

Windows Developers at Microsoft Build 2017

Microsoft Build 2017 kicks off on May 10 in Seattle, with an expected capacity crowd of over 5,000 developers—plus countless more online. Join us for the live-streamed keynotes, announcements, technical sessions and more. You’ll be among the first to hear about new developments that will help you engage your users, keep their information safe and reach them in more places. Big things have been unveiled and promoted at Microsoft Build over the years and this year’s conference won’t be any different!

There will be quite a bit of content specifically relevant to Windows developers:

  • Improvements that help you immediately engage your users with beautiful UI and natural inputs
  • Team collaboration and connectedness to streamline and improve your development experience
  • Services that make it easier to reach customers and learn what they want from your software
  • Connected screens and experiences that make your end-to-end experience stickier and more engaging
  • Mixed reality and creating deeply immersive experiences

Sign up for a Save-the-Date reminder on our Build site for Windows Developers and we’ll keep you in the loop as new details and information come in. When you sign up, you’ll also gain the ability to:

  • Save sessions for later viewing
  • Create and share content collections
  • Discuss what you’ve seen and heard with other developers
  • Upvote content you like and track trending sessions

You’ll find sign-up, sessions and content here.

Building a Telepresence App with HoloLens and Kinect

When does the history of mixed reality start? There are lots of suggestions, but 1977 always shows up as a significant year. That’s the year millions of children – many of whom would one day become the captains of Silicon Valley – first experienced something they wouldn’t be able to name for another decade or so.

The plea of an intergalactic princess that set off a Star Wars film franchise still going strong today: “Help me Obi-wan Kenobi, you’re my only hope.” It’s a fascinating validation of Marshal McLuhan’s dictum that the medium is the message. While the content of Princess Leia’s message is what we have an emotional attachment to, it is the medium of the holographic projection – today we would call it “augmented reality” or “mixed reality” – that we remember most vividly.

While this post is not going to provide an end-to-end blueprint for your own Princess Leia hologram, it will provide an overview of the technical terrain, point out some of the technical hurdles and point you in the right direction. You’ll still have to do a lot of work, but if you are interested in building a telepresence app for the HoloLens, this post will help you get there.

An external camera and network connection

The HoloLens is equipped with inside-out cameras. In order to create a telepresence app, however, you are going to need a camera that can face you and take videos of you – in other words, an outside-in camera. This post is going to use the Kinect v2 as an outside-in camera because it is widely available, very powerful and works well with Unity. You may choose to use a different camera that provides the features you need, or even use a smartphone device.

The HoloLens does not allow third-party hardware to plug into its mini-USB port, so you will also need some sort of networking layer to facilitate inter-device communication. For this post, we’ll be using the HoloToolkit’s sharing service – again, because it is just really convenient to do so and even has a dropdown menu inside of the Unity IDE for starting the service. You could, however, build your own custom socket solution as Mike Taulty did or use the Sharing with UNET code in the HoloToolkit Examples, which uses a Unity provided networking layer.

In the long run, the two choices that will most affect your telepresence solution are what sort of outside-in cameras you plan to support and what sort of networking layer you are going to use. These two choices will determine the scalability and flexibility of your solution.

Using the HoloLens-Kinect project

Many telepresence HoloLens apps today depend in some way on Michelle Ma’s open-source HoloLens-Kinect project. The genius of the app is that it glues together two libraries, the Unity Pro plugin package for Kinect with the HoloToolkit sharing service, and uses them in unintended ways to arrive at a solution.

Even though the Kinect plugin for Unity doesn’t work in UWP (and the Kinect cannot be plugged into a HoloLens device in any case), it can still run when deployed to Windows or when running in the IDE (in which case it is using the .NET 3.5 framework rather than the .NET Core framework). The trick, then, is to run the Kinect integration in Windows and then send messages to the HoloLens over a wireless network to get Kinect and the device working together.

On the network side, the HoloToolkit’s sharing service is primarily used to sync world anchors between different devices. It also requires that a service be instantiated on a PC to act as a communication bus between different devices. The sharing service doesn’t have to be used as intended, however. Since the service is already running on a PC, it can also be used to communicate between just the PC and a single HoloLens device. Moreover, it can be used to send more than just world anchors – it can really be adapted to send any sort of primitive values – for instance, Kinect joint positions.

To use Ma’s code, you need two separate Unity projects: one for running on a desktop PC and the other for running on the HoloLens. You will add the Kinect plugin package to the desktop app. You will add the sharing prefab from the HoloToolkit to both projects. In the app intended for the HoloLens, add the IP address of your machine to the Server Address field in the Sharing Stage component.

The two apps are largely identical. On the PC side, the app takes the body stream from the Kinect and sends the joint data to a script named BodyView.cs. BodyView creates spheres for each joint when it recognizes a new body and then repositions these joints whenever it gets updated Kinect.

private GameObject CreateBodyObject(ulong id)
    GameObject body = new GameObject(&amp;quot;Body:&amp;quot; + id);
    for (int i = 0; i &amp;lt; 25; i++)
        GameObject jointObj = GameObject.CreatePrimitive(PrimitiveType.Sphere);

        jointObj.transform.localScale = new Vector3(0.3f, 0.3f, 0.3f);
        jointObj.name = i.ToString();
        jointObj.transform.parent = body.transform;
    return body;

private void RefreshBodyObject(Vector3[] jointPositions, GameObject bodyObj)
    for (int i = 0; i &amp;lt; 25; i++)
        Vector3 jointPos = jointPositions[i];

        Transform jointObj = bodyObj.transform.FindChild(i.ToString());
        jointObj.localPosition = jointPos;

As this is happening, another script called BodySender.cs intercepts this data and sends it to the sharing service. On the HoloLens device, a script named BodyReceiver.cs gets this intercepted joint data and passes it to its own instance of the BodyView class that animates the dot man made up of sphere primitives.

The code used to adapt the sharing service for transmitting Kinect data is contained in Ma’s CustomMessages2 class, which is really just a straight copy of the CustomMessages class from the HoloToolkit sharing example with a small modification that allows joint data to be sent and received:

public void SendBodyData(ulong trackingID, Vector3[] bodyData)
    // If we are connected to a session, broadcast our info
    if (this.serverConnection != null &amp;amp;&amp;amp; this.serverConnection.IsConnected())
        // Create an outgoing network message to contain all the info we want to send
        NetworkOutMessage msg = CreateMessage((byte)TestMessageID.BodyData);


        foreach (Vector3 jointPos in bodyData)
            AppendVector3(msg, jointPos);

        // Send the message as a broadcast

Moreover, once you understand how CustomMessages2 works, you can pretty much use it to send any kind of data you want.

Be one with The Force

Another thing the Kinect is very good at is gesture recognition. HoloLens currently supports a limited number of gestures and is constrained by what the inside-out cameras can see – mostly just your hands and fingers. You can use the Kinect-HoloLens integration above, however, to extend the HoloLens’ repertoire of gestures to include the user’s whole body.

For example, you can recognize when a user raises her hand above her head simply by comparing the relative positions of these two joints. Because this pose recognition only requires the joint data already transmitted by the sharing service and doesn’t need any additional Kinect data, it can be implemented completely on the receiver app running in the HoloLens.

private void DetectGesture(GameObject bodyObj)
    string HEAD = &amp;quot;3&amp;quot;;
    string RIGHT_HAND = &amp;quot;11&amp;quot;;

    // detect gesture involving the right hand and the head
    var head = bodyObj.transform.FindChild(HEAD);
    var rightHand = bodyObj.transform.FindChild(RIGHT_HAND);
    // if right hand is half a meter above head, do something
    if (rightHand.position.y &amp;gt; head.position.y + .5)

In this sample, a hidden item is shown whenever the pose is detected. It is then hidden again whenever the user lowers her right arm.

The Kinect v2 has a rich literature on building custom gestures and even provides a tool for recording and testing gestures called the Visual Gesture Builder that you can use to create unique HoloLens experiences. Keep in mind that while many gesture solutions can be run directly in the HoloLens, in some cases, you may need to run your gesture detection routines on your desktop and then notify your HoloLens app of special gestures through a further modified CustomMessages2 script.

As fun as dot man is to play with, he isn’t really that attractive. If you are using the Kinect for gesture recognition, you can simply hide him by commenting a lot of the code in BodyView. Another way to go, though, is to use your Kinect data to animate a 3D character in the HoloLens. This is commonly known as avateering.

Unfortunately, you cannot use joint positions for avateering. The relative sizes of a human being’s limbs are often not going to be the same as those on your 3D model, especially if you are trying to animate models of fantastic creatures rather than just humans, so the relative joint positions will not work out. Instead, you need to use the rotation data of each joint. Rotation data, in the Kinect, is represented by an odd mathematical entity known as a quaternion.


Quaternions are to 3D programming what midichlorians are to the Star Wars universe: They are essential, they are poorly understood, and when someone tries to explain what they are, it just makes everyone else unhappy.

The Unity IDE doesn’t actually use quaternions. Instead it uses rotations around the X, Y and Z axes (pitch, yaw and roll) when you manipulate objects in the Scene Viewer. These are also known as Euler angles.

There are a few problems with this, however. Using the IDE, if I try to rotate the arm of my character using the yellow drag line, it will actually rotate both the green axis and the red axis along with it. Somewhat more alarming, as I try to rotate along just one axis, the Inspector windows show that my rotation around the Z axis is also affecting the rotation around the X and Y axes. The rotation angles are actually interlocked in such a way that even the order in which you make changes to the X, Y and Z rotation angles will affect the final orientation of the object you are rotating. Another interesting feature of Euler angles is that they can sometimes end up in a state known as gimbal locking.

These are some of the reasons that avateering is done using quaternions rather than Euler angles. To better visualize how the Kinect uses quaternions, you can replace dot man’s sphere primitives with arrow models (there are lots you can find in the asset store). Then, grab the orientation for each joint, convert it to a quaternion type (quaternions have four fields rather than the three in Euler angles) and apply it to the rotation property of each arrow.

private static Quaternion GetQuaternionFromJointOrientation(Kinect.JointOrientation jointOrientation)
    return new Quaternion(jointOrientation.Orientation.X, jointOrientation.Orientation.Y, jointOrientation.Orientation.Z, jointOrientation.Orientation.W);
private void RefreshBodyObject(Vector3[] jointPositions, Quaternion[] quaternions, GameObject bodyObj)
    for (int i = 0; i &amp;lt; 25; i++)
        Vector3 jointPos = jointPositions[i];

        Transform jointObj = bodyObj.transform.FindChild(i.ToString());
        jointObj.localPosition = jointPos;
        jointObj.rotation = quaternions[i];

These small changes result in the arrow man below who will actually rotate and bend his arms as you do.

For avateering, you basically do the same thing, except that instead of mapping identical arrows to each rotation, you need to map specific body parts to these joint rotations. This post is using the male model from Vitruvius avateering tools, but you are welcome to use any properly rigged character.

Once the character limbs are mapped to joints, they can be updated in pretty much the same way arrow man was. You need to iterate through the joints, find the mapped GameObject, and apply the correct rotation.

private Dictionary&amp;lt;int, string&amp;gt; RigMap = new Dictionary&amp;lt;int, string&amp;gt;()
    {0, &amp;quot;SpineBase&amp;quot;},
    {1, &amp;quot;SpineBase/SpineMid&amp;quot;},
    {2, &amp;quot;SpineBase/SpineMid/Bone001/Bone002&amp;quot;},
    // etc ...
    {22, &amp;quot;SpineBase/SpineMid/Bone001/ShoulderRight/ElbowRight/WristRight/ThumbRight&amp;quot;},
    {23, &amp;quot;SpineBase/SpineMid/Bone001/ShoulderLeft/ElbowLeft/WristLeft/HandLeft/HandTipLeft&amp;quot;},
    {24, &amp;quot;SpineBase/SpineMid/Bone001/ShoulderLeft/ElbowLeft/WristLeft/ThumbLeft&amp;quot;}

private void RefreshModel(Quaternion[] rotations)
    for (int i = 0; i &amp;lt; 25; i++)
        if (RigMap.ContainsKey(i))
            Transform rigItem = _model.transform.FindChild(RigMap[i]);
            rigItem.rotation = rotations[i];

This is a fairly simplified example, and depending on your character rigging, you may need to apply additional transforms on each joint to get them to the expected positions. Also, if you need really professional results, you might want to look into using inverse kinematics for your avateering solution.

If you want to play with working code, you can clone Wavelength’s Project-Infrared repository on github; it provides a complete avateering sample using the HoloToolkit sharing service. If it looks familiar to you, this is because it happens to be based on Michelle Ma’s HoloLens-Kinect code.

Looking at point cloud data

To get even closer to the Princess Leia hologram message, we can use the Kinect sensor to send point cloud data. Point clouds are a way to represent depth information collected by the Kinect. Following the pattern established in the previous examples, you will need a way to turn Kinect depth data into a point cloud on the desktop app. After that, you will use shared services to send this data to the HoloLens. Finally, on the HoloLens, the data needs to be reformed as a 3D point cloud hologram.

The point cloud example above comes from the Brekel Pro Point Cloud v2 tool, which allows you to read, record and modify point clouds with your Kinect.

The tool also includes a Unity package that replays point clouds, like the one above, in a Unity for Windows app. The final steps of transferring point cloud data over the HoloToolkit sharing server to HoloLens is an exercise that will be left to the reader.

If you are interested in a custom server solution, however, you can give the open source LiveScan 3D – HoloLens project a try.

HoloLens shared experiences and beyond

There are actually a lot of ways to orchestrate communication for the HoloLens of which, so far, we’ve mainly discussed just one. A custom socket solution may be better if you want to institute direct HoloLens-to-HoloLens communication without having to go through a PC-based broker like the sharing service.

Yet another option is to use a framework like WebRTC for your communication layer. This has the advantage of being an open specification, so there are implementations for a wide variety of platforms such as Android and iOS. It is also a communication platform that is used, in particular, for video chat applications, potentially giving you a way to create video conferencing apps not only between multiple HoloLenses, but also between a HoloLens and mobile devices.

In other words, all the tools for doing HoloLens telepresence are out there, including examples of various ways to implement it. It’s now just a matter of waiting for someone to create a great solution.

ICYMI – Your weekly TL;DR

Busy coding weekend ahead? Before you go heads-down, get the latest from this week in Windows Developer below.

Getting Started with a Mixed Reality Platformer Using Microsoft HoloLens

The platform game genre has undergone many revolutions – and with mixed reality and HoloLens, we all have the opportunity to expand the platform game yet again. What will you build?

Windows 10 SDK Preview Build 15042 Released!

A new Windows 10 Creators Update SDK Preview was released this week! Read about what’s new in 15042.

Announcing the Xbox Live Creators Program

The Xbox Live Creators Program was announced at GDC on Wednesday, starting with an Insider Preview that gives any developer the opportunity to publish Xbox Live-enabled games on Windows 10 PCs along with Xbox One consoles. Get the details here.

Just Released – Windows Developer Evaluation Virtual Machines – February 2017 Build

And last but not least – the February 2017 edition of evaluation Windows developer virtual machines on Windows Dev Center was just released. The VMs come in Hyper-V, Parallels, VirtualBox and VMWare flavors. Get ‘em all!

Download Visual Studio to get started.

The Windows team would love to hear your feedback. Please keep the feedback coming using our Windows Developer UserVoice site. If you have a direct bug, please use the Windows Feedback tool built directly into Windows 10.