Nate Mars – Music Producer, Sound Designer and Immersive Content Creator
Meet Nate Mars: a producer and music technologist based in New York City. He has released original music on several labels in addition to having songs licensed for documentary films and television. Nate is passionate about artist-to-artist music education, specifically helping inspire other producers to achieve their music production goals through collaboration and technology. In addition to his work as a recording artist, Nate works as a content marketing consultant and strategist. For several years, he held the position of Marketing Director at music production & DJ school Dubspot. Nate has also worked in the tech startup world with companies such as music production platform/app creator Splice and Virtual Reality startup, SpaceoutVR. He has been the host of the Decibel 2015 conference and has been a guest speaker on panels at Sundance Film Festival 2014 and more.
Nate planned and produced unique activations for the Surface team this past year at two North American festivals, Moogfest (Durham, North Carolina) and MUTEK (Montreal, Canada), and received a Surface Book from Microsoft to use for music production and demos.
Let’s hear more from Nate on his career journey and love of music and technology.
Can you talk about what got you in interested in pursuing music? Was there an “aha” moment? How did you get into it?
I have been interested in music ever since I was very little. I used to record sounds on a children’s tape recorder and listen for hours to what I recorded. I would also play little melodies on a Casio keyboard and mumble along, trying to emulate songs I heard on pop radio stations, somewhere around 5 years old. The ‘ah-ha’ moment came later. When I was in high school, practicing electric bass in my room, I would record on to a 4-track tape recorder and when I realized I could overdub and create a bunch of parts (other than bass) myself, something clicked and deeply moved me when it comes to music production. Technology like Surface has continued to emerge, making the recording process even easier and more fun over the years!
Where do you get your inspiration?
I get inspiration from everywhere. Sometimes, on a long bus ride, I’ll look out the window and imagine the soundtrack for the landscape/moment going by. Sometimes, I’ll be at a concert absorbing what other musicians are creating and finding inspiration for colleagues. Science Fiction books and movies are also a huge inspiration. I love the way they can open your mind to think differently about reality or what might be possible in the future. I try to translate that same feeling to music… or write the music to a scene from a sci-fi novel in my mind.
What’s been a moment or project you’ve done that you’re proudest of?
Recently, I completed a sound design project for a TV show where boxing champion Evander Holyfield was a special guest. I’m a huge boxing fan and never imagined working with him because (obviously) we are in very different industries but the places music can take you sometimes are crazy. I also have two music production courses I created using Logic Pro as well as Ableton Live & Push 2 for Lynda.com. You can check those out online if you are looking to learn some workflow methods for music production.
You’re passionate about technology and videos – how did you incorporate those into your work over time?
In addition to my work in music, I also work in content marketing, using technology and video to tell a brand’s story. I think it is very exciting where technology is headed, particularly in the Virtual Reality and 360° video world. I’ve been working in that emerging area for a little while now and have produced a lot of (standard) video content in the past. There will be many new challenges but I do believe that some tools are like the phonograph when it comes to recording. We’re at the very beginning of something huge!
Nate uses Bitwig in conjunction with Ableton Live on Surface Book in his studio.
What do you like about creating on Surface Book? Do you use pen, touch, or a combination?
One of my favorite things when it comes to creating on Surface Book is that your software just opens up and you can dive right in to any parameter faster than you can hit a combination of key commands or map parameters on a MIDI controller. I have also run a full live set recently along with a visual component on the same computer sent to a projector and everything runs smoothly.
What experiences have stood out to you?
DAWS like Bigwig and Ableton Live work so well with Surface. It is exciting to use music creation software that is optimized for touch interaction AND has a powerful processor to run everything.
What’s something recent you’re working on with Surface now that you’re really excited about?
I’m currently working on a brand new live set with a vocalist I collaborate with. We have a show coming up in a month and I’m planning on using Surface to run visuals in the background as well as triggering effects on vocals and playing different elements in the songs. Instead of mapping a controller, I’m excited to be able to just trigger clips and edit audio via touch interaction on the fly.
As promised, we are continuing the wave of sharing what we kicked off last week during the “App Dev on Xbox” event. Every week for the next two months, we will be releasing a new blog post where we will focus on a specific topic around UWP app development where Xbox One will be the hero device. With each post, we will be sharing a demo app experience around that topic, and open sourcing the source code on GitHub so anyone can download it and learn from it.
For the first blog post, we thought we’d continue the conversation around premium experiences, or better said, how to take your beautiful app and tailor it for the TV in order to delight your users.
The Premium Experience
Apps built with the Anniversary Update SDK “just work” on Xbox and your users are able to download your apps from the store the same way they do on the desktop and the phone. And even if you have not tested your app on Xbox, the experience in most cases is acceptable. Yet, acceptable is not always enough and as developers we want our apps to have the best experience possible; the premium experience. In that spirit, let’s cover seven (7) things to consider when adapting your app for the TV premium experience.
Fourth Coffee is a sample news application that works across the desktop, phone, and Xbox One and offers a premium experience that takes advantage of each device’s strengths. We will use Fourth Coffee as the example to illustrate the default experience and what it takes to tailor it to get the best experience. All code below comes directly from the app and each code snippet is linked directly to the source in the repository which is open to anyone.
1. Optimize for the Gamepad
The first time you launch your app on Xbox, you will notice a pointer that you can move freely with the controller and you can use it to click on buttons and do much of the same things you can do with a mouse. This is called mouse mode and it is the default experience for any UWP app on Xbox. This mode virtually works with any layout but it’s not always the experience your users will expect as they are used to directional focus navigation. Luckily, switching to directional navigation is as simple as one line of code:
Note: WhenRequested allows a control to request mouse mode if needed when it is focused or engaged to provide the best experience.
In general, if your app is usable through the keyboard, it should work reasonably well with the controller. The great thing about the UWP platform is that the work that enables directional navigation and gamepad support on Xbox also enables the same on other UWP devices running the Anniversary Update. So you can just plug in a controller to your desktop machine and try it out immediately. Try it out with some of the default applications such as the Store or even the Start Menu. It’s really handy on the go; no need to bring your Xbox on the plane just for development.
Navigating the UI with a controller is intuitive and the platform has already mapped existing keyboard input behaviors to gamepad and remote control input. However, there are buttons on the controller and the remote that have not been mapped by default and you might want to consider using them to accelerate navigation. For example, a lot of developers are mapping search to the Y button and users have already started to identify Y as search. As an example, here is the code from Fourth Coffee where we’ve mapped the X button to jump to the bottom of the page, accelerating navigation for the user:
Even though apps on Xbox One generally run on much larger screens than the desktop, the users are much farther away, so everything needs to be a bit bigger with plenty of room to breathe. The effective working size of every app on the Xbox is 960 x 540 (1920 x 1080 with 200% scaling applied) in a XAML app (150% scaling is applied in HTML apps). In general, if elements are appropriately sized for other devices, they will be appropriate for the TV.
Note: you can disable the automatic scaling if you chose so by following these instructions.
In some cases, you will find that the existing layout for your app does not work as well with directional navigation as it would with mouse or touch. The user might run into a situation where they are scrolling through an entire 500 item list just so they can get to the button at the bottom. In those cases, it’s best to change layout of items or use focus engagement on the ListView so it acts as one focusable item until the user engages it.
In cases where the focus algorithm does not prioritize on the correct element, the developer can override the default behavior by using the XYFocus properties on focusable elements. For example, in the following code from Fourth Coffee:
RelatedGridView.XYFocusUp = PlayButton;
when the user presses Up to focus to an element above the RelatedGrid, the PlayButton will always focus next, no matter where it is.
When first working with directional navigation, you might run into focus misbehavings. In those cases, it’s always helpful to take advantage of the FocusManager API and keep track of the what elements has focus. For example, you can login focus information to the output window while debugging with this code snippet:
When users are interacting with your app using the keyboard or controller, they need to be able to easily identify the element on which they are currently focused. With the Anniversary Update SDK, the existing focus visuals (the border around an element when in focus) have been updated to be more prominent. In the majority cases, developers won’t need to do anything and the focus visual will look great.
Of course, in other cases, you might want to modify the focus visual to apply your color, style, or even complete change the shape or behavior. For example, the default focus visuals for the VideoButton in the DetailsPage are not as visible against the bright background so we changed them:
To take it even further, in MainPage, the focus visual for the navigation pane items has been turned off and has been customized by using a custom template.
For more in depth on Focus Visuals, check out the guidelines.
4. TV Safe Area
Unlike computer monitors, some TVs cut off the edge of the display which can cause content at the edge to be hidden. When you run your app on Xbox, by default, you might notice obvious margins from the edge of the screen.
In many cases, using a dark page background with the dark theme (or a white page background on white theme) is exactly what is needed to get the optimal experience as the page background always extends to the edge. However, if you are using full bleed images as in Fourth Coffee, you will notice obvious borders around your app. Luckily, there is a way to disable this experience by using one line of code:
Once you have your content drawing to the edge of the screen, you will need to make sure that any focusable content is not extending outside of the TV-safe area, which in general is 48 pixels from the sides and 27 pixels from the top and bottom.
To learn more about TV safe area, make sure to check out the guidelines.
5. TV Safe Colors
Not all TVs show color the same way and in some ways that can impact the way an app looks. In general, RGB values in the 16-235 are considered safe for the majority of TVs and if your app depends a lot on subtle differences in color, or high intensities, colors could look washed out or have unexpected effects. To optimize the color palette for TV, the recommendation is to try to clamp the colors to the TV-safe range. In some cases, clamping alone could cause colors to collide and if that is the case, scaling the colors after clamping will give the best results.
The default colors for XAML controls are designed around black or white and are not automatically adjusted for the TV. One way to make sure the default colors are TV safe on Xbox is to use a resources dictionary that overwrites the default colors when running on the Xbox. For example, we use this code in Fourth Coffee to apply the resource dictionary when the app is running on the Xbox:
// use TV colorsafe values
Source = new Uri(&quot;ms-appx:///TvSafeColors.xaml&quot;)
There is a resource dictionary that you can use in your app here, and make sure to check out the guidance on TV safe color.
6. Snapped system apps
UWP apps by definition should be adaptive or responsive to provide the best experience in any size and any supported device. Just like on any UWP device, apps on Xbox will need to account for changes in size. The majority of time, apps on Xbox will be in running in full screen, but the user can at any moment snap a system app next to your app (such as Cortana), effectively changing the width of your app.
One of the methods for responding to size changes is using AdaptiveTriggers directly in your XAML which will apply changes to the XAML depending on the current size. Checkout the use of AddaptiveTriggers in FourthCoffee on GitHub.
OK, so this one is not relevant for all apps, but seeing how the majority of apps on Xbox are media apps, it is absolutely worth including in this list. Media applications should respond to media controls initiated by the user no matter what the method used, and there are several ways that a user can initiate media commands:
Via buttons in your application
Via Cortana (typically through speech)
Via the System Media Transport Controls in the multi-tasking tab even if your app is in the background (if implemented)
Via the Xbox app on other devices
If your application is already using the MediaPlayer class for media playback, this integration is automatic. But there are few scenarios where you may need to implement manual control, and implementing the System Media Transport Controls allows you to respond to all the ways the user could be interacting with your app. If you use your own media controls, make sure to integrate with the SystemMediaTransportControl.
Now that you have familiarity what it takes to optimize your app experience for the TV, it’s time to get your hands dirty and try it out. Checkout out the app source on our official GitHub page, watch the event if you missed it, and let us know what you think through the comments bellow or on Twitter.
This is just the beginning. Next week we will release another app experience and go in depth on how to create even richer app experiences with XAML and Unity. We will also cover how to create app services and extensions for your apps so other developers can build experiences that compliment your apps.
As title, looking for a gtx 1080 founders. Needs to be founders edition as I already have a waterblock ready.
Also I would like to know what the card clocks to. Very interested in a higher clocking card.
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed…