Tag Archives: team

A brother and sister team are rowing 3,000 miles across the Atlantic Ocean – their equipment includes Microsoft Teams

A brother and sister team taking part in a 3,000-mile race across the Atlantic Ocean have stayed in touch with family and friends by using Microsoft Teams, despite being hundreds of miles from land.

Anna and Cameron McLean have used the Microsoft tool to contact loved ones, and receive weather and race updates from a crew on shore during the Talisker Whisky Atlantic Challenge.

Known as “the world’s toughest row”, participants spend 60 days at sea in a small boat, braving 40-foot waves, sharks, illness and a schedule that sees them sleep and row in two-hour shifts as they make their way from La Gomera in The Canary Islands to Antigua. To put the gruelling race in context, fewer people have rowed the Atlantic than reached the summit of Everest.

While many mixed-sex teams have completed the challenge, Anna and Cameron believe they are the first brother and sister to take part.

Speaking via Teams on the 35th day of their journey, Anna said the Microsoft tool had been crucial for receiving messages of support that have kept the siblings going.

  • Part of the Microsoft Teams call between Anna McLean, in the Atlantic, and Andy Trotman, in the UK

“We can use Teams to communicate with anyone in the world from the middle of the Atlantic Ocean. That’s been essential,” she said. “Teams has been such a dream because we’ve been able to maintain a two-way dialogue with our family and friends back home, as well as our sponsors. We have been able to share real-time updates and pictures, and get information such as the weather forecast. That’s been a big contributing factor to the success and speed of our crossing. Teams has helped us navigate the best and most direct course.

“It’s been easy to set up, too. We connect to the internet via a satellite, and then open up the Teams app on my phone. That’s it.”

Anna, 25, and Cameron, 32, are currently third in the pairs race, in a field of 34 vessels. They are each burning 10,000 calories a day and fighting against sleep deprivation, exhaustion, blisters and bruises. Meals consist of “space food” that has to be mixed with water and left on deck so the sun can warm it up. Sea water is filtered for drinking, and they aim to drink at least 10 litres a day.

Even though they are experienced rowers, having competed at university, nothing could prepare them for a race of this magnitude.

Anna McLean rowing across the Atlantic
Anna and Cameron are spending 60 days at sea in a small boat, braving 40-foot waves, sharks, illness and a schedule that sees them sleep and row in two-hour shifts

“The nights are brutal,” said Anna, who works for Microsoft partner AlfaPeople. “With a lack of moonlight, the nights are so dark that you can’t see your hand in front of your face or the waves that crash over the side of the boat and threaten to capsize you. The sea was so rough one night that we broke an oar.

“Then, each new day brings new challenges. Our water maker and autohelm broke, and we have been followed by what I estimate to be a 14-foot shark. But we have no choice but to overcome those challenges through strength and perseverance.”

Anna and Cameron are rowing to raise money for UN Women, an organisation dedicated to gender equality and the empowerment of women. “The impact they have for women and girls everywhere is just phenomenal,” Anna added.

The pair have around 300 miles to go before they reach the finish line, and Anna is already looking forward to some simple luxuries.

“I can’t wait to see my mum and dad, and give them a big hug,” she said. “I’m also looking forward to a hot shower and eating fresh fruit and vegetables.”

  • Subscribe to the UK News Centre to learn more about Anna and Cameron’s challenge in an upcoming feature

Tags: , , , , , , , , ,

Go to Original Article
Author: Microsoft News Center

How Microsoft re-envisioned the data warehouse with Azure Synapse Analytics

About four years ago, the Microsoft Azure team began to notice a big problem troubling many of its customers. A mass migration to the cloud was in full swing, as enterprises signed up by the thousands to reap the benefits of flexible, largescale computing and data storage. But the next iteration of that tech revolution, in which companies would use their growing stores of data to get more tangible business benefits, had stalled.

Technology providers, including Microsoft, have built a variety of systems to collect, retrieve and analyze enormous troves of information that would uncover market trends and insights, paving the way toward a new era of improved customer service, innovation and efficiency.

But those systems were built independently by different engineering teams and sold as individual products and services. They weren’t designed to connect with one another, and customers would have to learn how to operate them separately, wasting time, money and precious IT talent.

“Instead of trying to add more features to each of our services, we decided to take a step back and figure out how to bring their core capabilities together to make it easy for customers to collect and analyze all of their increasingly diverse data, to break down data silos and work together more collaboratively,” said Raghu Ramakrishnan, Microsoft’s chief technology officer for data.

At its Ignite conference this week in Orlando, Florida, Microsoft announced the end result of a yearslong effort to address the problem: Azure Synapse Analytics, a new service that merges the capabilities of Azure SQL Data Warehouse with new enhancements such as on-demand query as a service.

Microsoft said this new offering will help customers put their data to work much more quickly, productively and securely by pulling together insights from all data sources, data warehouses and big data analytics systems. And, the company said, with deeper integration between Power BI and Azure Machine Learning, Azure Synapse Analytics can reduce the time required to process and share that data, speeding up the insights that businesses can glean.

What’s more, it will allow many more businesses to take advantage of game-changing technologies like data analytics and artificial intelligence, which are helping scientists to better predict the weather, search engines to better understand people’s intent and workers to more easily handle mundane tasks.

This newest effort to break down data silos also builds on other Microsoft projects, such as the Open Data Initiative and Azure Data Share, which allows you to share data from multiple sources and even other organizations.

Microsoft said Azure Synapse Analytics is also designed to support the increasingly popular DevOps strategy, in which development and operations staff collaborate more closely to create and implement services that work better throughout their lifecycles.

YouTube Video

A learning process

Azure Synapse Analytics is the result of a lot of work, and a little trial and error.

At first, Ramakrishnan said, the team developed highlevel guidelines showing customers how to glue the systems together themselves. But they quickly realized that was too much to ask.

“That required a lot of expertise in the nitty gritty of our platforms,” Ramakrishnan said. Customers made it overwhelmingly clear that we needed to do better.”

So, the company went back to the drawing board and spent an additional two years revamping the heart of its data business, Azure SQL Data Warehouse, which lets customers build, test, deploy and manage applications and services in the cloud.

A breakthrough came when the company realized that customers need to analyze all their data in a single service, without having to copy terabytes of information across various systems to use different analytic capabilities – as has traditionally been the case with data warehouses and data lakes.

With the new offering, customers can use their data analytics engine of choice, such as Apache Spark or SQL, on all their data. That’s true whether it’s structured data, such as rows of numbers on spreadsheets, or unstructured data, such as a collection of social media posts.

This project was risky. It involved deep technical surgery: completely rewriting the guts of the SQL query processing engine to optimize it for the cloud and make it capable of instantly handling big bursts of work as well as very large and diverse datasets.

It also required unprecedented integration among several teams within Microsoft, some of whom would have to make hard choices. Established plans had to be scrapped. Resources earmarked for new features would be redirected to help make the entire system work better.

“In the beginning, the conversations were often heated. But as we got into the flow of it, they became easier. We began to come together,” Ramakrishnan said.

Microsoft also had to make sure that the product would work for any company, regardless of employees’ technical expertise.

“Most companies can’t afford to hire teams of 20 people to drive data projects and wire together multiple systems. There aren’t even enough skilled people out there to do all that work,” said Daniel Yu, director of product marketing for Azure Data and Artificial Intelligence.

Making it easy for customers

Customers can bring together various sources of data into a single feed with Azure Synapse Analytics Studio, a console – or single pane of glass that will allow a business professional with minimal technical expertise to locate and collect data from multiple sources like sales, supply chain, finance and product development. They can then choose how and where to store that data, and they can use it to create reports through Microsoft’s popular Power BI analytics service.

In a matter of hours, Azure Synapse will deliver useful business insights that used to take days or even weeks and months, said Rohan Kumar, corporate vice president for Azure Data.

“Let’s say an executive wants a detailed report on sales performance in the eastern U.S. over the last six months,” Kumar said. Today, a data engineer has to do a lot of work to find where that data is stored and write a lot of brittle code to tie various services together. They might even have to bring in a systems integrator partner. With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

The complexity of the technical problems Azure Synapse addressed would be hard to overstate. Microsoft had to meld multiple independent components into one coherent form factor, while giving a wide range of people – from data scientists to line of business owners – their preferred tools for accessing and using data.

With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

~ Rohan Kumar, corporate vice president for Azure Data

That includes products like SQL Server, the open source programming interface Apache Spark, Azure Data Factory and Azure Data Studio, as well as notebook interfaces preferred by many data professionals to clean and model data.

“Getting all those capabilities to come together fluidly, making it run faster, simpler, eliminating overlapping processes – there was some scary good stuff getting done,” Ramakrishnan said.

The result is a data analytics system that will be as easy to use as a modern mobile phone. Just as the smartphone replaced several devices by making all of their core capabilities intuitively accessible in a single device, the Azure Synapse “smartphone for data” now allows a data engineer to build an entire end-to-end data pipeline in one place. It also enables data scientists and analysts to look at the underlying data in ways that are natural to them.

And just as the phone has driven waves of collaboration and business innovation, Azure Synapse will free up individuals and companies to introduce new products and services as quickly as they can dream them up, Microsoft said.

“If we can help different people view data through a lens that is natural to them, while it’s also visible to others in ways natural to them, then we will transform the way companies work,” Ramakrishnan said. That’s how we should measure our success.

Top photo: Rohan Kumar, corporate vice president for Azure Data, says Azure Synapse will deliver useful business insights that used to take days or even weeks and months. Photo by Scott Eklund/Red Box Pictures.


Go to Original Article
Author: Microsoft News Center

Announcing our new podcast: Artificial Intelligence in Education | | Microsoft EDU

Dan Bowen and Ray Fleming, from our Microsoft Australia Education team, have put their voices together to create a new podcast series, the Artificial Intelligence (AI) in Education Podcast. Over the coming weeks they’ll be talking about what artificial intelligence is, and what it  could be used for in schools, colleges and universities. It isn’t a technical conversation, but is intended for educators who are interested to learn more about this often discussed technology wave.

With “the robots are coming to steal our (students’) jobs!” being a hot topic of many education conferences (and, at many business conferences too), Dan and Ray have their feet on the ground and spend their time talking about what’s really happening, and what the technical language means in plain English. As we run further into the podcast series, Ray and Dan will talk about scenarios that are more specific to their areas of expertise – Dan’s a schools specialist, and Ray’s a specialist in higher education. And along the way they will also interview specialists in other areas – personalising learning, accessibility – and how artificial intelligence is intersecting with them and bringing innovation.

Your podcast hosts

  • Ray Fleming is the Higher Education Director for Microsoft Australia, and has spent his career working within the education ICT industry. From working with tertiary education organisations at global and national level, Ray brings insights into the rapid pace of change being seen as digital disruption occurs in other industries, and what might happen next in Australia’s universities.
    In the past Ray’s been an award-winning writer, and columnist for the Times Higher Education. He’s also a regular speaker at higher education conferences in Australia on data-led decision making, the implications of AI for the higher education sector and today’s students, and on digital transformation of industries.
  • Dan is a Technology Strategist working with schools and school systems. He has worked in education as a teacher, governor and school inspector. He also worked in higher education as a blended learning advisor, before moving into Microsoft where he was the product manager for Office 365 in Education, STEM education lead and managed the Minecraft portfolio for Australia (to the delight of his kids). In his current role he is working across Azure, Windows and devices and looking to support schools to drive educational transformation. His interest in Artificial Intelligence comes from both the use cases and implementation as well as the education and enablement of IT to drive this technology. He is interested in the social and ethical uses of AI in Education.

You can find the podcast in Spotify, Apple Podcasts and Google Podcasts, search in your normal podcast app for “AI in Education”, or listen directly on the podcast website at http://aipodcast.education 

Click here for free STEM resourcesExplore tools for student-centered learning

Go to Original Article
Author: Microsoft News Center

Behind the Design: Surface Headphones

Meet the Surface design team who built our first smart headphones

Vivian Nguyen

It’s 2 PM, and you need to finish a project by end of day. Coworkers in your open office space are chatting about the newest ramen spot. While you’d like to expound on the difference between Hokkaido- versus Tokyo-style noodles, you need to focus. You put on your Surface Headphones, and Cortana greets you with a quick update:

“Hi, you’ve got about 8 hours of battery left. You’re connected to [your device name].”

As your favorite Odesza song plays, you start work immediately.


Composed with the design trinity of audio quality, comfort, and seamless integration, the Surface Headphones help you create a personal space in the modern workplace and in your modern life. The idea that, when you wear them, you escape into another world that lets you focus on you and what you need to get done. The Surface design team wanted to give you — the actual user — control over how plugged in (or not!) you want to be to your immediate environment. Check out the tech specs here.

And you can see that thoughtful approach in the hardware. Designing comfortable earmuffs was paramount because it’s the one part that touches you all the time. They initially considered a traditional donut shape, but with inclusive design at the heart of everything we do, we wanted to accommodate a diverse set of ear shapes. The earmuffs now fit ears of all shapes and sizes comfortably with even pressure points for a secure fit.

Tactile design wasn’t the only consideration. They set out to craft a device that’s both functional and beautiful. Creating a smooth seam on the earmuff, for example, was surprisingly difficult. See how the team wouldn’t take no for an answer in the video below:

See how the Surface design team wove together elegant hardware design, rich audio, and an intelligent assistant. Click here for the audio description version.

Every decision about the Surface Headphones keeps real people in mind — including the writing they don’t see or touch.

To create a holistic and seamless voice experience, Senior Writer Matt Lichtenberg, who focuses on hardware, and Senior Personality Lead for AI Chris O’Connor, who shapes the voice for intelligent experiences, fused their complementary skills. Because Cortana delivers the instructions, Matt and Chris needed to collaborate and bring together the what (instructions) and the how (Cortana).

“Words contribute to the whole experience,” said Matt, “and we wanted the headphones to be almost invisible to people while they’re wearing them. They shouldn’t have to think about them much.”

“I like to think of it as, we’re helping people achieve more with less energy,” said Chris. “How do they get the most out of this device with the least amount of effort? It’s the idea that design stays out of your way — it’s minimal and there to help you get stuff done.”


From the onset, the design team wanted to understand how people naturally use headphones in a variety of vignettes. They developed a series of scenarios to answer key questions about how people interacted with the headphones.

For instance, when customers initially turn on the headphones, would they want to pair and go? Or would they download the Cortana app first?

As it turns out, most want to pair and go.

When you turn on other Bluetooth devices for the first time, you’ll need to put the device in pairing mode. With the Surface Headphones, they’re immediately in pairing mode and Cortana greets you with, “Hello, you’re ready to pair.”

You connect your device, and Cortana confirms with, “You’re paired to [device name].”

“It’s a challenge to create a rich and enjoyable out-of-the-box experience,” said Chris. “If it’s boring and tedious, people blow right through it. But if it’s enjoyable and people understand the value, they’ll reach an optimal state before carrying on.”

Design is an iterative process, and we’re constantly listening to feedback. We’ve heard customers ask for more device control to turn settings on or off, including the “Hey, Cortana” voice activation, touch controls, and voice prompts. So, we delivered.

The latest firmware update on the Cortana app can help you personalize your headphone settings, like reducing the number or duration of voice prompts. That means you can change your settings so a simple “Hello” plays when you initially turn on your headphones. The app gives you more control of your device, ensuring you get the best experience possible.

“It’s amazing how long it feels to say a few words, so you need to make them count,” said Matt.

Unlike computers, which require constant interaction, the Surface Headphones almost disappear into the background while you work, helping you focus while eliminating outside distractions. To help people achieve this, the voice writing team designed the voice prompts to avoid interruptions unless they’re critical, like letting you know when your battery is low.

“How do you thread the needle between being a voice prompt, a robot, and a conversational entity, but still get out of the way?” asked Chris. “This was one of the first areas where we had to practice design differently and pull back on personality to allow things to be shorter and faster.”


Some interactions don’t even need words.

When the headphones are charging, for example, the LED light flashes. In this context, a visual cue is more intuitive. You don’t need to pick them up or put them on to know what’s happening.

In times when words feel unnatural, sound itself can communicate information. When you turn the left dial on the Surface Headphones forward, you hear a low-pitched beep to indicate maximum noise cancellation. Conversely, a high-pitched beep plays when you turn the dial in the opposite direction. This confirms the headphones are now amplifying ambient sound.

Inspired by the volume knobs of hi-fi stereos, which turn with a certain slowness, the hardware design team added headset dials to adjust volume, noise cancellation, or sound amplification. Rotating the dial is an intuitive motion that lets people choose the precise level of sound they want (or don’t want).

Our design anticipates different modes of communication contingent on how someone wants to use or interact with the headphones. But whether it’s audio or visual, each interaction remains succinct.


The Surface Headphones are the first ambient device from Microsoft with an assistant. The Surface design team had a groundbreaking opportunity to radically reimagine headphones as more than just headphones.

In the past, people often confused or conflated digital assistants with voice control. But with increased investments in personality design and the future of interaction, Microsoft is experimenting with giving Cortana added dimension and awareness to help customers get the most out of a digital assistant.

“We decided to use the human metaphor for a digital assistant, because a real-life assistant isn’t just voice control. They don’t just take dictation. They understand what’s important to you, your family, your priorities, your goals,” explained Chris.

As we continue to infuse intelligence across our products and services, teams throughout the company are beginning to explore the potential for what a digital assistant could be.

“The headphones sparked a whole new area of thinking — one that we’re using to think through the same problem from other endpoints as we move on to work for the Office 365 apps,” said Chris.

And who knows? Maybe one day, when you slip on your Surface Headphones, Cortana can chime in with her favorite kind of ramen, too.

Go to Original Article
Author: Microsoft News Center

New VR Garage project Microgravity Lab takes students to space – Microsoft Garage

Virtual reality can transport us to new lands that are near, far, or imagined. As a team of Garage interns found partnering with the Microsoft Hacking STEM and NASA Stem on Station teams, it can also demonstrate physics concepts and spark an interest in STEM careers. For the back-to-school season, we’re excited to announce the opportunity to try Microgravity Lab, a Microsoft Garage project. The VR experience for Windows Mixed Reality and corresponding lesson plan equip teachers with an engaging tool for teaching physics concepts by simulating microgravity. Interested educators can request an invite to try the VR application and corresponding lesson plans. Be sure to include your school name and plan for using the application into the form.

Bringing space into the classroom via Windows Mixed Reality

The Garage Internship is a unique, startup-style program in which teams of interns build projects in response to pitched challenges by Microsoft engineering teams. When this Vancouver intern team heard that the Microsoft Education team was looking for a creative new method way to illustrate the concept of microgravity through VR, they jumped at the opportunity to work on the project.

Microgravity Lab title screen, displaying 5 different expeiences, settings, and other options.An often-misunderstood concept, microgravity is difficult to simulate and understand in Earth’s gravity-laden environment. It is best explained through experiential learning. The Microgravity Lab VR lab experience for Windows Mixed Reality and its accompanying lessons gives teachers the tools to bring this experiential learning to their students.

As NASA Education Specialist Matthew E. Wallace shared, “The concept of microgravity is often misunderstood by students who learn about astronauts on the International Space Station. Providing a virtual reality world for them to explore the phenomena of life on orbit is an excellent way to engage students and solidify their comprehension of concepts related to force, mass and gravitational acceleration.”

Sabrina Ng, Design Intern for the project noted, “When I think of microgravity, I think of it as something you feel, not what you see per se. Thinking about how to visualize and communicate such an abstract concept without stimulating the physical senses was a really cool challenge.”

Microgravity Lab joins a collection of eight middle school lesson plans developed in partnership with NASA to celebrate 20 years of humans living in and working on the International Space Station.

Experiencing microgravity to understand Newton’s 2nd & 3rd Law

Microgravity Lab is designed for grades 6-8. Students can explore three VR modules to understand these physics principles in the context of microgravity on the moon:

  • Conservation of momentum
  • Newton’s 2nd Law
  • Newton’s 3rd Law

The team worked closely with teachers to develop the project, testing early versions of Microgravity Lab with 7th and 8th grade classes. They refined and updated the experienced based on the classroom feedback.

Implementing feedback from teachers and students, the interns added a feature to enable live Microgravity data analysis via Excel. “This project gives students the experience and the fun aspects of VR, but with Excel, we found a way to expose them to Data Analysis. Data is a very important part of our world and this is a great way to introduce it to them,” shared Rébecca Vézina-Côté, the Program Manager Intern for Microgravity Lab.

Introducing space into the classroom via Windows Mixed Reality

Hacking STEM to engage students

Microgravity Lab joins the Hacking STEM portfolio. The portfolio is created by teachers for teachers to offer hands-on, inquiry-driven, real-world lesson plans. The standards-aligned, interdisciplinary lessons lesson plans teach 21st century technical skills in the context of existing curricula. The Hacking STEM portfolio now includes 22 middle and high school lesson plans built by teachers for teachers on topics ranging from circuits and robotic hands to learning how sharks swim, and now, microgravity.

“There are companies moving towards commercializing space travel and package delivery, a project like this might give students an idea of what life might be like on a space station, and hopefully inspire them to want to go further with it and see it as a future path for them as an area of interest or a future career,” shared Adrian Pang, a Software Engineer Intern with the project.

The Microgravity Lab experience makes science more engaging and introduces these concepts to students in a way that inspires lifelong learning and passionate curiosity about the world around them.

The impact of VR in the classroom

Microgravity lab team photoThe Microsoft Education team has provided materials to enable a seamless introduction of VR to the classroom. When immersive technologies are deployed correctly and in a pedagogically consistent manner, they have the potential to support and expand curriculum, enhancing learning outcomes in ways that haven’t been previously affordable or scalable. Read more in this white paper detailing the impact of VR in the classroom.

Based on their own experience learning VR and Windows Mixed Reality, Garage interns have suggestions on how teachers can get started with VR. “Windows Mixed Reality does a great job of walking users through setting up the headset, then it’s just finding the app on the Microsoft Store, downloading it and installing it,” shared Rébecca. Crystal Song, another Software Engineering Intern continues, “I’d encourage teachers and school administrators to not see the tech as just a toy, but something that can teach. VR has a unique ability to teach through discovery, so allowing space and time for students to explore is key.”

James Burke, a longtime Hacking STEM developer partner who worked with the interns to test the project, encourages fellow educators to think outside the box to engage and challenge students. “Kids can do a lot more than people give them credit for.” In Burke’s engineering lab at Tyee Middle School, students work on project-based learning modules that can resemble college-level multidisciplinary assignments. With future-ready equipment and real-world projects to tackle, his award-winning classroom engages with students at every level. VR is just another way to spark that passion in students.

Request an invitation to try the project

To get started with Microgravity Lab for your classroom, request an invite to try the VR application. Include your school name and plan for using the application into the form.

More lesson plans and classroom materials are available at the Hacking STEM website.

Go to Original Article
Author: Microsoft News Center

How to Create and Manage Hot/Cold Tiered Storage

When I was working in Microsoft’s File Services team around 2010, one of the primary goals of the organization was to commoditize storage and make it more affordable to enterprises. Legacy storage vendors offered expensive products, often consuming a majority of the budget of the IT department and they were slow to make improvements because customers were locked in. Since then, every release of Windows Server has included storage management features which were previously only provided by storage vendors, such as deduplication, replication, and mirroring. These features could be used to manage commodity storage arrays and disks, reducing costs and eliminating vendor lock-in. Windows Server now offers a much-requested feature, the ability to move files between different tiers of “hot” (fast) storage and “cold” (slow) storage.

Managing hot/cold storage is conceptually similar to computer memory cache but at an enterprise scale. Files which are frequently accessed can be optimized to run on the hot storage, such as faster SSDs. Meanwhile, files which are infrequently accessed will be pushed to cold storage, such as older or cheaper disks. These lower priority files will also take advantage of file compression techniques like data deduplication to maximize storage capacity and minimize cost. Identical or varying disk types can be used because the storage is managed as a pool using Windows Server’s storage spaces, so you do not need to worry about managing individual drives. The file placement is controlled by the Resilient File System (ReFS), a file system which is used to optimize and rotate data between the “hot” and “cold” storage tiers in real-time based on their usage. However, using tiered storage is only recommended for workloads that are not regularly accessed. If you have permanently running VMs or you are using all the files on a given disk, there would be little benefit in allocating some of the disk to cold storage. This blog post will review the key components required to deploy tiered storage in your datacenter.

Overview of Resilient File System (ReFS) with Storage Tiering

The Resilient File System was first introduced in Windows Server 2012 with support for limited scenarios, but it has been greatly enhanced through the Windows Server 2019 release. It was designed to be efficient, support multiple workloads, avoid corruption and maximize data availability. More specifically to tiering though, ReFS divides the pool of storage into two tiers automatically, one for high-speed performance and one of maximizing storage capacity. The performance tier receives all the writes on the faster disk for better performance. If those new blocks of data are not frequently accessed, the files will gradually be moved to the capacity tier. Reads will usually happen from the capacity tier, but can also happen from the performance tier as needed.

Storage Spaces Direct and Mirror-Accelerated Parity

Storage Spaces Direct (S2D) is one of Microsoft’s enhancements designed to reduce costs by allowing servers with Direct Attached Storage (DAS) drives to support Windows Server Failover Clustering. Previously, highly-available file server clusters required some type of shared storage on a SAN or used an SMB file share, but S2D allows for small local clusters which can mirror the data between nodes. Check out Altaro’s blog on Storage Spaces Direct for in-depth coverage on this technology.

With Windows Server 2016 and 2019, S2D offers mirror-accelerated parity which is used for tiered storage, but it is generally recommended for backups and less frequently accessed files, rather than heavy production workloads such as VMs. In order to use tiered storage with ReFS, you will use mirror-accelerated parity. This provides decent storage capacity by using both mirroring and a parity drive to help prevent and recover from data loss. In the past, mirroring and parity would conflict and you would usually have to select one of the other.  Mirror-accelerator parity works with ReFS by taking writes and mirroring them (hot storage), then using parity to optimize their storage on disk (cold storage). By switching between these storage optimizations techniques, ReFS provides admins with the best of both worlds.

Creating Hot and Cold Tiered Storage

When configuring hot and cold storage you get to define the ratio of the hot and cold storage. For most workloads, Microsoft recommends allocating 20% to hot and 80% to cold. If you are using high-performance workloads, consider having more hot memory to support more writes. On the flip-side, if you have a lot of archival files, then allocate more cold memory. Remember that with a storage pool you can combine multiple disk types under the same abstracted storage space. The following PowerShell cmdlets show you how to configure a 1,000 GB disk to use 20% (200 GB) for performance (hot storage) and 80% (800 GB) for capacity (cold storage).

Managing Hot and Cold Tiered Storage

If you want to increase the performance of your disk, then you will allocate a great percentage of the disk to the performance (hot) tier. In the following example we use the PowerShell cmdlets to create a 30:70 ratio between the tiers:

Unfortunately, this resizing only changes the ratios of the disks but does not change the size of the partition or volume, so you likely also want to change these using the Resize-Volumes cmdlets.

Optimizing Hot and Cold Storage

Based on the types of workloads you are using, you may wish to further optimize when data is moved between hot and cold storage, which is known as the “aggressiveness” of the rotation. By default, the hot storage will use wait until 85% of its capacity is full before it begins to send data to the cold storage. If you have a lot of write traffic going to the hot storage then you want to reduce this value so that performance-tier data gets pushed to the cold storage quicker. If you have fewer write requests and want to keep data in hot storage longer then you can increase this value. Since this is an advanced configuration option, it must be configured via the registry on every node in the S2D cluster, and it also requires a restart. Here is a sample script to run on each node if you want to change the aggressiveness so that it swaps files when the performance tier reaches 70% capacity:

You can apply this setting cluster-wide by using the following cmdlet:

NOTE: If this is applied to an active cluster, make sure that you reboot one node at a time to maintain service availability.


Now you should be fully equipped with the knowledge to optimize your commodity storage using the latest Windows Server storage management features. You can pool your disks with storage spaces, use storage spaces direct (S2D) to eliminate the need for a SAN, and ReFS to optimize the performance and capacity of these drives.  By understanding the tradeoffs between performance and capacity, your organization can significantly save on storage management and hardware costs. Windows Server has made it easy to centralize and optimize your storage so you can reallocate your budget to a new project – or to your wages!

What about you? Have you tried any of the features listed in the article? Have they worked well for you? Have they not worked well? Why or why not? Let us know in the comments section below!

Go to Original Article
Author: Symon Perriman

Built by Garage Interns, find the best movie, powered by the Microsoft Recommenders collection – Microsoft Garage

Challenged with rethinking how to build a movie recommendation experience, a team of Garage interns based out of Cambridge, MA created a sample app and corresponding documentation that shows how to use recommendation algorithms in an app experience. Today, we’re excited to share their project ahead of its debut at the RecSys’19 conference next week: Recommenders Engine Experience Layout, a Microsoft Garage project. This work joins a collection of best practices and tools for recommendation engines available on a larger Recommenders GitHub. Explore both on the Recommenders GitHub repository and Recommenders Engine Example Layout GitHub repository respectively.

Bringing recommendation tools to apps

Recommender Engine Example Layout Screenshot 1Recommenders Engine Example Layout, focuses on recommendation algorithm experiences that take place in apps and provides a detailed breakdown of how developers can leverage the work by the sponsoring team. The Azure AI Customer Advisory Team, or AzureCAT AI works with such customers as ASOS to incorporate enhanced recommenders algorithms into their solutions. The team was inspired to partner with a team of Garage interns upon continued feedback that expanding on their popular Recommenders repository with a focus on apps would be helpful to customers who already have an app infrastructure.

“The key thing we wanted to demonstrate out of this was showing the recommenders we have, in a real-world setting that’s relevant to apps,” shares Scott Graham the Senior Data Scientist on Azure AI CAT who oversaw the project.”Often when we work with customers, they already have complex infrastructure and want to see how they can incorporate these algorithms into an app. This was a great opportunity to illustrate and document this.”

The Garage project documents how to build a sample app powered by the Recommenders algorithms, featuring the MovieLens dataset, one of the largest open source collections of movie ratings. Put by Bruce Gatete, Program Manager Intern for the project. “It provides an end-to-end demonstration of how developers can build fully function cross-platform applications that use these algorithms.”

Recommender Engine Example Layout Team photoSample app key features

The sample app developers can build includes a wide variety of features, including:

  • Browse a large dataset of movies
  • Select and view movie descriptions
  • Create your own personalized favorites list
  • Switch between different recommender algorithms
  • Switch between pre-generated personas or create your own

The Azure AI CAT team also has a continuous goal to accelerate the speed with which they’re able to partner with customers on this solution. Scott continues, “This was a great proof point that we could do this in the space of a summer timeframe: can we use these algorithms to quickly putt together an app? And we did!”

In addition to trying this project, developers can explore the original Recommenders repository which has recommendation algorithm tools and best practices such as a popular deep dive into the SAR model or an example of deploying these models in a production setting.

Built using Xamarin.Forms

The Recommenders Engine app is built using Xamarin.Forms and supports iOS, Android, and UWP platforms. “It was really great leveraging Xamarin Forms to be able to deploy this across all these platforms so quickly. I was really impressed with the speed of that coming together,” shared Scott Graham, who oversaw the project from the sponsoring team.

Try it Out

Recommenders Engine Example Layout and the Recommenders collection are available on GitHub worldwide. Try them out, create your own apps and experiences, and share feedback to the team.

Become a Garage Intern. We’re hiring

The Garage is hiring for the 2020 Winter & Summer seasons! Here you can learn more details about the internship and how to apply.

Why become a Garage intern? The Garage opens doors to interesting and challenging projects and collaborative partners. Michelle shares her favorite part about the internship “I enjoyed getting to know my team very well over the summer–there’s an incredible amount of talent on our team, and it’s awesome getting to work with a team of people my age and truly have a say in the design and final result of our product.”

Check out past Garage internship projects such as: Seeing AI, Web TS, Ink to Code, or Earth Lens.

Go to Original Article
Author: Microsoft News Center

Google delays Hangouts Chat migration until mid-2020

Google delayed plans to force G Suite customers to begin using the team messaging app Hangouts Chat after many businesses objected to the proposed timeline.

Google had planned to begin switching businesses from an older messaging app to Hangouts Chat in October, but the company now says it will compel users to change no sooner than June 2020.

The debacle underscores how difficult it can be for software vendors to convince users to abandon technology they have grown accustomed to, even if a newer app offers more advanced features than the previous version.

The tech giant launched Hangouts Chat in early 2018 in response to the growing popularity of apps like Slack. For Google’s business customers, the newer app will replace Classic Hangouts, a messaging service that supports basic one-to-one and group chats.  

Google designed Hangouts Chat to keep pace with a trend towards more robust, team-centered messaging apps that integrate with business workflows. The product is Google’s counter to Microsoft Teams, a messaging and meetings app included with Office 365.

Microsoft dominated the traditional business productivity market for decades, but Google has been attempting to steal some of those customers as they move to the cloud.

Google provides Hangouts Chat to businesses subscribed to G Suite, a cloud-based bundle of email, calendar, word-processing and document-storage apps that competes with Office 365. G Suite also includes the video conferencing app Hangouts Meet.

In a blog post, Google said many of its customers had requested more time to prepare users for the switch from Classic Hangouts to Hangouts Chat. Businesses wishing to make the switch today can apply for an invitation to an accelerated transition program.

Google has yet to deliver on some critical features for Hangouts Chat. For examples, users can’t chat with external parties or from within Gmail, both of which are possible through Classic Hangouts. Those features, now in beta, are scheduled to launch during the first half of 2020.

Google will continue to support Classic Hangouts for consumers, although the company previously said it would eventually move those users to free versions of Hangouts Chat and Hangouts Meet. Google has not provided a timeline for that change.

Microsoft has faced similar pushback from customers in its campaign to migrate users from Skype for Business to Teams. Microsoft is giving those using Skype for Business Online, a cloud-based version of the app included with Office 365, until July 2021 to adopt Teams.

Google has taken a slightly different approach to enterprise collaboration than Microsoft. It maintains separate apps for calling, messaging and meetings — capabilities Microsoft has chosen to combine in Teams.

Google appears to be grappling with how to respond to a trend away from email.

Research shows that adoption of apps like Slack and Teams results in workers sending fewer emails, said Irwin Lazar, analyst at Nemertes Research. Google, however, has invested heavily in Gmail and continues to roll out new AI technologies to improve the service.

“Gmail is still bread and butter for them, but [email is] just not the ideal way to collaborate,” Lazar said. At some point, he said, Google may need to commit to making Hangouts Chat its primary interface for enterprise collaboration. “That’s a big leap for them to make.”

Go to Original Article

Chief transformation officer takes digital one step further

There’s a new player on the block when it comes to the team leading digital efforts within a healthcare organization.

Peter Fleischut, M.D., has spent the last two years leading telemedicine, robotics and robotic process automation and artificial intelligence efforts at New York-Presbyterian as its chief transformation officer, a relatively new title that is beginning to take form right alongside the chief digital officer.

Fleischut works as part of the organization’s innovation team under New York-Presbyterian CIO Daniel Barchi. Formerly the chief innovation officer for New York-Presbyterian, Fleischut described his role as improving care delivery and providing a better digital experience.

“I feel like we’re past the age of innovating. Now it’s really about transforming our care model,” he said.

What is a chief transformation officer?

The chief transformation officer is “larger than a technology or digital role alone,” according to Barchi.

Indeed, Laura Craft, analyst at Gartner, said she’s seeing healthcare organizations use the title more frequently to indicate a wider scope than, say, the chief digital officer.

The chief digital officer, a title that emerged more than five years ago, is often described as taking an organization from analog to digital. The digital officer role is still making inroads in healthcare today. Kaiser Permanente recently named Prat Vemana as its first chief digital officer for the Kaiser Foundation Health Plan and Hospitals. In the newly created role, Vemana is tasked with leading Kaiser Permanente’s digital strategy in collaboration with internal health plan and hospital teams, according to a news release.

A chief transformation officer, however, often focuses not just on digital but also emerging tech, such as AI, to reimagine how an organization does business.

“It has a real imperative to change the way [healthcare] is operating and doing business, and healthcare organizations are struggling with that,” Craft said. 

Barchi, who has been CIO at New York-Presbyterian for four years, said the role of chief transformation officer was developed by the nonprofit academic medical center to “take technology to the next level” and scale some of the digital programs it had started. The organization sought to improve not only back office functions but to advance the way it operates digitally when it comes to the patient experience, from hospital check-in to check-out.

I feel like we’re past the age of innovating. Now it’s really about transforming our care model.
Peter Fleischut, M.D.Chief transformation officer, New York-Presbyterian

Fleischut was selected for the role due to his background as a clinician, as well as the organization’s former chief innovation officer. He has been in the role for two years and is charged with further developing and scaling New York-Presbyterian’s AI, robotics and telemedicine programs.

The organization, which has four major divisions and is comprised of 10 hospitals, deeply invested in its telemedicine efforts and built a suite of services about four years ago. In 2016, it completed roughly 1,000 synchronous video visits between providers and patients. Now, the organization expects to complete between 500,000 and 1,000,000 video visits by the end of 2019, Fleischut said during his talk at the recent mHealth & Telehealth World Summit in Boston.

One of the areas where New York-Presbyterian expanded its telemedicine services under Fleischut’s lead was in emergency rooms, offering low-acuity patients the option of seeing a doctor virtually instead of in-person, which shortened patient discharge times from an average baseline of two and a half hours to 31 minutes.

The healthcare organization has also expanded its telemedicine services to kiosks set up in local Walgreens, and has a mobile stroke unit operating out of three ambulances. Stroke victims are treated in the ambulance virtually by an on-call neurologist.  

“At the end of the day with innovation and transformation, it’s all about speed, it’s all about time, and that’s what this is about,” Fleischut said. “How to leverage telemedicine to provide faster, quicker, better care to our patients.”

Transforming care delivery, hospital operations  

Telemedicine is one example of how New York-Presbyterian is transforming the way it interacts with patients. Indeed, that’s one of Fleischut’s main goals — to streamline the patient experience digitally through tools like telemedicine, Barchi said.

“The way you reach patients is using technology to be part of their lives,” Barchi said. “So Pete, in his role, is really important because we wanted someone focused on that patient experience and using things like telemedicine to make the patient journey seamless.” 

But for Fleischut to build a better patient experience, he also has to transform the way the hospital operates digitally, another one of his major goals.

As an academic medical center, Barchi said the organization invests significantly in advanced, innovative technology, including robotics. Barchi said he works with one large budget to fund innovation, information security and electronic medical records.

One hospital operation Fleischut worked to automate using robotics was food delivery. Instead of having hospital employees deliver meals to patients, New York-Presbyterian now uses large robots loaded with food trays that are programmed to deliver patient meals.

Fleischut’s work, Barchi said, will continue to focus on innovative technologies transforming the way New York-Presbyterian operates and delivers care.

“Pete’s skills from being a physician with years of experience, as well as his knowledge of technology, allow him to be truly transformative,” Barchi said.

In his role as chief transformation officer, Fleischut said he considers people and processes the most important part of the transformation journey. Without having the right processes in place for changing care delivery and without provider buy-in, the effort will not be a success, he said.

“Focusing on the people and the process leads to greater adoption of technologies that, frankly, have been beneficial in other industries,” he said.

Go to Original Article