Tag Archives: team

How Microsoft re-envisioned the data warehouse with Azure Synapse Analytics

About four years ago, the Microsoft Azure team began to notice a big problem troubling many of its customers. A mass migration to the cloud was in full swing, as enterprises signed up by the thousands to reap the benefits of flexible, largescale computing and data storage. But the next iteration of that tech revolution, in which companies would use their growing stores of data to get more tangible business benefits, had stalled.

Technology providers, including Microsoft, have built a variety of systems to collect, retrieve and analyze enormous troves of information that would uncover market trends and insights, paving the way toward a new era of improved customer service, innovation and efficiency.

But those systems were built independently by different engineering teams and sold as individual products and services. They weren’t designed to connect with one another, and customers would have to learn how to operate them separately, wasting time, money and precious IT talent.

“Instead of trying to add more features to each of our services, we decided to take a step back and figure out how to bring their core capabilities together to make it easy for customers to collect and analyze all of their increasingly diverse data, to break down data silos and work together more collaboratively,” said Raghu Ramakrishnan, Microsoft’s chief technology officer for data.

At its Ignite conference this week in Orlando, Florida, Microsoft announced the end result of a yearslong effort to address the problem: Azure Synapse Analytics, a new service that merges the capabilities of Azure SQL Data Warehouse with new enhancements such as on-demand query as a service.

Microsoft said this new offering will help customers put their data to work much more quickly, productively and securely by pulling together insights from all data sources, data warehouses and big data analytics systems. And, the company said, with deeper integration between Power BI and Azure Machine Learning, Azure Synapse Analytics can reduce the time required to process and share that data, speeding up the insights that businesses can glean.

What’s more, it will allow many more businesses to take advantage of game-changing technologies like data analytics and artificial intelligence, which are helping scientists to better predict the weather, search engines to better understand people’s intent and workers to more easily handle mundane tasks.

This newest effort to break down data silos also builds on other Microsoft projects, such as the Open Data Initiative and Azure Data Share, which allows you to share data from multiple sources and even other organizations.

Microsoft said Azure Synapse Analytics is also designed to support the increasingly popular DevOps strategy, in which development and operations staff collaborate more closely to create and implement services that work better throughout their lifecycles.

YouTube Video

A learning process

Azure Synapse Analytics is the result of a lot of work, and a little trial and error.

At first, Ramakrishnan said, the team developed highlevel guidelines showing customers how to glue the systems together themselves. But they quickly realized that was too much to ask.

“That required a lot of expertise in the nitty gritty of our platforms,” Ramakrishnan said. Customers made it overwhelmingly clear that we needed to do better.”

So, the company went back to the drawing board and spent an additional two years revamping the heart of its data business, Azure SQL Data Warehouse, which lets customers build, test, deploy and manage applications and services in the cloud.

A breakthrough came when the company realized that customers need to analyze all their data in a single service, without having to copy terabytes of information across various systems to use different analytic capabilities – as has traditionally been the case with data warehouses and data lakes.

With the new offering, customers can use their data analytics engine of choice, such as Apache Spark or SQL, on all their data. That’s true whether it’s structured data, such as rows of numbers on spreadsheets, or unstructured data, such as a collection of social media posts.

This project was risky. It involved deep technical surgery: completely rewriting the guts of the SQL query processing engine to optimize it for the cloud and make it capable of instantly handling big bursts of work as well as very large and diverse datasets.

It also required unprecedented integration among several teams within Microsoft, some of whom would have to make hard choices. Established plans had to be scrapped. Resources earmarked for new features would be redirected to help make the entire system work better.

“In the beginning, the conversations were often heated. But as we got into the flow of it, they became easier. We began to come together,” Ramakrishnan said.

Microsoft also had to make sure that the product would work for any company, regardless of employees’ technical expertise.

“Most companies can’t afford to hire teams of 20 people to drive data projects and wire together multiple systems. There aren’t even enough skilled people out there to do all that work,” said Daniel Yu, director of product marketing for Azure Data and Artificial Intelligence.

Making it easy for customers

Customers can bring together various sources of data into a single feed with Azure Synapse Analytics Studio, a console – or single pane of glass that will allow a business professional with minimal technical expertise to locate and collect data from multiple sources like sales, supply chain, finance and product development. They can then choose how and where to store that data, and they can use it to create reports through Microsoft’s popular Power BI analytics service.

In a matter of hours, Azure Synapse will deliver useful business insights that used to take days or even weeks and months, said Rohan Kumar, corporate vice president for Azure Data.

“Let’s say an executive wants a detailed report on sales performance in the eastern U.S. over the last six months,” Kumar said. Today, a data engineer has to do a lot of work to find where that data is stored and write a lot of brittle code to tie various services together. They might even have to bring in a systems integrator partner. With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

The complexity of the technical problems Azure Synapse addressed would be hard to overstate. Microsoft had to meld multiple independent components into one coherent form factor, while giving a wide range of people – from data scientists to line of business owners – their preferred tools for accessing and using data.


With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

~ Rohan Kumar, corporate vice president for Azure Data


That includes products like SQL Server, the open source programming interface Apache Spark, Azure Data Factory and Azure Data Studio, as well as notebook interfaces preferred by many data professionals to clean and model data.

“Getting all those capabilities to come together fluidly, making it run faster, simpler, eliminating overlapping processes – there was some scary good stuff getting done,” Ramakrishnan said.

The result is a data analytics system that will be as easy to use as a modern mobile phone. Just as the smartphone replaced several devices by making all of their core capabilities intuitively accessible in a single device, the Azure Synapse “smartphone for data” now allows a data engineer to build an entire end-to-end data pipeline in one place. It also enables data scientists and analysts to look at the underlying data in ways that are natural to them.

And just as the phone has driven waves of collaboration and business innovation, Azure Synapse will free up individuals and companies to introduce new products and services as quickly as they can dream them up, Microsoft said.

“If we can help different people view data through a lens that is natural to them, while it’s also visible to others in ways natural to them, then we will transform the way companies work,” Ramakrishnan said. That’s how we should measure our success.

Top photo: Rohan Kumar, corporate vice president for Azure Data, says Azure Synapse will deliver useful business insights that used to take days or even weeks and months. Photo by Scott Eklund/Red Box Pictures.

Related:

Go to Original Article
Author: Microsoft News Center

Announcing our new podcast: Artificial Intelligence in Education | | Microsoft EDU

Dan Bowen and Ray Fleming, from our Microsoft Australia Education team, have put their voices together to create a new podcast series, the Artificial Intelligence (AI) in Education Podcast. Over the coming weeks they’ll be talking about what artificial intelligence is, and what it  could be used for in schools, colleges and universities. It isn’t a technical conversation, but is intended for educators who are interested to learn more about this often discussed technology wave.

With “the robots are coming to steal our (students’) jobs!” being a hot topic of many education conferences (and, at many business conferences too), Dan and Ray have their feet on the ground and spend their time talking about what’s really happening, and what the technical language means in plain English. As we run further into the podcast series, Ray and Dan will talk about scenarios that are more specific to their areas of expertise – Dan’s a schools specialist, and Ray’s a specialist in higher education. And along the way they will also interview specialists in other areas – personalising learning, accessibility – and how artificial intelligence is intersecting with them and bringing innovation.

Your podcast hosts

  • Ray Fleming is the Higher Education Director for Microsoft Australia, and has spent his career working within the education ICT industry. From working with tertiary education organisations at global and national level, Ray brings insights into the rapid pace of change being seen as digital disruption occurs in other industries, and what might happen next in Australia’s universities.
    In the past Ray’s been an award-winning writer, and columnist for the Times Higher Education. He’s also a regular speaker at higher education conferences in Australia on data-led decision making, the implications of AI for the higher education sector and today’s students, and on digital transformation of industries.
  • Dan is a Technology Strategist working with schools and school systems. He has worked in education as a teacher, governor and school inspector. He also worked in higher education as a blended learning advisor, before moving into Microsoft where he was the product manager for Office 365 in Education, STEM education lead and managed the Minecraft portfolio for Australia (to the delight of his kids). In his current role he is working across Azure, Windows and devices and looking to support schools to drive educational transformation. His interest in Artificial Intelligence comes from both the use cases and implementation as well as the education and enablement of IT to drive this technology. He is interested in the social and ethical uses of AI in Education.

You can find the podcast in Spotify, Apple Podcasts and Google Podcasts, search in your normal podcast app for “AI in Education”, or listen directly on the podcast website at http://aipodcast.education 

Click here for free STEM resourcesExplore tools for student-centered learning

Go to Original Article
Author: Microsoft News Center

Behind the Design: Surface Headphones

Meet the Surface design team who built our first smart headphones

Vivian Nguyen

It’s 2 PM, and you need to finish a project by end of day. Coworkers in your open office space are chatting about the newest ramen spot. While you’d like to expound on the difference between Hokkaido- versus Tokyo-style noodles, you need to focus. You put on your Surface Headphones, and Cortana greets you with a quick update:

“Hi, you’ve got about 8 hours of battery left. You’re connected to [your device name].”

As your favorite Odesza song plays, you start work immediately.

YOUR WORKPLACE ACCOMPANIMENT

Composed with the design trinity of audio quality, comfort, and seamless integration, the Surface Headphones help you create a personal space in the modern workplace and in your modern life. The idea that, when you wear them, you escape into another world that lets you focus on you and what you need to get done. The Surface design team wanted to give you — the actual user — control over how plugged in (or not!) you want to be to your immediate environment. Check out the tech specs here.

And you can see that thoughtful approach in the hardware. Designing comfortable earmuffs was paramount because it’s the one part that touches you all the time. They initially considered a traditional donut shape, but with inclusive design at the heart of everything we do, we wanted to accommodate a diverse set of ear shapes. The earmuffs now fit ears of all shapes and sizes comfortably with even pressure points for a secure fit.

Tactile design wasn’t the only consideration. They set out to craft a device that’s both functional and beautiful. Creating a smooth seam on the earmuff, for example, was surprisingly difficult. See how the team wouldn’t take no for an answer in the video below:

See how the Surface design team wove together elegant hardware design, rich audio, and an intelligent assistant. Click here for the audio description version.

Every decision about the Surface Headphones keeps real people in mind — including the writing they don’t see or touch.

To create a holistic and seamless voice experience, Senior Writer Matt Lichtenberg, who focuses on hardware, and Senior Personality Lead for AI Chris O’Connor, who shapes the voice for intelligent experiences, fused their complementary skills. Because Cortana delivers the instructions, Matt and Chris needed to collaborate and bring together the what (instructions) and the how (Cortana).

“Words contribute to the whole experience,” said Matt, “and we wanted the headphones to be almost invisible to people while they’re wearing them. They shouldn’t have to think about them much.”

“I like to think of it as, we’re helping people achieve more with less energy,” said Chris. “How do they get the most out of this device with the least amount of effort? It’s the idea that design stays out of your way — it’s minimal and there to help you get stuff done.”

THINKING OUT OF THE BOX

From the onset, the design team wanted to understand how people naturally use headphones in a variety of vignettes. They developed a series of scenarios to answer key questions about how people interacted with the headphones.

For instance, when customers initially turn on the headphones, would they want to pair and go? Or would they download the Cortana app first?

As it turns out, most want to pair and go.

When you turn on other Bluetooth devices for the first time, you’ll need to put the device in pairing mode. With the Surface Headphones, they’re immediately in pairing mode and Cortana greets you with, “Hello, you’re ready to pair.”

You connect your device, and Cortana confirms with, “You’re paired to [device name].”

“It’s a challenge to create a rich and enjoyable out-of-the-box experience,” said Chris. “If it’s boring and tedious, people blow right through it. But if it’s enjoyable and people understand the value, they’ll reach an optimal state before carrying on.”

Design is an iterative process, and we’re constantly listening to feedback. We’ve heard customers ask for more device control to turn settings on or off, including the “Hey, Cortana” voice activation, touch controls, and voice prompts. So, we delivered.

The latest firmware update on the Cortana app can help you personalize your headphone settings, like reducing the number or duration of voice prompts. That means you can change your settings so a simple “Hello” plays when you initially turn on your headphones. The app gives you more control of your device, ensuring you get the best experience possible.

“It’s amazing how long it feels to say a few words, so you need to make them count,” said Matt.

Unlike computers, which require constant interaction, the Surface Headphones almost disappear into the background while you work, helping you focus while eliminating outside distractions. To help people achieve this, the voice writing team designed the voice prompts to avoid interruptions unless they’re critical, like letting you know when your battery is low.

“How do you thread the needle between being a voice prompt, a robot, and a conversational entity, but still get out of the way?” asked Chris. “This was one of the first areas where we had to practice design differently and pull back on personality to allow things to be shorter and faster.”

COMMUNICATING WITHOUT WORDS

Some interactions don’t even need words.

When the headphones are charging, for example, the LED light flashes. In this context, a visual cue is more intuitive. You don’t need to pick them up or put them on to know what’s happening.

In times when words feel unnatural, sound itself can communicate information. When you turn the left dial on the Surface Headphones forward, you hear a low-pitched beep to indicate maximum noise cancellation. Conversely, a high-pitched beep plays when you turn the dial in the opposite direction. This confirms the headphones are now amplifying ambient sound.

Inspired by the volume knobs of hi-fi stereos, which turn with a certain slowness, the hardware design team added headset dials to adjust volume, noise cancellation, or sound amplification. Rotating the dial is an intuitive motion that lets people choose the precise level of sound they want (or don’t want).

Our design anticipates different modes of communication contingent on how someone wants to use or interact with the headphones. But whether it’s audio or visual, each interaction remains succinct.

THE NEXT MOVEMENT IN VOICE DESIGN

The Surface Headphones are the first ambient device from Microsoft with an assistant. The Surface design team had a groundbreaking opportunity to radically reimagine headphones as more than just headphones.

In the past, people often confused or conflated digital assistants with voice control. But with increased investments in personality design and the future of interaction, Microsoft is experimenting with giving Cortana added dimension and awareness to help customers get the most out of a digital assistant.

“We decided to use the human metaphor for a digital assistant, because a real-life assistant isn’t just voice control. They don’t just take dictation. They understand what’s important to you, your family, your priorities, your goals,” explained Chris.

As we continue to infuse intelligence across our products and services, teams throughout the company are beginning to explore the potential for what a digital assistant could be.

“The headphones sparked a whole new area of thinking — one that we’re using to think through the same problem from other endpoints as we move on to work for the Office 365 apps,” said Chris.

And who knows? Maybe one day, when you slip on your Surface Headphones, Cortana can chime in with her favorite kind of ramen, too.

Go to Original Article
Author: Microsoft News Center

New VR Garage project Microgravity Lab takes students to space – Microsoft Garage

Virtual reality can transport us to new lands that are near, far, or imagined. As a team of Garage interns found partnering with the Microsoft Hacking STEM and NASA Stem on Station teams, it can also demonstrate physics concepts and spark an interest in STEM careers. For the back-to-school season, we’re excited to announce the opportunity to try Microgravity Lab, a Microsoft Garage project. The VR experience for Windows Mixed Reality and corresponding lesson plan equip teachers with an engaging tool for teaching physics concepts by simulating microgravity. Interested educators can request an invite to try the VR application and corresponding lesson plans. Be sure to include your school name and plan for using the application into the form.

Bringing space into the classroom via Windows Mixed Reality

The Garage Internship is a unique, startup-style program in which teams of interns build projects in response to pitched challenges by Microsoft engineering teams. When this Vancouver intern team heard that the Microsoft Education team was looking for a creative new method way to illustrate the concept of microgravity through VR, they jumped at the opportunity to work on the project.

Microgravity Lab title screen, displaying 5 different expeiences, settings, and other options.An often-misunderstood concept, microgravity is difficult to simulate and understand in Earth’s gravity-laden environment. It is best explained through experiential learning. The Microgravity Lab VR lab experience for Windows Mixed Reality and its accompanying lessons gives teachers the tools to bring this experiential learning to their students.

As NASA Education Specialist Matthew E. Wallace shared, “The concept of microgravity is often misunderstood by students who learn about astronauts on the International Space Station. Providing a virtual reality world for them to explore the phenomena of life on orbit is an excellent way to engage students and solidify their comprehension of concepts related to force, mass and gravitational acceleration.”

Sabrina Ng, Design Intern for the project noted, “When I think of microgravity, I think of it as something you feel, not what you see per se. Thinking about how to visualize and communicate such an abstract concept without stimulating the physical senses was a really cool challenge.”

Microgravity Lab joins a collection of eight middle school lesson plans developed in partnership with NASA to celebrate 20 years of humans living in and working on the International Space Station.

Experiencing microgravity to understand Newton’s 2nd & 3rd Law

Microgravity Lab is designed for grades 6-8. Students can explore three VR modules to understand these physics principles in the context of microgravity on the moon:

  • Conservation of momentum
  • Newton’s 2nd Law
  • Newton’s 3rd Law

The team worked closely with teachers to develop the project, testing early versions of Microgravity Lab with 7th and 8th grade classes. They refined and updated the experienced based on the classroom feedback.

Implementing feedback from teachers and students, the interns added a feature to enable live Microgravity data analysis via Excel. “This project gives students the experience and the fun aspects of VR, but with Excel, we found a way to expose them to Data Analysis. Data is a very important part of our world and this is a great way to introduce it to them,” shared Rébecca Vézina-Côté, the Program Manager Intern for Microgravity Lab.

Introducing space into the classroom via Windows Mixed Reality

Hacking STEM to engage students

Microgravity Lab joins the Hacking STEM portfolio. The portfolio is created by teachers for teachers to offer hands-on, inquiry-driven, real-world lesson plans. The standards-aligned, interdisciplinary lessons lesson plans teach 21st century technical skills in the context of existing curricula. The Hacking STEM portfolio now includes 22 middle and high school lesson plans built by teachers for teachers on topics ranging from circuits and robotic hands to learning how sharks swim, and now, microgravity.

“There are companies moving towards commercializing space travel and package delivery, a project like this might give students an idea of what life might be like on a space station, and hopefully inspire them to want to go further with it and see it as a future path for them as an area of interest or a future career,” shared Adrian Pang, a Software Engineer Intern with the project.

The Microgravity Lab experience makes science more engaging and introduces these concepts to students in a way that inspires lifelong learning and passionate curiosity about the world around them.

The impact of VR in the classroom

Microgravity lab team photoThe Microsoft Education team has provided materials to enable a seamless introduction of VR to the classroom. When immersive technologies are deployed correctly and in a pedagogically consistent manner, they have the potential to support and expand curriculum, enhancing learning outcomes in ways that haven’t been previously affordable or scalable. Read more in this white paper detailing the impact of VR in the classroom.

Based on their own experience learning VR and Windows Mixed Reality, Garage interns have suggestions on how teachers can get started with VR. “Windows Mixed Reality does a great job of walking users through setting up the headset, then it’s just finding the app on the Microsoft Store, downloading it and installing it,” shared Rébecca. Crystal Song, another Software Engineering Intern continues, “I’d encourage teachers and school administrators to not see the tech as just a toy, but something that can teach. VR has a unique ability to teach through discovery, so allowing space and time for students to explore is key.”

James Burke, a longtime Hacking STEM developer partner who worked with the interns to test the project, encourages fellow educators to think outside the box to engage and challenge students. “Kids can do a lot more than people give them credit for.” In Burke’s engineering lab at Tyee Middle School, students work on project-based learning modules that can resemble college-level multidisciplinary assignments. With future-ready equipment and real-world projects to tackle, his award-winning classroom engages with students at every level. VR is just another way to spark that passion in students.

Request an invitation to try the project

To get started with Microgravity Lab for your classroom, request an invite to try the VR application. Include your school name and plan for using the application into the form.

More lesson plans and classroom materials are available at the Hacking STEM website.

Go to Original Article
Author: Microsoft News Center

How to Create and Manage Hot/Cold Tiered Storage

When I was working in Microsoft’s File Services team around 2010, one of the primary goals of the organization was to commoditize storage and make it more affordable to enterprises. Legacy storage vendors offered expensive products, often consuming a majority of the budget of the IT department and they were slow to make improvements because customers were locked in. Since then, every release of Windows Server has included storage management features which were previously only provided by storage vendors, such as deduplication, replication, and mirroring. These features could be used to manage commodity storage arrays and disks, reducing costs and eliminating vendor lock-in. Windows Server now offers a much-requested feature, the ability to move files between different tiers of “hot” (fast) storage and “cold” (slow) storage.

Managing hot/cold storage is conceptually similar to computer memory cache but at an enterprise scale. Files which are frequently accessed can be optimized to run on the hot storage, such as faster SSDs. Meanwhile, files which are infrequently accessed will be pushed to cold storage, such as older or cheaper disks. These lower priority files will also take advantage of file compression techniques like data deduplication to maximize storage capacity and minimize cost. Identical or varying disk types can be used because the storage is managed as a pool using Windows Server’s storage spaces, so you do not need to worry about managing individual drives. The file placement is controlled by the Resilient File System (ReFS), a file system which is used to optimize and rotate data between the “hot” and “cold” storage tiers in real-time based on their usage. However, using tiered storage is only recommended for workloads that are not regularly accessed. If you have permanently running VMs or you are using all the files on a given disk, there would be little benefit in allocating some of the disk to cold storage. This blog post will review the key components required to deploy tiered storage in your datacenter.

Overview of Resilient File System (ReFS) with Storage Tiering

The Resilient File System was first introduced in Windows Server 2012 with support for limited scenarios, but it has been greatly enhanced through the Windows Server 2019 release. It was designed to be efficient, support multiple workloads, avoid corruption and maximize data availability. More specifically to tiering though, ReFS divides the pool of storage into two tiers automatically, one for high-speed performance and one of maximizing storage capacity. The performance tier receives all the writes on the faster disk for better performance. If those new blocks of data are not frequently accessed, the files will gradually be moved to the capacity tier. Reads will usually happen from the capacity tier, but can also happen from the performance tier as needed.

Storage Spaces Direct and Mirror-Accelerated Parity

Storage Spaces Direct (S2D) is one of Microsoft’s enhancements designed to reduce costs by allowing servers with Direct Attached Storage (DAS) drives to support Windows Server Failover Clustering. Previously, highly-available file server clusters required some type of shared storage on a SAN or used an SMB file share, but S2D allows for small local clusters which can mirror the data between nodes. Check out Altaro’s blog on Storage Spaces Direct for in-depth coverage on this technology.

With Windows Server 2016 and 2019, S2D offers mirror-accelerated parity which is used for tiered storage, but it is generally recommended for backups and less frequently accessed files, rather than heavy production workloads such as VMs. In order to use tiered storage with ReFS, you will use mirror-accelerated parity. This provides decent storage capacity by using both mirroring and a parity drive to help prevent and recover from data loss. In the past, mirroring and parity would conflict and you would usually have to select one of the other.  Mirror-accelerator parity works with ReFS by taking writes and mirroring them (hot storage), then using parity to optimize their storage on disk (cold storage). By switching between these storage optimizations techniques, ReFS provides admins with the best of both worlds.

Creating Hot and Cold Tiered Storage

When configuring hot and cold storage you get to define the ratio of the hot and cold storage. For most workloads, Microsoft recommends allocating 20% to hot and 80% to cold. If you are using high-performance workloads, consider having more hot memory to support more writes. On the flip-side, if you have a lot of archival files, then allocate more cold memory. Remember that with a storage pool you can combine multiple disk types under the same abstracted storage space. The following PowerShell cmdlets show you how to configure a 1,000 GB disk to use 20% (200 GB) for performance (hot storage) and 80% (800 GB) for capacity (cold storage).

Managing Hot and Cold Tiered Storage

If you want to increase the performance of your disk, then you will allocate a great percentage of the disk to the performance (hot) tier. In the following example we use the PowerShell cmdlets to create a 30:70 ratio between the tiers:

Unfortunately, this resizing only changes the ratios of the disks but does not change the size of the partition or volume, so you likely also want to change these using the Resize-Volumes cmdlets.

Optimizing Hot and Cold Storage

Based on the types of workloads you are using, you may wish to further optimize when data is moved between hot and cold storage, which is known as the “aggressiveness” of the rotation. By default, the hot storage will use wait until 85% of its capacity is full before it begins to send data to the cold storage. If you have a lot of write traffic going to the hot storage then you want to reduce this value so that performance-tier data gets pushed to the cold storage quicker. If you have fewer write requests and want to keep data in hot storage longer then you can increase this value. Since this is an advanced configuration option, it must be configured via the registry on every node in the S2D cluster, and it also requires a restart. Here is a sample script to run on each node if you want to change the aggressiveness so that it swaps files when the performance tier reaches 70% capacity:

You can apply this setting cluster-wide by using the following cmdlet:

NOTE: If this is applied to an active cluster, make sure that you reboot one node at a time to maintain service availability.

Wrap-Up

Now you should be fully equipped with the knowledge to optimize your commodity storage using the latest Windows Server storage management features. You can pool your disks with storage spaces, use storage spaces direct (S2D) to eliminate the need for a SAN, and ReFS to optimize the performance and capacity of these drives.  By understanding the tradeoffs between performance and capacity, your organization can significantly save on storage management and hardware costs. Windows Server has made it easy to centralize and optimize your storage so you can reallocate your budget to a new project – or to your wages!

What about you? Have you tried any of the features listed in the article? Have they worked well for you? Have they not worked well? Why or why not? Let us know in the comments section below!


Go to Original Article
Author: Symon Perriman

Built by Garage Interns, find the best movie, powered by the Microsoft Recommenders collection – Microsoft Garage

Challenged with rethinking how to build a movie recommendation experience, a team of Garage interns based out of Cambridge, MA created a sample app and corresponding documentation that shows how to use recommendation algorithms in an app experience. Today, we’re excited to share their project ahead of its debut at the RecSys’19 conference next week: Recommenders Engine Experience Layout, a Microsoft Garage project. This work joins a collection of best practices and tools for recommendation engines available on a larger Recommenders GitHub. Explore both on the Recommenders GitHub repository and Recommenders Engine Example Layout GitHub repository respectively.

Bringing recommendation tools to apps

Recommender Engine Example Layout Screenshot 1Recommenders Engine Example Layout, focuses on recommendation algorithm experiences that take place in apps and provides a detailed breakdown of how developers can leverage the work by the sponsoring team. The Azure AI Customer Advisory Team, or AzureCAT AI works with such customers as ASOS to incorporate enhanced recommenders algorithms into their solutions. The team was inspired to partner with a team of Garage interns upon continued feedback that expanding on their popular Recommenders repository with a focus on apps would be helpful to customers who already have an app infrastructure.

“The key thing we wanted to demonstrate out of this was showing the recommenders we have, in a real-world setting that’s relevant to apps,” shares Scott Graham the Senior Data Scientist on Azure AI CAT who oversaw the project.”Often when we work with customers, they already have complex infrastructure and want to see how they can incorporate these algorithms into an app. This was a great opportunity to illustrate and document this.”

The Garage project documents how to build a sample app powered by the Recommenders algorithms, featuring the MovieLens dataset, one of the largest open source collections of movie ratings. Put by Bruce Gatete, Program Manager Intern for the project. “It provides an end-to-end demonstration of how developers can build fully function cross-platform applications that use these algorithms.”

Recommender Engine Example Layout Team photoSample app key features

The sample app developers can build includes a wide variety of features, including:

  • Browse a large dataset of movies
  • Select and view movie descriptions
  • Create your own personalized favorites list
  • Switch between different recommender algorithms
  • Switch between pre-generated personas or create your own

The Azure AI CAT team also has a continuous goal to accelerate the speed with which they’re able to partner with customers on this solution. Scott continues, “This was a great proof point that we could do this in the space of a summer timeframe: can we use these algorithms to quickly putt together an app? And we did!”

In addition to trying this project, developers can explore the original Recommenders repository which has recommendation algorithm tools and best practices such as a popular deep dive into the SAR model or an example of deploying these models in a production setting.

Built using Xamarin.Forms

The Recommenders Engine app is built using Xamarin.Forms and supports iOS, Android, and UWP platforms. “It was really great leveraging Xamarin Forms to be able to deploy this across all these platforms so quickly. I was really impressed with the speed of that coming together,” shared Scott Graham, who oversaw the project from the sponsoring team.

Try it Out

Recommenders Engine Example Layout and the Recommenders collection are available on GitHub worldwide. Try them out, create your own apps and experiences, and share feedback to the team.

Become a Garage Intern. We’re hiring

The Garage is hiring for the 2020 Winter & Summer seasons! Here you can learn more details about the internship and how to apply.

Why become a Garage intern? The Garage opens doors to interesting and challenging projects and collaborative partners. Michelle shares her favorite part about the internship “I enjoyed getting to know my team very well over the summer–there’s an incredible amount of talent on our team, and it’s awesome getting to work with a team of people my age and truly have a say in the design and final result of our product.”

Check out past Garage internship projects such as: Seeing AI, Web TS, Ink to Code, or Earth Lens.

Go to Original Article
Author: Microsoft News Center

Google delays Hangouts Chat migration until mid-2020

Google delayed plans to force G Suite customers to begin using the team messaging app Hangouts Chat after many businesses objected to the proposed timeline.

Google had planned to begin switching businesses from an older messaging app to Hangouts Chat in October, but the company now says it will compel users to change no sooner than June 2020.

The debacle underscores how difficult it can be for software vendors to convince users to abandon technology they have grown accustomed to, even if a newer app offers more advanced features than the previous version.

The tech giant launched Hangouts Chat in early 2018 in response to the growing popularity of apps like Slack. For Google’s business customers, the newer app will replace Classic Hangouts, a messaging service that supports basic one-to-one and group chats.  

Google designed Hangouts Chat to keep pace with a trend towards more robust, team-centered messaging apps that integrate with business workflows. The product is Google’s counter to Microsoft Teams, a messaging and meetings app included with Office 365.

Microsoft dominated the traditional business productivity market for decades, but Google has been attempting to steal some of those customers as they move to the cloud.

Google provides Hangouts Chat to businesses subscribed to G Suite, a cloud-based bundle of email, calendar, word-processing and document-storage apps that competes with Office 365. G Suite also includes the video conferencing app Hangouts Meet.

In a blog post, Google said many of its customers had requested more time to prepare users for the switch from Classic Hangouts to Hangouts Chat. Businesses wishing to make the switch today can apply for an invitation to an accelerated transition program.

Google has yet to deliver on some critical features for Hangouts Chat. For examples, users can’t chat with external parties or from within Gmail, both of which are possible through Classic Hangouts. Those features, now in beta, are scheduled to launch during the first half of 2020.

Google will continue to support Classic Hangouts for consumers, although the company previously said it would eventually move those users to free versions of Hangouts Chat and Hangouts Meet. Google has not provided a timeline for that change.

Microsoft has faced similar pushback from customers in its campaign to migrate users from Skype for Business to Teams. Microsoft is giving those using Skype for Business Online, a cloud-based version of the app included with Office 365, until July 2021 to adopt Teams.

Google has taken a slightly different approach to enterprise collaboration than Microsoft. It maintains separate apps for calling, messaging and meetings — capabilities Microsoft has chosen to combine in Teams.

Google appears to be grappling with how to respond to a trend away from email.

Research shows that adoption of apps like Slack and Teams results in workers sending fewer emails, said Irwin Lazar, analyst at Nemertes Research. Google, however, has invested heavily in Gmail and continues to roll out new AI technologies to improve the service.

“Gmail is still bread and butter for them, but [email is] just not the ideal way to collaborate,” Lazar said. At some point, he said, Google may need to commit to making Hangouts Chat its primary interface for enterprise collaboration. “That’s a big leap for them to make.”

Go to Original Article
Author:

Chief transformation officer takes digital one step further

There’s a new player on the block when it comes to the team leading digital efforts within a healthcare organization.

Peter Fleischut, M.D., has spent the last two years leading telemedicine, robotics and robotic process automation and artificial intelligence efforts at New York-Presbyterian as its chief transformation officer, a relatively new title that is beginning to take form right alongside the chief digital officer.

Fleischut works as part of the organization’s innovation team under New York-Presbyterian CIO Daniel Barchi. Formerly the chief innovation officer for New York-Presbyterian, Fleischut described his role as improving care delivery and providing a better digital experience.

“I feel like we’re past the age of innovating. Now it’s really about transforming our care model,” he said.

What is a chief transformation officer?

The chief transformation officer is “larger than a technology or digital role alone,” according to Barchi.

Indeed, Laura Craft, analyst at Gartner, said she’s seeing healthcare organizations use the title more frequently to indicate a wider scope than, say, the chief digital officer.

The chief digital officer, a title that emerged more than five years ago, is often described as taking an organization from analog to digital. The digital officer role is still making inroads in healthcare today. Kaiser Permanente recently named Prat Vemana as its first chief digital officer for the Kaiser Foundation Health Plan and Hospitals. In the newly created role, Vemana is tasked with leading Kaiser Permanente’s digital strategy in collaboration with internal health plan and hospital teams, according to a news release.

A chief transformation officer, however, often focuses not just on digital but also emerging tech, such as AI, to reimagine how an organization does business.

“It has a real imperative to change the way [healthcare] is operating and doing business, and healthcare organizations are struggling with that,” Craft said. 

Barchi, who has been CIO at New York-Presbyterian for four years, said the role of chief transformation officer was developed by the nonprofit academic medical center to “take technology to the next level” and scale some of the digital programs it had started. The organization sought to improve not only back office functions but to advance the way it operates digitally when it comes to the patient experience, from hospital check-in to check-out.

I feel like we’re past the age of innovating. Now it’s really about transforming our care model.
Peter Fleischut, M.D.Chief transformation officer, New York-Presbyterian

Fleischut was selected for the role due to his background as a clinician, as well as the organization’s former chief innovation officer. He has been in the role for two years and is charged with further developing and scaling New York-Presbyterian’s AI, robotics and telemedicine programs.

The organization, which has four major divisions and is comprised of 10 hospitals, deeply invested in its telemedicine efforts and built a suite of services about four years ago. In 2016, it completed roughly 1,000 synchronous video visits between providers and patients. Now, the organization expects to complete between 500,000 and 1,000,000 video visits by the end of 2019, Fleischut said during his talk at the recent mHealth & Telehealth World Summit in Boston.

One of the areas where New York-Presbyterian expanded its telemedicine services under Fleischut’s lead was in emergency rooms, offering low-acuity patients the option of seeing a doctor virtually instead of in-person, which shortened patient discharge times from an average baseline of two and a half hours to 31 minutes.

The healthcare organization has also expanded its telemedicine services to kiosks set up in local Walgreens, and has a mobile stroke unit operating out of three ambulances. Stroke victims are treated in the ambulance virtually by an on-call neurologist.  

“At the end of the day with innovation and transformation, it’s all about speed, it’s all about time, and that’s what this is about,” Fleischut said. “How to leverage telemedicine to provide faster, quicker, better care to our patients.”

Transforming care delivery, hospital operations  

Telemedicine is one example of how New York-Presbyterian is transforming the way it interacts with patients. Indeed, that’s one of Fleischut’s main goals — to streamline the patient experience digitally through tools like telemedicine, Barchi said.

“The way you reach patients is using technology to be part of their lives,” Barchi said. “So Pete, in his role, is really important because we wanted someone focused on that patient experience and using things like telemedicine to make the patient journey seamless.” 

But for Fleischut to build a better patient experience, he also has to transform the way the hospital operates digitally, another one of his major goals.

As an academic medical center, Barchi said the organization invests significantly in advanced, innovative technology, including robotics. Barchi said he works with one large budget to fund innovation, information security and electronic medical records.

One hospital operation Fleischut worked to automate using robotics was food delivery. Instead of having hospital employees deliver meals to patients, New York-Presbyterian now uses large robots loaded with food trays that are programmed to deliver patient meals.

Fleischut’s work, Barchi said, will continue to focus on innovative technologies transforming the way New York-Presbyterian operates and delivers care.

“Pete’s skills from being a physician with years of experience, as well as his knowledge of technology, allow him to be truly transformative,” Barchi said.

In his role as chief transformation officer, Fleischut said he considers people and processes the most important part of the transformation journey. Without having the right processes in place for changing care delivery and without provider buy-in, the effort will not be a success, he said.

“Focusing on the people and the process leads to greater adoption of technologies that, frankly, have been beneficial in other industries,” he said.

Go to Original Article
Author:

No one likes waiting on the phone for a GP appointment. So why do we still do it?

The team behind the services are experts at healthcare, as they also run Patient.Info, one of the most popular medical websites in the UK. More than 100 million people logged on to the site in 2018 to read articles about healthcare, check symptoms and learn to live a healthier life, and more than 60% of GPs in England have access to it.

They also produce a newsletter that’s sent to 750,000 subscribers and around 2,000 leaflets on health conditions and 850 on medicines.

People can access Patient.Info 24 hours a day, seven days a week. It’s the same for Patient Access but web traffic spikes every morning when people want to book appointments to see their GP. To handle that demand, Patient Access runs on Microsoft’s Azure cloud platform. As well as being reliable and stable, all patient data is protected by a high level of security – Microsoft employs more than 3,500 dedicated cybersecurity professionals to help protect, detect and respond to threats, while segregated networks and integrated security controls add to the peace of mind.

“About 62% of GP practices use Patient Access,” says Sarah Jarvis MBE, the Clinical Director behind the service. “They’re using it to manage their services, manage appointments, take in repeat medications, consolidate a patient’s personal health record and even conduct video consultations.

“Just imagine your GP being able to conduct video consultations. If you’re aged 20 to 39 you might not want or need to have a relationship with a GP because you don’t need that continuity of care.

“But imagine you are elderly and housebound, and a district nurse visits you. They phone your GP and say: ‘Could you come and visit this patient’, but the GP is snowed under and can’t get there for a couple of hours. The district nurse is also very busy and must visit someone else.

“Now, with Patient Access, a Duty Doctor can look at someone’s medical record and do a video consultation in five minutes. If the patient needs to be referred, the GP can do it there and then from inside the system. The possibilities are endless, and older people, especially, have so much to gain from this.”

Go to Original Article
Author: Microsoft News Center

Expanding Xbox Voice Commands to Hundreds of Millions of Smart Devices: Xbox Now Connects with Cortana and Alexa-Enabled Devices – Xbox Wire

Here at Team Xbox, we’ve had a long history in offering voice controls as a way to interact with your Xbox console through Kinect and headsets. Today, starting with select Xbox U.S. Insiders, we’re expanding voice support by introducing the Xbox Skill, which enables you to navigate and interact with Xbox One using voice commands through your Cortana and Alexa-enabled devices.

With the Xbox Skill, you can use voice commands to power your Xbox One console, adjust volume, launch games and apps, start and stop broadcasts on Mixer, capture screenshots, and more. It’s the fastest way to get into your games and one of the easiest ways to interact with your console for everyday tasks. For example, if you have the skill enabled on your Echo and you’re a part of the Insider preview, just say “Alexa, start Rocket League.” and this command will automatically turn on your console, sign you in, and launch your game.

The Xbox Skill integrates with your Cortana and Alexa-enabled device such as a Windows 10 PC, Amazon Echo, Harman Kardon Invoke, Sonos One, or Cortana and Alexa apps on iOS and Android, enabling voice commands to control your Xbox One console.

For Xbox Insiders* in the U.S. who want to try the Xbox Skill with Cortana or Alexa, here’s how to get started:

If you use Cortana:

  1. Sign into the Xbox you want to control.
  2. On your Windows 10 PC, click here and sign in with your Microsoft account to link the skill.
  3. Try your first command! “Hey Cortana, tell Xbox to open Netflix.”

If you use Alexa:

  1. Sign into the Xbox you want to control.
  2. Click here, sign in with your Amazon Account, and click Enable.
  3. Sign in with your Microsoft account to link the skill.
  4. Let Alexa discover your console, then follow the instructions to pair your console with Alexa.
  5. Try your first command! “Alexa, start Rocket League.”

Wondering what else the Xbox Skill can do? Just say “Ask Xbox what can I say?” to discover more commands for your console. For a full list of commands, troubleshooting assistance, and to give the team feedback and ideas, you can visit the Xbox Insider Subreddit.

As always, your feedback is important to us and our partners as we continue to evolve this experience and grow our voice integration across devices, digital assistants and voice services.

*Note to Xbox Insiders: We will be rolling the Xbox Skill out to Xbox Insider rings gradually. If the Digital Assistant setting is visible on your console in Settings -> Devices, then you are currently eligible to test the Xbox Skill. If it doesn’t appear, then please be patient as we are working quickly to add more Insider rings to the beta.