Tag Archives: Microsoft’s

Do you know the difference in the Microsoft HCI programs?

While similar in name, Microsoft’s Azure Stack and Azure Stack HCI products are substantially different product offerings designed for different use cases.

Azure Stack brings Azure cloud capabilities into the data center for organizations that want to build and run cloud applications on localized resources. Azure Stack HCI operates on the same Hyper-V-based, software-driven compute, storage and networking technologies but serves a fundamentally different purpose. This new Microsoft HCI offering is a hyper-converged infrastructure product that combines vendor-specific hardware with Windows Server 2019 Datacenter edition and management tools to provide a highly integrated and optimized computing platform for local VM workloads.

Azure Stack gives users a way to employ Azure VMs for Windows and Linux, Azure IoT and Event Hubs, Azure Marketplace, Docker containers, Azure Key Vault, Azure Resource Manager, Azure Web Apps and Functions, and Azure administrative tools locally. This functionality gives an organization the benefits of Azure cloud operation, while also satisfying regulatory requirements that require workloads to run in the data center.

Azure Stack HCI offers optional connections to an array of Azure cloud services, including Azure Site Recovery, Azure Monitor, Azure Backup, Azure Update Management, Azure File Sync and Azure Network Adapter. However, these workloads remain in the Azure cloud. Also, there is no way to convert this Microsoft HCI product into an Azure Stack deployment.

Azure Stack HCI evolved from Microsoft’s WSSD HCI offering.

Windows Server Software-Defined products still exist

Azure Stack HCI evolved from Microsoft’s Windows Server Software-Defined (WSSD) HCI offering. The WSSD program still exists, but the main difference on the software side is hardware in the WSSD program runs on the Windows Server 2016 OS.

WSSD HCI is similar to Azure Stack HCI with a foundation of vendor-specific hardware, the inclusion of Windows Server technologies — Hyper-V, Storage Spaces Direct and software-defined networking — and Windows Admin Center for systems management. Azure Stack HCI expands on WSSD through improvements to Windows Server 2019 and tighter integration with Azure services.

Go to Original Article
Author:

How to Create and Manage Hot/Cold Tiered Storage

When I was working in Microsoft’s File Services team around 2010, one of the primary goals of the organization was to commoditize storage and make it more affordable to enterprises. Legacy storage vendors offered expensive products, often consuming a majority of the budget of the IT department and they were slow to make improvements because customers were locked in. Since then, every release of Windows Server has included storage management features which were previously only provided by storage vendors, such as deduplication, replication, and mirroring. These features could be used to manage commodity storage arrays and disks, reducing costs and eliminating vendor lock-in. Windows Server now offers a much-requested feature, the ability to move files between different tiers of “hot” (fast) storage and “cold” (slow) storage.

Managing hot/cold storage is conceptually similar to computer memory cache but at an enterprise scale. Files which are frequently accessed can be optimized to run on the hot storage, such as faster SSDs. Meanwhile, files which are infrequently accessed will be pushed to cold storage, such as older or cheaper disks. These lower priority files will also take advantage of file compression techniques like data deduplication to maximize storage capacity and minimize cost. Identical or varying disk types can be used because the storage is managed as a pool using Windows Server’s storage spaces, so you do not need to worry about managing individual drives. The file placement is controlled by the Resilient File System (ReFS), a file system which is used to optimize and rotate data between the “hot” and “cold” storage tiers in real-time based on their usage. However, using tiered storage is only recommended for workloads that are not regularly accessed. If you have permanently running VMs or you are using all the files on a given disk, there would be little benefit in allocating some of the disk to cold storage. This blog post will review the key components required to deploy tiered storage in your datacenter.

Overview of Resilient File System (ReFS) with Storage Tiering

The Resilient File System was first introduced in Windows Server 2012 with support for limited scenarios, but it has been greatly enhanced through the Windows Server 2019 release. It was designed to be efficient, support multiple workloads, avoid corruption and maximize data availability. More specifically to tiering though, ReFS divides the pool of storage into two tiers automatically, one for high-speed performance and one of maximizing storage capacity. The performance tier receives all the writes on the faster disk for better performance. If those new blocks of data are not frequently accessed, the files will gradually be moved to the capacity tier. Reads will usually happen from the capacity tier, but can also happen from the performance tier as needed.

Storage Spaces Direct and Mirror-Accelerated Parity

Storage Spaces Direct (S2D) is one of Microsoft’s enhancements designed to reduce costs by allowing servers with Direct Attached Storage (DAS) drives to support Windows Server Failover Clustering. Previously, highly-available file server clusters required some type of shared storage on a SAN or used an SMB file share, but S2D allows for small local clusters which can mirror the data between nodes. Check out Altaro’s blog on Storage Spaces Direct for in-depth coverage on this technology.

With Windows Server 2016 and 2019, S2D offers mirror-accelerated parity which is used for tiered storage, but it is generally recommended for backups and less frequently accessed files, rather than heavy production workloads such as VMs. In order to use tiered storage with ReFS, you will use mirror-accelerated parity. This provides decent storage capacity by using both mirroring and a parity drive to help prevent and recover from data loss. In the past, mirroring and parity would conflict and you would usually have to select one of the other.  Mirror-accelerator parity works with ReFS by taking writes and mirroring them (hot storage), then using parity to optimize their storage on disk (cold storage). By switching between these storage optimizations techniques, ReFS provides admins with the best of both worlds.

Creating Hot and Cold Tiered Storage

When configuring hot and cold storage you get to define the ratio of the hot and cold storage. For most workloads, Microsoft recommends allocating 20% to hot and 80% to cold. If you are using high-performance workloads, consider having more hot memory to support more writes. On the flip-side, if you have a lot of archival files, then allocate more cold memory. Remember that with a storage pool you can combine multiple disk types under the same abstracted storage space. The following PowerShell cmdlets show you how to configure a 1,000 GB disk to use 20% (200 GB) for performance (hot storage) and 80% (800 GB) for capacity (cold storage).

Managing Hot and Cold Tiered Storage

If you want to increase the performance of your disk, then you will allocate a great percentage of the disk to the performance (hot) tier. In the following example we use the PowerShell cmdlets to create a 30:70 ratio between the tiers:

Unfortunately, this resizing only changes the ratios of the disks but does not change the size of the partition or volume, so you likely also want to change these using the Resize-Volumes cmdlets.

Optimizing Hot and Cold Storage

Based on the types of workloads you are using, you may wish to further optimize when data is moved between hot and cold storage, which is known as the “aggressiveness” of the rotation. By default, the hot storage will use wait until 85% of its capacity is full before it begins to send data to the cold storage. If you have a lot of write traffic going to the hot storage then you want to reduce this value so that performance-tier data gets pushed to the cold storage quicker. If you have fewer write requests and want to keep data in hot storage longer then you can increase this value. Since this is an advanced configuration option, it must be configured via the registry on every node in the S2D cluster, and it also requires a restart. Here is a sample script to run on each node if you want to change the aggressiveness so that it swaps files when the performance tier reaches 70% capacity:

You can apply this setting cluster-wide by using the following cmdlet:

NOTE: If this is applied to an active cluster, make sure that you reboot one node at a time to maintain service availability.

Wrap-Up

Now you should be fully equipped with the knowledge to optimize your commodity storage using the latest Windows Server storage management features. You can pool your disks with storage spaces, use storage spaces direct (S2D) to eliminate the need for a SAN, and ReFS to optimize the performance and capacity of these drives.  By understanding the tradeoffs between performance and capacity, your organization can significantly save on storage management and hardware costs. Windows Server has made it easy to centralize and optimize your storage so you can reallocate your budget to a new project – or to your wages!

What about you? Have you tried any of the features listed in the article? Have they worked well for you? Have they not worked well? Why or why not? Let us know in the comments section below!


Go to Original Article
Author: Symon Perriman

Analyzing data from space – the ultimate intelligent edge scenario – The Official Microsoft Blog

Space represents the next frontier for cloud computing, and Microsoft’s unique approach to partnerships with pioneering companies in the space industry means together we can build platforms and tools that foster significant leaps forward, helping us gain deeper insights from the data gleaned from space.

One of the primary challenges for this industry is the sheer amount of data available from satellites and the infrastructure required to bring this data to ground, analyze the data and then transport it to where it’s needed. With almost 3,000 new satellites forecast to launch by 20261 and a threefold increase in the number of small satellite launches per year, the magnitude of this challenge is growing rapidly.

Essentially, this is the ultimate intelligent edge scenario – where massive amounts of data must be processed at the edge – whether that edge is in space or on the ground. Then the data can be directed to where it’s needed for further analytics or combined with other data sources to make connections that simply weren’t possible before.

DIU chooses Microsoft and Ball Aerospace for space analytics

To help with these challenges, the Defense Innovation Unit (DIU) just selected Microsoft and Ball Aerospace to build a solution demonstrating agile cloud processing capabilities in support of the U.S. Air Force’s Commercially Augmented Space Inter Networked Operations (CASINO) project.

With the aim of making satellite data more actionable more quickly, Ball Aerospace and Microsoft teamed up to answer the question: “what would it take to completely transform what a ground station looks like, and downlink that data directly to the cloud?”

The solution involves placing electronically steered flat panel antennas on the roof of a Microsoft datacenter. These phased array antennas don’t require much power and need only a couple of square meters of roof space. This innovation can connect multiple low earth orbit (LEO) satellites with a single antenna aperture, significantly accelerating the delivery rate of data from satellite to end user with data piped directly into Microsoft Azure from the rooftop array.

Analytics for a massive confluence of data

Azure provides the foundational engine for Ball Aerospace algorithms in this project, processing worldwide data streams from up to 20 satellites. With the data now in Azure, customers can direct that data to where it best serves the mission need, whether that’s moving it to Azure Government to meet compliance requirements such as ITAR or combining it with data from other sources, such as weather and radar maps, to gain more meaningful insights.

In working with Microsoft, Steve Smith, Vice President and General Manager, Systems Engineering Solutions at Ball Aerospace called this type of data processing system, which leverages Ball phased array technology and imagery exploitation algorithms in Azure, “flexible and scalable – designed to support additional satellites and processing capabilities. This type of data processing in the cloud provides actionable, relevant information quickly and more cost-effectively to the end user.”

With Azure, customers gain its advanced analytics capabilities such as Azure Machine Learning and Azure AI. This enables end users to build models and make predictions based on a confluence of data coming from multiple sources, including multiple concurrent satellite feeds. Customers can also harness Microsoft’s global fiber network to rapidly deliver the data to where it’s needed using services such as ExpressRoute and ExpressRoute Global Reach. In addition, ExpressRoute now enables customers to ingest satellite data from several new connectivity partners to address the challenges of operating in remote locations.

For tactical units in the field, this technology can be replicated to bring information to where it’s needed, even in disconnected scenarios. As an example, phased array antennas mounted to a mobile unit can pipe data directly into a tactical datacenter or Data Box Edge appliance, delivering unprecedented situational awareness in remote locations.

A similar approach can be used for commercial applications, including geological exploration and environmental monitoring in disconnected or intermittently connected scenarios. Ball Aerospace specializes in weather satellites, and now customers can more quickly get that data down and combine it with locally sourced data in Azure, whether for agricultural, ecological, or disaster response scenarios.

This partnership with Ball Aerospace enables us to bring satellite data to ground and cloud faster than ever, leapfrogging other solutions on the market. Our joint innovation in direct satellite-to-cloud communication and accelerated data processing provides the Department of Defense, including the Air Force, with entirely new capabilities to explore as they continue to advance their mission.

  1. https://www.satellitetoday.com/innovation/2017/10/12/satellite-launches-increase-threefold-next-decade/

Tags: ,

Go to Original Article
Author: Microsoft News Center

Microsoft PowerApps pricing proposal puts users on edge

BOSTON — Microsoft’s proposed licensing changes for PowerApps, the cloud-based development tools for Office 365 and Dynamics 365, have confused users and made them fearful the software will become prohibitively expensive.

Last week, at Microsoft’s SPTechCon user conference, some organizations said the pricing changes, scheduled to take effect Oct. 1, were convoluted. Others said the new pricing — if it remains as previewed by Microsoft earlier this summer — would force them to limit the use of the mobile app development tools.

“We were at the point where we were going to be expanding our usage, instead of using it for small things, using it for larger things,” Katherine Prouty, a developer at the nonprofit Greater Lynn Senior Services, based in Lynn, Mass., said. “This is what our IT folks are always petrified of; [the proposed pricing change] is confirmation of their worst nightmares.”

This is what our IT folks are always petrified of; this is confirmation of their worst nightmares.
Katherine ProutyDeveloper, Greater Lynn Senior Services

Planned apps the nonprofit group might have to scrap if the pricing changes take effect include those for managing health and safety risks for its employees and clients in a regulatory-compliant way, and protecting the privacy of employees as they post to social media on behalf of the organization, Prouty said.

Developers weigh in

The latest pricing proposal primarily affects organizations building PowerApps that tap data sources outside of Office 365 and Dynamics 365. People connecting to Salesforce, for example, would pay $10 per user, per month, unless they opt to pay $40 per user, per month for unlimited use of data connectors to third-party apps.

The new pricing would take effect even if customers were only connecting Office 365 to Dynamics 365 or vice versa. That additional cost for using apps they’re already paying for does not sit well with some customers, while others find the pricing scheme perplexing. 

“It’s all very convoluted right now,” said David Drever, senior manager at IT consultancy Protiviti, based in Menlo Park, Calif.

Manufacturing and service companies that create apps using multiple data sources are among the businesses likely to pay a lot more in PowerApps licensing fees, said IT consultant Daniel Christian of PowerApps911, based in Maineville, Ohio.

Annual PowerApps pricing changes

However, pricing isn’t the only problem, Christian said. Microsoft’s yearly overhaul of PowerApps fees also contributes to customer handwringing over costs.

“Select [a pricing model] and stick with it,” he said. “I’m OK with change; we’ll manage it and figure it out. It’s the repetitive changes that bug me.”

Microsoft began restricting PowerApps access to outside data sources earlier this year, putting into effect changes announced last fall. The new policy required users to purchase a special PowerApps plan to connect to popular business applications such as Salesforce Chatter, GotoMeeting and Oracle Database. The coming changes as presented earlier this summer would take that one step further by introducing per-app fees and closing loopholes that were available on a plan that previously cost $7 per user per month.

Matt Wade, VP of client services at H3 Solutions Inc., based in Manassas, Va., said customers should watch Microsoft’s official PowerApps blog for future information that might clarify costs and influence possible tweaks to the final pricing model. H3 Solutions is the maker of AtBot, a platform for developing bots for Microsoft’s cloud-based applications.

“People who are in charge of administering Office 365 and the Power Platform need to be hyper-aware of what’s going on,” Wade said. “Follow the blog, comment, provide feedback — and do it respectfully.”

Go to Original Article
Author:

‘Thank you for helping us make history’: Microsoft’s new London flagship store opens to the public

Microsoft’s new flagship store in London has opened its doors to the public for the first time, with people waiting hours to be among the first to set foot inside.

The first physical retail store for Microsoft in the UK, which is located on Oxford Circus and covers 22,000 square feet over three floors, was officially unveiled to the crowd at 11am.

Chris Capossela, Microsoft’s Chief Marketing Officer, Cindy Rose, UK Chief Executive, and Senior Store Manager John Carter welcomed the public by giving speeches in front of the doors on Regent Street.

Rose said the store was a “symbol of Microsoft’s enduring commitment to the UK”, which allows people to “experience the best the company has to offer”. “Thank you for helping us make history today,” she added.

People had started queuing along Regent Street from 7am to see Microsoft’s Surface devices, HoloLens, Xbox Gaming Lounge and sit in the McLaren Senna on the ground floor.

Store associates welcome customers to the store

One customer, Blair, had started queuing at 7:30am after travelling from Wiltshire by bus. “I’m a Microsoft fan but I especially love Xbox. I heard there would be a few games from [videogame event] E3 here,” he said. “I really want to have a go in the McLaren Senna.”

Denise, from Sutton, was interested in seeing the Surface devices. “I want to see the latest technology and products that Microsoft has in there. I might buy a new laptop today.”

Callum, from London, also wanted to sit in the McLaren Senna. “I play a lot of Forza, so I want to experience the car and the Xbox Gaming Lounge,” he said.

James, from Reading, wanted to see how the store could help businesses. “I’m excited to see what it’s like,” he said. “I want to see what they can offer businesses. The outside of the store looks incredible; it’s a masterpiece of architecture.”

Microsoft handed out free T-shirts and Xbox Game Pass codes to people in the queue, while the first 100 visitors to buy a Surface Pro 6 were also given a free limited edition Liberty Surface Type Cover.

Staff clap as customers enter the store

At 11am, the curtains in the store windows dropped to reveal excited store staff, dressed in red, green, yellow and blue shirts – the colours of Microsoft’s logo – jumping up and down and cheering.

The customers walked into a store with a modern feel, with lots of space and wood and glass surfaces. They were greeted by staff standing in front of a large video wall and Surface devices on tables, with the McLaren on their right and the HoloLens mixed-reality headset to their left. A wooden spiral staircase or lifts took them to the first floor, where they could play the latest Xbox and PC titles in high-quality gaming chairs and professional pods in the Gaming Lounge, purchase third-party laptops and accessories and get tech support, trainings, repairs and advice from the Answer Desk, or go to the Community Theatre where coding workshops were taking place. Visitors could create their own personalised Surface Type Cover with Surface Design Lab, featuring a range of designs that can be etched directly onto the cover. They also took photos in the Selfie Area.

The enterprise area on the second floor is a place to support, train and grow businesses no matter where they are on their digital transformation journey. From small companies and educational institutions to enterprise customers, the Product Advisors and Cloud Technical Experts will help customers discover, deploy and use Microsoft 365 and other resources to solve business challenges such as AI, data security, collaboration and workplace efficiencies. This floor also contains an area for hosting events, as well as meeting rooms and a Showcase space for demonstrating how customers, including Carlsberg and Toyota, are digitally transforming.

It is also the most accessible store Microsoft has ever opened, with store associates collectively speaking 45 languages, buttons to open doors, lower desks to help those in wheelchairs and Xbox Adaptive Controllers available for gamers with restricted movement.

Tags: , , , , ,

Go to Original Article
Author: Tracy Ith

Explore the Cubic congestion control provider for Windows

Administrators may not be familiar with the Cubic congestion control provider, but Microsoft’s move to make this the default setting in the Windows networking stack means IT will need to learn how it works and how to manage it.

When Microsoft released Windows Server version 1709 in its Semi-Annual Channel, the company introduced a number of features, such as support for data deduplication in the Resilient File System and support for virtual network encryption.

Microsoft also made the Cubic algorithm the default congestion control provider for that version of Windows Server. The most recent preview builds of Windows 10 and Windows Server 2019 (Long-Term Servicing Channel) also enable Cubic by default.

Microsoft added Cubic to Windows Server 2016, as well, but it calls this implementation an experimental feature. Due to this disclaimer, administrators should learn how to manage Cubic if unexpected behavior occurs.

Why Cubic matters in today’s data centers

Congestion control mechanisms improve performance by monitoring packet loss and latency and making adjustments accordingly. TCP/IP limits the size of the congestion window and then gradually increases the window size over time. This process stops when the maximum receive window size is reached or packet loss occurs. However, this method hasn’t aged well with the advent of high-bandwidth networks.

For the last several years, Windows has used Compound TCP as its standard congestion control provider. Compound TCP increases the size of the receive window and the volume of data sent.

Cubic, which has been the default congestion provider for Linux since 2006, is a protocol that improves traffic flow by keeping track of congestion events and dynamically adjusting the congestion window.

A Microsoft blog on the networking features in Windows Server 2019 said Cubic performs better over a high-speed, long-distance network because it accelerates to optimal speed more quickly than Compound TCP.

Enable and disable Cubic with netsh commands

Microsoft added Cubic to later builds of Windows Server 2016. You can use the following PowerShell command to see if Cubic is in your build:

Get-NetTCPSetting| Select-Object SettingName, CcongestionProvider

Technically, Cubic is a TCP/IP add-on. Because PowerShell does not support Cubic yet, admins must enable it in Windows Server 2016 from the command line with the netsh command from an elevated command prompt.

Netsh uses the concepts of contexts and subcontexts to configure many aspects of Windows Server’s networking stack. A context is similar to a mode. For example, the netsh firewall command places netsh in a firewall context, which means that the utility will accept firewall-related commands.

Microsoft added Cubic-related functionality into the netsh interface context. The interface context — abbreviated as INT in some Microsoft documentation — provides commands to manage the TCP/IP protocol.

Prior to Windows Server 2012, admins could make global changes to the TCP/IP stack by referencing the desired setting directly. For example, if an administrator wanted to use the Compound TCP congestion control provider — which was the congestion control provider since Windows Vista and Windows Server 2008 — they could use the following command:

netsh int tcp set global congestionprovider=ctcp

Newer versions of Windows Server use netsh and the interface context, but Microsoft made some syntax changes in Windows Server 2012 that carried over to Windows Server 2016. Rather than setting values directly, Windows Server 2012 and Windows Server 2016 use supplemental templates.

In this example, we enable Cubic in Windows Server 2016:

netsh int tcp set supplemental template=internet congestionprovider=cubic

This command launches netsh, switches to the interface context, loads the Internet CongestionProvider template and sets the congestion control provider to Cubic. Similarly, we can switch from the Cubic provider to the default Compound congestion provider with the following command:

netsh int tcp set supplemental template=internet congestionprovider=compound

60 seconds with … Cambridge Research Lab Director Chris Bishop

Microsoft’s Research Lab in Cambridge UK, back when the lab was first opened in 1997, before being named Lab Director two-and-a-half years ago, so I’ve been involved in growing and shaping the lab for more than two decades. Today my role includes leadership of the MSR Cambridge lab, as well as coordination of the broader Microsoft presence in Cambridge. I am fortunate in being supported by a very talented leadership team and a highly capable and motivated team of support staff.

What were your previous jobs?

My background is in theoretical physics. After graduating from Oxford, I did a PhD in quantum field theory at the University of Edinburgh, exploring some of the fundamental mathematics of matter, energy, and space-time. After my PhD I wanted to do something that would have potential for practical impact, so I joined the UK’s national fusion research lab to work on the theory of magnetically confined plasmas as part of a long-term goal to create unlimited clean energy. It was during this time that there were some breakthroughs in the field of neural networks. I was very inspired by the concept of machine intelligence, and the idea that computers could learn for themselves. Initially I started applying neural networks to problems in fusion research, and we became the first lab to use neural networks for real-time feedback control of a high-temperature fusion plasma.

In fact, I found neural networks so fascinating that, after about eight years working on fusion research, I took a rather radical step and switched fields into machine learning. I became a Professor at Aston University in Birmingham, where I set up a very successful research lab. Then I took a sabbatical and came to Cambridge for six months to run a major, international programme called “Neural Networks and Machine Learning” at the Isaac Newton Institute. The programme started on July 1, 1997, on the very same day that Microsoft announced it was opening a research lab in Cambridge, its first outside the US. I was approached by Microsoft to join the new lab, and have never looked back.

What are your aims at Microsoft?

My ambition is for the lab to have an impact on the real world at scale by tackling very hard research problems, and by leveraging the advantages and opportunities we have as part of Microsoft. I often say that I want the MSR Cambridge lab to be a critical asset for the company.

I’m also very passionate about diversity and inclusion, and we have introduced multiple initiatives over the last year to support this. We are seeing a lot of success in bringing more women into technical roles in the lab, across both engineering and research, and that’s very exciting to see.

What’s the hardest part of your job?

A core part of my job is to exercise judgment in situations where there is no clear right answer. For instance, in allocating limited resources I need to look at the risk, the level of investment, the potential for impact, and the timescale. At any one time there will be some things we are investing in that are quite long term but where the impact could be revolutionary, along with other things that have perhaps been researched for several years which are beginning to get real traction, all the way to things that have had real-world impact already. The hardest part of my job is to weigh up all these factors and make some difficult decisions on where to place our bets.

What’s the best part of your job?

The thing I enjoy most is the wonderful combination of technology and people. Those are two aspects I find equally fascinating, yet they offer totally different kinds of challenges. We, as a lab, are constantly thinking about technology, trends and opportunities, but also about the people, teams, leadership, staff development and recruitment, particularly in what has become a very competitive talent environment. The way these things come together is fascinating. There is never a dull day here.

What is a leader?

I think of leadership as facilitating and enabling, rather than directing. One of the things I give a lot of attention to is leadership development. We have a leadership team for the lab and we meet once a week for a couple of hours. I think about the activities of that team, but also about how we function together. It’s the diversity of the opinions of the team members that creates a value that’s greater than the sum of its parts. Leadership is about harnessing the capabilities of every person in the lab and allowing everyone to bring their best game to the table. I therefore see my role primarily as drawing out the best in others and empowering them to be successful.

What are you most proud of?

Last year I was elected a Fellow of the Royal Society, and that was an incredibly proud moment. There’s a famous book I got to sign, and you can flip back and see the signatures of Isaac Newton, Charles Darwin, Albert Einstein, and pretty much every scientist you’ve ever heard of. At the start of the book is the signature of King Charles II who granted the royal charter, so this book contains over three-and-a-half centuries of scientific history. That was a very humbling but thrilling moment.

Another thing I’m very proud of was the opportunity to give the Royal Institution Christmas Lectures. The Royal Institution was set up more than 200 years ago – Michael Faraday was one of the early directors – and around 14 Nobel prizes have been associated with the Institution, so there is a tremendous history there too. These days it’s most famous for the Christmas Lectures, which were started by Faraday. Ever since the 1960s these lectures have been broadcast on national television at Christmas, and I watched them as a child with my mum and dad. They were very inspirational for me and were one of the factors that led me to choose a career in science. About 10 years ago, I had the opportunity to give the lectures, which would have been inconceivable to me as a child. It was an extraordinary moment to walk into that famous iconic theatre, where Faraday lectured many times and where so many important scientific discoveries were first announced.

One Microsoft anecdote that relates to the lectures was that getting selected was quite a competitive process. It eventually came down to a shortlist of five people, and I was very keen to be chosen, especially as it was the first time in the 200 year history of the lectures that they were going to be on the subject of computer science. I was thinking about what I could do to get selected, so I wrote to Bill Gates, explained how important these lectures were and asked him whether, if I was selected, he would agree to join me as a guest in one of the lectures. Fortunately, he said yes, and so I was able to include this is my proposal to the Royal Institution. When I was ultimately selected, I held Bill to this promise, and interviewed him via satellite on live television during one of the lectures.

Chris Bishop is elected a Fellow of the Royal Society

What inspires you?

I love the idea that through our intellectual drive and curiosity we can use technology to make the world a better place for millions of people. For example, the field of healthcare today largely takes a one-size-fits-all approach that reactively waits until patients become sick before responding, and which is increasingly associated with escalating costs that are becoming unsustainable. The power of digital technology offers the opportunity to create a data-driven approach to healthcare that is personalised, predictive and preventative, and which could significantly reduce costs while also improving health and wellbeing. I’ve made Healthcare AI one of the focal points of the Cambridge lab, and I find it inspiring that the combination of machine learning, together with Microsoft’s cloud, could help to bring about a much-needed transformation in healthcare.

What is your favourite Microsoft product?

A few years ago, the machine learning team here in Cambridge built a feature, in collaboration with the Exchange team, called Clutter. It sorts out the email you should pay attention to now, from the ones that can be left to, say, a Friday afternoon. I love it because it’s used by tens of millions of people, and it has some very beautiful research ideas at the heart of it – something called a hierarchical Bayesian machine learning model. This gives it a nice out-of-the-box experience, a sort of average that does OK for everybody, but as you engage with it, it personalises and learns your particular preferences of what constitutes urgent versus non-urgent email. The other reason I’m particularly fond of it is that when I became Lab Director, the volume of email in my inbox quadrupled. That occurred just as we were releasing the Clutter feature, so it arrived just in time to save me from being overwhelmed.

What was the first bit of technology that you were excited about?

When I was a child I was very excited about the Apollo moon landings. I was at an age where I could watch them live on television and knew enough to understand what an incredible achievement they were. Just think of that Saturn launch vehicle that’s 36 storeys high, weighs 3,000 tonnes, is burning 15 tonnes of fuel a second, and yet it’s unstable. So, it must be balanced, rather like balancing a broom on your finger, by pivoting those massive engines backwards and forwards on hydraulic rams in response to signals from gyroscopes at the top of the rocket. It’s that combination of extreme brute force with exquisite precision, along with dozens of other extraordinary yet critical innovations, that made the whole adventure just breath-taking. And the filtering algorithms used by the guidance system are an elegant application of Bayesian inference, so it turns out that machine learning is, literally, rocket science.

Tags: , , , , ,

M12 announces $4 million global competition for women entrepreneurs – Stories

Microsoft’s venture fund, M12, partners with EQT Ventures and SVB Financial Group to accelerate funding for women leaders

REDMOND, Wash. — July 26, 2018 M12, Microsoft Corp.’s venture fund, in collaboration with the EQT Ventures fund and SVB Financial Group, on Thursday announced the Female Founders Competition, seeking to accelerate funding for top women-led startups focused on enterprise technology solutions. Two winners will share $4 million in venture funding, as well as access to technology resources, mentoring and more.

Women entrepreneurs receive a disproportionately small amount of venture funding, with only 2.2 percent of the total invested in 2017 going to women-founded startups. Studies have shown that investing in companies founded by women delivers significantly higher returns than the market average. By shining a light on this highly talented, but underfunded group of entrepreneurs, M12 and its partners seek to not only fund innovative female entrepreneurs, but to spotlight the funding gap that exists and the benefits of more equitable distribution of capital.

“We formed M12 to make smart bets on innovative people and their ideas, and the Female Founders Competition is an extension of that mandate,” said Peggy Johnson, executive vice president of Business Development at Microsoft. “This isn’t about checking a box; it’s an opportunity to remind the VC community that investing in women is more than just good values, it’s good business.”

“The EQT Ventures team is all about backing founders with the ambition, drive and vision to build a global success story,” said Alastair Mitchell, partner and investment advisor at EQT Ventures. “This competition reflects this and offers women entrepreneurs a great platform from which to launch their business, providing them with access to capital and mentorship. It also raises awareness of the funding gap between male and female founders, and the EQT Ventures team wants to play an active role in bridging that gap.”

Submissions will be accepted from July 26, 2018, to Sept. 30, 2018, and open across three regions: Europe, Israel, and North America (U.S., Canada and Mexico). Companies will be eligible to apply if they have at least one woman founder, have raised less than $4 million in combined equity funding and/or loans at day of application, and offer or intend to release a product, service or platform that addresses a critical business problem.

“At SVB, we strive to help innovative companies succeed,” said Tracy Isacke, head of Corporate Venture at Silicon Valley Bank. “Research tells us diverse teams are more successful. We believe this is true for our business, our clients’ businesses and the innovation economy at large. Our partnership with Microsoft has created a great opportunity for SVB to engage in this competition and is one of the many ways we are supporting diverse representation in the global innovation ecosystem.”

Up to 10 finalists will pitch in person for the chance to be one of the two startups that earn a $2 million investment as well as access to technology resources, mentoring and additional support. The competition also seeks to drive greater awareness for both finalists and winners, with the potential for future funding from the broader VC community. Full guidelines and contest information can be found on M12’s application page.

About EQT Ventures

EQT Ventures is a European VC fund with commitments of just over €566 million. The fund is based in Luxembourg and has investment advisors stationed in Stockholm, Amsterdam, London, San Francisco and Berlin. Fueled by some of Europe’s most experienced company builders, EQT Ventures helps the next generation of entrepreneurs with capital and hands on support. EQT Ventures is part of EQT, a leading investment firm with approximately EUR 50 billion in raised capital across 27 funds. EQT funds have portfolio companies in Europe, Asia and the US with total sales of more than EUR 19 billion and approximately 110,000 employees.

About SVB Financial Group

For 35 years, SVB Financial Group (NASDAQ: SIVB) and its subsidiaries have helped innovative companies and their investors move bold ideas forward, fast. SVB Financial Group’s businesses, including Silicon Valley Bank, offer commercial and private banking, asset management, private wealth management, brokerage and investment services and funds management services to companies in the technology, life science and healthcare, private equity and venture capital, and premium wine industries. Headquartered in Santa Clara, California, SVB Financial Group operates in centers of innovation around the world. Learn more at svb.com.

About M12

As the corporate venture arm for Microsoft, M12 (formerly Microsoft Ventures) invests in enterprise software companies in the Series A through C funding stage. As part of its value-add to portfolio companies, M12 offers unique access to strategic go-to-market resources and relationships globally. Visit https://m12.vc/ to learn more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, [email protected]

Lucy Wimmer, PR for EQT Ventures, +44(0) (755) 128-9177, [email protected]

Julia Thompson, PR for Silicon Valley Bank, (415) 764-4707, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Microsoft’s Council for Digital Good calls on US policymakers to promote digital civility – Microsoft on the Issues

In an open letter to U.S. law- and policy-makers, Microsoft’s Council for Digital Good is calling on government to address digital-world realities like cyberbullying and “sextortion” by modernizing laws and promoting in-school education to encourage positive online behaviors.

“As young people who have encountered some of these problems firsthand, our goal as the Council for Digital Good is to provide strategies, solutions and resources for other young people in these situations,” council members wrote. “For our sake and for that of future generations, it is imperative that we amplify discussions about making the internet a more productive, civil, and safe place.”

Council for Digital Good logo

The letter, shared last week at an event featuring the 15-member council at Microsoft’s Innovation and Policy Center in Washington, D.C., touts the benefits of awareness-raising of digital risks. The council also recommends that in-school online safety and behavioral education be supported and prioritized, and requests that laws be updated and brought into the digital age. The letter and its recommendations to policymakers is the culmination of the council’s work after 18 months of other assignments, activities, learning and fun. In addition to the council members and a parent or chaperone who accompanied each of them to the event, the young people also hosted leaders from other technology companies, non-governmental organizations and D.C.-area influentials.

Youth shine in the nation’s capital
The event, “Is there a place for civility in our digital future? Conversations with Microsoft’s Council for Digital Good,” featured two panel discussions, comprised of teens sharing their work and views, and two sets of three adult panelists, each responding and reacting to the young people’s presentations. The first panel focused on the state of online civility today and included Christina W., Jazmine H., Judah S. and Miosotis R. These four young people, ages 14 to 17, went above and beyond their regular council assignments, taking it upon themselves to speak in their schools and communities on or around international Safer Internet Day this past February. They then brought those learnings to this panel discussion.

From left, Judah S., Miosotis R., Christina W. and Jazmine H. following their panel discussion.

From left, Judah S., Miosotis R., Christina W. and Jazmine H. following their panel discussion.

Christina spoke of the rewarding experience it was to see parents interact with one another after hearing her guidance for staying safer online; Jazmine noted the importance of awareness-raising and education among all groups; and Judah highlighted the importance of respecting age requirements on social media. Miosotis talked about her peer-to-peer outreach in both Florida and Puerto Rico. The adult respondents from Google, Born This Way Foundation and Columbia University were impressed by the young people’s drive, determination and knowledge of the issues.

The second panel focused on building and growing a culture of digital civility. Indigo E., Jacob S. and Sierra W. presented the cohort’s written manifesto for life online first released in January, while Bronte J., Rees D. and William F., unveiled the open letter. Adult respondents from Snap, Inc., Tyler Clementi Foundation and UNICEF posed some provocative and important questions and offered instructive advice for reaching policymakers with their message.

Jacqueline Beauchere speaking

Jacqueline Beauchere summing up after a second panel with Council for Digital Good members and adult respondents.

Erin R., Robert B. and Isabella W. showcased their individual art projects, and Katherine C. and Champe S. shared highlights from their council experiences, and assisted me in opening and closing the event, respectively. These 11 council members range in age from 14 to 18.

“The CDG council members are impressive and inspiring,” said retired U.S. Ambassador Maura Harty, president and CEO of the International Center for Missing and Exploited Children, who attended the event. “Their kindness and maturity are matched by their desire for effectiveness. With their manifesto, they have provided a well-considered road map and a path to greater digital civility for all of us. Emphasizing awareness, skills, and yes, ethics and etiquette, they have modeled the behavior we all should emulate.”

Program highlights importance of the youth voice
We assembled this impressive group as part of a pilot program in the U.S., launched in January 2017. The council served as a sounding board for Microsoft’s youth-focused, online safety policy work. Prior to last week’s event, the council met for a two-day summit last August where they each drafted an individual manifesto for life online. They were then tasked with creating an artistic or visual representation of those written works. The written cohort manifesto and a creative cohort manifesto followed, all leading up to the crafting of the open letter and the youth assuming a more visible role as a full group.

As I’ve mentioned before, we thought the in-person portion of the program would conclude after the August summit. But after meeting these youth, we knew it would be a missed opportunity not to bring them together again and in a more public way. We wanted others to appreciate their passion and perspectives and to hear from them in their own words. Indeed, for us at Microsoft, the program underscores the importance of the youth voice and the need for young people to have a say in policy matters – be they governmental or corporate – that affect them. We shared a lot and we’ve learned even more from these youth. I’m planning a more reflective account of the full program soon.

Following the D.C. event, first lady Melania Trump met with the council members, and spent time with each teen personally to learn about their individual creative projects and to hear about the cohort’s 15 online safety tenets.

Afterward, we held a brief capstone event, where we honored each council member for his or her unique contributions to this pilot program. We are excited to learn that many council members want to stay involved in these issues and to remain in contact with us at Microsoft and many of our partner organizations.

As the youth concluded in their open letter: “Now is the time for action, and we need your help in the push for change in online culture. If we gain the ability to always harness the internet in a positive and productive way, we will be able to use our generation’s signature swiftness, effectiveness, and global platform to make a difference.”

Learn more
Read the council’s full open letter here; view all of their individual, creative projects at this link, and learn more about digital civility by visiting www.microsoft.com/digitalcivility. Look for our latest digital civility research releases leading up to Safer Internet Day 2019 in February and, until then, follow the Council for Digital Good on our Facebook page and via Twitter using #CouncilforDigitalGood. To learn more about online safety generally, visit our website and resources page; “like” us on Facebook and follow us on Twitter.

 

Tags: ,

Five lessons on reaching 1 billion people living with disabilities

Microsoft’s mission is to empower every person and organization on the planet to achieve more. Whether or not we succeed depends on our ability to create an inclusive company culture, deliver inclusive products for our customers and show up to the world in an inclusive way.

Recently I spoke at Microsoft’s Ability Summit about five lessons we’ve learned (so far) in our journey to inclusive and accessible marketing. I’m sharing here in hopes they will inspire your own thinking. To learn more about a couple employee-driven accessibility projects coming out of Microsoft’s One Week Hackathon, I encourage you to check out The Ability Hacks, which we published today.

1. Recognize the values case and the business case

People typically think about the values case for accessibility, which makes sense — empowering people with disabilities makes the world work better for everyone. But the business case for accessibility is equally important. According to the World Health Organization, more than 1 billion people worldwide experience some form of disability. In the US alone, that’s nearly 1 in 5 people in 1 in 3 households. If our products don’t work for a billion people, we’re not only failing in our mission, we’re also missing an enormous business opportunity.

2. Proximity powers empathy

We’ve learned the incredible value of investing in programs that bring us closer to customers of different backgrounds. We learn so much and do our best work when we commit to seeing the world from their perspectives. For instance, back at our 2015 Hackathon, a team of Microsoft engineers pitched a project with the lofty ambition of making gaming more accessible to gamers with limited mobility, and so began the journey of the Xbox Adaptive Controller. From the earliest moments, the development team reached out to nonprofits like Warfighter Engaged and AbleGamers to partner and learn how the product of their dreams could address the broadest set of needs in the real world. The team increased community engagement at every milestone, from product design and engineering, to prototype testing with gamers living with disabilities, to designing final retail packaging. The empathy we gained forged the path to a product we’re very proud of, that we hope gamers everywhere love when it arrives this September.

3. Accessibility for few becomes usability for many

We see time and again that our accessibility work starts out focused on enabling a specific set of customers but ends up benefiting everyone. For instance, Microsoft events are a major marketing investment each year, so it’s important our events meet the needs of every attendee, including people living with disabilities. A few years ago, we began live-transcribing event keynotes with the goal of helping attendees who are deaf or hard of hearing more easily follow along with keynotes. To our surprise, we ended up getting far more feedback from attendees who speak English as a second language – live transcription helped them navigate highly technical discussions and fast-paced product demos. Now we provide live transcription services in keynotes at all large Microsoft events and open captioning (and in many cases audio description) in company videos. The positive responses we’ve received speak to the broader, unexpected benefits of embracing accessibility.

If you find a Microsoft video missing captions, please contact us via our
Disability Answer Desk.

4. All marketing should be inclusive marketing

There’s value in audience-specific marketing programs, but we’ve learned we get the best results when mainstream marketing programs feature people from a range of audiences, backgrounds and life experiences. For instance, in our most recent AI ad we tell three different customer stories – one on preserving ancient architecture, one on sustainable farming and one on audio visualization AI – all woven together seamlessly as cool examples of how AI is improving lives for people today.

Pro tip: Make your presentations more accessible by adding live subtitles with the
Presentation Translator add-in for PowerPoint.

5. Real people, real stories

A few years back, we shifted our marketing approach to show technology empowering real people to do real things. As a result, we’ve seen far stronger return on investment than we would hiring actors to depict the stories of others. The video below is a powerful example – it features real students from Holly Springs Elementary in Georgia talking about how Microsoft Learning Tools help them overcome obstacles to reading.

Not only is the story more credible coming from real students, it makes the core empowerment message relatable to more people. This shift in philosophy now guides decisions on who represents Microsoft in our ads, on our website and at our events. In each case, real people sharing real stories is the most effective way to bring the impact of technology to life.

Real people sharing real stories is the most effective way to bring the impact of technology to life.

These are just five of many lessons we’ve learned, and our work is only beginning. We’re energized to keep learning and sharing our biggest lessons, because there’s tremendous value in embracing inclusion and accessibility – for your people, your bottom line, your customers and the world.