Tag Archives: challenges

Databricks bolsters security for data analytics tool

Some of the biggest challenges with data management and analytics efforts is security.

Databricks, based in San Francisco, is well aware of the data security challenge, and recently updated its Databricks’ Unified Analytics Platform with enhanced security controls to help organizations minimize their data analytics attack surface and reduce risks. Alongside the security enhancements, new administration and automation capabilities make the platform easier to deploy and use, according to the company.

Organizations are embracing cloud-based analytics for the promise of elastic scalability, supporting more end users, and improving data availability, said Mike Leone, a senior analyst at Enterprise Strategy Group. That said, greater scale, more end users and different cloud environments create myriad challenges, with security being one of them, Leone said.

“Our research shows that security is the top disadvantage or drawback to cloud-based analytics today. This is cited by 40% of organizations,” Leone said. “It’s not only smart of Databricks to focus on security, but it’s warranted.”

He added that Databricks is extending foundational security in each environment with consistency across environments and the vendor is making it easy to proactively simplify administration.

As organizations turn to the cloud to enable more end users to access more data, they’re finding that security is fundamentally different across cloud providers.
Mike LeoneSenior analyst, Enterprise Strategy Group

“As organizations turn to the cloud to enable more end users to access more data, they’re finding that security is fundamentally different across cloud providers,” Leone said. “That means it’s more important than ever to ensure security consistency, maintain compliance and provide transparency and control across environments.”

Additionally, Leone said that with its new update, Databricks provides intelligent automation to enable faster ramp-up times and improve productivity across the machine learning lifecycle for all involved personas, including IT, developers, data engineers and data scientists.

Gartner said in its February 2020 Magic Quadrant for Data Science and Machine Learning Platforms that Databricks Unified Analytics Platform has had a relatively low barrier to entry for users with coding backgrounds, but cautioned that “adoption is harder for business analysts and emerging citizen data scientists.”

Bringing Active Directory policies to cloud data management

Data access security is handled differently on-premises compared with how it needs to be handled at scale in the cloud, according to David Meyer, senior vice president of product management at Databricks.

Meyer said the new updates to Databricks enable organizations to more efficiently use their on-premises access control systems, like Microsoft Active Directory, with Databricks in the cloud. A member of an Active Directory group becomes a member of the same policy group with the Databricks platform. Databricks then maps the right policies into the cloud provider as a native cloud identity.

Databricks uses the open source Apache Spark project as a foundational component and provides more capabilities, said Vinay Wagh, director of product at Databricks.

“The idea is, you, as the user, get into our platform, we know who you are, what you can do and what data you’re allowed to touch,” Wagh said. “Then we combine that with our orchestration around how Spark should scale, based on the code you’ve written, and put that into a simple construct.”

Protecting personally identifiable information

Beyond just securing access to data, there is also a need for many organizations to comply with privacy and regulatory compliance policies to protect personally identifiable information (PII).

“In a lot of cases, what we see is customers ingesting terabytes and petabytes of data into the data lake,” Wagh said. “As part of that ingestion, they remove all of the PII data that they can, which is not necessary for analyzing, by either anonymizing or tokenizing data before it lands in the data lake.”

In some cases, though, there is still PII that can get into a data lake. For those cases, Databricks enables administrators to perform queries to selectively identify potential PII data records.

Improving automation and data management at scale

Another key set of enhancements in the Databricks platform update are for automation and data management.

Meyer explained that historically, each of Databricks’ customers had basically one workspace in which they put all their users. That model doesn’t really let organizations isolate different users, however, and has different settings and environments for various groups.

To that end, Databricks now enables customers to have multiple workspaces to better manage and provide capabilities to different groups within the same organization. Going a step further, Databricks now also provides automation for the configuration and management of workspaces.

Delta Lake momentum grows

Looking forward, the most active area within Databricks is with the company’s Delta Lake and data lake efforts.

Delta Lake is an open source project started by Databrick and now hosted at the Linux Foundation. The core goal of the project is to enable an open standard around data lake connectivity.

“Almost every big data platform now has a connector to Delta Lake, and just like Spark is a standard, we’re seeing Delta Lake become a standard and we’re putting a lot of energy into making that happen,” Meyer said.

Other data analytics platforms ranked similarly by Gartner include Alteryx, SAS, Tibco Software, Dataiku and IBM. Databricks’ security features appear to be a differentiator.

Go to Original Article
Author:

IoT Signals energy report: Embracing transparent, affordable, and sustainable energy

The increased use of renewables, resiliency challenges, and sustainability concerns are all disrupting the energy industry today. New technologies are accelerating the way we source, store, and distribute energy. With IoT, we can gain new insights about the physical world that enables us to optimize and create more efficient processes, reduce energy waste, and track specific consumption. This is a great opportunity for IoT to support power and utilities (P&U) companies across grid assets, electric vehicles, energy optimization, load balancing, and emissions monitoring.

We’ve recently published a new IoT Signals report focused on the P&U industry. The report provides an industry pulse on the state of IoT adoption to help inform us how to better serve our partners and customers, as well as help energy companies develop their own IoT strategies. We surveyed global decision-makers in P&U organizations to deliver an industry-level view of the IoT ecosystem, including adoption rates, related technology trends, challenges, and benefits of IoT.

The study found that while IoT is almost universally adopted in P&U, it comes with complexity. Companies are commonly deploying IoT to improve the efficiency of operations and employee productivity, but can be challenged by skills and knowledge shortages, privacy and security concerns, and timing and deployment issues. To summarize the findings:

Top priorities and use cases for IoT in power and utilities

  1. Optimizing processes through automation is critical for P&U IoT use. Top IoT uses cases in P&U include automation-heavy processes such as smart grid automation, energy optimization and load balancing, smart metering, and predictive load forecasting. In support of this, artificial intelligence (AI) is often a component of energy IoT solutions, and they are often budgeted together. Almost all adopters have either already integrated AI into an IoT solution or are considering integration.
  2. Using IoT to improve both data security and employee safety is a top priority. Almost half of decision-makers we talked to use IoT to make their IT practices more secure. Another third are implementing IoT to make their workplaces safer, as well as improve the safety of their employees.
  3. P&U companies also leverage IoT to secure their physical assets. Many P&U companies use IoT to secure various aspects of their operations through equipment management and infrastructure maintenance.
  4. The future is bright with IoT adoption continuing to focus on automation, with growth in adoption for use cases related to optimizing energy and creating more efficient maintenance systems.

Decision-makers cite cost savings and reducing human error as the greatest benefits to IoT adoption; 63 percent and 57 percent respectively say these are the most valuable ways they use IoT in their businesses

Today, customers around the world are telling us they are heavily investing in four common use cases for IoT in the energy sector:

Grid asset maintenance

Visualize your grid’s topology, gather data from grid assets, and define rules to trigger alerts. Use these insights to predict maintenance and provide more safety oversight. Prevent failures and avoid critical downtime by monitoring the performance and condition of your equipment.

Energy optimization and load balancing

Balance energy supply and demand to alleviate pressure on the grid and prevent serious power outages. Avoid costly infrastructure upgrades and gain flexibility by using distributed energy resources to drive energy optimization.

Emissions monitoring and reduction

Monitor emissions in near real-time and make your emissions data more readily available. Work towards sustainability targets and clean energy adoption by enabling greenhouse gas and carbon accounting and reporting.

E-mobility

Remotely maintain and service electric vehicle (EV) charging points that support various charging speeds and vehicle types. Make it easier to own and operate electric vehicles by incentivizing ownership and creating new visibility into energy usage.

Learn more about IoT for energy

Read about the real world customers doing incredible things with IoT for energy where you can learn about market leaders like Schneider Electric making remote asset management easier using predictive analytics.

“Traditionally, machine learning is something that has only run in the cloud … Now, we have the flexibility to run it in the cloud or at the edge—wherever we need it to be.” Matt Boujonnier, Analytics Application Architect, Schneider Electric.

Read the blog where we announced Microsoft will be carbon negative by 2030 and discussed our partner Vattenfall delivering a new, highly transparent 24/7 energy matching solution; a first-of-its-kind approach that gives customers the ability to choose the green energy they want and ensure their consumption matches that goal using Azure IoT.

We are committed to helping P&U customers bring their vision to life with IoT, and this starts with simplifying and securing IoT. Our customers are embracing IoT as a core strategy to drive better outcomes for energy providers, energy users, and the planet. We are heavily investing in this space, committing $5 billion in IoT and intelligent edge innovation by 2022, and growing our IoT and intelligent edge partner ecosystem.
 
When IoT is foundational to a transformation strategy, it can have a significantly positive impact on the bottom line, customer experiences, and products. We are invested in helping our partners, customers, and the broader industry to take the necessary steps to address barriers to success. Read the full IoT Signals energy report and learn how we’re helping power and utilities companies embrace the future and unlock new opportunities with IoT.

Go to Original Article
Author: Microsoft News Center

Microsoft for Healthcare: Empowering our customers and partners to provide better experiences, insights and care – The Official Microsoft Blog

At Microsoft, our goal within healthcare is to empower people and organizations to address the complex challenges facing the healthcare industry today. We help do this by co-innovating and collaborating with our customers and partners as a trusted technology provider. Today, we’re excited to share progress on the latest innovations from Microsoft aimed at helping address the most prevalent and persistent health and business challenges:

  • Empower care teams with Microsoft 365: Available in the coming weeks, the new Bookings app in Microsoft Teams will empower care teams to schedule, manage and conduct virtual visits with remote patients via video conference. Also coming soon, clinicians will be able to target Teams messages to recipients based on the shift they are working. Finally, healthcare customers can support their security and compliance requirements with the HIPAA/HITECH assessment in Microsoft Compliance Score.
  • Protect health information with Azure Sphere: Microsoft’s integrated security solution for IoT (Internet of Things) devices and equipment – is now widely available for the development and deployment of secure, connected devices. Azure Sphere helps securely personalize patient experiences with connected devices and solutions. And, to make it easier for healthcare leaders to develop their own IoT strategies, today we’re launching a new IoT Signals report focused on the healthcare industry that provides an industry pulse on the state of IoT adoption and helpful insights for IoT strategies. Learn more about Microsoft’s IoT offerings for healthcare here.
  • Enable personalized virtual care with Microsoft Healthcare Bot: Today, we’re pleased to announce that Microsoft Healthcare Bot, our HITRUST-certified platform for creating virtual health assistants, is enriching its healthcare intelligence with new built-in templates for healthcare-specific use cases, and expanding its integrated medical content options. With the addition of Infermedica, a cutting-edge triage engine based on advanced artificial intelligence (AI) that enables symptom checking in 17 languages Healthcare Bot is empowering providers to offer global access to care.
  • Reimagine healthcare using new data platform innovations: With the 2019 release of Azure API for FHIR, Microsoft became the first cloud provider with a fully managed, enterprise-grade service for health data in the Fast Healthcare Interoperability Resources (FHIR) format. We’re excited to expand those offerings with several new innovations around connecting, converting and transforming data. The first is Power BI FHIR Connector, which makes it simple and easy to bring FHIR data into Power BI for analytics and insights. The second, IoMT (Internet of Medical Things) FHIR Connector, is now available as open source software (OSS) and allows for seamless ingestion, normalization and transformation of Protected Health Information data from health devices into FHIR. Another new open source project, FHIR Converter, provides an easy way to convert healthcare data from legacy formats (i.e., HL7v2) into FHIR. And lastly, FHIR Tools for Anonymization, is now offered via OSS and enables anonymization and pseudonymization of data in the FHIR format. Including capabilities for redaction and date shifting in accordance with the HIPAA privacy rule.

Frictionless exchange of health information in FHIR makes it easier for researchers and clinicians to collaborate, innovate and improve patient care. As we move forward working with our customers and partners and others across the health ecosystem, Microsoft is committed to enabling and improving interoperability and required standards to make it easier for patients to manage their healthcare and control their information. At the same time, trust, privacy and compliance are a top priority – making sure Protected Health Information (PHI) remains under control and custodianship of healthcare providers and their patients.

We’ve seen a growing number of healthcare organizations not only deploy new technologies, but also begin to develop their own digital capabilities and solutions that use data and AI to transform and innovate healthcare and life sciences in profoundly positive ways. Over the past year, together with our customers and partners, we’ve announced new strategic partnerships aimed at empowering this transformation.

For example, to enable caregivers to focus more on patients by dramatically reducing the burden of documenting doctor-patient visits, Nuance has released Nuance Dragon Ambient eXperience (DAX). This ambient clinical intelligence technologies (ACI) is enriched by AI and cloud capabilities from Microsoft, including the ambient intelligence technology, EmpowerMD, which is coming to market as part of Nuance’s DAX solution. The solution aims to transform the exam room by deploying ACI to capture, with patient consent, interactions between clinicians and patients so that clinical documentation writes itself.

Among health systems, Providence St. Joseph Health is using Microsoft’s cloud, AI, productivity and collaboration technologies to deploy next-generation healthcare solutions while empowering their employees. NHS Calderdale is enabling patients and their providers to hold appointments virtually via Microsoft Teams for routine and follow-up visits, which helps lower costs while maintaining the quality of care. The U.S. Veterans Affairs Department is embracing mixed reality by working with technology providers Medivis, Microsoft and Verizon to roll out its first 5G-enabled hospital. And specifically for health consumers, Walgreens Boots Alliance will harness the power of our cloud, AI and productivity technologies to empower care teams and deliver new retail solutions to make healthcare delivery more personal, affordable and accessible.

Major payor, pharmaceutical and health technology platform companies are also transforming healthcare in collaboration with us. Humana will develop predictive solutions for personalized and secure patient support, and by using Azure, Azure AI and Microsoft 365, they’ll also equip home healthcare workers with real-time access to information and voice technology to better understand key factors that influence patient health. In pharmaceuticals, Novartis will bring Microsoft AI capabilities together with its deep expertise in life sciences to address specific challenges that make the process of discovering, developing and delivering new medicines so costly and time-consuming.

We’re pleased to showcase how together with our customers and partners, we’re working to bring healthcare solutions to life and positively impact the health ecosystem.

To keep up to date with the latest announcements visit the Microsoft Health News Room.

About the authors:
As Corporate Vice President of Health Technology and Alliances, Dr. Greg Moore leads the dedicated research and development collaborations with our strategic partners, to deliver next-generation technologies and experiences for healthcare.

Vice President and Chief Medical Officer Dr. David Rhew recently joined Microsoft’s Worldwide Commercial Business Healthcare leadership team and provides executive-level support, engaging in business opportunities with our customers and partners.

As Corporate Vice President of Healthcare, Peter Lee leads the Microsoft organization that works on technologies for better and more efficient healthcare, with a special focus on AI and cloud computing.

Tags: , , , ,

Go to Original Article
Author: Microsoft News Center

Public cloud vendors launch faulty services as race heats up

The public cloud services arena has turned a corner, introducing new challenges for customers, according to the latest edition of “Technology Radar,” a biannual report by global software consultancy ThoughtWorks. Competition has heated up, so top public cloud vendors are creating new cloud services at a fast clip. But in their rush to market, those vendors can roll out flawed services, which opens the door for resellers to help clients evaluate cloud options.

Public cloud has become a widely deployed technology, overcoming much of the resistance it had seen in the past. “Fears about items like security and sovereignty have been calmed,” noted Scott Shaw, director of technology for Asia Pacific region at ThoughtWorks. “Regulators have become more comfortable with the technology, so cloud interest has been turning into adoption.”

The cloud market shifts

With the sales of public cloud services rising, competition has intensified. Initially, Amazon Web Services dominated the market, but recently Microsoft Azure and Google Cloud Platform have been gaining traction among enterprise customers.

Corporations adopting public cloud have not had as much success as they had hoped for.
Scott ShawDirector of technology for Asia Pacific region, ThoughtWorks

One ripple effect is that the major public cloud providers have been trying to rapidly roll out differentiating new services. However, in their haste to keep pace, they can deliver services with rough edges and incomplete feature sets, according to ThoughtWorks.

Customers can get caught in this quicksand. “Corporations adopting public cloud have not had as much success as they had hoped for,” Shaw said.

Businesses try to deploy public cloud services based on the promised functionality but frequently hit roadblocks during implementations. “The emphasis on speed and product proliferation, through either acquisition or hastily created services, often results not merely in bugs but also in poor documentation, difficult automation and incomplete integration with vendors’ own parts,” the report noted.

Top public cloud vendors chart
The global public cloud market share in 2019.

Testing is required

ThoughtWorks recommended that organizations not assume all public cloud vendors’ services are of equal quality. They need to test out key capabilities and be open to alternatives, such as open source options and multi-cloud strategies.

Resellers can act as advisors to help customers make the right decisions as they consider new public cloud services, pointing out the strengths and flaws in individual cloud options, Shaw said.

To serve as advisors, however, resellers need in-depth, hands-on experience with the cloud services. “Channel partners cannot simply rely on a feature checklist,” Shaw explained. “To be successful, they need to have worked with the service and understand how it operates in practice and not just in theory.”

Go to Original Article
Author:

Major storage vendors map out 2020 plans

The largest enterprise storage vendors face a common set of challenges and opportunities heading into 2020. As global IT spending slows and storage gets faster and frequently handles data outside the core data center, primary storage vendors must turn to cloud, data management and newer flash technologies.

Each of the major storage vendors has its own plans for dealing with these developments. Here is a look at what the major primary storage vendors did in 2019 and what you can expect from them in 2020.

Dell EMC: Removing shadows from the clouds

2019 in review: Enterprise storage market leader Dell EMC spent most of 2019 bolstering its cloud capabilities, in many cases trying to play catch-up. New cloud products include VMware-orchestrated Dell EMC Cloud Platform arrays that integrate Unity and PowerMax storage, coupled with VxBlock converged and VxRail hyper-converged infrastructure.

The new Dell EMC Cloud gear allows customers to build and deploy on-premises private clouds with the agility and scale of the public cloud — a growing need as organizations dive deeper into AI and DevOps.

What’s on tap for 2020: Dell EMC officials have hinted at a new Power-branded midrange storage system for several years, and a formal unveiling of that product is expected in 2020. Then again, Dell initially said the next-generation system would arrive in 2019. Customers with existing Dell EMC midrange storage likely won’t be forced to upgrade, at least not for a while. The new storage platform will likely converge features from Dell EMC Unity and SC Series midrange arrays with an emphasis on containers and microservices.

Dell will enhance its tool set for containers to help companies deploy microservices, said Sudhir Srinivasan, the CTO of Dell EMC storage. He said containers are a prominent design featured in the new midrange storage. 

“Software stacks that were built decades ago are giant monolithic pieces of code, and they’re not going to survive that next decade, which we call the data decade,” Srinivasan said. 

Hewlett Packard Enterprise’s eventful year

2019 in review: In terms of product launches and partnerships, Hewlett Packard Enterprise (HPE) had a busy year in 2019. HPE Primera all-flash storage arrived in late 2019,  and HPE expects customers will slowly transition from its flagship 3PAR platform. Primera supports NVMe flash, embedding custom chips in the chassis to support massively parallel data transport on PCI Express lanes. The first Primera customer, BlueShore Financial, received its new array in October.

HPE bought supercomputing giant Cray to expand its presence in high-performance computing, and made several moves to broaden its hyper-converged infrastructure options. HPE ported InfoSight analytics to HPE SimpliVity HCI, as part of the move to bring the cloud-based predictive tools picked up from Nimble Storage across all HPE hardware. HPE launched a Nimble dHCI disaggregated HCI product and partnered with Nutanix to add Nutanix HCI technology to HPE GreenLake services while allowing Nutanix to sell its software stack on HPE servers.

It capped off the year with HPE Container Platform, a bare-metal system to make it easier to spin up Kubernetes-orchestrated containers on bare metal. The Container Platform uses technology from recent HPE acquisitions MapR and BlueData.

What’s on tap for 2020: HPE vice president of storage Sandeep Singh said more analytics are coming in response to customer calls for simpler storage. “An AI-driven experience to predict and prevent issues is a big game-changer for optimizing their infrastructure. Customers are placing a much higher priority on it in the buying motion,” helping to influence HPE’s roadmap, Singh said.

It will be worth tracking the progress of GreenLake as HPE moves towards its goal of making all of its technology available as a service by 2022.

Hitachi Vantara: Renewed focus on traditional enterprise storage

2019 in review: Hitachi Vantara renewed its focus on traditional data center storage, a segment it had largely conceded to other array vendors in recent years. Hitachi underwent a major refresh of the Hitachi Virtual Storage Platform (VSP) flash array in 2019. The VSP 5000 SAN arrays scale to 69 PB of raw storage, and capacity extends higher with hardware-based deduplication in its Flash Storage Modules. By virtualizing third-party storage behind a VSP 5000, customers can scale capacity to 278 PB.

What’s on tap for 2020: The VSP5000 integrates Hitachi Accelerated Fabric networking technology that enables storage to scale out and scale up. Hitachi this year plans to phase in the networking to other high-performance storage products, said Colin Gallagher, a Hitachi vice president of infrastructure products.

“We had been lagging in innovation, but with the VSP5000, we got our mojo back,” Gallagher said.

Hitachi arrays support containers, and Gallagher said the vendor is considering whether it needs to evolve its support beyond a Kubernetes plugin, as other vendors have done. Hitachi plans to expand data management features in Hitachi Pentaho analytics software to address AI and DevOps deployments. Gallagher said Hitachi’s data protection and storage as a service is another area of focus for the vendor in 2020.

IBM: hybrid cloud, with cyber-resilient storage

2019 in review: IBM brought out the IBM Elastic Storage Server 3000, an NVMe-based array packaged with IBM Spectrum Scale parallel file storage. Elastic Storage Server 3000 combines NVMe flash and containerized software modules to provide faster time to deployment for AI, said Eric Herzog, IBM’s vice president of world storage channels.

In addition, IBM added PCIe-enabled NVMe flash to Versastack converged infrastructure and midrange Storwize SAN arrays.

What to expect in 2020: Like other storage vendors, IBM is trying to navigate the unpredictable waters of cloud and services. Its product development revolves around storage that can run in any cloud. IBM Cloud Services enables end users to lease infrastructure, platforms and storage hardware as a service. The program has been around for two years, and will add IBM software-defined storage to the mix this year. Customers thus can opt to purchase hardware capacity or the IBM Spectrum suite in an OpEx model. Non-IBM customers can run Spectrum storage software on qualified third-party storage.

“We are going to start by making Spectrum Protect data protection available, and we expect to add other pieces of the Spectrum software family throughout 2020 and into 2021,” Herzog said.

Another IBM development to watch in 2020 is how its $34 billion acquisition of Red Hat affects either vendor’s storage products and services.

NetApp: Looking for a rebound

2019 in review: Although spending slowed for most storage vendors in 2019, NetApp saw the biggest decline. At the start of 2019, NetApp forecast annual sales at $6 billion, but poor sales forced NetApp to slash its guidance by around 10% by the end of the year.

NetApp CEO George Kurian blamed the revenue setbacks partly on poor sales execution, a failing he hopes will improve as NetApp institutes better training and sales incentives. The vendor also said goodbye to several top executives who retired, raising questions about how it will deliver on its roadmap going forward.

What to expect in 2020: In the face of the turbulence, Kurian kept NetApp focused on the cloud. NetApp plowed ahead with its Data Fabric strategy to enable OnTap file services to be consumed, via containers, in the three big public clouds.  NetApp Cloud Data Service, available first on NetApp HCI, allows customers to consume OnTap storage locally or in the cloud, and the vendor capped off the year with NetApp Keystone, a pay-as-you-go purchasing option similar to the offerings of other storage vendors.

Although NetApp plans hardware investments, storage software will account for more revenue as companies shift data to the cloud, said Octavian Tanase, senior vice president of the NetApp OnTap software and systems group.

“More data is being created outside the traditional data center, and Kubernetes has changed the way those applications are orchestrated. Customers want to be able to rapidly build a data pipeline, with data governance and mobility, and we want to try and monetize that,” Tanase said.

Pure Storage: Flash for backup, running natively in the cloud

2019 in review: The all-flash array specialist broadened its lineup with FlashArray//C SAN arrays and denser FlashBlade NAS models. FlashArray//C extends the Pure Storage flagship with a model that supports Intel Optane DC SSD-based MemoryFlash modules and quad-level cell NAND SSDs in the same system.

Pure also took a major step on its journey to convert FlashArray into a unified storage system by acquiring Swedish file storage software company Compuverde. It marked the second acquisition in as many years for Pure, which acquired deduplication software startup StorReduce in 2018.

What to expect in 2020: The gap between disk and flash prices has narrowed enough that it’s time for customers to consider flash for backup and secondary workloads, said Matt Kixmoeller, Pure Storage vice president of strategy.

“One of the biggest challenges — and biggest opportunities — is evangelizing to customers that, ‘Hey, it’s time to look at flash for tier two applications,'” Kixmoeller said.

Flexible cloud storage options and more storage in software are other items on Pure’s roadmap items. Cloud Block Store, which Pure introduced last year, is just getting started, Kixmoeller said, and is expected to generate lots of attention from customers. Most vendors support Amazon Elastic Block Storage by sticking their arrays in a colocation center and running their operating software on EBS, but Pure took a different approach. Pure reengineered the backend software layer to run natively on Amazon S3.

Go to Original Article
Author:

Developing a quantum-ready global workforce – Microsoft Quantum

At Microsoft Quantum, our ambition is to help solve some of the world’s most complex challenges through the world’s most scalable quantum system. Recently, we introduced Azure Quantum to unite a diverse and growing quantum community and accelerate the impact of this technology. Whether it’s algorithmic innovation that improves healthcare outcomes or breakthroughs in cryogenic engineering that enable more sustainable systems design, these recent advancements across the stack are bringing the promise of quantum to our world, right now.

In December 2018, the United States Congress signed the National Quantum Initiative Act – an important milestone for investing the resources needed to continue advancing the field. As recognized by the Act, education on quantum information science and engineering needs to be an area of explicit focus, as the shortage of quantum computing talent worldwide poses a significant challenge to accelerating innovation and fully realizing the impact quantum can have on our world.

Leaders across both public and private sectors need to continue working together to develop a global workforce of quantum engineers, researchers, computer and materials scientists, and other industry experts who will be able to carry quantum computing into the future. Microsoft has been collaborating with academic institutions and industrial entities around the world to grow this quantum generation and prepare the workforce for this next technological revolution.

Empowering the quantum generation through education

Earlier this year, Microsoft partnered with the University of Washington to teach an introductory course on quantum computing and programming. The course, led by Dr. Krysta Svore, General Manager of Microsoft Quantum Systems, focused on the practical implementation of quantum algorithms.

Students were first introduced to quantum programming with Q# through a series of coding exercises followed by programming assignments. For their final project, student teams developed quantum solutions for specified problems – everything from entanglement games and key distribution protocols to quantum chemistry and a Bitcoin mining algorithm. Several students from this undergraduate course joined the Microsoft Quantum team for a summer internship, further developing their new skillsets and delivering quantum impact to organizations around the world.

On the heels of this hands-on teaching engagement, Microsoft has established curriculum partnerships with more than 10 institutions around the world to continue closing the skills gap in quantum development and quantum algorithm design. This curriculum is circling the globe, from the University of California, Los Angeles (UCLA) to the Indian Institute of Technology (IIT) in Roorkee and Hyderabad, India.

Partner universities leverage Q#, Microsoft’s quantum programming language and associated Quantum Development Kit, to teach the principles of quantum computing to the next generation of computer engineers and scientists.

“The course material extended to us by Microsoft is concise and challenging. It covers the necessary mathematical foundations of Quantum Computing. Simulation on Q# is quite straightforward and easy to interpret. Collaboration with Microsoft has indeed captivated students of IIT Roorkee to get deeper insights into Quantum Technology.”

Professor Ajay Wasan of IIT Roorkee, Department of Physics

Q# integrates with familiar tools like Visual Studio and Python, making it a very approachable entry point for undergraduate and graduate students alike.

 “I integrated Microsoft’s Q# into my UCLA graduate course called Quantum Programming.  My students found many aspects of Q# easy to learn and used the language to program and run four quantum algorithms. Thus, the curriculum partnership with Microsoft [has] helped me teach quantum computing to computer science students successfully.”

– Professor Jens Palsberg of UCLA, Computer Science Department

Microsoft has also partnered with Brilliant to bring quantum computing to students and professionals around the world via a self-serve e-learning environment.

a GIF of Microsoft's Brilliant quantum curriculum

a GIF of Microsoft

This interactive Quantum Computing course introduces students to quantum principles and uses Q# to help people learn to build quantum algorithms, simulating a quantum environment in their browsers. In the last six months, more than 40,000 people have interacted with the course and started building their own quantum solutions.

Accelerating quantum innovation through cross-industry collaboration

Recently, Microsoft enrolled into the Quantum Economic Development Consortium (QED-C), which aims to enable and grow the United States quantum industry.

QED-C was established with support from the National Institute of Standards and Technology (NIST) as part of the federal strategy for advancing quantum information science. Through the QED-C, Microsoft partners with a diverse set of business and academic leaders to identify and address gaps in technology, standards, and workforce readiness facing the quantum industry.

We look forward to continuing our academic and cross-industry collaborations in developing a quantum workforce to tackle real-world scenarios and bring this revolutionary technology to fruition.

Request to be an early adopter of Azure Quantum and incorporate Q# and the QDK in your quantum curriculum.

Are you currently a student interested in joining Microsoft Quantum as an intern? Apply to our open research intern positions today!

Other ways to get involved:

Learning resources:

Go to Original Article
Author: Microsoft News Center

Windows 10 issues top list of most read stories for IT pros

Windows 10 — and the challenges posed to IT professionals by its updates — dominated the enterprise desktop discussion in 2019. Troubleshooting and understanding the eccentricities of 2019’s Windows 10 issues comprised many of our top 10 most popular stories this year.

With the sunset of Windows 7 scheduled for the first month of 2020, interest in other Microsoft OSes, including Windows 10, may intensify in the coming year.

Below is a countdown of the top ten most-read SearchEnterpriseDesktop stories, based on page views.

  1. Micro apps, AI to power new version of Citrix Workspace

Citrix announced a new version of Citrix Workspace, which enables IT admins to provide employees with virtual access to an organization’s desktop and applications, at May’s Citrix Synergy event in Atlanta. The company cited micro apps or small, task-based applications as a key feature, saying they would handle complicated tasks more efficiently by bringing them into a unified work feed. The addition of micro apps was made possible through a $200 million acquisition of Sapho in 2018.

  1. Lenovo to launch ThinkBook brand, next-gen ThinkPad x1

Lenovo started a new subbrand — called ThinkBook — this past spring, with two laptops aimed at younger employees in the workforce. The 13- and 14-inch laptops were intended to incorporate a sleek design with robust security, reliability and support services. The company also launched a laptop for advanced business users, ThinkPad X1 Extreme Gen 2, and the ultrasmall desktop ThinkCentre M90n-1 Nano in the same time frame.

  1. Learn about the device-as-a-service model and its use cases

The device-as-a-service model, in which a vendor leases devices to a business, may help IT admins fulfill their responsibility to support, maintain and repair equipment. The model has its pros and cons. It can provide a single point of contact for troubleshooting and enable more frequent hardware refreshes, but it can also limit an organization’s device choices and pose complications for a company’s BYOD plan.

  1. Lenovo powers new ThinkPad series with AMD Ryzen Pro processors

Lenovo released three Windows 10 laptops with AMD processors this past spring, the first time it has used non-Intel chips in its higher-end ThinkPad T and X series devices. The company hoped its T495, T495s and X395 computers would provide better performance and security at a lower cost; the company said the AMD-powered T and X series laptops saw an 18% increase over the previous generation.

  1. Windows 10 security breach highlights third-party vulnerabilities

Microsoft detected a security vulnerability in Windows 10, introduced through Huawei PCManager driver software. Microsoft Defender Advanced Threat Protection, a feature that finds and blocks potential compromises, found the problem before the vulnerability could cause serious damage, but industry professionals said the incident highlighted the risks posed by third-party kernels such as device drivers and the importance of working with trusted companies.

  1. Samsung Notebook 9 Pro 2-in-1 impresses with specs and looks

Samsung released a redesign of its flagship Windows 10 laptop this year, opting for an aluminum chassis in place of the plastic from previous iterations. The device offered comparable specs to other high-end laptop offerings, with a slate of features including a backlit keyboard, a variety of inputs and the Samsung Active Pen.

  1. With the new Windows 10 OS update, trust but verify

Dave Sobel, senior director and managed services provider at SolarWinds in Austin, Texas, expounded on the then-forthcoming May 2019 Windows 10 update a month before its scheduled release. Sobel acknowledged the security importance of patching systems but stressed that IT professionals remain vigilant for complications — notable, as the Windows 10 update came in the wake of an October 2018 patch that deleted files of users who work with Known Folder redirection.

  1. Citrix CEO David Henshall addresses Citrix news, sale rumors

In a Q&A, Citrix CEO David Henshall talked about the future of the 30-year-old company, downplaying rumors that it would be sold. Henshall spoke of the venerable firm’s history of connecting people and information on demand and saw the coming years as a time when Citrix would continue to simplify and ease that connection to encourage productivity.

  1. Latest Windows 10 update issues cause more freezing problems

The April 9 Windows 10 update caused device freezing upon launch. Those in IT had already noted freezing in devices using Sophos Endpoint Protection; after a few days, they learned that the patch was clashing with antivirus software, causing freezing both during startup and over the course of regular operation of the computer. Microsoft updated its support page to acknowledge the issue and provided workarounds shortly thereafter.

  1. 1. IT takes the good with the bad in Windows 10 1903 update

After experiencing problems with previous Windows 10 updates, June’s 1903 version came with initial positive — but wary — reception. Microsoft’s Windows-as-a-service model drew complaints for the way it implemented updates. Among its changes, 1903 enabled IT professionals to pause feature and monthly updates for up to 35 days. Also new was Windows Sandbox, providing IT with the ability to test application installations without compromising a machine. The new version of Windows 10 did not launch bug-free, however; issues with Wi-Fi connectivity, Bluetooth device connection and USB devices causing the rearrangement of drive letters were reported.

Go to Original Article
Author:

Microsoft for Startups and NVIDIA Inception join forces to accelerate AI startups | Blog

Startups, especially in the AI space, have a multitude of unique, daily challenges, from selecting the right technology systems to improving their algorithms to building a robust sales pipeline.

That’s why today at Slush we announced that we are teaming with NVIDIA to give cutting-edge startups developing AI technologies fewer things to worry about by providing them preferred access to the Microsoft for Startups and NVIDIA Inception programs. Now, eligible startups active in our respective programs can receive preferred access and reciprocal benefits, including free or discounted technology, go-to-market support and access to technical experts.

NVIDIA Logo

Eligible NVIDIA Inception AI startups can access Microsoft for Startups’ premium offer, providing:

· Free access to Microsoft technologies, including up to $120k of free Azure cloud.

· Dedicated go-to-market resources to help startups sell alongside our global sales teams and partner channel.

Web

Eligible Microsoft for Startups AI members can access NVIDIA Inception benefits including:

· Free credits for NVIDIA Deep Learning Institute online courses, such as the Fundamentals of Deep Learning for Computer Vision, Accelerating Data Science, and Image Classification.

· Access to go-to-market NVIDIA Inception Connect events and marketing support.

· Unlimited access to DevTalk—a forum built for technical inquiries and community engagement.

· Guidance on which GPU applications and hardware are best suited for your needs.

· Discounts on NVIDIA DGX systems, NVIDIA GPU accelerators, NVIDIA Quadro pro graphics, and NVIDIA TITAN GPUs for deep learning.

This partnership will allow us to accelerate AI startups with NVIDIA’s deep technical expertise and market-leading GPU technology on Microsoft Azure, combined with both companies’ ability to connect startups with customers.

Launched in February 2018, Microsoft for Startups is a comprehensive global program designed to support startups as they build and scale their companies. Since we launched, companies active with Microsoft for Startups are on track to drive $1B in pipeline opportunity by the end of 2020. To find out more about the Microsoft for Startups and to apply for the program, click here.

NVIDIA Inception is a virtual accelerator program that supports startups harnessing GPUs for AI and data science applications during critical stages of product development, prototyping and deployment. Since its launch in 2016, the program has expanded to over 5,000 companies. To find out more about NVIDIA Inception, click here.

Go to Original Article
Author: Microsoft News Center

Forus Health uses AI to help eradicate preventable blindness – AI for Business

Big problems, shared solutions

Tackling global challenges has been the focus of many health data consortiums that Microsoft is enabling. The Microsoft Intelligent Network for Eyecare (MINE) – the initiative that Chandrasekhar read about – is now part of the Microsoft AI Network for Healthcare, which also includes consortiums focused on cardiology and pathology.

For all three, Microsoft’s aim is to play a supporting role to help doctors and researchers find ways to improve health care using AI and machine learning.

“The health care providers are the experts,” said Prashant Gupta, Program Director in Azure Global Engineering. “We are the enabler. We are empowering these health care consortiums to build new things that will help with the last mile.”

In the Forus Health project, that “last mile” started by ensuring image quality. When members of the consortium began doing research on what was needed in the eyecare space, Forus Health was already taking the 3nethra classic to villages to scan hundreds of villagers in a day. But because the images were being captured by minimally trained technicians in areas open to sunlight, close to 20% of the images were not high quality enough to be used for diagnostic purposes.

“If you have bad images, the whole process is crude and wasteful,” Gupta said. “So we realized that before we start to understand disease markers, we have to solve the image quality problem.”

Now, an image quality algorithm immediately alerts the technician when an image needs to be retaken.

The same thought process applies to the cardiology and pathology consortiums. The goal is to see what problems exist, then find ways to use technology to help solve them.

“Once you have that larger shared goal, when you have partners coming together, it’s not just about your own efficiency and goals; it’s more about social impact,” Gupta said.

And the highest level of social impact comes through collaboration, both within the consortiums themselves and when working with organizations such as Forus Health who take that technology out into the world.

Chandrasekhar said he is eager to see what comes next.

“Even though it’s early, the impact in the next five to 10 years can be phenomenal,” he said. “I appreciated that we were seen as an equal partner by Microsoft, not just a small company. It gave us a lot of satisfaction that we are respected for what we are doing.”

Top image: Forus Health’s 3nethra classic is an eye-scanning device that can be attached to the back of a moped and transported to remote locations. Photo by Microsoft. 

Leah Culler edits Microsoft’s AI for Business and Technology blog.

Go to Original Article
Author: Microsoft News Center

Beyond overhead: What drives donor support in the digital era – Microsoft on the Issues

One of the greatest challenges to running a successful nonprofit organization has always been that donors look at nonprofits’ stewardship of funds as a primary way to assess impact. While there is no doubt that nonprofits must use donor funds responsibly, tracking to see if a nonprofit maintains the highest possible ratio of spending on programs-to spending on overhead is a poor proxy for understanding how effective a nonprofit truly is. In fact, the imperative to limit overhead has forced many organizations to underinvest in efforts to improve efficiency. Ironically, this has long prevented nonprofits from utilizing innovative digital technologies that could help them be more efficient and effective.

Now more than ever, cloud-based technology can have a transformative effect on how nonprofit organizations increase impact and reduce costs. The same technologies that give for-profit businesses insights about customers and markets, create operational efficiencies and speed up innovation can also help nonprofits target donors and raise funds more strategically, design and deliver programming more efficiently, and connect field teams with headquarters more effectively. This means smart investments in digital tools are essential to every nonprofit’s ability to make progress toward its mission.

The good news is that a major shift is underway. As part of our work at Microsoft Tech for Social Impact to understand how nonprofits can use technology to drive progress and demonstrate impact, we recently surveyed 2,200 donors, volunteers and funding decision-makers to learn how they decide which organizations to support, what their expectations are for efficiency and effectiveness, and how they feel about funding technology infrastructure at the nonprofits they support.

The results, which we published recently in the white paper “Beyond overhead: Donor expectations for driving impact with technology,” make clear that people donate to organizations they trust and that donors are increasingly looking at data beyond the ratio of program spending to overhead spending to measure impact. We also found that those who support nonprofits now overwhelmingly recognize the critical role technology plays in driving impact and delivering value. Nearly four out of five supporters (which includes both donors and volunteers) and more than nine out of 10 funding decision-makers told us they support directing donations to improve technology at a nonprofit. An overwhelming majority — 85 percent of supporters and 95 percent of funding decision-makers — are more likely to contribute to organizations that can show that they are using technology to improve how it runs programs.

At the same time, the survey found that most people expect organizations to use donations more efficiently and to advance the causes they work for more effectively than in the past. Among supporters, for example, 79 percent believe nonprofits should be better at maximizing funding than they were 10 years ago. Just over 80 percent of funding decision-makers believe nonprofits should be more effective at achieving their goals and advancing the causes they work for now than in the past.

To give you a better sense of what potential donors are looking for as they consider where to target their nonprofit contributions and how much they weigh technology into their thinking, we have developed a tool using Power BI so you can look at the data in greater detail. Within the tool, you can see how people responded to questions about overall effectiveness and efficiency, the importance of technology as a driver of success, how likely they are to support organizations that use technology to demonstrate impact, and their willingness to fund technology improvements at the nonprofits they support.

To make the tool as useful as possible for your organization, you can sort the data by supporters and funding decision-makers, and you can explore how responses varied by region. As you move through the data, you will see how these critical groups of supporters and funders think about these important questions in the region where your organization operates:

The ultimate goal of this survey was to get a clearer picture of what motivates people to contribute to an organization and how technology can help nonprofits meet supporters’ expectations. Overall, I believe our research provides some important insights that can help any organization be more successful. Fundamentally, we found that people donate to organizations that are perceived to be trustworthy, and that trust is achieved though operational transparency and effective communications. More than ever before, donors recognize that using data to measure and demonstrate impact is the foundation for trust.

I encourage you to read the full report and learn more about Microsoft’s commitment to support nonprofits.

Go to Original Article
Author: Microsoft News Center