Tag Archives: applications

Google joins bare-metal cloud fray

Google has introduced bare-metal cloud deployment options geared for legacy applications such as SAP, for which customers require high levels of performance along with deeper virtualization controls.

“[Bare metal] is clearly an area of focus of Google,” and one underscored by its recent acquisition of CloudSimple for running VMware workloads on Google Cloud, said Deepak Mohan, an analyst at IDC.

Deepak MohanDeepak Mohan

IBM, AWS and Azure have their own bare-metal cloud offerings, which allow them to support an ESXi hypervisor installation for VMware, and Bare Metal Solution will apparently underpin CloudSimple’s VMware service on Google, Mohan added.

But Google will also be able to support other workloads that can benefit from bare metal availability, such as machine learning, real-time analytics, gaming and graphical rendering. Bare-metal cloud instances also avert the “noisy neighbor” problem that can crop up in virtualized environments as clustered VMs seek out computing resources, and do away with the general hit to performance known commonly as the “hypervisor tax.”

Google’s bare-metal cloud instances offer a dedicated interconnect to customers and tie into all native Google Cloud services, according to a blog post. The hardware has been certified to run “multiple enterprise applications,” including ones built on top of Oracle’s database, Google said.

Oracle, which lags far behind in the IaaS market, has sought to preserve some of those workloads as customers move to the cloud.

This is clearly an area of focus of Google.
Deepak MohanAnalyst, IDC

Earlier this year, it formed a cloud interoperability partnership with Microsoft, pushing a use case wherein customers could run enterprise application logic and presentation tiers on Azure infrastructure, while tying back to an Oracle database running on bare-metal servers or specialized Exadata hardware in Oracle’s cloud.

Not all competitive details laid bare

Overall, bare-metal cloud is a niche market, but by some estimates it is growing quickly.

Among hyperscalers such as AWS, Google and Microsoft, the battleground is in early days, with AWS only making its bare-metal offerings generally available in May 2018. Microsoft has mostly positioned bare metal for memory-intensive workloads such as SAP HANA, while also offering it underneath CloudSimple’s VMware service for Azure.

Meanwhile, Google’s bare-metal cloud service is fully managed by Google, provides a set of provisioning tools for customers, and will have unified billing with other Google Cloud services, according to the blog.

How smoothly this all works together could be a key differentiator for Google in comparison with rival bare-metal providers. Management of bare-metal machines can be more granular than traditional IaaS, which can mean increased flexibility as well as complexity.

Google’s Bare Metal Solution instances are based on x86 systems that range from 16 cores with 384 GB of DRAM, to 112 cores with 3,072 GB of DRAM. Storage comes in 1 TB chunks, with customers able to choose between all-flash or a mix of storage types. Google also plans to offer custom compute configurations to customers with that need.

It also remains to be seen how price-competitive Google is on bare metal compared with competitors, which includes providers such as Packet, CenturyLink and Rackspace.

The company didn’t immediately provide costs for Bare Metal Solution instances, but said the hardware can be purchased via monthly subscription, with the best deals for customers that sign 36-month terms. Google won’t charge for data movement between Bare Metal Solution instances and general-purpose Google Cloud infrastructure if it occurs in the same cloud region.

Go to Original Article
Author:

Extending the power of Azure AI to business users

Today, Alysa Taylor, Corporate Vice President of Business Applications and Industry, announced several new AI-driven insights applications for Microsoft Dynamics 365.

Powered by Azure AI, these tightly integrated AI capabilities will empower every employee in an organization to make AI real for their business today. Millions of developers and data scientists around the world are already using Azure AI to build innovative applications and machine learning models for their organizations. Now business users will also be able to directly harness the power of Azure AI in their line of business applications.

What is Azure AI?

Azure AI is a set of AI services built on Microsoft’s breakthrough innovation from decades of world-class research in vision, speech, language processing, and custom machine learning. What I find particularly exciting is that Azure AI provides our customers with access to the same proven AI capabilities that power Xbox, HoloLens, Bing, and Office 365.

Azure AI helps organizations:

  • Develop machine learning models that can help with scenarios such as demand forecasting, recommendations, or fraud detection using Azure Machine Learning.
  • Incorporate vision, speech, and language understanding capabilities into AI applications and bots, with Azure Cognitive Services and Azure Bot Service.
  • Build knowledge-mining solutions to make better use of untapped information in their content and documents using Azure Search.

Bringing the power of AI to Dynamics 365 and the Power Platform

The release of the new Dynamics 365 insights apps, powered by Azure AI, will enable Dynamics 365 users to apply AI in their line of business workflows. Specifically, they benefit from the following built-in Azure AI services:

  • Azure Machine Learning which powers personalized customer recommendations in Dynamics 365 Customer Insights, analyzes product telemetry in Dynamics 365 Product Insights, and predicts potential failures in business-critical equipment in Dynamics 365 Supply Chain Management.
  • Azure Cognitive Services and Azure Bot Service that enable natural interactions with customers across multiple touchpoints with Dynamics 365 Virtual Agent for Customer Service.
  • Azure Search which allows users to quickly find critical information in records such as accounts, contacts, and even in documents and attachments such as invoices and faxes in all Dynamics 365 insights apps.

Furthermore, since Dynamics 365 insights apps are built on top of Azure AI, business users can now work with their development teams using Azure AI to add custom AI capabilities to their Dynamics 365 apps.

The Power Platform, comprised of three services – Power BI, PowerApps, and Microsoft Flow, also benefits from Azure AI innovations. While each of these services is best-of-breed individually, their combination as the Power Platform is a game-changer for our customers.

Azure AI enables Power Platform users to uncover insights, develop AI applications, and automate workflows through low-code, point-and-click experiences. Azure Cognitive Services and Azure Machine Learning empower Power Platform users to:

  • Extract key phrases in documents, detect sentiment in content such as customer reviews, and build custom machine learning models in Power BI.
  • Build custom AI applications that can predict customer churn, automatically route customer requests, and simplify inventory management through advanced image processing with PowerApps.
  • Automate tedious tasks such as invoice processing with Microsoft Flow.

The tight integration between Azure AI, Dynamics 365, and the Power Platform will enable business users to collaborate effortlessly with data scientists and developers on a common AI platform that not only has industry leading AI capabilities but is also built on a strong foundation of trust. Microsoft is the only company that is truly democratizing AI for businesses today.

And we’re just getting started. You can expect even deeper integration and more great apps and experiences that are built on Azure AI as we continue this journey.

We’re excited to bring those to market and eager to tell you all about them!

Go to Original Article
Author: Microsoft News Center

Sigfox network provides cheap, efficient connectivity for IoT

Forget 5G. The key to implementing IoT applications may lie in “zero G.”

Broadband networks like 5G (or 2G, 3G, 4G) can quickly send large amounts of data for streaming applications. But the networks are overly powerful for IoT devices, which produce and transmit small bits of data and don’t need to operate in real time. Scale IoT devices to the millions, and the ideal network for carrying lots of IoT-generated data may be the Sigfox “0G” network.

The Sigfox 0G network enables companies to connect IoT devices at a fraction of the cost and power consumption needed by broadband networks, according to Ajay Rane, vice president of business development for Sigfox, which is based in Labège, France.

A network like this has many uses for Industry 4.0 applications, including supply chain and logistics, industrial IoT (IIoT), smart cities and smart buildings, Rane said.

“[The Sigfox 0G network] can’t do high-speed data, but it works well for the market that we’re targeting,” he said. “We’re not about to replace cellular or any other technology; we have a spot at the bottom of the pyramid of IoT technologies and there are a lot of devices at the bottom of that pyramid which require low power, low cost connectivity.”

The Sigfox network is a Low Power Wide Area Network (LPWAN) that connects devices over large distances without consuming a lot of power. IoT sensors send data through the Sigfox network to a gateway called a Sigfox base station, which posts the messages to the Sigfox cloud at least every 10 minutes. The Sigfox cloud then pushes the messages to client applications.

Keeping the costs low

Asset tracking applications implemented as LPWAN network availability has increased in the last 18 months, and Sigfox has been able to deliver immediate ROI because it’s simpler to implement and the device cost is comparatively low, according to Adarsh Krishnan, principle analyst at ABI Research, which is based in Oyster Bay, New York. Krishnan covers IoT connectivity and LPWAN technologies.

“When you bring the cost of individual asset trackers down to that level, an enterprise looking to deploy these in large volumes can justify such an investment, and Sigfox has devices that can last many years and track across multiple regions,” he said.  “The initial capital investment is much lower because the cost of connectivity itself is very low, and then the cost of devices becomes low because you’re sending very small amounts of data infrequently, which lowers the device cost because you’re optimizing battery use on the devices.”

Battery costs are some of the biggest expenses in asset tracking applications, Krishnan explained, and it can break a business justification to continually replace batteries or pay maintenance costs on thousands of devices.

As IIoT applications become more feasible, other LPWAN options such as LoRa, and other connectivity methods such as cellular networks are also emerging, according to Krishnan. However, the Sigfox network connectivity capacity of just 12 bytes of data maximum and no more than 140 messages per day makes it attractive for IIoT applications.

“Their idea is less is more and they’re addressing very specific use cases within what we call massive IoT use cases, where the data requirements are very small with small packets of data being transmitted from the devices less frequently,” he said. “It’s not real-time tracking — the data transmission may be every half hour — so battery or power efficiency becomes the biggest requirement in some of these use cases.”

Sigfox enables supply chain track-and-trace

Safecube, a startup firm based in Lyon, France that provides supply chain track-and-trace applications, was able to scale its business globally after connecting to the Sigfox network.

Safecube’s service enables shippers to have direct, near real-time access to data about a shipment’s location and condition through IoT sensors that transmit data on the Sigfox network. The network’s growth in coverage in the last two years is the main reason why Safecube uses it, according to Waël Cheaib, Safecube CEO.

“Now the network covers between 60 and 70 countries in the world, so they can say that they have global footprint,” Cheaib said. “They’ve been also developing a technical feature that allows truckers to work worldwide. Until recently, it was not possible for any low power network to have something working in Europe, the U.S., South America and Africa.”

The Sigfox network is suited to Safecube’s application because the data needs to be precise about things such as shipment location, but it’s not a lot of data, Cheaib explained. The network also has to work globally.

“In order to send this information, you don’t need a 5G network. You need a network that is designed to communicate small loads of data, so there are very limited connectivity costs,” he said. “The other resources that are available are long range networks — 2G, 3G, 4G, 5G, which are very costly. But Sigfox is the only low-power network that’s able to provide a global solution.”

Go to Original Article
Author:

Building cloud-native applications with Azure and HashiCorp

With each passing year, more and more developers are building cloud-native applications. As developers build more complex applications they are looking to innovators like Microsoft Azure and HashiCorp to reduce the complexity of building and operating these applications. HashiCorp and Azure have worked together on a myriad of innovations. Examples of this innovation include tools that connect cloud-native applications to legacy infrastructure and tools that secure and automate the continuous deployment of customer applications and infrastructure. Azure is deeply committed to being the best platform for open source software developers like HashiCorp to deliver their tools to their customers in an easy-to use, integrated way. Azure innovation like the managed applications platform that power HashiCorp’s Consul Service on Azure are great examples of this commitment to collaboration and a vibrant open source startup ecosystem. We’re also committed to the development of open standards that help these ecosystems move forward and we’re thrilled to have been able to collaborate with HashiCorp on both the CNAB (Cloud Native Application Bundle) and SMI (Service Mesh Interface) specifications.

Last year at HashiConf 2018, I had the opportunity to share how we had started to integrate Terraform and Packer into the Azure platform. I’m incredibly excited to get the opportunity to return this year to share how these integrations are progressing and to share a new collaboration on cloud native networking. With this new work we now have collaborations that help customers connect and operate their applications on Azure using HashiCorp technology.

Connect — HashiCorp Consul Service on Azure

After containers and Kubernetes, one of the most important innovations in microservices has been the development of the concept of a service mesh. Earlier this year we partnered with HashiCorp and others to announce the release of Service Mesh Interface, a collaborative, implementation agnostic API for the configuration and deployment of service mesh technology. We collaborated with HashiCorp to produce a control rules implementation of the traffic access control (TAC) using Consul Connect. Today we’re excited that Azure customers can take advantage of HashiCorp Consul Services on Azure powered by the Azure Managed Applications platform. HashiCorp Consul provides a solution to simplify and secure service networking and with this new managed offering, our joint customers can focus on the value of Consul while confident that the experts at HashiCorp are taking care of the management of the service. Reducing complexity for customers and enabling them to focus on cloud native innovation.

Provision — HashiCorp Terraform on Azure

HashiCorp Terraform is a great tool for doing declarative deployment to Azure. We’re seeing great momentum with adoption of HashiCorp Terraform on Azure as the number of customers has doubled since the beginning of the year – customers are using Terraform to automate Azure infrastructure deployment and operation in a variety of scenarios. 

The momentum is fantastic on the contribution front as well with nearly 180 unique contributors to the Terraform provider for Azure Resource Manager. The involvement from the community with our increased 3-week cadence of releases (currently at version 1.32) ensures more coverage of Azure services by Terraform. Additionally, after customer and community feedback regarding the need for additional Terraform modules for Azure, we’ve been working hard at adding high quality modules and now have doubled the number of Azure modules in the terraform registry, bringing it to over 120 modules. 

We believe all these additional integrations enable customers to manage infrastructure as code more easily and simplify managing their cloud environments. Learn more about Terraform on Azure.

Microsoft and HashiCorp are working together to provide integrated support for Terraform on Azure. Customers using Terraform on Microsoft’s Azure cloud are mutual customers, and both companies are united to provide troubleshooting and support services. This joint entitlement process provides collaborative support across companies and platforms while delivering a seamless customer experience. Customers using Terraform Provider for Azure can file support tickets to Microsoft support. Customers using Terraform on Azure support can file support tickets to Microsoft or HashiCorp.

Deploy — Collaborating on Cloud Native Application Bundles specification

One of the critical problems solved by containers is the hermetic packaging of a binary into a package that is easy to share and deploy around the world. But a cloud-native application is more than a binary, and this is what led to the co-development, with HashiCorp and others, of the Coud Native Application Bundle (CNAB) specification. CNABs  allow you to package images alongside configuration tools like Terraform and other artifacts to allow a user to seamlessly deploy an application from a single package. I’ve been excited to see the community work together to build the specification to a 1.0 release that shows CNAB is ready for all of the world’s deployment needs. Congratulations to the team on the work and the fantastic partnership.

If you want to learn more about the ways in which Azure and HashiCorp collaborate to make cloud-native development easier, please check out the links below:

Go to Original Article
Author: Microsoft News Center

The use of technology in education has pros and cons

The use of technology in education continues to grow, as students turn to AI-powered applications, virtual reality and internet searches to enhance their learning.

Technology vendors, including Google, Lenovo and Microsoft, have increasingly developed technology to help pupils in classrooms and at home. That technology has proved popular with students in elementary education and higher education, and has been shown to benefit independent learning efforts, even as critics have expressed worry that can lead to decreased social interactions.

Lenovo, in a recent survey of 15,000 technology users across 10 countries, reported that 75% of U.S. parents who responded said their children are more likely to look something up online than ask for help with schoolwork. In China, that number was 85%, and in India, it was 89%.

Taking away stress

According to vendors, technology can augment schoolwork help busy parents give their children.

Parenting in general is becoming a challenge for a lot of the modern families as both parents are working and some parents may feel overwhelmed,” said Rich Henderson, director of global education solutions at Lenovo, a China-based multinational technology vendor.

If children can learn independently, that can take pressure and stress off of parents, Henderson continued.

Independent learning can include searching for information on the web, querying using a virtual assistant, or using specific applications.

About 45% of millennials and younger students find technology “makes it much easier to learn about new things,” Henderson said.

Many parents, however, said on the survey that they felt the use of technology in education, while beneficial to their children’s learning, also led to decreases in social interactions. Using the technology to look up answers, instead of consulting parents, teachers or friends, concerned parents that “their children may be becoming too dependent on technology and may not be learning the necessary social skills they require,” according to the survey.

At the same time, however, many parents felt that the use of technology in education would eventually help future generations become more independent learners.

Technology has certainly helped [children learn].
Rich HendersonDirector of global education solutions, Lenovo

“Technology has certainly helped [children learn] with the use of high-speed internet, more automated translation tools. But we can’t ignore the fact that we need students to improve their social skills, also,” Henderson said. “That’s clearly a concern the parents have.”

Yet, despite the worries, technology vendors have poured more and more money into the education space. Lenovo itself sells a number of hardware and software products for the classroom, including infrastructure to help teachers manage devices in a classroom, and a virtual reality (VR) headset and software to build a VR classroom.

The VR classroom has benefited students taking online classes, giving them a virtual classroom or lab to learn in.

Google in education

Meanwhile Google, in an Aug. 15 blog post, promoted the mobile learning application Socratic it had quietly acquired last year. The AI-driven application, released for iOS, can automatically solve mathematical and scientific equations by taking photos of them. The application can also search for answers to questions posed in natural language.

Use of technology in education, student, learning
The use of technology in education provides benefits and challenges for students.

Also, Socratic features references guides to topics frequently taught in schools, including algebra, biology and literature.

Microsoft, whose Office suite is used in many schools around the world, sells a range of educational and collaborative note-taking tools within its OneNote product. The tool, which includes AI-driven search functions, enables students to type in math equations, which it will automatically solve.

While apparently helpful, the increased use of technology in education, as well as the prevalence of AI-powered software for students, has sparked some criticism.

The larger implications

Mike Capps, CEO of AI startup Diveplane, which sells auditable, trainable, “transparent” AI systems, noted that the expanding use of AI and automation could make basic skills obsolete.

Many basic skills, including typing and driving, could eventually end up like Latin — learnable, potentially useful, but unnecessary.

AI systems could increasingly help make important life decisions for people, Capps said.

“More and more decisions about kids’ lives are made by computers, like college enrollment decisions and what car they should buy,” Capps said.

Go to Original Article
Author:

Get to know data storage containers and their terminology

Data storage containers have become a popular way to create and package applications for better portability and simplicity. Seen by some analysts as the technology to unseat virtual machines, containers have steadily gained more attention as of late, from customers and vendors alike.

Why choose containers and containerization over the alternatives? Containers work on bare-metal systems, cloud instances and VMs, and across Linux and select Windows and Mac OSes. Containers typically use fewer resources than VMs and can bind together application libraries and dependencies into one convenient, deployable unit.

Below, you’ll find key terms about containers, from technical details to specific products on the market. If you’re looking to invest in containerization, you’ll need to know these terms and concepts.

Getting technical

Containerization. With its roots in partitioning, containerization is an efficient data storage strategy that virtually isolates applications, enabling multiple containers to run on one machine but share the same OS. Containers run independent processes in a shared user space and are capable of running on different environments, which makes them a flexible alternative to virtual machines.

The benefits of containerization include reduced overhead on hardware and portability, while concerns include the security of data stored on containers. With all of the containers running under one OS, if one container is vulnerable, the others are as well.

Container management software. As the name indicates, container management software is used to simplify, organize and manage containers. Container management software automates container creation, destruction, deployment and scaling and is particularly helpful in situations with large numbers of containers on one OS. However, the orchestration aspect of management software is complex and setup can be difficult.

Products in this area include Kubernetes, an open source container orchestration software; Apache Mesos, an open source project that manages compute clusters; and Docker Swarm, a container cluster management tool.

Persistent storage. In order to be persistent, a storage device must retain data after being shut off. While persistence is essentially a given when it comes to modern storage, the rise of containerization has brought persistent storage back to the forefront.

Containers did not always support persistent storage, which meant that data created with a containerized app would disappear when the container was destroyed. Luckily, storage vendors have made enough advances in container technology to solve this issue and retain data created on containers.

Stateful app. A stateful app saves client data from the activities of one session for use in the next session. Most applications and OSes are stateful, but because stateful apps didn’t scale well in early cloud architectures, developers began to build more stateless apps.

With a stateless app, each session is carried out as if it was the first time, and responses aren’t dependent upon data from a previous session. Stateless apps are better suited to cloud computing, in that they can be more easily redeployed in the event of a failure and scaled out to accommodate changes.

However, containerization allows files to be pulled into the container during startup and persist somewhere else when containers stop and start. This negates the issue of stateful apps becoming unstable when introduced to a stateless cloud environment.

Container vendors and products

While there is one vendor undoubtedly ahead of the pack when it comes to modern data storage containers, the field has opened up to include some big names. Below, we cover just a few of the vendors and products in the container space.

Docker. Probably the most synonymous with data storage containers, Docker is even credited with bringing about the container renaissance in the IT space. Docker’s platform is open source, which enables users to register and share containers over various hosts in both private and public environments. In recent years, Docker made containers accessible and offers various editions of containerization technology.

When you refer to Docker, you likely mean either the company itself, Docker Inc., or the Docker Engine. Initially developed for Linux systems, the Docker Engine had version updates extended to operate natively on both Windows and Apple OSes. The Docker Engine supports tasks and workflows involved in building, shipping and running container-based applications.

Container Linux. Originally referred to as CoreOS Linux, Container Linux by CoreOS is an open source OS that deploys and manages the applications within containers. Container Linux is based on the Linux kernel and is designed for massive scale and minimal overhead. Although, Container Linux is open source, CoreOS sells support for the OS. Acquired by Red Hat in 2018, CoreOS develops open source tools and components.

Azure Container Instances (ACI). With ACI, developers can deploy data storage containers on the Microsoft Azure cloud. Organizations can spin up a new container via the Azure portal or command-line interface, and Microsoft automatically provisions and scales the underlying compute resources. ACI also supports standard Docker images and Linux and Windows containers.

Microsoft Windows containers. Windows containers are abstracted and portable operating environments supported by the Microsoft Windows Server 2016 OS. They can be managed with Docker and PowerShell and support established Windows technologies. Along with Windows Containers, Windows Server 2016 also supports Hyper-V containers.

VMware vSphere Integrated Containers (VIC). While VIC can refer to individual container instances, it is also a platform that deploys and manages containers within VMs from within VMware’s vSphere VM management software. Previewed under the name Project Bonneville, VMware’s play on containers comes with the virtual container host, which represents tools and hardware resources that create and control container services.

Go to Original Article
Author:

Automated transcription services for adaptive applications

Automated transcription services have a variety of applications. Enterprises frequently use them to transcribe meetings, and call centers use them to transcribe phone calls into text to more easily analyze the substance of each call.

The services are widely used to aid the deaf, by automatically providing subtitles to videos and television shows, as well as in call centers that enable the deaf to communicate with each other by transcribing each person’s speech.

VTCSecure and Google

VTCSecure, a several-years-old startup based in Clearwater, Fla., uses Google Cloud’s Speech-to-Text services to power a transcription platform that is used by businesses, non-profits, and municipalities around the world to aid the deaf and hard of hearing.

The platform offers an array of capabilities, including video services that connect users to a real-time sign-language interpreter, and deaf-to-deaf call centers. The call centers, enabling users to connect via video, voice or real-time-text, build on Google Cloud’s Speech-to-Text technology to provide users with automatic transcriptions.

Google Cloud has long sold Speech-to-Text and Text-to-Speech services, which provide developers with the data and framework to create their own transcription or voice applications. For Hayes, the services, powered in part by speech technologies developed by parent company Alphabet Inc.’s DeepMind division, were easy to set up and adapt.

“It was one of the best processes,” said Peter Hayes, CEO of VTCSecure. He added that his company has been with happy with what it considers a high level of support from Google.

Speech-to-text

Hayes said Google provides technologies, as well as development support, for VTCSecure and for his newest company, TranslateLive.

Hayes also runs the platform on Google Cloud, after doing a demo for the FTC that he said lagged on a rival cloud network.

Google Cloud’s Speech-to-Text and Text-to-Speech technology, as well as the translation technologies used for TranslateLive, constantly receive updates from Google, Hayes said.

Startup Verbit provides automated transcription services that it built in-house. While only two years old, the startup considers itself a competitor to Google Cloud’s transcription services, even releasing a blog post last year outlining how its automated transcription services can surpass Google’s.

Automatic translation service, automatic translation services, Verbit
Automatic translation services from companies like Verbit are used by the deaf and hard of hearing

Transcription startup

Verbit, unlike Google, adds humans to the transcription loop, explained Tom Livne, co-founder and CEO of the Israel-based startup. It relies on its home-grown models for an initial transcription, and then passes those off to remote human transcribers who fine-tune the transcription, reviewing them and making edits.

The combined process produces high accuracy, Livne said.

A lawyer, Livne initially started Verbit to specifically sell to law firms. However, the vendor moved quickly into the education space.

“We want to create an equal opportunity for students with disabilities,” Livne said. Technology, he noted, has long been able to aid those with disabilities.

We want to create an equal opportunity for students with disabilities.
Tom LivneCo-founder and CEO, Verbit

George Mason University, a public university in Fairfax, Va., relies on Verbit to automatically transcribe videos and online lectures.

“We address the technology needs of students with disabilities here on campus,” said Korey Singleton, assistive technology initiative manager at George Mason.

After trying out other vendors, the school settled on Verbit largely because of its competitive pricing, Singleton said. As most of its captioning and transcription comes from the development of online courses, the school doesn’t require a quick turnaround, Singleton said. So, Verbit was able to offer a cheaper price.

“We needed to find a vendor that could do everything we needed to do and provide us with a really good rate,” Singleton said. Verbit provided that.

Moving forward, George Mason will be looking for a way to automatically integrate transcripts with the courses. Now, putting them together is a manual process, but with some APIs and automated technologies, Singleton said he’s aiming to make that happen automatically.

Go to Original Article
Author:

Announcing new AI and mixed reality business applications for Microsoft Dynamics – The Official Microsoft Blog

Today, I had the opportunity to speak to press and analysts in San Francisco about our vision for business applications at Microsoft. In addition, I had the privilege to make two very important announcements: the upcoming availability of new Dynamics 365 AI applications, and our very first mixed reality business applications: Dynamics 365 Remote Assist and Dynamics 365 Layout.

Our vision for business applications at Microsoft

We live in a connected world where companies are challenged every day to innovate so they can stay ahead of emerging trends and repivot business models to take advantage of new opportunities to meet growing customer demands.

To innovate, organizations need to reimagine their processes. They need solutions that are modern, enabling new experiences for how they can engage their customers while making their people more productive. They need unified systems that break data silos, so they have a holistic view of their business, customers and employees. They need pervasive intelligence threaded throughout the platform, giving them the ability to reason over data, to predict trends and drive proactive intelligent action. And with adaptable applications, they can be nimble, allowing them to take advantage of the next opportunity that comes their way.

Two years ago, when we introduced Dynamics 365 we started a journey to tear down the traditional silos of customer relationship management (CRM) and enterprise resource planning (ERP). We set out to reimagine business applications as modern, unified, intelligent and adaptable solutions that are integrated with Office 365 and natively built on Microsoft Azure.

With the release of our new AI and mixed reality applications we are taking another step forward on our journey to help empower every organization on the planet to achieve more through the accelerant of business applications. Specifically, today we are making the following announcements:

Dynamics 365 + AI

First, I am happy to announce the coming availability of a new Dynamics 365 AI offering — a new class of AI applications that will deliver out-of-the-box insights by unifying data and infusing it with advanced intelligence to guide decisions and empower organizations to take informed actions. And because these insights are easily extensible through the power of Microsoft Power BI, Azure and the Common Data Service, organizations will be able to address even the most complex scenarios specific to their business.

Dynamics 365 AI for Sales: AI can help salespeople prioritize their time to focus on deals that matter most, provide answers to the most common questions regarding the performance of sales teams, offer a detailed analysis of the sales pipeline, and surface insights that enable smarter coaching of sales teams.

Dynamics 365 AI for Customer Service: With Microsoft’s AI and natural language understanding, customer service data can surface automated insights that help guide employees to take action and can even leverage virtual agents to help lower support costs and enable delightful customer experiences, all without needing in-house AI experts and without writing any code.

Dynamics 365 AI for Market Insights: Helps empower your marketing, social media and market research teams to make better decisions with market insights. Marketers can improve customer relationships with actionable web and social insights to engage in relevant conversations and respond faster to trends.

To help bring this to life, today we released a video with our CEO, Satya Nadella, and Navrina Singh, a member of our Dynamics 365 engineering team, showing examples of ways we’re bringing the power of AI to customer service organizations.

Dynamics 365 + Mixed Reality

Our second announcement of the day centers on the work we are doing to bring mixed reality and business applications together.

Since the release of Microsoft HoloLens over two years ago, the team has learned a lot from customers and partners. The momentum that HoloLens has received within the commercial space has been overwhelmingly positive. This has been supported by increased demand and deployment from some of the world’s most innovative companies.

We recognize that many employees need information in context to apply their knowledge and craft. Not only on a 2-D screen — but information and data in context, at the right place, and at the right time, so employees can produce even greater impact for their organizations. Mixed reality is a technology uniquely suited to do exactly that.

This is a whole new kind of business application. And that’s precisely what we’re introducing today, Dynamics 365 Remote Assist and Dynamics 365 Layout.

Today, we also showcased for the first time how Chevron is deploying HoloLens to take advantage of Dynamics 365 mixed reality business applications.

Chevron is already achieving real, measurable results with its global HoloLens deployment. Previously it was required to fly in an inspector from Houston to a facility in Singapore once a month to inspect equipment. Now it has in-time inspection using Dynamics 365 Remote Assist and can identify issues or provide approvals immediately.

In addition, remote collaboration and assistance have helped the company operate more safely in a better work environment, serving as a connection point between firstline workers and remote experts, as well as cutting down on travel and eliminating risks associated with employee travel.

Here is a peek into the work Chevron is doing with mixed reality:

Unlock what’s next with the Dynamics 365 October 2018 release

Next week at Microsoft Ignite and Microsoft Envision we’ll be in Orlando talking with thousands of customers, partners, developers, and IT and business leaders about our October 2018 release for Dynamics 365 and the Power platform that will be generally available Oct. 1. The wave of innovation this represents across the entire product family is significant, with hundreds of new capabilities and features.

We will have a lot more to talk about in the weeks and months ahead. We look forward to sharing more!

Tags: , , ,

The new business imperative: A unified cloud security strategy

As more businesses begin to explore the benefits of moving on-premises data and applications to the cloud, they’re having to rethink their traditional approaches to data security. Not only are cybercriminals developing more sophisticated attacks, but the number of employees and users who can access, edit, and share data has increased the risk of breaches. In fact, Gartner indicates* that “through 2022, 95 percent of cloud security incidents failures will be the customer’s fault. CIOs can combat this by implementing and enforcing policies on cloud ownership, responsibility and risk acceptance. They should also be sure to follow a life cycle approach to cloud governance and put in place central management and monitoring planes to cover the inherent complexity of multicloud use.”

Instead of relying on a patchwork of third-party security solutions that don’t always speak to each other, potentially leaving systems vulnerable to attack, companies are now adopting a unified, end-to-end cloud security defense. This typically involves choosing a cloud provider that can integrate security controls right into existing corporate systems and processes. When these controls span the entire IT infrastructure, they make it easier to protect data and maintain user trust by offering increased compatibility, better performance, and more flexibility.

Protection that’s always compatible

A holistic, cloud-supported threat warning and detection system can be designed to work seamlessly across every asset of an IT environment. For instance, built-in security management solutions can give IT teams the ability to constantly monitor the entire system from a centralized location, rather than manually evaluating different machines. This allows them to sense threats early, provide identity monitoring, and more—all without any compatibility issues.

Container shipping company Mediterranean Shipping Company (MSC) has gone this route. As in many businesses, MSC’s IT environment is spread across a variety of locations, networks, and technologies, such as container ships, trucking networks, and offices. Its previous security strategy employed a mixture of third-party solutions that often ran into compatibility issues between different components, giving attackers a large surface area to probe. This made MSC vulnerable to threats such as fileless attacks, phishing, and ransomware. However, after transitioning to a unified cloud security solution, it has been able to guard against attacks using protection that integrates effortlessly into its existing environment.

Reliable performance, more efficiently

The more complex an IT environment gets, the more time employees spend testing, maintaining, and repairing third-party security solutions. A unified cloud security approach improves performance by not only providing a consistent, layered defense strategy, but by also automating it across the entire IT infrastructure. At MSC, software and security updates are now done automatically and deployed without delay across the cloud. Information about possible threats and breaches can quickly be shared across devices and identities, speeding up response and recovery times so that employees can focus on other issues.

Security with flexibility to grow

Scalability is another factor driving adoption. A cloud environment can easily scale to accommodate spikes in traffic, additional users, or data-intensive applications. A patchwork of third-party security solutions tends not to be so nimble. At MSC, security controls are integrated into multiple levels of the existing IT infrastructure—from the operating system to the application layer—and can be dynamically sized to meet new business needs. For example, continuous compliance controls can be established to monitor regulatory activities and detect vulnerabilities as they grow.

A unified security approach: becoming the standard

The best security solutions perform quietly in the background, protecting users without them noticing. Unified cloud security does this while also reducing the resources required to keep things running smoothly. “Once you have true defense in depth, there’s less chance of having to single out a user and impact their productivity because you have to reimage an infected machine,” said Aaron Shvarts, chief security officer at MSC Technology North America.

After moving its workloads to Azure and upgrading its previous third-party security solutions to the native protection of Windows Defender, MSC now has a defense strategy that suits the complexity of its business. Learn more about Azure security solutions and how Microsoft can help you implement unified security across your cloud.

To stay up to date on the latest news about Microsoft’s work in the cloud, bookmark this blog and follow us on Twitter, Facebook, and LinkedIn.

*Gartner, Smarter with Gartner, Is the Cloud Secure?, 27 March 2018, https://www.gartner.com/smarterwithgartner/is-the-cloud-secure/

VMware Project Dimension to deliver managed HCI, edge networking

VMware is developing a managed edge appliance that has compute and storage for running applications and a software-defined WAN for connecting to the data center and public clouds.

The upcoming offering is in technical preview under the name Project Dimension. The product is a lightweight hyper-converged infrastructure system that includes the vSphere infrastructure compute stack and the vSAN software-defined storage product.

For networking, Project Dimension uses VMware’s NSX SD-WAN by VeloCloud, which VMware acquired last year. The VeloCloud SD-WAN provides connectivity to the corporate data center, SaaS or applications running on IaaS.

Project Dimension is essentially the branch version of VMware’s Cloud Foundation, which merges compute, storage and network provisioning to simplify application deployment in the data center and the Amazon and Microsoft Azure public clouds. Companies could use Project Dimension to run IoT and other software in retail stores, factories and oil rigs, according to VMware. Actual hardware for the system would come from VMware partners.

Companies already using Cloud Foundation could apply their policies and security to applications running on Project Dimension.

“There’s a lot of potential for operational simplicity. There’s the potential for improved multi-cloud management, and there’s the potential for faster time to market [for users’ applications],” said Stephen Elliot, an analyst at IDC.

But Project Dimension’s hybrid cloud approach — which lets companies run some applications at the edge, while also connecting to software running in the cloud — could eventually make it a “niche product,” said Andrew Froehlich, president of computer consultancy West Gate Networks, based in Loveland, Colo.

“While hybrid architectures are extremely common today, most businesses are looking to get to a 100% public cloud model as soon as they can,” he said. “Thus, it’s an interesting concept — and one that some can use — but I don’t see this making a significant impact long term.”

How Project Dimension works as a managed service

VMware plans to offer Project Dimension as a managed service. A company would order the service by logging into the VMware Cloud and going to its Edge Portal, where the business would choose a Project Dimension resource cluster and a service-level agreement.

Businesses would then upload the IP addresses of the edge locations, where VMware would send technicians to install the Project Dimension system. Each system would appear as a separate cluster in the Edge Portal.

VMware plans to use its cloud-based lifecycle management system to fix failures and handle infrastructure firmware and software updates. As a result, companies could focus on developing and deploying business applications without having to worry about infrastructure maintenance.

VMware, which introduced Project Dimension last week at the VMworld conference in Las Vegas, did not say when it would release the product. Also, the company did not disclose pricing.