Tag Archives: Cloud

What admins need to know about Azure Stack HCI

Despite all the promise of cloud computing, it remains out of reach for administrators who cannot, for different reasons, migrate out of the data center.

Many organizations still grapple with concerns, such as compliance and security, that weigh down any aspirations to move workloads from on-premises environments. For these organizations, hyper-converged infrastructure (HCI) products have stepped in to approximate some of the perks of the cloud, including scalability and high availability. In early 2019, Microsoft stepped into this market with Azure Stack HCI. While it was a new name, it was not an entirely new concept for the company.

Some might see Azure Stack HCI as a mere rebranding of the existing Windows Server Software-Defined (WSSD) program, but there are some key differences that warrant further investigation from shops that might benefit from a system that integrates with the latest software-defined features in the Windows Server OS.

What distinguishes Azure Stack HCI from Azure Stack?

When Microsoft introduced its Azure Stack HCI program in March 2019, there was some initial confusion from many in IT. The company offered a similarly named product called Azure Stack, which uses the name of Microsoft’s cloud platform, to run a version of Azure inside the data center.

Microsoft developed Azure Stack HCI for local VM workloads that run on Windows Server 2019 Datacenter edition. While not explicitly tied to the Azure cloud, organizations that use Azure Stack HCI can connect to Azure for hybrid services, such as Azure Backup and Azure Site Recovery.

Azure Stack HCI offerings use OEM hardware from vendors such as Dell, Fujitsu, Hewlett Packard Enterprise and Lenovo that is validated by Microsoft to capably run the range of software-defined features in Windows Server 2019.

How is Azure Stack HCI different from the WSSD program?

While Azure Stack is essentially an on-premises version of the Microsoft cloud computing platform, its approximate namesake, Azure Stack HCI, is more closely related to the WSSD program that Microsoft launched in 2017.

Microsoft made its initial foray into the HCI space with its WSSD program, which utilized the software-defined features in the Windows Server 2016 Datacenter edition on hardware validated by Microsoft.

For Azure Stack HCI, Microsoft uses the Windows Server 2019 Datacenter edition as the foundation of this product with updated software-defined functionality compared to Windows Server 2016.

Windows Server gives administrators the virtualization layers necessary to avoid the management and deployment issues related to proprietary hardware. Windows Server’s software-defined storage, networking and compute capabilities enable organizations to more efficiently pool the hardware resources and use centralized management to sidestep traditional operational drawbacks.

For Azure Stack HCI, Microsoft uses the Windows Server 2019 Datacenter edition as the foundation of this product with updated software-defined functionality compared to Windows Server 2016. For example, Windows Server 2019 offers expanded pooled storage of 4 petabytes in Storage Spaces Direct, compared to 1 PB on Windows Server 2016. Microsoft also updated the clustering feature in Windows Server 2019 for improved workload resiliency and added data deduplication to give an average of 10 times more storage capacity than Windows Server 2016.

What are the deployment and management options?

The Azure Stack HCI product requires the use of the Windows Server 2019 Datacenter edition, which the organization might get from the hardware vendor for a lower cost than purchasing it separately.

To manage the Azure Stack HCI system, Microsoft recommends using Windows Admin Center, a relatively new GUI tool developed as the potential successor to Remote Server Administration Tools, Microsoft Management Console and Server Manager. Microsoft tailored Windows Admin Center for smaller deployments, such as Azure Stack HCI.

Windows Admin Center drive dashboard
The Windows Admin Center server management tool offers a dashboard to check on the drive performance for issues related to latency or when a drive fails.

Windows Admin Center encapsulates a number of traditional server management utilities for routine tasks, such as registry edits, but it also handles more advanced functions, such as the deployment and management of Azure services, including Azure Network Adapter for companies that want to set up encryption for data transmitted between offices.

Companies that purchase an Azure Stack HCI system get Windows Server 2019 for its virtualization technology that pools storage and compute resources from two nodes up to 16 nodes to run VMs on Hyper-V. Microsoft positions Azure Stack HCI as an ideal system for multiple scenarios, such as remote office/branch office and VDI, and for use with data-intensive applications, such as Microsoft SQL Server.

How much does it cost to use Azure Stack HCI?

The Microsoft Azure Stack HCI catalog features more than 150 models from 20 vendors. A general-purpose node will cost about $10,000, but the final price will vary depending on the level of customization the buyer wants.

There are multiple server configuration options that cover a range of processor models, storage types and networking. For example, some nodes have ports with 1 Gigabit Ethernet, 10 GbE, 25 GbE and 100 GbE, while other nodes support a combination of 25 GbE and 10 GbE ports. Appliances optimized for better performance that use all-flash storage will cost more than units with slower, traditional spinning disks.

On top of the price of the hardware is the annual maintenance and support fees that are typically a percentage of the purchase price of the appliance.

If a company opts to tap into the Azure cloud for certain services, such as Azure Monitor to assist with operational duties by analyzing data from applications to determine if a problem is about to occur, then additional fees will come into play. Organizations that remain fixed with on-premises use for their Azure Stack HCI system will avoid these extra costs.

Go to Original Article
Author:

For Sale – Synology DS218j 2 Bay NAS

I have for sale a mint condition Synology NAS drive (HDD drives not included).

In perfect working order with Original packaging.
Selling as I have purchased a 4 bay NAS.

I’ll post some images in the next couple of days once I’ve had chance to Remove it from my system and setup the new NAS.
In the meantime I’ve posted a link above for product details on Synology’s website.

Location
Wimblington, Cambridgeshire
Price and currency
£100
Delivery cost included
Delivery Is Included
Prefer goods collected?
I prefer the goods to be collected
Advertised elsewhere?
Not advertised elsewhere
Payment method
Bank Transfer

Go to Original Article
Author:

KPMG expects to invest US$5 billion in digital strategy and expand Microsoft alliance to accelerate professional services transformation – Stories

New innovations built on Microsoft cloud and AI technologies help clients achieve greater accuracy and decision-making capabilities, increased productivity, and cost efficiencies.

AMSTELVEEN, Netherlands and REDMOND, Wash. — Dec. 5, 2019 — KPMG and Microsoft Corp. are strengthening their global relationship through a five-year agreement to accelerate digital transformation for KPMG member firms and mutual clients. As part of its announcement to significantly invest in technology, people and innovation,, KPMG is modernizing its workplace using the Microsoft 365 suite of cloud-based collaboration and productivity tools, including Microsoft Teams. KPMG is also utilizing Microsoft Azure and its AI capabilities as the backbone for a new common, global cloud-based platform. The platform will strengthen KPMG’s range of digital offerings with new innovations in cloud-based audit capabilities, tax solutions and risk management. Clients in all sectors, including those in highly regulated industries, benefit from globally consistent and continuous service delivery that enables greater speed of deployment while adhering to industry-leading compliance and security standards.

“Together with KPMG, we’re accelerating digital transformation across industries by bringing the latest advances in cloud, AI and security to highly regulated workloads in tax, audit and advisory,” said Satya Nadella, CEO, Microsoft. “KPMG’s deep industry and process expertise, combined with the power of our trusted cloud — spanning Azure, Dynamics 365 and Microsoft 365 — will bring the best of both organizations together to help customers around the world become more agile in an increasingly complex business environment.”

New business-critical solutions

As organizations expand to new geographies, develop new products and recruit new talent, processes can become increasingly complex and harder to scale. Market forces, such as evolving data protection laws, currency fluctuations and geopolitical tensions, increase the complexity and require a greater responsiveness for systems and tools.

The strong portfolio of KPMG and Microsoft alliance offerings can help address these challenges more quickly by building applications on demand, automating manual processes, and continuously analyzing information to minimize the risk of errors and increase the ability to make smart decisions.

“Our alliance with Microsoft has become a critical component in helping us deliver industry-leading solutions and services to clients. Our significant multiyear investment continues to empower our people to work more efficiently and collaboratively while maximizing the power of a workforce that blends people and technology,” said Bill Thomas, Global Chairman, KPMG International. “By harnessing Microsoft’s intelligent and trusted cloud, we aim to help clients be at the forefront of change and better prepared for a digital-first future.”

Combining industry expertise with advanced technology

Through a jointly funded incubator, KPMG and Microsoft are co-developing a robust portfolio of solutions and managed services in the areas of cloud business transformation, intelligent business applications and smart workplace solutions.

For example, KPMG in the U.S. and Microsoft are working together to bring the power of Azure to the healthcare and life sciences industries. This collaboration is enabling organizations within this highly regulated sector to maximize their clinical, operational and financial performance with an easily scalable solution that helps improve deployment speed, accelerate ROI and increase data-driven insights.

In addition, KPMG in the Netherlands has developed risk management, compliance and internal audit solutions that leverage discovery tools to enable the digitization of risk and compliance processes across domains such as finance, legal and IT. Designed by KPMG and built on Microsoft Azure, the solutions provide seamless and cost-efficient policy and controls automation, putting smart analytics directly in the hands of business and IT operators so they can make timely, corrective actions when deviations occur.

Smart audit platform

KPMG, with the launch of its smart audit platform KPMG Clara in 2017, became the first of the Big Four to take its audit workflow to the cloud. Based on Microsoft Azure, KPMG Clara is an automated, agile, intelligent and scalable platform that allows KPMG professionals to work smarter, bringing powerful data and analytics capabilities into one interface, while allowing clients to interact on a real-time basis with the audit process.

By enriching the audit mandate with AI, KPMG enables its professionals to make decisions based on real-time data. This further reinforces KPMG’s commitment to maintaining and enhancing audit quality and building a future where technology continually enriches the audit through the introduction of new digital advancements.

KPMG Clara will integrate with Microsoft Teams, providing a platform for audit professionals to centrally manage and securely share audit files, track audit-related activities, and communicate using chat, voice and video meetings. This helps simplify the auditors’ workflow, enabling them to stay in sync throughout the entire process and drive continuous communication with the client.

Empowering workforce transformation

Through its common, global cloud platform, KPMG will create a set of cloud-based capabilities ranging from hosting infrastructure based on Microsoft Azure to more than 50 advanced solutions, such as AI, cyber and robotic process automation (RPA). KPMG will further empower its global workforce of over 207,000 employees across 153 countries with Microsoft 365, including Teams, to better meet the needs of clients through increased collaboration, improved productivity and data-driven insights. In addition, more than 30,000 KPMG professionals across 17 member firms have been equipped with Dynamics 365, a suite of powerful customer relationship applications.

To read more about the KPMG and Microsoft alliance, visit the Microsoft Transform blog.

About KPMG International 

KPMG is a global network of professional services firms providing Audit, Tax and Advisory services. We operate in 153 countries and territories and have 207,000 people working in member firms around the world. The independent member firms of the KPMG network are affiliated with KPMG International Cooperative (“KPMG International”), a Swiss entity. Each KPMG firm is a legally distinct and separate entity and describes itself as such.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:
Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777,
[email protected]
Mark Walters, KPMG International, (646) 943-2115, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

SageMaker Studio makes model building, monitoring easier

LAS VEGAS — AWS launched a host of new tools and capabilities for Amazon SageMaker, AWS’ cloud platform for creating and deploying machine learning models; drawing the most notice was Amazon SageMaker Studio, a web-based integrated development platform.

In addition to SageMaker Studio, the IDE for platform for building, using and monitoring machine learning models, the other new AWS products aim to make it easier for non-expert developers to create models and to make them more explainable.

During a keynote presentation at the AWS re:Invent 2019  conference here Tuesday, AWS CEO Andy Jassy described five other new SageMaker tools: Experiments, Model Monitor, Autopilot, Notebooks and Debugger.

“SageMaker Studio along with SageMaker Experiments, SageMaker Model Monitor, SageMaker Autopilot and Sagemaker Debugger collectively add lots more lifecycle capabilities for the full ML [machine learning] lifecycle and to support teams,” said Mike Gualtieri, an analyst at Forrester.

New tools

SageMaker Studio, Jassy claimed, is a “fully-integrated development environment for machine learning.” The new platform pulls together all of SageMaker’s capabilities, along with code, notebooks and datasets, into one environment. AWS intends the platform to simplify SageMaker, enabling users to create, deploy, monitor, debug and manage models in one environment.

Google and Microsoft have similar machine learning IDEs, Gualtieri noted, adding that Google plans for its IDE to be based on DataFusion, its cloud-native data integration service, and to be connected to other Google services.

SageMaker Notebooks aims to make it easier to create and manage open source Jupyter notebooks. With elastic compute, users can create one-click notebooks, Jassy said. The new tool also enables users to more easily adjust compute power for their notebooks and transfer the content of a notebook.

Meanwhile, SageMaker Experiments automatically captures input parameters, configuration and results of developers’ machine learning models to make it simpler for developers to track different iterations of models, according to AWS. Experiments keeps all that information in one place and introduces a search function to comb through current and past model iterations.

AWS CEO Andy Jassy talks about new Amazon SageMaker capabilitiesatre:Invent 2019
AWS CEO Andy Jassy talks about new Amazon SageMaker capabilities at re:Invent 2019

“It is a much, much easier way to find, search for and collect your experiments when building a model,” Jassy said.

As the name suggests, SageMaker Debugger enables users to debug and profile their models more effectively. The tool collects and monitors key metrics from popular frameworks, and provides real-time metrics about accuracy and performance, potentially giving developers deeper insights into their own models. It is designed to make models more explainable for non-data scientists.

SageMaker Model Monitor also tries to make models more explainable by helping developers detect and fix concept drift, which refers to the evolution of data and data relationships over time. Unless models are updated in near real time, concept drift can drastically skew the accuracy of their outputs. Model Monitor constantly scans the data and model outputs to detect concept drift, alerting developers when it detects it and helping them identify the cause.

Automating model building

With Amazon SageMaker Autopilot, developers can automatically build models without, according to Jassy, sacrificing explainability.

Autopilot is “AutoML with full control and visibility,” he asserted. AutoML essentially is the process of automating machine learning modeling and development tools.

The new Autopilot module automatically selects the correct algorithm based on the available data and use case and then trains 50 unique models. Those models are then ranked by accuracy.

“AutoML is the future of ML development. I predict that within two years, 90 percent of all ML models will be created using AutoML by data scientists, developers and business analysts,” Gualtieri said.

SageMaker Autopilot is a must-have for AWS.
Mike GualtieriAnalyst, Forrester

“SageMaker Autopilot is a must-have for AWS, but it probably will help” other vendors also, including such AWS competitors as DataRobot because the AWS move further legitimizes the automated machine learning approach, he continued.

Other AWS rivals, including Google Cloud Platform, Microsoft Azure, IBM, SAS, RapidMiner, Aible and H2O.ai, also have automated machine learning capabilities, Gualtieri noted.

However, according to Nick McQuire, vice president at advisory firm CCS Insight, some of the  new AWS capabilities are innovative.

“Studio is a great complement to the other products as the single pane of glass developers and data scientists need and its incorporation of the new features, especially Model Monitor and Debugger, are among the first in the market,” he said.

“Although AWS may appear late to the game with Studio, what they are showing is pretty unique, especially the positioning of the IDE as similar to traditional software development with … Experiments, Debugger and Model Monitor being integrated into Studio,” McQuire said. “These are big jumps in the SageMaker capability on what’s out there in the market.”

Google also recently released several new tools aimed at delivering explainable AI, plus a new product suite, Google Cloud Explainable AI.

Go to Original Article
Author:

IBM Cloud Pak for Security aims to unify hybrid environments

IBM this week launched Cloud Pak for Security, which experts say represents a major strategy shift for Big Blue’s security business

The aim of IBM’s Cloud Pak for Security is to create a platform built on open-source technology that can connect security tools from multiple vendors and cloud platforms in order to help reduce vendor lock-in. IBM Cloud Paks are pre-integrated and containerized software running on Red Hat OpenShift, and previously IBM had five options for Cloud Paks — Applications, Data, Integration, Automation and Multicloud Management — which could be mixed and matched to meet enterprise needs.

Chris Meenan, director of offering management and strategy at IBM Security, told SearchSecurity that Cloud Pak for Security was designed to tackle two “big rock problems” for infosec teams. The first aim was to help customers get data insights through federated search of their existing data without having to move it to one place. Second was to help “orchestrate and take action across all of those systems” via built-in case management and automation. 

Meenan said IT staff will be able to take actions across a multi-cloud environment, including “quarantining users, blocking IP addresses, reimaging machines, restarting containers and forcing password resets.”

“Cloud Pak for Security is the first platform to take advantage of STIX-Shifter, an open-source technology pioneered by IBM that allows for unified search for threat data within and across various types of security tools, datasets and environments,” Meenan said. “Rather than running separate, manual searches for the same security data within each tool and environment you’re using, you can run a single query with Cloud Pak for Security to search across all security tools and data sources that are connected to the platform.” 

Meenan added that Cloud Pak for Security represented a shift in IBM Security strategy because of its focus on delivering “security solutions and outcomes without needing to own the data.”

“That’s probably the biggest shift — being able to deliver that to any cloud or on-premise the customer needs,” Meenan said. “Being able to deliver that without owning the data means organizations can deploy any different technology and it’s not a headwind. Now they don’t need to duplicate the data. That’s just additional overhead and introduces friction.”

One platform to connect them all

Meenan said IBM was “very deliberate” to keep data transfers minimal, so at first Cloud Pak for Security will only take in alerts from connected vendor tools and search results.

“As our Cloud Pak develops, we plan to introduce some capability to create alerts and potentially store data as well, but as with other Cloud Paks, the features will be optional,” Meenan said. “What’s really fundamental is we’ve designed a Cloud Pak to deliver applications and outcomes but you don’t have to bring the data and you don’t have to generate the alerts. Organizations have a SIEM in place, they’ve got an EDR in place, they’ve got all the right alerts and insights, what they’re really struggling with is connecting all that in a way that’s easily consumable.”

In order to create the connections to popular tools and platforms, IBM worked with clients and service providers. Meenan said some connectors were built by IBM and some vendors built their own connectors. At launch, Cloud Pak for Security will include integration for security tools from IBM, Carbon Black, Tenable, Elastic, McAfee, BigFix and Splunk, with integration for Amazon Web Services and Microsoft Azure clouds coming later in Q4 2019, according to IBM’s press release.

Ray Komar, vice president of technical alliances at Tenable, said that from an integration standpoint, Cloud Pak for Security “eliminates the need to build a unique connector to various tools, which means we can build a connector once and reuse it everywhere.”

“Organizations everywhere are reaping the benefits of cloud-first strategies but often struggle to ensure their dynamic environments are secure,” Komar told SearchSecurity. “With our IBM Cloud Pak integration, joint customers can now leverage vulnerability data from Tenable.io for holistic visibility into their cloud security posture.”

Jon Oltsik, senior principal analyst and fellow at Enterprise Strategy Group, based in Milford, Mass., told SearchSecurity that he likes this new strategy for IBM and called it “the right move.”

“IBM has a few strong products but other vendors have much greater market share in many areas. Just about every large security vendor offers something similar, but IBM can pivot off QRadar and Resilient and extend its footprint in its base. IBM gets this and wants to establish Cloud Pak for Security as the ‘brains’ behind security. To do so, it has to be able to fit nicely in a heterogeneous security architecture,” Oltsik said. “IBM can also access on-premises data, which is a bit of unique implementation. I think IBM had to do this as the industry is going this way.”

Martin Kuppinger, founder and principal analyst at KuppingerCole Analysts AG, based in Wiesbaden, Germany, said Cloud Pak for Security should be valuable for customers, specifically “larger organizations and MSSPs that have a variety of different security tools from different vendors in place.”

“This allows for better incident response processes and better analytics. Complex attacks today might span many systems, and analysis requires access to various types of security information. This is simplified, without adding yet another big data lake,” Kuppinger told SearchSecurity. “Obviously, Security Cloud Pak might be perceived competitive by incident response management vendors, but it is open to them and provides opportunities by building on the federated data. Furthermore, a challenge with federation is that the data sources must be up and running for accessing the data — but that can be handled well, specifically when it is only about analysis; it is not about real-time transactions here.”

The current and future IBM Security products

Meenan told SearchSecurity that Cloud Pak for Security would not have any special integration with IBM Security products, which would “have to stand on their own merits” in order to be chosen by customers. However, Meenan said new products in the future will leverage the connections enabled by the Cloud Pak.

“Now what this platform allows us to do is to deliver new security solutions that are naturally cross-cutting, that require solutions that can sit across an EDR, a SIEM, multiple clouds, and enable those,” Meenan said. “When we think about solutions for insider threat, business risk, fraud, they’re very cross-cutting use cases so anything that we create that cuts across and provides that end-to-end security, absolutely the Cloud Pak is laying the foundation for us — and our partners and our customers — to deliver that.”

Oltsik said IBM’s Security Cloud Pak has a “somewhat unique hybrid cloud architecture” but noted that it is “a bit late to market and early versions won’t have full functionality.”

“I believe that IBM delayed its release to align it with what it’s doing with Red Hat,” Oltsik said. “All that said, IBM has not missed the market, but it does need to be more aggressive to compete with the likes of Cisco, Check Point, FireEye, Fortinet, McAfee, Palo Alto, Symantec, Trend Micro and others with similar offerings.”

Kuppinger said that from an overall IBM Security perspective, this platform “is rather consequent.”

“IBM, with its combination of software, software services, and implementation/consultancy services, is targeted on such a strategy of integration,” Kuppinger wrote via email. “Not owning data definitely is a smart move. Good architecture should segregate data, identity, and applications/apps/services. This allows for reuse in modern, service-oriented architectures. Locking-in data always limits that reusability.”

Go to Original Article
Author:

Google joins bare-metal cloud fray

Google has introduced bare-metal cloud deployment options geared for legacy applications such as SAP, for which customers require high levels of performance along with deeper virtualization controls.

“[Bare metal] is clearly an area of focus of Google,” and one underscored by its recent acquisition of CloudSimple for running VMware workloads on Google Cloud, said Deepak Mohan, an analyst at IDC.

Deepak MohanDeepak Mohan

IBM, AWS and Azure have their own bare-metal cloud offerings, which allow them to support an ESXi hypervisor installation for VMware, and Bare Metal Solution will apparently underpin CloudSimple’s VMware service on Google, Mohan added.

But Google will also be able to support other workloads that can benefit from bare metal availability, such as machine learning, real-time analytics, gaming and graphical rendering. Bare-metal cloud instances also avert the “noisy neighbor” problem that can crop up in virtualized environments as clustered VMs seek out computing resources, and do away with the general hit to performance known commonly as the “hypervisor tax.”

Google’s bare-metal cloud instances offer a dedicated interconnect to customers and tie into all native Google Cloud services, according to a blog post. The hardware has been certified to run “multiple enterprise applications,” including ones built on top of Oracle’s database, Google said.

Oracle, which lags far behind in the IaaS market, has sought to preserve some of those workloads as customers move to the cloud.

This is clearly an area of focus of Google.
Deepak MohanAnalyst, IDC

Earlier this year, it formed a cloud interoperability partnership with Microsoft, pushing a use case wherein customers could run enterprise application logic and presentation tiers on Azure infrastructure, while tying back to an Oracle database running on bare-metal servers or specialized Exadata hardware in Oracle’s cloud.

Not all competitive details laid bare

Overall, bare-metal cloud is a niche market, but by some estimates it is growing quickly.

Among hyperscalers such as AWS, Google and Microsoft, the battleground is in early days, with AWS only making its bare-metal offerings generally available in May 2018. Microsoft has mostly positioned bare metal for memory-intensive workloads such as SAP HANA, while also offering it underneath CloudSimple’s VMware service for Azure.

Meanwhile, Google’s bare-metal cloud service is fully managed by Google, provides a set of provisioning tools for customers, and will have unified billing with other Google Cloud services, according to the blog.

How smoothly this all works together could be a key differentiator for Google in comparison with rival bare-metal providers. Management of bare-metal machines can be more granular than traditional IaaS, which can mean increased flexibility as well as complexity.

Google’s Bare Metal Solution instances are based on x86 systems that range from 16 cores with 384 GB of DRAM, to 112 cores with 3,072 GB of DRAM. Storage comes in 1 TB chunks, with customers able to choose between all-flash or a mix of storage types. Google also plans to offer custom compute configurations to customers with that need.

It also remains to be seen how price-competitive Google is on bare metal compared with competitors, which includes providers such as Packet, CenturyLink and Rackspace.

The company didn’t immediately provide costs for Bare Metal Solution instances, but said the hardware can be purchased via monthly subscription, with the best deals for customers that sign 36-month terms. Google won’t charge for data movement between Bare Metal Solution instances and general-purpose Google Cloud infrastructure if it occurs in the same cloud region.

Go to Original Article
Author:

SAP Data Warehouse Cloud may democratize data analytics

SAP Data Warehouse Cloud is the latest products entrant designed to democratize data analytics.

SAP Data Warehouse Cloud is a data warehouse as a service product that became generally available in October. It connects a variety of data sources but enables organizations to decide where and how they want to keep and access the data.

It includes some features that should make it attractive to business users, particularly for those already running SAP systems, but SAP Data Warehouse Cloud faces a crowded market. Still, it’s a product SAP clearly needs to remain competitive in the data warehouse and data analytics game, according to one analyst.

Dan LahlDan Lahl

Making data analytics more applicable and accessible to the business is the main goal of SAP Data Warehouse Cloud, said Dan Lahl, SAP vice president of product marketing.

“In SAP Data Warehouse Cloud we’ve added new things to the data warehouses that allow customers to define their own data sets, and either virtualize or pull in that data so they can make decisions on exactly the data they want to look at,” Lahl said. “Usually you get five guys with four spreadsheets and everybody’s arguing over who’s got the data and the decision of record. SAP Data Warehouse Cloud will either virtualize or move data, specifically to the information the customer needs, and then the business user decides what dashboards or what reports they want to use on that data. It’s getting to business decisions more quickly for business users.”

One feature, Spaces, enables business users to align data in ways that are aligned with their business or process needs.

“Data warehouses used to take a long time to build. It was very expensive, it took a number of people to build them, and by the time you got to build the reports that you wanted, nobody wanted them anymore,” Lahl said. “This is kind of the antithesis of that, getting the business user to look and access the data they want and then build reporting and visualizations off of that.”

SAP Data Warehouse Cloud also includes SAP Analytics Cloud, which provides built-in reporting and dashboard capabilities.

Opening up analytics

Velux Group, a global manufacturing of windows based in Horsholm, Denmark, has been evaluating SAP Data Warehouse Cloud as a key part in the evolution of its data analytics program. The company has 27 production sites in 10 countries, sales offices in 40 countries, and 11,500 employees.

Andreas MadsenAndreas Madsen

Velux Group is a longtime SAP customer, running SAP ECC for ERP and BusinessObjects BI and Business Warehouse (BW) for data analytics and storage. However, the company has run into some limitations with BusinessObjects and is looking to overcome those limitations with SAP Data Warehouse Cloud, said Andreas Madsen, senior data and analytics partner at Velux.

Velux is in the process of developing a new business model to sell more directly to consumers, in addition to the traditional channel of installers and resellers.

“Normally, it’s the installers and dealers that actually sell our windows — we sell through them. So we don’t know that much about our end users, but that’s changing as we’re moving into the online market as well,” Madsen said.

Direct selling requires more flexibility than what Velux’s current systems offer.

“There’s a transition in what we need to know about our end users and how we use data,” he said. “And although we have a very good, well-functioning ECC system, BW, and some classic reporting, it’s really hard to navigate in, it’s really hard to explore data when everything is structured in the data warehouse.”

The ultimate strategic goal is for Velux to become more customer-centric, and Madsen said that in order to do this, the company needs a more open data platform that enables business users to link the data as needed.

“There is a world outside of our corporate enterprise systems. We cover all the business processes in Velux, but if you look at marketing they might have 50 or 60 different databases that that they use data from, they need to be able to cater to that data as well,” Madsen said. “It’s important data, but it’s not something that we are in control of, so we need to give them a platform where they can operate and then combine that with the enterprise data.”

SAP’s needed reaction to data analytics market

SAP Data Warehouse Cloud is a necessary evolution of SAP’s HANA-based data analytics approach, especially given the crowded, competitive analytics market, said Dana Gardner, president and principal analyst with Interarbor Solutions LLC.

It needs to be noted that SAP is reacting to the market rather than making the market.
Dana GardnerPresident and principal analyst, Interarbor Solutions LLC

“There are lot of companies out there like Qlik, Tableau, and others that are making inroads and we’re seeing more of this analytics as a service and the democratization of data,” Gardner said. “SAP is recognizing that they need to compete better in all aspects of data analytics and not just the enterprise systems of record integration part of it. But it needs to be noted that SAP is reacting to the market rather than making the market.”

Dana GardnerDana Gardner

The SAP Data Warehouse Cloud approach focuses on the democratization of data analytics and makes it simple and automated behind the scenes, so that more business users get the types of analytics they need based on their business role and what work they are doing, Gardner said.

“You don’t want to make data analytics just available to data scientists. It’s time to break down the ivory tower,” he said. “The more people that use more analytics in your organization, the better you’re going to be as a company, so it’s important that SAP gets aggressive and out on front on this.”

Go to Original Article
Author:

AWS, Azure and Google peppered with outages in same week

AWS, Microsoft Azure and Google Cloud all experienced service degradations or outages this week, an outcome that suggests customers should accept that cloud outages are a matter of when, not if.

In AWS’s Frankfurt region, EC2, Relational Database Service, CloudFormation and Auto Scaling were all affected Nov. 11, with the issues now resolved, according to AWS’s status page.

Azure DevOps services for Boards, Repos, Pipelines and Test Plans were affected for a few hours in the early hours of Nov. 11, according to its status page. Engineers determined that the problem had to do with identity calls and rebooted access tokens to fix the system, the page states.

Google Cloud said some of its APIs in several U.S. regions were affected, and others experienced problems globally on Nov. 11, according to its status dashboard. Affected APIs included those for Compute Engine, Cloud Storage, BigQuery, Dataflow, Dataproc and Pub/Sub. Those issues were resolved later in the day.

Google Kubernetes Engine also went through some hiccups over the past week, in which nodes in some recently upgraded container clusters resulted in high levels of kernel panics. Known more colloquially as the “blue screen of death” and other terms, kernel panics are conditions wherein a system’s OS can’t recover from an error quickly or easily.

The company rolled out a series of fixes, but as of Nov. 13, the status page for GKE remained in orange status, which indicates a small number of projects are still affected.

AWS, Microsoft and Google have yet to provide the customary post-mortem reports on why the cloud outages occurred, although more information could emerge soon.

Move to cloud means ceding some control

The cloud outages at AWS, Azure and Google this week were far from the worst experienced by customers in recent years. In September 2018, severe weather in Texas caused a power surge that shut down dozens of Azure services for days.

Stephen ElliotStephen Elliot

Cloud providers have aggressively pursued region and zone expansions to help with disaster recovery and high-availability scenarios. But customers must still architect their systems to take advantage of the expanded footprint.

Still, customers have much less control when it comes to public cloud usage, according to Stephen Elliot, an analyst at IDC. That reality requires some operational sophistication.

It’s a myth that outages won’t happen.
Stephen ElliotAnalyst, IDC

“Networks are so interconnected and distributed, lots of partners are involved in making a service perform and available,” he said. “[Enterprises] need a risk mitigation strategy that covers people, process, technologies, SLAs, etc. It’s a myth that outages won’t happen. It could be from weather, a black swan event, security or a technology glitch.”

Jay LymanJay Lyman

This fact underscores why more companies are experimenting with and deploying workloads across hybrid and multi-cloud infrastructures, said Jay Lyman, an analyst at 451 Research. “They either control the infrastructure and downtime with on-premises deployments or spread their bets across multiple public clouds,” he said.

Ultimately, enterprise IT shops can weigh the challenges and costs of running their own infrastructure against public cloud providers and find it difficult to match, said Holger Mueller, an analyst at Constellation Research.

“That said, performance and uptime are validated every day, and should a major and longer public cloud outage happen, it could give pause among less technical board members,” he added.

Go to Original Article
Author:

Google to unveil post-Chronicle cloud cybersecurity plans

Google is set to reveal how cloud cybersecurity technologies developed by Chronicle have been worked into its portfolio for large enterprise customers.

In June, Google Cloud announced it had acquired Chronicle, a startup launched within parent company Alphabet in 2015. Integration work has proceeded since then, and details will be shared at the Cloud Next ’19 UK conference, which begins in London on Nov. 20.

A recent report on Chronicle from Vice’s Motherboard publication painted a bleak picture of the company post-Google acquisition, with key executives including its founder and CEO departing, and dismal morale in the product-development trenches.

“People keep quitting. Sales doesn’t know what to do, since there’s no real product roadmap anymore. Engineering is depressed for the same reason,” an unnamed Chronicle employee told the site.

Asked for comment, a Google spokeswoman pointed to the company’s blog post on the upcoming announcements at Cloud Next UK, and did not address the claims of unrest at Chronicle.

Google plans to announce “multiple new native capabilities” for security, as well as planned new features for Backstory, Chronicle’s main cloud cybersecurity product, according to the blog.

Backstory can ingest massive amounts of security telemetry data and process it for insights. It is geared toward companies that have a wealth of this information but lack the staff or resources to analyze it in-house.

Customers upload their telemetry data to a private repository on Google Cloud infrastructure, where it is indexed and analyzed by Chronicle’s software engine. The engine compares the customer’s data against threat intelligence signals mined from many sources and looks for problematic correlations.

Backstory will compete with both on-premises security information and event management platforms and cloud cybersecurity systems, such as Sumo Logic and Splunk. Rival cloud providers have responded as well, with one prominent case being Azure Sentinel, which Microsoft launched this year.

Beyond performance and results, pricing may be a key factor for Backstory. Chronicle has made much of the fact that it won’t be priced according to data volume, but the exact nature of the business model still isn’t clear. Microsoft uses a tiered, fixed-fee pricing scheme for Azure Sentinel based on daily data capacity.

Backstory’s biggest opportunity may be outside Google Cloud

Jon OltsikJon Oltsik

While Chronicle’s staff would have enjoyed more freedom if kept independent from Google Cloud, there’s no evidence to suggest it’s being held back at this point, according to Jon Oltsik, senior principal analyst for cybersecurity at Enterprise Strategy Group.

The Google Cloud management team needs to give Chronicle the latitude to innovate and compete.
Jon OltsikSenior principal analyst, cybersecurity, Enterprise Strategy Group

“The Google Cloud management team needs to give Chronicle the latitude to innovate and compete against a strong and dynamic market,” he said. “This should be the model moving forward and I’ll be monitoring how it proceeds.”

There is an emerging market for specific security analytics and operations tools for monitoring the security of cloud-based workloads, which aligns well with Google Cloud, Oltsik added. But the bigger opportunity lies with customers who aren’t necessarily Google Cloud users, he added.

Go to Original Article
Author:

Microsoft and Salesforce expand strategic partnership to accelerate customer success

Salesforce names Microsoft Azure as its public cloud provider for Salesforce Marketing Cloud to help customers scale and grow; new integration between Salesforce Sales and Service Clouds with Microsoft Teams will boost productivity

REDMOND, Wash., and SAN FRANCISCO — Nov. 14, 2019 — Microsoft Corp. (Nasdaq: MSFT) and Salesforce (NYSE: CRM) on Thursday announced plans to expand their strategic partnership to help customers meet the evolving needs of their businesses and boost team productivity. Salesforce has named Microsoft Azure as its public cloud provider for Salesforce Marketing Cloud. Salesforce will also build a new integration that connects Salesforce’s Sales Cloud and Service Cloud with Microsoft Teams.

Salesforce and Microsoft logos
“At Salesforce, we’re relentlessly focused on driving trust and success for our customers,” said Marc Benioff and Keith Block, co-CEOs, Salesforce. “We’re excited to expand our partnership with Microsoft and bring together the leading CRM with Azure and Teams to deliver incredible customer experiences.”

“In a world where every company is becoming a digital company, we want to enable every customer and partner to build experiences on our leading platforms,” said Satya Nadella, CEO, Microsoft. “By bringing together the power of Azure and Microsoft Teams with Salesforce, our aim is to help businesses harness the power of the Microsoft Cloud to better serve customers.”

Comments on the news

“Marriott has more than 7,200 properties spanning 134 countries and territories, so driving efficiency and collaboration is critical,” said Brian King, global officer, Digital, Distribution, Revenue Strategy, and Global Sales, Marriott International. “The combination of Salesforce and Microsoft enables our teams to work better together to enhance the guest experience at every touchpoint.”

“With 400 brands and teams in 190 countries, we are always looking for ways to scale more efficiently and strengthen collaboration,” said Jane Moran, chief technology advisor, Unilever. “The powerful combination of Salesforce and Microsoft enables us to be more productive and connect with each other and our customers like never before.”

Salesforce names Microsoft Azure as its public cloud provider for marketing cloud

With Salesforce Marketing Cloud, marketers are empowered to know their customers, personalize marketing with Einstein, engage with them across any channel, and analyze the impact to improve campaign performance. Bringing its Marketing Cloud workload to Azure, Salesforce joins the over 95% of Fortune 500 companies benefitting from an Azure infrastructure offering the most global regions of any cloud provider.

Through this partnership, Salesforce will move its Marketing Cloud to Azure — unlocking new growth opportunities for customers. By moving to Azure, Salesforce will be able to optimize Marketing Cloud’s performance as customer demand scales. This will reduce customer onboarding times and enable customers to expand globally more quickly with Azure’s global footprint and help address local data security, privacy and compliance requirements.

​Salesforce and Microsoft Teams integration will boost productivity

​As teamwork becomes a driving force in the workplace, people want to bring workflows and frequently used apps into their collaboration workspace environments. Sales and customer service are highly collaborative, team-centric functions, and many companies actively use both Salesforce CRM and Microsoft Teams. As part of this agreement, Salesforce will build a new integration that give sales and service users the ability to search, view, and share Salesforce records directly within Teams. The new Teams integration for Salesforce Sales and Service Clouds will be made available in late 2020.

Building on a commitment to customer success

These new integrations will build on existing solutions that enable mutual customers to be more productive, including the hundreds of thousands of monthly active users using Salesforce’s Microsoft Outlook integration to create, communicate and collaborate.

​About Salesforce​

Salesforce is the global leader in Customer Relationship Management (CRM), bringing companies closer to their customers in the digital age. Founded in 1999, Salesforce enables companies of every size and industry to take advantage of powerful technologies—cloud, mobile, social, internet of things, artificial intelligence, voice and blockchain—to create a 360° view of their customers. For more information about Salesforce (NYSE: CRM), visit: www.salesforce.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, [email protected]

Stephanie Barnes, Salesforce PR, (415) 722-0883, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center