Tag Archives: accounts

Druva expands on AWS backup, moves S3 snaps across regions

Customers with multiple Amazon accounts now have a way to manage backup policies for all of them.

Druva is adding a global policy management tool to its CloudRanger software, alongside other AWS backup features. Originally, CloudRanger only allowed backup policy setting within individual accounts. The update allows users to create backup policies first, then select or exclude the Amazon accounts to apply them to.

Druva’s vice president of product David Gildea said there has been an increase in the number of enterprises that hold multiple accounts. He said Druva designed the new CloudRanger feature around the idea that customers have thousands of accounts and multiple resources, and it gives the customer a “broad stroke” approach to backup policy management.

“[Amazon] S3 is one of the biggest and most important data sources in the world now,” Gildea said, highlighting the need to protect and manage the data within it.

S3 backup is one of Druva’s key new AWS backup features. Customers can back up S3 snapshots across regions, protecting them from a regional outage. In addition, users can move EBS snapshots to S3 storage, including Glacier and Glacier Deep Archive tiers for greater cost efficiency.

Druva CloudRanger is a management tool for AWS workloads and automated disaster recovery. The total list of Amazon workloads CloudRanger protects now includes EBS, S3, RedShift, ODS, EC2, DocumentDB and Neptune DB. Along with AWS backup, Druva also has products for on-premises data center, endpoint and SaaS application protection.

Druva is not alone in the AWS backup space. Clumio recently extended its backup as a service to support EBS, and Veeam recently launched a cloud-native EC2 protection product in AWS Marketplace.

Screenshot of Druva CloudRanger
Druva CloudRanger now lets customers apply backup policies by account.

Druva’s new AWS backup capabilities are available immediately to early access customers and are expected to become generally available in the first quarter of 2020.

Gildea said customers who have built apps on Amazon and use them at a large scale have a large amount of off-premises data that may not be under the protection of a business’s traditional backup. Druva’s AWS backup saves these customers the trouble of scripting and developing custom backup, which typically does not scale and needs to be continually maintained with every Amazon update.

There is a growing adoption of hybrid cloud infrastructure, said Steven Hill, a senior analyst at 451 Research. Backup vendors have products for protecting on-premises workloads, as well as offerings for the cloud. However, Hill said the challenge for vendors is eliminating the complexity that comes with managing separate environments, one of which is off premises.

Hill said as businesses push more critical workloads to the cloud, the cost of backup will be minor compared to the potential loss of data. He said some businesses have to learn this the hard way through a data loss incident before they buy in.

“Data protection is a bit like buying insurance — it’s optional,” Hill said.

Hill said over time, businesses will learn that cloud workloads need the same quality of backup and business continuity and disaster recovery (BC/DR) as on-premises environments. However, monitoring off-premises systems is an additional challenge. Therefore, he believes the future of BC/DR will lie in automation and flexibility through policy-based management regardless if an environment is on or off premises.

Go to Original Article

Oracle cloud ERP migration pays off for Caesars

SAN FRANCISCO — Ask Michael Mann, director of finance and accounts at Caesars Entertainment, about the distinction between traditional on-premises ERP and the cloud-based ERP systems that are becoming all the rage. He’ll tell you how it’s like the contrast between buying a house that’s move-in-ready and having to relocate a bathroom and remodel the kitchen before moving in.

Mann should know: He’s just overseen the first phase of a move to the Oracle cloud ERP suite, a complex undertaking for a company that operates 47 casinos and 34,000 hotel rooms at properties in five countries. The fact that Caesars didn’t have to do extensive customizations or rebuild its own processes was a prime attraction.

“We’re moving into the house, and we’re going to pick out the drapes and the carpet,” Mann said during a well-attended session at the Oracle OpenWorld 2017 conference. “They built the right house.”

Oracle cloud ERP enabled close partnership

Caesars’ motivations for finding a new house, to borrow Mann’s vernacular, were driven in large part by the fact that the company was relying on 30-year-old legacy green-screen technology to run its accounting and HR systems. Caesars’ leadership wanted to simplify and shrink that footprint in the process.

But moving to the cloud is about more than picking a vendor, Mann told OpenWorld attendees. It’s one thing to buy a product to run a business; it’s quite another to hand over your company’s data and many of its IT responsibilities. In doing so, the relationship becomes much more than a customer-vendor dynamic, as Caesars discovered.

“If you think about it, you’re really replacing your IT department and system with Oracle,” Mann said. “They’re not really a vendor — they’re a partner. It was so collaborative, and we worked so well together.”

The collaboration was critical because of the project’s inherent complexity. Numerous back-office systems had to be integrated into the Oracle cloud ERP environment. The new system had to replace a largely dispersed data-entry model that was challenged by issues of consistency and accuracy, and Oracle’s approach to data relationship management allowed more consistent, centralized data entry. Perhaps most importantly, Caesars wanted the system to serve up real-time reporting based on its improved transaction-processing capabilities.

We’re really going down this integrated ERP system path to make things work better, faster, cheaper and smarter.
Michael Manndirector of finance and accounts at Caesars Entertainment

After a one-year deployment that Mann said could have been completed in nine months if Caesars had made a few different decisions, such as investing in a dedicated person to lead testing — an oversight that caused some delays — the company saw the system’s potential immediately. Mann said during August — the first month the system was live — 1,100 users had accessed Caesars’ Oracle cloud ERP, processing 2.5 million journal lines, 440,000 accounts payable lines and 28,000 payments without a glitch.

While the company hasn’t yet computed any hard benefits it’s realized from the Oracle cloud ERP system, Mann said he expects significant gains in efficiency across the board — from sales and service processes to hospitality and loyalty programs to the payment of invoices.

“We’re really going down this integrated ERP system path to make things work better, faster, cheaper and smarter,” he said.

Extensive change management, project organization were essential

It helps to go about implementation in a level-headed fashion, and according to Bill Behen, a principal at Chicago-based Oracle partner Grant Thornton, which served as the system integrator on the project, Caesars had all the right strategic pieces in place. That included a desire to take an incremental, phased approach to rolling out the various pieces of the software — which is officially called Oracle ERP Cloud — and buy-in from senior leadership.

“I’m sure everyone has heard that over and over again, that you need support from the top,” Behen told the OpenWorld audience. In the case of Caesars, “it wasn’t just talk. They had senior people very involved in the project.”

Behen advised Caesars to adopt the Oracle Unified Method for SaaS applications, which helped to ensure core team members were trained on the methodology, project roles and responsibilities were clearly defined and there were sufficient dedicated internal resources. 

Doing so helped Caesars overcome numerous challenges, such as limited internal expertise with cloud applications, aversion to risk, a distributed user base that was reliant upon antiquated legacy technology, and high numbers of integrations and transactions.

Caesars also worked with Grant Thornton to set up a change network, which essentially functioned as a way to provide local change management and support to a highly distributed user base.

Even successful transitions to the cloud are filled with valuable lessons, and Caesars’ experience was no different. Mann said the things he realized during the implementation included the following:

  • the need to dedicate full-time resources for project roles;
  • the danger of underestimating the importance of validating data conversions;
  • the importance of engaging both internal and external audit teams to ensure expectations are met; and
  • the importance of engaging the network team to make sure the network is optimized to support cloud apps.

Ultimately, Mann had two recommendations for Oracle OpenWorld 2017 attendees: Make sure to discuss any network maintenance that might be planned during the implementation to reduce potential conflict, and don’t be tempted to work with offshore outsourcers, which can result in delays as long as a day or two just to get responses to basic questions via email.

That said, however a company goes about its move from a legacy ERP system to SaaS ERP, there’s one happy outcome that no mistake during deployment can prevent.

“You’re not going to find yourself 30 years from now working on a green screen,” Mann said.

Use Azure Storage Explorer to manage Azure storage accounts

You might have used third-party tools to manage Azure storage accounts — including managing storage blobs, queues…


* remove unnecessary class from ul
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

* Replace “errorMessageInput” class with “sign-up-error-msg” class
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {

* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
return validateReturn;

* DoC pop-up window js – included in moScripts.js which is not included in responsive page
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);

and table storages — and VM files in the past, but there’s another option. Microsoft developed an Azure storage management tool that can manage multiple Azure storage accounts, which helps increase productivity. Meet certain requirements before installing the tool, and you can realize other benefits of using Azure Storage Explorer, such as performing complete Azure storage operational tasks from your desktop in a few simple steps.

Azure Storage Explorer was released in June 2016. Although Azure Storage Explorer is in preview, many organizations use it to efficiently manage Azure storage accounts. There are several previous versions of Azure Storage Explorer, but the latest version that is reliable and is in production use is 0.8.16.

Benefits of using Azure Storage Explorer

One of the main benefits of using Azure Storage Explorer is that you can perform Azure storage operations-related tasks — copy, delete, download, manage snapshots. You can also perform other storage-related tasks, such as copying blob containers, managing access policies configured for blob containers and setting public access levels, from a single GUI installed on your desktop machine.

Another benefit of using this tool is that if you have Azure storage accounts created in both Azure Classic and Azure Resource Manager modes, the tool allows you to manage Azure storage accounts for both modes.

You can also use Azure Storage Explorer to manage storage accounts from multiple Azure subscriptions. This helps you track storage sizes and accounts from a single UI rather than logging into the Azure portal to check the status of Azure storage for a different Azure subscription.

Azure Storage Emulator, which must be downloaded separately,  allows you to test code and storage without an Azure storage account. Apart from managing storage accounts created on Azure, Azure Storage Explorer can connect to other storage accounts hosted on sovereign clouds and Azure Stack.

Requirements and installing Azure Storage Explorer

Azure Storage Explorer requires minimum resources on the desktop and can be installed on Windows Client, Windows Server, Mac and Linux platforms. All you need to do is download the tool and then install it. The installation process is quite simple. Just proceed with the onscreen steps to install the tool. When you launch the tool for the first time, it will ask you to connect to an Azure subscription, but you can cancel and add an Azure subscription at a later stage if you want to explore the options available with the tool. For example, you might want to modify the proxy settings before a connection to Azure subscriptions can be established.

Configuring proxy settings

It’s important to note that, because Azure Storage Explorer requires a working internet connection and because many of the production environments have a proxy server deployed before someone can access the internet, you’ll be required to modify the proxy settings in Azure Storage Explorer by navigating to the Edit menu and then clicking Configure Proxy as shown in Figure A below:

Azure Storage Explorer proxy server settings
Figure A. Launching the proxy server settings page

When you click on Configure Proxy, the tool will show you the Proxy Settings page as shown in Figure B below. From there, you can enter the proxy settings and then click on OK to save the settings.

Proxy setting configuration
Figure B. Configuring proxy settings in Azure Storage Explorer

When you configure proxy settings in Azure Storage Explorer, the tool doesn’t check whether the settings are correct. It just saves the settings. If you run into any connection issues, please make sure that the proxy settings are correct and that you have a reliable internet connection.

How to use Azure Storage Explorer

If you’ve worked with third-party Azure storage management tools, you’re already familiar with storage operational tasks, such as uploading VHDX files and working with blob containers, tables and queues. Azure Storage Explorer provides the same functionality, but the interface might be different than the third-party storage management tools you’ve worked with thus far. The first step is to connect to an Azure account by clicking on the Manage Accounts icon and then clicking Add an Account. Once it is connected, Azure Storage Explorer will retrieve all the subscriptions associated with the Azure account. If you need to work with storage accounts in an Azure subscription, first select the subscription, and then click Apply. When you click Apply, Azure Storage Explorer will retrieve all of the storage accounts hosted on the Azure subscription. Once storage accounts have been retrieved, you can work with blob containers, file shares, queues and tables from the left navigation pane as shown in Figure C below:

Storage accounts in Azure Storage Explorer
Figure C. Working with storage accounts in Azure Storage Explorer

If you have several Azure storage accounts, you can search for a particular storage account by typing in the search box located on top of the left pane as it is shown in Figure C above. Azure Storage Explorer provides easy management of blob containers. You can perform most blob container-related tasks, including creating a blob, setting up public access for a blob and managing access policies for blobs. As you know, by default, a blob container has public access disabled. If you want to enable public access for a blob container, click on a blob container in the left navigation pane, right-click on the blob container and then click on Set Public Access Level… to display the Set Container Public Access Level page shown in Figure D below.

Blob container public access level
Figure D. Setting public access level for a blob container

Next Steps

Learn more about different Azure storage types

Navigate expanded Microsoft Azure features

Enhance cloud security with Azure Security Center