Salesforce users will be able to continue to work even if Salesforce goes down, thanks to Odaseva’s new addition.
Odaseva ultra high availability (UHA) works similarly to high availability (HA) for any non-SaaS environment. If there’s a Salesforce outage, such as a planned maintenance or an unexpected failure, a customer’s Salesforce account would failover to an emulated Salesforce account in Odaseva. Users can continue to view, edit and update the emulated records like normal. When Salesforce is back up, Odaseva will re-synchronize the two environments, performing what is essentially a failback.
Odaseva UHA is in early access and will be released as an add-on to the Odaseva platform in early 2020. Pricing is not yet available.
Salesforce has become so mission-critical to some organizations that they can’t afford any downtime. Odaseva CEO Sovan Bin said Odaseva UHA isn’t strictly necessary for smaller businesses that can shrug off a small Salesforce outage, but there are places such as call centers that need Salesforce access 100% of the time. These organizations stand to lose hundreds of thousands of dollars for every hour they can’t conduct business, while suffering from lost opportunities and damage to their brand.
“The real damage is because you’ve stopped doing business,” Bin said.
Odaseva provides backup and data governance for Salesforce data. Developed by Salesforce certified technical architects — the highest Salesforce expertise credential — Odaseva Data Governance Cloud offers archiving and automated data compliance on top of data protection features. Odaseva claims its compliance and data governance tools differentiate it from Salesforce backup competitors such as OwnBackup and Spanning.
Data protection and backup only address the integrity of data, but HA addresses its availability and accessibility. Christophe Bertrand, senior analyst at IT analyst firm Enterprise Strategy Group (ESG), said HA is lacking for SaaS application data. He said he didn’t know any other vendor with a similar product or feature.
“Not only is it unique, other vendors aren’t even exploring HA for Salesforce,” Bertrand said.
Bertrand added that other SaaS applications such as Office 365, Box and ServiceNow also have an availability gap, even as they become mission-critical to businesses. When these services go down, companies may have to stop working. Bertrand estimated the cost of downtime averages to higher than $300,000 per hour for most enterprises. Although many vendors provide backup, no one has yet provided a failover/failback offering.
“Ninety-nine-point-whatever percent uptime is not enough. That’s still 15 hours of downtime per year,” Bertrand said.
Odaseva also introduced some new capabilities to its platform this week. It is now integrated with Salesforce Marketing Cloud, which allows users to back up emails, leads, contact information and marketing campaign files stored in Marketing Cloud. Before this integration, customers would have to develop a backup mechanism for Marketing Cloud themselves, which would include complex processes of extracting the data and replicating it.
Odaseva also extended its compliance automation applications to cover more than GDPR. Odaseva has data privacy applications that automatically perform anonymization, right of access, right of erasure and other privacy tasks in order to keep compliant with GDPR. Automated compliance now covers CCPA, HIPAA and a number of privacy regulations in non-U.S. countries such as PIPA (Japan), PIPEDA (Canada) and POPIA (South Africa).
The Salesforce Marketing Cloud integration and compliance automation extensions are available immediately.
Bin said Odaseva will focus on DevOps next. Salesforce Full Sandbox environments can be natively refreshed every 29 days. To help customers accelerate development, Bin said Odaseva will come up with a way to work around that limit and enable more frequent refreshes in a future release.
The robust market of tools to help users of the Redis database manage their systems just got a new entrant.
Redis Labs disclosed the availability of its RedisInsight tool, a graphical user interface (GUI) for database management and operations.
Redis is a popular open source NoSQL database that is also increasingly being used in cloud-native Kubernetes deployments as users move workloads to the cloud. Open source database use is growing quickly according to recent reports as the need for flexible, open systems to meet different needs has become a common requirement.
Among the challenges often associated with databases of any type is ease of management, which Redis is trying to address with RedisInsight.
“Database management will never go out of fashion,” said James Governor, analyst and co-founder at RedMonk. “Anyone running a Redis cluster is going to appreciate better memory and cluster management tools.”
Governor noted that Redis is following a tested approach, by building out more tools for users that improve management. Enterprises are willing to pay for better manageability, Governor noted, and RedisInsight aims to do that.
RedisInsight based on RDBtools
The RedisInsight tool, introduced Nov. 12, is based on the RDBTools technology that Redis Labs acquired in April 2019. RDBTools is an open source GUI for users to interact with and explore data stores in a Redis database.
James GovernorAnalyst and co-founder, RedMonk
Over the last seven months, Redis added more capabilities to the RDBTools GUI, expanding the product’s coverage for different applications, said Alvin Richards, chief product officer at Redis.
One of the core pieces of extensibility in Redis is the ability to introduce modules that contain new data structures or processing frameworks. So for example, a module could include time series, or graph data structures, Richards explained.
“What we have added to RedisInsight is the ability to visualize the data for those different data structures from the different modules,” he said. “So if you want to visualize the connections in your graph data for example, you can see that directly within the tool.”
RDBTools is just one of many different third-party tools that exist for providing some form of management and data insight for Redis. There are some 30 other third-party GUI tools in the Redis ecosystem, though lack of maturity is a challenge.
“They tend to sort of come up quickly and get developed once and then are never maintained,” Richards said. “So, the key thing we wanted to do is ensure that not only is it current with the latest features, but we have the apparatus behind it to carry on maintaining it.”
How RedisInsight works
For users, getting started with the new tool is relatively straightforward. RedisInsight is a piece of software that needs to be downloaded and then connected to an existing Redis database. The tool ingests all the appropriate metadata and delivers the visual interface to users.
RedisInsight is available for Windows, macOS and Linux, and also available as a Docker container. Redis doesn’t have a RedisInsight as a Service offering yet.
“We have considered having RedisInsight as a service and it’s something we’re still working on in the background, as we do see demand from our customers,” Richards said. “The challenge is always going to be making sure we have the ability to ensure that there is the right segmentation, security and authorization in place to put guarantees around the usage of data.”
Datrium plans to open its new cloud disaster recovery as a service to any VMware vSphere users in 2020, even if they’re not customers of Datrium’s DVX infrastructure software.
Datrium released disaster recovery as a service with VMware Cloud on AWS in September for DVX customers as an alternative to potentially costly professional services or a secondary physical site. DRaaS enables DVX users to spin up protected virtual machines (VMs) on demand in VMware Cloud on AWS in the event of a disaster. Datrium takes care of all of the ordering, billing and support for the cloud DR.
In the first quarter, Datrium plans to add a new Datrium DRaaS Connect for VMware users who deploy vSphere infrastructure on premises and do not use Datrium storage. Datrium DraaS Connect software would deduplicate, compress and encrypt vSphere snapshots and replicate them to Amazon S3 object storage for cloud DR. Users could set backup policies and categorize VMs into protection groups, setting different service-level agreements for each one, Datrium CTO Sazzala Reddy said.
A second Datrium DRaaS Connect offering will enable VMware Cloud users to automatically fail over workloads from one AWS Availability Zone (AZ) to another if an Amazon AZ goes down. Datrium stores deduplicated vSphere snapshots on Amazon S3, and the snapshots replicated to three AZs by default, Datrium chief product officer Brian Biles said.
Speedy cloud DR
Datrium claims system recovery can happen on VMware Cloud within minutes from the snapshots stored in Amazon S3, because it requires no conversion from a different virtual machine or cloud format. Unlike some backup products, Datrium does not convert VMs from VMware’s format to Amazon’s format and can boot VMs directly from the Amazon data store.
“The challenge with a backup-only product is that it takes days if you want to rehydrate the data and copy the data into a primary storage system,” Reddy said.
Although the “instant RTO” that Datrium claims to provide may not be important to all VMware users, reducing recovery time is generally a high priority, especially to combat ransomware attacks. Datrium commissioned a third party to conduct a survey of 395 IT professionals, and about half said they experienced a DR event in the last 24 months. Ransomware was the leading cause, hitting 36% of those who reported a DR event, followed by power outages (26%).
The Orange County Transportation Authority (OCTA) information systems department spent a weekend recovering from a zero-day malware exploit that hit nearly three years ago on a Thursday afternoon. The malware came in through a contractor’s VPN connection and took out more than 85 servers, according to Michael Beerer, a senior section manager for online system and network administration of OCTA’s information systems department.
Beerer said the information systems team restored critical applications by Friday evening and the rest by Sunday afternoon. But OCTA now wants to recover more quickly if a disaster should happen again, he said.
OCTA is now building out a new data center with Datrium DVX storage for its VMware VMs and possibly Red Hat KVM in the future. Beerer said DVX provides an edge in performance and cost over alternatives he considered. Because DVX disaggregates storage and compute nodes, OCTA can increase storage capacity without having to also add compute resources, he said.
Datrium cloud DR advantages
Beerer said the addition of Datrium DRaaS would make sense because OCTA can manage it from the same DVX interface. Datrium’s deduplication, compression and transmission of only changed data blocks would also eliminate the need for a pricy “big, fat pipe” and reduce cloud storage requirements and costs over other options, he said. Plus, Datrium facilitates application consistency by grouping applications into one service and taking backups at similar times before moving data to the cloud, Beerer said.
Datrium’s “Instant RTO” is not critical for OCTA. Beerer said anything that can speed the recovery process is interesting, but users also need to weigh that benefit against any potential additional costs for storage and bandwidth.
“There are customers where a second or two of downtime can mean thousands of dollars. We’re not in that situation. We’re not a financial company,” Beerer said. He noted that OCTA would need to get critical servers up and running in less than 24 hours.
Reddy said Datrium offers two cost models: a low-cost option with a 60-minute window and a “slightly more expensive” option in which at least a few VMware servers are always on standby.
Pricing for Datrium DRaaS starts at $23,000 per year, with support for 100 hours of VMware Cloud on-demand hosts for testing, 5 TB of S3 capacity for deduplicated and encrypted snapshots, and up to 1 TB per year of cloud egress. Pricing was unavailable for the upcoming DRaaS Connect options.
Other cloud DR options
Jeff Kato, a senior storage analyst at Taneja Group, said the new Datrium options would open up to all VMware customers a low-cost DRaaS offering that requires no capital expense. He said most vendors that offer DR from their on-premises systems to the cloud force customers to buy their primary storage.
George Crump, president and founder of Storage Switzerland, said data protection vendors such as Commvault, Druva, Veeam, Veritas and Zerto also can do some form of recovery in the cloud, but it’s “not as seamless as you might want it to be.”
“Datrium has gone so far as to converge primary storage with data protection and backup software,” Crump said. “They have a very good automation engine that allows customers to essentially draw their disaster recovery plan. They use VMware Cloud on Amazon, so the customer doesn’t have to go through any conversion process. And they’ve solved the riddle of: ‘How do you store data in S3 but recover on high-performance storage?’ “
Scott Sinclair, a senior analyst at Enterprise Strategy Group, said using cloud resources for backup and DR often means either expensive, high-performance storage or lower cost S3 storage that requires a time-consuming migration to get data out of it.
“The Datrium architecture is really interesting because of how they’re able to essentially still let you use the lower cost tier but make the storage seem very high performance once you start populating it,” Sinclair said.
Office 365 admins must sacrifice some degree of control as Microsoft allows end users to purchase certain capabilities themselves for Power Platform products.
Microsoft Power Platform includes Power BI, PowerApps and Microsoft Flow, which have business intelligence, low-code development and workflow capabilities, respectively. These applications are included in most Office 365 enterprise subscriptions. Previously, only administrators could purchase licensing for an organization.
On Oct. 23, Microsoft announced that it would roll out self-service purchasing to U.S. cloud customers starting Nov. 19.
Widespread adoption of the SaaS model has already caused significant communication gaps between IT and end users, said Reda Chouffani, vice president of development at Biz Technology Solutions, a consulting firm in Mooresville, N.C.
“Now introducing this and knowing that Microsoft has over 140 million business subscribers that are empowered to make purchasing decisions on certain apps within the suite … that will make it where more of these [communication issues] will occur, and IT is not going to take it lightly,” he said.
Users with non-guest user accounts in a managed Azure Active Directory tenant will be able to make purchases directly with a credit card, according to a recent Microsoft FAQ. IT administrators can turn off the self-service purchasing policy through PowerShell, however, according to an update this week from Microsoft. Microsoft also extended the rollout date to Jan. 14, 2020, to give admins more time to prepare for the change.
The decision to allow IT to disable the capability likely came about from customer pushback about security concerns, said Willem Bagchus, messaging and collaboration specialist at United Bank, based in Parkersburg, W.Va.
IT admins may still be deterred by the self-service purchasing capability, because some may not be aware they can turn it off via PowerShell, Bagchus said.
“For a small-business IT admin who does everything by themselves or depends on the web only for [PowerShell] functions, it’ll be a bit of a challenge,” he added.
Security, licensing and support concerns
Security remains a top concern for many Office 365 customers, said Doug Hemminger, director of Microsoft services at SPR, a technology consulting firm in Chicago. Midsize and large businesses will be scrambling to turn the self-service purchasing capability off, he said.
“A lot of companies are worried about the data access issues that those users may inadvertently expose their company to,” Hemminger said. “Monitoring is a key part of implementing a certain environment and making sure that governance is in place, so many companies that I work with don’t want to give their employees the ability to go out and buy their own licenses.”
Mark BowkerSenior analyst, Enterprise Strategy Group
Office 365 admins can apply data management and access policies to Microsoft self-service purchases, which may alleviate some security concerns. End users do not need administrator approval before purchasing an application with a credit card, however.
“Most users will not think twice before purchasing something if it’s going to help them, which means that security may not necessarily be top of mind,” Chouffani said. “That can make it very difficult, because now everybody can pick their product of choice without truly doing some sort of due diligence and evaluation.”
“Microsoft has proved to me that they’re very serious about security,” said Willem Bagchus, messaging and collaboration specialist at United Bank, based in Parkersburg, W.Va. “Anything that may happen from a security perspective, [Microsoft] will be on top of it right away.”
Self-service purchasers can access a limited view of the Microsoft 365 admin center and assign licenses to other end users, according to the Microsoft FAQ.
“Licensing is the least of our worries,” said Daniel Beato, director of technology at TNTMAX, an IT consultancy based in Wyckoff, N.J. “The user can do their own licensing; they will pay with their own credit card or even the company credit card.”
Employees will likely be held responsible for company purchases, however, when an organization reviews its finances, Beato said.
It is also unclear who is expected to provide end-user support when an application fails, Chouffani said.
Microsoft will provide standard support for self-service purchasers, according to the company.
A ‘smart decision for Microsoft’
Microsoft’s self-service policy is a smart one for the company, said Mark Bowker, a senior analyst at Enterprise Strategy Group in Milford, Mass.
“In the world we live in today, employees need access to applications to get their jobs done,” he said. “Today’s application environment is very, very dynamic.”
Unlike other Office 365 products, such as Word and Excel, Power Platform applications aren’t widely used, Bowker said. Instead, they are used mainly by niche employees such as corporate developers and data analytics professionals.
“I think overall this will be a good thing,” Bagchus said. “More users and more installations will improve a product.”
Communication is key
No matter their personal feelings on the Microsoft self-service policy, Office 365 admins should be prepared for the changes and adjust accordingly.
Admins should have a good relationship with their organization’s Microsoft sales representative and keep in regular contact with a point person for updates, Bagchus said.
“That way you won’t get blindsided,” he said. “You can evolve with it.”
IT should also collaborate with end users to understand the needs of the business and to be a part of the solution, Chouffani said.
Among the key pointsin the study is that most organizations have not includeddata and analyticsas part of documented corporate strategies.
Mike RollingsAnalyst, Gartner
“The primary challenge is that data and data insights are not a central part of business strategy,” Rollings said.
Often, data and data analytics are actually just byproducts of other activities, rather than being the core focus of a formal data-driven architecture, he said. In Rollings’ view, data and analytics should be considered assets that can be measured, managed and monetized.
“When we talk about measuring and monetizing, we’re really saying, do you have an intentional process to even understand what you have,” he said. “And do you have an intentional process to start to evaluate the opportunities that may exist with data, or with analysis that could fundamentally change the business model, customer experience and the way decisions are made.”
Data transformation challenges
The struggle to make the data useful is a key challenge, said Hoshang Chenoy, senior manager of marketing analytics at San Francisco-based LiveRamp, an identity resolution software vendor.
Among other data transformation challenges is that many organizations still have siloed deployments, where data is collected and remains in isolated segments.
“In addition to having siloed data within an organization, I think the biggest challenge for enterprises to make their data ready for analytics are the attempts at pulling in data that has previously never been accessed, whether it’s because the data exists in too many different formats or for privacy and security reasons,” Chenoy said. “It can be a daunting task to start on adata managementproject but with the right tech, team and tools in place, enterprises should get started sooner rather than later.”
How to address the challenges
With the data warehouse anddata laketechnologies, the early promise was making it easier to use data.
But despite technology advances, there’s still a long way to go to solving data transformation challenges, said Ed Thompson, CTO of Matillion, a London-baseddata integrationvendor that recently commissioned a survey on data integration problems.
The survey of 200 IT professionals found that 90% of organizations see making data available for insights as a barrier. The study also found a rapid rate of data growth of up to 100% a month at some organizations.
When an executive team starts to get good quality data, what typically comes back is a lot of questions that require more data. The continuous need to ask and answer questions is the cycle that is driving data demand.
“The more data that organizations have, the more insight that they can gain from it, the more they want, and the more they need,” Thompson said.
Microsoft Azure users will get a hosted version of the HashiCorp Consul service mesh as multi-platform interoperability becomes a key feature for IT shops and cloud providers alike.
Service mesh is an architecture for microservices networking that uses a sidecar proxy to orchestrate and secure network connections among complex ephemeral services. HashiCorp Consul is one among several service mesh control planes available, but its claim to fame for now is that it can connect multiple VM-based or container-based applications in any public cloud region or on-premises deployment, through the Consul Connect gateway released last year.
HashiCorp Consul Service on Azure (HCS), released to private beta this week, automatically provisions clusters that run Consul service discovery and service mesh software within Azure. HashiCorp site reliability engineers also manage the service behind the scenes, but it’s billed through Azure and provisioned via the Azure console and the Azure Managed Applications service catalog.
The two companies unveiled this expansion to their existing partnership this week at HashiConf in Seattle, and touted their work together on service mesh interoperability, which also includes the Service Mesh Interface (SMI), released in May. SMI defines a set of common APIs that connect multiple service mesh control planes such as Consul, Istio and Linkerd.
Industry watchers expect such interconnection — and coopetition — to be a priority for service mesh projects, at least in the near future, as enterprises struggle to make sense of mixed infrastructures that include legacy applications on bare metal along with cloud-native microservices in containers.
“The only way out is to get these different mesh software stacks to interoperate,” said John Mitchell, formerly chief platform architect at SAP Ariba, a HashiCorp Enterprise shop, and now an independent digital transformation consultant who contracts with HashiCorp, among others. “They’re all realizing they can’t try to be the big dog all by themselves, because it’s a networking problem. Standardization of that interconnect, that basic interoperability, is the only way forward — or they all fail.”
John MitchellIndependent consultant
Microsoft and HashiCorp talked up multi-cloud management as a job for service mesh, but real-world multi-cloud deployments are still a bleeding-edge scenario at best among enterprises. However, the same interoperability problem faces any enterprise with multiple Kubernetes clusters, or assets deployed both on premises and in the public cloud, Mitchell said.
“Nobody who’s serious about containers in production has just one Kubernetes cluster,” he said. “Directionally, multiplatform interoperability is where everybody has to go, whether they realize it yet or not.”
The tangled web of service mesh interop
For now, Consul has a slight edge over Google and IBM’s open source Istio service mesh control plane, in the maturity of its Consul Connect inter-cluster gateway and ability to orchestrate VMs and bare metal in addition to Kubernetes-orchestrated containers. Clearly, it’s pushing this edge with HashiCorp Consul Service on Azure, but it won’t be long before Istio catches up. Istio Gateway and Istio Multicluster projects both emerged this year, and the ability to integrate virtual machines is also in development. Linkerd has arguably the best production-use bona fides in VM-based service mesh orchestration. All the meshes use the same Envoy data plane, which will make differentiation between them in the long term even more difficult.
“Service mesh will become like electricity, just something you expect,” said Tom Petrocelli, an analyst at Amalgam Insights in Arlington, Mass. “The vast majority of people will go with what’s in their preferred cloud platform.”
HCS could boost Consul’s profile, given Microsoft’s strength as a cloud player — but it will depend more on how the two companies market it than its technical specifications, Petrocelli said. At this stage, Consul doesn’t interoperate with Azure Service Fabric, Microsoft’s original hosted service mesh, which is important if it’s to get widespread adoption, in Petrocelli’s view.
“I’m not really going to get excited about something in Azure that doesn’t take advantage of Azure’s own fabric,” he said.
Without Service Fabric integration to widen Consul’s appeal to Azure users, it’s likely the market for HCS will pull in many new customers, Petrocelli said. Also, whether Microsoft positions HCS as its service mesh of choice for Azure, or makes it one among many hosted service mesh offerings, will decide how widely used it will be, in his estimation.
“If [HCS] is one of many [service mesh offerings on Azure], it’s nice if you happen to be a HashiCorp customer that also uses Azure,” Petrocelli said.
An integration unveiled this week will make it easier for Slack users to launch and join Fuze meetings. Zoom Video Communications Inc. rolled out a similar integration with Slack over the summer.
Slack is increasingly making it clear that it intends to incorporate voice and video capabilities into its team messaging app through integrations and partnerships, rather than by attempting to build the technology on its own.
Fuze Inc.’s announcement also underscores how big a player Slack has become in the business collaboration industry. Fuze, a cloud unified communications (UC) provider, opted to partner with Slack, even though it sells a team messaging app with the same core capabilities.
The integration lets users launch Fuze meetings by clicking on the phone icon in Slack, instead of typing a command. They will also see details about an ongoing meeting, such as how long it’s been going on and who’s participating.
Furthermore, Slack’s Microsoft Outlook and Google Calendar apps will let users join scheduled Fuze meetings with one click. Slack previously announced support for that capability with Zoom, Cisco Webex and Skype for Business.
No formal partnership
Slack gave Fuze special access to the set of APIs that made the latest integrations possible, said Eric Hanson, Fuze’s vice president of marketing intelligence. But the companies later clarified there was no formal partnership between them.
The vendors apparently miscommunicated about how to frame this week’s announcement. Within hours on Tuesday, Fuze updated a blog post to remove references to a “partnership” with Slack, instead labeling it as an “integration.”
In contrast, Slack and Zoom signed a contract to align product roadmaps and marketing strategies earlier this year.
In the future, Fuze hopes to give users the ability to initiate phone calls through Slack. Previously, Slack said it would enable such a feature with Zoom Phone, the video conferencing provider’s new cloud calling service.
Slack declined to comment on any plans to expand the Fuze integration.
“There are still some things that Slack hasn’t made available through this set of APIs yet,” Hanson said. “They have a roadmap in terms of where they want to take this.”
Making it easier for users to pick and choose
The voice and video capabilities natively supported in Slack are far less advanced than those available from main rival Microsoft Teams, an all-in-one suite for calling, messaging and meetings. But users want to be able to easily switch between messaging with someone and talking to them in real time.
By integrating with cloud communications vendors like Fuze and Zoom, Slack can focus on what it does best — team-based collaboration — while still connecting to the real-time communications services that customers need, said Mike Fasciani, analyst at Gartner.
“One of Slack’s advantages over Microsoft Teams is its ability and willingness to integrate with many business and communications applications,” Fasciani said.
Fuze also competes with Microsoft Teams. Integrations with Slack should help cloud UC providers sell to the vendor’s rapidly expanding customer base. Slack now has more than 100,000 paid customers, including 720 enterprises that each contribute more than $100,000 per year in revenue.
“Even though Fuze has its own [messaging] app, it doesn’t have anywhere near the market share of Slack,” said Irwin Lazar, analyst at Nemertes Research. “I think this shows Slack’s continued view that they don’t want to compete directly with the voice/meeting vendors.”
Oracle’s strategy going into 2020 is to support users wherever they are, while not-so-subtly urging them to move onto Oracle cloud services – particularly databases.
In fact, some say its Oracle’s legacy as a database vendor that may be the key to the company’s long-term success as a major cloud player.
To reconcile the Oracle cloud persona of today with the identity of database giant that the company still holds, it helps to look back at key milestones in Oracle’s history over the past 20 years, beginning with Oracle database releases at the turn of the century.
Oracle releases Database 8i, 9i
Two major versions of Oracle’s database arrived in 1998 and 2001. Oracle Database 8i was the first written with a heavy emphasis on web applications — the “i” stood for Internet.
Then Oracle 9i introduced the feature Real Application Clusters (RAC) for high-availability scenarios. RAC is a widely popular and lucrative database option for Oracle, one it has held very close to date. RAC is only supported and certified for use on Oracle’s cloud service at this time.
With the 9i update, Oracle made a concerted effort to improve the database’s administrative tooling, said Curt Monash, founder of Monash Research in Acton, Mass.
“This was largely in reaction to growing competition from Microsoft, which used its consumer software UI expertise to have true ease-of-administration advantages versus Oracle,” Monash said. “Oracle narrowed the gap impressively quickly.”
Oracle acquires PeopleSoft and Siebel
Silicon Valley is littered with the bones of once-prominent application software vendors that either shut down or got swallowed up by larger competitors. To that end, Oracle’s acquisitions of PeopleSoft and Siebel still resonate today.
The company launched what many considered to be a hostile takeover of PeopleSoft, the second-largest software vendor in 2003 after SAP. It ultimately succeeded with a $10.3 billion bid the following year. Soon after the deal closed, Oracle laid off more than half of PeopleSoft’s employees in a widely decried act.
Oracle also gained J.D. Edwards, known for its manufacturing ERP software, through the PeopleSoft purchase.
The PeopleSoft deal, along with Oracle’s $5.8 billion acquisition of Siebel in 2005, reinvented the company as a big player in enterprise applications and set up the path toward Fusion.
Oracle realized that to catch up to SAP in applications, it needed acquisitions, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif., who worked in business and product development roles at Oracle during much of the 2000s.
“To cement ownership within complex CRM, they needed Siebel,” Mueller said. Those Siebel customers largely remain in the fold today, he added. While rival HCM software vendor Workday has managed to poach some of Oracle’s PeopleSoft customers, Salesforce hasn’t had the same luck converting Siebel users over to its CRM, according to Mueller.
Oracle’s application deals were as much or more about acquiring customers as they were about technology, said Frank Scavo, president of IT consulting firm Strativa in Irvine, Calif.
“Oracle had a knack for buying vendors when they were at or just past their peak,” he said. “PeopleSoft was an example of that.”
The PeopleSoft and Siebel deals also gave Oracle the foundation, along with its homegrown E-Business Suite, for a new generation of applications in the cloud era.
Oracle’s Fusion Applications saga
Oracle first invoked the word “Fusion” in 2005, under the promise it would deliver an integrated applications suite that comprised a superset of functionality from its E-Business Suite, PeopleSoft and Siebel software, with both cloud and on-premises deployment options.
The company also pledged that Fusion apps would deliver a consumer-grade user experience and business intelligence embedded throughout processes.
Fusion Applications were supposed to become generally available in 2008, but Oracle didn’t make these applications generally available to all customers until 2011.
It’s been suggested that Oracle wanted to take its time and had the luxury of doing so, since its installed base was still weathering a recession and had little appetite for a major application migration, no matter how useful the new software was.
Fusion Applications’ sheer scope was another factor. “It takes a long time to build software from scratch, especially if you have to replace things that were strong category leaders,” Mueller said.
Oracle’s main shortcoming with Fusion Applications was its inability to sell very much of them early on, Mueller added.
Oracle acquires Hyperion and BEA
After its applications shopping spree, Oracle eyed other areas of software. First, it bought enterprise performance management vendor Hyperion in 2007 for $3.3 billion to bolster its financials and BI business.
“Hyperion was a smart acquisition to get customers,” Mueller said. “It helped Oracle sell financials. But it didn’t help them in the move to cloud.”
In contrast, BEA and its well-respected application server did. The $8.5 billion deal also gave Oracle access to a large customer base and many developers, Mueller added.
BEA’s products also gave a boost to Oracle’s existing Fusion Middleware portfolio, said John Rymer, an analyst at Forrester. “At the time, Oracle’s big competitor in middleware was IBM,” he said. “[Oracle] didn’t have credibility.”
Exadata packs servers, networking and storage, along with Oracle database and other software, into preconfigured racks. Oracle also created storage processing software for the machines, which its marketing arm initially dubbed “engineered systems.”
With the move, Oracle sought to take a bigger hold in the data warehousing market against the likes of Teradata and Netezza, which was subsequently acquired by IBM.
Exadata was a huge move for Oracle, Monash said.
“They really did architect hardware around software requirements,” he said. “And they attempted to change their business relationship with customers accordingly. … For context, recall that one of Oracle’s top features in its hypergrowth years in the 1980s was hardware portability.”
In fact, it would have been disastrous if Oracle didn’t come up with Exadata, according to Monash.
“Oracle was being pummeled by independent analytics DBMS vendors, appliance-based or others,” he said. “The competition was more cost-effective, naturally, but Exadata was good enough to stem much of the bleeding.”
Exadata and its relatives are foundational to Oracle’s IaaS, and the company also offers the systems on-premises through its Cloud at Customer program.
“We offer customers choice,” said Steve Daheb, senior vice president of Oracle Cloud. “If customers want to deploy [Oracle software] on IBM or HP [gear], you could do that. But we also continue to see this constant theme in tech, where things get complicated and then they get aggregated.”
Oracle buys Sun Microsystems
Few Oracle acquisitions were as controversial as its $7.4 billion move to buy Sun Microsystems. Critics of the deal bemoaned the potential fate of open-source technologies such as the MySQL database and the Java programming language under Oracle’s ownership, and the deal faced serious scrutiny from European regulators.
Oracle ultimately made a series of commitments about MySQL, which it promised to uphold for five years, and the deal won approval in early 2010.
Sun’s hardware became a platform for Exadata and other Oracle appliances. MySQL has chugged along with regular updates, contrary to some expectations that it would be killed off.
But many other Sun-related technologies fell into the darkness, such as Solaris and Sun’s early version of an AWS-style IaaS. Oracle also moved Java EE to the Eclipse Foundation, although it maintains tight hold over Java SE.
The Sun deal remains relevant today, given how it ties into Ellison’s long-term vision of making Oracle the IBM for the 21st century, Mueller said.
That aspiration realized would see Oracle become a “chip-to-click” technology provider, spanning silicon to end-user applications, he added. “The verdict is kind of still out over whether that is going to work.”
Oracle Database 12c
The company made a slight but telling change to its database naming convention with the 2013 release of 12c, swapping consonants for one that denoted “cloud,” rather than “g” for grid computing.
Oracle’s first iteration of 12c had multitenancy as a marquee feature. SaaS vendors at the time predominantly used multitenancy at the application level, with many customers sharing the same instance of an app. This approach makes it easier to apply updates across many customers’ apps, but is inherently weaker for security, Ellison contended.
Oracle 12c’s multi-tenant option provided an architecture where one container database held many “pluggable” databases.
Oracle later rolled out an in-memory option to compete with SAP’s HANA in-memory database. SAP hoped its customers, many of which used Oracle’s database as an underlying store, would migrate onto HANA.
2016: Oracle acquires NetSuite
Oracle’s $9.3 billion purchase of cloud ERP vendor NetSuite came with controversy, given Ellison’s large personal financial stake in the vendor. But on a strategic level, the move made plenty of sense.
NetSuite at the time had more than 10,000 customers, predominantly in the small and medium-sized business range. Oracle, in contrast, had 1,000 or so customers for its cloud ERP aimed at large enterprises, and not much presence in SMB.
Thus, the move plugged a major gap for Oracle. It also came as Oracle and NetSuite began to compete with each other at the margins for customers of a certain size.
Oracle’s move also gave it a coherent two-tier ERP strategy, wherein a customer that opens new offices would use NetSuite in those locations while tying it back to a central Oracle ERP system. This is a practice rival SAP has used with Business ByDesign, its cloud ERP product for SMBs, as well as Business One.
The NetSuite acquisition was practically destined from the start, said Scavo of Strativa.
“I always thought Larry was smart not to do the NetSuite experiment internally. NetSuite was able to develop its product as a cloud ERP system long before anyone dreamed of doing that,” Scavo said.
NetSuite customers could benefit as the software moves onto Oracle’s IaaS if they receive the promised benefits of better performance and elasticity, which NetSuite has grappled with at times, Scavo added. “I’m looking forward to seeing some evidence of that.”
Oracle launches its second-generation IaaS cloud
The IaaS market has largely coalesced around three players in hyperscale IaaS: AWS, Microsoft and Google. Other large companies such as Cisco and HPE tried something similar, but ceded defeat and now position themselves as neutral middle players keen to help customers navigate and manage multi-cloud deployments.
Oracle, meanwhile, came to market with an initial public IaaS offering based in part on OpenStack, but it failed to gain much traction. It subsequently made major investments in a second-generation IaaS, called Oracle Cloud Infrastructure, which offers many advancements at the compute, network and storage layers over the original.
Steve DahebSenior vice president, Oracle Cloud
Oracle has again shifted gears, evidenced by its partnership with Microsoft to boost interoperability between Oracle Cloud Infrastructure and Azure. One expected use case is for IT pros to run their enterprise application logic and presentation tiers on Azure, while tying back to Oracle’s Autonomous Database on the Oracle cloud.
“We started this a while back and it’s something customers asked for,” Oracle’s Daheb said. There was significant development work involved and given the companies’ shared interests, the deal was natural, according to Daheb.
“If you think about this world we came from, with [on-premises software], we had to make it work with everybody,” Daheb said. “Part of it is working together to bring that to the cloud.”
Oracle Autonomous Database marks the path forward
Ellison will unveil updates to Oracle database 19c, which runs both on-premises and in the cloud, in a talk at OpenWorld. While details remain under wraps, it is safe to assume the news will involve autonomous management and maintenance capabilities Oracle first discussed in 2017.
Oracle database customers typically wait a couple of years before upgrading to a new version, preferring to let early adopters work through any remaining bugs and stability issues. Version 19c arrived in January, but is more mature than the name suggests. Oracle moved to a yearly naming convention and update path in 2018, and thus 19c is considered the final iteration of the 12c release cycle, which dates to 2013.
Oracle users should be mindful that autonomous database features have been a staple of database systems for decades, according to Monash.
But Oracle has indeed accomplished something special with its cloud-based Autonomous Database, according to Daheb. He referred to an Oracle marketing intern who was able to set up databases in just a couple of minutes on the Oracle Cloud version. “For us, cloud is the great democratizer,” Daheb said.
Bluescape has launched its newest mobile app to enable users to access their content on the go.
The app, available in the Apple App Store and Google Play store, connects to Bluescape workspaces from mobile devices, such as cellphones or tablets. According to the vendor, it enables users to give presentations without a laptop by launching a Bluescape session from the app onto larger touchscreens.
Users can also access their content and workspace anytime and from anywhere and search and view content. According to Bluescape, the app provides a visual collaboration workspace that integrates day-to-day applications, content and tools.
The Bluescape platform is cloud-based software, with applications designed for collaboration in the workplace. Available applications include mobile and personal workstations, huddle rooms, innovation centers, collaboration suites, conference rooms, training rooms, executive briefing centers, command centers and control centers. Search, messaging and file sharing are also built into the platform.
Bluescape lists professionals in jobs such as architecture, consulting, designing, filmmaking, marketing and product development as ideal users for its product, as these are often groups of people working collaboratively and visually.
Bluescape is among the vendors offering visual collaboration software, which works hand in hand with digital collaborative whiteboards. Vendor Mural provides separate workspaces for teams and enables scaling for companywide processes, with frameworks for Agile, Lean and Design Thinking methods. Custom frameworks are also available.
Competitor Miro touts its product development, user experience research and design, and Lean and Agile capabilities, as well as its enterprise-grade security. Available applications include Google Drive, Box, Dropbox, Slack, OneDrive and Microsoft Teams.
An Office 365 migration can improve an end user’s experience by making it easier to work in a mobile environment while also keeping Office 365 features up to date. But if the migration is done without the end users in mind, it can lead to headaches for IT admins.
At a Virtual Technology User Group (VTUG) event in Westbrook, Maine, about 30 attendees piled into a Westbrook Middle School classroom to hear tips on how to transition to Office 365 smoothly.
Office 365 is Microsoft’s subscription-based line of Office applications, such as Word, PowerPoint, Outlook, Teams and Excel. Rather than downloaded onto a PC, Office 365 apps are run in the cloud, enabling users to access their files wherever they are.
“As IT admins, we need to make the digital transformation technology seem easy,” said Jay Gilchrist, business development manager for Presidio Inc., a cloud, security and digital infrastructure vendor in New York and a managed service provider for Microsoft. Gilchrist and his Presidio colleague, enterprise software delivery architect Michael Cessna, led the session, outlining lessons they’ve learned from previous Office 365 migrations.
Importance of communication and training
Their first lessons included communicating with end users, keeping a tight migration schedule and the importance of training.
“You want to make it clear that you’re not just making a change for change’s sake,” Gilchrist said. “Communicate these changes as early as possible and identify users who may need a little more training.”
One practical tip he offered is to reserve the organization’s name in Office 365 early to ensure it’s available.
Jay GilchristBusiness development manager, Presidio
Conducting presentations, crafting targeted emails and working to keep the migration transparent can help IT admins keep end users up to date and enthused about the transition.
“End users are not information professionals,” Cessna said. “They don’t understand what we understand and these changes are a big deal to them.”
Cessna and Gilchrist said that if IT admins want end users to adopt apps in Office 365, they’ll need to provide the right level of training. IT admins can do that by providing internal training sessions, using external resources such as SharePoint Training Sites, as well as letting users work with the apps in a sandbox environment. Training will help end users get used to how the apps work and address questions end users may have in real time, thereby reducing helpdesk tickets once the Office 365 migration is completed.
Governance and deployment
Before an Office 365 migration, IT admins need to have a governance of applications and deployment plan in place.
“Governance built within Microsoft isn’t really there,” Cessna said. “You can have 2,000 users and still have 4,500 Team sessions and now you have to manage all that data. It’s good to take care of governance at the beginning.”
Deployment of Office 365 is another aspect that IT admins need to tackle at the start of an Office 365 migration. They need to determine what versions are compatible with the organization’s OS and how the organization will use the product.
“It’s important to assess the digital environment, the OSes, what versions of Office are out there and ensure the right number of licenses,” Cessna said.
Securing and backing up enterprise data
One existing concern for organizations migrating from on-premises to an Office 365 cloud environment is security.
Microsoft provides tools that can help detect threats and secure an organization’s data. Microsoft offers Office 365 Advanced Threat Protection (ATP), a cloud-based email filtering service that helps protect against malware, Windows Defender ATP, an enterprise-grade tool to detect and respond to security threats, and Azure ATP, which accesses the on-premises Active Directory to identify threats.
Microsoft has also added emerging security capabilities such as passwordless log in, single-sign-on and multi-factor authentication to ensure data or files don’t get compromised or stolen during an Office 365 migration.
Regulated organizations such as financial institutions that need to retain data for up to seven years will need to back up Office 365 data, as Microsoft provides limited data storage capabilities, according to Cessna.
Microsoft backs up data within Office 365 for up to two years in some cases, and only for one month in other cases, leaving the majority of data backup to IT.
“[Microsoft] doesn’t give a damn about your data,” he said. “Microsoft takes care of the service, but you own the data.”
Picking the right license
Once the organization is ready for the migration, it’s important to choose the right Office 365 license, according to Gilchrist.
There are several ways for an organization to license an Office 365 subscription. Gilchrist said choosing the right one depends on the size of the organization and the sophistication of the organization’s IT department.
Smaller businesses can choose an option of licenses for 300 or less users, as well as options for add-ons like a desktop version of Office and advanced security features. The cost for enterprise licenses differs depending on the scope of the licenses and number of licenses needed, and educational and non-profit discounts on licenses are offered as well.
Other licensing options include Microsoft 365 bundles, which combine Office 365 with a Windows 10 deployment, or organizations could use Microsoft as a Cloud Solution Provider and have the company handle the heavy lifting of the Office 365 migration.
“There are different ways to do it. You just have to be aware of the best way to license for your business,” Gilchrist said.
Measuring success and adoption
Once completed, IT still has one more objective, and that’s to prove the worth of an Office 365 migration.
“This is critical and these migrations aren’t cheap,” Cessna said. “You want to show back to the business the ROI and what this new world looks like.”
To do that, IT admins will have to circle back to their end users. They can use tools such as Microsoft’s Standard Office 365 Usage Reports, Power BI Adoption reports or other application measurement software to pin down end user adoption and usage rates. They can provide additional training, if necessary.
“Projects fail because the end users aren’t happy,” Cessna said. “We don’t take them into account enough. Our end users are our customers and we need to make sure they’re happy.”