IBM said Wednesday it reached an agreement to acquire a Brazilian RPA vendor in a move that will add RPA to IBM’s suite of automation tools.
The company, WDG Soluções Em Sistemas E Automação De Processos LTDA, is informally called WDG Automation.
“IBM doesn’t currently have an RPA offering (beyond partnerships), so this plugs a hole for them,” said Alan Pelz-Sharpe, founder of market advisory and research firm Deep Analysis. “Frankly, it looks like a smart move; they have acquired a firm with good basic technology at most likely a modest price.”
Based in São José do Rio Prêto, Brazil, WDG Automation primarily sells robotic automation systems, chatbots and automation software to enterprises in Latin America. Its low-code/no-code platform is designed to enable business users to easily create bots.
IBM said it plans to integrate more than 600 prebuilt RPA functions from WDG Automation into its Cloud Pak products on Red Hat OpenShift, beginning with Cloud Pak for Automation.
Integrating WDG Automation technology into Cloud Pak for Automation, a platform for building and running automation applications, will enable IBM customers to deploy bots faster and automate more workflow processes.
Eventually, IBM aims to integrate WDG Automation technology into its other products, including Watson AIOps, a tool meant to automate processes that help CIOs detect, diagnose and respond to IT anomalies, IBM said.
Commvault and Microsoft are joined at the hip with a new partnership deal combining development, marketing and sales efforts for Commvault’s Metallic and Microsoft Azure.
Commvault’s multiyear strategic agreement with Microsoft builds on the two companies’ previous relationship. Metallic’s SaaS-based backup is already hosted on the Microsoft Azure cloud, and the new partnership laid down a roadmap to deepen that integration. That roadmap includes building a new SaaS offering of Metallic Cloud Storage on Azure Blob Storage, but other integrations with native Azure services are also in the works.
The partnership also focuses on making Metallic easy to discover, purchase and use. As a result of the agreement between Commvault and Microsoft, Metallic Backup & Recovery for Office 365 was introduced to the Azure Marketplace as a featured application. As a native service, the bill for Metallic will show up on a customer’s Azure bill, and they will be able to use Metallic through Azure’s control panel.
Data protection vendor Commvault launched Metallic SaaS backup in late 2019 as a separate brand and business unit. Manoj Nair, general manager of Metallic, said the demand for SaaS and cloud has accelerated as a result of COVID-19, and customers today have a wide range of vendors to choose from. He said as a native Azure service, Metallic can capture Microsoft customers migrating to Office 365 and shopping for backup.
“There’s a need for trust today, and we’re a trusted solution from two vendors,” Nair said.
Nair also mentioned that this partnership feeds the global launch efforts of Metallic, as it allows Metallic to go to any market Azure is. Metallic launched only in the U.S. in 2019, and became available in Canada last month. With this agreement in place, Metallic will soon be launching in New Zealand and Australia, followed by the European market.
Chris Powell, chief marketing officer of Commvault, said Metallic started with Office 365 backup because there is an immediate market need. Businesses were already gravitating toward cloud and SaaS, slowly and methodically as they considered how to best optimize costs during their transition. However, Powell said COVID-19 was like a “punch in the mouth,” and organizations were scrambling as they quickly found themselves in a world with more remote workers and more ransomware attacks.
“We were seeing demand even before putting Metallic in Azure Marketplace, but customers are even more cost-conscious now,” Powell said. He also added that Metallic’s other products, Core and Endpoint backup, will be in Azure Marketplace in the future.
Nate Hauenstein, global infrastructure manager at Chart Industries, said the Metallic and Azure partnership comes at a good time. Chart manufactures cryogenic equipment, from Yeti can coolers to large shipping containers found on trucks and barges. Chart also has a hand in the cannabis market, flash-freezing extracted cannabis oil, and recently experienced an uptick in orders for cryogenic freezers for medical research labs.
Six years ago, Hauenstein united Chart’s data protection onto Commvault. As a result of multiple mergers and acquisitions, Chart has more than 30 sites across the globe, and Hauenstein said uniting them all on a single platform was no small task. However, he said the cost savings from heavily virtualizing IT, reducing the physical footprints of each site and bringing every site to a single vendor proved to be a winning argument with his superiors.
But Hauenstein’s task of reducing Chart’s hardware footprint went beyond data protection, and four years ago, he made the company cloud-first by bringing most of its workloads to Azure. He said Chart doesn’t have a single large data center and described its IT as decentralized and full of continually moving parts.
Hauenstein took part in Metallic’s beta and decided that, once his Commvault license was up for renewal in early 2020, he would onboard all of his on-premises backup to Metallic. However, Chart was in the middle of heavy merger and acquisition activity, and Hauenstein felt Chart’s Azure cloud architecture wasn’t mature enough. He said he wanted tight control over where his data was stored in Azure. As a result, he had to pause the Metallic launch.
Hauenstein said the partnership between Commvault and Microsoft will help him a great deal because it lines up perfectly with what he’s working on. He is already in the process of refining his Azure architecture, so wrapping Metallic into that discussion will let him complete both IT projects.
“This partnership is in direct alignment with my objectives. It will make that transition easier,” Hauenstein said.
Technology partnerships between backup vendors and cloud providers aren’t new. Azure Marketplace includes a host of backup products, including Acronis and Veeam. Actifio GO for Google Cloud Platform (GCP) has similar tight integration between the two companies, with fees for Actifio showing up on a customer’s GCP bill.
Christophe Bertrand, senior analyst at Enterprise Strategy Group, said the difference in the partnership between Commvault and Microsoft is its level of integration. It taps into both companies’ channel ecosystems and incentivizes salespeople on both sides, aligning go-to-market between the two. Bertrand sees the deal as more than “just a logo on a website,” which is how he describes most technology partnerships.
Bertrand expects the two companies to build out beyond backup and recovery, with a roadmap leading toward intelligent data management. Bertrand added that the timing of this partnership is also fortuitous, given the increased demand for SaaS and cloud.
“There’s a lot of cloud adoption accelerated by the current situation,” Bertrand said. “This has all the ingredients for success.”
Oracle is further blurring the lines between public and private cloud infrastructure with the launch of Dedicated Region Cloud at Customer, which can run the company’s Autonomous Database, SaaS applications and other software inside an enterprise’s data center.
The company has offered Cloud at Customer for a couple of years now in a few flavors, including ones for its Exadata platform and a Big Data version. The systems use a combination of software and hardware to duplicate services running in Oracle’s public cloud. Dedicated Regions extends the concept to the full range of Oracle services, including its Fusion SaaS applications and Autonomous Database.
Oracle Cloud at Customer Dedicated Region pricing starts at $500,000 per month, which indicates the offering is aimed at the company’s largest customers. Many of these companies have regulatory and data-residency requirements that necessitate on-premises deployments, but they may also want to benefit from the scalability, elasticity and subscription pricing model that public cloud infrastructure offers.
“[It’s] an easy way to switch from managing your own infrastructure to getting it as a service with exactly the same compliance and data residency and security as if you did it yourself,” said Deepak Mohan, an analyst at IDC.
Cloud at Customer subscriptions include the required hardware, control plane software and a support gateway that ties a customer’s implementation to remote Oracle operations staff that handle daily tasks and maintenance in the same way Oracle runs its own data centers, the company said.
Dedicated Regions bear some resemblance to AWS Outposts and Microsoft’s Azure Stack, but there are differences. Azure Stack is available as a managed service, but customers can also self-manage the system and buy hardware from multiple vendors. Both Outposts and Azure Stack are also targeting smaller-scale workloads, whereas Dedicated Regions are meant for large, steady-state use cases in heavily regulated industries such as finance and healthcare, Mohan said.
Still, the trend toward expanding public cloud infrastructure outside of actual public cloud data centers is red-hot, he added. “It’s exploded in the last year.”
Cloud at Customer Dedicated Regions is oriented around Oracle software, but that doesn’t mean it amounts to vendor lock-in, according to Steve Daheb, senior vice president of Oracle Cloud. “If you want to run non-Oracle workloads on it, you can absolutely do that,” he said.
Autonomous Database goes on-premises
Oracle first introduced Autonomous Database at OpenWorld 2018. Since then, it’s been offered in multi-tenant and single-tenant form through Oracle Cloud Infrastructure. Now Oracle has added support for Autonomous Database on Exadata Cloud at Customer, either in standalone form or as part of a Dedicated Region.
The system uses machine learning and other techniques to automate tasks otherwise left to database administrators, such as patching, tuning and provisioning. Oracle’s Exadata hardware platform can be used to consolidate large fleets of Oracle database instances and provides superior performance over traditional racks, the company said.
In a nod to customers with on-premises legacy software that may be difficult to migrate away from, Oracle has also certified its Siebel, PeopleSoft and JD Edwards applications for CRM, human capital management (HCM) and ERP for Autonomous Database. This should have some appeal among Oracle’s installed base, according to Doug Henschen, an analyst at Constellation Research.
“Even if customers are still using on-premises apps, they can now take advantage of some as-a-service automation capabilities through Autonomous Database without having to migrate to a SaaS-based application,” Henschen said.
Oracle has yet to announce support for its Autonomous Data Warehouse on Cloud at Customer, but this, too, would be welcome, he added.
“It would presumably open up options to support Oracle Analytics for Applications through Cloud at Customer,” Henschen said. The Oracle Analytics integrations now available for Fusion ERP and Fusion HCM include services-based data warehouse instances and data integration, but they are not yet supported by Cloud at Customer.
“Here, too, the benefit would be consistent use of a services-based approach even where internal or external governance or policy requirements demand that data is managed on premises,” Henschen said.
Microsoft has seized control of several malicious domains that were used in COVID-19-themed phishing attacks against its customers in 62 countries around the world.
Last month, the technology giant filed a complaint with the U.S. District Court for the Eastern District of Virginia in order to stop cybercriminals from “exploiting the pandemic by attempting to obtain personal access and confidential information of its customers.” The court documents were unsealed on Tuesday as Microsoft secured control of the domains, which were used in a variety of phishing and business email compromise (BEC) attacks.
In a blog post Tuesday, Microsoft revealed that the “civil case resulted in a court order allowing Microsoft to seize control of key domains in the criminals’ infrastructure so that it can no longer be used to execute cyberattacks,” Tom Burt, corporate vice president of customer security and trust at Microsoft, wrote.
Microsoft’s Digital Crimes Unit first observed a new phishing scheme in December of 2019, which was designed to compromise customers’ Office 365 accounts. While efforts to block the sophisticated scheme were successful, Microsoft recently observed renewed attempts by the same threat actors, this time with a COVID-19 lure.
“Specifically, defendants in this action are part of an online criminal network whose tactics evolved to take advantage of global current events by deploying COVID-19 themed phishing campaign targeting Microsoft customers around the world. This sophisticated phishing campaign is designed to compromise thousands of Microsoft customer accounts and gain access to customer email, contact lists, sensitive documents and other personal information,” Microsoft wrote in the complaint.
Microsoft seized six primary domains, five of which were revealed to have the name “Office” in them; the sixth domain was mailitdaemon[.]com, which is used to receive forwarded mail from compromised Office 365 accounts.
Burt wrote in the blog post that BEC threats have “increased in complexity, sophistication and frequency in recent years.” As BEC rises, threat actors have become equipped with new tactics that take impersonation to the next level. “These phishing emails are designed to look like they come from an employer or trusted source,” Microsoft wrote in the complaint.
In these coronavirus phishing emails, threat actors included messages with a COVID-19 theme to lure in victims, playing on the fear and uncertainty caused by the pandemic. For example, threat actors do this by “using terms such as ‘COVID-19 bonus,'” Burt wrote.
According to the FBI, half of cybercrime losses in 2019 were BEC alone. Some experts say BEC attacks have led to as many cyberinsurance payments as ransomware, and in some cases more.
Microsoft isn’t alone in seizing coronavirus-related malicious domains. In April, the Department of Justice announced the disruption of hundreds of online COVID-19 related scams, through public and private sector cooperative efforts.
“As of April 21, 2020, The FBI’s Internet Crime Complaint Center has received and reviewed more than 3,600 complaints related to COVID-19 scams, many of which operated from websites that advertised fake vaccines and cures, operated fraudulent charity drives, delivered malware or hosted various other types of scams, ” the DOJ wrote in the announcement.
Like many security vendors, Microsoft said it has observed cybercriminals adapting their lures this year to take advantage of current events such as COVID-19. The company recommended several steps to prevent credential theft, including implementing two-factor authentication on all business and personal accounts.
“While the lures may have changed, the underlying threats remain, evolve and grow,” Burt wrote.
Not only is Information Builders now IBI, but the vendor also is trying to modernize its analytics suite by adding augmented intelligence capabilities and expanding beyond data visualizations to become an end-to-end platform.
Information Builders was founded in 1975 and is based in New York. Well into its third decade, the vendor’s analytics platform was considered to offer one of the more vibrant business intelligence tools. In recent years, however, newer vendors such as Tableau, Qlik and ThoughtSpot have developed innovations that pushed the capabilities of their platforms beyond those of the older analytics purveyors.
Information Builders, however, has responded in recent years to changes in the market. Frank Vella took over as CEO in January 2019, replacing founder Gerry Cohen, and since then has led an overhaul of Information Builders’ capabilities with AI and the cloud as focal points along with a target audience of small and midsize businesses.
During its virtual conference last month, the vendor not only further advanced its platform with the introduction of four new features but also debuted a revamped website and revealed that it will now be known as IBI rather than Information Builders.
“It’s more than just a new logo,” said Keith Kohl, senior vice president of product management. “One of the things we always talked about as Information Builders was that we were a BI and analytics company. But the fact is, we’re a data and analytics company.”
That, Kohl continued, means that IBI has customers whose needs go well beyond analyzing data but use its platform for data management as well.
“It’s a rebirth of the company with new messaging,” Kohl said.
As far as the new features IBI added to its analytics platform in late June — three of which are now generally available — both the cloud and AI play a prominent role.
Automated Insights is a new tool built on AI and machine learning designed to help users more quickly and easily derive insights from their data; Open Data Visualizations uses IBI’s Open Data Platform to helps customers connect to new data sources and embed applications so they can access data in real time in order to make data-driven decisions; Omni-HealthData Cloud Essentials is a SaaS-only offering that will be available in August and will enable midmarket healthcare providers to better use patient data; and finally a new partnership between IBI and ASG, a provider of IT management technology, will enable customers to see key metrics and data lineage information in dashboards and reports to help support their data governance efforts.
Taken as a whole, though none of the new features are going to revolutionize analytics, they further demonstrate that IBI’s platform is again modern. More importantly, they will benefit the vendor’s customers, according to Mike Leone, senior analyst at Enterprise Strategy Group.
“Many of their customers and end users are at a tipping point when it comes to leveraging next-generation technology like AI and ML,” he said. “Combined with the need to support more end users, IBI has a recipe for success. They’ve been successful to date at easily meeting the large-scale demands of growing business and streaming data sets. Now it’s about enabling faster ramp up of next-gen technology to fuel better, faster data storytelling and eventual decision-making for more end users.”
The response from users, meanwhile — not only to the new features but also to the evolution of the IBI analytics platform over the course of many months — has been positive.
Mike LeoneSenior analyst, Enterprise Strategy Group
Sound Credit Union, based in Tacoma, Wash., began using Information Builders for its BI needs about two years ago, according to Martin Walker, the credit union’s vice president of digital experience and innovation. When the credit union chose the vendor’s platform, it was looking for a product that would empower end users to do analytical analysis without having to go through its IT department every time they needed a report.
In addition, the credit union wanted a platform that would provide data management capabilities in addition to BI capabilities.
“With a lot of the other solutions we saw, we really would have needed to have two solutions,” Walker said. “Some were very good at presenting the data and visualizations but didn’t have the data warehouse component. We would have had to build that out separately or hire a consultant to do that for us, and with IBI we essentially got the whole puzzle.”
But while Information Builders provided the data-management and self-service capabilities Sound Credit Union was seeking, it couldn’t foresee the evolution into IBI that was to come and the capabilities the platform would enable.
“It was good in 2018, and it is fantastic today,” Walker said. “They’ve led the way for us to understand what is possible. We were in our infancy in terms of understanding how we could leverage our data, and IBI has taken us a lot further than we thought in the last two-and-a-half years.”
Regarding the rebranding of company name after 45 years, both Walker and Leone said it is a positive development.
“It almost feels natural that it comes with the changing of the guard that took play a year and a half ago,” Leone said. “Their existing customers will continue to see value and a new wave of customers will look to achieve the same if not better successes.”
Meanwhile, Walker, who spent 20 years in marketing, including 10 years with the NBA’s Seattle SuperSonics before they left for Oklahoma City, said “it’s pretty cool.”
Looking toward future IBI analytics platform updates, Kohl said IBI’s areas of focus will be to add more AI capabilities, including automating repetitive tasks in the short term and more complex ones in the long term, adding more features to help nontechnical users use machine learning models to assist them with specific use cases, improving ease of use and adding more cloud services to make the platform easier to adopt.
“I believe IBI is headed in the right direction,” Leone said. “I think as they look to place a heavy focus on enabling the masses to better achieve data-driven success, it will be well-received by both existing and potential customers alike.”
Impediments to connecting and managing disparate data sources are many. The new SODA Foundation offers the promise of more interoperability in open source data management so users can connect to applications and data, whether on premises or in the cloud.
SODA stands for SODA Open Data Autonomy. The group was introduced at the Linux Foundation’s Open Source Summit North America virtual conference on June 29.
The nascent foundation then held its own mini summit on July 2. Participating members and users outlined the goals and components that make up the open source data effort, which is an evolution of the Linux Foundation’s OpenSDS (open software-defined storage) project that started in 2016. Among members of the group are Huawei, Fujitsu, NTT Communications and Sony.
The OpenSDS project leadership realized in recent years that managing storage is only part of the challenge for organizations, which also need to manage data wherever it resides.
Many data silos exist in his organization and others and it’s not easy to bring them all together, said Yuji Yazawa, principal engineer at Toyota Motor Corporation and chair of the end user advisory committee for the SODA Foundation, during a recorded session at the Open Source Summit.
Yuji YazawaPrincipal engineer, Toyota Motor Corporation
Yazawa noted that in his experiences at Toyota and previously at Yahoo Japan, the lack of standardized interfaces for data management tends to lead to lock-in. He said that’s why he’s interested in the SODA Foundation’s mission to foster an ecosystem of open source data management tools and capabilities.
“SODA is an open source unified autonomous framework for data mobility from edge to core to cloud,” Yazawa said.
Moving from OpenSDS to SODA Foundation
In a keynote session during the SODA Foundation mini summit, Rakesh Jain, co-chair of the SODA Foundation technical steering committee and senior technical staff member at IBM, outlined the open data fabric architecture approach that the group is taking.
The SODA Foundation integrates projects that provide core elements needed to enable a unified data framework, including an infrastructure manager, controller and multi-cloud plugins, Jain said.
He noted that the core projects enable users to access different data storage repositories including VMware on premises as well as public cloud with AWS, Azure and Google. There are also components for data lifecycle management, governance, security and analytics.
Beyond the core project, Jain noted that the SODA Foundation is fostering an ecosystem of projects that help to expand the idea of an open data fabric.
Creating a single data framework with the SODA Foundation
In the same keynote session, Sanil Divakaran, a member of the technical steering committee, noted that the foundation is aiming to define a single data network framework, in which any application can potentially connect to any data or storage back end in an interoperable approach.
Each type of application deployment approach, whether VMware virtual machines, Kubernetes and containers, or public cloud, has its own method of connecting to data storage back ends and enabling data management. SODA’s open data framework enables an abstraction, so a user will transparently be able to connect and manage the different data sources, regardless of the underlying deployment approach.
“We want to provide key features like data lifecycle and data protection in a unified framework,” Divakaran said. “So the application framework can focus on application business logic and the storage can simply focus on the storage, so we connect between the two and provide a unified interface.”
The COVID-19 pandemic has created massive disruptions across all industries, causing demand to plummet for some products and skyrocket for others.
SAP Business One, an ERP for SMBs that includes functionality of SAP’s enterprise systems but is easier to manage, has been vital in helping two small companies deal with dramatic changes in demand and may have helped them stay in business.
One such company, Sagamore Spirit, used SAP Business One ERP to successfully switch from producing whiskey to producing hand sanitizer.
A small rye whiskey distillery in Baltimore, Sagamore Spirit feared it would have to shut down production when the COVID-19 crisis hit, shuttering the restaurant industry which brings in more than a third of its revenue. However, the company received a call from Johns Hopkins Medicine that it was in desperate need of hand sanitizer and hoped that Sagamore could help by converting its distillery, said Drew Thorn, vice president of finance and operations at Sagamore Spirit.
Moving from spirits to sanitizer overnight
Sagamore Spirit immediately began work with Johns Hopkins to determine the formula for the hand sanitizer. It also needed to adjust its supply chain to integrate new suppliers and ingredients. Not only that, but the distillery’s staff was reduced from 50 to 12 employees. The SAP Business One ERP, which Sagamore uses in conjunction with distillery software for process manufacturing enhancements, was a main reason why the adjustment could be pulled off, according to Thorn.
“From a systems perspective, our implementation of SAP Business One for whiskey production directly translates into making sanitizer,” Thorn said. “All ingredients are inventoried items within SAP Business One and then are assembled through the production module into finished sanitizer product inventory.”
The switch to making hand sanitizer was a lifeline for Sagamore Spirit as the demand for craft distilled spirits took an abrupt and substantial hit in the wake of the COVID-19 shutdowns, he said. More than 35% of Sagamore’s business came from restaurants and bars that were shuttered overnight. The company also had to shut down its visitor center and tasting rooms, which hosted more than 40,000 visitors annually prior to the crisis.
“We partnered with Johns Hopkins Medicine to produce sanitizer and set a company goal to fully supply their organization of approximately 30,000 employees, and we were shipping within 10 days,” Thorn said. “SAP Business One provided the foundation to fold in a whole new set of vendors, completely new ingredients, packaging materials, production process and recipes overnight. From there, we were able to understand our costs, set up and invoice new customers, manage inventory and update reporting.”
Sagamore Spirit runs SAP Business One in an on-premises deployment but has users working remotely because of the crisis. Despite the shift to remote work, the finance and operations teams haven’t skipped a beat, Thorn said.
“We were able to move our finance and corporate teams remote on one day’s notice with full continuity,” he said. “Like most manufacturers, our finance team and operations team are interconnected, and by being able to keep them physically separate but still fully functional, we have been able to continue operations in our facility in a way that optimizes safety by eliminating unnecessary human contact.”
Increased demand leads to increased challenges
Sagamore Spirit was faced with a decline in demand due to COVID-19, but that hasn’t been the case for all companies. Ellsworth Foods LLC, an all-natural, farm-to-table food service that sources its products from local farmers, ranchers, growers and culinary artisans, saw demand doubling almost overnight.
An increase in demand is usually good for a business, but such a sudden and dramatic increase can tax a company’s employees and IT systems. Ellsworth Foods’ implementation of SAP Business One was able to help it manage the sudden shift, said Karleigh Jackson, director of marketing at Ellsworth Foods.
Based in Tifton, Ga., Ellsworth Foods delivers products to customers across the southeastern U.S., including all-natural beef, chicken, and pork, vine-ripened vegetables and North American seafood. Ellsworth’s brands include Blue Ribbon Foods and Southern Foods At Home.
Karleigh JacksonDirector of marketing, Ellsworth Foods LLC
On March 14, the day after a national emergency was declared in the U.S., reorder sales at Ellsworth Foods rose dramatically, Jackson said. New sales followed suit and by the next week had risen by 200%. The challenge for the company was that more sales led to increased customer service needs, including item swaps and fulfilling backorders.
“The entire company individually made the unspoken decision to just get it done, whatever it took,” she said. “Every single person was working long hours, every single day. We basically couldn’t breathe because there was just so much to do, but when you experience sudden, overwhelming business shifts like this, your processes and systems become your saving grace.”
SAP Business One’s Inventory Management module enabled Ellsworth Foods to keep up with the new inventory demands, Jackson said.
“We knew exactly how many items we needed to pull in to handle what was already sold,” she said. “Then it even allowed us to forecast orders, so we could start getting ahead of inventory needs. Imagine trying to do that without a system to help you when your business has more than doubled, but SAP made it so we could.”
The rules and award categories have changed this year, so please read the full Best of VMworld 2020 U.S. Awards rules, judging criteria and category descriptions before filling out the nomination form below.
If you cannot access the nomination form below, please use this link.
All product nomination forms must include a link to a public announcement or press release containing the official product general availability date. Nominations must also include the name and contact information for at least one customer reference.
Submissions that fail to follow rules and regulations will be disqualified.
In previous years, only vendors with a physical booth presence were eligible to participate in the Best of VMworld Awards. This year, current sponsors of the digital event and vendors that had contracted with VMware to participate in the live VMworld 2020 U.S. and Europe events prior to their cancellation will be eligible to participate in the awards.
TechTarget is now accepting nominations for the Best of VMworld 2020 Awards. The nomination window will remain open until 5 p.m. PST on Wednesday, July 29, 2020. The winners will be virtually announced by SearchServerVirtualization. Before nominating a product, please read the official rules and awards criteria.
A team of expert judges — consisting of editors, independent analysts, consultants and users — will evaluate the nominated products and select winners in the following categories:
Virtualization and Cloud Infrastructure
DevOps and Automation
Resilience and Recovery
See the full category descriptions and judging criteria below.
A Best of Show winner will also be selected from the individual category winners; nomination forms cannot be submitted for this category. If you submit nominations for multiple products, complete one form for each product. The same product may not be entered in multiple categories.
Only products that have shipped and are available between July 25, 2019, and July 29, 2020, will be considered for this year’s awards. The product must be generally available before the submission period closes. Products that are generally available after July 29, 2020, will be eligible for next year’s awards. All product nomination forms must include a link to a public announcement or press release containing the official product general availability date.
Nominations must also include the name and contact information for at least one customer reference. This customer must have access to the exact version of the generally available product for which the nomination is submitted (i.e., if the nomination is for version 1.2 of the product, the customer must have access to 1.2). Customer references may be contacted by judges, but their names and contact information will not be shared or published.
In previous years, only vendors with a physical booth presence were eligible to participate in the Best of VMworld Awards. This year’s event will be digital; vendors must be a sponsor of the digital event or have arranged a contracted presence at VMworld prior to the cancellation of the physical event. Not all nominees will be contacted directly by judges or interviewed in person.
Submissions that fail to follow rules and regulations will be disqualified.
If you have questions about your eligibility or the Best of VMworld Awards nomination process, email [email protected].
Best of VMworld 2020 Awards judging criteria
Judges will evaluate products in each category based on the following areas:
Innovation. Does the product introduce new capabilities or significant improvements? Does it break new ground?
Performance. Does the product perform to a degree that it could improve overall data center operation?
Ease of integration into environment. How easily does the product integrate with other products? Can the product operate effectively in heterogeneous environments?
Ease of use and manageability. Is the product easy to install? Are the product’s functions clear and easy to learn and run? Will the product scale to accommodate growth?
Functionality. Does the product deliver as promised? Does it provide greater or more useful functionality than others in its category?
Value. Does the product represent a cost-effective solution? Can its return on investment be easily justified?
Fills a market gap. What needs does the product uniquely meet? What problems does it solve?
Best of VMworld 2020 Awards categories
Please review the category descriptions carefully before nominating products. Remember, a product can only be entered in one category. If you believe a product is eligible for more than one category, use your best judgment in choosing a category, drawing from the examples included in the category descriptions and customer expectations. Judges have the authority to reassign products entered into the wrong category.
Virtualization and Cloud Infrastructure. Eligible entrants include hardware products designed to enable organizations to build virtual infrastructures, including compute and storage hardware. Examples consist of storage arrays, private and hybrid cloud infrastructures, and hyper-converged infrastructure appliances, as well as software products designed to manage or virtualize hardware, such as software-defined storage.
DevOps and Automation. Eligible entrants include products that help operations teams deploy and support applications in VMs and containers on premises and in the cloud, as well as products that monitor, track and manage on-premises or cloud-based workloads, or that enable workload migration across cloud platforms. Examples include products that monitor performance, troubleshoot workload availability, and automate workload deployment, scaling or configuration.
Networking. Eligible entrants are hardware and software technologies that enhance networking in virtual or cloud infrastructures and/or enable and optimize virtualized networks. Examples include network switches, routers, and software or services that enhance networking in a virtual or cloud environment, such as software-defined networking and software-defined WANs.
Resilience and Recovery. Eligible entrants include software products or cloud services — such as disaster recovery as a service — that are designed to back up, restore and replicate data and/or achieve fault tolerance in a virtual server or cloud infrastructure.
Security. Eligible entrants monitor and protect hypervisors, cloud workloads, guest operating systems, and virtual networks and enforce security best practices.
Digital Workspace. Eligible entrants include products that secure mobile devices, applications and content while enabling mobile productivity, or software and hardware platforms that deliver or enhance the delivery of desktops and applications to various endpoints.
To submit a product for consideration, please fill out the nomination form.
IBM is expected to unveil this week several new and updated storage offerings to help large businesses build infrastructure that supports AI-optimized software to analyze and organize data.
The centerpiece is the new IBM Elastic Storage System (ESS) 5000 data lake capable of performing up to 55 GBps in a single eight-disk enclosure node. IBM said ESS 5000 handles up to 8-yottabyte configurations. The new IBM storage system is particularly suited for data collection and longer-term storage capability, according to IBM product documents viewed by SearchStorage.
The forthcoming products underscore the growing use of object storage as a target for AI and high-density analytics. IBM also built an enhanced version of its IBM Elastic Storage System 3000 that allows access and data movement between IBM Spectrum Scale and object storage, both on premises and in the cloud. The disk-based system adds Data Acceleration for AI feature in IBM Spectrum Scale software. IBM claims the feature serves to lower costs by eliminating the need for an extra cloud copy of the data. It moves data using automatic or controlled acceleration of lower-cost object storage.
IBM’s AI and analytics offerings form a cornerstone of its overall corporate strategy, said Steve McDowell, a senior analyst at Moor Insights & Strategy, based in Austin, Texas.
“The ESS 5000 is a product that is only designed to solve big data problems for big data customers,” McDowell said. “There are only a handful of IT shops in the world today who need the combination of 55 GBps performance that is scalable to yottabyte capacities. Those that do need it are almost all IBM customers.”
The underpinnings of the 2U ESS 3000 building block stem from IBM’s long expertise in mainframes and traditional high-end storage, McDowell said. ESS 3000 systems are based on the IBM FlashSystem NVMe flash platform.
Steve McDowellSenior analyst, Moor Insights & Strategy
“The ESS 3000 building block addresses the kind of performance required for enterprise AI workloads, and the data lakes that emerge around those workloads, for organizations building out those capabilities.”
IBM also upgraded it Cloud Object Storage system, adding support for shingled magnetic recording hard disk drives, expanding its capacity to 1.9 petabytes (PB) in a 4u enclosure.
Aside from the storage hardware, McDowell said Spectrum Scale enhancements include some compelling features, including the new Data Acceleration for AI to help balance data between storage tiers.
“One of the biggest challenges of hybrid cloud is keeping data where you need it, when you need it. It’s also a costly challenge, as the egress charges encountered when moving data out of public cloud can become very expensive,” McDowell said.
Greater flexibility to move data across different storage tiers should appeal to corporate IT shops that need to keep sensitive data on premises or perhaps in a hybrid cloud.
“No one wants to leave all their sensitive or strategic data in the cloud,” said one analyst familiar with the company’s plans. “If you are coming up with the next vaccine for the coronavirus that could end up being worth $3 billion, you are not going to put that up in anyone’s public cloud. Especially massive data sets that can be hard to manage across multiple environments.”
IBM has supported data movement to other vendors’ storage for years, and recently added support for Dell EMC PowerScale and NetApp filers to its IBM Spectrum Discover metadata management software.
The AI software IBM has added makes it easier to locate and manage data spread across multiple vendors’ clouds, and makes a difference in the way large enterprises build object storage and discover information, one analyst said.
IBM also upgraded its Spectrum Discover Policy Engine to optimize data migration to less expensive archive tiers.
IBM enhances Red Hat storage
Along with IBM Elastic Storage hardware, IBM also debuted the Storage Suite for IBM Cloud Paks, which combines OpenSource Red Hat Storage with IBM Spectrum Storage Software.
Red Hat is a key part of IBM’s cloud strategy. IBM acquired Red Hat in a $34 billion deal in 2019, vowing to run it as an independent engineering arm.
Offerings in the new bundle include Red Hat Ceph, OpenShift Container Platform, IBM Spectrum Virtualize, IBM Spectrum Scale, IBM Cloud Object Storage and IBM Discover.
IBM claims Spectrum Discover can search billions of files or objects in 0.5 seconds and automatically deploy that data on Red Hat OpenShift. The product is intended to improve users’ insights into data to eliminate rescanning. A storage data catalog can be integrated with the IBM Cloud Pak for Data with one click.
Some of the AI-driven capabilities built into the new or enhanced offerings ease installation and maintenance. Integration with existing infrastructure will be a factor in convincing users to adopt the products in what figures to be challenging economic times this year and likely into next.
“Adoptability will be key with this,” said another analyst familiar with the company’s plans. “But the Fortune 50 to Fortune 100 companies are watching pennies these days and could be reluctant to spend money until they have a better idea of what the returns are going to be. With this virus, no one knows what they will need, or need over the long term.”