The post This AI system lets you focus on the parts of your job that really matter appeared first on Stories.
The post Microsoft’s focus on transforming healthcare: Intelligent health through AI and the cloud appeared first on Stories.
Tableau is continuing its focus on enterprise functionality, rolling out several new features that the company hopes will make its data visualization and analytics software more attractive as an enterprise tool to help broaden its appeal beyond an existing base of line-of-business users.
In particular, the new Tableau 10.5 release, launched last week, includes the long-awaited Hyper in-memory compute engine. Company officials said Hyper will bring vastly improved speeds to the software and support new Tableau analytics use cases, like internet of things (IoT) analytics applications.
The faster speeds will be particularly noticeable, they said, when users refresh Tableau data extracts, which are in-memory snapshots of data from a source file. Extracts can reach large sizes, and refreshing larger files takes time with previous releases.
“We extract every piece of data that we work with going to production, so we’re really looking forward to [Hyper],” Jordan East, a BI data analyst at General Motors, said in a presentation at Tableau Conference 2017, held in Las Vegas last October.
East works in GM’s global telecom organization, which supports the company’s communications needs. His team builds BI reports on the overall health of the communications system. The amount of data coming in has grown substantially over the year, and keeping up with the increasing volume of data has been a challenge, he said.
Extracting the data, rather than connecting Tableau to live data, helped improve report performance. East said he hopes the extra speed of Hyper will enable dashboards to be used in more situations, like live meetings.
Faster extracts mean fresher analytics
The Tableau 10.5 update also includes support for running Tableau Server on Linux, new governance features and other additions. But Hyper is getting most of the attention. Potentially, faster extract refreshes mean customers will refresh extracts more frequently and be able to do their Tableau analytics on fresher data.
“If Hyper lives up to demonstrations and all that has been promised, it will be an incredible enhancement for customers that are struggling with large complex data,” said Rita Sallam, a Gartner analyst.
Sallam’s one caveat was that customers who are doing Tableau analytics on smaller data sets will see less of a performance upgrade, because their extracts likely already refresh and load quickly. She said she believes the addition of Hyper will make it easier to analyze data stored in a Hadoop data lake, which was typically too big to efficiently load into Tableau before Hyper. This will give analysts access to larger, more complex data sets and enable deeper analytics, Sallam said.
Focus on enterprise functionality risky
Looking at the bigger picture, though, Sallam said there is some risk for Tableau in pursuing an enterprise focus. She said moving beyond line-of-business deployments and doubling down on enterprise functionality was a necessary move to attract and retain customers. But, at the same time, the company risks falling behind on analytics functionality.
Sallam said the features in analytics software that will be most important in the years ahead will be things like automated machine learning and natural language querying and generation. By prioritizing the nuts and bolts of enterprise functionality, Tableau hasn’t invested as much in these types of features, Sallam said.
“If they don’t [focus on enterprise features], they’re not going to be able to respond to customers that want to deploy Tableau at scale,” Sallam said. “But that does come with a cost, because now they can’t fully invest in next-generation features, which are going to be the defining features of user experience two or three years from now.”
I am sure everyone enjoyed Computer Science Education Week and its amazing focus on enabling the students of today to create the world of tomorrow. We live in an amazing time of technological progress. Every aspect of our lives is being shaped by digital transformation. However, with transformation comes disruption. There’s growing concern over job growth, economic opportunity, and the world we are building for the next generation. So, the real question is: How can technology create more opportunity not for a few, but for all?
This week we would love to focus on how to bring applied computer science through robotics into the classroom. The skill of programming is fundamental for structured, logical thinking and enables students to bring technology to life and make it their own. Oftentimes this can be a lofty goal when resources are limited, but there is room for a grounded, everyday approach.
Code Builder for Minecraft: Education Edition is an extension that allows educators and students to explore, create, and play in an immersive Minecraft world – all by writing code. Since they can connect their work to learn-to-code packages like ScratchX, Tynker, and Microsoft MakeCode, players start with familiar tools, templates and tutorials. Minecraft: Education Edition is available free to qualified education institutions with any new Windows 10 device. You can check out our Minecraft: Education Edition sign-up page to learn how you can receive a one-year, single-user subscription for Minecraft: Education Edition for each new Windows 10 device purchased for your K-12 school.
OhBot is an educational robotics system that has been designed to stretch pupils’ computational thinking and understanding of computer science, and explore human/robot interaction through a creative robotic head that students program to speak and interact with their environment.
Another key area that we are supporting is in simulation solutions for robotics, to enable lower-cost access and better design practices in the classroom. With these programs, educators can teach robotic coding without a physical robot.
Daniel Rosenstein, a volunteer Robotics coach at the Elementary, Middle school and High school levels, firmly believes that simulation illustrates the connection between computer science and best practices in engineering design. Simulation makes the design process uniquely personal, because students are encouraged to build digital versions of their physical robot, and to try their programs in the simulator before investing in physical tools. The simulation environment, similar to a video game, creates a digital representation of the robot and its tasks, and allows for very quick learning cycles through design, programming, trial and error.
The Virtual Robotics Toolkit (VRT) is a good example. It’s an advanced simulator designed to enhance the LEGO MINDSTORMS experience. An excellent learning tool for classroom and competitive robotics, the VRT is easy to use and is approved by teachers and students.
Looks set to be another year of great new apps in the Microsoft Store and we are excited to shortly be welcoming Synthesis: An Autodesk Technology to the Store. This app is built for design simulation and will enable students to work together to design, test and experiment with robotics, without having to touch a piece of physical hardware.
We look forward to connecting with you on this and more soon!
Apps, Einstein and Quip are expected to be the focus at Dreamforce, with Salesforce keeping any new clouds it may be building under wraps.
For its first 18 years, Salesforce focused inward, building its clouds and the infrastructure to support them. This year, with many business processes covered by one cloud product or another, Salesforce is turning its attention outward — to the applications side of the aisle — hoping that building out its community of developers will help propel new growth.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Salesforce applications will be a big focus at the start of Dreamforce, the company’s annual conference, which is expected to draw more than 170,000 attendees, according to a recently published report by USA Today. In its initial announcements ahead of Dreamforce, Salesforce focused on existing products and how to improve the user experience, including a bevy of app-building tools.
Einstein apps and bots
The company’s apps can now be embedded into Einstein with the release of myEinstein, which allows users to create custom AI models. Salesforce Einstein AI bots can also employ artificial intelligence to assist with customer service workflows. Einstein Prediction Builder enables admins to craft AI models that predict business outcomes.
Salesforce Einstein AI was the big reveal at last year’s Dreamforce — the accumulation spending more than $1 billion on AI-centric companies. And while no new clouds or platform-wide products were unveiled this year, some analysts see this year’s Dreamforce as a Part Two to last year’s Part One.
“It’s an evolution from what [Salesforce] talked about last year,” said John Bruno, principal analyst at Forrester Research. “Right now, Einstein is still in the early adopter phase. That being said, the stuff Salesforce has done has matured [Einstein] over the past year.”
Apps extend to Apple, Google stores
One key example of that, according to Bruno, is the availability of Einstein Prediction Builder, which allows companies to embed AI functionality into its own business processes.
“Prediction Builder is Salesforce stepping out and saying, ‘Everything you’ve known Salesforce to be as a platform is in the past,'” Bruno said. “Prediction Builder is the next generation of that. Salesforce placed its bets on AI being the future, and, if that’s the case, you can’t rely on the first-party capabilities you put out there.”
Beyond improving and building out Einstein, Salesforce released several other upgrades, many of which focused on building Salesforce applications and company branding.
The Salesforce mobile application will go from Salesforce1 to mySalesforce — allowing for employees at all levels to build custom Salesforce applications. App builders can also publish Salesforce applications to the App Store or Google Play with a Listing Wizard capability. Lightning received the app upgrade with myLightning, including better branding capabilities and an improved App Builder.
Quip makes collaboration push
Quip also received an application-centric upgrade, with Live Apps embedding real-time collaboration and document viewing, a calendar app that can be used to track projects, and workflow templates for quick document and spreadsheet use for specific industries and projects.
The added collaboration features for Quip can lead to the question of whether this is Salesforce positioning itself to challenge the Slacks and Microsoft Teams of the world. Salesforce denies any posturing, saying that Slack remains a partner.
Bill Quinn, director of customer experience solutions, Tata Consultancy Services
“Slack and Quip are allies in changing the way people work, and Slack continues to be a great partner of ours,” said Rafael Alenda, vice president of marketing at Quip. “Slack has seen success in communication, while the Quip Collaboration Platform is focused on document, collaboration and, in the end, transforming the enterprise culture into something much more modern, less reliant on emails and less reliant on meetings.”
Alenda added that with an open API, Quip could be embedded into other document-based tools that customers use.
While Salesforce continues to play nice with Slack, others see it as the company subtly positioning itself into the growing collaboration market.
“I think they’re essentially working to make Salesforce the ‘hub’ for all the work you do as an employee,” said Bill Quinn, director of customer experience solutions at Tata Consultancy Services, based in Mumbai, India. “Salesforce wants to be the one place where employees conduct all of their work-related activities. It started with Chatter but has grown with Quip.”
To help companies with development of Salesforce applications, Trailhead has also been expanded into myTrailhead. The move allows customers to create custom learning pages with their own content and branding to assist with onboarding and company-specific skills.
More information regarding these features and other future features will be released throughout the week at Dreamforce. Be sure to check back to SearchSalesforce for updates.
Though Microsoft’s focus of late has been on advanced mobile, cloud and application development products, a recent debut of a remote server management tool shows the company has not forgotten to fortify the Windows Server admin’s tool chest.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Microsoft last month released a technical preview of Project Honolulu, a free browser-based utility aimed at small and medium-sized businesses. The utility combines tools that manage and monitor Microsoft systems in one interface.
Project Honolulu offers flexibility to IT pros on the go, as well as IT pros who work on non-Windows devices. An admin on an iPhone or a Linux desktop can log into Project Honolulu running on a gateway server to check on a Windows Server system if an urgent help desk ticket arrives.
Smaller organizations that see Microsoft embrace various open source projects and steer admins to PowerShell command-line management might think the company has moved away from the familiar GUI. Some features in Windows Server 2016, such as Storage Spaces Direct, require an admin with PowerShell expertise to perform certain tasks. But not every IT shop works with PowerShell on a regular basis, and a cmdlet can’t visualize the performance of a VM to help pinpoint a slowdown.
“I think customers were telling Microsoft they needed something to help them and were saying, ‘If you’re going to take away our GUI, then why don’t we just go to Linux anyway?'” said Jim Gaynor, research analyst at Directions on Microsoft in Kirkland, Wash.
Microsoft says Project Honolulu is ideal for managing a single rack of servers that run Windows Server 2012 and newer. Honolulu also connects to Windows Server VMs that run in the cloud, such as Amazon Web Services.
Although Microsoft will support the Windows Server 2008 and 2008 R2 operating systems until 2020, Honolulu cannot manage these OSes, because its functionality relies on PowerShell cmdlets that came after those releases.
“Anywhere there’s a Windows Server running, you can use Honolulu to manage it,” said Jeff Woolsey, principal program manager for Windows Server at Microsoft. The official release for Honolulu will be sometime in 2018.
Richard Hooper, an IT project analyst with Utilitywise, a U.K.-based energy cost management firm, said Project Honolulu’s unified interface helps him work with his infrastructure of just over 20 physical servers and more than 200 VMs. He uses PowerShell for a number of daily tasks, but appreciates the efficiency of Honolulu’s streamlined layout, which makes it easy to dive into a VM to make a configuration adjustment or troubleshoot a problem.
“You just need one tool instead of the 20-odd tools I used before,” Hooper said. “Before, to manage Hyper-V, you would have to open Hyper-V Manager. But then, to manage the cluster, you needed to open Failover Cluster Manager, as well.”
Why admins need new server management tools today
Windows Server is more than 20 years old and has accumulated new features with each release. Microsoft must provide new snap-ins to its Microsoft Management Console to manage these roles, such as Active Directory, file servers or Hyper-V, which are not enabled by default. Out of this necessity, there are now dozens of tools administrators use to add or change roles, depending on the server’s purpose.
Then, there’s server maintenance. If a machine underperforms, the admin might go into the Task Manager to see the CPU usage or run the Performance Monitor to gather more data. If an issue recurs, the admin checks the logs in the Event Viewer for further insights. This juggle between multiple tools becomes second nature for many administrators, but it’s time-consuming, as well as a distraction. Project Honolulu attempts to bring order to the chaos.
“[With Honolulu,] I can go to the registry. I can add roles. I can add features. I can uninstall things. I can configure services. I can configure storage VMs. All of this is happening within the context of a web browser, so no more switching between 10 different tools,” Microsoft’s Woolsey said.
The groundwork for Project Honolulu came with the release of Windows Server 2012, which included a significant increase in the PowerShell cmdlets used to manage Windows systems. Microsoft added more than 2,000 cmdlets in Windows Server 2012 and another 1,000 cmdlets in 2012 R2, according to Woolsey.
“What that did was make Windows Server a completely automatable, PowerShell-able platform for just about everything you need,” he said.
Project Honolulu first appeared in early 2016, when it was known as Server Management Tools (SMT), a free Azure-based service to manage servers remotely through a web browser. While the tool was appreciated by some, others disliked the need to connect to Microsoft’s cloud to manage on-premises machines.
“When you are troubleshooting, you need some tools where you can actually get on the servers without a network connection,” Gaynor said. “SMT is useless then.”
Microsoft’s remote server management tool
consolidates a number of admin tools
in a web browser interface.
Microsoft incorporated that feedback and removed the Azure requirement with Project Honolulu, which features new versions of Server Manager, Failover Cluster Manager and Hyper-Converged Cluster Manager. Microsoft plans to publicly release a software development kit, so vendors and users can construct their own plug-ins to extend the tool’s functionality. And because Honolulu uses PowerShell cmdlets to execute tasks, there is the potential for it to manage other platforms, such as Linux, now that Microsoft is developing PowerShell Core — the open source, cross-platform version of Windows PowerShell.
The Honolulu team expects to deliver product updates every four to six weeks and takes feature requests at the product’s user voice forum.
Tracking Project Honolulu
While it covers many important server-based tasks, Project Honolulu does not completely replace Remote Server Administration Tools (RSAT), the standard suite of utilities admins use for remote server management. One shortcoming of RSAT is, with every new release of the Windows client, RSAT must be updated. As an HTML5-based application, Project Honolulu could mark the beginning of the end for RSAT.
“Microsoft will have to bring Honolulu up to parity with RSAT,” Gaynor said. “Microsoft needs to kill RSAT, because there’s no way that you can have this compiled set of desktop tools that keeps up with the cadence of Windows 10.”
Gaynor said he also sees Honolulu as a catalyst for companies that are interested in the Windows Server Semi-Annual Channel releases, which are geared toward organizations that want the latest app development features.
“You’ll see Microsoft prioritize support for any showcase feature that comes out in a future Semi-Annual Channel release. … If it’s something they want to promote to improve adoption, it’s going to be in Honolulu,” Gaynor said.
Microsoft has been retooling its collaboration roadmap to focus on Microsoft Teams, the cloud and interoperability. As a result, many of the vendor’s partners are capitalizing on Microsoft’s roadmap with new integrations and services.
At Microsoft Ignite, several Microsoft partners unveiled Skype for Business and Microsoft Teams integrations that focused on video conferencing. These integrations focus on interoperability and user experience.
Polycom said its video and audio portfolio will integrate with Microsoft Teams. The Microsoft Teams integrations will allow users and administrators to have the Teams user interface, workflow and functionality on Polycom devices.
The integration extends to Polycom’s RealPresence Group Series, a Microsoft-certified standards-based group video conferencing system, the RealConnect interoperability service, and the Polycom Trio conferencing phones.
At Ignite, Polycom also announced RealConnect Hybrid, which connects on-premises Skype for Business users with existing video devices. When users are ready to move to Office 365 and Skype for Business Online, they can update their subscription to RealConnect. The hybrid interoperability service will be available in October.
The Polycom MSR Series, its next-generation Skype for Business room system, is also available for pre-order. The MSR Series includes a Surface Pro tablet with the Skype for Business interface, an MSR dock to connect to existing room displays and peripherals, the Polycom Trio 8500/8800 speakerphone and an EagleEye camera for medium-to-large meeting rooms.
Pexip develops Microsoft Teams video interoperability
Pexip said it will develop and deliver standards-based video conferencing interoperability with Microsoft Teams, which will allow traditional video conferencing users to join Microsoft Teams video calls and meetings.
Pexip offers a similar service for Skype for Business in its Infinity Fusion gateway. Pexip’s Microsoft Teams integration will use the platform to provide video, audio and content sharing capabilities between Microsoft Teams users and video conferencing users. Organizations can extend Microsoft Teams video meetings to legacy video meeting room services.
The gateway service will offer a native user experience for both Teams users and legacy video conferencing users. Video conferencing systems joining a Microsoft Teams meeting can be managed like a Teams participant, while users on the video conferencing system will have the standard video conferencing experience.
The Infinity Fusion gateway for Microsoft Teams is currently in development. It is designed for any size organization and can accommodate any number of users and simultaneous meetings.
Videxio offers Skype for Business conferencing gateway
Videxio introduced a video conferencing gateway service within the Microsoft Azure cloud that enables users on dedicated video conferencing devices to join Skype for Business meetings.
The company said the gateway service addresses interoperability challenges between Skype for Business and video conferencing devices from vendors such as Cisco and Huawei. The service allows video conferencing and Skype for Business users to hold meetings with their respective native user experience.
Tom-Erik Lia, Videxio CEO, said in a statement that the number of Skype for Business users connecting with third-party video systems on the Videxio service has increased significantly over the past two years, indicating a growing need for interoperability.
The service was created in partnership with Pexip’s Skype for Business Server-certified gateway and will be deployed in Microsoft Azure. The gateway service will work with Skype for Business Online, as well as on-premises and hybrid Skype for Business deployments. It will be available in the fourth quarter for video conferencing systems hosted in Videxio’s cloud.
LAS VEGAS — As VMware intensifies its focus on security and enabling support for multiple clouds, VMware’s NSX networking software continues to grow in importance, and now underpins many of the company’s upcoming initiatives.
At VMworld here this week, Pat Gelsinger, VMware CEO, said NSX will serve a number of key functions to tie together multiple clouds from VMware, Amazon Web Services (AWS) and its network of business partners, as well as segment out selected capabilities of monolithic applications to share across multiple clouds.
“With micro-segmentation, we can extend out NSX to serve as the connective tissue to IoT devices,” Gelsinger told TechTarget after his keynote. “There are a ton of IoT devices out there responsible for collecting, storing and sharing critical information that are unprotected and this is a way to provide them with more security.
“It will be the secret sauce behind all of what we do — it is that important,” he said.
Tom Hull, CTO with the Moffitt Cancer Center, agrees with that assessment about the strategic importance of NSX networking. Moffitt is updating its disaster recovery (DR) architecture built around proprietary server hardware and software, and has gravitated toward a software-defined networking (SDN) approach, where NSX will play a key role.
Pat GelsingerCEO, VMware
Moffitt’s largest use of NSX will be to gain more freedom for the center’s research domain without affecting the security of its clinical information. “Instead of a monolithic DR in a remote location that gets spun up once a year and tested for audit purposes, we can bring it into our active environment so DR looks more like business continuity,” Hull said.
VMware underscored its commitment to NSX with Expanded VMware NSX to support networking and security for both clouds and cloud-native applications. The new support is intended to help NSX administrators manage and troubleshoot larger-scale NSX deployments.
The company also introduced VMware NSX Cloud, a service designed to offer more consistent networking and security for applications running in multiple private and public clouds, through a single management console and common API. The new offering is supposed to simplify and help scale operations, improve standardization and compliance and lower OPEX for applications that run in public clouds. A micro-segmentation security policy can be defined just once and applied to workloads that run across multiple clouds, according to company officials.
The State of Louisiana’s Division of Administration/OTS plans to use the VMware Cloud on AWS, released this week, to better leverage NSX to extend to the public cloud across a common operating environment, said Michael Allison, the division’s CTO. This will give his organization “public cloud agility and economics” with a more proven virtualized infrastructure, he said.
One systems engineer with a large telecommunications company in Hayward, Calif. is evaluating NSX, but he said he is concerned about the costs to replace his older proprietary server hardware, as well as older remote devices that would be connected to NSX.
“[NSX] is good technology, but to take full advantage of it, I’d have to replace many of the local servers and upgrade the quality of my network and the remote devices we use to collect data and monitor traffic,” he said.
Startup E8 Storage has sharpened the focus of its nonvolatile memory express all-flash arrays, adding support for parallel file systems in a bid to boost scalability and shared flash storage.
The upgrade allows users to scale capacity beyond a single appliance by allowing host machines to access multiple E8 Storage appliances. The enhanced E8 Storage software supports shared writable volumes, which the vendor claims allows 96 clustered hosts to read and write to the same volume in parallel at line speed. That feature is geared initially to organizations running IBM Spectrum Scale — formerly IBM General Parallel File System — and Oracle Real Application Cluster (RAC) environments, although shared flash has implications for any parallel file system used in technical computing.
The vendor this week also previewed E8-X24 block arrays at the Flash Memory Summit in Santa Clara, Calif. The X-24 is a companion to the flagship E8 Storage D-24 rack-scale flash system that it launched last year. The X-24 will allow customers to mix and match NAND flash and storage-class memory in the same box. E8 Storage said X-24 proofs of concept are underway at cloud providers, financial services and travel industry firms. The X-24 array is expected to be generally available in the fourth quarter.
“The focus of this release is to increase the agility of our system for application acceleration. We’re supporting more parallel file architectures to help customers get the most processing power and move away from serial access to data,” said Julie Herd, director of technical marketing for E8 Storage.
Shared writable volumes connect multiple hosts to back end
The nonvolatile memory express (NVMe) host controller interface is designed to speed data transfer between host systems and flash media. The NVMe protocol transmits the packets across the PCI Express interconnect, bypassing the traditional network hops between networking components.
The E8 Storage shared flash block system uses dual-ported server and rack hardware from OEM AIC Inc. It supports 24 7.68 TB capacities, which scales storage to 140 TB of usable flash per rack. Drives connect via a Remote Direct Memory Access over Converged Ethernet high-performance fabric. E8 client software handles dynamic LUN, RAID 6 schemes and thin provisioning.
Although the concept of sharing a volume isn’t a new idea, supporting it with block storage is a challenge. It requires vendors to enable software capabilities in the storage layer, particularly a locking mechanism to allow clustered servers to simultaneously read and write results to the same volume, without interfering with one another.
In its rack-scale deployment, each server sees E8 Storage servers as local block storage. A parallel file system writes data to those servers at the host level. The E8 agent responds to lock calls to prevent data collisions, as multiple hosts attempt to access the volume in real time.
“This was one of the early-on requests we had from customers: the ability to have read and write access to shared flash. We’ve had it in test with IBM Spectrum Scale for a couple months. Now, we’re ready to launch,” Herd said.
Eric Burgener, a storage analyst with IT firm IDC, said E8 Storage offers a potential alternative to the Oracle Exadata in-memory product that supports large Oracle RAC deployments, which require underlying high-performance storage. Oracle does not have an end-to-end NVMe implementation for Exadata.
“For a company the size of E8 Storage, selling even 10 systems in a year into Oracle RAC environments would be a pretty big deal. They have a better performance than Oracle Exadata and cost about one-third less. Now is the time for E8 to get into those environments that will be looking to refresh every quarter,” Burgener said.
Other potential use cases for E8 to pursue involve parallel file-system-based technical computing for big data, fraud detection, life sciences, online transaction processing and seismic processing, Burgener said.
Choose between flash, SCM, with dedicated RAID
Herd said E8 Storage is testing the forthcoming X-24 array with Intel’s Optane-based storage-class memory SSDs. The Optane drives provide a persistent memory cache designed to mimic the performance of an in-memory database.
Rather than an in-memory cluster accessing servers across a network, E8 said its architecture provides better scalability by eliminating dedicated storage into the servers. Dedicated network links ensure each tier of storage gets sufficient bandwidth.
One feature lacking is dynamic tiering between shared flash and storage-class memory. Herd said E8 Storage customers will have to determine which database apps require in-memory-like performance.
The upgrade allows host to access multiple E8 Storage appliances. Initially, customers could connect 96 host servers to the appliance. The new configuration allows NAND flash and Intel Optane SSDs to be shared across D-24 and X-24 arrays. Instead of one large RAID configuration, customers could create smaller, multiple RAID groups and dedicate each to a specific cluster.
E8 Storage is among a handful of startup vendors trying to peddle fast and scalable, shared flash storage using off-the-shelf NVMe drives. Other entrants include Apeiron Data Systems and software-defined Excelero. Two other hopefuls, Pavilion Data Systems and Vexata, have yet to formally unveil their storage gear.