Tag Archives: market

Adobe acquisition of Marketo could shake up industry

The potential Adobe acquisition of Marketo could unsettle the customer experience software market and give Adobe, which is mainly known for its B2C products, a substantial network of B2B customers from Marketo.

Adobe is in negotiations to acquire marketing automation company Marketo, according to reports.

“It’s a trend that B2B customers are trying to become more consumer-based organizations,” said Sheryl Kingstone, research director for 451 Research. “Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.”

But, reportedly, talks between Adobe and Marketo’s holding company may not lead to a deal.

Ray Wang, founder of Constellation Research, said leaks could be coming from Vista Equity Partners Management, which bought Marketo in 2016 and took the company private, in the hopes of adding another bidder to the race to acquire Marketo.

“If people think Adobe would buy Marketo, maybe it would get SAP to think about it,” Wang said. “The question is, who needs marketing automation or email marketing? And when you think about the better fit at this moment, it’s SAP.”

When reached for comment, Adobe declined, adding that it does not comment on acquisition rumors or speculation.

Adobe expanding to B2B

Marketo said it had roughly 4,600 customers when it was acquired by Vista Equity. It’s unclear whether Adobe and Marketo have much overlap between customer bases, but there could be product overlap between the software vendors.

Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.
Sheryl Kingstoneresearch director, 451 Research

Adobe has its Marketing Cloud system, and both vendors offer basic martech features, like lead scoring, lead segmentation, web tracking, SMS marketing, personalized web content and predictive analytics. But an Adobe acquisition of Marketo would allow Adobe to expand into a wider B2B market, while allowing Marketo to offer its users the ability to market more like a B2C vendor using Adobe’s expertise.

“It’s a huge benefit for Marketo when you look at Adobe,” Kingstone said.

“Marketo has struggled in a B2B sense when its customers try to implement an ABM [account-based marketing] strategy,” she said.

Despite any potential overlap with its own products’ marketing capabilities, Adobe could find the chance to break into a pool of nearly 5,000 B2B customers compelling.

“There’s a lot of value in Marketo, and Adobe has been gun shy about entering B2B,” Wang said.

Adobe’s alliance

If the Adobe acquisition reports turn out to be accurate, it would amplify what has already been a busy year for the vendor. In May, Adobe acquired commerce platform Magento for a reported $1.7 billion.

A Reuters report about the Adobe acquisition of Marketo said likely prices will well exceed the $1.8 billion that Vista paid for Marketo when it took Marketo private.

Over the past few years, industry-leading companies in the CRM and customer experience spaces have sought to build alliances with other vendors.

Adobe and Microsoft have built a substantial partnership and have even gone to market together with products, while Salesforce and Google unveiled their partnership and product integrations last year at Salesforce’s annual Dreamforce conference.

Marketo has been one of the few major martech vendors without an alliance. Combining its technologies with Adobe’s creative suite and potentially Microsoft’s B2B breadth could make a significant imprint on the industry.

“If this is real, then it means Adobe has gotten serious about B2B,” Wang said.

Editor’s note: TechTarget offers ABM and project intelligence data and tools services.

Microsoft Ignite 2018 conference coverage

Introduction

Microsoft continues to gain market momentum fueled in part by an internal culture shift and the growing popularity of the Azure cloud platform that powers the company’s popular Office 365 product.

When CEO Satya Nadella took the helm in 2014, he made a concerted effort to turn the company away from its proprietary background to win over developers and enterprises with cloud and DevOps ambitions.

To reinforce this new agenda, Microsoft acquired GitHub, the popular software development platform, for $7.5 billion in June and expanded its developer-friendly offerings in Azure — from Kubernetes management to a Linux-based distribution for use with IoT devices. But many in IT have long memories and don’t easily forget the company’s blunders, which can wipe away any measure of good faith at a moment’s notice.

PowerShell, the popular automation tool, continues to experience growing pains after Microsoft converted it to an open source project that runs on Linux and macOS systems. As Linux workloads on Azure continue to climb — around 40% of Azure’s VMs run on Linux according to some reports — and Microsoft releases Linux versions of on-premises software, PowerShell Core is one way Microsoft is addressing the needs of companies with mixed OS environments.

While this past year solidified Microsoft’s place in the cloud and open source arenas, Nadella wants the company to remain on the cutting edge and incorporate AI into every aspect of the business. The steady draw of income from its Azure product and Office 365 — more than 135 million users — as well as its digital transformation agenda, have proven successful so far. So what’s in store for 2019?

This Microsoft Ignite 2018 guide gives you a look at the company’s tactics over the past year along with news from the show to help IT pros and administrators prepare for what’s coming next on the Microsoft roadmap. 

1Latest news on Microsoft

Recent news on Microsoft’s product and service developments

Stay current on Microsoft’s new products and updated offerings before and during the Microsoft Ignite 2018 show.

2A closer look

Analyzing Microsoft’s moves in 2018

Take a deeper dive into Microsoft’s developments with machine learning, DevOps and the cloud with these articles.

3Glossary

Definitions related to Microsoft products and technologies

Wanted – 1080ti – prefer EVGA

Hi everyone,

As per the title.

Im in the market for a Geforce 1080ti graphics card. Original receipt and warranty. Not too fussed if its been mined on, so long as not over clocked or de-lidded. Prefer EVGA as the warranty is transferable.

Let me know what you have.

Location: Nottingham

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Wanted – 1080ti – prefer EVGA

Hi everyone,

As per the title.

Im in the market for a Geforce 1080ti graphics card. Original receipt and warranty. Not too fussed if its been mined on, so long as not over clocked or de-lidded. Prefer EVGA as the warranty is transferable.

Let me know what you have.

Location: Nottingham

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

No-code and low-code tools seek ways to stand out in a crowd

As market demand for enterprise application developers continues to surge, no-code and low-code vendors seek ways to stand out from one another in an effort to lure professional and citizen developers.

For instance, last week’s Spark release of Skuid’s eponymous drag-and-drop application creation system adds on-premises, private data integration, a new Design System Studio, and new core components for tasks such as creation of buttons, forms, charts and tables.

A suite of prebuilt application templates aim to help users build and customize a bespoke application, such as salesforce automation, recruitment and applicant tracking, HR management and online learning.

And a native mobile capability enables developers to take the apps they’ve built with Skuid and deploy them on mobile devices with native functionality for iOS and Android.

Ray Wang, Constellation ResearchRay Wang

“We’re seeing a lot of folks who started in other low-code/no-code platforms move toward Skuid because of the flexibility and the ability to use it in more than one type of platform,” said Ray Wang, an analyst at Constellation Research in San Francisco.

Skuid CTO Mike DuensingMike Duensing

“People want to be able to get to templates, reuse templates and modify templates to enable them to move very quickly.”

Skuid — named for an acronym, Scalable Kit for User Interface Design — was originally an education software provider, but users’ requests to customize the software for individual workflows led to a drag-and-drop interface to configure applications. That became the Skuid platform and the company pivoted to no-code, said Mike Duensing, CTO of Skuid in Chattanooga, Tenn.

Quick Base adds Kanban reports

Quick Base Inc., in Cambridge, Mass., recently added support for Kanban reports to its no-code platform. Kanban is a scheduling system for lean and just-in-time manufacturing. The system also provides a framework for Agile development practices, so software teams can visually track and balance project demands with available capacity and ease system-level bottlenecks.

The Quick Base Kanban reports enable development teams to see where work is in process. It also lets end users interact with their work and update their status, said Mark Field, Quick Base director of products.

Users drag and drop progress cards between columns to indicate how much work has been completed on software delivery tasks to date. This lets them track project tasks through stages or priority, opportunities through sales stages, application features through development stages, team members and their task assignments and more, Field said.

Datatrend Technologies, an IT services provider in Minnetonka, Minn., uses Quick Base to build the apps that manage technology rollouts for its customers, and finds the Kanban reports handy.

A lot of low-code/no-code platforms allow you to get on and build an app but then if you want to take it further, you’ll see users wanting to move to something else.
Ray Wanganalyst, Constellation Research

“Quick Base manages that whole process from intake to invoicing, where we interface with our ERP system,” said Darla Nutter, senior solutions architect at Datatrend.

Previously, we kept data of work in progress through four stages (plan, execute, complete and invoice) in a table report with no visual representation, but with these reports users can see what they have to do at any given stage and prioritize work accordingly, she said.

“You can drag and drop tasks to different columns and it automatically updates the stage for you,” she said.

Like the Quick Base no-code platform, the Kanban reports require no coding or programming experience. Datatrend’s typical Quick Base users are project managers and business analysts, Nutter said.

For most companies, however, the issue with no-code and low-code systems is how fast users can learn and then expand upon it, Constellation Research’s Wang said.

“A lot of low-code/no-code platforms allow you to get on and build an app but then if you want to take it further, you’ll see users wanting to move to something else,” Wang said.

OutSystems sees AI as the future

OutSystems said it plans to add advanced artificial intelligence features into its products to increase developer productivity, said Mike Hughes, director of product marketing at OutSystems in Boston.

“We think AI can help us by suggesting next steps and anticipating what developers will be doing next as they build applications,” Hughes said.

OutSystems uses AI in its own tool set, as well as links to publicly available AI services to help organizations build AI-based products. To facilitate this, the company launched Project Turing and opened an AI Center of Excellence in Lisbon, Portugal, named after Alan Turing, who is considered the father of AI.

The company also will commit 20% of its R&D budget to AI research and partner with industry leaders and universities for research in AI and machine learning.

Panasas storage roadmap includes route to software-defined

Panasas is easy to overlook in the scale-out NAS market. The company’s products don’t carry the name recognition of Dell EMC Isilon, NetApp NAS filers and IBM Spectrum Scale. But CEO Faye Pairman said her team is content to fly below the radar — for now — concentrating mostly on high-performance computing, or HPC.

The Panasas storage flagship is the ActiveStor hybrid array with the PanFS parallel file system. The modular architecture scales performance in a linear fashion, as additional capacity is added to the system. “The bigger our solution gets, the faster we go,” Pairman said.

Panasas founder Garth Gibson launched the object-based storage architecture in 2000. Gibson, a computer science professor at Carnegie Mellon University in Pittsburgh, was a a developer of RAID storage taxonomy. He serves as Panasas’ chief scientist.

Panasas has gone through many changes over the past several years, marked by varying degrees of success to broaden into mainstream commercial NAS. That was Pairman’s charter when she took over as CEO in 2010. Key executives left in a 2016 management shuffle, and while investors have provided $155 million to Panasas since its inception, the last reported funding was a $52.5 million venture round in 2013.

As a private company, Panasas does not disclose its revenue, but “we don’t have the freedom to hemorrhage cash,” Pairman said.

We caught up with Pairman recently to discuss Panasas’ growth strategy, which could include offering a software-only license option for PanFS. She also addressed how the vendor is moving to make its software portable and why Panasas isn’t jumping on the object-storage bandwagon.

Panasas storage initially aimed for the high end of the HPC market. You were hired to increase Panasas’ presence in the commercial enterprise space. How have you been executing on that strategy?

Faye Pairman: It required looking at our parallel file system and making it more commercially ready, with features added to improve stability and make it more usable and reliable. We’ve been on that track until very recently.

We have an awesome file system that is very targeted at the midrange commercial HPC market. We sell our product as a fully integrated appliance, so our next major objective — and we announced some of this already — is to disaggregate the file system from the hardware. The reason we did that is to take advantage of commodity hardware choices on the market.

Once the file system is what we call ‘portable,’ meaning you can run it on any hardware, there will be a lot of new opportunity for us. That’s what you’ll be hearing from us in the next six months.

Would Panasas storage benefit by introducing an object storage platform, even as an archive device?

Pairman: You know, this is a question we’ve struggled with over the years. Our customers would like us to service the whole market. [Object storage] would be a very different financial profile than the markets we serve. As a small company, right now, it’s not a focus for us.

We differentiate in terms of performance and scale. Normally, what you see in scale-out NAS is that the bigger it gets, the more sluggish it tends to be. We have linear scalability, so the bigger our solution gets, the faster we go.

That’s critically important to the segments we serve. It’s different from object storage, which is all about being simple and the ability to get bigger and bigger. And performance is not a consideration.

Which vendors do you commonly face off with in deals? 

Pairman: Our primary competitor is IBM Spectrum Scale, with a file system and approach that is probably the most similar to our own and a very clear target on commercial HPC. We also run into Isilon, which plays more to commercial — meaning high reads, high usability features, but [decreased] performance at scale.

And then, at the very high end, we see DataDirect Networks (DDN) with a Lustre file system for all-out performance, but very little consideration for usability and manageability.

The niche is in the niche. We target very specific markets and very specific workloads.
Faye PairmanCEO, Panasas

Which industry verticals are prominent users of Panasas storage architecture? Are you a niche within the niche of HPC?

Pairman: The niche is in the niche. We target very specific markets and very specific workloads. We serve all kinds of application environments, where we manage very large numbers of users and very large numbers of files.

Our target markets are manufacturing, which is a real sweet spot, as well as life sciences and media and entertainment. We also have a big practice in oil and gas exploration and all kinds of scientific applications, and even some manufacturing applications within the federal government.

Panasas storage is a hybrid system, and we manage a combination of disk and flash. With every use case, while we specialize in managing very large files, we also have the ability to manage the file size that a company does on flash.

What impact could DDN’s acquisition of open source Lustre exert on the scale-out sector, in general, and Panasas in particular?

Pairman: I think it’s a potential market-changer and might benefit us, which is why we’re keeping a close eye on where Lustre ends up. We don’t compete directly with Lustre, which is more at the high end.

Until now, Lustre always sat in pretty neutral hands. It was in a peaceful place with Intel and Seagate, but they both exited the Lustre business, and Lustre ended up in DDN’s hands. It remains to be seen what that portends. But there is a long list of vendors that depend on Lustre remaining neutral, and now it’s in the hands of the most aggressive competitor in that space.

What happens to Lustre is less relevant to us if it stays the same. If it falters, we think we have an opportunity to move into that space. It’s potentially a big shakeup that could benefit vendors like us who build a proprietary file system.

HiveIO seeks to create buzz in HCI market

Newcomer HiveIO Inc. is trying to make it in the already crowded hyper-converged infrastructure market by touting a software-only application that it claims uses AI for resource management.

HiveIO this week released Hive Fabric 7.0, its hyper-converged application. The vendor, based in Hoboken, N.J., has actually been around since 2015 and shipped its first version of Hive Fabric that same year, but has kept a low profile until now. HiveIO’s co-founders Kevin McNamara and Ofer Bezalel came out of JP Morgan Chase’s engineering team. HiveIO CTO McNamara said the goal was to create an infrastructure that consisted of one platform, was simple to use and was inexpensive.

“They thought about a single product, single vendor, hyper-converged fabric out of the box that just deploys and just works and reduces the complexity of the data center,” said HiveIO CEO Dan Newton, who joined HiveIO last April from Rackspace. “Our team comes from an operational background, and we’re focused on making our product operationally very easy, yet very stable. We try to make the technology work for the customers. We don’t want the customers to have to work to make it work.”

Newton said HiveIO has about 400 customers, including those it picked up by acquiring the assets of HCI software vendor Atlantis Computing in July 2017. HiveIO also inherited Atlantis’ OEM deal with Lenovo, which packaged Atlantis’ HCI software on its servers. However, HiveIO has no other hardware partnerships for Hive Fabric.

Newton said the goal is to provide HCI software that can deploy in 20 minutes on three nodes and requires little training to use.

We put the Message Bus into appliances and use machine learning to manage the appliances.
Kevin McNamaraCTO, HiveIO

HiveIO describes Hive Fabric as a “zero-layer, hardware-agnostic” hyper-converged platform that runs on any x86 server or in the cloud. Hive Fabric includes a free kernel-based virtual machine hypervisor, although it can also run with VMware and Microsoft hypervisors. Hive Fabric manages storage, compute, virtualization and networking across HCI clusters through its Message Bus. It includes a REST API and Universal Rest Interface to support third-party and customer applications.

McNamara called the artificial intelligence-driven Hive Fabric Message Bus “unique to the industry.” he said the Message Bus relies on AI and metadata to format data in real-time and provide predictive analytics to prevent potential performance and capacity problems.

“It’s all integrated into the stack,” McNamara said. “We can see everything in the hardware, everything in the stack, everything in the guest server and everything in the application layers. We put the Message Bus into appliances and use machine learning to manage the appliances. You can move workloads across appliances.”

Newton added, “Every piece of data point all comes through the Message Bus.”

HiveIO released Hive Fabric 7.0 this week, simplifying resource management through a Cluster Resource Scheduler (CRS). The CRS uses AI to monitor resource allocation across the cluster, and moves guest virtual machines between servers to improve operational efficiency. Hive Fabric 7.0 also allows customers to run multiple mixed-application workloads.

Hive Fabric 7 from HiveIO
HiveIO’s Hive Fabric 7 management dashboard.

Forrester Research senior analyst Naveen Chhabra said HiveIO will need to prove its AI capabilities to make it in an HCI field that includes at least 15 vendors.

“A number of companies already have proven technology — including Nutanix, Cisco, Dell EMC, VMWare,” Chhabra said. “HiveIO can do the same, but they must deliver at least table stakes technology, and then find out what innovations they can come up with. They talk about the interconnect fabric with artificial intelligence. It’s a transport layer for sending bits and bytes from one node to another. What kind of artificial intelligence does it have? Is it artificial intelligence or just AI washing like you hear from other vendors? And they have to find a strong use case for that artificial intelligence, even if it’s just one use case.”

HiveIO executives claim their early customers’ workloads include general server virtualization, virtual desktops, databases, log analysis and test/dev.

Hive Fabric is sold as a monthly subscription based on the number of physical servers with no restrictions on memory, storage or cores.

HiveIO promises to support Atlantis Computing hyper-converged and virtual desktop infrastructure software through 2022. Newton said HiveIO will offer Atlantis customers an upgrade path to Hive Fabric. He said HiveIO hired some Atlantis employees but is not using its technology in Hive Fabric.

HiveIO has 30 employees in the U.S. and U.K. It has completed two funding rounds and lists El Dorado Ventures, Rally Ventures, Osage Venture Partners and Citrix as investors but does not disclose its total funding.

Recruiting platforms see large VC investments

Recruiting platforms have long been a leader in the HR venture capital market and this year seem to be attracting some big funding rounds.

Recruiting platform Greenhouse recently received $50 million in new funding, and Hired recently received $30 million. Earlier this year, Scout Exchange gained $100 million in funding. Applicant-tracking systems and recruiting platforms typically lead the HR market in venture capital funding, industry analysts have reported.

These platforms each approach the problem of recruiting in different ways, and their methods illustrate the complexity of filling jobs with high-quality candidates.

With its latest funding round, Greenhouse, a recruiting platform and applicant-tracking system provider, has now raised $110 million. Raising $100 million or more is not unusual for recruiting platforms.

Greenhouse believes recruiting is a companywide responsibility, and its platform is built with this approach in mind, said Daniel Chait, CEO of Greenhouse, based in New York. Recruiting “involves everyone in the company, every single day,” doing all kinds of things, such as interviewing and finding candidates, he said. The Greenhouse platform uses data to help consider candidates “in a fair and objective way,” and to ensure a good candidate experience, he said.

Employees don’t like interviewing job candidates

Recruiting platform Greenhouse recently received $50 million in new funding, and Hired recently received $30 million. Earlier this year, Scout Exchange gained $100 million in funding.

Preparing employees to take part in candidate interviews is an important aspect of Greenhouse’s platform, Chait said. It provides all the people conducting the interview with all the available information on the candidate, but also helps users develop questions to ask the candidates.

“Employees generally don’t like doing interviews. They are stressful, and they don’t know what questions to ask,” he said.

Scout Exchange runs a marketplace recruiting platform that matches recruiters with job searches based on their expertise and “their actual track record of filling positions,” said Ken Lazarus, CEO of Scout, based in Boston.

Scout enables employers to tap into one or more recruiters with the best record for filling a particular type of job, Lazarus said.

Meanwhile, Hired has a created a talent pool and the technology to help match candidates with employers. If an employer believes a candidate has the right skills, they will send an interview request to the candidate. The firm said it has raised more than $130 million to date.

Setting the right salary level

Knowing what to pay candidates helped to drive Salary.com’s just-announced acquisition of Compdata Surveys & Consulting.

Salary.com gets its compensation data from surveys purchased from other providers, as well as what it gathers in its own surveys. The acquisition of Compdata, which is predominantly a survey firm, gives Salary.com the platform, analytics and the data it needs, said Alys Scott, chief marketing officer at Salary.com, based in Waltham, Mass., although the firm will still buy some third-party surveys.

The low unemployment rate and retirements of baby boomers is putting pressure on firms to have good compensation models, Scott said. The “No. 1 motivator” around retaining, attracting and engaging talent is compensation, she said.

VR in real estate has mainstream potential for IT resellers

Channel firms targeting the real estate market are likely to encounter growing customer interest in emerging VR and AR technology.

That’s according to a recent podcast by distributor Ingram Micro, which explored benefits of AR and VR in real estate.  Up to now, the technology has been mostly experimented within high-end real estate situations — conducting virtual walkthroughs of New York luxury lofts or West Coast mansions, for example. But as the cost of the hardware decreases, channel partners can expect to see VR and AR technology move downstream.

“I would say that [VR in real estate] hasn’t trickled all the way down yet, and that’s mainly because of the cost of the hardware associated” with it, said Sam Alt, technical support specialist at Ingram Micro, in the podcast. Hardware would include VR headsets and 3D camera equipment.

The benefits of VR in real estate are clear, Alt said. Agents could use VR to perform numerous house tours from one location rather than have to drive with their clients to physically tour the locations. “You could go to one location and you could view multiple houses in an afternoon versus only a few,” he said. While house buyers would eventually want to visit a prospective real estate purchase in person, VR could help them weed through the options.

Alt also pointed to a role for augmented reality. Architectural firms could use AR to walk clients through model homes and, using an AR helmet, “swipe through what types of kitchens they could provide,” he said. “I think that’s a really easy way to … get a person who’s looking to … build a brand-new home really, really excited and be able to showcase that the end result is going to look exactly like … [what you can see] in this AR helmet, versus what it would look like on a piece of paper.”

“I think that VR and AR really do this market justice because it just brings in an entire new level of detail to what [firms] previously could provide,” he added.

CompTIA seeks tech stories

In an effort to encourage young people to enter the IT industry, CompTIA has launched a #MyTechStory initiative, in which current industry personnel tell the story of how they got started in technology.

Todd Thibodeaux, CEO at CompTIA, invited attendees at ChannelCon 2018 to participate, but the program is open to tech workers worldwide. Three- to five-minute videos may be tweeted to @CompTIA using #MyTechStory. Videos may also be emailed to kstone@comptia.org. Thibodeaux said his road to IT started with Lincoln Logs and Legos.

Other news

  • AppDynamics, a Cisco business unit specializing in app performance monitoring software, expanded its partner program with a new Pioneer partner tier. Dedicated to regional partners with domain expertise in applications, the Pioneer tier adds to the AppDynamics program’s existing Alliance and invitation-only Titan tiers, acting essentially as a promotion path to Titan status. Pioneer partners can access support from channel account managers and channel sales engineers, training and enablement programs, and semiannual business planning sessions, AppDynamics said.
  • Cloud distributor Pax8 will offer Anchor and Cloudfinder to MSPs under a new agreement with Axcient/eFolder, which provides data protection and business continuity offerings.
  • Xerox introduced a marketing toolkit to help partners promote the vendor’s managed print services and ConnectKey portfolio. New resources include social media syndication, redesigned partner badges and tools for hosting on-site customer events.
  • Collabrance, a provider of products and services for managed service providers (MSPs), said it expanded its Master Managed Security Services Provider portfolio. The portfolio now features security information event management and vulnerability and penetration testing, Collabrance said.

Market Share is a news roundup published every Friday.

NetApp AI storage packages OnTap all-flash FAS A800, Nvidia

NetApp is jumping into the AI storage market with a platform that combines OnTap-powered All Flash FAS arrays with Nvidia supercomputers.

NetApp AI OnTap Converged Infrastructure is a validated architecture combining NetApp’s FAS A800 all-flash NVMe array for NFS storage with integrated Nvidia DGX-1 servers and graphical processing units (GPUs). NetApp said the reference design verified four DGX servers to one FAS A800, although customers can start with a 1-1 ratio and nondisruptively scale as needed.

 “The audience for this type of architecture is data scientists,” said Octavian Tanase, senior vice president of NetApp’s Data OnTap system group, during a live webcast this week. “We want to make it simple for them. We want to eliminate complexity (and give) a solution that can be integrated and deployed with confidence, from the edge to the core to the cloud. We believe they will be able to adopt this with a lot of confidence.”

The product is intended to help companies implement data analytics that bridges a core data center, edge computing and cloud environment, said Jim McHugh, a vice president and general manager at Nvidia Corp. He said Nvidia DGX processors build on the foundation of Nvidia Cuda GPUs developed for general-process computing.

“Every industry is figuring ‘we need better insights,’ but better insights means a new computing block,” McHugh said. “Data is really the new source code. When you don’t spend time writing all the features and going through QA, you’re letting data drive the solutions. That takes an incredible amount of parallel computing.”

The joint NetApp-Nvidia product reflects a surge in AI and machine learning, which requires scalable storage to ingest reams of data and highly powerful parallel processing to analyze it.

In April, NetApp rival Pure Storage teamed with Nvidia to launch an AI storage platform with DGX-1 GPUs and Pure’s scale-out NAS FlashBlade array.

Capacity and scaling of NetApp AI OnTap

The NetApp FAS A800 system supports 30 TB NVMe SSDs with multistream write capabilities, scaling to 2 PB of raw capacity in a 2U shelf. The system scales from 364 TB in two nodes to 24 nodes and 74 PB. NetApp said a 24-node FAS A800 cluster delivers up to 300 gigabits per second of throughput and 11.4 million IOPS. It supports 100 Gigabit Ethernet together and 32 Gbps Fibre Channel network connectivity.

The NetApp AI storage platform is tested to minimize deployment risks, the vendors said. A NetApp AI OnTap cluster can scale to multiple racks with additional network switches and storage controller pairs. The product integrates NetApp Data Fabric technologies to move AI data between edge, core and cloud environments, Tanase said.

NetApp AI OnTap is based on OnTap 9.4, which handles enterprise data management, protection and replication. Each DGX server packs eight Nvidia Tesla V100 GPUs, configured in a hybrid cube-mesh topology to use Nvidia’s NVLink network transport as high-bandwidth, low-latency fabric. The design is intended to eliminate traffic bottlenecks that occur with PCIe-based interconnects.

DGX-1 servers support multimode clustering via Remote-Direct-Memory-Access-capable fabrics. 

Enterprises struggle to size, deploy AI projects

AI storage is a hot topic among enterprise customers, said Scott Webb, who manages the global storage practice at World Wide Technologies (WWT) in St. Louis, a NetApp technology partner.

“In our customer workshops, AI is now a main use case. Customers are trying to figure out the complexity. DGX and AI on a NetApp flash back end is a winning combination. It’s not only the performance, but the ability for a customer to start small and [scale] as their use cases grow,” Webb said.

John Woodall, a vice president of engineering at systems integrator Integrated Archive Systems, based in Palo Alto, Calif., cited NetApp Data Fabric as a key enabler for AI storage deployments.

“The speeds and feeds are very imp in AI, but that becomes a game of leapfrog. With the Data Fabric, I think NetApp has been able to give customers more control about where to apply their assets,” Woodall said.