Tag Archives: Billion

For Sale – Billion BiPAC 8800AXL Dual Band Wireless Router

For sale is a used billion bipac 8800axl dual band wireless AC 1300 mbps vdsl2 /adsl2+ 3G/4G LTE router in mint condition. Comes with all original accessories and still has protective film on front. Comes with original box but no sleeve

Has been used in smoke free home and in full Working order

[ATTACH…

Billion BiPAC 8800AXL Dual Band Wireless Router

Go to Original Article
Author:

Investments in data storage vendors topped $2B in 2019

Data storage vendors received $2.1 billion in private funding in 2019, according to SearchStorage.com analysis of data from websites that track venture funding. Not surprisingly, startups in cloud backup, data management and ultrafast scale-out flash continue to attract the greater interest from private investors.

Six private data storage vendors closed funding rounds over more than $100 million in 2019, all in the backup/cloud sector. It’s a stretch to call most of these startups — all but one of the companies have been selling products for years.

A few vendors with disruptive storage hardware also got decent chunks of money to build out arrays and storage systems, although these rounds were much smaller than the data protection vendors received.

According to a recent report by PwC/ CB Insights MoneyTree, 213 U.S.-based companies closed funding rounds of at least $100 million last year. The report pegged overall funding for U.S. companies at nearly $108 billion, down 9% year on year but well above the $79 billion total from 2017.

Despite talk of a slowing global economy, data growth is expected to accelerate for years to come. And as companies mine new intelligence from older data, data centers need more storage and better management than ever. The funding is flowing more to vendors that manage that data than to systems that store it.

“Investors don’t lead innovation; they follow innovation. They see a hot area that looks like it’s taking off, and that’s when they pour money into it,” said Marc Staimer, president of Dragon Slayer Consulting in Beaverton, Ore.

Here is a glance at the largest funding rounds by storage companies in 2019, starting with software vendors:

Kaseya Limited, $500 million: Investment firm TPG will help Kaseya further diversify the IT services it can offer to manage cloud providers. Kaseya has expanded into backup in recent years, adding web-monitoring software ID Agent last year. That deal followed earlier pickups of Cloud Spanning Apps and Unitrends.

Veeam Software, $500 million: Veeam pioneered backup of virtual machines and serves many Fortune 500 companies. Insight Partners invested half of a billion dollars in Veeam in January 2019, and followed up by buying Veeam outright in January 2020 for a $5 billion valuation. That may lead to an IPO. Veeam headquarters are shifting to the U.S. from Switzerland, and Insight plans to focus on landing more U.S. customers.

Rubrik, $261 million: The converged storage vendor has amassed $553 million since launching in 2014. The latest round of Bain Capital investment reportedly pushed Rubrik’s valuation north of $3 billion. Flush with investment, Rubrik said it’s not for sale — but is shopping to acquire hot technologies, including AI, data analytics and machine learning.

Clumio, $175 million: Sutter Hill Ventures provided $40 million in April, on top of an $11 million 2017 round. It then came back for another $135 million bite in November, joined by Altimeter Capital. Clumio is using the money to add cybersecurity to its backup as a service in Amazon Web Services.

Acronis, $147 million: Acronis was founded in 2003, so it’s halfway into its second decade. But the veteran data storage vendor has a new focus of backup blended with cybersecurity and privacy, similar to Clumio. The Goldman Sachs-led funding helped Acronis acquire 5nine to manage data across hybrid Microsoft clouds.

Druva, $130 million: Viking Global Investors led a six-participant round that brought Druva money to expand its AWS-native backup and disaster recovery beyond North America to international markets. Druva since has added low-cost tiering to Amazon Glacier, and CEO Jaspreet Singh has hinted Druva may pursue an IPO.

Notable 2019 storage funding rounds

Data storage startups in hardware

Innovations in storage hardware underscore the ascendance of flash in enterprise data centers. Although fewer in number, the following storage startups are advancing fabrics-connected devices for high-performance workloads.

Over time, these data storage startups may mature to be able to deliver hardware that blends low latency, high IOPS and manageable cost, emerging as competitors to leading array vendors. For now, these products will have limited market to companies that needs petabytes (PB) (or more) of storage, but the technologies bear watching due to their speed, density and performance potential.

Lightbits Labs, $50 million: The Israel-based startup created the SuperSSD array for NVMe flash. The Lightbits software stack converts generic in-the-box TCP/IP into a switched Ethernet fabric, presenting all storage as a single giant SSD. SuperSSD starts at 64 PB before data reduction. Dell EMC led Lightbits’ funding, with contributions from Cisco and Micron Technology.

Vast Data, $40 million: Vast’s Universal Storage platform is not for everyone. Minimum configuration starts at 1 PB. Storage class memory and low-cost NAND are combined for unified block, file and object storage. Norwest Venture Partners led the round, with participation from Dell Technologies Capital and Goldman Sachs.

Honorable mentions in hardware include Pavilion Data Systems and Liqid. Pavilion is one of the last remaining NVMe all-flash startups, picking up $25 million in a round led by Taiwania Capital and RPS Ventures to flesh out its Hyperparallel Flash Array.

Liqid is trying to break into composable infrastructure, a term coined by Hewlett Packard Enterprise to signify the ability for data centers to temporarily lease capacity and hardware by the rack. Panorama Point Partners provided $28 million to help the startup flesh out its Liqid CI software platform.

Go to Original Article
Author:

AppExchange, acquisitions key to the future of Salesforce

If numbers such as $13.28 billion fiscal 2019 revenue or 171,000 Dreamforce attendees last month are any indication, Salesforce nailed the tech side of building a wildly loyal customer following for its sales, marketing, customer service and e-commerce clouds during its first 20 years.

For the next two decades, it will take continuous technology innovation, especially in the areas of cloud integration, AI and voice to prevent those customers from defecting to Adobe, Microsoft, SAP and Oracle platforms. Far more important to the future of Salesforce,  employees, customers and analysts said, is growing a Salesforce talent pool beyond the company’s control: partners, developers, admins and consultants.

To woo partners, Salesforce opened its platform. It hosts the AppExchange, a third-party marketplace similar to the Apple App Store or Google Play. Lightning Platform, a low-code appdev environment launched in 2009 as Force.com, enables individual users to create low-code apps and integrations themselves. Finally, Trailhead, a free, self-paced Salesforce training site, debuted in 2014; it has attracted 1.7 million people to learn developer, admin and consultant skills.

Yet it’s not enough. Salesforce developer and admin talent are in short supply. They will get even shorter if the company realizes CEO and founder Marc Benioff’s oft-stated revenue targets of $20 billion by 2022 and $60 billion by 2034 as more customers come to Salesforce.

“Salesforce’s biggest innovation is building this open community, whether it’s admins and recognizing how crucial they are, or creating Force.com and encouraging other developers to come in and develop on their platform,” said Nicole France, an analyst at Constellation Research. “Going forward, the challenge will be keeping up with the pace of innovation — it’s a lot harder when you’re a behemoth company.”

Salesforce's 20-year boom

AppExchange, Dreamforce built over many years

When Salesforce first started, what we call cloud companies today were referred to as application service providers. Salesforce’s big innovation was building an entire platform in the cloud instead of just one app, said Michael Fauscette, an analyst at G2.

When Salesforce first got into the enterprise, they didn’t go in the traditional way. IT bought tech — except for Salesforce automation. It came in through the sales guy.
Michael FauscetteAnalyst, G2

“Salesforce, and NetSuite, too, really had this idea of scaling infrastructure up and down really quickly with multi-tenancy, according to need,” Fauscette said, which found a different buying audience. “When Salesforce first got into the enterprise, they didn’t go in the traditional way. IT bought tech — except for Salesforce automation. It came in through the sales guy. They could just start using Salesforce immediately.”

Quickly, though, Salesforce knew it couldn’t keep up with every individual customer’s tech needs, especially integrations with outside business applications. So, in 2006, it threw open its platform to third-party developers by introducing the AppExchange, which provided sales teams with tools to integrate Salesforce with applications such as calendars, email, accounting, HR and ERP. Today, AppExchange hosts 3,400 apps.

Force.com, now called Lightning Platform, came along two years later, and enabled individual developers or even nondevelopers to build their own apps and connectors among Salesforce and other apps.

The AppExchange evolved into a Salesforce revenue generator in several ways, said Woodson Martin, executive vice president and general manager of Salesforce AppExchange. First, Salesforce earns revenue when an app is sold. Second, AppExchange enables customers to use Salesforce to grow their companies and, in turn, increase their Salesforce subscription. Third, it generates new leads for Salesforce when a developer creates a connector to a vertical-specific app.

“We think of AppExchange as the hub of the Salesforce ecosystem,” Martin said. “In some cases, apps are the tip of the spear for new industry verticals.”

G2’s Fauscette said that shuttling data between clouds, and between clouds and on-premises systems, will require more and more integrations between Salesforce and outside applications for at least the next decade. That makes AppExchange a crucial part of the future of Salesforce.

Acquisitions give partners new opportunities

Moving forward, AppExchange will expand into new domains, Martin said, as Salesforce integrates features and capabilities from companies it acquired, including Tableau and MuleSoft, into its platform. That will create opportunities for developers to create new customizations for data visualizations and data integrations.

Martin also said that Salesforce closely watches technology trends in the consumer retailing and e-commerce space — personalization and AI are two recent examples — to bring to its B2B platform. That’s what customers want, he said: a B2B buying experience that works as well as Amazon does at home.

But it takes outside developers to buy into the AppExchange concept, and so far, they seem rosy on the future of Salesforce. AppExchange partners such as configure-price-quote (CPQ) provider Apttus generally believe there’s room for developers of all stripes to grow their own franchises, even when Salesforce adds native overlapping features that directly compete.

That happened when Salesforce acquired Apttus competitor SteelBrick and added Salesforce-native CPQ three years ago, said Eric Carrasquilla, senior vice president of product at Apttus. That’s because Salesforce has hundreds of thousands of CRM customers now — and the number keeps increasing.

“Salesforce is a force of nature,” Carrasquilla said, adding that Apttus and Salesforce CPQ have roughly 3,500 customers combined. “That’s still a fraction of a fraction of a fraction of the opportunity within the CRM market. It’s a very deep pool, businesswise, and there’s more than enough for everyone in the ecosystem.”

Read how Trailblazers also figure heavily into the future of Salesforce in the second part of this story.

Go to Original Article
Author:

With Time on its hands, Meredith drives storage consolidation

After Meredith Corp. closed its $2.8 billion acquisition of Time Inc. in January 2018, it adopted the motto “Be Bold. Together.”

David Coffman, Meredith’s director of enterprise infrastructure, took that slogan literally. “I interpreted that as ‘Drive it like you stole it,'” said Coffman, who was given a mandate to overhaul the combined company’s data centers that held petabytes of data. He responded with an aggressive backup and primary storage consolidation.

The Meredith IT team found itself with a lot of Time data on its hands, and in need of storage consolidation because a variety of vendors were in use. Meredith was upgrading its own Des Moines, Iowa, data center at the time, and Coffman’s team standardized technology across legacy Time and Meredith. It dumped most of its traditional IT gear and added newer technology developed around virtualization, convergence and the cloud.

Although Meredith divested some of Time’s best-known publications, it now publishes People, Better Homes and Gardens, InStyle, Southern Living and Martha Stewart Living. The company also owns 17 local television stations and other properties.

The goal is to reduce its data centers to two major sites in New York and Des Moines with the same storage, server and data protection technologies. The sites can serve as DR sites for each other. Meredith’s storage consolidation resulted in implementing Nutanix hyper-converged infrastructure for block storage and virtualization, Rubrik data protection and a combination of Nasuni and NetApp for file storage.

“I’ve been working to merge two separate enterprises into one,” Coffman said. “We decided we wanted to go with cutting-edge technologies.”

At the time of the merger, Meredith used NetApp-Cisco FlexPod converged infrastructure for primary storage and Time had Dell EMC and Hitachi Vantara in its New York and Weehawken, N.J. data centers. Both companies backed up with Veritas NetBackup software. Meredith had a mixture of tape and NetBackup appliances and Time used tape and Dell EMC Data Domain disk backup.

By coincidence, both companies were doing proofs of concept with Rubrik backup software on integrated appliances and were happy with the results.

Meredith installed Rubrik clusters in its Des Moines and New York data centers as well as a large Birmingham, Alabama office after the merger. They protect Nutanix clusters in all those sites.

“If we lost any of those sites, we could hook up our gear to another site and do restores,” Coffman said.

Meredith also looked at Cohesity and cloud backup vendor Druva while evaluating Rubrik Cloud Data Management. Coffman and Michael Kientoff, senior systems administrator of data protection at Meredith, said they thought Rubrik had the most features and they liked its instant restore capabilities.

Coffman said Cohesity was a close second, but he didn’t like that Cohesity includes its own file system and bills itself as secondary storage.

“We didn’t think a searchable file system would be that valuable to us,” Coffman said. “I didn’t want more storage. I thought, ‘These guys are data on-premises when I’m already getting yelled out for having too much data on premises.’ I didn’t want double the amount of storage.”

Coffman swept out most of the primary storage and servers from before the merger. Meredith still has some NetApp for file storage, and Nasuni cloud NAS for 2 PB of data that is shared among staff in different offices. Nasuni stores data on AWS.

Kientoff is responsible for protecting the data across Meredith’s storage systems.

“All of a sudden, my world expanded exponentially,” he said of the Time aftermath. “I had multiple NetBackup domains all across the world to manage. I was barely keeping up on the NetBackup domain we had at Meredith.”

Coffman and Kientoff said they were happy to be rid of tape, and found Rubrik’s instant restores and migration features valuable. Instead of archiving to tape, Rubrik moves data to AWS after its retention period expires.

Rubrik’s live mount feature can recover data from a virtual machine in seconds. This comes in handy when an application running in a VM dies, but also for migrating data.

However, that same feature is missing from Nutanix. Meredith is phasing out VMware in favor of Nutanix’s AHV hypervisor to save money on VMware licenses and to have, as Coffman put it, “One hand to shake, one throat to choke. Nutanix provided the opportunity to have consolidation between the hypervisor and the hardware.”

The Meredith IT team has petitioned for Nutanix to add a similar live mount capability for AHV. Even without it, though, Kientoff said backing up data from Nutanix with Rubrik beats using tapes.

“With a tape restore, calling backup tapes from off-site, it might be a day or two before they get their data back,” he said. “Now it might take a half an hour to an hour to restore a VM instead of doing a live mount [with VMware]. Getting out of the tape handling business was a big cost savings.”

The Meredith IT team is also dealing with closing smaller sites around the country to get down to the two major data centers. “That’s going to take a lot of coordinating with people, and a lot of migrations,” Coffman said.

Meredith will back up data from remote offices locally and move them across the WAN to New York or Des Moines.

Kientoff said Rubrik’s live restores is a “killer feature” for the office consolidation project. “That’s where Rubrik has really shone for us,” he said. “We recently shut down a sizeable office in Tampa. We migrated most of those VMs to New York and some to Des Moines. We backed up the cluster across the WAN, from Tampa to New York. We shut down the VM in Tampa, live mounted in New York, changed the IP address and put it on the network. There you go — we instantly moved VMs form one office to another.”

Go to Original Article
Author:

Microsoft releases 18M building footprints in Africa to enable AI Assisted Mapping

In the last ten years, 2 billion people were affected by disasters according to the World Disasters report 2018. In 2017, 201 million people needed humanitarian assistance and 18 million were displaced due to weather related disasters. Many of these disaster-prone areas are literally “missing” from the map, making it harder for first responders to prepare and deliver relief efforts.

Since the inception of Tasking Manager, the Humanitarian OpenStreetMap Team (HOT) community has mapped at an incredible rate with 11 million square kilometers mapped in Africa alone. However, large parts of Africa with populations prone to disasters still remain unmapped — 60% of the 30 million square kilometers.

Under Microsoft’s AI for Humanitarian Action program, Bing Maps together with Microsoft Philanthropies is partnering with HOT on an initiative to bring AI Assistance as a resource in open map building. The initiative focuses on incorporating design updates, integrating machine learning, and bringing new open building datasets into Tasking Manager.

The Bing Maps team has been harnessing the power of Computer Vision to identify map features at scale. Building upon their work in the United States and Canada, Bing Maps is now releasing country-wide open building footprints datasets in Uganda and Tanzania. This will be one of the first open building datasets in Africa and will be available for use within OpenStreetMap (OSM).

In Tasking Manager specifically, the dataset will be used to help in task creation with the goal of improving task completion rates. Tasking Manager relies on ‘ML enabler’ to connect with building datasets through an API. This API-based integration makes it convenient to access not just Africa building footprints, but all open building footprints datasets from Bing Maps through ML Enabler, and thus the OpenStreetMap ecosystem.

“Machine learning datasets for OSM need to be open. We need to go beyond identifying roads and buildings and open datasets allow us to experiment and uncover new opportunities. Open Building Dataset gives us the ability to not only explore quality and validation aspects, but also advance how ML data assists mapping.”
– Tyler Radford (Executive Director, Humanitarian OpenStreetMap Team)

Africa presented several challenges: stark difference in landscape from the United States or Canada, unique settlements such as Tukuls, dense urban areas with connected structures, imagery quality and vintage, and lack of training data in rural areas. The team identified areas with poor recall by leveraging population estimates from CIESIN. Subsequent targeted labeling efforts across Bing Maps and HOT improved model recall especially in rural areas. A two-step process with semantic segmentation followed by polygonization resulted in 18M building footprints — 7M in Uganda and 11M in Tanzania.

Extractions Musoma, TanzaniaExtractions in Musoma, Tanzania

Bing Maps is making this data open for download free of charge and usable for research, analysis and of course, OSM. In OpenStreetMap there are currently 14M building footprints in Uganda and Tanzania (the last time our team counted). We are working to determine overlaps.

We will be making the data available on Github to download. The CNTK toolkit developed by Microsoft is open source and available on GitHub as well. The ResNet3 model is also open source and available on GitHub. The Bing Maps computer vision team will be presenting the work in Africa at the annual International State of the Map conference in Heidelberg, Germany and at the HOT Summit.

– Bing Maps Team

Go to Original Article
Author: Microsoft News Center

SuccessFactors customers to see big Qualtrics impact

LAS VEGAS — At the SAP SuccessFactors customer conference, SAP’s $8 billion Qualtrics acquisition seemed like the tail wagging the dog. Employee experience was such a central theme that SuccessFactors may rebrand HCM as HXM — Human Experience Management.

It may have been a lot for SuccessFactors customers to take in.

Some SuccessFactors customers are measuring employee experience with deeper analysis of employee behavior, such as time to complete certain tasks. But others, who were not Qualtrics users, were still assessing its capabilities.

What SAP made clear is that Qualtrics is important to the future of SuccessFactors.

Qualtrics “allows us now to really rethink almost every transaction in every application that we’re investing in,” Greg Tomb, president of SAP SuccessFactors, said at a meeting with press and analysts at SuccessConnect 2019.

Qualtrics sells an “experience management” or XM platform. It captures and measures employee experience (EX), product experience (PX), customer experience (CX) and brand experience (BX). The platform can combine experience data with a company’s operational data. 

The use of sophisticated employee experience measuring was illustrated by Hernan Garcia, vice president of talent and culture at Tecnológico de Monterrey, a university in Mexico. Garcia’s team studies employee experience as well as the efficiency of a process, including the time it takes to complete something.

“We measure both how they feel, how they interact, but also how much time, how many clicks, how many people they need to touch” to complete something, Garcia said during a press and analyst meeting. The school can improve the experience of employees by directly making changes to processes that affect it, he said.

The university was awarded SAP’s annual 2019 Klaus Tschira HR Innovation Award on Tuesday, which is named after an SAP co-founder. The university has about 31,000 employees and 160,000 students.

SuccessFactors is delivering some Qualtrics integrations, such as with employee records. It is also building capability to integrate with SAP Analytics Cloud so that companies can include both “X” or experience data and “O” or operational data in their analytics programs, said Amy Wilson, head of products and application engineering at SuccessFactors.

The SuccessFactors and Qualtrics integration work will continue into next year. For now, SuccessFactors and Qualtrics are separate applications, but “seamless,” Wilson said. SAP’s ultimate plan is to embed Qualtrics into SuccessFactors, she said.

But the employee experience discussion can’t just focus on X and O data. It must reconcile with the major workforce changes looming, said Vera Cuevas, a SuccessFactors user and HRIS senior manager at a technology firm she asked not be named.

“There’s probably going to be a lot of jobs across a number of different industries that might go away, that might be automated,” Cuevas said. “It will be interesting to see how you retain that employee engagement while at the same time you are moving employees in different jobs, or in some cases eliminating industries.”

Another attendee, Catrena Hairston, a senior learning professional at a U.S. government agency, said the ability to use both experience and operational data makes sense and may be useful. But she will have to see it in action. “I’m not into vaporware, so I’ll have to see if it works with our data,” Hairston said.

Go to Original Article
Author:

U.S. spends more on AI as AI in China continues to grow

The Trump administration’s fiscal year 2020 budget includes nearly $1 billion in non-defense AI research and development as the adoption and funding of AI in China continues to grow rapidly.

The budget request is to fund the Network and Information Technology Research and Development Program, a federal program that coordinates the activities and budgeting for research and development of several technology agencies.

U.S. AI efforts

The fiscal year 2020 budget supplement, released in September, allocates $973.5 million for AI as a new spending category to support existing White House efforts to ramp up robotic process automation and AI innovation and adoption.

Millions of dollars will go toward funding research and development for a range of AI-related projects, including machine vision, cybersecurity challenges unique to AI, and chips optimized for neural nets.

While the funding should help U.S. government’s AI efforts, it’s unclear how it will match up with AI work in China.

Past efforts of the U.S. government, according to Rudina Seseri, founder and managing partner of venture-capital firm Glasswing Ventures, have lagged considerably to those of other countries.

“For all that is being done, it’s not even 1% of the spending that other countries have made,” Seseri said during a talk at the AI World Government 2019 conference in Washington, D.C. in June.

The U.S. government’s efforts in the global AI race are “far, far behind China,” a country that, in addition to spending more, also collects more data from its citizens, she noted.

In China, “the human rights of data (is) not a notion that exists,” she said. “The government has free control of everyone’s data.”

AI in China

That may be evident in China’s development of smart cities. The government is constructing, or plans to construct, more than 500 cities with smart city capabilities that include automatically monitoring and routing buses to optimize traffic flows, and prioritizing mobile payment systems. The urban strategy also leans heavily on the internationally controversial goal of expanding surveillance of citizens by setting up networks of security cameras with gait and facial recognition software.

Traditional industries are still lagging behind [in AI adoption in China].
Danny Muanalyst, Forrester

Given the Chinese government’s extensive involvement in city planning, and its propensity to spy on its own citizens, these kinds of smart cities have no real equivalent in the U.S. A number of U.S. cities, however, have begun using AI and analytics to better their transportation systems and some have started to install intelligent security cameras and intelligent street lights.

Other Western countries have begun creating smart cities as well, including the U.K. and Canada. Calgary, Alberta, for example, deployed sensors to collect data on noise levels to help with noise management and on soil conditions to create better care for gardens, among other things.

Development of AI in China has skyrocketed in past years, with the government spending heavily on research and development. In 2017, the State Council of China released a plan with China’s AI goals over the next decade.

The plan details China’s ambition to become level with other world players in AI theories and technologies by 2020, and become the world leader in AI by 2030, with a domestic industry worth almost $150 billion.

The plan includes efforts to promote and support domestic AI enterprises, and encourage partnerships between them and international AI schools and research teams.

Yet, at an enterprise-level, the adoption of AI in China is comparable to that in the U.S.

Chinese and U.S. enterprises

A 2018 Forrester study commissioned by Chinese tech-giant Huawei surveyed 200 large and medium-sized companies and found that the vast majority of respondents see AI as a driver of innovation in their industry. About 65% of respondents say AI will play an extremely important role in their digital transformation.

Meanwhile, a little over half of the respondents lacked professional AI talent, and 70% of respondents said a lack of technology skills is slowing the adoption of AI in the enterprise intelligence market.

According to Danny Mu, a Forrester analyst based in Beijing, AI has been widely adopted by digital-native enterprises in the internet industry, even as “other traditional industries are still lagging behind.”

That’s about on par with enterprises in the U.S. Reports regularly highlight that most business leaders see the importance of AI, even as many traditional companies have yet to start using it.

The U.S. trade war with China has hampered China’s push to develop AI technologies, however, and has begun to force China to become more independent of Western hardware and software.

Some of China’s largest technology companies, including Baidu, Alibaba, and Huawei have started developing their own hardware and AI platforms to rely less on Western-developed technology.

The “trade war is a good reminder to Chinese companies to evaluate the risk from dependence,” Mu said.

While the trade war may affect Chinese companies’ ability to purchase American-made technologies, Chinese scientists are still actively conducting AI research, something which “is hard to ban by trade wars,” Mu said.

“AI is changing industries,” he said. “Companies and governments can’t afford to lag behind when it comes to AI technology.”

Go to Original Article
Author:

Splunk pricing worries users as SignalFx buy makes headlines

Splunk broke open its piggy bank and shelled out $1.05 billion for cloud monitoring vendor SignalFx this week, and its user base wonders if Splunk pricing means their IT budgets will be next.

Splunk, which began as a log analytics player, will gain expertise and software that collects metrics and distributed tracing data in cloud-native application environments with this acquisition, as it sets its sights on speedier data analytics features that support a broader swath of data sources.

The deal, valued at $1.05 billion, is the largest in recent memory in the IT monitoring space, analysts say. It also indicates future trends that will see cloud monitoring tools for increasingly complex applications and software frameworks such as container orchestration and service mesh take over enterprise IT. It’s the third merger between IT monitoring and automation firms in the past week — IT infrastructure monitoring firm Virtual Instruments also acquired cloud monitoring startup Metricly, and IT automation vendor Resolve Systems bought AIOps player FixStream.

“There’s a lot more interest in monitoring these days,” said Nancy Gohring, analyst at 451 Research in New York. “I would tie that to the growing interest in cloud and cloud-native technologies, and the maturity curve there, especially from the enterprise perspective.”

Splunk pricing a potential downside of cloud monitoring updates

[Enterprise IT] shops are learning the hard way that you have to change your approach to monitoring in these new environments, and, thus, there’s more demand for these types of tools.
Nancy GohringAnalyst, 451 Research

Splunk users are aware of emerging trends toward cloud-native application monitoring, streaming data analytics and machine learning, and see the need for fresh approaches to data collection and data analytics as they grow.

“It lowers the entry barrier for machine learning and gets people asking the right questions,” said Steve Koelpin, lead Splunk engineer for a Fortune 1000 company in the Midwest, of Splunk’s Machine Learning Toolkit. “People share their code, and then others use that as a starting point to innovate.”

But an emphasis on machine learning that includes metrics data will demand large data sets and large amounts of processing power, and that comes at a cost.

“It’s expensive to stream metric data as it’s metered at a fixed 150 bytes,” Koelpin said. “This could mean substantially [higher] license costs compared to streaming non-metric data.”

Splunk pricing starts at $150 per gigabyte, per day, with volume discounts for larger amounts of data. The costs for Splunk software licenses and data storage have already driven some users away from the vendor’s tools and toward open source cloud monitoring software such as the Elastic Stack.

Tim TullyTim Tully

“We’re always evaluating pricing options, but our focus is more on making sure we build the best products we can,” said Tully, senior vice president and CTO at Splunk, when asked about users’ Splunk pricing complaints.

Splunk offers the most extensive set of log analytics features on the market, as well as good data collection and analytics performance and stability, Gohring said. However, some users chafe at paying for that full set of features when they may not use them all.

“People want to collect more and more data, and there’s always a cost associated with that,” she said. “It’s something all vendors are struggling with and trying to address.”

Splunk prepares to gobble up massive data

As enterprises adopt cloud-native technologies, they will look to vendors such as Splunk, pricing notwithstanding, for cloud monitoring tools rather than build the cheaper homegrown systems early adopters favored, Gohring said.

“Those shops are learning the hard way that you have to change your approach to monitoring in these new environments, and, thus, there’s more demand for these types of tools,” she said.

In fact, 451 Research estimated that container monitoring tools will overtake the overall market revenue share held by container orchestration and management tools over the next five years. A June 2019 market monitor report on containers by the research group estimated total application containers market size at $1.59 billion in 2018, and projected growth to $5.54 billion in 2023. In 2018, management and orchestration software vendors generated 32% of that revenue, and monitoring and logging vendors did 24%. By 2023, however, 451 Research expects management and orchestration vendors to generate 25% of overall market revenue, and monitoring and logging vendors will do 31%.

In the meantime, Splunk plans to capture its slice of that pie, when beta products such as its Machine Learning Toolkit, Data Fabric Search and Data Stream Processor become generally available at its annual user conference this October. These products will boost the data collection and query performance of the Splunk Enterprise platform as it absorbs more data, and give users a framework to create their own machine learning algorithms against those data repositories under the Splunk UI. Splunk will also create a separate Kafka-like product for machine data processing based on open source software.

“We’re looking to add Apache Flink stream data processing under the Splunk UI for real-time monitoring and data enrichment, where Splunk acts as the query engine and storage layer for that data [under Data Fabric Search],” Tully said.

SignalFx has strong streaming analytics IP that will bolster those efforts in metrics and distributed tracing environments, Gohring said.

Go to Original Article
Author:

Cisco’s acquisition of Acacia bolsters service provider offerings

Cisco plans to acquire Acacia Communications for $2.6 billion, a move that would make Cisco a direct supplier of packet-optical transport systems for carrier networks and organizations that connect data centers across hundreds of miles.

Cisco announced the pending purchase on Tuesday in a joint statement with Acacia, based in Maynard, Mass. The companies expect to close the Cisco acquisition in the first half of next year.

Cisco offers Acacia’s packet-optical transport systems (P-OTS) with networking gear it sells today to carriers, cloud service providers and the largest enterprises. Cisco rivals Juniper Networks and Huawei are also Acacia customers, and analysts expect them to eventually turn to other P-OTS suppliers, such as Ciena, Inphi and Nokia.

“If I’m a Juniper or a Huawei, why would I buy from Cisco?” said Rajesh Ghai, an analyst at IDC.

Bill Gartner, general manager of Cisco's optical systems groupBill Gartner

Nevertheless, Acacia customers can expect from Cisco the same level of support that they receive today and equal access to products, said Bill Gartner, general manager of the vendor’s optical systems group.

“If we’re going to make this successful, we have to make sure that we’re providing the technology to third parties that they want to consume at the time they want to consume it and at the right performance and price point,” Gartner said. “I don’t think we could make this successful more broadly if we give Cisco preference on any of those parameters.”

Reasoning behind Cisco acquisition

Cisco has agreed to acquire Acacia because the company’s optical interconnect technology will let Cisco help customers design networks that can keep pace with the projected increase in data traffic. Cisco has predicted that annual global IP traffic will increase from 1.5 zettabytes in 2017 to 4.8 zettabytes by 2022. Contributors to the traffic surge include internet growth, video content delivery and emerging next-generation wireless technology to support more demanding business applications.

Today, Cisco’s proprietary optical transport technology ends in the data center, where analysts expect port speeds of 100 Gbps and 400 Gbps to become commonplace over the next couple of years. To meet that emerging demand, Cisco this year completed the $660 million acquisition of silicon photonics company Luxtera.

With Acacia, Cisco will also own the optical technology for service providers that need high-speed connections for metropolitan area networks or data centers as far as 1,500 miles apart.

“Our optics business today is primarily addressing what’s happening inside the data center — short-reach optics,” Gartner said during a conference call with financial analysts. “We don’t have a portfolio today that addresses what happens outside the data center for pluggables.”

Acacia’s portfolio includes pluggables, which are optical modular transceivers that vendors can sell as a plugin for a router or switch. The pluggable architecture, which is in its infancy, promises to simplify upgrading and repairing transceivers in networking gear.

John Burke, an analyst at Nemertes Research, based in Mokena, Ill., said Acacia could help Cisco “stay dominant in large data center markets long term,” while also providing some technical advantages over Arista, Juniper and Huawei.

“I suspect it will also give a boost to some smaller optical companies and trigger at least one more acquisition — perhaps by Arista,” Burke said.

Go to Original Article
Author:

Cisco bolsters cloud security with Duo acquisition

Cisco has announced the $2.35 billion acquisition of Duo Security, adding two-step authentication services to the networking company’s cloud-based security portfolio.

Cisco said this week it expects to close the cash deal by the end of October. Following the Duo acquisition, Cisco will make Duo part of its security business under its general manager and executive vice president, David Goeckeler. Duo, which has 700 employees, will remain at its Ann Arbor, Mich., headquarters, and CEO Dug Song will continue to lead the company.

Under Cisco, Duo could grow much faster than it could on its own by gaining access to Cisco’s 800,000 customers. Duo, which was founded in 2009, has 12,000 customers.

Cisco wants to buy Duo to strengthen its cloud-based security services. Duo offers two-factor authentication that companies can integrate into websites, VPNs and cloud services. Duo services can also determine whether the user device trying to access the corporate asset poses a security risk.

The Duo acquistion adds another set of capabilities to those provided by Cisco’s other cloud-based security products, including OpenDNS and Stealthwatch Cloud. OpenDNS blocks malware, phishing attacks and botnets at the domain name system layer. Stealthwatch Cloud searches for threats by aggregating and analyzing telemetry drawn from public cloud infrastructures, such as AWS, Microsoft Azure and Google Cloud Platform.

Cisco’s plans following Duo acquisition

During a conference call with reporters and analysts, Goeckeler said Cisco will sell Duo as a stand-alone product, while also integrating its services into some of Cisco’s other cloud-based services. He did not provide details or a timeline, but noted other cloud-based products that Cisco has combined with each other include OpenDNS, the Viptela SD-WAN and the cloud-managed Meraki wireless LAN.

“We think we can drive [more] integrations here,” Goeckeler said of Duo. He later added Duo could bring more value to Cisco Umbrella, a cloud-based service that searches for threats in internet activity.

“Duo is another asset we can combine together with Umbrella to just increase the value of that solution to our customers,” Goeckeler said.

Cisco has been growing its security business through acquisition since at least 2013, when it bought firewall provider Sourcefire for $2.7 billion. In 2015, Cisco acquired OpenDNS for $635 million, and it bought CloudLock a year later for $293 million. CloudLock provides secure access to cloud applications, including those running on platform-as-a-service and infrastructure-as-a-service providers.

“All of these pieces are part of the larger strategy to build that integrated networking, security and identity cloud-delivered platform,” Goeckeler said.

Cisco’s acquisitions have fueled much of the growth in its security business. In the quarter ended in April, Cisco reported an 11% increase in security revenue to $583 million.