Tag Archives: could

Nextlink Internet and Microsoft closing broadband gap in central US – Stories

The agreement could bring broadband access to benefit more than 9 million people, including approximately 1 million in unserved rural areas

REDMOND, Wash. — Sept. 18, 2019 — On Wednesday, Nextlink Internet and Microsoft Corp. announced a partnership that will help close the broadband gap in Iowa, Illinois, Kansas, Nebraska, Oklahoma and Texas, bringing high-speed internet to hundreds of rural communities. The agreement will further enable Nextlink to substantially expand their coverage areas and is part of the Microsoft Airband Initiative, which is focused on addressing this national crisis, with the goal of extending broadband access to over 3 million unserved people in rural America by July 2022.

Lack of broadband connectivity is a pervasive national issue, and particularly acute in rural areas of the country. The Federal Communications Commission (FCC) reports that more than 21 million Americans lack broadband access, the vast majority of whom live in rural areas that continue to lag the national rate of broadband usage. The problem is almost certainly larger than that, though, as other studies and data sources, including Microsoft data, have found that 162 million people across the United States are not using the internet at broadband speeds, including approximately 29 million people across Iowa, Illinois, Kansas, Nebraska, Oklahoma and Texas.

“It’s time to deliver on the connectivity promises that have been made to people across the country, and this partnership will help do that for many who have been left behind and unserved in the heartland of America,” said Shelley McKinley, vice president, Technology and Corporate Responsibility at Microsoft. “In the past two years with our Airband Initiative, we’ve seen that progress is possible — particularly when the public and private sectors come together. Partnerships with regional ISPs like Nextlink that have the desire and wherewithal to provide internet connectivity are a critical part of closing the broadband gap and helping families, children, farmers, businesses and whole communities to not only survive, but thrive in the 21st century.”

Nextlink will deploy a variety of broadband connectivity technologies to bring these areas under coverage, including wireless technologies leveraging TV white spaces (e.g., unused TV frequencies) in select markets. Nextlink will continue its deployments in Texas and Oklahoma and immediately begin deployment efforts in Kansas, Nebraska, Iowa and Illinois, with rollouts planned through 2024.

Nextlink CEO Bill Baker noted, “Nextlink is tremendously excited about the opportunity to join forces with Microsoft. This agreement will accelerate the rollout of high-speed broadband access to underserved areas that are desperate for this critical service. This in turn will make those areas more attractive for employers who require high-speed broadband to operate. By itself, this project is going to generate hundreds of full-time, long-term jobs in rural communities as Nextlink builds out and services the required networks. The overall impact to rural communities in terms of job creation and increased viability for all employers is tremendous.”

“This partnership will enable the coming of precision agriculture, IoT, digital healthcare, access to higher education and overall economic growth,” said Ted Osborn, Nextlink SVP of Strategy & Regulatory Affairs. “Our experience tells us that advanced broadband access and community support can make these promises a reality in relatively short order.”

Improved connectivity will bolster economic, educational and telehealth opportunities for everyone in the region, and could be particularly impactful for farmers. Together, the states covered in part by this deal — Iowa, Illinois, Kansas, Nebraska, Oklahoma and Texas — account for more than $120 billion in annual agricultural value, or 29% of the agricultural output of the United States, according to the U.S. Department of Agriculture (USDA). With broadband access, farmers can gain better access to markets and take advantage of advancements in precision agriculture, enabling them to better monitor crops and increase their yields, which can translate into significant economic returns. The USDA estimates widespread use of connected technologies for agricultural production has the potential to unlock over $47 billion in annual gross benefit for the United States.

The partnership builds on Microsoft and Nextlink’s efforts to close the digital divide. Nextlink is familiar with the needs of rural communities and was awarded federal Connect America Fund funding to expand broadband access to unserved rural communities. The companies will also work together to ensure that, once connectivity is available in these regions, people will receive the digital skills training to help them take advantage of the economic and social benefits that come with broadband access.

About Nextlink Internet  

Nextlink Internet, LLC is a residential and commercial internet access and phone services provider based in Hudson Oaks, Texas. The company is a leading provider of broadband services to rural school districts and municipalities. Since 2013, the company has organically attracted over 36,000 broadband subscribers using solely private capital and has managed industry-leading operating metrics. Nextlink optimizes its IP-based optical-fiber and fixed wireless network with an unrelenting commitment to customer service to achieve high customer satisfaction.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, [email protected]

Dale Curtis for Nextlink Internet, [email protected], (202) 246-5659

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

 

Go to Original Article
Author: Microsoft News Center

USBAnywhere vulnerabilities put Supermicro servers at risk

Security researchers discovered a set of vulnerabilities in Supermicro servers that could allow threat actors to remotely attack systems as if they had physical access to the USB ports.

Researchers at Eclypsium, based in Beaverton, Ore., discovered flaws in the baseboard management controllers (BMCs) of Supermicro servers and dubbed the set of issues “USBAnywhere.” The researchers said authentication issues put servers at risk because “BMCs are intended to allow administrators to perform out-of-band management of a server, and as a result are highly privileged components.

“The problem stems from several issues in the way that BMCs on Supermicro X9, X10 and X11 platforms implement virtual media, an ability to remotely connect a disk image as a virtual USB CD-ROM or floppy drive. When accessed remotely, the virtual media service allows plaintext authentication, sends most traffic unencrypted, uses a weak encryption algorithm for the rest, and is susceptible to an authentication bypass,” the researchers wrote in a blog post. “These issues allow an attacker to easily gain access to a server, either by capturing a legitimate user’s authentication packet, using default credentials, and in some cases, without any credentials at all.”

The USBAnywhere flaws make it so the virtual USB drive acts in the same way a physical USB would, meaning an attacker could load a new operating system image, deploy malware or disable the target device. However, the researchers noted the attacks would be possible on systems where the BMCs are directly exposed to the internet or if an attacker already has access to a corporate network.

Rick Altherr, principal engineer at Eclypsium, told SearchSecurity, “BMCs are one of the most privileged components on modern servers. Compromise of a BMC practically guarantees compromise of the host system as well.”

Eclypsium said there are currently “at least 47,000 systems with their BMCs exposed to the internet and using the relevant protocol.” These systems would be at additional risk because BMCs are rarely powered off and the authentication bypass vulnerability can persist unless the system is turned off or loses power.

Altherr said he found the USBAnywhere vulnerabilities because he “was curious how virtual media was implemented across various BMC implementations,” but Eclypsium found that only Supermicro systems were affected.

According to the blog post, Eclypsium reported the USBAnywhere flaws to Supermicro on June 19 and provided additional information on July 9, but Supermicro did not acknowledge the reports until July 29.

“Supermicro engaged with Eclypsium to understand the vulnerabilities and develop fixes. Supermicro was responsive throughout and worked to coordinate availability of firmware updates to coincide with public disclosure,” Altherr said. “While there is always room for improvement, Supermicro responded in a way that produced an amicable outcome for all involved.”

Altherr added that customers should “treat BMCs as a vulnerable device. Put them on an isolated network and restrict access to only IT staff that need to interact with them.”

Supermicro noted in its security advisory that isolating BMCs from the internet would reduce the risk to USBAnywhere but not eliminate the threat entirely . Firmware updates are currently available for affected Supermicro systems, and in addition to updating, Supermicro advised users to disable virtual media by blocking TCP port 623.

Go to Original Article
Author:

Learn how to get started with your networking career

As the networking industry rapidly changes, so could your networking career. Maybe you’re just starting out, or you want to take your career to the next level. Or maybe you want to hit the reset button and start over in your career. Regardless of experience, knowledge and career trajectory, everybody can use advice along the way.

Network engineer role requirements vary depending on a candidate’s experience, education and certifications, but one requirement is constant: Network engineers should have the skills to build, implement and maintain a computer network that supports an organization’s required services.

This compilation of expert advice brings together helpful insights for network engineers at any point in their networking careers in any area of networking. It includes information about telecommunications and Wi-Fi careers and discusses how 5G may affect job responsibilities.

The following expert advice can help budding, transforming and still-learning network engineers in their networking career paths.

What roles are included in a network engineer job description?

Network engineers have a variety of responsibilities that fall within multiple categories and require varying skills. All potential network engineers, however, should have a general understanding of the multiple layers of network communication protocols, like IP and TCP. Engineers that know how these protocols work can better develop fundamental networking wisdom, according to Terry Slattery, principal architect at NetCraftsmen.

The role of a network engineer is complex, which is why it’s often divided into subcategories. Potential responsibilities include the following:

Each of these paths has different responsibilities, requirements and training. For most networking careers, certifications and job experience are comparable to advanced degrees, Slattery said. Engineers should renew their certifications every few years to ensure they maintain updated industry knowledge, he added. As of mid-2019, network engineer salaries ranged from $60,000 to $180,000 a year. However, these salaries vary by location, market, experience and certifications of the candidate.

Learn more about network engineer job requirements.

What steps should I take to improve my networking career path?

As the networking industry transforms, network engineers eager to advance their networking careers have to keep up. One way to ensure engineers maintain relevant networking skills is for those engineers to get and retain essential certifications, said Amy Larsen DeCarlo, principal analyst at Current Analysis. The Cisco Certified Network Associate (CCNA) certification, in particular, provides foundational knowledge about how to build and maintain network infrastructures.

Network engineers should renew their certifications every few years, which requires a test to complete the renewal. Certifications don’t replace experience, DeCarlo said, but they assure employers that candidates have the essential, basic networking knowledge. Continuing education or specializing in a certain expertise area can also help engineers advance their networking careers, as can a maintained awareness of emerging technologies, such as cloud services.

Read more about how to advance your networking career.

network engineer skills
Learn more about the various paths you can take in your networking career.

What are the top telecom certifications?

Different types of certifications can benefit different aspects of networking. For a telecom networking career, the three main certification categories are vendor-based, technology-based or role-based, said Tom Nolle, president of CIMI Corp. Vendor-based certifications are valuable for candidates that mostly use equipment from a single vendor. However, these certifications can be time-consuming and typically require prior training or experience.

Technology-based certifications usually encompass different categories of devices, such as wireless or security services. These include certifications from the International Association for Radio, Telecommunications and Electromagnetics and the Telecommunications Certification Organization. These certifications are best for entry-level engineers or those who want to specialize in a specific area of networking. They are also equivalent to an advanced degree, Nolle said.

Role-based certifications are more general and ideal for candidates without degrees or those who want a field technician job. Certifications can make candidates more attractive to employers, as these credentials prove the candidate has the skills and experience the employer requires. One example of this type of certification is the NCTI Master Technician, which specializes in field and craft work for the cable industry.

Dive deeper into the specifics of telecom certifications.

Why should I stay up to date with Wi-Fi training?

One of the most complicated areas of networking is wireless LAN (WLAN) — Wi-Fi, in particular. Yet, Wi-Fi is essential in today’s networking environment. Like other networking career paths, WLAN engineers should refresh their Wi-Fi training every so often to remain credible, according to network engineer Lee Badman.

The history of Wi-Fi has been complicated, and the future can be daunting. But Wi-Fi training is a helpful way to understand common issues. In the past, many issues stemmed from the lack of an identical, holistic understanding of Wi-Fi among organizations and network teams, Badman said. Without a consistent Wi-Fi education plan, Wi-Fi training was a point of both success and failure.

While some training inconsistencies still linger now, Badman recommended the Certified Wireless Specialist course from Certified Wireless Network Professionals as a starting point for those interested in WLANs. A variety of vendor-agnostic courses are also available for other wireless roles, he said.

Discover more about Wi-Fi training in networking careers.

Will 5G networks require new network engineer skills?

Mobile network generations seem to change as rapidly as Wi-Fi does, causing many professionals to wonder what 5G will mean for networking careers in the future. In data centers, job requirements won’t change much, according to John Fruehe, an independent analyst. But 5G could launch a new era for cloud-based and mobile applications and drive security changes as well.

Network engineers should watch out for gaps in network security due to this new combination of enterprise networks, cloud services and 5G, Fruehe said. However, employees working in carrier networks may already see changes in how their organizations construct and provision communication services as a result of current 5G deployments. For example, 5G may require engineers to adhere to a new, fine-grained programmability to manage the increased volume of services organizations plan to run on 5G.

Networking areas where network engineer skills will be crucial are software-defined networking, software-defined radio access networks, network functions virtualization, automation and orchestration. This transformation is because manual command-line interfaces will no longer suffice when engineers program devices, as virtualization and automation are better suited to program devices.

Explore more about 5G’s potential effect on networking careers.

Go to Original Article
Author:

Smart cloud storage tier management added to Druva cloud

IT pros could significantly reduce their monthly bills with the new intelligent data tiering feature from Druva.

Based on frequency of access, data can be considered cold, warm or hot, and Amazon Web Services (AWS) has respective storage tiers for them: Glacier Deep Archive, Glacier and S3. The hotter the tier, the more expensive it is to maintain.

Druva Cloud Platform, a software product built on AWS for cloud data protection and management, has a new functionality that uses machine learning algorithms to assess which tier backup data belongs and automatically moves it there, optimizing cloud storage costs. Customers would also have oversight of the data and be able to manually manage it if they choose.

NetApp Cloud Volumes OnTap and Hitachi Vantara have similar automated cloud tiering features to optimize data storage between on-premises environments and public clouds. Druva’s upcoming feature is different in that it works with all layers of AWS storage.

Druva is a privately held software company based in Sunnyvale, Calif., and completed a $130 million funding round earlier this year. It acquired CloudLanes around that same time, adding its on-premises data ingestion capability to Druva Cloud Platform. In 2018, Druva acquired CloudRanger, which provides data protection for AWS.

AWS supplies its customers with the tools to build tiered storage and prices the tiers competitively, according to Mike Palmer, chief product officer at Druva, but customers have to rely on their own know-how in order to stitch together a system to manage it. They’d have to build their own indexes for data visibility and develop a clear understanding of the pricing and benefits of each cloud storage tier, then come up with ways to determine where their data should go to maximize savings.

“Amazon makes the customer the systems integrator, and that’s by design. They’re providing the absolute best price and performance, but it’s your job to [put it together],” Palmer said.

Mismatching data to their respective tiers could be costly to a business. With deeper archives come larger penalties for pulling data out early, and there are fees for putting data in AWS and taking it out. Palmer said many enterprises understand the potential benefits of the cloud, but they are also worried that mismanagement will wipe out any cost savings. Intelligent tiering provided by Druva cloud is designed to remove the need for this level of Amazon expertise.

“We’re going to make it easy for an untrained person to be able to clearly understand what’s happening to their data, how to get it back, and how to manage the cost,” Palmer said.

From a storage standpoint, the cloud has some benefits over on-premises options, said Steven Hill, senior analyst at IT market analyst firm 451 Research. Consumption-based pricing prevents overprovisioning (although storage as a service exists for on-premises infrastructure), and there are often five or more storage tiers with a wide range of response times.

Realizing those benefits can be difficult, as more choices also mean more complexity. Druva Cloud’s new automated tiering addresses the complexity of balancing the availability of data with the cost of storing it.

“Having more choices can be really cost-efficient, provided that you have an abstraction layer capable of placing data in the appropriate tiers efficiently and automatically,” Hill said.

Druva’s intelligent tiering not only works for existing storage tiers in AWS, but future ones as well. The machine learning algorithm works behind the scenes to predict data usage patterns and compares them against the costs of different AWS tiers.

Hill said many organizations are understandably hesitant to allow automated processes to determine storage policy, which is why Druva’s new function allows users who know their data access patterns to set tiering themselves. However, as AWS and other cloud vendors introduce more tiers to their cloud storage services, the complexity will grow to a point where automated management is the only logical step.

A growing number of customers won’t have the time or skills needed to manually make these decisions as they evolve.
Steven HillSenior analyst, 451 Research

“A growing number of customers won’t have the time or skills needed to manually make these decisions as they evolve,” Hill said.

Hill believes over time, the automation of hybrid storage tiering will be efficient and reliable enough that, “it will gain the trust of even the most apprehensive storage or data protection administrator.”

Druva’s automated cloud tiering feature is currently in early access and will be available in September 2019. The feature will be free and can be enabled directly from Druva’s backup policy interface. After providing info about their data retention needs, customers will be presented with an adjusted bill of their existing Druva subscription that factors in the discounts from moving data to cold tiering. Customers only receive a bill from Druva, and not AWS.

Go to Original Article
Author:

IFS, Acumatica take aim at top-tier ERP vendors — together

The not-quite-merger between IFS AB and Acumatica Inc. could result in a coalition that challenges top tier ERP vendors.

The two ERP vendors became corporate siblings in June when IFS’s owner, EQT Partners, acquired Acumatica. IFS board chairman Jonas Persson is expected to be chairman of both companies, but they will operate independently, according to Darren Roos, IFS CEO. Roos will serve on the Acumatica board.

IFS and Acumatica both sell manufacturing ERP products, but they serve different markets, which will lead to collaboration opportunities, according to Roos. The companies will not compete for the same customers.

Darren Roos, CEO, IFSDarren Roos

“There are loads of areas where IFS can bring value to Acumatica customers, and vice versa,” Roos said. “It’s a case of trying to sustain the growth in each of the businesses separately, while leveraging those opportunities to help each other.”

Based in Linkoping, Sweden, IFS targets large enterprises, with customers primarily in the aerospace and defense, energy and utilities, manufacturing, construction and services industries. Acumatica, based in Seattle, is a cloud-first ERP that targets mid-market companies. IFS is strongest in Europe and has a growing presence in North America, according to Roos.

Although Acumatica is about ten times smaller than IFS, each company can take advantage of the other’s strengths to grow and improve their products, according to Jon Roskill, Acumatica CEO. On the business side, IFS uses a direct sales approach while Acumatica sells through a value-added reseller channel network, Roskill explained. Acumatica can use IFS’s direct salesforce to expand into Europe, and IFS can use the Acumatica channel to grow its North American presence.

“In technology [Acumatica is] strong in cloud and cloud interfaces, and IFS is accelerating in that direction, so that’s a place where they can leverage some of Acumatica’s skills,” Roskill said. “IFS has a robust set of technologies in manufacturing and field service that are localized to many countries, so we think we can take some of that technology back to Acumatica.”

Sharing is good

Sharing resources makes sense for both companies, and the two companies are not likely to step on each other’s toes competitively, according to Cindy Jutras, president of Mint Jutras, an ERP analysis firm based in Windham, N.H.

“This is more about Acumatica than IFS. I don’t see a lot of overlap in target markets, but I think there will be some synergies here,” Jutras said. “IFS already has a strong solution in their target market — asset-intensive industries — which is not overly broad.”

Acumatica’s cloud-first origins should help IFS broaden its cloud deployment efforts, an area where it was relatively weak compared to some competitors, Jutras explained.

“As a cloud pioneer, Acumatica has developed a strong, scalable business model, while IFS lags behind many of its competitors in moving from [on-premises] solutions to delivering software as a service,” Jutras said in an analysis she wrote. “In this sense, IFS stands to gain more from Acumatica.”

Acumatica adds Phocas BI

Acumatica has added functionality through partnerships with independent software vendors (ISVs), and according to Roskill, there are more than 150 ISVs that have built integrations into Acumatica’s ERP platform. These include DocuSign for digital contract management and Smartsheet for collaborative workflow management, as well as Microsoft Power BI and Tableau for business intelligence (BI).

Acumatica recently added more BI capabilities via a partnership with Phocas Software, a cloud-based BI application designed specifically for manufacturing.

Jon Roskill, CEO, AcumaticaJon Roskill

Roskill described the Phocas partnership as a good fit for Acumatica because the two companies are aimed at a similar mid-market manufacturing base. Acumatica already has partnerships with some of the larger, more well-known BI platforms including Microsoft Power BI and Tableau, but Roskill said Phocas’ ease of use is an important factor for mid-market companies that may not be able to afford data analysts.

“You can do almost anything with Tableau, but you’ve got to put a lot of emphasis into your expertise,” he said. “The difference with Phocas is that it gets very specific manufacturing and distribution oriented analytics and lets you tune that out of the box.”

The goal of Phocas’ BI software is to take industry-specific data from ERP systems like Acumatica and “make it consumable for business professionals so they can make better data-driven decisions,” according to Jay Deubler, president of Phocas’ U.S. division.

“We’ve aligned ourselves with Acumatica, which is an ERP provider, for what we would call our perfect prospect profile in the mid-market business,” Deubler said. “Because we now have a pre-written integration, the implementation becomes much easier and less expensive for customers.”

Go to Original Article
Author:

Zoom vulnerability reveals privacy issues for users

Zoom faced privacy concerns after the disclosure of a vulnerability that could allow threat actors to use the video conferencing software to spy on users.

The Zoom vulnerability, originally reported to only affect the Mac version of the software, has been found to partially affect Windows and Linux as well. Jonathan Leitschuh, software engineer at open source project Gradle, disclosed the Zoom vulnerability in a blog post earlier this week and said it “allows any website to forcibly join a user to a Zoom call, with their video camera activated, without the user’s permission.”

On top of this, this vulnerability would have allowed any webpage to DOS (Denial of Service) a Mac by repeatedly joining a user to an invalid call,” Leitschuh added. “Additionally, if you’ve ever installed the Zoom client and then uninstalled it, you still have a localhost web server on your machine that will happily re-install the Zoom client for you, without requiring any user interaction on your behalf besides visiting a webpage.”

According to Leitschuh, it took Zoom 10 days to confirm the vulnerability and in a meeting on June 11, he told Zoom there was a way to bypass the planned fix, but Zoom did not address these concerns when Zoom reported the vulnerability fixed close to two weeks later. The Zoom vulnerability resurfaced on July 7, Leitschuh disclosed on July 8 and Zoom patched the Mac client on July 9. Zoom also worked with Apple on a silent background update for Mac users, released July 10, which removed the Zoom localhost from systems.

“Ultimately, Zoom failed at quickly confirming that the reported vulnerability actually existed and they failed at having a fix to the issue delivered to customers in a timely manner,” Leitschuh wrote. “An organization of this profile and with such a large user base should have been more proactive in protecting their users from attack.” 

Zoom — whose video conferencing software is used by more than 4 million users in approximately 750,000 companies around the world — downplayed the severity of the issue and refuted Leitschuh’s characterization of the company.

This trust tradeoff, between making it easy and making it secure, is something that every consumer should consider.
Tom PattersonChief trust officer, Unisys

“Once the issue was brought to our Security team’s attention, we responded within ten minutes, gathering additional details, and proceeded to perform a risk assessment,” Richard Farley, CISO at Zoom, wrote in the company’s response. “Our determination was that both the DOS issue and meeting join with camera on concern were both low risk because, in the case of DOS, no user information was at risk, and in the case of meeting join, users have the ability to choose their camera settings.”

“To be clear, the host or any other participant cannot override a user’s video and audio settings to, for example, turn their camera on,” Farley added. 

Both the disclosure and response from Zoom portrayed the issue as only affecting the Mac client, but Alex Willmer, Python developer for CGI, wrote on Twitter that the Zoom vulnerability affected Windows and Linux as well.

“In particular, if zoommtg:// is registered as a protocol handler with Firefox then [Zoom] joins me to the call without any clicks,” Willmer tweeted. “To be clear, a colleague and I saw the auto-join/auto-webcam/auto-microphone behavior with Firefox, and Chromium/Chrome; on Linux, and Windows. We did not find any webserver on port 19421 on Linux. We didn’t check Windows for the webserver.”

Leitschuh confirmed Willmer’s discovery, but it is unclear if Zoom is working to fix these platform clients. Leitschuh also noted in his disclosure that the issue affects a whitehite label version of Zoom licensed to VoIP provider RingCentral. It is unclear if RingCentral has been patched.

Leitschuh told SearchSecurity via Twitter DM that “Zoom believes the Windows/Linux vulnerabilities are the browser vendors’ to fix,” but he disagrees.

Zoom did not respond to requests for comment at the time of this post.

Tom Patterson, chief trust officer at Unisys, said the tradeoff between security and ease of use is “not always a fair trade.”

“The fact that uninstalling any app doesn’t completely uninstall all components runs counter to engendering trust. In this case, it’s an architectural decision made by the manufacturers which appears to be designed to make operations much easier for users,” Patterson told SearchSecurity. “This trust tradeoff, between making it easy and making it secure, is something that every consumer should consider.”

Go to Original Article
Author:

Adobe acquisition of Marketo could shake up industry

The potential Adobe acquisition of Marketo could unsettle the customer experience software market and give Adobe, which is mainly known for its B2C products, a substantial network of B2B customers from Marketo.

Adobe is in negotiations to acquire marketing automation company Marketo, according to reports.

“It’s a trend that B2B customers are trying to become more consumer-based organizations,” said Sheryl Kingstone, research director for 451 Research. “Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.”

But, reportedly, talks between Adobe and Marketo’s holding company may not lead to a deal.

Ray Wang, founder of Constellation Research, said leaks could be coming from Vista Equity Partners Management, which bought Marketo in 2016 and took the company private, in the hopes of adding another bidder to the race to acquire Marketo.

“If people think Adobe would buy Marketo, maybe it would get SAP to think about it,” Wang said. “The question is, who needs marketing automation or email marketing? And when you think about the better fit at this moment, it’s SAP.”

When reached for comment, Adobe declined, adding that it does not comment on acquisition rumors or speculation.

Adobe expanding to B2B

Marketo said it had roughly 4,600 customers when it was acquired by Vista Equity. It’s unclear whether Adobe and Marketo have much overlap between customer bases, but there could be product overlap between the software vendors.

Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.
Sheryl Kingstoneresearch director, 451 Research

Adobe has its Marketing Cloud system, and both vendors offer basic martech features, like lead scoring, lead segmentation, web tracking, SMS marketing, personalized web content and predictive analytics. But an Adobe acquisition of Marketo would allow Adobe to expand into a wider B2B market, while allowing Marketo to offer its users the ability to market more like a B2C vendor using Adobe’s expertise.

“It’s a huge benefit for Marketo when you look at Adobe,” Kingstone said.

“Marketo has struggled in a B2B sense when its customers try to implement an ABM [account-based marketing] strategy,” she said.

Despite any potential overlap with its own products’ marketing capabilities, Adobe could find the chance to break into a pool of nearly 5,000 B2B customers compelling.

“There’s a lot of value in Marketo, and Adobe has been gun shy about entering B2B,” Wang said.

Adobe’s alliance

If the Adobe acquisition reports turn out to be accurate, it would amplify what has already been a busy year for the vendor. In May, Adobe acquired commerce platform Magento for a reported $1.7 billion.

A Reuters report about the Adobe acquisition of Marketo said likely prices will well exceed the $1.8 billion that Vista paid for Marketo when it took Marketo private.

Over the past few years, industry-leading companies in the CRM and customer experience spaces have sought to build alliances with other vendors.

Adobe and Microsoft have built a substantial partnership and have even gone to market together with products, while Salesforce and Google unveiled their partnership and product integrations last year at Salesforce’s annual Dreamforce conference.

Marketo has been one of the few major martech vendors without an alliance. Combining its technologies with Adobe’s creative suite and potentially Microsoft’s B2B breadth could make a significant imprint on the industry.

“If this is real, then it means Adobe has gotten serious about B2B,” Wang said.

Editor’s note: TechTarget offers ABM and project intelligence data and tools services.

Wanted – Dell XPS 13 9360

Hi
I did consider what you had posted, however I too had spent some time looking into what would be a suitable laptop for my son who is going into 6th next month. I was disappointed when the thread was achieved as I was going to put an offer in as well.

Sean had a FS thread for a couple of screens, one of which I was interested in, I asked if he would in interested in a deal for both the screen and the laptop only to find out the screen had been sold off forum, I pointed Sean in the direction on my wanted thread should he change his mind.

The rest can be seen above, the laptop has been paid for, Sean should have put Windows 10 back on and posted today for delivery tomorrow.

I wish you luck in your hunt, perhaps a wanted thread may help, there is another XPS here for sale.

Plan to map UK’s network of heart defibrillators could save thousands of lives a year

Thousands of people who are at risk of dying every year from cardiac arrest could be saved under new plans to make the public aware of their nearest defibrillator.

There are 30,000 cardiac arrests outside of UK hospitals annually but fewer than one-in-10 of those survive, compared with a 25% survival rate in Norway, 21% in North Holland, and 20% in Seattle, in the US.

A new partnership between the British Heart Foundation (BHF), Microsoft, the NHS and New Signature aims to tackle the problem by mapping all the defibrillators in the UK, so 999 call handlers can tell people helping a cardiac arrest patient where the nearest device is.

Ambulance services currently have their own system of mapping where defibrillators are located but this is not comprehensive.

It is hoped the partnership can evolve to capture heart data from cardiac arrest patients

“There is huge potential ahead in the impact that technology will have in digitally transforming UK healthcare,” said Clare Barclay, Chief Operating Officer at Microsoft. “This innovative partnership will bring the power of Microsoft technology together with the incredible vision and life-saving work of BHF and the NHS. This project, powered by the cloud, will better equip 999 call handlers with information that can make the difference between life and death and shows the potential that innovative partnerships like this could make to the health of the nation.”

Cardiac arrest occurs when the heart fails to pump effectively, resulting in a sudden loss of blood flow. Symptoms include a loss of consciousness, abnormal or absent breathing, chest pain, shortness of breath and nausea. If not treated within minutes, it usually leads to death.

Defibrillators can save the life of someone suffering from a cardiac arrest by providing a high-energy electric shock to the heart through the chest wall. This allows the body’s natural pacemaker to re-establish the heart’s normal rhythm.

However, defibrillators are used in just 2% of out-of-hospital cardiac arrests, often because bystanders and ambulance services don’t know where the nearest device is located.

Owners of the tens of thousands of defibrillators in workplaces, train stations, leisure centres and public places across the country will register their device with the partnership. That information will be stored in Azure, Microsoft’s cloud computing service, where it will be used by ambulance services during emergency situations. The system will also remind owners to check their defibrillators to make sure they are in working order.

It is hoped that the partnership can evolve to enable defibrillators to self-report their condition, as well as capture heart data from cardiac arrest patients that can be sent to doctors.

Simon Gillespie, Chief Executive of the BHF, said: “Every minute without CPR or defibrillation reduces a person’s chance of surviving a cardiac arrest by around 10%. Thousands more lives could be saved if the public were equipped with vital CPR skills, and had access to a defibrillator in the majority of cases.

Everything you need to know about Microsoft’s cloud

“While we’ve made great progress in improving the uptake of CPR training in schools, public defibrillators are rarely used when someone suffers a cardiac arrest, despite their widespread availability. This unique partnership could transform this overnight, meaning thousands more people get life-saving defibrillation before the emergency services arrive.”

Simon Stevens, Chief Executive of NHS England, added: “This promises to be yet another example of how innovation within the NHS leads to transformative improvements in care for patients.”

The defibrillation network will be piloted by West Midlands Ambulance Service and the Scottish Ambulance Service, before being rolled out across the UK.

Tags: , , , , ,

CEO outlines Data Dynamics StorageX trends, future features

Data Dynamics CEO Piyush Mehta admitted he could not have envisioned the customer challenges his company would need to address as it marks its six-year anniversary.

The Teaneck, N.J., vendor focused on file migration when Mehta founded the company in 2012. But the Data Dynamics StorageX software has since added capabilities, such as analytics and native Amazon S3 API support, to help customers better understand and manage their data as they transition to hybrid cloud architectures. 

The StorageX software enables users to set policies to move files from one storage system to another, including on-premises or public cloud object storage, and to retrieve and manage their unstructured data. New features on the horizon include container support, NFSv4 namespace capabilities and greater integration with Microsoft’s Active Directory, according to Mehta.

Mehta said Data Dynamics StorageX currently has more than 100 customers, including 25 Fortune 100 companies and six of the world’s top 12 banks. He estimated more than 90% of customer data is stored on premises.

StorageX actually goes back to 2002, when it was developed by startup NuView for file virtualization. SAN switching vendor Brocade acquired NuView in 2006 and tried to use StorageX for a push into storage software. After Brocade’s software strategy failed, it sold the StorageX intellectual property to Data Dynamics, which relaunched the software as a file migration tool in 2013.

In a Q&A, Mehta discussed the latest customer trends, upcoming product capabilities and recent pricing changes for Data Dynamics StorageX software.

How has the primary use case for Data Dynamics StorageX software changed over the one you initially envisioned?

Data Dynamics CEO Piyush MehtaPiyush Mehta

Piyush Mehta: What ended up happening is, year over year, as customers were leveraging StorageX for large-scale migrations, we realized a consistent challenge across environments. Customers lost track of understanding the data, lost track of owners, lost track of impact when they moved the data. And we realized that there’s a business opportunity where we could add modules that can help do that.

Think of this as ongoing lifecycle management of unstructured data that just keeps growing at 30%, 40%, 50% year over year. The second aspect to that was helping them move data not just to a NAS tier, but also to object storage, both on- and off-prem.

What are the latest customer trends?

Mehta: One theme that we continue to see is a cloud-first strategy regardless of vertical; every company, every CIO, every head of infrastructure talks about how they can leverage the cloud. The challenge is very few have a clearly defined strategy of what cloud means. And from a storage perspective, the bigger challenge for them is to understand what these legacy workloads are and where they can find workloads that can actually move to the cloud.

For born-in-the-cloud workloads, with applications that were started there, it’s an easy starting point. But for the years and years of user and application data that’s been gathered and collected, all on-prem, the question becomes: How do I manage that?

The second thing is a reality check that everything’s not going to the public cloud, and everything’s not going to stay local. There’s going to be this hybrid cloud concept where certain data and certain applications will most likely — at least for the foreseeable future — reside locally. And then whatever is either not used, untouched, deep archive, those type of things can reside in the cloud.

Are customers more interested in using object storage in public or private clouds?

Mehta: It’s a mixture. We do see huge interest in AWS and Glacier as a deep archive or dark data archive tier. At the same time, we see [NetApp’s] StorageGrid, [IBM Cloud Object Storage, through its] Cleversafe [acquisition], Scality as something that customers are looking at or have deployed locally to tier large amounts of data — but, again, data that’s not necessarily active.

Do you find that people are more inclined to store inactive data than implement deletion policies?

Mehta: I still haven’t seen the customer who says, ‘It’s OK to delete.’ You’ll have the one-off exceptions where they may delete, but there’s always this propensity to save, rather than delete, because I may need it.

What you end up finding is more and more data being stored — in which case, why would I keep it on primary NAS? No matter how cheap NAS may be getting, I’d rather put it on a cheaper tier. That’s where object conversations are coming in.

Which of the new StorageX capabilities have customers shown the most interest in?

Mehta: We have seen huge interest, adoption and sale of our analytics product. Most customers don’t know their data — type, size, age, who owns it, how often it’s being accessed, etc. We’ve been able to empower them to go in and look at these multi-petabyte environments and understand that. Then, the decision becomes: What subset of this do I want to move to a flash tier? What subset do I want to move to a scale-up, scale-out NAS tier?

Then, there is what we call dark or orphan data, where a company says, ‘Anything over 18 months old can sit in the active archive tier‘ — and by active, I mean something that’s disk-driven, rather than tape-driven. That’s where we’re seeing object interest come in. First, help me do the analytics to understand it. And then, based on that, set policies, which will then move the data.

Does Data Dynamics offer the ability to discover data?

Mehta: We have an analytics module that leverages what we call UDEs — universal data engines. In the old world, when we were doing migrations only, they were the ones that were doing the heavy lifting of moving the data. Now, they also have the ability to go collect data. They will go ahead and crawl the file system or file directories and capture metadata information that then is sent back into the StorageX database, which can be shared, as well as exported. We can give you aggregate information, and then you can drill on those dashboards, as needed.

Does your analytics engine work on object- and file-based data?

Mehta: It works only on file today. It’s really to understand your SMB and NFS data to help determine how to best leverage it. Most of that data — I would say north of 95% — is sitting on some kind of file tier when you look at unstructured data. It’s not sitting on object.

Where is StorageX deployed?

Mehta: The StorageX software gets deployed on a server within the customer environment, because that’s your control mechanism, along with the back-end databases. That’s within the customer’s firewalls. From an infrastructure standpoint, everything sits in central VMs [virtual machines]. We’re moving it to a container technology in our next release to make it far more flexible and versatile in terms of how you are scaling and managing it.

What other capabilities do you think you’ll need moving forward?

Mehta: More integration with Active Directory so that we can provide far more information in terms of security and access than we can today. From a technology standpoint, we are continuing to make sure that we support API integration downstream into local storage vendors — so, the latest operating systems and the latest APIs. Then, from a hyperscaler standpoint, being able to have native API integration into things like Azure Blob and Glacier are things that are being added.

Data Dynamics updated StorageX pricing this year. There’s no longer a fee for file retrieval, but the prices for analytics, replication and other capabilities increased. What drove the changes?

Mehta: The costs haven’t gone up. Before, we were giving you a traditional licensing mechanism where you had two lines items: a base license cost and a maintenance cost. That was confusing customers, so we decided to just make it one single line item. Every module of ours now becomes an annual subscription based on capacity, where the cost of maintenance is embedded into it.

The other thing we learned from our customers was that when you looked at both an archive and a retrieval capability, we wanted customers to have the flexibility to manage that without budgeting and worrying about the cost constraints of what they were going to retrieve. It’s hard to predict what percentage of the data that you archive will need to be brought back. The management of the ‘bring back, send back, bring back, send back’ becomes a huge tax on the customer.

Now, the functionality of retrieval is given to you as part of your archive module, so you are not paying an incremental cost for it. It became subscription, so it’s just an auto renewal, rather than worrying about it from a Capex perspective and renewing maintenance and all of that.