Tag Archives: analysts

Economic worry may be impacting HR budgets

On last week’s earnings call with financial analysts, Workday Inc. CEO Aneel Bhusri was asked for his opinion on the broader economic outlook. He was both vague and definitive. His company wasn’t seeing problems in its own product pipeline, “but there is no question there is uncertainty in the air,” he said.

In HR departments, the uncertainty has turned into action, according to Gartner. In a survey of 171 HR managers, 92% said they are now “prioritizing budgeting and cost optimization initiatives.”

This means HR managers are taking specific steps to control spending, said Daniel Dirks, managing vice president at Gartner’s HR practice. The most likely effects are on department hiring and technology buying decisions, he said.

A hiring freeze would be near the top of HR budget actions, “because it is relatively easy to do,” Dirks said. HR managers are also taking a hard look at their tech vendor contracts. They are “making sure what was promised in the contract is really being delivered,” he said.

But no worries, so far, in HR tech

The concern about an economic downturn is not turning up in HR vendor spending.

On Aug. 29, Workday, for instance, reported total revenues of nearly $888 million, an increase of 32% from the same quarter a year ago.

There is no question there is uncertainty in the air.
Aneel BhusriCEO, Workday

ADP LLC recently reported a revenue increase of 6% to $14.2 billion. Lisa Ellis, a MoffettNathanson partner who leads its payments, processors, and IT services business, described the increase as “great results” on a July 31 analyst call. Ellis also noted on the call that ADP’s guidance for 2020 “implies a pretty robust outlook on the U.S. economy.”

Venture capital (VC) investments in HR remain strong, according to HRWins, which reported nearly $1.5 billion in global HR VC investment in the second quarter. It sees 2019 VC investment outpacing last year by a strong margin.

Nonetheless, Dirks said there is a “change in sentiment and in mindset” in HR because of the economy. For the last 10 years, HR priorities have focused on finding talent and investing in tech; cost optimization is now emerging as a new priority. But Dirks said the new priority isn’t necessarily emerging at the expense of HR’s other priorities.

Gartner is also advising HR managers to play a broader role in watching the economy. It recommends teaming up with peers in finance and sales, for instance, to look at broader economic data.

HR managers have expertise in the labor market and may be able to identify market shifts. This could include, for instance, an increase in part-time hiring, if firms are becoming more conservative in hiring full time.

Go to Original Article
Author:

Slack-Zoom partnership could include joint pricing

Analysts expect Slack and Zoom to eventually bundle pricing for their cloud software to undercut the appeal of all-in-one collaboration suites from Microsoft and Cisco.

Team messaging vendor Slack and online meetings provider Zoom signed a contract earlier this year to align their product roadmaps and marketing strategies. So far, the companies have focused on making it easier for businesses to use their products side by side.

A logical next step would be for the upstarts to offer bundled pricing deals to woo businesses already paying for Office 365, which includes the competing app Microsoft Teams, analysts said. It could also help the vendors counter competitors such as Cisco and Google, which sell both messaging and meetings apps.

“Neither of them can own a customer the way Microsoft or Cisco can,” said Jon Arnold, principal analyst at J Arnold and Associates. “So I think for them to get into the enterprise market, where the real dollars are, it’s just got to be a stronger offering.”

Companies have been willing to pay for Slack and Zoom up to this point because their products have offered more features and have been easier to use than many competing apps, said Irwin Lazar, analyst at Nemertes Research.

“That’s not a sustainable advantage. The other vendors are getting better,” Lazar said. “So I think, at some point, they are going to have to figure out how to have combined go-to-market offers that bundle pricing to provide discounts.”

A Zoom representative acknowledged a pricing bundle with Slack wasn’t outside the realm of possibility. The companies are already working together on joint billboards and other marketing initiatives, said Laura Padilla, who directs Zoom’s business development and channel program.

“Around bundling or packaging with pricing, that’s not something we’re willing to discuss right now, but they are considered a strategic partner, so there may be something like that in the future,” Padilla said.

Slack declined to comment.

Slack features coming to Zoom

In the meantime, businesses can expect tighter integrations between the two apps — including ways to access Slack from the Zoom interface.

Photo of Stewart Butterfield and Eric Yuan
Slack CEO and co-founder Stewart Butterfield, left, with Zoom CEO and founder Eric Yuan at Slack Frontiers 2019.

The pair’s integrations to date have focused on making it possible for users to join Zoom meetings from Slack. But the companies have a multi-year roadmap that includes a commitment to build integrations in the opposite direction.

“We’re going to be bringing in a lot more of the Slack features into Zoom,” Padilla said.

The Slack-Zoom partnership bore its first fruit late last month with an update to the Zoom app for Slack. In Slack, users can now view the participants in an ongoing Zoom meeting. They can also set up notifications that let them join Zoom meetings by clicking a button rather than typing a command.

In the future, users will be able to make phone calls in Slack using Zoom Phone, a new cloud calling service.

The partnership sets the stage for a showdown between two models for selling software.

Vendors such as Microsoft offer a cloud-based suite of productivity and collaboration apps able to meet virtually all of a customer’s needs. Vendors such as Slack and Zoom specialize in one type of app to make better software than anyone else in their niche.  

Slack and Zoom both began trading stock publicly early this year and must now report financial results to Wall Street every quarter. To achieve long-term profitability, they each need to sell to larger businesses, a huge chunk of which already uses Office 365. 

More than 15,000 teams have installed Slack’s integration with Zoom, which lets users launch Zoom meetings from Slack. That’s up from more than 10,000 teams in April and roughly 5,000 teams a year ago, showing that companies are increasingly using the two apps in tandem.

“Slack doesn’t have the meetings and Zoom doesn’t have the workstream collaboration,” said Mike Fasciani, analyst at Gartner. “So it really requires a paring of these two to be able to stand up and compete against Microsoft.”

Go to Original Article
Author:

Announcing new AI and mixed reality business applications for Microsoft Dynamics – The Official Microsoft Blog

Today, I had the opportunity to speak to press and analysts in San Francisco about our vision for business applications at Microsoft. In addition, I had the privilege to make two very important announcements: the upcoming availability of new Dynamics 365 AI applications, and our very first mixed reality business applications: Dynamics 365 Remote Assist and Dynamics 365 Layout.

Our vision for business applications at Microsoft

We live in a connected world where companies are challenged every day to innovate so they can stay ahead of emerging trends and repivot business models to take advantage of new opportunities to meet growing customer demands.

To innovate, organizations need to reimagine their processes. They need solutions that are modern, enabling new experiences for how they can engage their customers while making their people more productive. They need unified systems that break data silos, so they have a holistic view of their business, customers and employees. They need pervasive intelligence threaded throughout the platform, giving them the ability to reason over data, to predict trends and drive proactive intelligent action. And with adaptable applications, they can be nimble, allowing them to take advantage of the next opportunity that comes their way.

Two years ago, when we introduced Dynamics 365 we started a journey to tear down the traditional silos of customer relationship management (CRM) and enterprise resource planning (ERP). We set out to reimagine business applications as modern, unified, intelligent and adaptable solutions that are integrated with Office 365 and natively built on Microsoft Azure.

With the release of our new AI and mixed reality applications we are taking another step forward on our journey to help empower every organization on the planet to achieve more through the accelerant of business applications. Specifically, today we are making the following announcements:

Dynamics 365 + AI

First, I am happy to announce the coming availability of a new Dynamics 365 AI offering — a new class of AI applications that will deliver out-of-the-box insights by unifying data and infusing it with advanced intelligence to guide decisions and empower organizations to take informed actions. And because these insights are easily extensible through the power of Microsoft Power BI, Azure and the Common Data Service, organizations will be able to address even the most complex scenarios specific to their business.

Dynamics 365 AI for Sales: AI can help salespeople prioritize their time to focus on deals that matter most, provide answers to the most common questions regarding the performance of sales teams, offer a detailed analysis of the sales pipeline, and surface insights that enable smarter coaching of sales teams.

Dynamics 365 AI for Customer Service: With Microsoft’s AI and natural language understanding, customer service data can surface automated insights that help guide employees to take action and can even leverage virtual agents to help lower support costs and enable delightful customer experiences, all without needing in-house AI experts and without writing any code.

Dynamics 365 AI for Market Insights: Helps empower your marketing, social media and market research teams to make better decisions with market insights. Marketers can improve customer relationships with actionable web and social insights to engage in relevant conversations and respond faster to trends.

To help bring this to life, today we released a video with our CEO, Satya Nadella, and Navrina Singh, a member of our Dynamics 365 engineering team, showing examples of ways we’re bringing the power of AI to customer service organizations.

Dynamics 365 + Mixed Reality

Our second announcement of the day centers on the work we are doing to bring mixed reality and business applications together.

Since the release of Microsoft HoloLens over two years ago, the team has learned a lot from customers and partners. The momentum that HoloLens has received within the commercial space has been overwhelmingly positive. This has been supported by increased demand and deployment from some of the world’s most innovative companies.

We recognize that many employees need information in context to apply their knowledge and craft. Not only on a 2-D screen — but information and data in context, at the right place, and at the right time, so employees can produce even greater impact for their organizations. Mixed reality is a technology uniquely suited to do exactly that.

This is a whole new kind of business application. And that’s precisely what we’re introducing today, Dynamics 365 Remote Assist and Dynamics 365 Layout.

Today, we also showcased for the first time how Chevron is deploying HoloLens to take advantage of Dynamics 365 mixed reality business applications.

Chevron is already achieving real, measurable results with its global HoloLens deployment. Previously it was required to fly in an inspector from Houston to a facility in Singapore once a month to inspect equipment. Now it has in-time inspection using Dynamics 365 Remote Assist and can identify issues or provide approvals immediately.

In addition, remote collaboration and assistance have helped the company operate more safely in a better work environment, serving as a connection point between firstline workers and remote experts, as well as cutting down on travel and eliminating risks associated with employee travel.

Here is a peek into the work Chevron is doing with mixed reality:

Unlock what’s next with the Dynamics 365 October 2018 release

Next week at Microsoft Ignite and Microsoft Envision we’ll be in Orlando talking with thousands of customers, partners, developers, and IT and business leaders about our October 2018 release for Dynamics 365 and the Power platform that will be generally available Oct. 1. The wave of innovation this represents across the entire product family is significant, with hundreds of new capabilities and features.

We will have a lot more to talk about in the weeks and months ahead. We look forward to sharing more!

Tags: , , ,

Broadcom acquisition of CA seeks broader portfolio

The out-of-the-blue Broadcom acquisition of CA Technologies has analysts scratching their heads about how the two companies’ diverse portfolios weave together strategically, and how customers might feel the impacts — beneficial or otherwise.

CA’s strength in mainframe and enterprise infrastructure software, the latter of which is a growing but fragmented market, gives chipmaker Broadcom another building block to create an across-the-board infrastructure technology company, stated Hock Tan, president and CEO of Broadcom.

But vaguely worded statements from both companies’ execs lent little insight into potential synergies and strategic short- or long-term goals of the $18.9 billion deal.

One analyst believes the deal is driven primarily by financial and operational incentives, and whatever technology synergies the two companies create are a secondary consideration for now.

“The operating margins from mainframes are very healthy and that fits very well with Broadcom’s financial model,” said Stephen Elliot, an analyst at IDC.

The bigger issue will be Broadcom’s ability to manage the diverse software portfolio of a company the size of CA. To date, Broadcom’s acquisition strategy has focused almost exclusively on massive deals for hardware companies, in areas such as storage, wireless LAN and networking. “The question is, is this too far of a reach for them? Customers are going to have to watch this closely,” Elliot said.

The overall track record of acquisitions that combine hardware-focused companies and large software companies is not good, Elliot noted. He pointed to the failures of Intel’s acquisition of LANDesk and Symantec’s purchase of Veritas.

Broadcom’s ability to manage CA’s complex and interwoven product portfolio is another concern.

The question is, is this too far of a reach for [Broadcom]? Customers are going to have to watch this closely.
Stephen Elliotanalyst, IDC

“As far as I can see, Broadcom has little or no visible prior execution or knowledge about a complicated and nuanced software and technology arena such as the one CA addresses … that includes DevOps, agile and security,” said Melinda Ballou, research director for IDC’s application life-cycle management program. “Infrastructure management would be more in their line of work, but still very different.”

Broadcom’s acquisition of CA also fills a need to diversify, particularly in the aftermath of its failed attempt to buy Qualcomm earlier this year, which was blocked by the Trump administration for national security reasons.

“They need to diversify their offerings to be more competitive given they primarily focus on chips, networking and the hardware space,” said Judith Hurwitz, president and CEO of Hurwitz & Associates LLC. “CA has done a lot of work on the operational and analytics side, so maybe [Broadcom] is looking at that as a springboard into the software enablement space.”

Hurwitz does see opportunities for both companies to combine their respective products, particularly in network management and IoT security. And perhaps this deal portends more acquisitions will follow, potentially among companies that compete directly or indirectly with CA. Both Broadcom and CA have pursued growth through numerous acquisitions in recent years.

“You could anticipate Broadcom goes on a spending spree, going after other companies that are adjacent to what CA does,” Hurwitz said. “For example, there was talk earlier this year that CA and BMC would merge, so BMC could be a logical step with some synergy there.”

Peering into the future of enterprise storage for 2018

As the end of the year approaches, analysts, vendors, consultants and anybody else who has some time on their hands are turning their attention to that age-old annual rite of making predictions about what tech to expect in the coming year.

Although I’ve ridiculed these prognostications in the past, I am not above making predictions of my own about the future of enterprise storage. Hopefully, you’ll find my prophecies a bit more fun, even if they do skimp on the facts. But, you know what, those of us who are bold enough to look into the future and tell the world what we see — no matter how silly or creepy that vision may be — can hardly be encumbered by the limitations of truth, reality and, well, sanity.

So, here it goes …

Toshiba enters a new dimension

As I peer into my SAN-attached crystal ball, I see Toshiba stumbling over a nuclear power plant to, almost literally, cause billions of IT dollars to atomize into thin air, leaving the company no choice but to sell off its most profitable, and least scary, business — it’s solid-state labs and fabs. Oh wait, that already happened, didn’t it? We’ve been reading about the sale of that business for what seems like six or seven years now.

My real prediction is we will finally find out that 4D flash is the reason why Western Digital has so desperately clung to its joint development deal with Toshiba. Already bored with run-of-the-mill 2D and 3D flash, Toshiba’s scientists have invented a new dimension where flash cells can be lined up left and right, stacked up and down, and spun out into alternate universes in the future of enterprise storage. With its infinite capacity and endless scalability, 4D has no need for wear leveling or program erase or any of those old-fashioned techniques used to extend the lives of chips stuck in a mere three dimensions.

Dell EMC splits

By about Q2 2018, Michael Dell will realize the honeymoon is over, and the venerable Dell EMC brand is in need of a shake-up that will have a profound effect on storage technology trends. With no shareholders to confer or contend with, Dell will hold long, animated conversations with himself — captured by the webcam on his Dell XPS 15 laptop (with 8 GB of RAM and a 256 GB SSD).

After weeks of negotiations, during which Dell storms out in a huff on more than one occasion, he announces that Dell EMC will split into two publicly traded companies. Dell will sell laptops and servers, and buy storage companies that few have heard of (and fewer have bought from). EMC will focus on storage hardware, diversifying by buying up so many smaller companies they have to create a team just to keep track of acquisitions.

Nutanix’s super-extra-mega-hyper-converged system

Each new Nutanix HCI system will include a living, breathing admin, sandwiched somewhere between the storage, servers and networking gear.

Nutanix, a pioneer of the hyper-converged infrastructure craze that has dominated storage and systems discussions for the last couple of years, will one-up the growing ranks of competitors by adding a new service to its HCI product line. Move over Acropolis, Prism and Enterprise Cloud and make room for Norman. While combining storage, servers, networking and virtualization in a single product helped revolutionize data center infrastructure, Norman promises to turn that on its ear in 2018.

Stuffing all that infrastructure into a single package was pretty cool and oh so convenient, but Nutanix engineers realized something was missing: someone to run the whole thing and treat users with disdain when they request a new virtual machine or some disk capacity. Enter Norman. Each new Nutanix HCI system will include a living, breathing admin, sandwiched somewhere between the storage, servers and networking gear. By the second half of 2018, expect Nutanix’s competitors to play catch-up, doing their best to erase the line that separates the man-machine interface with their own versions of the ultimate converged solution.

LTO-142 debuts

Faced with higher and higher HDD capacities and flash systems that can store the entire world’s knowledge on a single PCIe card, the LTO Consortium decided to speed up product development to keep pace with the competing media. Thus, instead of following up the recent rollout of LTO-8 with LTO-9, the consortium — anchored by IBM, Hewlett Packard Enterprise and Quantum — will take the bold step of shortcutting its previously laid out LTO product development roadmap, accelerating the program with the release of LTO-142 products.

The LTO-142 specs will include capacities of up to 12 yottabytes compressed (3 YB native) and transfer speeds too damned fast to measure. It remains to be seen what effect these radical developments in tape technology will have on the future of enterprise storage or if they’ll even be noticed.

Future of enterprise storage closer to home

In 2018, you will need more capacity and have less money to spend. When you start shopping for more storage, you will have a dizzying array (pun intended) of storage choices, ranging from traditional shared systems to hyper-converged gear to object to cloud and so on.

It looks like you’ll have your hands full even if my fearless forecasts about the future of enterprise storage don’t come to fruition. Happy New Year!

New Pure Storage CEO Giancarlo faces scaling challenge

With revenue approaching $1 billion for 2017, analysts and customers will watch closely to see if new Pure Storage CEO Charles Giancarlo will steer the all-flash array vendor in a different direction.

Giancarlo became the CEO of Pure Storage in August following the surprise resignation of Scott Dietzen. By all accounts, the decision to step down was Dietzen’s alone; he built Pure from a venture-backed fledgling in 2009 to a public company with the third-highest all-flash market share, according to IT analyst firm IDC.

Giancarlo takes over as Pure Storage CEO with an extensive track record as an IT executive. Giancarlo worked at Cisco from 1994 to 2008, serving variously as executive vice president, chief development officer and president of its Linksys division. He later served as interim CEO of telecom giant Avaya, and as a managing partner at equity firm Silver Lake Partners.

Giancarlo has yet to publicly disclose specific plans to grow revenue as the Pure Storage CEO.

New Pure Storage CEO Charles GiancarloCharles Giancarlo

Danh Duong, lead storage and backup administrator at the University of California, Davis, said he expects little impact from the Pure Storage CEO change. And that would be fine with him. UC Davis uses Pure Storage’s FlashArray as the data repository for an internal private cloud for managed services.

“We’ve always been very happy with Pure support,” Duong said. “They’ve made our storage management extremely easy, which is what we wanted. As long as the core management team is there and they don’t change their licensing and support, I don’t have any concerns about the CEO change.”

Jack Hogan, CTO of women’s health website Lifescript, said he was surprised when Dietzen stepped down Aug. 24. Hogan said Dietzen assuaged his concerns during a personal conversation, and he is encouraged that Dietzen will stay on as Pure’s chairman.

“I think if Scott were gone completely from Pure, the culture he built could erode,” Hogan said. “But the fact he’s staying around as chairman — the leader who says ‘always do the right thing’ — almost gives him more power” to influence Pure’s future.

“It’s not just Scott, but the entire top-level executives all the way down that do a good job of instilling a culture of putting the customer first,” Hogan said. “It’s a do-the-right-thing approach that we’ve seen handfuls and handfuls and handfuls of times.

Analysts: What’s next after Pure Storage hits $1 billion?

Eric Burgener, a research director for storage at IDC, said tech companies commonly swap CEOs as they move from one phase of growth to another. For Pure, the $1-billion plateau marks a demarcation point.

Pure reported $407 million in revenue for the first half of 2017, and company executives project enough growth in the second half to crack $1 billion for the full year.

“One thing you hear consistently from the venture capital community: the CEO that gets you from zero to $1 billion in revenue is usually not the same CEO that gets you to $10 billion. From that point of view, Scott’s work (as CEO) at Pure is over,” Burgener said.

Pure’s revenue growth has been mitigated by huge losses, but its executives predict the flash pioneer will show a profit this year. Most of the revenue so far has come from its FlashArray, with its newer FlashBlade big data storage platform starting to pick up steam.  

Giancarlo faces the task of sustaining profitability by scaling Pure Storage revenue into the multibillion-dollar range.

“That’s going to be a challenge,” Burgener said of the next step. “Pure is not going to reach $10 billion in revenue only by selling FlashArray. They have told me that FlashBlade will account for just under $100 million in revenue for this fiscal year. That’s a great start, but Charlie needs to be very successful in continuing to sell FlashArray, FlashBlade, and perhaps introducing new platforms to go after other markets.”

Although it is growing rapidly in primary storage systems, about half of organizations do not yet deploy flash in the data center, according to Enterprise Strategy Group, based in Milford, Mass. Among enterprises that use flash, it often remains a small subset of their overall capacity, ESG storage analyst Scott Sinclair said. That leaves considerable room for Pure Storage to boost revenue by selling more storage.

“One big question for the new CEO to answer is how to expand on net-new opportunities,” Sinclair said. “We are seeing the emergence of different buying personas within IT. The majority of buyers still want to buy storage infrastructure and build it into a data center. But we’re also seeing more and more customers that want to offload those responsibilities. Pure may want to decide if it wants to take advantage of that trend.”

Positive cash flow first, acquisitions could follow

Pure has started to expand the software features on FlashArray and FlashBlade models, most recently adding predictive storage analytics and multisite replication. It also has notched data protection partnerships with Cohesity, Rubrik and Zerto (PDF).

Having more integrated features could boost Pure Storage revenue by enabling it to better compete with established storage vendors, said Henry Baltazar, a research director for storage at 451 Research in San Francisco.

Giancarlo’s initial focus as Pure Storage CEO will be on achieving positive cash flow, Baltazar said. After that, he expects Pure will explore strategic acquisitions.

“There are going to be a lot of asset sales” because failed startups don’t get funding, he said. “There probably will be a lot of decent technology available that Pure could use to fill some gaps and leverage its portfolio.

“While Pure has made a great effort to expand in all-flash, it’s basically a two-product company. Right now, Pure covers an important part of the market, but it’s not the whole market.”

Confluent’s Kafka data-streaming framework gets ‘SQL-ized’

The day when armies of business analysts can query incoming data in real time may be drawing closer. Supporting such continuous interactive queries is a goal of KSQL, software put forward this week by the Kafka data-streaming software originators at Confluent Inc.

KSQL is a SQL engine that directly handles Apache Kafka data streams. As such, it can skip other big data components that bring broadly supported SQL capabilities to Hadoop and Spark, but may require intermediate data stores and batch-oriented processing.

The software is intended to bridge the gap between real-time monitoring and real-time analytics, according to Neha Narkhede, co-founder and CTO at Confluent, based in Palo Alto, Calif. At the Kafka Summit in San Francisco, she said KSQL can continuously join streaming data, such as web user clicks, with relevant table-based data, such as account information.

She also said KSQL is intended to broaden the use of Kafka beyond Java and Python, opening up Kafka programming to developers familiar with SQL; although, the form of SQL Confluent is using here is a dialect, one the company has developed to deal with the unique architecture of Kafka streaming. The software is appearing first as a developer preview, and it will be available under an Apache 2.0 license, according to the company.

Kafka data on the move

Created at LinkedIn, Kafka began life as a publish-and-subscribe messaging system that focused on handling log files as system events. It became an Apache Software Foundation project, and it was expanded to support a fuller data-streaming architecture.

The open source version of Kafka is commonly used in Hadoop and Spark data pipelines today. That puts it at the center of much of the industry activity aimed at putting big data into motion.

“Overall, we are seeing Kafka growing across large enterprises, in startups and in job posts. Companies are looking for people with Kafka skill sets,” said Fintan Ryan, an analyst at RedMonk in Portland, Maine. “Underlying that is a drive in use of streaming data in general.”

Much of the current streaming data landscape is centered on Kafka. But a grab bag of alternatives exists.

Alternatives are found in long-standing software, such as RabbitMQ and Tibco StreamBase; in later entries, such as Amazon Kinesis and Apache Spark Streaming; and in newly emerging frameworks, such as Apache Flink, Microsoft Azure Event Grid and others. Just this month, startup Streamlio emerged from stealth mode, describing its efforts to promote enterprise streaming based on Heron — a stream processor emerging from distributed systems work at another social media mainstay, Twitter.

The goal of the newly released KSQL is to bring Kafka streaming programming directly to SQL-capable developers. For example, it's meant to join click streams via continuous queries with table data.
The goal of the newly released KSQL is to bring Kafka streaming programming directly to SQL-capable developers. For example, it’s meant to join click streams via continuous queries with table data.

Waiting for the SQL

For now, Confluent’s KSQL is programmed via a command-line interface, Ryan noted. That means opportunity, he said, for other software vendors to build drag-and-drop interfaces that tap into Kafka via KSQL. In fact, at the Kafka Summit, analytics software provider Arcadia Data said it was working with Confluent to support a visual interface for interactive queries on Kafka topics, or Kafka message containers, via KSQL.

Confluent’s KSQL scheme meets competition among a handful of players that have already been working to connect Kafka with SQL. Some of those players were on hand at the Kafka Summit with product updates.

At the conference, Striim Inc. unrolled its 3.7.4 platform release, adding more monitoring metrics and Kafka diagnostic utilities, as well as new connectors to Amazon Web Services Redshift and Simple Storage Service, Google Cloud, Azure SQL Server, Azure Storage and Azure HDInsight. Also at the summit, SQLstream launched Blaze 5.2, supporting Apache SystemML for declarative programming of machine learning applications.

Kafka SQL links and other streaming activity should not belie the fact that big data streaming architecture is still a young discipline. That is emphasized by recent word that Apache Kafka would formally reach version 1.0.0 in early October.

Software veterans recall there was a time when development managers would wait until release 2.0 before touching any software, and release 1.0 was a nonstarter. But it seems the speed of data streaming today is such that those types of caution are out the window, at least where organizations sense the potential for significant business advantage.

AI washing muddies the artificial intelligence products market

Analysts predict that by 2020, artificial intelligence technologies will be in almost every new software and service release. And if they’re not actually in them, technology vendors will probably use smoke and mirrors marketing tactics to make users believe they are.

Many tech vendors already shoehorn the AI label into the marketing of every new piece of software they develop, and it’s causing confusion in the market. To muddle things further, major software vendors accuse their competitors of egregious mislabeling, even when the products in question truly do include artificial intelligence technologies.

AI mischaracterization is one of the three major problems in the AI market, as highlighted by Gartner recently. More than 1,000 vendors with applications and platforms describe themselves as artificial intelligence products vendors, or say they employ AI in their products, according to the research firm. It’s a practice Gartner calls “AI washing” — similar to the cloudwashing and greenwashing, which have become prevalent over the years as businesses overexaggerate their association to cloud computing and environmentalism.

AI goes beyond machine learning

When a technology is labelled AI, the vendor must provide information that makes it clear how AI is used as a differentiator and what problems it solves that can’t be solved by other technologies, explained Jim Hare, a research VP at Gartner, who focuses on analytics and data science.

You have to go in with the assumption that it isn’t AI, and the vendor has to prove otherwise.
Jim Hareresearch VP, Gartner

“You have to go in with the assumption that it isn’t AI, and the vendor has to prove otherwise,” Hare said. “It’s like the big data era — where all the vendors say they have big data — but on steroids.”

“What I’m seeing is that anything typically called machine learning is now being labelled AI, when in reality it is weak or narrow AI, and it solves a specific problem,” he said.

IT buyers must hold the vendor accountable for its claims by asking how it defines AI and requesting information about what’s under the hood, Hare said. Customers need to know what makes the product superior to what is already available, with support from customer case studies. Also, Hare urges IT buyers to demand a demonstration of artificial intelligence products using their own data to see them in action solving a business problem they have.

Beyond that, a vendor must share with customers the AI techniques it uses or plans to use in the product and their strategy for keeping up with the quickly changing AI market, Hare said.

The second problem Gartner highlights is that machine learning can address many of the problems businesses need to solve. The buzz around more complicated types of AI, such as deep learning, gets so much hype that businesses overlook simpler approaches.

“Many companies say to me, ‘I need an AI strategy’ and [after hearing their business problem] I say, ‘No you don’t,'” Hare said.

Really, what you need to look for is a solution to a problem you have, and if machine learning does it, great,” Hare said. “If you need deep learning because the problem is too gnarly for classic ML, and you need neural networks — that’s what you look for.”

Don’t use AI when BI works fine

When to use AI versus BI tools was the focus of a spring TDWI Accelerate presentation led by Jana Eggers, CEO of Nara Logics, a Cambridge, Mass., company, that describes its “synaptic intelligence” approach to AI as the combination of neuroscience and computer science.

BI tools use data to provide insights through reporting, visualization and data analysis, and people use that information to answer their questions. Artificial intelligence differs in that it’s capable of essentially coming up with solutions to problems on its own, using data and calculations.

Companies that want to answer a specific question or problem should use business analytics tools. If you don’t know the question to ask, use AI to explore data openly, and be willing to consider the answers from many different directions, she said. This may involve having outside and inside experts comb through the results, perform A/B testing, or even outsource via platforms such as Amazon’s Mechanical Turk.

With an AI project, you know your objectives and what you are trying to do, but you are open to finding new ways to get there, Eggers said.

AI isn’t easy

A third issue plaguing AI is that companies don’t have the skills on staff to evaluate, build and deploy it, according to Gartner. Over 50% of respondents to Gartner’s 2017 AI development strategies survey said the lack of necessary staff skills was the top challenge to AI adoption. That statistic appears to coincide with the data scientist supply and demand problem.

Companies surveyed said they are seeking artificial intelligence products that can improve decision-making and process automation, and most prefer to buy one of the many packaged AI tools rather than build one themselves. Which brings IT buyers back to the first problem of AI washing; it’s difficult to know which artificial intelligence products truly deliver AI capabilities, and which ones are mislabeled.

After determining a prepackaged AI tool provides enough differentiation to be worth the investment, IT buyers must be clear on what is required to manage it, Hare said; what human services are needed to change code and maintain models over the long term? Is it hosted in a cloud service and managed by the vendor, or does the company need knowledgeable staff to keep it running?

“It’s one thing to get it deployed, but who steps in to tweak and train models over time?” he said. “[IBM] Watson, for example, requires a lot of work to stand up and you need to focus the model to solve a specific problem and feed it a lot of data to solve that problem.”

Companies must also understand the data and compute requirements to run the AI tool, he added; GPUs may be required and that could add significant costs to the project. And cutting-edge AI systems require lots and lots of data. Storing that data also adds to the project cost.

Learn the basics of PowerShell for Azure Functions

just for developers; several scripting languages open up new opportunities for admins and systems analysts as well.

Scripting options for Azure Functions

Azure Functions is a collection of event-driven application components that can interact with other Azure services. It’s useful for asynchronous tasks, such as data ingestion and processing, extract, transform and load processes or other data pipelines, as well as microservices or cloud service integration.

In general, functions are well-suited as integration and scripting tools for legacy enterprise applications due to their event-driven, lightweight and infrastructure-free nature. The ability to use familiar languages, such as PowerShell, Python and Node.js, makes that case even stronger. Since PowerShell is popular with Windows IT shops and Azure users, the best practices below focus on that particular scripting language but apply to others as well.

PowerShell for Azure Functions

The initial implementation of PowerShell for Azure Functions uses PowerShell version 4 and only supports scripts (PS1 files), not modules (PSM1 files), which makes it best for simpler tasks and rapid development. To use PowerShell modules in Azure Functions, users can update the PSModulepath environment variable to point to a folder that contains custom modules and connect to it through FTP.

When you use scripts, pass data to PowerShell functions through files or environment variables, because a function won’t store or cache the runtime environment. Incoming data to a function, via an event trigger or input binding, is passed using files that are accessed in PowerShell through environment variables. The same scheme works for data output. Since the input data is just a raw file, users must know what to expect and parse accordingly. Functions itself won’t format data but will support most formats, including:

  • string;
  • int;
  • bool;
  • object/JavaScript Object Notation;
  • binary/buffer;
  • stream; and
  • HTTP

PowerShell functions can be triggered by HTTP requests, an Azure service queue, such as when a message is added to a specified storage queue, or a timer (see Figure 1). Developers can create Azure Functions with the Azure portal, Visual Studio — C# functions only — or a local code editor and integrated development environment, although the portal is the easiest option.

Triggers for PowerShell functions
Figure 1. PowerShell functions triggers

Recommendations

Azure Functions works the same whether the code is in C#, PowerShell or Python, which enables teams to use a language with which they have expertise or can easily master. The power of Functions stems from its integration with other Azure services and built-in runtime environments. Writing as a function is more efficient than creating a standalone app for simple tasks, such as triggering a webhook from an HTTP request.

While PowerShell is an attractive option for Windows teams, they need to proceed with caution since support for Azure Functions is still a work in progress. The implementation details will likely change, however, for the better.

How threat actors weaponized Mia Ash for a social media attack

Who is Mia Ash?

That was the question security analysts at Dell SecureWorks found themselves pondering earlier this year while investigating a flurry of phishing attacks against targets in the Middle East. Analysts believed a sophisticated advanced persistent threat (APT) group was behind the attack, for two reasons. First, the emails contained PupyRAT, a cross-platform remote access Trojan that was first discovered in 2015 and had been used by an Iranian threat actor group Dell refers to as “Cobalt Gypsy” (also known as Threat Group 2889 or “OilRig”). And second, the email addresses used in the attacks weren’t spoofed.  

“Many of the phishing emails were coming from legitimate addresses at other companies, which led us to believe those companies had been compromised,” Allison Wikoff, intelligence analyst at Dell SecureWorks, told SearchSecurity.

The email addresses used by the attackers belonged to Saudi Arabian IT supplier National Technology Group and Egyptian IT services firm ITWorx. But as sophisticated as the phishing attacks were, the targeted companies — which included energy, telecommunications, and financial services firms, as well as government agencies in the EMEA region — were largely successful in repelling the attacks and preventing the spread of PupyRAT in their environments.

But after the unsuccessful phishing attacks, Dell SecureWorks’ Counter Threat Unit (CTU) observed something else that alarmed them. Instead of another wave of phishing emails, CTU tracked a complex social media attack that indicated a resourceful, patient and knowledgeable nation-state threat actor.

Who is Mia Ash?

On Jan. 13, after the phishing attacks had ended, an employee at one of the companies targeted by Cobalt Gypsy received a message via LinkedIn from Mia Ash, a London-based photographer in her mid-20s, who said she was reaching out to various people as part of a global exercise. The employee, who SecureWorks researchers refer to anonymously as “Victim B,” connected to the photographer’s LinkedIn profile. To Victim B or the casual observer, Ash’s profile seemed legitimate; it contained a detailed work history and had more than 500 connections to professionals in the photography field, as well as individuals in the same regions and industries as Victim B.

The attackers spent a lot of time and effort building this persona, and they knew how to avoid detection.
Allison Wikoffintelligence analyst, Dell SecureWorks

After about a week of exchanged messages about photography and travel, Ash requested that Victim B add her as a friend on Facebook so the two could continue their conversation on that platform. According to SecureWorks’ new report, Victim B instead moved the correspondence to WhatsApp, a messaging service owned by Facebook, as well as email. Then on Feb. 12, Ash sent an email to Victim B’s personal email account with a Microsoft Excel file that was purportedly a photography survey. Ash requested that Victim B open the file at work in his corporate environment so that the file could run properly.

Victim B honored the request and opened the Excel on his company workstation; the Excel file contained macros that downloaded the same PupyRAT that Cobalt Gypsy used in the barrage of phishing attacks several weeks earlier. “It was the same organization that was hit before, within a month, and that was a big red flag,” Wikoff said.

Luckily, Victim B’s company antimalware defenses blocked the PupyRAT download. But the incident alarmed the company; Dell SecureWorks was asked to investigate the matter, and the CTU team soon discovered that “Mia Ash” wasn’t a professional photographer — in fact, she likely didn’t exist at all — and that another person was targeted long before Victim B.

Mia Ash Facebook page
The now-deleted Facebook page of ‘Mia Ash’

Behind the online persona

When CTU researchers started digging into the Mia Ash online persona, they discovered more red flags. While Ash’s LinkedIn profile was populated with connections to legitimate professionals, half of the connections bore striking similarities: all male individuals, between their early 20s and 40s, who work in midlevel positions as software developers, engineers and IT administrators. In addition, these connections worked at various oil and gas, financial services and aerospace companies in countries such as Saudi Arabia, India and Israel — all of which had been targeted by the Iranian APT group Cobalt Gypsy.

“We saw a good cross section of LinkedIn connections — half of them were what looked like legitimate photographers and photography professionals, and the other half appeared to be potential targets,” Wikoff said.

This wasn’t the first time threat actors used fake social media accounts for malicious purposes, but this was one of the most complex efforts the researchers had ever seen. The CTU team discovered Mia Ash had been active long before January and that Victim B wasn’t actually the first target to fall prey to this complex social media attack. The CTU team discovered a Blogger website called “Mia’s Photography” that had been created in April 2016. They also found that two other domains apparently belonging to Ash were registered in June and September of last year using a combination of Ash’s information and that of a third party, whom CTU refers to as “Victim A.”

It’s unclear why the domains were registered — they don’t contain malware or any malicious operations — or why Victim A participated. Wikoff said there are a number of possibilities; it’s likely that either Victim A registered both domains as a friendly or romantic gesture to Ash, believing she was real, or that Victim A registered the first domain as a gift for Ash and then the attackers behind the persona registered the second on behalf of Victim A to reciprocate the gesture.

Whatever the case, it appears Victim A was used as a sort of “patient zero” from whom the attackers could establish other social media connections. Wikoff said SecureWorks made attempts to contact Victim A, who like other Mia Ash targets had worked in energy and aerospace companies in the Middle East/Asia region, but so far has not heard back from him. The ironic part is that Victim A is currently an information security manager for a large consulting company – and even he was apparently fooled by this online persona.

There was more to Mia Ash than just the LinkedIn profile and Blogger site; the persona’s Facebook account was populated with personal details (her relationship status, for example, was listed as “It’s complicated”), posts about photography and images of herself, as well as her own professional photos. However, the images were stolen from the social media accounts of a Romanian photographer (Dell SecureWorks did not disclose the woman’s identity in order to protect her privacy).

“At first pass, it looks like a legitimate Facebook profile,” Wikoff said. “The attackers spent a lot of time and effort building this persona, and they knew how to avoid detection.”

For example, Wikoff said, the threat actors rotated or flipped many of the images stolen from the Romanian woman so the pictures would not show up in a reverse image search. The attackers also kept the social media accounts active with fresh postings and content to make them appear authentic and to lure potential targets like Victim A to interact with them; in fact, Victim A interacted with Mia Ash’s Facebook page as recently as March.

Online personas as social media attacks

The CTU team determined with a high confidence level that Mia Ash was a fake online persona created by threat actors to befriend employees at targeted organizations and lure those individuals into executing malware in their corporate environments. The CTU team also believes with “moderate confidence” (according to the scale used by the U.S. Office of the Director of National Intelligence) that Mia Ash was created and managed by the Cobalt Gypsy APT group.

The Mia Ash LinkedIn account disappeared before the CTU team could contact LinkedIn; the team alerted Facebook, which removed the Mia Ash profile. The CTU team wasn’t able to determine what Cobalt Gypsy’s ultimate goal was with this social media attack; they only know the threat actors were attempting to harvest midlevel network credentials with the PupyRAT malware.

While the motive for Mia Ash campaign is still a mystery, Wikoff said it was clear the APT group had done its homework on both the organizations it was targeting, as well as what was required to build and maintain a convincing online persona. In addition, the threat actors specifically targeted employees they knew had the desired network credentials and would likely respond to and engage the Mia Ash persona.

This isn’t the first time Cobalt Gypsy has used social media attacks; in 2015, SecureWorks reported the APT group used 25 fake LinkedIn accounts in a social engineering scheme. In that case, the attackers created profiles of employment recruits for major companies like Teledyne and Northrop Grumman and used them as malicious honeypots or “honey traps.” Once victims made contact with the fake profiles, attackers would lure them into filling out fraudulent employment applications.

The Mia Ash campaign demonstrates the evolution of such social media attacks. Instead of just composing a single LinkedIn profile, the attackers expanded their online footprint with other social media accounts. And the larger the online presence, Wikoff said, the more convincing the persona becomes.

“Cobalt Gypsy’s continued social media use reinforces the importance of recurring social engineering training,” the SecureWorks report states. “Organizations must provide employees with clear social media guidance and instructions for reporting potential phishing messages received through corporate email, personal email, and social media platforms.”

But Wikoff said awareness training isn’t enough to stop advanced social engineering attacks like the Mia Ash campaign. “You can train people with security awareness, but someone is always going to click,” she said. “And the attackers know this.”

In the case of Victim A, the campaign would have been successful if not for antimalware defenses that prevented PupyRAT (which, it should be noted, was a known malware signature) from downloading. But other organizations might not be as lucky, especially if these attacks use new malware types with no known signatures.

In addition, social media services offer an enormous opportunity for threat actors. Wikoff said attacks can easily set up accounts for LinkedIn, Facebook, Twitter and other services, free of charge, and use them for malicious purposes without running afoul of the sites’ terms of service. While the Mia Ash profiles for LinkedIn and Facebook were removed after the fact, Wikoff said it’s difficult for social media services to spot APT activity like the Mia Ash campaign before a user is victimized.

SecureWorks believes that Cobalt Gypsy has more online personas actively engaged in malicious activity, but finding them before they compromise their potential targets will be a challenge.

“It shows how much bigger the threat landscape has gotten,” Wikoff said. “It’s a case study on persistent threat actors and the effort they will go to in order to achieve their goals.”