Tag Archives: Automation

AIOps exec bets on incident response market shakeup

AIOps and IT automation have been hot topics in IT for about three years, but the ultimate vision of hands-off incident response has yet to be realized in most IT shops, says Vijay Kurkal, who was appointed CEO of Resolve Systems on Jan. 16. Kurkal had served as chief operating officer for the company since 2018.

Kurkal’s key priority in the first quarter of 2020 is the release of Resolve Insights, a platform that folds AIOps IP from the company’s August 2019 acquisition of FixStream into its IT automation software. While enterprise IT pros have been slow to trust such systems — which rely on AI and machine learning data analytics to automate common tasks such as server restarts — they have begun to find their way into production use at mainstream companies.

Vijay KurkalVijay Kurkal

Resolve faces a crowded field of competition that includes vendors with backgrounds in IT monitoring, incident response and data analytics. SearchITOperations.com had a conversation with Kurkal this week about how the company plans to hold its own in this volatile market.

Your product pitch sounds familiar to me. I’m sure I don’t have to tell you there are many vendors out there pursuing a vision of proactive IT automation assisted by AI. How will Resolve and FixStream be different?

Vijay Kurkal: There are two ecosystems we’re playing with. There are application monitoring tools like AppDynamics, Dynatrace, New Relic, etc. The users that they are going after are the application operations team. FixStream is complimentary to them. But they have limited visibility into hypervisors and deep into the network infrastructure. FixStream builds out a visual topology of every single infrastructure device that a particular application is touching, and all the events are overlaid on that. It’s [built] for the IT operation teams that are supporting critical applications.

Some of the other AIOps vendors using AI technologies, they have tons of algorithms, but any algorithm is only as good as the source data. It’s a garbage in, garbage out. Our starting point is always around relationship dependency mapping and getting data in context, and prioritizing what to act on. A second differentiator is that AI/ML algorithms are all based on a probabilistic model. [They] say what they believe are the potential root causes [of an issue], but they can’t say that with certainty. Where we’re taking it is, as soon as those events trigger an alert from FixStream, Resolve automates diagnostics. Typically, that requires a network engineer. We’re already trying this out with some pilot customers and by end of Q1 are going to have a product there. Most AIOps companies identify events; they don’t resolve them.

Most AIOps companies identify events; they don’t resolve them.
Vijay KurkalCEO, Resolve Systems

Is there a plan for IT automation beyond diagnostics?

Kurkal: The next step, and I don’t think most customers are there yet, is, ‘I’ve done this 10 times, and I feel very comfortable, just run this [process] automatically.’ You’ll have categories of events — there’ll be 30% that are not super critical. As the organization gets comfortable, these can be completely [automated]. Then there are 50% that are critical, and we can give them diagnostics, and point them in the right direction to solve them pretty quickly. Then 10% will be outliers where no automation can help, and that’s where IT ops experts will always be very, very relevant to run the business.

Another important aspect of observability is informing the development of the product at the other end of the DevOps pipeline. How does your product work within that process?

Kurkal: The people who build the applications know exactly what stresses their application is putting on various elements [of the infrastructure]. We want to equip the DevOps team with a drag-and-drop system to write automation — to tell the IT operations team, here’s the configuration of the infrastructure I’ll need, and here’s a set of diagnostic scripts, and remediation automation that’s pre-approved. And then it’ll be a closed feedback loop where the operations teams can give feedback [to the developer]. We’re not saying we’ll solve every need of the application, but we are trying to bring together these two teams to drive automation and intelligence.

There are some tools that specifically tie outages or incidents to changes in code — could Resolve make another acquisition in that space or further build out its products to address that too?

Kurkal: For us, it’s a strong possibility in late 2020 or in 2021. It might be an organic development of our products, or potentially, an inorganic acquisition around that. But we do see that’s where the market is moving, because no one wants to be reactive, and they want to have it all together.

Go to Original Article
Author:

SaltStack infrastructure-as-code tools seek cloud-native niche

As IT automation evolves to suit the cloud-native era, earlier approaches such as configuration management and infrastructure-as-code tools must also change with the times.

SaltStack infrastructure-as-code tools, along with products from competitors such as Puppet, Chef and Ansible, must accommodate fresh IT trends, from AI to immutable infrastructure. They must also do so quickly to keep up with the rapid pace of innovation in the IT industry, something SaltStack has failed to do in recent years, company officials acknowledge.

“[Our new approach] will be an accelerant that allows us to create [new products] much more quickly than we have in the past, and in a much more maintainable way,” said Salt open source creator Thomas Hatch, who is also CTO of SaltStack, the project’s commercial backer.

That new approach is an overhauled software development process based on the principles of plugin-oriented programming (POP), first introduced to core SaltStack products in 2018. This week, the company also renewed its claim in cloud-native territory with three new open source modules developed using POP that will help it keep pace with rivals and emerging technologies, Hatch said.

The modules are Heist, which creates “dissolvable” infrastructure-as-code execution agents to better serve ephemeral apps; Umbra, which automatically links IT data streams to AI and machine learning services; and Idem, a redesigned data description language based in YAML that simplifies the enforcement of application state.

Salt open source contributors say POP has already sped up the project’s development, where previously they faced long delays between code contributions and production-ready inclusion in the main Salt codebase.

“I’m the largest contributor of Azure-specific code to the Salt open source project, and I committed the bulk of that code at the beginning of 2017,” said Nicholas Hughes, founder and CEO of IT automation consulting firm EITR Technologies in Sykesville, Md., which is also a licensed reseller of SaltStack’s commercial product.  “It was accepted into the developer branch at that point. It just showed up in the stable branch at the beginning of 2019, nearly two years later.”

The new modules, especially Idem, can also be used to modernize Salt, especially its integrations with cloud service providers, Hughes said.

SaltStack plugin-oriented programming
SaltStack rewrote its infrastructure-as-code tools with plugin-oriented programming, instead of a traditional object-oriented method.

SaltStack revs update engine with POP and Idem

SaltStack’s Hatch introduced the POP method three years ago. This approach is a faster, more flexible alternative to the more traditional object-oriented programming method developers previously used to maintain the project, Hatch said.

“Object-oriented programming [puts] functions and data right next to each other, and the result is … a lot of isolated silos of code and data,” he said. “Then you end up building custom scaffolding to get them all to communicate with each other, which means it can become really difficult to extend that code.”

Plugin-oriented programming, by contrast, is based on small modules that can be developed separately and merged quickly into a larger codebase.

The new modules rolled out this week serve as a demonstration of how much more quickly the development of Salt and SaltStack infrastructure-as-code tools can move using POP, Hatch said. While an earlier project, Salt SSH, took one engineer two months to create a minimum viable product, and another six months to polish, Heist took one engineer a week and a half to stand up and another two weeks to refine, he said.

Similar open source projects that maintain infrastructure-as-code tools, such as HashiCorp’s Terraform, had long since broken up their codebases into more modular pieces to speed development, Hughes said. He also contributes Azure integration code to Terraform’s open source community.

[Idem and POP] will allow us to move and iterate and build out [the codebase] much more easily.
Nicholas HughesCEO, EITR Technologies

Now, Hughes said he has high hopes for Idem as a vehicle to further modernize cloud provider integrations in open source Salt, and he has already ported all the Azure code he wrote for Salt into Idem using POP.

“It will allow us to move and iterate and build out those codebases much more easily, and version and handle them separately,” he said. He’d also like to see Salt’s open source AWS integrations updated to work with Idem, as well as Salt functions such as the event bus, which ties in with third-party APIs to orchestrate CI/CD and IT monitoring systems alongside infrastructure.

As the cloud working group captain for the Salt open source project, Hughes said he’s put out a call for the community to port more cloud components into Idem, but that’s still a work in progress

Infrastructure-as-code tools ‘reaching the end of their run?’

In the meantime, the breakneck pace of cloud-native technology development waits for no one, and most of SaltStack’s traditional competitors in infrastructure-as-code tools, such as Puppet, Chef and Ansible, have a head start in the race to reinvent themselves.

Puppet has sought a foothold in CI/CD tools with its Distelli acquisition and moved into agentless IT automation, similar to Ansible’s, with Puppet Bolt. Chef overhauled its Ruby codebase using Rust to create the Chef Habitat project years ahead of SaltStack’s POP, in 2015, and expanded into IT security and compliance with Chef InSpec, which rolled out in version 1.0 in 2016.

SaltStack plans to refocus its business primarily on cloud security automation, which Hatch said accounts for 40 percent of the company’s new sales in 2019. It began that expansion in late 2018, but SaltStack has some potential advantages over Chef InSpec, since it can automate security vulnerability remediation without relying on third-party tools, and the company also beat Red Hat Ansible to the security automation punch, which Ansible began in earnest late last year.

Still, Ansible also has the cachet of its IBM/Red Hat backing and well-known network automation prowess.

HashiCorp’s Terraform has a long lead over Salt’s Idem-based POP modules in cloud provisioning integrations, and the company has hot projects to sustain it in other areas of IT, including cloud security, such as Vault secrets management.

“SaltStack seems to be the slowest to redefine themselves, and they’re the smallest [among their competitors], in my view,” said Jim Mercer, analyst at IDC. “The Umbra plugin that could pull them through into the hot area of AI and machine learning certainly isn’t going to hurt them, but there’s only so much growth left here.” A SaltStack spokesperson expressed disagreement with Mercer’s characterization of the company.

As container orchestration tools such as Kubernetes have risen in popularity, they’ve encroached on the traditional configuration management turf of vendors such as SaltStack, Puppet and Chef, though infrastructure-as-code tools such as Terraform remain widely used to automate cloud infrastructure under Kubernetes and to tie in to GitOps workflows.

Still, the market for infrastructure-as-code tools has also begun to erode, in Mercer’s view, with the growth of function-as-a-service products such as AWS Lambda and serverless container approaches such as AWS Fargate that eliminate infrastructure management below the application container level. Even among shops that still manage infrastructure under Kubernetes, fresh approaches to IT automation have begun to horn in on infrastructure as code’s turf, such as Kubernetes Helm, Kubernetes Operators and KUDO Operators created by D2iQ, formerly Mesosphere.

“These tools had their heyday, but they’re reaching the end of their run,” Mercer said. “They’re still widely used for existing apps, but as new cloud-native apps emerge, they’ll start to go the way of the VCR.”

Go to Original Article
Author:

RPA in manufacturing increases efficiency, reduces costs

Robotic process automation software is increasingly being adopted by organizations to improve processes and make operations more efficient.

In manufacturing, the use cases for RPA range from reducing errors in payroll processes to eliminating unneeded processes before undergoing a major ERP system upgrade.

In this Q&A, Shibaji Das, global director of finance and accounting and supply chain management for UiPath, discusses the role of RPA in manufacturing ERP systems and how it can help improve efficiency in organizations.

UiPath, based in New York, got its start in 2005 as DeskOver, which made automation scripts. In 2012, the company relaunched as UiPath and shifted its focus to RPA. UiPath markets applications that enable organizations to examine processes and create bots, or software robots that automate repetitive, rules-based manufacturing and business processes. RPA bots are usually infused with AI or machine learning so that they can take on more complex tasks and learn as they encounter more processes.

What is RPA and how does it relate to ERP systems?

Shibaji Das: When you’re designing a city, you put down the freeways. Implementing RPA is a little like putting down those freeways with major traffic flowing through, with RPA as the last mile automation. Let’s say you’re operating on an ERP, but when you extract information from the ERP, you still do it manually to export to Excel or via email. A platform like [UiPath] forms a glue between ERP systems and automates repetitive rules-based stuff. On top of that, we have AI, which gives brains to the robot and helps it understand documents, processes and process-mining elements.

Why is RPA important for the manufacturing industry?

Shibaji Das, UiPathShibaji Das

Das: When you look at the manufacturing industry, the challenges are always the cost pressure of having lower margins or the resources to get innovation funds to focus on the next-generation of products. Core manufacturing is already largely automated with physical robots; for example, the automotive industry where robots are on the assembly lines. The question is how can RPA enable the supporting functions of manufacturing to work more efficiently? For example, how can RPA enable upstream processes like demand planning, sourcing and procurement? Then for downstream processes when the product is ready, how do you look at the distribution channel, warehouse management and the logistics? Those are the two big areas where RPA plus AI play an important role.

What are some steps companies need to take when implementing RPA for manufacturing?

Das: Initially, there will be a process mining element or process understanding element, because you don’t want to automate bad processes. That’s why having a thought process around making the processes efficient first is critical for any bigger digital transformation. Once that happens, and you have more efficient processes running, which will integrate with multiple ERP systems or other legacy systems, you could go to task automation. 

What are some of the ways that implementing RPA will affect jobs in manufacturing? Will it lead to job losses if more processes are automated?

Das: Will there be a change in jobs as we know them? Yes, but at the same time, there’s a very positive element that will create a net positive impact from a jobs perspective, experience perspective, cost, and the overall quality of life perspective. For example, the moment computers came in, someone’s job was to push hundreds of piles of paper, but now, because of computing, they don’t have to do that. Does that mean there was a negative effect? Probably not, in the long run. So, it’s important to understand that RPA — and RPA that’s done in collaboration with AI — will have a positive impact on the job market in the next five to 10 years.

Can RPA help improve operations by eliminating mistakes that are common in manual processes?

Das: Robots do not make mistakes unless you code it wrong at the beginning, and that’s why governance is so important. Robots are trained to do certain things and will do them correctly every time — 100%, 24/7 — without needing coffee breaks.

What are some of the biggest benefits of RPA in manufacturing?

Das: From an ROI perspective, one benefit of RPA is the cost element because it increases productivity. Second is revenue; for example, at UiPath, we are using our own robots to manage our cash application process, which has impacted revenue collection [positively]. Third is around speed, because what an individual can do, a robot can do much faster. However, this depends on the system, as a robot will only operate as fast as the mother system of the ERP system works — with accuracy, of course. Last, but not least, the most important part is experience. RPA plus AI will enhance the experience of your employees, of your customers and vendors. This is because the way you do business becomes easier, more user-friendly and much more nimble as you get rid of the most common frustrations that keep coming up, like a vendor not getting paid.

What’s the future of RPA and AI in organizations?

Das: The vision of Daniel Dines [UiPath’s co-founder and CEO] is to have one robot for every individual. It’s similar to every individual having access to Excel or Word. We know the benefits of the usage of Excel or Word, but RPA access is still a little technical and there’s a bit of coding involved. But UiPath is focused on making this as code free as possible. If you can draw a flowchart and define a process clearly through click level, our process mining tool can observe it and create an automation for you without any code. For example, I have four credit cards, and every month, I review it and click the statement for whatever the amount is and pay it. I have a bot now that goes in at the 15th of the month and logs into the accounts and clicks through the process. This is just a simple example of how practical RPA could become.

Go to Original Article
Author:

AI trends 2020 outlook: More automation, explainability

The AI trends 2020 landscape will be dominated by increasing automation, more explainable AI and natural language capabilities, better AI chips for AI on the edge, and more pairing of human workers with bots and other AI tools.

AI trends 2020 — increased automation

In 2020, more organizations across many vertical industries will start automating their back-end processes with robotic process automation (RPA), or, if they are already using automation, increase the number of processes to automate.

RPA is “one of the areas where we are seeing the greatest amount of growth,” said Mark Broome, chief data officer at Project Management Institute (PMI), a global nonprofit professional membership association for the project management profession.

Citing a PMI report from summer 2019 that compiled survey data from 551 project managers, Broome said that now, some 21% of surveyed organizations have been affected by RPA. About 62% of those organizations expect RPA will have a moderate or high impact over the next few years.

RPA is an older technology — organizations have used RPA for decades. It’s starting to take off now, Broome said, partially because many enterprises are becoming aware of the technology.

“It takes a long time for technologies to take hold, and it takes a while for people to even get trained on the technology,” he said.

Moreover, RPA is becoming more sophisticated, Broome said. Intelligent RPA or simply intelligent process automation (IPA) — RPA infused with machine learning — is becoming popular, with major vendors such as Automation Anywhere and UiPath often touting their intelligent RPA products. With APIs and built-in capabilities, IPA enables users to more quickly and easily scale up their automation use cases or carry out more sophisticated tasks, such as automatically detecting objects on a screen, using technologies like optical character recognition (OCR) and natural language processing (NLP).

Sheldon Fernandez, CEO of DarwinAI, an AI vendor focused on explainable AI, agreed that RPA platforms are becoming more sophisticated. More enterprises will start using RPA and IPA over the next few years, he said, but it will happen slowly.

RPA, AI trends 2020
In 2020, enterprises will use more RPA and chatbots, and will get more explainable AI.

AI trends 2020 — push toward explainable AI

Even as AI and RPA become more sophisticated, there will be a bigger move toward more explainable AI.

“You will see quite a bit of attention and technical work being done in the area of explainability across a number of verticals,” Fernandez said.

You will see quite a bit of attention and technical work being done in the area of explainability.
Sheldon FernandezCEO, DarwinAI

Users can expect two sets of effort behind explainable AI. First, vendors will make AI models more explainable for data scientists and technical users. Eventually, they will make models explainable to business users.

Likely, technology vendors will move more to address problems of data bias as well, and to maintain more ethical AI practices.

“As we head into 2020, we’re seeing a debate emerge around the ethics and morality of AI that will grow into a highly contested topic in the coming year, as organizations seek new ways to remove bias in AI and establish ethical protocols in AI-driven decision-making,” predicted Phani Nagarjuna, chief analytics officer at Sutherland, a process transformation vendor.

AI trends 2020 — natural language

Furthermore, BI, analytics and AI platforms will likely get more natural language querying capabilities in 2020.

NLP technology also will continue to evolve, predicted Sid Reddy, chief scientist and senior vice president at virtual assistant vendor Conversica.

“Human language is complex, with hundreds of thousands of words, as well as constantly changing syntax, semantics and pragmatics and significant ambiguity that make understanding a challenge,” Reddy said.

“As part of the evolution of AI, NLP and deep learning will become very effective partners in processing and understanding language, as well as more clearly understanding its nuance and intent,” he continued.

Among the tech giants involved in AI, AWS, for example, in November 2019 revealed Amazon Kendra, an AI-driven search tool that will enable enterprise users to automatically index and search their business data. In 2020, enterprises can expect similar tools to be built into applications or sold as stand-alone products.

More enterprises will deploy chatbots and conversational agents in 2020 as well, as the technology becomes cheaper, easier to deploy and more advanced. Organizations won’t fully replace contact center employees with bots, however. Instead, they will pair human employees more effectively with bot workers, using bots to answer easy questions, while routing more difficult ones to their human counterparts.

“There will be an increased emphasis in 2020 on human-machine collaboration,” Fernandez said.

AI trends 2020 — better AI chips and AI at the edge

To power all the enhanced machine learning and deep learning applications, better hardware is required. In 2020, enterprises can expect hardware that’s specific to AI workloads, according to Fernandez.

In the last few years, a number of vendors, including Intel and Google, released AI-specific chips and tensor processing units (TPUs). That will continue in 2020, as startups begin to enter the hardware space. Founded in 2016, the startup Cerebras, for example, unveiled a giant AI chip that made national news. The chip, the largest ever made, Cerebras claimed, is the size of a dinner plate and designed to power massive AI workloads. The vendor shipped some last year, with more expected to ship this year.

While Cerebras may have created the largest chip in the world, 2020 will likely introduce smaller pieces of hardware as well, as more companies move to do AI at the edge.

Max Versace, CEO and co-founder of neural network vendor Neurala, which specializes in AI technology for manufacturers, predicted that in 2020, many manufacturers will move toward the edge, and away from the cloud.

“With AI and data becoming centralized, manufacturers are forced to pay massive fees to top cloud providers to access data that is keeping systems up and running,” he said. “As a result, new routes to training AI that can be deployed and refined at the edge will become more prevalent.”

Go to Original Article
Author:

Microsoft Power Platform adds chatbots; Flow now Power Automate

More bots and automation tools went live on the Microsoft Power Platform, Microsoft announced today. In their formal introductions, Microsoft said the tools will make data sources flow within applications like SharePoint, OneDrive and Dynamics 365, and create more efficiencies with custom apps.

The more than 400 capabilities added to the Microsoft Power Platform focus on expanding its robotic process automation potential for users, as well as new integrations between the platform and Microsoft Teams, according to a blog post by James Phillips, corporate vice president of business applications at Microsoft.

Some of those include robotic process automation (RPA) tools for Microsoft Power Automate, formerly known as Flow, which makes AI tools easier to add into PowerApps. Also newly available are tools for creating user interfaces in Power Automate.

AI Builder adds a point-and-click means to fold common processes such as forms processing, object detection and text classification into apps — processes commonly used for SharePoint and OneDrive content curation.

Microsoft is adding these tools, as well as new security features to analytics platform Power BI, in part to coax customers who remain on premises into the Azure cloud, said G2 analyst Michael Fauscette.

PowerApps reduce the development needed to create necessary connections between systems in the cloud, such as content in OneDrive and SharePoint with work being done in Dynamics 365 CRM, Teams and ERP applications.

Microsoft Power Automate, formerly Flow
Microsoft Power Automate, a low-code app-design tool,is the new version ofFlow.

Chatbots go live

Also announced as generally available at Microsoft Ignite are Power Virtual Agents, do-it-yourself chatbots on the Microsoft Power Platform.

They’ll likely first be used by customer service teams on Dynamics 365, said Constellation Research analyst R “Ray” Wang, but they could spread to other business areas such as human resources, which could use the bots to answer common questions during employee recruiting or onboarding.

If an agent is costing you $15 an hour and the chatbot 15 cents an hour … it’s all about call deflection.
R ‘Ray’ WangAnalyst, Constellation Research

While some companies may choose outside consultants and developers to build custom chatbots instead of making their own on the Microsoft Power Platform, Wang said some companies may try it to build them internally. Large call centers employing many human agents and running on Microsoft applications would be logical candidates for piloting new bots.

“I think they’ll start coming here to build their virtual agents,” Wang said. “[Bot] training will be an issue, but it’s a matter of scale. If an agent is costing you $15 an hour and the chatbot 15 cents an hour … it’s all about call deflection.”

Microsoft Power Platform evolves

PowerApps, which launched in late 2015, originally found utility with users of Microsoft Dynamics CRM who needed to automate and standardize processes across data sets inside the Microsoft environment and connect to outside platforms such as Salesforce, said Gartner analyst Ed Anderson.

Use quickly spread to SharePoint, OneDrive and Dynamics ERP users, as they found that Flow — a low-code app-design tool — enabled the creation of connectors and apps without developer overhead. Third-party consultants and developers also used PowerApps to speed up deliverables to clients. Power BI, Power Automate and PowerApps together became known as the Microsoft Power Platform a year ago.

“PowerApps are really interesting for OneDrive and SharePoint because it lets you quickly identify data sources and quickly do something meaningful with them — connect them together, add some logic around them or customized interfaces,” Anderson said.

Go to Original Article
Author:

Gen Z in the workforce both want and fear AI and automation

For Gen Z in the workforce, AI and automation are useful and time-saving tools, but also possible threats to job security.

Typically characterized as the demographic born between the mid-1990s and mid-2000s, Generation Z  is the first generation to truly grow up exclusively with modern technologies such as smart phones, social media and digital assistants.

Many Gen Z-ers first experienced Apple’s Siri, released in 2011, and then Amazon’s Alexa, introduced in 2014 alongside Amazon Echo, at a young age.

The demographic as a whole tends to have a strong understanding of the usefulness of AI and automation, said Terry Simpson, technical evangelist at Nintex, a process management and automation vendor

Gen Z in the workforce

Most Gen Z employees have confidence in AI and automation, Nintex found in a September 2019 report about a survey of 500 current and 500 future Gen Z employees. Some 88% of the survey takers said AI and automation can make their jobs easier.

This generation understands AI technology, Simpson said, and its members want more of it in the workplace.

“For most organizations, almost 68 percent of processes are not automated,” Simpson said. Automation typically replaces menial, repetitive tasks, so lack of automation leaves those tasks to be handled by employees.

Gen Z, Gen Z in the workforce, AI and automation
Gen Z wants more automation in the workplace, even as they fear it could affect job security.

For Gen Z in the workforce, a lack of automation can be frustrating, Simpson said, especially when Gen Z-ers are so used to the ease of digital assistants and automated programs in their personal lives. Businesses generally haven’t caught up to the AI products Gen Z-ers are using at home, he said.

Yet, even as Gen Z-ers have faith that AI and automation will help them in the workplace, they fear it, too.

Job fears

According to the Nintex report, 57% of those surveyed expressed concern that AI and automation could affect their job security.

“A lot of times you may be a Gen Z employee that automation could replace what you’re doing as a job function, and that becomes a risk,” Simpson said.

Everybody says I don’t want to lose my job to a robot, and then Outlook tells you to go to a meeting and you go.
Anthony ScriffignanoChief data scientist, Dun & Bradstreet

Still, he added, automation can help an organization as a whole, and can ease the employees’ workloads.

“Everybody says I don’t want to lose my job to a robot, and then Outlook tells you to go to a meeting and you go,” said Anthony Scriffignano, chief data scientist at Dun & Bradstreet.

Jobs that can be easily automated may eventually be given to an automated system, but AI will also create jobs, Scriffignano said.

As a young generation, Gen Z-ers may have less to fear than other generations, however.

Younger generations are coachable and more open to change than the older generations, Scriffignano said. They will be able to adapt better to new technologies, while also helping their employers adapt, too.

“Gen Z have time in their career to reinvent themselves and refocus” their skills and career goals to better adapt for AI and automation, Scriffignano said.

Go to Original Article
Author:

Box AI, workflow automation strategies about to unfold

Box AI and workflow automation advancements that users are waiting for, and which are instrumental to the content services platform vendor’s future, will come into clearer focus this month, according to CEO Aaron Levie.

With Box AI tools at the hub of Box Skills, the company’s still-in-beta system for customizing Box applications with machine learning technology from Google, Microsoft or IBM, AI will permeate Box’s content management systems, Levie said.

“We want to make sure we continue to automate and bring intelligence to your digital business processes,” Levie said in an interview.

New Box AI tools

Levie said the company will make announcements around Box AI and workflow automation, and generally, about how Box plans to “advance the state of the digital workplace,” at the BoxWorks 2018 conference in San Francisco Aug. 29 to 30.

“We’re going to talk a lot about AI and the power of machine learning,” Levie said. “And you’re going to see more of a roadmap around workflow in Box as well, which we’re really excited about.”

Indeed, workflow and digital process automation have been a perennial question for Box in recent years, said Cheryl McKinnon, a Forrester analyst scheduled to speak at BoxWorks.

Workflow automation progress

McKinnon noted that Box, which started out as an enterprise file sync-and-share company, has tried to remedy the gap through a partnership with IBM on the Box Relay workflow automation tool and other deals (with companies like Nintex and Pegasystems). Box also recently acquired startup Progressly to improve workflow automation.

We want to make sure we continue to automate and bring intelligence to your digital business processes.
Aaron LevieCEO, Box

“I do expect to see deeper investment in Box’s own automation capabilities as it puts some of the expertise from recent acquisitions, such as Progressly, to work,” McKinnon said.

“Content doesn’t get created in a vacuum — embedding the content creation, collaboration and sharing lifecycle into key business processes is important to keep Box a sticky and integral part of its clients’ internal and external work activities,” she said.

In addition to Box AI and workflow automation, Levie said Box is putting a lot of emphasis on its native-cloud architecture and persuading potential customers to move from on-premises content management systems to the cloud-based content services platform model that has distinguished Box.

Box CEO Aaron Levie
Box CEO Aaron Levie speaking at the BoxWorks 2017 conference.

“We’re really trying to help them move their legacy information systems, their technology infrastructure, to the cloud,” Levie said.

Box wants “to show a better path forward for managing, securing, governing and working content and not just using the same legacy systems, not having a fragmented content management architecture that we think is not going to enable a modern digital workplace,” Levie said.

Box vs. Dropbox and bigger foes

Meanwhile, its similarly named competitor, DropBox, completed a successful IPO this year and is angling for the enterprise market, where Box holds the lead. Dropbox’s stock price took a hit recently, but Levie said he takes the competition seriously. Box, too, sustained a decline in its stock price earlier this year, though the stock’s value has stabilized.

“I would not dismiss them as a player in this space,” Levie said of Dropbox. “But we think we serve more or less different segments of the market. They are more consumer and SMB leaning and we are much more SMB and enterprise leaning.”

Actually, Box’s most dangerous competitive threats are from cloud giants like Microsoft and Google, McKinnon said.

They are “investing significantly in their own content and collaboration platforms, and while Box partners with both of them for integration with office productivity tools and as optional cloud storage back ends, the desire to be the single source of truth for corporate content in the cloud will put them head to head in many accounts,” she said.

ONAP Beijing release targets deployment scenarios

Deployability is the name of the game with the Linux Foundation’s latest Open Network Automation Platform architecture.

Central to the ONAP Beijing release are seven identified “dimensions of deployability,” said Arpit Joshipura, general manager of networking and orchestration at the Linux Foundation. These seven deployability factors comprise usability, security, manageability, stability, scalability, performance and resilience.

By identifying these dimensions, the Linux Foundation expects to better address and answer questions regarding documentation, Kubernetes management, disruptive testing, multisite failures and lifecycle management transactions. The goal is better consistency among ONAP deployments, Joshipura said.

Other than the standardized support for external northbound APIs that face a user’s operational and business support systems, the ONAP Beijing release had only a handful of architectural changes from the previous Amsterdam architecture, according to Joshipura. To that end, the ONAP Beijing release features four relevant MEF APIs taken from MEF’s Lifecycle Service Orchestration architecture and framework.

An additional architectural tweak pinpointed the ONAP Operations Manager. OOM now works with Kubernetes and can run with any cloud provider, Joshipura said.

Arpit Joshipura, Linux Foundation Arpit Joshipura

“All the projects within ONAP can become Docker containers, and Kubernetes orchestrates all of them,” he said. “It helps with management, portability and efficiencies in terms of VMs [virtual machines] needed to run them.”

The ONAP Beijing release also introduced Multi-site State Coordination Services, which ONAP dubbed MUSIC. MUSIC coordinates databases and synchronizes policies for ONAP deployments in multiple locations, geographies and countries — relevant for providers like Vodafone and Orange. The release also provided standard templates and virtual network functions (VNFs) integration and validation, regarding information and data modeling.

Functional enhancements for ONAP Beijing

In addition to architecture adaptions, the ONAP Beijing release made a series of functional enhancements that include change management, hardware platform awareness and autoscaling with manual triggers. For example, the system follows policy to automatically move VNFs or add VMs if a certain location has excess compute capacity. This capability helps scale the VNFs appropriately, Joshipura said.

ONAP expects to make its next release, Casablanca, available at the end of 2018. ONAP Casablanca will continue work on operational and business support systems, in addition to adding more cross-project integration related to microservices architecture, Joshipura said. Further, ONAP Casablanca will introduce a formal VNF certification program and standardize features to support 5G and cross-cloud connectivity.

Linux Foundation reacts to Microsoft’s GitHub acquisition

Microsoft’s $7.5 billion acquisition of GitHub received a cautiously positive response from the Linux Foundation. Jim Zemlin, the Linux Foundation’s executive director, categorized the move as “pretty good news for the world of open source,” highlighting Microsoft’s expertise to make GitHub better. He did, however, stress Microsoft’s growing need to earn the trust of developers, while also acknowledging the existence of other open source developer platforms, such as GitLab and Stack Overflow.

“As we all evaluate the evolution of open source from the early days to now, I suggest we celebrate this moment,” Zemlin said about the purchase. “The multidecade progression toward the adoption and continual use of open source software in developing modern technological products, solutions and services is permanent and irreversible. The majority of the world’s economic systems, stock exchanges, the internet, supercomputers and mobile devices run the open source Linux operating system, and its usage and adoption continue to expand.”

Atomist extends CI/CD to automate the entire DevOps toolchain

Startup Atomist hopes to revolutionize development automation throughout the application lifecycle, before traditional application release automation vendors catch on.

Development automation has been the fleeting goal of a generation of tools, particularly DevOps tools, that promise continuous integration and continuous delivery. The latest is Atomist and its development automation platform, which aims to automate as many of the mundane tasks as possible in the DevOps toolchain.

Atomist ingests information about an organization’s software projects and processes to build a comprehensive understanding of those projects. Then it creates automations for the environment, which use programming tools such as parser generators and microgrammars to parse and contextualize code.

The system also correlates event streams pulled from various stages of development and represents them as code in a graph database known as the Cortex. Because Atomist’s founders said they believe the CI pipeline model falls short, Atomist takes an event-based approach to model everything in an organization’s software delivery process as a stream of events. The event-driven model also enables development teams to compose development flows based on events.

In addition, Atomist automatically creates Git repositories and configures systems for issue tracking and continuous integration, and creates chat channels to consolidate notifications on the project and delivered information to the right people.

“Atomist is an interesting and logical progression of DevOps toolchains, in that it can traverse events across a wide variety of platforms but present them in a fashion such that developers don’t need to context switch,” said Stephen O’Grady, principal analyst at RedMonk in Portland, Maine. “Given how many moving parts are involved in DevOps toolchains, the integrations are welcome.”

Mik Kersten, a leading DevOps guru and CEO at Tasktop Technologies, has tried Atomist firsthand and calls it a fundamentally new approach to manage delivery. As these become increasingly complex, the sources of waste move well beyond the code and into the tools spread across the delivery pipeline, Kersten noted.

The rise of microservices, and tens or hundreds of services in their environments, introduce trouble spots as developers collaborate, deploy and monitor the lifecycle of these hundreds of services, Johnson said.

This is particularly important for security, where keeping services consistent is paramount. In last year’s Equifax breach, hackers gained access through an unpatched version of Apache Struts — but with Atomist, an organization can identify and upgrade old software automatically across potentially hundreds of repositories, Johnson said.

Atomist represents a new class of DevOps product that goes beyond CI, which is “necessary, but not sufficient,” said Rod Johnson, Atomist CEO and creator of the Spring Framework.

Rod Johnson, CEO, AtomistRod Johnson

Tasktop’s Kersten agreed that approach to developer-centric automation “goes way beyond what we got with CI.” The company created a Slack bot that incorporates Atomist’s automation facilities, driven by a development automation engine that is reminiscent of model-driven development or aspect-oriented programming, but provides generative facilities not only of code but across projects resources and other tools, Kersten said. A notification system informs users what the automations are doing.

Most importantly, Atomist is fully extensible, and its entire internal data model can be exposed in GraphQL.

Tasktop has already explored ways to connect Atomist to Tasktop’s Integration Hub and the 58 Agile and DevOps tools it currently supports, Kersten said.

Automation built into development

As DevOps becomes more widely adopted, integrating automation into the entire DevOps toolchain is critical to help streamline the development process so programmers can develop faster, said Edwin Yuen, an analyst at Enterprise Strategy Group in Milford, Mass.

The market to integrate automation and development will grow, as both the companies that use DevOps and the number of applications they develop increase.
Edwin Yuenanalyst, Enterprise Strategy Group

“The market to integrate automation and development will grow, as both the companies that use DevOps and the number of applications they develop increase,” he said. Atomist’s integration in the code creation and deployment process, through release and update management processes, “enables automation not just in the development process but also in day two and beyond application management,” he said.”

Atomist joins other approaches such as GitOps and Bitbucket Pipelines that target the developer who chooses the tools used across the complete lifecycle, said Robert Stroud, an analyst at Forrester Research in Cambridge, Mass.

“Selection of tooling such as Atomist will drive developer productivity allowing them to focus on code, not pipeline development — this is good for DevOps adoption and acceleration,” he said. “The challenge for these tools is although new code fits well, deployment solutions are selected within enterprises by Ops teams, and also need to support on-premises deployment environments.”

For that reason, look for traditional application release automation vendors, such as IBM, XebiaLabs and CA Technologies, to deliver features similar to Atomist’s capabilities in 2018, Stroud said.

Intelligent information management the next wave for ECM

The introductions of AI, automation and an abundance of vendors have caused a seismic shift in content management — so much so that, in 2017, Gartner rebranded the space as content services, a term that covers content services applications, platforms and components.

Content management company M-Files hopes its evolution toward intelligent information management is where the content services industry is heading. The thought being that, as information finds itself in more silos, there’s more importance on what the content is, rather than where it resides.

“What M-Files is doing is interesting, as it’s consistent with what I see in the industry now,” said John Mancini, chief evangelist at AIIM, a nonprofit research and education firm in Silver Spring, Md., focusing on information management. “Everyone is trying to figure out how to leverage its existing content and how to push as much as possible in the direction of the user so that content management becomes more user-defined, rather than a top-down approach.”

Focus on content type, not new thinking

The concept of focusing on what content is, rather than where it resides, isn’t unique to 2018. Intelligent information management is gaining in importance, however, as information is scattered across clouds, on-premises systems and mobile devices.

We spent 15 years trying to convince people to care about what repository something is in, but they still don’t.
John Mancinichief evangelist, AIIM

“We spent 15 years trying to convince people to care about what repository something is in, but they still don’t,” Mancini said. “But they do care if it’s a contract or a proposal or an internal document.”

It’s with that in mind that M-Files released M-Files 2018, which connects to market leaders in content storage — ranging from other content management services, like OpenText and Laserfiche, to more classical repositories, like on-premises network folders or SharePoint, and expanding out to systems of record, like Salesforce and Dynamics 365.

“There’s this trend of consumerization — the idea that the way you’ll be most productive is by allowing users to work the way they want to work,” said Greg Milliken, vice president of marketing at M-Files, based in Plano, Texas. “There’s the OpenText or Laserfiche type, where the structure is defined, and this is how we’re going to work. But everyone works a little bit differently.”

Finding content wherever it’s stored

What M-Files is touting is, with its intelligent information management platform, users will be able to natively search for content based on its metadata and locate and view any content no matter where it may be stored.

“It speaks to the trend we’re going toward, which is away from static silos and one-size-fits-all approach to something where content can be anywhere and it comes to you like a service,” Milliken said. “You can look into these repositories with a common lens.”

Being repository-agnostic requires some form of AI, and M-Files hopes its partnership with ABBYY will prove fruitful in building out its intelligent information management software.

“The way I think about it sometimes is upside-down content management,” Mancini said, referring to the repository-agnostic approach — and not a reference to Stranger Things. “Content management used to be a very top-down, orchestrated process. That colored how we all thought about the tools that individual knowledge workers used. And in that process, we tended to overcomplicate that from an end-user perspective.”

A list of the various vendors to which M-Files 2018 is connected includes other content management players, on-premises share folders and CRM systems. The connectors vary in price, according to M-Files.
A list of the various vendors to which M-Files 2018 is connected includes other content management players, on-premises share folders and CRM systems. The connectors vary in price, according to M-Files.

Partnership adds AI to content management

Last year’s partnership between M-Files and ABBYY brought AI functionality to the vendor, which M-Files hopes benefits its users by automatically classifying and tagging documents and other content with the correct metadata, saving users time and minimizing human error.

“Where you put something comes down to the company or even the individual,” Milliken said. “Someone might put a document in a customer folder; someone else might put it in a specific project folder — now, it’s derived automatically. The idea of applying AI to a meta-driven approach is what we’ve been striving for.”

Mancini sees a metadata-driven, repository-agnostic approach as a potential transition for the content management market.

“What’s happened in the last couple of years and in surveys we’ve done, people and companies with long-term ECM [enterprise content management], the primary challenge they come back to us with is usability,” Mancini said. “The metadata-centric approach of M-Files is an attempt to do this through the prism of a knowledge worker and to see if that redefines content management.”

M-Files 2018, currently available for download, is a platform upgrade for existing customers. The AI metadata layer is an additional cost for new and existing customers, starting at $10,000 per year and varies depending on the size and scope of the company. The connectors to various repositories are also an additional cost, depending on the connector.