Tag Archives: intelligent

Plattner: ‘Speed is the essence’ for SAP HANA Data Management Suite

The intelligent enterprise must be built on speed — speed like you get from the in-memory processing of the HANA database, which is also the foundation of the SAP HANA Data Management Suite, SAP co-founder Hasso Plattner said in his keynote address at  Sapphire Now.

The vendor announced SAP HANA Data Management Suite at its annual user conference, which was held this month in Orlando, Fla. It’s an amalgam of SAP applications intended to allow companies to get a better grip on the various data sources flowing through the organization and extract business value, according to the company.

SAP HANA Data Management Suite consists of the SAP HANA database; SAP Data Hub, a data governance and orchestration platform; SAP Cloud Platform Big Data Services, a large-scale data processing platform based on Hadoop and Spark; and SAP Enterprise Architecture Designer, a collaborative cloud application that can let anyone in an organization participate in planning, designing and governing data analytics applications.

“When we talk about human intelligence, it’s directly correlated to speed. Every intelligence test in the world is based on how fast you can solve certain [tests]. Speed is the essence. And the faster you can do something, the faster you can simulate, the higher the probability that you reach a decent result,” Plattner said. “There are data pipelines, governance and workflows from one HANA system, and we can access all other HANA systems and even non-HANA systems. This is very important when we think about Leonardo projects we build outside the system, but can access any kind of data objects or services inside the system.”

Plattner outlined five innovations that are key to SAP HANA Data Management Suite:

  • data pipelines that allow access to data at its origin, which improves security and reduces management;
  • text and search, including natural language processing of unstructured data from sources like Hadoop;
  • spatial and graph functions that can combine business data with geographic and streaming data to enable much faster geo-enabled applications;
  • data anonymization that can be done on the fly, allowing for applications that can be in compliance with the General Data Protection Regulation in near-real time; and
  • persistent memory, which keeps data in nonvolatile storage that can greatly reduce the amount of time it takes to reload data in the event of an outage.

“Our objective is not only that we connect everything with everything else, but that we can develop enhancements to products without touching the products. We have to reduce the maintenance efforts,” he said.

Not new, but bundle may help

Our objective is not only that we connect everything with everything else, but that we can develop enhancements to products without touching the products. We have to reduce the maintenance efforts.
Hasso Plattnerco-founder and chairman of the supervisory board at SAP

SAP HANA Data Management Suite is not really new, but the bundle may help SAP to market data management applications, said Holger Mueller, vice president and principal analyst at Constellation Research.

“It’s a combination of stuff they’ve already announced. So, it’s nothing new, but it’s really the glue to keep the new SAP together. They also want to simplify things; [Plattner] always thinks there are too many products and confusion with names and so on. So, why not put them together?” Mueller said. “In the field, they’re probably going to sell more Data Management Suite and Data Hub now that it’s bundled with HANA. They can throw it in together, so it really is just packaging. It’s a new product, so there’s only a few customers out there. But from what I see, there are some early projects, and it’s going.”

Embedding analytics into S/4HANA Cloud

Analytics is a major focus area for SAP, and the company announced it will begin to embed SAP Analytics Cloud functions directly in S/4HANA Cloud, the SaaS version of its newest ERP platform, allowing organizations to plan and run analytics in one system. SAP Analytics Cloud provides analytics for business intelligence — including SAP BusinessObjects, SAP Business Planning and Consolidation (BPC) and SAP Business Warehouse (BW) capabilities in one cloud-based platform, said Mike Flannagan, senior vice president of SAP Analytics and SAP Leonardo. The analytics functions are not just available in the data layer of an application.

“We’re not just providing access to data that sits on premises; we’re also supporting things like doing planning in Analytics Cloud that allows you to write that to BPC. So, you can continue to do all of your systems of record in BPC, but access the information and do the planning in Analytics Cloud, which has advanced features and a more modern interface,” Flannagan said. “Just having access to the data is one thing, but our customers haven’t invested in BusinessObjects only because of the data that’s there. It’s also the semantic layer where they’ve made a significant investment — years of investment. So, being able to take advantage of that investment while using cloud functionality is really important.”

SAP faces a crowded competitive landscape on the analytics front, but the expansion of the SAP Analytics Cloud portfolio may help differentiate it, said Doug Henschen, vice president and principal analyst at Constellation Research.

“SAP Analytics Cloud is roughly 3 years old, but it was fairly late to the market — such that many large customers had already taken other paths to self-service analytics like Tableau, Microsoft PowerBI and Qlik, or to cloud-based planning with Anaplan, Adaptive Planning or Host Analytics. And there’s plenty of competition on the predictive analytics front, as well,” Henschen said.

“Standardization on SAC [SAP Analytics Cloud] across the SAP portfolio is a good move on SAP’s part that could help sway customers to give it a second look,” Henschen continued. “But I think SAP has to continue to deepen the capabilities on the self-service analytics, planning and predictive fronts to stand up against best-of-breed competitors in each of these niches.”

Bing adds new intelligent visual search features

As the saying goes, a picture is worth a thousand words. Microsoft’s new intelligent visual search technology allows users to discover information about objects captured in images without having to pick and choose a handful of keywords to fit into a search box.

The AI-powered visual search feature is available on Bing mobile apps.

“Sometimes, it is almost impossible to describe what you want to search for using words,” explained Vince Leung, product lead for Bing Images at Microsoft.

For example, imagine hiking through a meadow and seeing a flower that you’ve never seen before. You want to know what it is and whether you can get it at your local garden store to plant at home. Bing’s Visual Search can help you identify and find more information from your snapshot of the flower.

Or, perhaps you’re in the market for a new couch and spot one you like in a high-end home furnishing store, but the price tag is beyond your budget. By taking a picture of the couch, Bing’s Visual Search can help you find couches that match the style with prices that may meet your budget.

The visual search feature uses Microsoft’s computer vision algorithms, which are trained with datasets containing vast amounts of labeled images, as well as images from around the web. From the training images, the algorithms learn to recognize dogs from cats, for example, and roses from daisies.

What’s more, the learning process is never done; the performance of the algorithms improves as they get more data.

“While there have been strides for many years to get to this point,” noted Leung, “with the advent of cloud computing we are able to accelerate our ability to make sense out of pixels.”

Related:

John Roach writes about Microsoft research and innovation. Follow him on Twitter.

Visual Search from Bing now lets you search what you see

Today we’re launching new intelligent Visual Search capabilities that build upon the visual technology already in Bing so you can search the web using your camera. Now you can search, shop, and learn more about your world through the photos you take.
These new Visual Search capabilities are available today in the US on the Bing app for iOS and Android, and for Microsoft Launcher (Android only). They’ll also begin rolling out today for Microsoft Edge for Android, and will be coming soon to Microsoft Edge for iOS and Bing.com. Just click the camera button to get started:

                         
For example, imagine you see a landmark or flower and want to learn more. Simply take a photo using one of the apps, or upload a picture from your camera roll. Bing will identify the object in question and give you more information by providing additional links to explore.


                        

You can even shop from your photos for fashion and home furnishings. Let’s say you see a friend’s jacket you like, but don’t know its brand or where to purchase. Upload a pic into the app’s search box and Bing will return visually-similar jackets, prices, and details for where to purchase.

We’ll be working hard over the coming months to add more capabilities to Visual Search, so your input on these features is greatly appreciated, as always. We hope you’re as excited by Visual Search as we are!

– The Bing Team

New tools unveiled to monitor, manage and optimize SAP environments


The world of the SAP intelligent enterprise requires new tools to monitor, manage and optimize SAP environments as they evolve to include new SAP platforms, integrations and advanced technologies.

SAP’s vision of the intelligent enterprise includes SAP Data Hub, which incorporates integration and data management components, and it shows the company can embrace modern open source platforms, like Hadoop and Spark, and hybrid and multi-cloud deployment, according to Doug Henschen, an analyst at Constellation Research.

This openness, along with extending cloud initiatives to Microsoft Azure, Google Cloud Platform and IBM private cloud instances, necessitated a move to bring customers hybrid and multi-cloud data management capabilities, Henschen said.

“The Data Hub, in particular, facilitates hybrid and multi-cloud data access without data movement and copying,” he said. “This is crucial in harnessing data from any source, no matter where it may be running, to facilitate data-driven decisioning.”

At SAP Sapphire Now 2018, several vendors unveiled new tools — or updates to existing ones — that address some of the challenges associated with moving SAP systems to the intelligent enterprise landscape.

  • Tricentis Tosca’s continuous testing method is designed to keep pace with modern SAP environments, unlike traditional testing methods, which were built for previous versions of SAP applications. These legacy testing systems may not always adequately support S/4HANA and Fiori 2.0, so many SAP users have to use manual testing to validate releases, according to Tricentis. Cloud-enabled Tricentis Tosca 11.2 now supports a variety of the newest SAP versions, including S/4HANA and Fiori 2.0.
  • Worksoft announced the release of Worksoft Interactive Capture 2.0, which is test automation software for SAP environments. Worksoft Interactive Capture 2.0 operates on the principle that it’s critical to keep existing SAP applications operating as new systems and applications are being developed. Worksoft Interactive Capture 2.0 allows business users and application functional experts to create automated business workflows, test documentation and test cases.
  • Virtual Forge announced its CodeProfiler for HANA can now scan the SAPUI5 programming language. CodeProfiler for HANA provides detailed information on code quality as a programmer writes code, similar to spell check on a word processor, according to Virtual Forge. This allows coders to identify and manage performance, security and compliance deficiencies early in the HANA application development process. Reducing or eliminating performance decline and application downtime is particularly critical, as HANA enables real-time business applications.
  • As more organizations move their SAP environments to S/4HANA — or plan to — it becomes important to understand how users actually interact with SAP applications. Knoa Software showed a new version of its user experience management application, Knoa UEM for Enterprise Applications — it’s also resold by SAP as SAP User Experience Management by Knoa. The product allows organizations to view and analyze how users are interacting with SAP applications, including activities that lead to errors, never-used applications and workarounds that are needed because an application’s software is bad, according to Knoa. The latest version of Knoa UEM for Enterprise Applications allows companies that are migrating to S/4HANA to analyze usage on a range of SAP applications, including SAP Fiori, SAP Business Client, SAP Enterprise Portal and SAP GUI for Windows. It can also be used for SAP Leonardo application development by determining how customers actually use the applications and developing a business case for the application based on accurate measurements of user experience improvements in the new apps.
  • General Data Protection Regulation (GDPR) compliance is a huge issue now, and Attunity released Gold Client for Data Protection, a data governance application for SAP environments. Gold Client for Data Protection enables the identification and masking of personally identifiable information across production SAP ECC systems, according to Attunity. The software helps organizations to find PII across SAP systems, which then enables them to enforce GDPR’s “right to be forgotten” mandate.

Dig Deeper on SAP development

Edge computing helps IT execs in managing large data sets

Data was a hot topic at the “Building the Intelligent Enterprise” panel session at the recent MIT Sloan CIO Symposium in Cambridge, Mass. As panelists discussed, changing market trends, increased digitization and the tremendous growth in data usage are demanding a paradigm shift from traditional, centralized enterprise models to decentralized, edge computing models.

All the data required for an intelligent enterprise has to be collected and processed somehow and somewhere — sometimes in real-time, presenting a challenge for companies.

Here, four IT practitioners break down best practices and architectures for managing large data sets and how they’re taking advantage of edge computing. This was in response to a question posed by moderator Ryan Mallory, senior vice president of global solutions enablement at data center provider Equinix.

Here is Mallory’s question to the panel: Having an intelligent enterprise means dealing with a lot of data. Can you provide some best practices for managing large data sets?

Alston Ghafourifar
CEO and co-founder of AI communication company Entefy Inc.

“I think it really depends on the use cases. We live in a multimodal world. Almost everything we do deals with multiple modalities of information. There are lots of different types of information, all being streamed to the same central areas. You can think about it almost like data lake intelligence.

Alston Ghafourifar, CEO and co-founder, EntefyAlston Ghafourifar

“The hardest part of something like this is actually getting yourself ready for the fact that you actually don’t know what information you’re going to need in order to predict what you want to predict. In some cases, you don’t even necessarily know what you want to predict. You just know you want it to be cheaper, faster, safer — serve some cost function at the very end.

“So, what we tend to do is design the infrastructure to pool as much diverse information as possible to a centralized core and then understand when it finds something that predicts something else — and there’s a lot of techniques upon which to do that.

“But when the system is looking through this massively unstructured information, the moment it gets to something where it says, ‘Oh, I think this is reliable, since I’m getting this over and over again,’ it’ll take that and automatically pull it out and put it into production at the edge, because the edge is processing the application of information. [The edge] is processing enterprise information in transit, almost like a bus. It doesn’t have the benefit of you cleaning it properly, or of you knowing exactly what you’re looking for.

The hardest part of something like this is actually getting yourself ready for the fact that you actually don’t know what information you’re going to need in order to predict what you want to predict.
Alston GhafourifarCEO and co-founder, Entefy Inc.

“Making that transaction and that transition automatic and intelligent is what takes an enterprise further. [An enterprise] could have petabytes of information, but could be bottlenecked in their learning by the 50 or 100 data scientists looking at it. Now, it could say, ‘I’m going to create the computing power of 5,000 data scientists to [do] that job for me,’ and just automatically push it out. It’s almost like a different type of cloud orchestration.”

Stephen Taylor
Global head of analytics, reporting, integration and software engineering at oil and natural gas exploration company Devon Energy

Stephen Taylor, global head of analytics, reporting, integration and software engineering at Devon EnergyStephen Taylor

“Let me build on that and say the one thing that we’re starting to do is use more of what the industry calls a Lambda architecture, where we’re both streaming and storing it. It’s having something that’s pulling data out of your stream to store it in that long-term data store.

“What we’re doing in areas like northwest Texas or the panhandle of Oklahoma, where you have extremely limited communication capability, is we’re caching that data locally and streaming the events that you’re detecting back over the network. So, you’re only streaming a very small subset of the data back, caching the data locally and physically moving that data to locations, up to the cloud and doing that big processing, and then sending the small processing models back to the edge.

“One of the things I think you have to do, though, is understand that — to [Ghafourifar’s] point — you don’t know what you don’t know yet. And you don’t even know what questions you’re going to get yet, and you don’t know what business problems you’re going to have to solve yet. The more you can do to capture all of the data — so then when you do your data science work, you have it all — the better. But differentiate what you need for processing versus what you need for storage and for data science work. Those are two different workloads.”

Michael Woods
Vice president of information technology at engineering and construction firm CDM Smith

Michael Woods, vice president of information technology at CDM SmithMichael Woods

“I can use construction as a primary example. We have large streams of data that we want to analyze as part of the construction process, because we want to know what’s happening in real time. We might have remotely operated vehicles driving or flying around doing LIDAR or radar activity, monitoring things from a visualization standpoint, etc., and that data maybe is getting streamed somewhere and kept. But, in real time, we just want to know what’s changing and when it’s changing — like weather patterns and other things going on. We want to analyze all that in real time.

“Now, at the end of the project, that’s when the data scientists might say, ‘We want to improve our construction process. So, what can we do with that data to help us determine what will make our next construction projects be more successful, take less time and be more cost-effective?'”

Hugh Owen
Senior vice president of product marketing at business intelligence software provider MicroStrategy

Hugh Owen, senior vice president of product marketing, MicroStrategyHugh Owen

“In terms of [managing large data sets], we try and push down as much of the processing into the Hadoop data structure — into the database — as possible. So, we’re always pulling as small an amount of data back as possible, rather than push as much data as possible to the edge, which ties into some of the points we’ve already made.

“I think you should always try to optimize and reduce the amount of information that comes back. For us, we’re doing that because we want the response to come back faster.”

Driving opportunity for device partners in the era of the intelligent cloud and intelligent edge – The Official Microsoft Blog

Satya Nadella recently shared our vision for the future of computing – one in which the intelligent edge and intelligent cloud create experiences where trusted technology is part of the fabric of our lives.Today at Computex 2018, joined by executives from Microsoft’s engineering, marketing and research teams, I had the opportunity to share what that vision means for our device partners. The opportunity to leverage artificial intelligence (AI), ubiquitous computing and Microsoft 365 multi-device and multi-sense experiences has never been greater. Together we can create new, compelling devices and experiences in the era of the intelligent edge and intelligent cloud.

To accelerate innovation in this new era we invite all our partners to join our Intelligent Edge Partner Community. The community will help partners connect with one another to identify opportunities to collaborate on technology innovation and achieve shared business goals. In addition, community members will be able to participate in training and community events, and can participate in early-adopter programs that provide access to documentation, specs, OS builds and certification details. To sign up, simply head to http://Microsoft.com/intelligentedge.

Today, in Taipei, we announced a new category of teamwork devices: Windows Collaboration Displays. These large, interactive displays will let people experience Microsoft 365 collaboration tools: Office, Teams and Whiteboard at room scale, and include built-in sensors that can connect to Azure IoT spatial intelligence capabilities. This incredible technology will allow facility managers to utilize environmental data to make real-time decisions. A variety of collaboration displays, from Sharp and Avocor will be available later this year.

A Windows Collaboration Display from Sharp.
A Windows Collaboration Display from Sharp.

We also announced Windows 10 IoT Core Services. This new service offering enables partners to commercialize a secure Internet of Things (IoT)-device backed by industry-leading support. It provides device makers the ability to manage updates for the OS, apps, settings and OEM-specific files, and is backed by 10 years of support.

Advances in ubiquitous computing and AI will drive the intelligent cloud and intelligent edge era 
Every part of our life, our homes, our cars and our workplaces are being transformed by digital technology. We are seeing this in every industry and sector of our economy.  The era of the intelligent cloud and intelligent edge will be driven by advances in ubiquitous computing, artificial intelligence and multi-sense, multi-device experiences.

First, with ubiquitous computing, Azure is being built as the world’s largest computer with cloud services from 50 regions around the planet, more than any other cloud provider. Azure also has the broadest set of compliance certifications in the industry, and brands across all industries are using Azure at scale. With offerings like Azure Stack, an extension of Azure that enables a truly consistent hybrid cloud platform; Azure IoT, a broad set of services that power IoT solutions; Azure IoT Edge, that allows devices on the edge to act on their own and connect only when needed; and Azure Sphere, a new solution to secure the 9 billion microcontroller unit (MCU)-powered devices that are built and deployed every year, Microsoft provides the most comprehensive ubiquitous computing fabric that partners can use to bring intelligence to edge devices from servers to gateways, to the smallest MCUbased sensors.

Computer scientists at Microsoft have been working on AI technologies for decades. Thanks to the immense computing power of the Azure cloud, access to comprehensive and secure data spanning services such as Bing, Office and LinkedIn, and the AI breakthroughs coming out of our worldwide network of research labs, we are uniquely able to infuse AI into our core products and services. Beyond that we’re delivering AI tools and frameworks including cognitive, vision, spatial and object APIs, and the recently announced Project Brainwave, an architecture for deep neural net processing on the edge – all of which partners can use to enable next-generation AI applications and solutions that run on devices.

Modern devices amplify the power of Microsoft 365 

Using Microsoft’s programs, platforms and suite of services, our partners are bringing to life a breadth of devices at the intelligent edge that delight customers and empower them to do more.

Microsoft 365 enables people and organizations to embrace the modern culture of work, to be more creative, work together more effectively and have a more productive experience – without sacrificing protection and security. This platform opens the door for new experiences brought to life by great hardware innovations from our partner ecosystem that helps users fluidly go from mouse to keyboard to touch to ink – and beyond, to multi-sense scenarios like voice and vision.

Modern devices from our partners light up Microsoft 365 features. Today at Computex, for the first time, we showed the brand-new HP ProBook x360 440 built for growing businesses and professionals on-the-go. Powered by Windows 10, the ultra-slim device delivers the power, security and durability businesses demand in a versatile 360-degree design. Built-in security from HP BIOSphere Gen4, a firmware ecosystem that automates protection of the BIOS, coupled with Microsoft 365 and an infrared sensor supporting Windows Hello face authentication, provides incredibly strong protection.

The new HP ProBook x360, built for business
The new HP ProBook x360 440, built for business.

Another great device that takes advantage of Microsoft 365 is Asus ZenBook Pro 15 UX580, beautifully designed with an all-aluminum unibody in a luxurious Deep Dive Blue with Rose Gold detailing. This high-performance laptop isn’t just beautiful, it’s also powerful and able to handle the most demanding tasks with ease, powered by the latest eighth-generation Intel Core processors. The ZenBook Pro series also features Windows Hello capabilities and built-in support for Amazon Alexa voice services, giving users new, smart ways to interact with their laptop.

The Asus ZenBook Pro 15 UX580.
The Asus ZenBook Pro 15 UX580.

Last year at Computex we announced a new category of always-connected PCs that partners including, Asus, HP and Lenovo are bringing to market. These devices come with incredible battery life and work like your phone with always-on connectivity. Earlier this week, Qualcomm announced that Samsung is joining us to expand this category of always-connected devices.

A new era needs a new level of trust

This new era represents tremendous opportunity for the ecosystem and comes with a responsibility to ensure that the technologies, devices and solutions we all create are trusted by the individuals and organizations that use them. We also need to ensure everyone can experience technology’s benefits and are inclusive. We all need to work together to ensure privacy, protect the legal rights of people around the world, drive cybersecurity efforts to keep the world safe and take steps to ensure that AI works in ethical and responsible ways.

Every part of our lives, every industry and every sector of our economy is being digitally transformed. There are limitless opportunities for Microsoft partners – from the largest Azure servers to the smallest devices using Azure Sphere and everything in between.

I’m so inspired by the opportunity for innovation that’s made possible by the intelligent cloud, the intelligent edge and AI, and I look forward to the future we’ll build together.

SAP and Accenture collaborate on entitlement management platform

SAP and Accenture are teaming to deliver an intelligent entitlement management application intended to help companies build and deploy new business models.

Entitlement management applications help companies grant, enforce and administer customer access entitlements (which are usually referred to as authorizations, privileges, access right, or permissions) to data, devices and services — including embedded software applications — from a single platform.

The new SAP Entitlement Management allows organizations to dynamically change individual customer access rights and install renewal automation capabilities in applications, according to SAP. This means they can create new offerings that use flexible pricing structures.

The new platform’s entitlement management and embedded analytics integrate with SAP S/4HANA’s commerce and order management functions, which according to SAP, can help organizations create new revenue streams and get new products and services to market faster.

Accenture will provide consulting, system development and integration, application implementation, and analytics capabilities to the initiative.

“As high-tech companies rapidly transition from stand-alone products to highly connected platforms, they are under mounting pressure to create and scale new intelligent and digital business models,” said David Sovie, senior managing director of Accenture’s high-tech practice, in a press release. “The solution Accenture is developing with SAP will help enable our clients to pivot to as-a-service business models that are more flexible and can be easily customized.”

SAP and Accenture go on the defense

SAP and Accenture also unveiled a new platform that provides digital transformation technology and services for defense and security organizations.

The digital defense platform is based in S/4HANA and contains advanced analytics capabilities, and allows more use of digital applications by military personnel. It includes simulations and analytics applications intended to help defense and security organizations plan and run operations efficiently and be able to respond quickly to changing operating environments, according to SAP and Accenture.

“This solution gives defense agencies the capabilities to operate in challenging and fast-changing geo-political environments that require an intelligent platform with deployment agility, increased situational awareness and industry-specific capabilities,” said Antti Kolehmainen, Accenture’s managing director of defense business, in a press release.

The platform provides data-driven insights intended to help leaders make better decisions, and it enables cross-enterprise data integration in areas like personnel, military supply chain, equipment maintenance, finances and real estate.

IoT integration will enable defense agencies to connect devices that can collect and exchange data. The digital defense platform technology is available to be deployed on premises or in the cloud, according to the companies.

“The next-generation defense solution will take advantage of the technology capabilities of SAP S/4HANA and Accenture’s deep defense industry knowledge to help defense agencies build and deploy solutions more easily and cost-effectively and at the same time enable the digital transformation in defense,” said Isabella Groegor-Cechowicz, SAP’s global general manager of public services, in a press release.

New application and customer experience tool for SAP environments

AppDynamics (a Cisco company) has unveiled a new application and customer experience monitoring software product for SAP environments.

AppDynamics for SAP provides visibility into SAP applications and customer experiences via code-level insights into customer taps, swipes and clicks, according to AppDynamics. This helps companies understand the performance of SAP applications and databases, as well as the code impact on customers and business applications.

To satisfy customer expectations, [the modern enterprise] needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.
Thomas Wyattchief strategy officer, AppDynamics

“The modern enterprise is in a challenging position,” said Thomas Wyatt, AppDynamics’ chief strategy officer, in a press release. “To satisfy customer expectations, it needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.”

AppDynamics for SAP allows companies to collaborate around business transactions, using a unit of measurement that automatically reveals customers’ interactions with applications. They can then identify and map transactions flowing between each customer-facing application and systems of records — SAP ERP or CRM systems that include complex integration layers, such as SAP Process Integration and SAP Process Orchestration.

AppDynamics for SAP includes ABAP code-level diagnostics and native ABAP agent monitoring that provides insights into SAP environments with code and database performance monitoring, dynamic baselines, and transaction snapshots when performance deviates from the norm. It also includes intelligent alerting to IT based on health rules and baselines that are automatically set for key performance metrics on every business transaction. Intelligent alerting policies integrate with existing enterprise workflow tools, including ServiceNow, PagerDuty and JIRA.

This means that companies can understand dependencies across the entire digital business and baseline, identify, and isolate the root causes of problems before they affect customers. AppDynamics for SAP also helps companies to plan SAP application migrations to the cloud and monitor user experiences post-migration, according to AppDynamics.

The intelligent enterprise blends tech savvy, business smarts

The term intelligent enterprise was coined in the early 1990s by James Brian Quinn. It was a business theory holding that technology and computing infrastructure were key to improved business performance. That was before the internet was a household word, before smartphones were in every hand, before cloud computing became computing for so many.

In 2018, intelligent enterprise takes on a new significance. Ryan Mallory, an executive at data center provider Equinix, hosted a panel on the topic at the MIT Sloan CIO Symposium in Cambridge, Mass., on May 23. His definition didn’t stray far from the original: a company that uses technology to “become more efficient, more productive and have a larger impact on the success of the overall enterprise goals and objectives.”

As senior vice president for global solutions enablement at the Redwood City, Calif., company, Mallory helps companies use Equinix’s data center resources to their advantage, easing their transition to cloud and emerging technologies like AI. So, for him, talking future tech and how to prepare for it is business as usual. IT execs at the MIT event spoke to him about a major challenge they faced in getting started: the breakneck pace of development.

“They’re having a hard time even getting through their tech-dev, DevOps evaluation cycle before that product is either obsolete or it has changed its overall focus,” he said.

In an interview at the MIT conference, Mallory discussed digital transformation efforts companies need to undertake to become an intelligent enterprise, how they’re fairing in deploying the mandatory technologies, including cloud, plus challenges Equinix faced during its own emerging-tech initiatives. Here are edited excerpts of that conversation.

Ryan Mallory, senior vice president for global solutions enablement, EquinixRyan Mallory

Part of your job is helping business executives with their cloud deployment strategies. What do executives get about cloud, and where do they need the most help?

Ryan Mallory: Executives get that they need cloud, period. So the financial modeling and ingestion of services in a bare-metal environment — meaning they’re using their own dollars every single year, every two years to buy servers and then do the refreshes — they realize that the economic modeling associated with that just isn’t viable any longer. They know that they need to consume services in a virtualized fashion, and they want to get there.

The biggest challenge that they have is, over the past eight or 10 years, they’ve really reduced their IT staffs — most importantly, their network staffs. And what they realize is, what they have today can’t get them to where they need to be tomorrow. They don’t have intellectual property inside, from a staffing perspective, to cross that chasm, to understand how to actually get cloudified.

You’re moderating a panel here at the MIT CIO Symposium on becoming an intelligent enterprise. Can you define that?

Mallory: What I look at as an intelligent enterprise is a company or an enterprise that has the ability to utilize technology to become more efficient, more productive and have a larger impact on the success of the overall enterprise goals and objectives.

Does a business need to go through a digital transformation to become an intelligent enterprise?

Mallory: Absolutely. It’s always multi-tiered, and it should be viewed as multiyear, and it shouldn’t ever be viewed as a start and an end point. Because with today’s technology advancements and development life cycles, we see Moore’s Law continually compressing with the ability to have better capabilities, better services, better chips, etc. So the need to advance your digital strategy aligns with that, because it can continue to get better.

I was meeting with a with a couple of companies here, and they were very clearly stating that the biggest challenges when they start something in R&D, they’re having a hard time even getting through their tech-dev, DevOps evaluation cycle before that product is either obsolete or it has changed its overall focus. And their R&D dollars have to be refunded and then go back into another round of R&D.

That’s the cycle we’re seeing, especially from SaaS applications and the capabilities out there: These near-term roadmap capabilities are just evolving so quickly that there’s a frustration around, ‘How do we make sure we stay in front of this?’ And, ‘How are we looking at the current state and the “to be,” but also that the innovation ideation is taking place, as well?’

What are some of those key technologies?

Mallory: It’s AI. It’s analytics, machine learning, but it’s also the core apps that sit inside Microsoft Azure. We’ve had conversations around some of their data science applications and even some of their advanced notebooking capabilities, where companies are using them for a specific purpose.

One of the companies I was talking to this morning is an oil and gas company that was using some of the data analytics and scientific mechanisms that Microsoft had, and they were all bought in [on a Microsoft product]. Six months into their R&D cycle, they end-of-lifed that product because they were going to launch something else. And that’s not a hit on Microsoft; it’s just how fast people are developing their underlying products, and that’s coming from the hyperscalers all the way down to startups.

And we’ve seen this evolve. Entefy is a very hot, up-and-coming AI-machine learning company in Silicon Valley. And just in the last 18 months, we’ve seen their product set evolve from one unified messaging-type platform that had AI capabilities in it to allow you to combine text and voice and email into a single information interface to an eight-product set suite. That’s how quick they’re evolving. That original UC model it’s there; it’s OK. But the deep search and the deep learning capabilities are what they’ve really moved into. The market is just so dynamic right now; you just have to really focus on being in front of it.

So, the pace of development is a challenge. What are some other challenges companies will face?

Mallory: They’re very adept at their own code and their own algorithms; they’re very focused on that product development. It’s being able to integrate those capabilities in usable fashion out in the market.

If we look at the marketplace from a silo perspective, what’s happening with customers, what’s happening with vendors, you can look at the widgets and say, ‘OK, great. That’s a good idea,’ and, ‘That’s a great algorithm,’ or, ‘That’s a great implementation.’ But when you start looking at AI-machine learning and say, ‘OK, we want to start looking at some models for smart cities,’ what’s the endpoint? Data cameras that when every time everybody goes through a toll, we can look at what the car looks like, how many passengers are in there, what speed they’re going at. Then, companies can start looking at, ‘Where do we want to put charging stations? What’s the uptick of electric cars? Should we sell this information to Tesla and let them put in a new kiosk in a mall?’

But connecting all those ancillary dots becomes very difficult, because you’re taking technologies, and you’ve got to figure out how to integrate them, whether it’s physically or logically, and then figure out how to make sure that people can access the information. More importantly, how can they act on that info? That’s the big challenge.

Equinix has implemented a lot of emerging technologies — it spearheaded a number of AI initiatives, for example. What challenges did you experience when preparing for and executing on them?

Mallory: I think what everybody assumes is when they think about AI or they think of IoT that there are quick-fix scenarios out there. ‘OK, great. I’m IoT-enabled.’ Well, if a device is IoT-enabled, that just means that endpoint has the capability to communicate more broadly. When you look at that deployment plan or a deployment scheme on trying to make things smarter or have access to that data, that’s putting sensors everywhere.

The real challenge is the scope of the deployment associated with making things more intelligent. That’s the hard part. The technology and being able to define that it has been proven. And everybody I think in the technology field is very comfortable where we’re at and where we’re going to go. But it’s just making sure that anything that’s a legacy — anything that was built from today back has those capabilities. That’s where the challenge is. We’re talking about hundreds of thousands of sensors. It takes a lot of man-hours to go in and put a sensor on every light switch.

The physical deployment.

Mallory: Absolutely. We have the aggregation mechanisms both public and private, with private connectivity through our fabric and public connectivity through the internet or Wi-Fi, so the accessing and the concepts are there. But it’s the time associated with deployment that I think that’s catching everybody right now.

Equinix’s Ryan Mallory discusses his company’s moves in a growing data center services market in part two of this two part Q&A.