Tag Archives: major

Microsoft bills Azure network as the hub for remote offices

Microsoft’s foray into the rapidly growing SD-WAN market could solve a major customer hurdle and open Azure to even more workloads.

All the major public cloud platforms have increased their networking functionality in recent months, and Microsoft’s latest service, Azure Virtual WAN, pushes the boundaries of those capabilities. The software-defined network acts as a hub that links with third-party tools to improve application performance and reduce latency for companies with multiple offices that access Azure.

IDC estimates the software-defined wide area network (SD-WAN) market will hit $8 billion by 2021, as cloud computing continues to proliferate and employees must access cloud-hosted workloads from various locations. So far, the major cloud providers have left that work to partners.

But this Azure network service solves a big problem for customers that make decisions about network transports and integration with existing routers, as they consume more cloud resources from more locations, said Brad Casemore, an IDC analyst.

“Now what you’ve got is more policy-based, tighter integration within the SD-WAN,” he said.

Azure Virtual WAN uses a distributed model to link Microsoft’s global network with traditional on-premises routers and SD-WAN systems provided by Citrix and Riverbed. Microsoft’s decision to rely on partners, rather than provide its own gateway services inside customers’ offices, suggests it doesn’t plan to compete across the totality of the SD-WAN market, but rather provide an on-ramp to integrate with third-party products.

Customers can already use various SD-WAN providers to easily link to a public cloud, but Microsoft has taken the level of integration a step further, said Bob Laliberte, an analyst at Enterprise Strategy Group in Milford, Mass. Most SD-WAN vendors are building out security ecosystems, but Microsoft already has that in Azure, for example.

This could also simplify the purchasing process, and it would make sense for Microsoft to eventually integrate this virtual WAN with Azure Stack to help facilitate hybrid deployments, Laliberte said.

It’s unclear if customers trust Microsoft — or any single hyperscale cloud vendor — at the core of their SD-WAN implementation, as their architectures spread across multiple clouds.

The Azure Virtual WAN service is billed as a way to connect remote offices to the cloud, and also to each other, with improved reliability and availability of applications. But that interoffice linkage also could lure more companies to use Azure for a whole host of other services, particularly customers just starting to embrace the public cloud.

There are still questions about the Azure network service, particularly around multi-cloud deployments. It’s unclear if customers trust Microsoft — or any single hyperscale cloud vendor — at the core of their SD-WAN implementation, as their architectures spread across multiple clouds, Casemore said.

Azure updates boost network security, data analytics tools

Microsoft also introduced an Azure network security feature this week, Azure Firewall, with which users can create and enforce network policies across multiple endpoints. A stateful firewall protects Azure Virtual Network resources and maintains high availability without any restrictions on scale.

Several other updates include an expanded Azure Data Box service, still in preview, which provides customers with an appliance onto which they can upload data and ship directly to an Azure data center. These types of devices have become a popular means to speed massive migrations to public clouds. Another option for Azure users, Azure Data Box Disk, uses SSD disks to transfer up to 40 TB of data spread across five drives. That’s smaller than the original box’s 100 TB capacity, and better suited to collect data from multiple branches or offices, the company said.

Microsoft also doubled the query performance of Azure SQL Data Warehouse to support up to 128 concurrent queries, and waived the transfer fee for migrations to Azure of legacy applications that run on Windows Server and SQL Server 2008/2008 R2, for which Microsoft will end support in July 2019. Microsoft also plans to add features to Power BI for ingestions and integration across BI models that are similar to Microsoft customers’ experience with Power Query for Excel.

GandCrab ransomware adds NSA tools for faster spreading

With version 4, GandCrab ransomware has undergone a major overhaul, adding an NSA exploit to help spread and targeting a larger set of systems.

The updated GandCrab ransomware was first discovered earlier this month, but researchers are just now learning the extent of the changes. The code structure of the GandCrab ransomware was completely rewritten. And, according to Kevin Beaumont, a security architect based in the U.K., the malware now uses the EternalBlue National Security Agency (NSA) exploit to target SMB vulnerabilities and spread faster.

“It no longer needs a C2 server (it can operate in airgapped environments, for example) and it now spreads via an SMB exploit – including on XP and Windows Server 2003 (along with modern operating systems),” Beaumont wrote in a blog post. “As far as I’m aware, this is the first ransomware true worm which spreads to XP and 2003 – you may remember much press coverage and speculation about WannaCry and XP, but the reality was the NSA SMB exploit (EternalBlue.exe) never worked against XP targets out of the box.”

Joie Salvio, senior threat researcher at Fortinet, based in Sunnyvale, Calif., found the GandCrab ransomware was being spread to targets via spam email and malicious WordPress sites and noted another major change to the code.

“The biggest change, however, is the switch from using RSA-2048 to the much faster Salsa20 stream cipher to encrypt data, which had also been used by the Petya ransomware in the past,” Salvio wrote in the analysis. “Furthermore, it has done away with connecting to its C2 server before it can encrypt its victims’ file, which means it is now able to encrypt users that are not connected to the Internet.”

However, the GandCrab ransomware appears to specifically target users in Russian-speaking regions. Fortinet found the malware checks the system for use of the Russian keyboard layout before it continues with the infection.

Despite the overhaul of the GandCrab ransomware and the expanded systems being targeted, Beaumont and Salvio both said basic cyber hygiene should be enough to protect users from attack. This includes installing the EternalBlue patch released by Microsoft, keeping antivirus up-to-date and disabling SMB version 1 altogether, which is advice that has been repeated by various outlets, including US-CERT, since the initial WannaCry attacks began.

Database DevOps tools bring stateful apps up to modern speed

DevOps shops can say goodbye to a major roadblock in rapid application development.

At this time in 2017, cultural backlash from database administrators (DBAs) and a lack of mature database DevOps tools made stateful applications a hindrance to the rapid, iterative changes made by Agile enterprise developers. But, now, enterprises have found both application and infrastructure tools that align databases with fast-moving DevOps pipelines.

“When the marketing department would make strategy changes, our databases couldn’t keep up,” said Matthew Haigh, data architect for U.K.-based babywear retailer Mamas & Papas. “If we got a marketing initiative Thursday evening, on Monday morning, they’d want to know the results. And we struggled to make changes that fast.”

Haigh’s team, which manages a Microsoft Power BI data warehouse for the company, has realigned itself around database DevOps tools from Redgate since 2017. The DBA team now refers to itself as the “DataOps” team, and it uses Microsoft’s Visual Studio Team Services to make as many as 15 to 20 daily changes to the retailer’s data warehouse during business hours.

Redgate’s SQL Monitor was the catalyst to improve collaboration between the company’s developers and DBAs. Haigh gave developers access to the monitoring tool interface and alerts through a Slack channel, so they could immediately see the effect of application changes on the data warehouse. They also use Redgate’s SQL Clone tool to spin up test databases themselves, as needed.

“There’s a major question when you’re starting DevOps: Do you try to change the culture first, or put tools in and hope change happens?” Haigh said. “In our case, the tools have prompted cultural change — not just for our DataOps team and dev teams, but also IT support.”

Database DevOps tools sync schemas

Redgate’s SQL Toolbelt suite is one of several tools enterprises can use to make rapid changes to database schemas while preserving data integrity. Redgate focuses on Microsoft SQL Server, while other vendors, such as Datical and DBmaestro, support a variety of databases, such as Oracle and MySQL. All of these tools track changes to database schemas from application updates and apply those changes more rapidly than traditional database management tools. They also integrate with CI/CD pipelines for automated database updates.

Radial Inc., an e-commerce company based in King of Prussia, Pa., and spun out of eBay in 2016, took a little more than two years to establish database DevOps processes with tools from Datical. In that time, the company has trimmed its app development processes that involve Oracle, SQL Server, MySQL and Sybase databases from days down to two or three hours.

“Our legacy apps, at one point, were deployed every two to three months, but we now have 30 to 40 microservices deployed in two-week sprints,” said Devon Siegfried, database architect for Radial. “Each of our microservices has a single purpose and its own data store with its own schema.”

That means Radial, a 7,000-employee multinational company, manages about 300 Oracle databases and about 130 instances of SQL Server. The largest database change log it’s processed through Datical’s tool involved more than 1,300 discrete changes.

“We liked Datical’s support for managing at the discrete-change level and forecasting the impact of changes before deployment,” Siegfried said. “It also has a good rules engine to enforce security and compliance standards.”

Datical’s tool is integrated with the company’s GoCD DevOps pipeline, but DBAs still manually kick off changes to databases in production. Siegfried said he hopes that will change in the next two months, when an update to Datical will allow it to detect finer-grained attributes of objects from legacy databases.

ING Bank Turkey looks to Datical competitor DBmaestro to link .NET developers who check in changes through Microsoft’s Team Foundation Server 2018 to its 20 TB Oracle core banking database. Before its DBmaestro rollout in November 2017, those developers manually tracked schema and script changes through the development and test stages and ensured the right ones deployed to production. DBmaestro now handles those tasks automatically.

“Developers no longer have to create deployment scripts or understand changes preproduction, which was not a safe practice and required more effort,” said Onder Altinkurt, IT product manager for ING Bank Turkey, based in Istanbul. “Now, we’re able to make database changes roughly weekly, with 60 developers in 15 teams and 70 application development pipelines.”

Database DevOps tools abstract away infrastructure headaches

Consistent database schemas and deployment scripts through rapid application changes is an important part of DevOps practices with stateful applications, but there’s another side to that coin — infrastructure provisioning.

Stateful application management through containers and container orchestration tools such as Kubernetes is still in its early stages, but persistent container storage tools from Portworx Inc. and data management tools from Delphix have begun to help ease this burden, as well.

GE Digital put Portworx container storage into production to support its Predix platform in 2017, and GE Ventures later invested in the company.

Now, [developers] make database changes roughly weekly, with 60 developers in 15 teams and 70 application development pipelines.
Onder AltinkurtIT product manager, ING Bank Turkey

“Previously, we had a DevOps process outlined. But if it ended at making a call to GE IT for a VM and storage provisioning, you give up the progress you made in reducing time to market,” said Abhishek Shukla, managing director at GE Ventures, based in Menlo Park, Calif. “Our DevOps engineering team also didn’t have enough time to call people in IT and do the infrastructure testing — all that had to go on in parallel with application development.”

Portworx allows developers to describe storage requirements such as capacity in code, and then triggers the provisioning at the infrastructure layer through container orchestration tools, such as Mesosphere and Kubernetes. The developer doesn’t have to open a ticket, wait for a storage administrator or understand the physical infrastructure. Portworx can arbitrate and facilitate data management between multiple container clusters, or between VMs and containers. As applications change and state is torn down, there is no clutter to clean up afterward, and Portworx can create snapshots and clone databases quickly for realistic test data sets.

Portworx doesn’t necessarily offer the same high-octane performance for databases as bare-metal servers, said a Portworx partner, Kris Watson, co-founder of ComputeStacks, which packages Portworx storage into its Docker-based container orchestration software for service-provider clients.

“You may take a minimal performance hit with software abstraction layers, but rapid iteration and reproducible copies of data are much more important these days than bare-metal performance,” Watson said.

The addition of software-based orchestration-to-database testing processes can drastically speed up app development, as Choice Hotels International discovered when it rolled out Delphix’s test data management software a little more than two years ago.

“Before that, we had never refreshed our test databases. And in the first year with Delphix, we refreshed them four or five times,” said Nick Suwyn, IT leader at the company, based in Rockville, Md. “That has cut down data-related errors in code and allowed for faster testing, because we can spin up a test environment in minutes versus taking all weekend.”

The company hasn’t introduced Delphix to all of its development teams, as it prioritizes a project to rewrite the company’s core reservation system on AWS. But most of the company’s developers have access to self-service test databases whenever they are needed, and Suwyn’s team will link Delphix test databases with the company’s Jenkins CI/CD pipelines, so developers can spin up test databases automatically through the Jenkins interface.

Airtel CIO targets cutting-edge tech

A major part of every digital transformation is exploring how cutting-edge tech can facilitate the journey. Some companies, like Indian telecom giant Bharti Airtel Ltd., are more capable than others of experimenting with new technologies, affording them a wealth of opportunities for innovation.

In this video from the recent MIT Sloan CIO Symposium, Harmeen Mehta, global CIO and head of digital at Airtel, discusses some of the cutting-edge tech she’s employing at her company — everything from advanced mapping techniques and network digitization to voice computing technology and AI-driven customer offerings.

Editor’s note: This transcript has been edited for clarity and length.

What kind of cutting-edge tech are you using to speed up your company’s digital transformation process?

Harmeen Mehta: Lots of pieces. I think one of the biggest challenges that we have is mapping the intricacies and the inner lanes in India and doing far more than what even Google does. For Google, the streets are of prime importance [when it comes to mapping]. For us, the address of every single house and whether it’s a high-rise building or it’s a flat is very important as we bring different services into these homes. So, we’ve been working on finding very innovative ways to take Google’s [mapping] as a base and make it better for us to be able to map India to that level of accuracy of addresses, houses and floor plans.

Another problem that I can think of where a lot of cutting-edge tech is being used is in creating a very customized contextual experience for the consumer so that every consumer has a unique experience on any of our digital properties. The kind of offers that the company brings to them are really tailored and suited to them rather than it being a general, mass offering. There’s a lot of machine learning and artificial intelligence that’s going into that.

Another one is we’re digitizing a large part of our network. In fact, we’re collaborating with SK Telecom, who we think is one of the most innovative telcos out there, in order to do that. We’re using, again, a lot of machine learning and artificial intelligence there as well, as we bring about an entire digitization of our network and are able to optimize the networks and our investments much better.

Then, of course, I’m loving the new stream that we are creating, which is all around exploring voice as a technology. The voice assistants are getting more intelligent. It gives us a very unique opportunity to actually reach out and bring the digital transformation to a lot of Indians who aren’t as literate — to those whom the reading and the writing part doesn’t come to them as naturally as speaking does. It’s opening up a whole lot of new doors and we’re really finding that a very interesting space to work in and we’re exploring a lot in that arena at the moment.

View All Videos

How to muster the troops

A digital transformation journey, make no mistake, is no walk in the park. It involves major course corrections to technology, to business processes, to how people do their jobs and how they think about their roles. So, how does a company make something so radical as digital transformation part of its DNA?

Gail Evans, who was promoted in June from global CIO at Mercer to the consulting firm’s global chief digital officer, believes an important first step is getting people to see what’s in it for them, “because once you see the value, you’re all in.”

In this video recorded in May at the MIT Sloan CIO Symposium, then-CIO Evans provided some insight into how she musters the troops at Mercer, explaining that a digital transformation journey is, by nature, long and iterative, requiring people to see value all along the way.

Editor’s note: The following was edited for clarity and brevity.

What can companies do to get started on a digital transformation journey?

Gail Evans: Actually, I think there are a couple of things. I think digital transformation, at its core, is the people. At the very core of any transformation, it is about how do you inspire a team to align to this new era — this new era of different tools, different technologies that can be applied in many different, creative ways to create new business models or to drive efficiencies in your organization.

So, I think the leaders in the enterprise are ones who understand the dynamics of taking your core and moving it up the food chain. Where does it start? I think it starts with creating a beachhead, creating a platform of digital, and then allowing that to grow and swell with training and opportunities, webinars, blogs so that it becomes a part of a company’s DNA. Because I believe digital isn’t a thing — it’s a new way of doing things to create value through the application of technology and data.

Which departments at Mercer are in the vanguard of digital transformation? Who are laggards?

Evans: One would argue that marketing is already digital, right? I mean, they are already using digital technologies to drive personalized experiences on the web and have been doing that for many years. I would say that it starts in, probably, technology. Technology will embrace it, and also it needs to be infused into the business leaders.

I think the laggards are typically … I guess I wouldn’t necessarily call them ‘laggards.’ I think I would refer to them as not yet seeing the value of digital, because once you see the value, you’re all in.

Pockets of resistance

[Digital transformation is] humans plus technology and new business models. That is what digital transformation is all about and it’s fun!
Gail Evansglobal chief digital officer, Mercer

Evans: There are teams or pockets of folks who have done things the same way for a long time and there is a resistance there. It’s kind of the, ‘If it ain’t broke, don’t fix it.’ Those are pockets, but you’d find those pockets in every transformation, whether it’s digital or [moving into the] information age, whatever — you’ll find pockets of people who are not ready to go.

And so, I think there are pockets of people in our core legacy who are holding onto their technology of choice and may not have up-skilled themselves, so they are holding on and they are resisting.

And then there are business folks who have been used to one-to-one relationships and built their whole career — a very successful career — with those one-to-one relationships. And now digital is coming from a different place where some of what you might have thought was your IP in value is now in algorithms. What will you do differently and how do you manage those dynamics differently?

I think there’s education [that needs to happen] because I think it’s humans plus technology, it’s not just technology; it’s humans plus technology and new business models. That is what digital transformation is all about and it’s fun! It is a new way to just have fun. It will be something else two, three, five years from now.

Speaking to ‘hearts and minds’

What strategies do you have for getting people to sign on for that ‘fun’ digital transformation journey?

Evans: At Mercer, what I’ve done was, first, you have to create, I think, a very strong digital strategy that is not just textbook strategy, but one that speaks to the hearts and minds from the executive team down to the person who’s coding, that they can relate to and become a part of it. Many people believe, ‘What’s in it for me? Yeah, I get that technology stuff, but what is it in for me?’ [Showing that] then what is in it for the business and bringing that strategy together and having proof points along the way [is important].

It’s not a big bang approach; it’s really very agile and iterative. And so, as you iterate and show value, people will become more open to change. And as you train them, so build a strategy and inspire your team, inspire your executive leadership team because that’s where all the money is. You need the money, so they need to believe in the digital transformation [journey] and the revenue aspect and the stakeholder value that it would bring.

Basically, create a strong vision that applies to the team, create a strategy that is based on efficiencies and revenue and also create what many call a bimodal [IT approach] because you need to continue to drive the core legacy systems and optimize. They’re still the bread and butter of the company. So, you have to find a strategy that allows both to grow.

View All Videos

Data migration software coming for SAP CRM

ORLANDO — The goal of every major CRM vendor is to gain more of the market share and potentially capture customers from competitors. But doing that can prove difficult for a number of reasons, including organizations relying on legacy systems, challenges with data migration and the cost associated with migration.

Along with unveiling C/4HANA, SAP’s new suite of applications that it says will provide that full 360-degree view of the customer, the company also told SearchCRM.com you can expect data migration software to help automate that migration process from SAP later this year.

In this Q&A from Sapphire Now, Giles House, EVP and chief product officer for SAP Customer Experience talks about the future of CRM within the SAP sphere, as well as what customers of CallidusCloud can expect from the product. SAP bought CallidusCloud earlier this year, putting the finishing touches on its C/4HANA suite. House was chief product officer and chief marketing officer for CallidusCloud before its acquisition by SAP.

Beyond tying together the front- and back-office processes of C/4HANA, SAP hopes that adding data migration software to the suite later this year will help persuade unhappy CRM customers to migrate.

After the CallidusCloud sale, can customers expect anything different with CallidusCloud? What should they look for? Has there been any concern from non-SAP customers?

Giles House: An obvious one is tighter integration with SAP — Callidus was a great partner with SAP for many years and, more recently, the last couple years, SAP rolled it out internally. The biggest thing for those customers is, through us, a lot more investment in technology and innovation.

We’ll still be open and talk with other CRMs, and the answer is absolutely. In the modern world, have to recognize there are sales departments making their people suffer in other systems. We have to make sure they get the best incentives and CPQ [configure price quote] platform on the market.

How do you convince potential customers that you’re not lagging behind in CRM?

House: The intent, the acquisitions and the fact we’ve got these integrations in already two months after the sale closed starts to show progress and give people confidence. As we get through the rest of this year, you’ll see a completely different conversation happening around SAP CRM and the product itself.

Giles House, EVP and chief product officer for SAP Customer Experience
Giles House, SAP

The reason why is simple: there’s $1 billion-plus of churn in the CRM market and about $2 billion of resentment. Many companies want to get off their current, expensive CRM platform because it doesn’t give them that 360-degree view, and every year the sales person comes knocking for a 10% increase in licensing fee.

There’s been a desire to switch, but there hasn’t been something good to switch to because the other propositions are that same patchwork quilt — integrate it yourself, good luck on the analytics. Different for SAP is it will all be integrated and all will be running on the SAP Analytics Cloud and all running on the best cloud platform out there. Not a cloud platform that pays $1 billion to a legacy database vendor like Oracle.

There are software customers that would like to migrate, but data migration software is expensive and the process is challenging. How are you hoping to get them to actually commit to that migration?

[Data migration] can all be automated and that’s another thing we’re bringing out later this year is the automation of that migration.
Giles HouseSAP

House: I think number one is we have to lower that cost. There was a customer where they were quoted it would be eight figures to move. Under the covers, it’s not that hard because what CRM is doing today for a lot of people is not that hard. CRM is a notepad on a database. ‘Here’s what’s going on in the deal, here’s an account of the customer.’ It’s not that hard if you think about it, but we need to help migrate that and automate that migration.

Do the data mapping, make it simple, create the new fields in the new systems and help update the workflows. That can all be automated and that’s another thing we’re bringing out later this year is the automation of that migration.

So automating that process from previous CRM systems to C/4HANA with data migration software will be part of the suite?

House: It has to be. We need to automate it — whether that’s using some of the automation technology that we already have at SAP or whether it’s a whole new [data migration software] solution, we need to get the details of that ironed out, but it’s doable and it will be done.

Salesforce Interaction Studio unveiled at Connections 2018

Salesforce Interaction Studio appears to be the latest major launch in the vendor’s push to enable users to capture a more complete picture of their customers — where they choose to spend their time online, how they like vendors to market to them and which ways they prefer to communicate.

Salesforce has spent billions of dollars on acquisitions, integrations and marketing to achieve that goal. And while it’s not there yet — the targets tend to move in the rapidly changing technology landscape — Salesforce is hoping its latest integrations and new products help its customers find a more complete customer journey.

Toward that end of more smoothly guiding customers through the stages of interacting with an organization, Salesforce unveiled several new product integrations, partner integrations and new applications, including Salesforce Interaction Studio. The CRM giant made the product announcements on June 13 at its Connections 2018 conference in Chicago.

Ray Wang, a Salesforce watcher and principal analyst and founder of Constellation Research Inc., said Salesforce Interaction Studio is an effective response to the fast-increasing importance of customer experience in CRM.

“CRM is actually dead. It’s about how can you craft mass personalization at scale,” Wang said. “It’s about building that experience and journey, which has become a lot more important than CRM.”

CRM vendors such as Salesforce are moving beyond customer management and working toward improving the customer experience and journey — the process of customers interacting with a company. That progression is where Salesforce Interaction Studio comes into play.

Built out of an OEM partnership with software company Thunderhead, Salesforce Interaction Studio enables companies to analyze and manage consumer experiences. Additionally, it can recommend the next-best action for customers, depending on how they interact with that brand.

“It allows you to look at cross-channel consumer insight and see next-best action and optimize customer journey,” Wang said.

Marketing Cloud and Google Analytics 360

Salesforce Interaction Studio was just one of the products unveiled at Connections, a conference focusing on marketing, commerce and service.

Building off a partnership announced at Dreamforce last year, Salesforce Marketing Cloud and Google Analytics 360 are generally available for integration for Marketing Cloud customers. According to Wang, the alliance is a blow to one of Salesforce’s chief competitors, Adobe.

“The battle of analytics has been Adobe Omniture versus Google Analytics,” Wang said. “As Microsoft and Adobe have come close together, it’s natural for Salesforce and Google to become allies.”

The partnership will enable customers to merge insight from Marketing Cloud and Google Analytics 360 into a single dashboard within Marketing Cloud, while campaign data will be available within Google Analytics 360 to provide tailored web content.

Another feature, which won’t be available in beta until July, will allow users to create an audience in Google Analytics 360, activate it outside the Google platform and allow users to continue building that audience within Marketing Cloud.

While nothing prevented Marketing Cloud customers from using Google Analytics 360 to track web analytics, the lack of a deep integration with Marketing Cloud made it difficult for users of both to tie that information back to campaigns. By combining Google Analytics 360 with Salesforce’s inward-facing Einstein Analytics, a company can see a more holistic view of a customer’s data.

It’s not personalization — we haven’t gotten to that point yet. But it allows you to think about the type of journeys you’re creating for customers.
Ray Wangprincipal analyst and founder, Constellation Research

“Einstein takes the data inside the Salesforce platform and helps users find insights, while Google Analytics 360 component is based on web analytics,” Bobby Jania, vice president of product marketing for Marketing Cloud, said in a release. “It looks at how consumers are behaving, pages they visit [and] how long they are on there for.”

Combining those analytics insights to the insights from Salesforce Interaction Studio can go a long way in helping customers track consumers’ campaigns, according to Wang.

“It’s a huge void in Marketing Cloud and most other clouds,” Wang said. “It’s about crafting the right experience and understanding where a customer is — it takes context into account.”

While personalization is the ultimate goal for Salesforce and similar companies focused on customer journeys, that pie-in-the-sky objective is still off on the horizon.

“It’s not personalization — we haven’t gotten to that point yet,” Wang said. “But it allows you to think about the type of journeys you’re creating for customers.”

B2B Commerce Cloud

Beyond Marketing Cloud and Salesforce Interaction Studio, Salesforce officially released its B2B Commerce Cloud product, formerly known as CloudCraze. Salesforce acquired the B2B e-commerce product earlier this year, having been originally built on top of Salesforce software using Force.com.

“It will make it easier to see what a user is doing on a commerce site and connect it to Marketing Cloud,” Jania said. “B2B commerce for Salesforce will make that complex order have the same UI as B2C commerce.”

The Salesforce Interaction Studio release continues Salesforce’s long-term campaign to unify sales and marketing departments within organizations.

“You have a stack at Salesforce where marketing and commerce are a lot tighter than they were before,” Wang said.

Salesforce also unveiled Integrations between Commerce Cloud and Service Cloud, allowing service reps to see customers’ buying histories during service calls, opening up the opportunities to up-sell or cross-sell other products.

Pricing information for Salesforce Interaction Studio, B2B Commerce Cloud and the Marketing Cloud-Google Analytics 360 integration wasn’t immediately available.

For Sale – Cheap business laptop – Dell Latitude 3330 – i5/4GB/500GB

Dell Latitude 3330 13.3″ Laptop

Full working order, no major marks or scratches – screen & keyboard are in great condition, one small mark on base of laptop. Battery holds a charge and shows health as ‘normal’.

Intel Core i5-3337u
500GB Seagate HDD
4GB DDR3 RAM (second DIMM slot empty so can easily be expanded to 8GB+)
13.3″ Matte HD Display

Please note that the hard drive in this machine has been formatted and no OS is installed. The machine has a Windows 7 Pro COA and product key associated to it that may be able to be reused, but I cannot guarantee this.

Laptop comes complete with original Dell charger. No box but will be securely packed for shipping.

Any questions, please get in touch. This is also listed on eBay.

s-l1600 (5).jpg

s-l1600 (4).jpg

s-l1600 (3).jpg

s-l1600 (2).jpg

s-l1600 (1).jpg

s-l1600.jpg

Price and currency: 129
Delivery: Delivery cost is not included
Payment method: Bank Transfer/PPG
Location: Walsall
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Office Depot says ‘no’ to Oracle ERP Cloud customizations

Office Depot, a firm with about $11 billion in sales, is moving major applications to the Oracle ERP Cloud. In doing so, Office Depot wants to avoid any customizations as it shifts from in-house systems.

The retailer will use best practices embedded in various Oracle ERP Cloud platforms: in this case, Oracle’s Supply Chain Management Cloud, its cloud-based Human Capital Management (HCM) and Enterprise Performance Management (EPM) systems. Oracle announced Office Depot’s decision Jan. 29.

Rejecting customizations was easier for some systems than others. HR business processes lend themselves well to this change, said Damon Venger, senior director of IT applications at Office Depot.

With HR they “are not reviewing our customizations — we are getting rid of them,” Venger said.

By not customizing its Oracle ERP applications, the retailer will simplify its IT processes, and reduce the cost of maintaining and managing them, he said.

Office Depot started selling the initiative internally last year. “It’s hard for executives in the business to say, ‘I have to do performance management in a specific way,” Venger said. That’s the goal at least. Supply chain will “definitely be more challenging,” he said.

Deciding on no customization is ‘trendsetting’

Office Depot uses Oracle products, including PeopleSoft, hosted in an Oracle data center. It uses Hyperion Financial Management products, and a supply chain product.

It’s hard for executives in the business to say ‘I have to do performance management in a specific way.’
Damon Vengersenior director of IT applications, Office Depot

The HCM and EPM migration will take about a year, and supply chain about two years. The company plans to use Agile development processes to complete the migrations.

For a company its size, Office Depot’s decision on customizations is “trendsetting,” said Seth Lippincott, an analyst at Nucleus Research. But it’s also possible because vendors are developing “what they would consider best practices in every one of their capabilities,” he said.

Some users argue that they need ERP customizations because of unique business requirements or industry-specific practices. But those arguments are waning as vendors add industry-specific capabilities, Lippincott said.

If customizations are about “letting people feel comfortable and safe in what they’re used to, it won’t help,” Lippincott argued. A firm will still go through a change management process. It makes sense for the long-term to force users into the new environment, he said.

APIs will connect customizations, but once started problems mount.

Office Depot made ‘pragmatic’ decision

Judith Hurwitz, the CEO of Hurwitz & Associates, called Office Depot’s decision “pragmatic.”

Routine updates mean testing against the customizations. “You are always sort of out of sync” with the latest updates. They may take months of testing. Asking a vendor for customizations can add millions, she said.

“Are your processes really so unique, so different?” Hurwitz said. For most firms, they aren’t, she said.

Venger said the decision to migrate to the cloud “was not a blind move to go.” Office Depot analyzed its real costs, including data center costs, licensing — every aspect.

Oracle ERP Cloud “came with a significant cost-savings,” and functionality upgrades, Venger said. With the on-premises system, “unless we customized it, you wouldn’t have functionality changes,” he said.

NVMe flash storage doesn’t mean tape and disk are dying

Not long ago, a major hardware vendor invited me to participate in a group chat where we would explore the case for flash storage and software-defined storage. On the list of questions sent in advance was that burning issue: Has flash killed disk? Against my better judgment, I accepted the offer. Opinions being elbows, I figured I had a couple to contribute.

I joined a couple of notable commentators from the vendor’s staff and the analyst community, who I presumed would echo the talking points of their client like overzealous high school cheerleaders. I wasn’t wrong.

Shortly after it started, I found myself drifting from the nonvolatile memory express (NVMe) flash storage party line. I also noted that software-defined storage (SDS) futures weren’t high and to the right in the companies I was visiting, despite projections by one analyst of 30%-plus growth rates over the next couple years. Serious work remained to be done to improve the predictability, manageability and orchestration of software-defined and hyper-converged storage, I said, and the SDS stack itself needed to be rethought to determine whether the right services were being centralized.

Yesterday’s silicon tomorrow

I also took issue with the all-silicon advocates, stating my view that NVMe flash storage might just be “yesterday’s silicon storage technology tomorrow,” or at least a technology in search of a workload. I wondered aloud whether NVMe — that the “shiny new thing” — mightn’t be usurped shortly by capacitor-backed dynamic RAM (DRAM) that’s significantly less expensive and faster. DRAM also has much lower latency than NVMe flash storage because it’s directly connected to the memory channel rather than the PCI bus or a SAS or SATA controller.

The vendor tried to steer me back into the fold, saying “Of course, you need the right tool for the right job.” Truer words were never spoken. I replied that silicon storage was part of a storage ecosystem that would be needed in its entirety if we were to store the zettabytes of data coming our way. The vendor liked this response since the company had a deep bench of storage offerings that included disk and tape.

I then took the opportunity to further press the notion that disk isn’t dead any more than tape is dead, despite increasing claims to the contrary. (I didn’t share a still developing story around a new type of disk with a new form factor and new data placement strategy that could buy even more runway for that technology. For now, I am sworn to secrecy, but once the developers give the nod, readers of this column will be the first to know.)

I did get some pushback from analysts about tape, which they saw as completely obsoleted in the next generation, all-silicon data center. I could have pushed them over to Quantum Corp. for another view.

The back story

A few columns back, I wrote something about Quantum exiting the tape space based on erroneous information from a recently released employee. I had to issue a retraction, and I contacted Quantum and spoke with Eric Bassier, senior director of data center products and solutions, who set the record straight. Seems Quantum — like IBM and Spectra Logic — is excited about LTO-8 tape technology and how it can be wed to the company’s Scalar tape products and StorNext file system.

Bassier said Quantum was “one of only a few storage companies [in 2016] to demonstrate top-line growth and profitability,” and its dedication to tape was not only robust, it succeeded with new customers seeking to scale out capacity. In addition to providing a dense enterprise tape library, the Scalar i6000 has 11,000 or more slots, a dual robot and as many as 24 drives in a single 19-inch rack frame, all managed with web services using representational state transfer, or RESTful API calls.

Quantum, like IBM and Spectra Logic, is articulating a product strategy that has fingers in all the popular storage buckets.

Quantum was also hitting the market with a 3U rack-mountable, scalable library capable of delivering 150 TB uncompressed LTO-7 tape storage or 300 TB uncompressed LTO-8 in storage for backup, archive or additional secondary storage for less frequently used files and objects. Add compression and you more than double these capacity numbers. That, Bassier asserted, was more data than many small and medium-sized companies would generate in a year.

Disk also has a role in Quantum’s world; its DXi product provides data deduplication that’s a significant improvement over the previous-generation model. It offers performance and density improvements through the application of SSDs and 8 TB HDDs, as well as a reduction in power consumption.

All the storage buckets

Quantum, like IBM and Spectra Logic, is articulating a product strategy that has fingers in all the popular buckets, including tape, disk and NVMe flash storage. After years of burying their story under a rock by providing OEM products to other vendors who branded them as their own, 90% of the company’s revenue is now derived from the Quantum brand.

Bottom line: We might eventually get to an all-silicon data center. In the same breath, I could say that we might eventually get that holographic storage the industry has promised since the Kennedy administration. For planning 2018, your time is better spent returning to basics. Instead of going for the shiny new thing, do the hard work of understanding your workload, then architecting the right combination of storage and software to meet your needs. Try as you might, the idea of horizontal storage technology — one size fits most — with simple orchestration and administration, remains elusive.

That’s my two elbows.