Tag Archives: years

For Sale – Dell XPS8900

NOW JUST THE TOWER FOR SALE.

Selling my Dell desktop.

Purchased in May 2016 so just over 3 years old now, in great condition, hardly gets used hence the sale.

Computer specs are as below in one of the photos.

Only thing I’ve added was a faster SSD drive connected via PCI Express card to get the full speed. (Samsung 950 PRO M.2 256GB).

Key specs are

Intel core i7 6700K processor (6th gen).
Windows 10 home edition.
24gb DDR4 2133 MHz ram.
256gb Samsung 950 PRO boot drive as described above, plus a 2TB 7200RPM HDD.
NVIDIA GeForce GTX 960 2GB DDR5 graphics card.
BLURAY combo disc drive plus second dvd drive added also.
802.11ac WiFi
6x USB3 ports plus 4 USB2 ports.

Any questions please feel free to ask.

Buyer collects in Hove.

£390

Go to Original Article
Author:

For Sale – i4770k 4.4 Overclocked Bundle cpu/mobo/ram Noctua D14

Having just upgraded my system to Ryzen I am selling the following, this was bought 6 years ago direct from OCUK as an overclocked bundle and has run all of it’s life at 4.4Ghz on the noctua cooler. The only change was an upgrade I recently did to 16gb ram which cost me £113 brand new in January. would like to sell as a bundle preferably but will split.

I have set it back to stock but saved the OC Profile in the bios.

I74770K –£95
MSI Z87-G45 Gaming mobo –£45
16GB (2 x 8GB) HyperX Savage HX318C9SRK2/16 16 GB 1866 MHz DDR3 CL9 DIMM (Kit of 2) XMP, Red – £45

Noctua D14 Cooler (Brand new AM4 adapter included) only have socket LGA 1150 fittings and AM4 – £35

BUNDLE PRICE £200 inc.

Also my Old Ram before I upgraded it to 16gb:

8GB (2x4GB) Corsair Vengeance Pro DDR3 1866MHZ CMY8GX3M2A1866C9R £20

Prices inclusive of postage

Go to Original Article
Author:

How Microsoft re-envisioned the data warehouse with Azure Synapse Analytics

About four years ago, the Microsoft Azure team began to notice a big problem troubling many of its customers. A mass migration to the cloud was in full swing, as enterprises signed up by the thousands to reap the benefits of flexible, largescale computing and data storage. But the next iteration of that tech revolution, in which companies would use their growing stores of data to get more tangible business benefits, had stalled.

Technology providers, including Microsoft, have built a variety of systems to collect, retrieve and analyze enormous troves of information that would uncover market trends and insights, paving the way toward a new era of improved customer service, innovation and efficiency.

But those systems were built independently by different engineering teams and sold as individual products and services. They weren’t designed to connect with one another, and customers would have to learn how to operate them separately, wasting time, money and precious IT talent.

“Instead of trying to add more features to each of our services, we decided to take a step back and figure out how to bring their core capabilities together to make it easy for customers to collect and analyze all of their increasingly diverse data, to break down data silos and work together more collaboratively,” said Raghu Ramakrishnan, Microsoft’s chief technology officer for data.

At its Ignite conference this week in Orlando, Florida, Microsoft announced the end result of a yearslong effort to address the problem: Azure Synapse Analytics, a new service that merges the capabilities of Azure SQL Data Warehouse with new enhancements such as on-demand query as a service.

Microsoft said this new offering will help customers put their data to work much more quickly, productively and securely by pulling together insights from all data sources, data warehouses and big data analytics systems. And, the company said, with deeper integration between Power BI and Azure Machine Learning, Azure Synapse Analytics can reduce the time required to process and share that data, speeding up the insights that businesses can glean.

What’s more, it will allow many more businesses to take advantage of game-changing technologies like data analytics and artificial intelligence, which are helping scientists to better predict the weather, search engines to better understand people’s intent and workers to more easily handle mundane tasks.

This newest effort to break down data silos also builds on other Microsoft projects, such as the Open Data Initiative and Azure Data Share, which allows you to share data from multiple sources and even other organizations.

Microsoft said Azure Synapse Analytics is also designed to support the increasingly popular DevOps strategy, in which development and operations staff collaborate more closely to create and implement services that work better throughout their lifecycles.

YouTube Video

A learning process

Azure Synapse Analytics is the result of a lot of work, and a little trial and error.

At first, Ramakrishnan said, the team developed highlevel guidelines showing customers how to glue the systems together themselves. But they quickly realized that was too much to ask.

“That required a lot of expertise in the nitty gritty of our platforms,” Ramakrishnan said. Customers made it overwhelmingly clear that we needed to do better.”

So, the company went back to the drawing board and spent an additional two years revamping the heart of its data business, Azure SQL Data Warehouse, which lets customers build, test, deploy and manage applications and services in the cloud.

A breakthrough came when the company realized that customers need to analyze all their data in a single service, without having to copy terabytes of information across various systems to use different analytic capabilities – as has traditionally been the case with data warehouses and data lakes.

With the new offering, customers can use their data analytics engine of choice, such as Apache Spark or SQL, on all their data. That’s true whether it’s structured data, such as rows of numbers on spreadsheets, or unstructured data, such as a collection of social media posts.

This project was risky. It involved deep technical surgery: completely rewriting the guts of the SQL query processing engine to optimize it for the cloud and make it capable of instantly handling big bursts of work as well as very large and diverse datasets.

It also required unprecedented integration among several teams within Microsoft, some of whom would have to make hard choices. Established plans had to be scrapped. Resources earmarked for new features would be redirected to help make the entire system work better.

“In the beginning, the conversations were often heated. But as we got into the flow of it, they became easier. We began to come together,” Ramakrishnan said.

Microsoft also had to make sure that the product would work for any company, regardless of employees’ technical expertise.

“Most companies can’t afford to hire teams of 20 people to drive data projects and wire together multiple systems. There aren’t even enough skilled people out there to do all that work,” said Daniel Yu, director of product marketing for Azure Data and Artificial Intelligence.

Making it easy for customers

Customers can bring together various sources of data into a single feed with Azure Synapse Analytics Studio, a console – or single pane of glass that will allow a business professional with minimal technical expertise to locate and collect data from multiple sources like sales, supply chain, finance and product development. They can then choose how and where to store that data, and they can use it to create reports through Microsoft’s popular Power BI analytics service.

In a matter of hours, Azure Synapse will deliver useful business insights that used to take days or even weeks and months, said Rohan Kumar, corporate vice president for Azure Data.

“Let’s say an executive wants a detailed report on sales performance in the eastern U.S. over the last six months,” Kumar said. Today, a data engineer has to do a lot of work to find where that data is stored and write a lot of brittle code to tie various services together. They might even have to bring in a systems integrator partner. With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

The complexity of the technical problems Azure Synapse addressed would be hard to overstate. Microsoft had to meld multiple independent components into one coherent form factor, while giving a wide range of people – from data scientists to line of business owners – their preferred tools for accessing and using data.


With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

~ Rohan Kumar, corporate vice president for Azure Data


That includes products like SQL Server, the open source programming interface Apache Spark, Azure Data Factory and Azure Data Studio, as well as notebook interfaces preferred by many data professionals to clean and model data.

“Getting all those capabilities to come together fluidly, making it run faster, simpler, eliminating overlapping processes – there was some scary good stuff getting done,” Ramakrishnan said.

The result is a data analytics system that will be as easy to use as a modern mobile phone. Just as the smartphone replaced several devices by making all of their core capabilities intuitively accessible in a single device, the Azure Synapse “smartphone for data” now allows a data engineer to build an entire end-to-end data pipeline in one place. It also enables data scientists and analysts to look at the underlying data in ways that are natural to them.

And just as the phone has driven waves of collaboration and business innovation, Azure Synapse will free up individuals and companies to introduce new products and services as quickly as they can dream them up, Microsoft said.

“If we can help different people view data through a lens that is natural to them, while it’s also visible to others in ways natural to them, then we will transform the way companies work,” Ramakrishnan said. That’s how we should measure our success.

Top photo: Rohan Kumar, corporate vice president for Azure Data, says Azure Synapse will deliver useful business insights that used to take days or even weeks and months. Photo by Scott Eklund/Red Box Pictures.

Related:

Go to Original Article
Author: Microsoft News Center

For Sale – Dell XPS8900

NOW JUST THE TOWER FOR SALE.

Selling my Dell desktop.

Purchased in May 2016 so just over 3 years old now, in great condition, hardly gets used hence the sale.

Computer specs are as below in one of the photos.

Only thing I’ve added was a faster SSD drive connected via PCI Express card to get the full speed. (Samsung 950 PRO M.2 256GB).

Key specs are

Intel core i7 6700K processor (6th gen).
Windows 10 home edition.
24gb DDR4 2133 MHz ram.
256gb Samsung 950 PRO boot drive as described above, plus a 2TB 7200RPM HDD.
NVIDIA GeForce GTX 960 2GB DDR5 graphics card.
BLURAY combo disc drive plus second dvd drive added also.
802.11ac WiFi
6x USB3 ports plus 4 USB2 ports.

Any questions please feel free to ask.

Buyer collects in Hove.

£390

Go to Original Article
Author:

Veeam cloud backup sells back N2WS, adding AWS and Azure products

Veeam Software has sold an AWS data protection company it acquired two years ago and will launch Azure- and AWS-focused backup products as part of its own “unified cloud platform.”

About 10 months after Veeam’s acquisition of N2WS, the U.S. government requested “information regarding the transaction,” said Ratmir Timashev, co-founder and executive vice president of Veeam. He declined to provide details on the information request.

“After some discussions with the government in the first half of 2019, Veeam voluntarily made the decision to sell [N2WS] back to its original founders,” Timashev said. “And we decided to focus on building our own unified cloud platform, using our internal [research and development] resources.”

Veeam cloud backup, N2WS move forward, separately

The sale back to N2WS CEO Ohad Kritz and CTO Uri Wolloch closed in the third quarter of 2019. Veeam is not releasing terms of the sale, but Timashev called it “relatively small.”

Veeam bought N2WS and its cloud-native, enterprise backup and recovery for AWS data for $42.5 million at the end of 2017. About eight months earlier, Veeam disclosed it had invested in N2WS. Veeam, a data protection and management vendor with international headquarters in Switzerland and U.S. headquarters in Columbus, Ohio, is also no longer an investor in N2WS.

Ratmir TimashevRatmir Timashev

Timashev said he could not give much more detail about why the government’s information request led to the major step of selling back N2WS. He declined to comment on a report that the U.S. government’s interest was piqued because it is an N2WS customer and Timashev and Veeam Co-Founder Andrei Baronov are Russian. Baronov is also Veeam’s CEO.

“We feel that developing a unified cloud solution, not just AWS [backup], but that is closely integrated with our platform, was the best,” Timashev said.

The acquisition of N2WS showed that Veeam understands the importance of native backup technology for public cloud environments, said Archana Venkatraman, research manager at IDC.

“Veeam voluntarily sold the business following discussions with U.S. government, so it was a sensible move given the federal complexities,” Venkatraman wrote in an email.

N2WS had operated as a stand-alone business under Veeam, which representatives from both companies said makes the split easier for customers.

The majority of customers who bought the Veeam-owned N2WS were looking for a point product to back up AWS, Timashev said.

“People who were using our software at the time were protecting their current data center and the purchaser of the N2WS solution was someone who was standing up infrastructure in the cloud,” said Danny Allan, vice president of product strategy at Veeam.

Ezra Charm, vice president of marketing at N2WS, said he can’t comment on what happened on the Veeam side, but noted “the issues were not N2WS issues.” The split was “amicable,” he said.

“It was really awesome being in the Veeam world,” Charm said, citing a larger marketing budget as one positive. “But the best is yet to come.”

Charm stressed that IT is still in the beginning stages of the cloud movement, as many workloads that could be in the cloud are not there yet.

“N2WS is well positioned to grow and make a difference,” Charm said.

Venkatraman said N2WS is prominent in the AWS Marketplace.

“As an independent company, it will continue growing as demand for cloud data protection continues to grow,” she wrote.

Charm acknowledged that “some of this is a little scary.” While it’s still figuring out the new budget, N2WS is a financially stable company with thousands of customers, Charm said.

N2WS has about 50 employees, including 30 in Israel at its research and development center and 20 in West Palm Beach, Fla., at its sales and marketing headquarters. The company did not let go of any employees as a result of the sale, Charm said.

Backup for AWS, Azure provides important protection

Following the sale, Veeam cloud backup will launch two new products. Veeam Backup for AWS and Veeam Backup for Microsoft Azure will be available as stand-alone point products or integrated with Veeam’s platform.

The cloud-native Azure backup will be available at the Microsoft Ignite conference next week in a technology preview. It’s slated to be generally available early next year.

The point product offering Azure to Azure backup is much cheaper than the version integrated with the Veeam platform, Timashev said.

Veeam Backup for Microsoft Azure — both free and paid versions — will be available for deployment through the Azure Marketplace for cloud-first companies, Allan said. In addition, Veeam Backup & Replication users can extend their protection to Azure-native instances.

The product also features file-level recovery of native snapshots and Veeam backups, as well as the ability to restore to an on-premises data center or any other Veeam-supported environment, Allan said.

The similar Veeam Backup for AWS will be available by the end of 2019.

“That’s why we were talking about the unified cloud platform,” Timashev said. “So, immediately, it’s integrated in our cloud platform as well as available as a [point product].”

After some discussions with the government in the first half of 2019, Veeam voluntarily made the decision to sell [N2WS] back to its original founders. And we decided to focus on building our own unified cloud platform.
Ratmir TimashevCo-founder and executive vice president, Veeam

Veeam and N2WS go from the same company to competitors in AWS backup.

“While both will serve the cloud-native AWS backup market, Veeam’s goal has always been broader and that is to deliver data management for all of our customers’ data — across clouds and on-premises data centers,” Allan said.

N2WS’ most recent product version, Backup & Recovery 2.7, added Amazon S3 Infrequent Access support and intelligent tiering. The 3.0 edition scheduled for general availability in January will feature more integration into other S3 storage tiers.

N2WS’ connection to the AWS community, transparent pricing and flexibility in allowing customers to cancel anytime help it stand out, Charm said. Competition is the sign of a “healthy market opportunity,” he said, and reinforces N2WS’ message that workloads hosted with public cloud providers need protection.

“N2WS has been focused on solving the challenge of protecting data and workloads in the public cloud since 2013,” Charm said. “It is great to see that all the major backup providers — not just Veeam — are starting to take this seriously.”

IDC research found that more than 80% of new application deployments will include cloud. A backup platform that features support for hybrid and multi-cloud environments is a top need, especially for large enterprises, and will help Veeam attract those customers, according to Venkatraman.

“But cloud focus is a top priority for its main competitors, too, and success will be driven by differentiation — in pricing, in user experience and successful unification, and in channel/go-to-market transformation,” she wrote.

Office 365 backup, NAS support and more

The Veeam cloud backup portfolio is also updating its Office 365 protection, the fastest growing product in the history of the company. While Veeam Backup for Microsoft Office 365 previously offered on-premises backup, version 4 will back up directly to the cloud to either Azure or AWS. Veeam had only been addressing half of the market needs, Timashev said.

Veeam Backup for Microsoft Office 365, which covers Exchange, SharePoint and OneDrive, will also add object storage support, including AWS S3, Azure Blob, IBM Cloud and S3-compatible providers. Version 4 will be available as a public beta on Monday with general availability expected by the end of 2019.

Further along in the roadmap, version 10 of the Veeam Availability Suite is scheduled to be available for service providers in December and the general public in January. The top feature is enhanced NAS backup, which incorporates changed file tracking, the ability to “protect from anywhere to anywhere” and snapshot support, Allan said.

The product has been in private beta since the summer.

“It’s been tested very extensively by our partners and our customers, so we are pretty confident that we are getting very close,” Timashev said.

IDC research showed that unstructured data is growing faster than structured data and organizations need enterprise-grade backup for this environment that houses sensitive data, according to Venkatraman.

“[Veeam’s offering] is a wait and watch, but there is a lot of demand for NAS backup among enterprises,” she wrote.

Veeam is also “always looking for acquisitions,” Timashev said, in areas such as cloud data management and migration, data optimization and cloud optimization.

Go to Original Article
Author:

For Sale – Dell XPS8900

NOW JUST THE TOWER FOR SALE.

Selling my Dell desktop.

Purchased in May 2016 so just over 3 years old now, in great condition, hardly gets used hence the sale.

Computer specs are as below in one of the photos.

Only thing I’ve added was a faster SSD drive connected via PCI Express card to get the full speed. (Samsung 950 PRO M.2 256GB).

Key specs are

Intel core i7 6700K processor (6th gen).
Windows 10 home edition.
24gb DDR4 2133 MHz ram.
256gb Samsung 950 PRO boot drive as described above, plus a 2TB 7200RPM HDD.
NVIDIA GeForce GTX 960 2GB DDR5 graphics card.
BLURAY combo disc drive plus second dvd drive added also.
802.11ac WiFi
6x USB3 ports plus 4 USB2 ports.

Any questions please feel free to ask.

Buyer collects in Hove.

£390

Go to Original Article
Author:

AWS ups Java support, joins Java Community Process

Although it’s nearly 25 years old, the Java programming language has gained renewed interest lately from major cloud platform providers — namely, AWS and Microsoft.

For instance, this week AWS joined the Java Community Process (JCP), the governing body that manages the process of adding new features or specifications to the Java language and platform.

After more than 25 years being a workhorse programming language for enterprise applications and systems development, there is a ton of Java code out in the wild. And as more organizations move their Java workloads to the cloud, cloud platform providers are vying for those organizations to bring their Java apps to these vendors’ clouds.

‘You have to play nice with the community’

Amazon itself runs thousands of Java production services. And over the last few years, AWS has been courting Java developers in earnest.

“Java has the largest developer community, with between 10 million and 15 million developers,” said Holger Mueller, an analyst at Constellation Research in San Francisco. “If you want to attract enterprise workloads, you have to play nice with the community. And then you want to influence it on doing the ‘right’ things for the cloud era.”

In 2016, the company started building its own distribution of OpenJDK, the free and open source implementation of the Java Platform, Standard Edition. Amazon uses its OpenJDK distribution, known as Corretto, to run AWS and other Amazon services, said Yishai Galatzer, manager of the Artifacts and Languages Group in the AWS Developer Tools unit, in a blog post.

Galatzer’s team builds and distributes Amazon Corretto and built the Java Development Kit (JDK) that powers Amazon’s services, he said. Last year, Amazon released Corretto as an open source project.

In addition, Amazon began to contribute its patches to the OpenJDK project and also last year started to help maintain the OpenJDK 8 and OpenJDK 11 projects, Galatzer said. And with a focus on security, Amazon joined the Java Vulnerability Group to help address security issues in JDK 8 and JDK 11, he said. In that same vein, Amazon released in July its Amazon Corretto Crypto Provider, which implements standard Java Cryptography Architecture interfaces and provides high-performance cryptographic operations for all the OpenJDK operations, Galatzer said.

Now, with its membership in the Java Community Process, AWS has a direct line to influencing the future of Java — a matter of particular importance, as Java evolves to suit the requirements for cloud-native computing.

Amazon is trying to engage with the developer community, so it really makes sense to jump on the trend train.
Cameron PurdyCEO, Xqiz.it

“Amazon is trying to engage with the developer community, so it really makes sense to jump on the ‘trend train,'” said Cameron Purdy, CEO of Xqiz.it in Lexington, Mass., and former senior vice president of development at Oracle, where he oversaw key Java projects.

In recent years, AWS hired key Java experts, including Java creator James Gosling and Arun Gupta, who held core Java evangelism roles at both Sun Microsystems and Oracle after it acquired Sun in 2010.

Purdy says AWS’ interest in a role on the JCP seems quite natural for two reasons: “First, Amazon has hired some of the former Sun Java evangelism team, and they’re eager to engage with that community,” he said. “And second, Amazon now has their own distribution of the OpenJDK, so it makes sense to be involved.”

However, more cynical observers surmise that Amazon’s membership in the Java Community Process could be a condition of the company gaining the Java Technology Compatibility Kit license agreement that Amazon needs in order to claim that Corretto is a compatible OpenJDK implementation.

Moreover, Oracle gains as it gets Amazon’s brand attached to the JCP, and Amazon is now part of the patent non-aggression pact that supports Java.

Microsoft ramping up its Java presence

Meanwhile, Microsoft has made its own strong moves in the Java market, including its acquisition of jClarity in August to optimize its Azure cloud platform to run Java workloads. The jClarity acquisition also brings the core team behind the AdoptOpenJDK distribution of OpenJDK under the auspices of Microsoft.

Also, earlier in October, Microsoft and Pivotal teamed up to deliver Azure Spring Cloud, a service for Spring Boot apps designed to help developers build scalable microservices without the need to configure underlying infrastructure.

Go to Original Article
Author:

AI at the core of next-generation BI

Next-generation BI is upon us, and has been for a few years now.

The first generation of business intelligence, beginning in the 1980s and extending through the turn of the 21st century, relied entirely on information technology experts. It was about business reporting, and was inaccessible to all but a very few with specialized skills.

The second introduced self-service analytics, and lasted until just a few years ago. The technology was accessible to data analysts, and defined by data visualization, data preparation and data discovery.

Next-generation BI — the third generation — is characterized by augmented intelligence, machine learning and natural language processing. It’s open everyday business users, and trust and transparency are important aspects. It’s also changing the direction data looks, becoming more predictive.

In September, Constellation Research released “Augmented Analytics: How Smart Features Are Changing Business Intelligence.The report, authored by analyst Doug Henschen, took a deep look at next-generation BI.

Henschen reflected on some of his findings about the third generation of business analytics for a two-part Q&A.

In Part I, Henschen addressed what marked the beginning of this new era and who stands to benefit most from augmented BI capabilities. In Part II, he looked at which vendors are positioned to succeed and where next-generation business intelligence is headed next.

In your report you peg 2015 as the beginning of next generation BI — what features were you seeing in analytics platforms at that time that signaled a new era?

Doug HenschenDoug Henschen

Doug Henschen: There was a lot percolating at the time, but I don’t think that it’s about a specific technology coming out in 2015. That’s an approximation of when augmented analytics really became something that was ensconced as a buying criteria. That’s I think approximately when we shifted — the previous decade was really when self-service became really important and the majority of deployments were driving toward it, and I pegged 2015 as the approximate time at which augmented started getting on everyone’s radar.

Beyond the technology itself, what were some things that happened in the market around the time of 2015 that showed things were changing?

Henschen: There were lots of technology things that led up to that — Watson playing Jeopardy was in 2011, SAP acquired KXEN in 2013, IBM introduced Watson Analytics in 2014. Some startups like ThoughtSpot and BeyondCore came in during the middle of the decade, Salesforce introduced Einstein in 2016 and ended up acquiring BeyondCore in 2016. A lot of stuff was percolating in the decade, and 2015 is about when it became about, ‘OK, we want augmented analytics on our list. We want to see these features coming up on roadmaps.’

What are you seeing now that has advanced next-generation BI beyond what was available in 2015?

Anything that is proactive, that provides recommendations, that helps automate work that was tedious, that surfaces insights that humans would have a tough time recognizing but that machines can recognize — that’s helpful to everybody.
Doug HenschenAnalyst, Constellation Research

Henschen: In the report I dive into four areas — data preparation, data discovery and analysis, natural language interfaces and interaction, and forecasting and prediction — and in every category you’ve seen certain capabilities become commonplace, while other capabilities have been emerging and are on the bleeding edge. In data prep, everyone can pretty much do auto data profiling, but recommended or suggested data sources and joins are a little bit less common. Guided approaches that walk you through how to cleanse this, how to format this, where and how to join — that’s a little bit more advanced and not everybody does it.

Similarly, in the other categories, recommended data visualization is pretty common in discover and analysis, but intent-driven recommendations that track what individuals are doing and make recommendations based on patterns among people are more on the bleeding edge. It applies in every category. There’s stuff that is now widely done by most products, and stuff that is more bleeding edge where some companies are innovating and leading.

Who benefits from next-generation BI that didn’t benefit in previous generations — what types of users?

Henschen: I think these features will benefit all. Anything that is proactive, that provides recommendations, that helps automate work that was tedious, that surfaces insights that humans would have a tough time recognizing but that machines can recognize — that’s helpful to everybody. It has long been an ambition in BI and analytics to spread this capability to the many, to the business users, as well as the analysts who have long served the business users, and this extends the trend of self-service to more users, but it also saves time and supports even the more sophisticated users.

Obviously, larger companies have teams of data analysts and data engineers and have more people of that sort — they have data scientists. Midsize companies don’t have as many of those assets, so I think [augmented capabilities] stand to be more beneficial to midsize companies. Things like recommended visualizations and starting points for data exploration, those are very helpful when you don’t have an expert on hand and a team at your disposal to develop a dashboard to address a problem or look at the impact of something on sales. I think [augmented capabilities] are going to benefit all, but midsize companies and those with fewer people and resources stand to benefit more.  

You referred to medium-sized businesses, but what about small businesses?

Henschen: In the BI and analytics world there are products that are geared to reporting and helping companies at scale. The desktop products are more popular with small companies — Tableau, Microsoft Power BI, Tibco Spotfire are some that have desktop options, and small companies are turning also to SaaS options. We focus on enterprise analytics — midsize companies and up — and I think enterprise software vendors are focused that way, but there are definitely cloud services, SaaS vendors and desktop options. Salesforce has some good small business options. Augmented capabilities are coming into those tools as well.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

Cloud adoption a catalyst for IT modernization in many orgs

One of the biggest changes for administrators in recent years is the cloud. Its presence requires administrators to migrate from their on-premises way of thinking.

The problem isn’t the cloud. After all, there should be less work if someone else looks after the server for you. The arrival of the cloud has brought to light some of the industry’s outdated methodologies, which is prompting this IT modernization movement. Practices in many IT shops were not as rigid or regimented before the cloud came along because external access was limited.

Changing times and new technologies spur IT modernization efforts

When organizations were exclusively on premises, it was easy enough to add finely controlled firewall rules to only allow certain connections in and out. Internal web-based applications did not need HTTPS — just plain HTTP worked fine. You did not have to muck around with certificates, which seem to always be difficult to comprehend. Anyone on your network was authorized to be there, so it didn’t matter if data was unencrypted. The risk versus the effort wasn’t worthwhile — a lot of us told ourselves — to bother with and the users would have no idea anyway.

You would find different ways to limit the threats to the organization. You could implement 802.1X, which only allowed authorized devices on the network. This reduced the chances of a breach because the attacker would need both physical access to the network and an approved device. Active Directory could be messy; IT had a relaxed attitude about account management and cleanup, which was fine as long as everyone could do their job.

Now that there is increased risk with exposing the company’s systems to the world via cloud, it’s no longer an option to keep doing things the same way just to get by.

The pre-cloud era allowed for a lot of untidiness and shortcuts, because the risk of these things affecting the business in a drastic way was smaller. Administrators who stepped into a new job would routinely inherit a mess from the last IT team. There was little incentive to clean things up; just keep those existing workloads running. Now that there is increased risk with exposing the company’s systems to the world via cloud, it’s no longer an option to keep doing things the same way just to get by.

One example of how the cloud forces IT practices to change is the default configuration when you use Microsoft’s Azure Active Directory. This product syncs every Active Directory object to the cloud unless you apply filtering. The official documentation states that this is the recommended configuration. Think about that: Every single overlooked, basic password that got leaked several years ago during the LinkedIn breach is now in the cloud for use by anyone in the world. Those accounts went from a forgotten mess pushed under the rug years ago to a ticking time bomb waiting for attackers to hit a successful login as they spin through their lists of millions of username and password combos.

Back on the HTTP/HTTPS side, users now want to work from home or anywhere they might have an internet connection. They also want to do it from any device, such as their personal laptop, mobile phone or tablet. Exposing internal websites was once — and still is in many scenarios — a case of poking a hole in the firewall and hoping for the best. With an unencrypted HTTP site, all data it pushed in and out to that endpoint, from anything the user sees to anything they enter such as username and password is at risk. Your users could be working from a free McDonald’s Wi-Fi connection or at any airport in the world. It’s not hard for attackers to set up fake relay access points and listen to all the data and read anything that is not encrypted. Look up WiFi Pineapple for more information about the potential risks.

How to accommodate your users and tighten security

As you can see, it’s easy to end up in a high-risk situation if IT focuses on making users happy instead of company security. How do you make the transition to a safer environment? At the high level, there’s several immediate actions to take:

  • Clean up Active Directory. Audit accounts, disable ones not in use, organize your organizational units so they are clear and logical. Implement an account management process from beginning to end.
  • Review your password policy. If you have no other protection, cycle your passwords regularly and enforce some level of complexity. Look at other methods for added protection such as multifactor authentication (MFA), which Azure Active Directory provides, which can do away with password cycling. For more security, combine MFA with conditional access, so a user in your trusted network or using a trusted device doesn’t even need MFA. The choice is yours.
  • Review and report on account usage. When something is amiss with account usage, you should know as soon as possible to take corrective action. Technologies such as the identity protection feature Azure Active Directory issues alerts and remediates on suspicious activity, such a login from a location that is not typical for that account.
  • Implement HTTPS on all sites. You don’t have to buy a certificate for each individual site to enable HTTPS. Save money and generate them yourself if the site is only for trusted computers on which you can deploy the certificate chain. Another option is to buy a wildcard certificate to use everywhere. Once the certificate is deployed, you can expose the sites you want with Azure Active Directory Application Proxy rather than open ports in your firewall. This gives the added benefit of forcing an Azure Active Directory login to apply MFA and identity protection before the user gets to the internal site, regardless of the device and where they are physically located.

These are a few of the critical aspects to think about when changing your mindset from on-premises to cloud. This is a basic overview of the areas to give a closer look. There’s a lot more to consider, depending on the cloud services you plan to use.

Go to Original Article
Author:

SAP HANA database pivotal to SAP’s past — and future

Twenty years ago, enterprises may have turned to SAP for back-office business enterprise software. But these days, SAP wants to be much more than that.

A big part of SAP’s strategy has to do with SAP HANA, an in-memory database the company initially released in 2010. It is now the gateway to what SAP calls the intelligent enterprise, where data is used to improve business processes and develop new business models.

The first part of this two-part series looks at how SAP, which has been around for 47 years, has transitioned from a company that focused primarily on back-office business enterprise software to one that endeavors to transform organizations into intelligent enterprises.

Broadening the scope

SAP’s story in the last 20 years has been one of continually broadening scope, according to Lloyd Adams, managing director of the East Region at SAP America Inc. He joined the company in 1998.

In the late 1990s and early 2000s, “we were known more as an ERP company — perhaps back office only,” Adams said. “But through the years, both through organic development and a combination of development and acquisition, we’ve positioned ourselves to bring the back office to the front office to help provide the intelligent enterprise.”

Anchored by SAP R/3, its pioneering client-server ERP platform, SAP entered a period of dramatic growth in the late 1990s. It rode the wave of Y2K fears, as businesses scrambled to consolidate IT on back-office ERP systems.

Joshua Greenbaum, principle, Enterprise Applications ConsultingJoshua Greenbaum

“The upgrade fever that Y2K created was really enormous and a lot of folks were pushing to use Y2K as a way to rationalize IT spending,” said Joshua Greenbaum, principal at Enterprise Applications Consulting. “Also the Euro changeover was coming, and there was a lot of interest in looking at SAP because of how it could help manage European currency changes. So those two phenomena were really operative in the late 1990s, and SAP was right at the forefront of it.”

At the same time that SAP’s ERP business was growing, however, it faced threats from the rise of internet-based business systems and on-premises best-of-breed applications like Siebel Systems, which created a popular CRM product that Oracle acquired in 2005, and Ariba, which sold a procurement product that SAP eventually acquired in 2012, according to Jon Reed, co-founder of the ERP news and analysis firm Diginomica.com.

“SAP was able to weather those storms while expanding their ERP footprint by building out a serviceable CRM module, as well as an HR module with a globalized payroll function that has stood the test of time,” Reed said. “Their core manufacturing base remained loyal and … preferred SAP’s ‘one throat to choke’ approach and extensive consulting partners.”

Not all of SAP’s efforts succeeded. Its SAP NetWeaver integration platform fell short, and the company failed to see Salesforce — or anything SaaS — coming, Reed said.

One of the main keys to SAP’s success was to encourage its customers to undergo IT and business process reengineering in the 1990s, even if it was extremely complex, according to analyst Dana Gardner, president of Interarbor Solutions LLC in Gilford, N.H.

“Such IT-instigated business change was — and is — not easy, and the stumbles by many companies to catch up to the client-server world and implement ERP were legendary,” he said. “But imagine if those companies had not made the shift to being digital businesses in the 1990s? When the web and internet hit, manual processes and nonintegrated business functions had to adapt to a connected world, so IT went from being for the business to being the whole business.”

The idea that applications and the supporting IT infrastructure work collectively using distributed yet common data and pervasive networking to provide the best information and processes is a given these days, but SAP made this possible first, Gardner said.

Milestones in SAP's 20-year journey from R/3 to the intelligent enterprise.

The SAP HANA big bang

But by the end of the 2000s, the radical new in-memory database SAP HANA was about to change SAP’s direction again.

The release of the SAP HANA database in 2010 was the critical development that allowed SAP to conceive and begin to sell the concept of the intelligent enterprise, according to Adams. If there was no HANA, there would not have been an intelligent enterprise.

Lloyd Adams, managing director, East Region at SAP America IncLloyd Adams

“It truly revolutionized the company, the industry and our ability to transcend conversations from a back-office perspective, but then be able to sit down with our customers and try and understand what were the main opportunities that they were looking to exploit or problems they were looking to solve,” he said.

The development of SAP HANA was driven in large part by the rivalry between SAP and Oracle, according to Greenbaum. The SAP ERP applications ran mostly on Oracle databases, and in the 2000s Oracle began to aggressively encroach on SAP’s territory in the enterprise software space with moves like the bitter acquisition of ERP vendor PeopleSoft.

“For SAP this was a real wake up call, because of the dependency that they had on the Oracle database,” Greenbaum said. “That realization that they needed to get out from under Oracle, along with some research that had already been going on with in-memory databases inside SAP, began the hunt for an alternative, and that’s where the HANA project started to bear fruit.”

It has been a long, slow process for SAP to move its customers off of Oracle, which is still something of a problem today, Greenbaum said. But he believes HANA is now firmly established as the database of choice for customers.

Missteps with the SAP HANA database?

However, the emphasis on the SAP HANA database might have also been a distraction that took the company away from innovating on the applications that form SAP’s core user base, according to analyst Vinnie Mirchandani, founder of Deal Architect.

Vinnie Mirchandani, analyst and founder, Deal ArchitectVinnie Mirchandani

“Every few years, SAP gets enamored with platforms and tools,” Mirchandani said. “NetWeaver and HANA, in particular, distracted the company from an application focus, without generating much revenue or market share in those segments.”

SAP was fundamentally correct that in-memory technology and real-time ERP were the ways of the future, but its push into databases with HANA is still a questionable strategy, according to Reed.

“Whether SAP should have entered the database business themselves is still open to second-guessing,” he said. “You can argue this move has distracted SAP from focusing on their homegrown and acquired cloud applications. For example, would SAP be much further ahead on SuccessFactors functionality if they hadn’t spent so much time putting SuccessFactors onto HANA?”

Buying into the cloud

SAP was slow to react to the rise of enterprise cloud computing and SaaS application like Salesforce, but it course corrected by going on a cloud application buying spree, acquiring SuccessFactors in 2011, Ariba in 2012, Hybris in 2013, Fieldglass and Concur in 2014.

Combining these cloud applications with SAP HANA “completely changed the game” for the company, Adams said.

“We eventually began to put those cloud line of business solutions on the HANA platform,” he said. “That’s given us the ability to tell a full intelligent enterprise story in ways that we weren’t fully poised to do [before HANA].”

SAP’s strategy of buying its way into the cloud has been largely successful, although efforts to move core legacy applications to the cloud have been mixed, Greenbaum said.

“SAP can claim to be one of the absolute leaders in the cloud enterprise software space,” he said. “It’s a legacy that is tempered by the fact that they’re still pulling the core legacy R/3 and ECC customers into the cloud, which has not worked out as well as SAP would like, but in terms of overall revenue and influence in the area, they’ve made their mark.”

Although SAP has proved to be adaptable to changing technologies and business trends, the future is in question. Part two of this series, will look at the release of SAP S/4HANA (the rewriting of SAP’s signature Business Suite on HANA), the emergence of the SAP intelligent enterprise, and SAP’s focus on customer experience.

Go to Original Article
Author: