Tag Archives: first

SAP sees S/4HANA migration as its future, but do customers?

The first part of our 20-year SAP retrospective examined the company’s emerging dominance in the ERP market and its transition to the HANA in-memory database. Part two looks at the release of SAP S/4HANA in February 2015. The “next-generation ERP” was touted by the company as the key to SAP’s future, but it ultimately raised questions that in many cases have yet to be answered. The issues surrounding the S/4HANA migration remain the most compelling initiative for the company’s future.

Questions about SAP’s future have altered in the past year, as the company has undergone an almost complete changeover in its leadership ranks. Most of the SAP executives who drove the strategy around S/4HANA and the intelligent enterprise have left the company, including former CEO Bill McDermott. New co-CEOs Jennifer Morgan and Christian Klein are SAP veterans, and analysts don’t think the change in leadership will make for significant changes in the company’s technology and business strategy.

But they will take over the most daunting task SAP has faced: convincing customers of the business value of the intelligent enterprise, a data-driven transformation of businesses with S/4HANA serving as the digital core. As part of the transition toward intelligence, SAP is pushing customers to move off of tried and true SAP ECC ERP systems (or the even older SAP R/3), and onto the modern “next-generation ERP” S/4HANA. SAP plans to end support for ECC by 2025.

Dan LahlDan Lahl

S/4HANA is all about enabling businesses to make decisions in real time as data becomes available, said Dan Lahl, SAP vice president of product marketing and a 24-year SAP veteran.

“That’s really what S/4HANA is about,” Lahl said. “You want to analyze the data that’s in your system today. Not yesterday’s or last week’s information and data that leads you to make decisions that don’t even matter anymore, because the data’s a week out. It’s about giving customers the ability to make better decisions at their fingertips.”

S/4HANA migration a matter of when, not if

Most SAP customers see the value of an S/4HANA migration, but they are concerned about how to get there, with many citing concerns about the cost and complexity of the move. This is a conundrum that SAP acknowledges.

“We see that our customers aren’t grappling with if [they are going to move], but when,” said Lloyd Adams, managing director of the East Region at SAP America. “One of our responsibilities, then, is to provide that clarity and demonstrate the value of S/4HANA, but to do so in the context of the customers’ business and their industry. Just as important as showing them how to move, we need to do it as simply as possible, which can be a challenge.”

Lloyd AdamsLloyd Adams

S/4HANA is the right platform for the intelligent enterprise because of the way it can handle all the data that the intelligent enterprise requires, said Derek Oats, CEO of Americas at SNP, an SAP partner based in Heidelberg, Germany that provides migration services.

In order to build the intelligent enterprise, customers need to have a platform that can consume data from a variety of systems — including enterprise applications, IoT sensors and other sources — and ready it for analytics, AI and machine learning, according to Oats. S/4HANA uses SAP HANA, a columnar, in-memory database, to do that and then presents the data in an easy-to-navigate Fiori user interface, he said.

“If you don’t have that ability to push out of the way a lot of the work and the crunching that has often occurred down to the base level, you’re kind of at a standstill,” he said. “You can only get so much out of a relational database because you have to rely on the CPU at the application layer to do a lot of the crunching.”

S/4HANA business case difficult to make

Although many SAP customers understand the benefits of S/4HANA, SAP has had a tough sell in getting its migration message across to its large customer base. The majority of customers plan to remain on SAP ECC and have only vague plans for an S/4HANA migration.

Joshua GreenbaumJoshua Greenbaum

“The potential for S/4HANA hasn’t been realized to the degree that SAP would like,” said Joshua Greenbaum, principal at Enterprise Applications Consulting. “More companies are really looking at S/4HANA as the driver of genuine business change, and recognize that this is what it’s supposed to be for. But when you ask them, ‘What’s your business case for upgrading to S/4HANA?’ The answer is ‘2025.’”

The real issue with S/4HANA is that the concepts behind it are relatively big and very specific to company, line of business and geography.
Joshua GreenbaumPrincipal, Enterprise Applications Consulting

One of the problems that SAP faces when convincing customers of the value of S/4HANA and the intelligent enterprise is that no simple use case drives the point home, Greenbaum said. Twenty years ago, Y2K provided an easy-to-understand reason why companies needed to overhaul their enterprise business systems, and the fear that computers wouldn’t adapt to the year 2000 led in large measure to SAP’s early growth.

“Digital transformation is a complicated problem and the real issue with S/4HANA is that the concepts behind it are relatively big and very specific to company, line of business and geography,” he said. “So the use cases are much harder to justify, or it’s much more complicated to justify than, ‘Everything is going to blow up on January 1, 2000, so we have to get our software upgraded.'”

Evolving competition faces S/4HANA

Jon Reed, analyst and co-founder of ERP news and analysis firm Diginomica.com, agrees that SAP has successfully embraced the general concept of the intelligent enterprise with S/4HANA, but struggles to present understandable use cases.

Jon ReedJon Reed

“The question of S/4HANA adoption remains central to SAP’s future prospects, but SAP customers are still trying to understand the business case,” Reed said. “That’s because agile, customer-facing projects get the attention these days, not multi-year tech platform modernizations. For those SAP customers that embrace a total transformation — and want to use SAP tech to do it — S/4HANA looks like a viable go-to product.”

SAP’s issues with driving S/4HANA adoption may not come from the traditional enterprise competitors like Oracle, Microsoft and Infor, but from cloud-based business applications like Salesforce and Workday, said Eric Kimberling, president of Third Stage Consulting, a Denver-based firm that provides advice on ERP deployments and implementations.

Eric KimberlingEric Kimberling

“They aren’t direct competitors with SAP; they don’t have the breadth of functionality and the scale that SAP does, but they have really good functionality in their best-of-breed world,” Kimberling said. “Companies like Workday and Salesforce make it easier to add a little piece of something without having to worry about a big SAP project, so there’s an indirect competition with S/4HANA.”

SAP customers are going to have to adapt to evolving enterprise business conditions regardless of whether or when they move to S/4HANA, Greenbaum said.

“Companies have to build business processes to drive the new business models. Whatever platform they settle on, they’re going to be unable to stand still,” he said. “There’s going to have to be this movement in the customer base. The question is will they build primarily on top of S/4HANA? Will they use an Amazon or an Azure hyperscaler as the platform for innovation? Will they go to their CRM or workforce automation tool for that? The ‘where’ and ‘what next’ is complicated, but certainly a lot of companies are positioning themselves to use S/4HANA for that.”

Go to Original Article
Author:

The First Wave of Xbox Black Friday Deals Has Arrived: Discounts on Sea of Thieves and Select Xbox Wireless Controllers – Xbox Wire

The holidays will be here before you know it, and to kick off the start of November, we are unveiling the first wave of Xbox Black Friday discounts. This is just a sample of our entire Black Friday deals – tune in via Mixer for a special episode of Inside Xbox live from X019 in London on Thursday, November 14 at 12:00 p.m. PT for the full lineup of Xbox Black Friday discounts and offers. You won’t want to miss out!

First up, we are offering a 50% discount on Sea of Thieves: Anniversary Edition, the fastest-selling first-party new IP of this generation. Join this multiplayer, shared-world adventure game featuring new modes like the story driven Tall Tales or The Arena, a competitive multiplayer experience on the high seas. Xbox Live Gold is required to play Sea of Thieves: Anniversary Edition and is sold separately.

Fans can also save up to $20 on select Xbox Wireless Controllers, including some of the newest controllers in the Xbox collection. Snag the Night Ops Camo Special Edition, Sport Blue Special Edition, Gears 5 Kait Diaz Limited Edition controllers and many more at the lowest prices of the season.

Deals are valid starting on November 24 and run through December 2, 2019. Plus, Black Friday kicks off even earlier for Xbox Game Pass Ultimate and Xbox Live Gold members, with Early Access beginning on November 21.

Visit Xbox.com, Microsoft Store and participating retailers globally for more details on availability and pricing as deals will vary between regions and retailers. See here for more Black Friday deals from Microsoft Store.

Xbox has something for everyone on your gift this list year, and at every price point. Be sure to tune in to Inside Xbox at X019 on Thursday, November 14 at 12:00 p.m. PT for the full lineup of Xbox Black Friday deals.

Go to Original Article
Author: Microsoft News Center

HPE Cray ClusterStor E1000 arrays tackle converged workloads

Supercomputer maker Cray has pumped out revamped high-density ClusterStor storage, its first significant product advance since being acquired by Hewlett Packard Enterprise.

The new Cray ClusterStor E1000 launched this week, six months after HPE’s $1.3 billion acquisition of Cray in May. Engineering of the E1000 began before the HPE acquisition.

Data centers can mix the dense Cray ClusterStor E1000 all-flash and disk arrays to build ultrafast “exascale” storage clusters that converge processing for AI, modeling and simulation and similar data sets, said Ulrich Plechschmidt, Cray lead manager.

The E1000 arrays run a hardened version of the Lustre open source parallel file system. The all-flash E1000 provides 4.5 TB of raw storage per SSD rack, with expansion shelves that add up to 4.6 TB. The all-flash model system potentially delivers up to 1.6 TB of throughout per second and 50 million IOPS per SSD rack, while an HDD rack is rated at 120 Gbps and 10 PB of raw capacity.

When fully built out, Plechschmidt said ClusterStor can scale to 700 PB of usable capacity in a single system, with throughput up to 10 PB per second.

Cray software stack

Cray ClusterStor disk arrays pool flash and disk within the same file system. ClusterStor E1000 includes Cray-designed PCIe 4.0 storage servers that serve data from NVMe SSDs and spinning disk. Cray’s new Slingshot 200 Gbps interconnect top-of-rack switches manage storage traffic.

The most impressive work Cray did is on the software side. You might have to stage data in 20 different containers at the same time, each one outfitted differently. … That’s a very difficult orchestration process.
Steve ConwayCOO and senior research vice president, Hyperion Research

Newly introduced ClusterStor Data Services manage orchestration and data tiering, which initially will be available as scripted tiering for manually invoking Lustre software commands. Automated data movement and read-back/write-through caching are on HPE’s Cray roadmap.

While ClusterStor E100 hardware has massive density and low-latency throughout, Cray invested significantly in upgrading its software stack, said Steve Conway, COO and senior research vice president at Hyperion Research, based in St. Paul, Minn.

“To me, the most impressive work Cray did is on the software side. You might have to stage data in 20 different containers at the same time, each one outfitted differently. And you have to supply the right data at the right time and might have to solve the whole problem in milliseconds. That’s a very difficult orchestration process,” Conway said.

The ClusterStor odyssey

HPE is the latest in a string of vendors to take ownership of ClusterStor. Seagate Technology acquired original ClusterStor developer Xyratex in 2013, then in 2017 sold ClusterStor to Cray, which had been a Seagate OEM partner.

Cray ClusterStor E1000
HPE-owned Cray released new all-flash and disk ClusterStor arrays for AI, containerized workloads.

HPE leads the high-performance computing (HPC) market in overall revenue, but it has not had a strong presence in the high end of the supercomputing market. Buying Cray allows HPE to sell more storage for exascale computing, which represents a thousandfold increase above petabyte-scale processing computing power. These high-powered exascale systems are priced beyond the budgets of most commercial enterprises.

Cray’s Shasta architecture underpins three large supercomputing sites at federal research labs: Argonne National Laboratory in Lemont, Ill.;. Lawrence Livermore National Laboratory in Livermore, Calif.; and Oak Ridge National Laboratory in Oak Ridge, Tenn.

Cray last year won a $146 million federal contract to architect a new supercomputer at Livermore’s National Energy Research Scientific Computing Center. That system will use Cray ClusterStor storage.

Conway said Cray and other HPC competitors are under pressure to expand to address newer abstraction methods for processing data, including AI, container storage and microservices architecture.

“You used to think of supercomputers as a single-purpose steak knife. Now they have to be a multipurpose Swiss Army knife. The newest generation of supercomputers are all about containerization and orchestration of data on premises,” Conway said. “They have to be much more heterogeneous in what they do, and the storage has to follow suit.”

Go to Original Article
Author:

How to manage Server Core with PowerShell

After you first install Windows Server 2019 and reboot, you might find something unexpected: a command prompt.

While you’re sure you didn’t select the Server Core option, Microsoft now makes it the default Windows Server OS deployment for its smaller attack surface and lower system requirements. While you might remember DOS commands, those are only going to get you so far. To deploy and manage Server Core, you need to build your familiarity with PowerShell to operate this headless flavor of Windows Server.

To help you on your way, you will want to build your knowledge of PowerShell and might start with the PowerShell integrated scripting environment (ISE). PowerShell ISE offers a wealth of features for the novice PowerShell user, including auto complete of commands to context-colored commands to step you through the scripting process. The problem is PowerShell ISE requires a GUI or the “full” Windows Server. To manage Server Core, you have the command window and PowerShell in its raw form.

Start with the PowerShell basics

To start, type in powershell to get into the environment, denoted by the PS before the C: prompt. A few basic DOS commands will work, but PowerShell is a different language. Before you can add features and roles, you need to set your IP and domain. It can be done in PowerShell, but this is laborious and requires a fair amount of typing. Instead, we can take a shortcut and use sconfig to compete the setup. After that, we can use PowerShell for additional administrative work.

PowerShell uses a verb-noun format, called cmdlets, for its commands, such as Install-WindowsFeature or Get-Help. The verbs have predefined categories that are generally clear on their function. Some examples of PowerShell cmdlets are:

  • Install: Use this PowerShell verb to install software or some resource to a location or initialize an install process. This would typically be done to install a windows feature such as Dynamic Host Configuration Protocol (DHCP).
  • Set: This verb modifies existing settings in Windows resources, such as adjusting networking or other existing settings. It also works to create the resource if it did not already exist.
  • Add: Use this verb to add a resource or setting to an existing feature or role. For example, this could be used to add a scope onto the newly installed DHCP service.
  • Get: This is a resource retriever for data or contents of a resource. You could use Get to present the resolution of the display and then use Set to change it.

To install DHCP to a Server Core deployment with PowerShell, use the following commands.

Install the service:

Install-WindowsFeature –name 'dhcp'

Add a scope for DHCP:

Add-DhcpServerV4Scope –name "Office" –StartingRange 192.168.1.100 -EndRange 192.168.1.200 -SubnetMask 255.255.255.0

Set the lease time:

Set-DHCPSet-DhcpServerv4Scope -ScopeId 192.168.1.100 -LeaseDuration 1.00:00:00

Check the DHCP IPv4 scope:

Get-DhcpServerv4Scope

Additional pointers for PowerShell newcomers

Each command has a purpose and means you have to know the syntax, which is the hardest part of learning PowerShell. Not knowing what you’re looking for can be very frustrating, but there is help. The Get-Help displays the related commands for use with that function or role.

Part of the trouble for new PowerShell users is this can still be overwhelming to memorize all the commands, but there is a shortcut. As you start to type a command, the tab key auto-completes the PowerShell commands. For example, if you type Get-Help R and press the tab key, PowerShell will cycle through the commands, such as the command Remove-DHCPServerInDC, see Figure 1. When you find the command you want and hit enter, PowerShell presents additional information for using that command. Get-Help even supports wildcards, so you could type Get-Help *dhcp* to get results for commands that contain that phrase.

Get-Help command
Figure 1. Use the Get-Help command to see the syntax used with a particular PowerShell cmdlet.

The tab function in PowerShell is a savior. While this approach is a little clumsy, it is a valuable asset in a pinch due to the sheer number of commands to remember. For example, a base install of Windows 10 includes Windows PowerShell 5.1 which features more than 1,500 cmdlets. As you install additional PowerShell modules, you make more cmdlets available.

There are many PowerShell books, but do you really need them? There are extensive libraries of PowerShell code that are free to manipulate and use. Even walking through a Microsoft wizard gives the option to create the PowerShell code for the wizard you just ran. As you learn where to find PowerShell code, it becomes less of a process to write a script from scratch but more of a modification of existing code. You don’t have to be an expert; you just need to know how to manipulate the proper fields and areas.

Outside of typos, the biggest stumbling block for most beginners is not reading the screen. PowerShell does a mixed job with its error messages. The type is red when something doesn’t work, and PowerShell will give the line and character where the error occurred.

In the example in Figure 2, PowerShell threw an error due to the extra letter s at the end of the command Get-WindowsFeature. The system didn’t recognize the command, so it tagged the entire command rather than the individual letter, which can be frustrating for beginners.

PowerShell error message
Figure 2. When working with PowerShell on the command line, you don’t get precise locations of where an error occurred if you have a typo in a cmdlet name.

The key is to review your code closely, then review it again. If the command doesn’t work, you have to fix it to move forward. It helps to stop and take a deep breath, then slowly reread the code. Copying and pasting a script from the web isn’t foolproof and can introduce an error. With some time and patience, and some fundamental PowerShell knowledge of the commands, you can get moving with it a lot quicker than you might have thought.

Go to Original Article
Author:

AI at the core of next-generation BI

Next-generation BI is upon us, and has been for a few years now.

The first generation of business intelligence, beginning in the 1980s and extending through the turn of the 21st century, relied entirely on information technology experts. It was about business reporting, and was inaccessible to all but a very few with specialized skills.

The second introduced self-service analytics, and lasted until just a few years ago. The technology was accessible to data analysts, and defined by data visualization, data preparation and data discovery.

Next-generation BI — the third generation — is characterized by augmented intelligence, machine learning and natural language processing. It’s open everyday business users, and trust and transparency are important aspects. It’s also changing the direction data looks, becoming more predictive.

In September, Constellation Research released “Augmented Analytics: How Smart Features Are Changing Business Intelligence.The report, authored by analyst Doug Henschen, took a deep look at next-generation BI.

Henschen reflected on some of his findings about the third generation of business analytics for a two-part Q&A.

In Part I, Henschen addressed what marked the beginning of this new era and who stands to benefit most from augmented BI capabilities. In Part II, he looked at which vendors are positioned to succeed and where next-generation business intelligence is headed next.

In your report you peg 2015 as the beginning of next generation BI — what features were you seeing in analytics platforms at that time that signaled a new era?

Doug HenschenDoug Henschen

Doug Henschen: There was a lot percolating at the time, but I don’t think that it’s about a specific technology coming out in 2015. That’s an approximation of when augmented analytics really became something that was ensconced as a buying criteria. That’s I think approximately when we shifted — the previous decade was really when self-service became really important and the majority of deployments were driving toward it, and I pegged 2015 as the approximate time at which augmented started getting on everyone’s radar.

Beyond the technology itself, what were some things that happened in the market around the time of 2015 that showed things were changing?

Henschen: There were lots of technology things that led up to that — Watson playing Jeopardy was in 2011, SAP acquired KXEN in 2013, IBM introduced Watson Analytics in 2014. Some startups like ThoughtSpot and BeyondCore came in during the middle of the decade, Salesforce introduced Einstein in 2016 and ended up acquiring BeyondCore in 2016. A lot of stuff was percolating in the decade, and 2015 is about when it became about, ‘OK, we want augmented analytics on our list. We want to see these features coming up on roadmaps.’

What are you seeing now that has advanced next-generation BI beyond what was available in 2015?

Anything that is proactive, that provides recommendations, that helps automate work that was tedious, that surfaces insights that humans would have a tough time recognizing but that machines can recognize — that’s helpful to everybody.
Doug HenschenAnalyst, Constellation Research

Henschen: In the report I dive into four areas — data preparation, data discovery and analysis, natural language interfaces and interaction, and forecasting and prediction — and in every category you’ve seen certain capabilities become commonplace, while other capabilities have been emerging and are on the bleeding edge. In data prep, everyone can pretty much do auto data profiling, but recommended or suggested data sources and joins are a little bit less common. Guided approaches that walk you through how to cleanse this, how to format this, where and how to join — that’s a little bit more advanced and not everybody does it.

Similarly, in the other categories, recommended data visualization is pretty common in discover and analysis, but intent-driven recommendations that track what individuals are doing and make recommendations based on patterns among people are more on the bleeding edge. It applies in every category. There’s stuff that is now widely done by most products, and stuff that is more bleeding edge where some companies are innovating and leading.

Who benefits from next-generation BI that didn’t benefit in previous generations — what types of users?

Henschen: I think these features will benefit all. Anything that is proactive, that provides recommendations, that helps automate work that was tedious, that surfaces insights that humans would have a tough time recognizing but that machines can recognize — that’s helpful to everybody. It has long been an ambition in BI and analytics to spread this capability to the many, to the business users, as well as the analysts who have long served the business users, and this extends the trend of self-service to more users, but it also saves time and supports even the more sophisticated users.

Obviously, larger companies have teams of data analysts and data engineers and have more people of that sort — they have data scientists. Midsize companies don’t have as many of those assets, so I think [augmented capabilities] stand to be more beneficial to midsize companies. Things like recommended visualizations and starting points for data exploration, those are very helpful when you don’t have an expert on hand and a team at your disposal to develop a dashboard to address a problem or look at the impact of something on sales. I think [augmented capabilities] are going to benefit all, but midsize companies and those with fewer people and resources stand to benefit more.  

You referred to medium-sized businesses, but what about small businesses?

Henschen: In the BI and analytics world there are products that are geared to reporting and helping companies at scale. The desktop products are more popular with small companies — Tableau, Microsoft Power BI, Tibco Spotfire are some that have desktop options, and small companies are turning also to SaaS options. We focus on enterprise analytics — midsize companies and up — and I think enterprise software vendors are focused that way, but there are definitely cloud services, SaaS vendors and desktop options. Salesforce has some good small business options. Augmented capabilities are coming into those tools as well.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

TigerGraph Cloud releases graph database as a service

With the general release of TigerGraph Cloud on Wednesday, TigerGraph introduced its first native graph database as a service.

In addition, the vendor announced that it secured $32 million in Series B funding, led by SIG.

TigerGraph, founded in 2012 and based in Redwood City, Ca., is a native graph database vendor whose products, first released in 2016, enable users to manage and access their data in different ways than traditional relational databases.

Graph databases simplify the connection of data points and enable them to simultaneously connect with more than one other data point. Among the benefits are the ability to significantly speed up the process of developing data into insights and to quickly pull data from disparate sources.

Before the release of TigerGraph Cloud, TigerGraph customers were able to take advantage of the power of graph databases, but they were largely on-premises users, and they had to do their own upgrades and oversee the management of the database themselves.

“The cloud makes life easier for everyone,” said Yu Xu, CEO of TigerGraph. “The cloud is the future, and more than half of database growth is coming from the cloud. Customers asked for this. We’ve been running [TigerGraph Cloud] in a preview for a while — we’ve gotten a lot of feedback from customers — and we’re big on the cloud. [Beta] customers have been using us in their own cloud.”

Regarding the servicing of the databases, Xu added: “Now we take over this control, now we host it, we manage it, we take care of the upgrades, we take care of the running operations. It’s the same database, but it’s an easy-to-use, fully SaaS model for our customers.”

In addition to providing graph database management as a service and enabling users to move their data management to the cloud, TigerGraph Cloud provides customers an easy entry into graph-based data analysis.

Some of the most well-known companies in the world, at their core, are built on graph databases.

Google, Facebook, LinkedIn and Twitter are all built on graph technology. Those vendors, however, have vast teams of software developers to build their own graph databases and teams of data scientists do their own graph-based data analysis, noted TigerGraph chief operating officer Todd Blaschka.

“That is where TigerGraph Cloud fits in,” Blaschka said. “[TigerGraph Cloud] is able to open it up to a broader adoption of business users so they don’t have to worry about the complexity underneath the hood in order to be able to mine the data and look for the patterns. We are providing a lot of this time-to-value out of the box.”

TigerGraph Cloud comes with 12 starter kits that help customers quickly build their applications. It also doesn’t require users to configure or manage servers, schedule monitoring or deal with potential security issues, according to TigerGraph.

That, according Donald Farmer, principal at TreeHive Strategy, is a differentiator for TigerGraph Cloud.

It is the simplicity of setting up a graph, using the starter kits, which is their great advantage. Classic graph database use cases such as fraud detection and recommendation systems should be much quicker to set up with a starter kit, therefore allowing non-specialists to get started.
Donald FarmerPrincipal, TreeHive Strategy

“It is the simplicity of setting up a graph, using the starter kits, which is their great advantage,” he said. “Classic graph database use cases such as fraud detection and recommendation systems should be much quicker to set up with a starter kit, therefore allowing non-specialists to get started.”

Graph databases, however, are not better for everyone and everything, according to Farmer. They are better than relational databases for specific applications, in particular those in which augmented intelligence and machine learning can quickly discern patterns and make recommendations. But they are not yet as strong as relational databases in other key areas.

“One area where they are not so good is data aggregation, which is of course a significant proportion of the work for business analytics,” Farmer said. “So relational databases — especially relational data warehouses — still have an advantage here.”

Despite drawbacks, the market for graph databases is expected to grow substantially over the next few years.

And much of that growth will be in the cloud, according to Blaschka.

Citing a report from Gartner, he said that 68% of graph database market growth will be in the cloud, while the graph database market as whole is forecast to have at least 100 percent year-over-year annual growth through 2022.

“The reason we’re seeing this growth so fast is that graph is the cornerstone for technologies such as machine learning, such as artificial intelligence, where you need large sets of data to find patterns to find insight that can drive those next-gen applications,” he said. “It’s really becoming a competitive advantage in the marketplace.”

With respect to the $32 million TigerGraph raised in Series B financing, according to Xu it will be used to help TigerGraph expand its reach into new markets and accelerate its emphasis on the cloud.

Go to Original Article
Author:

Behind the Design: Surface Headphones

Meet the Surface design team who built our first smart headphones

Vivian Nguyen

It’s 2 PM, and you need to finish a project by end of day. Coworkers in your open office space are chatting about the newest ramen spot. While you’d like to expound on the difference between Hokkaido- versus Tokyo-style noodles, you need to focus. You put on your Surface Headphones, and Cortana greets you with a quick update:

“Hi, you’ve got about 8 hours of battery left. You’re connected to [your device name].”

As your favorite Odesza song plays, you start work immediately.

YOUR WORKPLACE ACCOMPANIMENT

Composed with the design trinity of audio quality, comfort, and seamless integration, the Surface Headphones help you create a personal space in the modern workplace and in your modern life. The idea that, when you wear them, you escape into another world that lets you focus on you and what you need to get done. The Surface design team wanted to give you — the actual user — control over how plugged in (or not!) you want to be to your immediate environment. Check out the tech specs here.

And you can see that thoughtful approach in the hardware. Designing comfortable earmuffs was paramount because it’s the one part that touches you all the time. They initially considered a traditional donut shape, but with inclusive design at the heart of everything we do, we wanted to accommodate a diverse set of ear shapes. The earmuffs now fit ears of all shapes and sizes comfortably with even pressure points for a secure fit.

Tactile design wasn’t the only consideration. They set out to craft a device that’s both functional and beautiful. Creating a smooth seam on the earmuff, for example, was surprisingly difficult. See how the team wouldn’t take no for an answer in the video below:

See how the Surface design team wove together elegant hardware design, rich audio, and an intelligent assistant. Click here for the audio description version.

Every decision about the Surface Headphones keeps real people in mind — including the writing they don’t see or touch.

To create a holistic and seamless voice experience, Senior Writer Matt Lichtenberg, who focuses on hardware, and Senior Personality Lead for AI Chris O’Connor, who shapes the voice for intelligent experiences, fused their complementary skills. Because Cortana delivers the instructions, Matt and Chris needed to collaborate and bring together the what (instructions) and the how (Cortana).

“Words contribute to the whole experience,” said Matt, “and we wanted the headphones to be almost invisible to people while they’re wearing them. They shouldn’t have to think about them much.”

“I like to think of it as, we’re helping people achieve more with less energy,” said Chris. “How do they get the most out of this device with the least amount of effort? It’s the idea that design stays out of your way — it’s minimal and there to help you get stuff done.”

THINKING OUT OF THE BOX

From the onset, the design team wanted to understand how people naturally use headphones in a variety of vignettes. They developed a series of scenarios to answer key questions about how people interacted with the headphones.

For instance, when customers initially turn on the headphones, would they want to pair and go? Or would they download the Cortana app first?

As it turns out, most want to pair and go.

When you turn on other Bluetooth devices for the first time, you’ll need to put the device in pairing mode. With the Surface Headphones, they’re immediately in pairing mode and Cortana greets you with, “Hello, you’re ready to pair.”

You connect your device, and Cortana confirms with, “You’re paired to [device name].”

“It’s a challenge to create a rich and enjoyable out-of-the-box experience,” said Chris. “If it’s boring and tedious, people blow right through it. But if it’s enjoyable and people understand the value, they’ll reach an optimal state before carrying on.”

Design is an iterative process, and we’re constantly listening to feedback. We’ve heard customers ask for more device control to turn settings on or off, including the “Hey, Cortana” voice activation, touch controls, and voice prompts. So, we delivered.

The latest firmware update on the Cortana app can help you personalize your headphone settings, like reducing the number or duration of voice prompts. That means you can change your settings so a simple “Hello” plays when you initially turn on your headphones. The app gives you more control of your device, ensuring you get the best experience possible.

“It’s amazing how long it feels to say a few words, so you need to make them count,” said Matt.

Unlike computers, which require constant interaction, the Surface Headphones almost disappear into the background while you work, helping you focus while eliminating outside distractions. To help people achieve this, the voice writing team designed the voice prompts to avoid interruptions unless they’re critical, like letting you know when your battery is low.

“How do you thread the needle between being a voice prompt, a robot, and a conversational entity, but still get out of the way?” asked Chris. “This was one of the first areas where we had to practice design differently and pull back on personality to allow things to be shorter and faster.”

COMMUNICATING WITHOUT WORDS

Some interactions don’t even need words.

When the headphones are charging, for example, the LED light flashes. In this context, a visual cue is more intuitive. You don’t need to pick them up or put them on to know what’s happening.

In times when words feel unnatural, sound itself can communicate information. When you turn the left dial on the Surface Headphones forward, you hear a low-pitched beep to indicate maximum noise cancellation. Conversely, a high-pitched beep plays when you turn the dial in the opposite direction. This confirms the headphones are now amplifying ambient sound.

Inspired by the volume knobs of hi-fi stereos, which turn with a certain slowness, the hardware design team added headset dials to adjust volume, noise cancellation, or sound amplification. Rotating the dial is an intuitive motion that lets people choose the precise level of sound they want (or don’t want).

Our design anticipates different modes of communication contingent on how someone wants to use or interact with the headphones. But whether it’s audio or visual, each interaction remains succinct.

THE NEXT MOVEMENT IN VOICE DESIGN

The Surface Headphones are the first ambient device from Microsoft with an assistant. The Surface design team had a groundbreaking opportunity to radically reimagine headphones as more than just headphones.

In the past, people often confused or conflated digital assistants with voice control. But with increased investments in personality design and the future of interaction, Microsoft is experimenting with giving Cortana added dimension and awareness to help customers get the most out of a digital assistant.

“We decided to use the human metaphor for a digital assistant, because a real-life assistant isn’t just voice control. They don’t just take dictation. They understand what’s important to you, your family, your priorities, your goals,” explained Chris.

As we continue to infuse intelligence across our products and services, teams throughout the company are beginning to explore the potential for what a digital assistant could be.

“The headphones sparked a whole new area of thinking — one that we’re using to think through the same problem from other endpoints as we move on to work for the Office 365 apps,” said Chris.

And who knows? Maybe one day, when you slip on your Surface Headphones, Cortana can chime in with her favorite kind of ramen, too.

Go to Original Article
Author: Microsoft News Center

For Sale – Apple iMac 27″ 5K Late 2015 (i7, 16GB, SSD)

First time selling here on AvForums, hope everything is laid out as it’s meant to be. So here goes:

Selling my iMac 27 inch 5K Late 2015 model. Reason for sale is that I found work outside the UK and I’d rather sell this than going through the hassle of transporting it. If it doesn’t sell by the 15th of September I’ll archive this thread and make arrangements to take it with me. Price is non negotiable, I’ve priced this so it sells fast. If it doesn’t, like said I’ll pack it up and take it with me. Overall the iMac is in mint condition. There is one hairline scratch on the bottom of the frame which shows only when light is directed towards that area. Apart from that, I can’t see any scratches on the aluminium frame but there might be hairlines scratches on the base of it and some very small ones here and there but I just checked and can’t see anything else.

Specs:

27 inch 5K Retina screen – No faults, no dead pixels, no scratches.
CPU: i7 6700K at 4ghz.
Memory: 16GB (2X8gb) DDR3L 1866mhz
HD: Apple proprietary 512GB SSD (Apple SMO512G), faster than your average SSD, google it.
GPU (Graphics card): AMD Radeon R9 M395X with 4GB of GDDR5 memory.
Comes with Apple wireless keyboard and Apple trackpad.

Pics:

[​IMG]
[​IMG]
[​IMG]
[​IMG]
[​IMG]
[​IMG]
[​IMG]

Any questions please do ask.

Price and currency: £1000
Delivery: Goods must be exchanged in person
Payment method: Bank Transfer
Location: Milton Keynes
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

For Sale – Apple iMac 27″ 5K Late 2015 (i7, 16GB, SSD)

First time selling here on AvForums, hope everything is laid out as it’s meant to be. So here goes:

Selling my iMac 27 inch 5K Late 2015 model. Reason for sale is that I found work outside the UK and I’d rather sell this than going through the hassle of transporting it. If it doesn’t sell by the 15th of September I’ll archive this thread and make arrangements to take it with me. Price is non negotiable, I’ve priced this so it sells fast. If it doesn’t, like said I’ll pack it up and take it with me. Overall the iMac is in mint condition. There is one hairline scratch on the bottom of the frame which shows only when light is directed towards that area. Apart from that, I can’t see any scratches on the aluminium frame but there might be hairlines scratches on the base of it and some very small ones here and there but I just checked and can’t see anything else.

Specs:

27 inch 5K Retina screen – No faults, no dead pixels, no scratches.
CPU: i7 6700K at 4ghz.
Memory: 16GB (2X8gb) DDR3L 1866mhz
HD: Apple proprietary 512GB SSD (Apple SMO512G), faster than your average SSD, google it.
GPU (Graphics card): AMD Radeon R9 M395X with 4GB of GDDR5 memory.
Comes with Apple wireless keyboard and Apple trackpad.

Pics:

[​IMG]
[​IMG]
[​IMG]
[​IMG]
[​IMG]
[​IMG]
[​IMG]

Any questions please do ask.

Price and currency: £1000
Delivery: Goods must be exchanged in person
Payment method: Bank Transfer
Location: Milton Keynes
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Clumio backup seeks to simplify with SaaS

As a new vendor’s first customer, the IT leader of a city wouldn’t be faulted for worrying about the product.

But Cory Smith, CIO and CTO of Davenport, Iowa, said he didn’t have concerns with using Clumio for backup and recovery. Clumio, which is based in Santa Clara, Calif., came out of stealth Aug. 13 with its cloud-based backup as a service.

Smith said the city was looking for a new backup product earlier this year when Clumio contacted him about trying out a beta version. He said he felt more at ease with the product after using it in beta and performing backups and restores. The city purchased Clumio as soon as it became generally available April 30.

Though Davenport doesn’t have a major cloud initiative, Smith said going cloud-only for backup is a goal.

“This is one of those situations where the cloud is really good for us,” Smith said.

Striving for simplicity

Clumio CEO and co-founder Poojan Kumar aims for his company to do with backup what Salesforce has done with CRM and ServiceNow with service management. He wants to deliver a true service offering that’s completely run and managed in the cloud by Clumio.

“SaaS is taking over,” Kumar said. “Our founding vision was really around going and building a data management platform on top of the public cloud.”

SaaS is taking over. Our founding vision was really around going and building a data management platform on top of the public cloud.
Poojan KumarCEO, Clumio

Kumar said he wants customers to get away from the complex nature of installing software and hardware for backup. In addition, as workloads are moving to the cloud, the practice of using multiple accounts, regions and clouds is increasing complexity.

“We saw all of this as an opportunity for simplification,” Kumar said.

To start, Clumio protects on-premises and cloud-based VMware environments on top of AWS. It also provides native AWS backup for accounts that run Elastic Compute Cloud and Elastic Block Store.

The majority of backup vendors were “born in the world of on premises,” delivering protection through software, hardware or both, which the customer has to manage, Kumar said. He said legacy backup players cannot take advantage of the public cloud “the right way” by building a cloud-native architecture and true SaaS platform.

“By SaaS, I mean a true service that is multi-tenant that frees the customer from the mundane of managing these solutions,” Kumar said.

Andrew Smith, research manager at IDC, noted that Clumio customers don’t need to use anything on premises. They can simply spin up the virtual appliance and start using Clumio. The vendor says it takes 15 minutes to get the product running.

“The way they’re approaching backup as a service as an all-inclusive platform is unique,” Smith said. “The idea is to ‘SaaSify’ the entire backup environment.”

Davenport’s Smith said even with his larger environment — about 70 VMs and 40 TB worth — getting to the cloud was not an issue.

The city, with a population of about 100,000, has to retain some data indefinitely. For example, police video — a data set that’s often large — could be critical in court 10 years from now.

“The city’s not going to go out of business,” he said. “I’ve got to keep it.”

Smith said its price is an advantage. Because Clumio charges per virtual machine rather than by the size of the VM, the cost does not rise as a VM grows larger.

Screenshot of Clumio backup
Clumio, a backup-as-a-service vendor that came out of stealth Aug. 13, charges per VM.

A look at current and future features

Smith said Davenport was looking for a new backup system because its Rubrik platform wasn’t performing well enough, especially with getting data sets to the cloud. The city wanted to get away from running hardware on premises and using traditional disaster recovery, and sought a cheap cloud service.

“Clumio has kind of hit that niche,” Smith said.

He acknowledged that the product is not yet mature and Clumio is still adding functionality. He said he’s looking for the vendor to add Microsoft Exchange and SQL support. Davenport still uses old Veeam licensing for Exchange and SQL Server, but Smith said he thinks eventually the city will only use Clumio for backup. He said he finds the interface and search easy to use.

Security wise, Clumio’s backups are encrypted in transit and at rest. Kumar said its immutability is especially important in the face of data protection threats like ransomware.

“You know that you can go back to the copy [and] it’s going to be kosher,” Kumar said. “We do a whole bunch of things automatically in the platform to make sure that it is restorable to the previous copy. It’s not just about backing it up — it’s about making sure it is restorable.”

Kumar said he expects Clumio will delve into machine learning to help look at potential issues with customer data.

Funding, founders, fighting status quo

Clumio has $51 million in funding over two rounds. Sutter Hill Ventures led the Series A round. Index Ventures drove the Series B round, which also had significant participation from Sutter Hill Ventures.

The company was founded in 2017. Kaustubh Patil, vice president of engineering, and CTO Woon Ho Jung were the other founders with Kumar. All three founders previously worked at VMware, Nutanix and PernixData. Kumar was a founder of PernixData, which was acquired by Nutanix.

Clumio has about 75 employees, Kumar said.

The product is sold exclusively through the channel.

IDC’s Smith said competition will include Veeam, Zerto, Rubrik and Cohesity, as well as the more traditional backup vendors such as Veritas, Dell EMC and Commvault. Druva and Carbonite are also leaders in cloud-based backup.

“They’re going to have to compete with everybody,” Smith said. “It’s going to be pretty difficult.”

It will be important for Clumio to attract customers moving all data to the cloud, Smith said, as well as users tackling multi-cloud and that increased complexity of environment.

Kumar said his biggest competition is the status quo.

“It’s going to be about educating the market that something like this is possible,” Kumar said. “And we can give you freedom from the mundane.”

Go to Original Article
Author: