Tag Archives: support

Google Cloud support premium tier woos enterprise customers

Google Cloud has introduced a Premium Support option designed to appeal to large enterprises through features such as 15-minute response times for critical issues.

Premium Support customers will be serviced by “context-aware experts who understand your unique application stack, architecture and implementation details,” said Atul Nanda, vice president of cloud support.

These experts will coordinate with a customer’s assigned technical account manager to resolve issues faster and in a more personalized manner, Nanda said in a blog post.

Google wanted to expand its support offerings beyond what basic plans for Google Cloud and G Suite include, according to Nanda. Other Premium Support features include operational health reviews, training, preview access to new products and more help with third-party technologies.

In contrast, Google’s other support options range from a free tier that provides help with only billing issues; Development, which costs $100 per user per month, with a four-hour response time; and Production, which costs $250 per user per month and has a one-hour response time.

Premium Support carries a base annual fee of $150,000 plus 4% of the customer’s net spending on Google Cloud Platform and/or G Suite. Google is also working on add-on services for Premium Support, such as expanded technical account manager coverage and mission-critical support, which involves a site reliability engineering consulting engagement. The latter is now in pilot.

Cloud changes the support equation

Customers with on-premises software licenses are used to paying stiff annual maintenance fees, which give them updates, bug fixes and technical support. On-premises maintenance fees can generate profit margins for vendors north of 90%, consuming billions of IT budget dollars that could have been spent on better things, said Duncan Jones, an analyst at Forrester.

Duncan JonesDuncan Jones

Google is recognizing they need to move up the stack in terms of support to make further inroads into the enterprise space.
Grant KirkwoodCTO, Unitas Global

“But customers of premium support offerings such as Microsoft Unified (fka Premier) Support and SAP MaxAttention express much higher satisfaction levels with value for money,” Jones said via email. “They are usually an alternative to similar services that the vendor’s SI and channel partners offer, so there is competition that drives up standards. Plus, they are optional extras so price/demand sensitivity keeps pricing at reasonable levels.” On the whole, Google’s move to add Premium Support is positive for customers, according to Jones.

But it’s clear why Google did it from a business perspective, said Grant Kirkwood, CTO of Unitas Global, a hybrid cloud services provider in Los Angeles. “Google is recognizing they need to move up the stack in terms of support to make further inroads into the enterprise space,” he said.

Microsoft today probably has the most robust support in terms of a traditional enterprise look-and-feel, while AWS’ approach is geared a bit more toward DevOps-centric shops, Kirkwood added.

“[Google is] taking a bit out of both playbooks,” he said. Premium Support could appeal to enterprises that have already done easier lift-and-shift projects to the cloud and are now rebuilding or creating new cloud-native applications, according to Kirkwood.

But as with anything, Google will have to prove its Premium Support option is worth the extra money.

“Successful [support] plans require great customer success management, highly trained technical account managers and AI-driven case management,” said Ray Wang, founder and CEO of Constellation Research.

Go to Original Article
Author:

Battle lines over Windows Server 2008 migration drawn

With technical support for Windows Server 2008 ending this week, the battle between Microsoft and AWS for the hearts and wallets of its corporate users is underway.

At its re:Invent conference last month, AWS introduced its appropriately named AWS End-of-Support Migration Program (EMP) for Windows Server, aimed at helping users with their Windows Server 2008 migration efforts. The program promises to make it easier to shift users’ existing Windows Server 2008 workloads over to newer versions of Windows running on servers in AWS’ data centers. The EMP technology decouples the applications from the underlying operating system, thereby allowing AWS partners to migrate mission-critical applications over to the newer versions of Windows Server.

The technology reportedly identifies whatever dependencies the application has on Windows Server 2008 and then pulls together the resources needed for applications to run on the updated version of Windows Server. The package of software includes all applications files, runtimes, components and deployment tools, along with an engine that redirects API calls from your application to files within the package, the company said.

Punching back in a blog this week, Vijay Kumar, director of Windows Server and Azure products at Microsoft, stressed the advantages of his company’s products for users undergoing Windows 2008 server migration efforts. Users can deploy Windows Server workloads in Azure a number of ways, he wrote, including the company’s Virtual Machines on Azure, Azure VMware Solutions and Azure Dedicated Host. Users can also apply Azure Hybrid Benefit service to leverage their existing Windows Server licenses in Azure.

Kumar also noted that users can take advantage of Microsoft’s Extended Security Updates program specifically aimed at Windows Server 2008/R2 users, which provides an additional three years of security updates. This can buy users more time to plan their transition paths for core applications and services, he wrote.

The battle to own Windows Server 2008 migration

AWS has long targeted Windows Server users and, in fact, has convinced more than a few IT shops to switch over to AWS EC2 cloud environment. It stepped up those efforts with the introduction of its AWS-Microsoft Workload Competency program for partners last fall, according to one analyst.

[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019. That number is a fivefold increase over 2015.
Meaghan McGrathSenior analyst, Technology Business Review

“[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019,” said Meaghan McGrath, a senior analyst at Technology Business Review. “That number is a fivefold increase over 2015.”

Microsoft has stemmed some of the bleeding, however, McGrath added. For instance, the company has convinced many of its partners to push its free migration assessment program, which gives users a more precise estimate of what their total cost of ownership will be by keeping their SQL Server workloads in Microsoft environments compared to migrating them to AWS’s EC2. But the company is also applying some financial pressure, as well.

“As of last fall, there is a caveat in the Software Assurance contracts among [SQL Server] users that made it much more expensive for them to bring their licenses over to another vendor’s hosted environment,” McGrath said. “The other financial incentive is [Microsoft’s] Azure Hyper Benefit program, which offers users a discount on Azure services for migrating their workloads from licensed software.”

32-bit apps snagging Windows Server 2008 migration efforts

Last summer, Microsoft officials said the operating system still represents 60% of the company’s overall server installed base — a number that’s likely so large because it’s the last 32-bit version of Windows Server. Many corporate users developed customized applications for the platform, which can be expensive and time-consuming to migrate to 64-bit platforms. Users can also have difficulty migrating a 32-bit app to a 64-bit environment that was purchased from a reputable third-party developer, typically because that developer has discontinued support for that offering.

Paul DeloryPaul Delory

“When you are dealing with a [Windows Server] 2008 app, you can’t assume there will be a 64-bit version of that app available,” said Paul Delory, a research director at Gartner. “Users have to coordinate with all their vendors from whom they bought commercial software to know if they are supporting their app on the new OS. If not, you have to factor in the associated costs there.”

Still, the added expense of adapting your existing 32-bit app on Windows Server 2008 is not nearly as expensive as remaining with your existing versions of the operating system and associated applications. With the product going out of technical support this week, users will have to pay for Microsoft’s Extended Support, which could double the cost for the technical support they were getting under their initial services agreement.

“You can go to extended support, which gets you three years’ worth of updates, but that requires you to have Software Assurance,” Delory said. “Extended support costs you 75% of your annual licensing costs, and SA [Software Assurance] is an additional 25%, making it twice as much.”

He said a practical and less expensive solution for users facing this situation is to consider gravitating to a SaaS-based offering such as Office 365 or a similar offering with the same capabilities.

“Something like [Office 365] will be the path of least resistance for many companies because it offers them the chance to sidestep some of these problems,” Delory said. “You can make these problems someone else’s in exchange for a reasonable monthly fee.”

Other options for users leaning away from a Windows Server 2008 migration are much less attractive. They can leave the server in place and mitigate the vulnerabilities as best they can, Delory said, or tuck it behind a firewall and whitelist only certain IP addresses or leave certain ports open.

“You can bring in an Intrusion Prevention System to detect vulnerabilities, but that system must have an understanding of Windows Server 2008 vulnerabilities and be able to maintain them across all your applications,” Delory said.

Go to Original Article
Author:

With support for Windows 7 ending, a look back at the OS

With Microsoft’s support for Windows 7 ending this week, tech experts and IT professionals remembered the venerable operating system as a reliable and trustworthy solution for its time.

The OS was launched in 2009, and its official end of life came Tuesday, Jan. 14.

Industry observers spoke of Windows 7 ending, remembering the good and the bad of an OS that managed to hold its ground during the explosive rise of mobile devices and the growing popularity of web applications.

An old reliable

Stephen Kleynhans, research vice president at Gartner, said Windows 7 was a significant step forward from Windows XP, the system that had previously gained dominance in the enterprise.

Stephen KleynhansStephen Kleynhans

“Windows 7 kind of defined computing for most enterprises over the last decade,” he said. “You could argue it was the first version of Windows designed with some level of true security in mind.”

Windows 7 introduced several new security features, including enhanced Encrypting File System protection, increased control of administrator privileges and allowing for multiple firewall policies on a single system.

The OS, according to Kleynhans, also provided a comfortable familiarity for PC users.

“It was a really solid platform that businesses could build on,” he said. “It was a good, solid, reliable OS that wasn’t too flashy, but supported the hardware on the market.”

“It didn’t put much strain on its users,” he added. “It fit in with what they knew.”

Eric Klein, analyst at VDC Research Group Inc., said the launch of Windows 7 was a positive move from Microsoft following the “debacle” that was Windows Vista — the immediate predecessor of Windows 7, released in 2007.

“Vista was a very big black eye for Microsoft,” he said. “Windows 7 was more well-refined and very stable.”

Eric KleinEric Klein

The fact that Windows 7 could be more easily administered than previous iterations of the OS, Klein said, was another factor in its enterprise adoption.

“So many businesses, small businesses included, really were all-in for Windows 7,” he said. “It was reliable and securable.”

Windows 7’s longevity, Klein said, was also due to slower hardware refresh rates, as companies often adopt new OSes when buying new computers. With web applications, there is less of a need for individual desktops to have high-end horsepower — meaning users can get by with older machines for longer.

Mark BowkerMark Bowker

“Ultimately, it was a well-tuned OS,” said Mark Bowker, senior analyst at Enterprise Strategy Group. “It worked, so it became the old reliable for a lot of organizations. Therefore, it remains on a lot of organizations’ computers, even at its end of life.”

Even Microsoft saw the value many enterprises placed in Windows 7 and responded by continuing support, provided customers pay for the service, according to Bowker. The company is allowing customers to pay for extended support for a maximum of three years past the January 14 end of life.

Early struggles for Windows 7

Kleynhans said, although the OS is remembered fondly, the switch from Windows XP was far from a seamless one.

“What people tend to forget about the transition from XP to 7 was that it was actually pretty painful,” he said. “I think a lot of people gloss over the fact that the early days with Windows 7 were kind of rough.”

The biggest issue with that transition was with compatibility, Kleynhans said.

“At the time, a lot of applications that ran on XP and were developed on XP were not developed with a secure environment in mind,” he said. “When they were dropped into Windows 7, with its tighter security, a lot of them stopped working.”

Daniel BeatoDaniel Beato

Daniel Beato, director of technology at IT consulting firm TNTMAX, recalled some grumbling about a hard transition from Windows XP.

“At first, like with Windows 10, everyone was complaining,” he said. “As it matured, it became something [enterprises] relied on.”

A worthy successor?

Windows 7 is survived by Windows 10, an OS that experts said is in a better position to deal with modern computing.

“Windows 7 has fallen behind,” Kleynhans said. “It’s a great legacy system, but it’s not really what we want for the 2020s.”

Companies, said Bowker, may be hesitant to upgrade OSes, given the complications of the change. Still, he said, Windows 10 features make the switch more alluring for IT admins.

“Windows 10, especially with Office 365, starts to provide a lot of analytics back to IT. That data can be used to see how efficiently [an organization] is working,” he said. “[Windows 10] really opens eyes with the way you can secure a desktop… the way you can authenticate users. These things become attractive [and prompt a switch].”

Klein said news this week of a serious security vulnerability in Windows underscored the importance of regular support.

“[The vulnerability] speaks to the point that users cannot feel at ease, regardless of the fact that, in 2020, Windows is a very, very enterprise-worthy and robust operating system that is very secure,” he said. “Unfortunately, these things pop up over time.”

The news, Klein said, only underlines the fact that, while some companies may wish to remain with Windows 7, there is a large community of hackers who are aware of these vulnerabilities — and aware that the company is ending support for the OS.

Beato said he still had customers working on Windows 7, but most people with whom he worked had made the switch to Windows 10. Microsoft, he said, had learned from Windows XP and provided a solid pathway to upgrade from Windows 7 to Windows 10.

The future of Windows

Klein noted that news about the next version of Windows would likely be coming soon. He wondered whether the trend toward keeping the smallest amount of data possible on local PCs would affect its design.

“Personally, I’ve found Microsoft to be the most interesting [of the OS vendors] to watch,” he said, calling attention to the company’s willingness to take risks and innovate, as compared to Google and Apple. “They’ve clearly turned the page from the [former Microsoft CEO Steve] Ballmer era.”

Go to Original Article
Author:

For Sale – 27” mid 2011 iMac i7 / 16gb For spares its repair

I was using it, then it stopped. I spoke to Apple Support and the booked me into Cambridge Store Genius. They ran a diagnostic and it passed all their tests and he suspected that it was a hard drive fail. It is classed as vintage and he says Apple would not repair it. I have already replaced with a new one so want this one gone. The guy in Apple removes the hard drive for me and that us not included. As for condition I can see no marks but he warned me that there may now be dust between glass and screen. Can take pictures if needed. It is boxed and comes with mouse only.

Go to Original Article
Author:

What should CIOs do with SAP ECC support ending in 2025?

SAP has promised the end of SAP ECC support in 2025, and that means big changes for most SAP users. 

Companies using SAP ERP Central Component are faced with some major decisions. The most obvious is whether to stay on ECC or migrate their systems to S4/HANA. This is not easy decision to make, as each has its own set of pros and cons. No matter which choice a company makes, it will face business consequences, and must prepare accordingly.

From the vendor perspective, support staff and developers should focus on a new product. As part of this, most software vendors push their clients to adopt the latest platform, partly by imposing an end-of-support deadline. And this strategy has some success. Most clients don’t want to be left with an unsupported system that might cause work delays. But moving to a new product can also be problematic.

For an SAP ECC customer, moving to S4/HANA comes with its own set of challenges and poses risks. Implementing the latest SAP platform does not always equate to better and faster systems, as seen in Revlon’s disastrous SAP S/4HANA implementation. Revlon experienced shipping delays and revenue losses as a result of system, operational and implementation challenges. It was also sued by shareholders.

Such failures can’t always be blamed only on the new software. Other factors that can contribute to ERP implementation failure — whether a new SAP system or another vendor’s system — include lack of operational maturity, poor leadership, lack of experienced resources and cultural challenges. These can turn a potentially successful ERP implementation into a complete disaster.

End of SAP ECC support must be balanced with the risks of moving to S/4HANA. Companies must consider performing the following activities in order to prepare for the upcoming deadline:

  • Talk to others in the same vertical about their experience with S/4HANA.
  • Determine the costs and changes associated with the change.
  • Evaluate the latest version of S/4HANA.
  • Identify which vendors might potentially continue to provide third-party ECC support after SAP stops it.
  • Determine any compliance concerns that could arise from not receiving updates on ECC software.
  • Reach out to other companies within the SAP user groups and discuss what some of their plans are.
  • Determine a plan for necessary patching and bug fixes.

Go to Original Article
Author:

HYCU backup for Google Cloud adds SAP HANA support

HYCU enhanced its Google Cloud Platform backup with SAP HANA support, offering it as a managed service that eases the burden on IT.

The HYCU Backup as a Service for Google Cloud is purpose-built for GCP, similar to how HYCU’s first major product was purpose-built for Nutanix data protection. It’s fully integrated into Google Cloud Identity & Access Management.

“It was built with the Google administrator in mind,” so there’s no extra training needed, said Subbiah Sundaram, vice president of products at HYCU.

Offering it as a service is critical to protecting cloud workloads natively, according to Enterprise Strategy Group senior analyst Christophe Bertrand. The firm’s research shows that IT professionals want similar features in cloud-native data protection as in their on-premises environments, but there are gaps.

“Among the key areas are enterprise-class scalability, which HYCU is addressing in this release with enhancements to cloud-native incrementals, scalability, mission-critical application support with SAP HANA and performance optimizations,” Bertrand wrote in an email. “Cloud is about scale, and this means that data protection mechanisms have to adapt.”

Protection for a ‘mission-critical application’

HYCU backup for GCP is supporting SAP HANA for the first time with this release. The support requires a special understanding of the infrastructure being protected and a mechanism to coordinate with SAP HANA to get a consistent copy, according to Sundaram.

The HYCU Backup as a Service uses Google snapshots for database-consistent, impact-free backup and recovery. It includes support for single file recovery.

The use of native storage snapshots is a distinguished approach, according to Bertrand.

“I expect that we will see a number of HYCU customers scale their environments in time,” Bertrand wrote. “SAP HANA is a mission-critical application in many enterprises, and in combination with GCP, offers a lot of promise for scaling deployments up and out, and the ability to do analytics for business uses beyond just backup or BC/DR.”

Sundaram said Google sellers and partners asked for the SAP HANA support — they want more customers adding SAP HANA on GCP. SAP HANA, an in-memory database for processing high volumes of data in real time, is popular with large retailers.

Screenshot of HYCU backup for Google Cloud Platform
HYCU backups use changed block tracking functionality to enable optimized bucket storage use.

Dive deeper into HYCU’s strategy

HYCU’s GCP backup product originally launched in July 2018. Because it is a service, HYCU takes care of the installation, management and upgrades. HYCU claims one-click backups.

It was built with the Google administrator in mind.
Subbiah SundaramVice president of products, HYCU

Users back up to Google Cloud Storage buckets. Starting with this update, HYCU backup uses changed block tracking to enable optimized bucket storage consumption.

HYCU can keep costs down because the customer doesn’t pay for compute, Sundaram said. The product’s incremental backups and auto-tiering also save money.

The product does not require caching storage, according to HYCU, which means cheaper data transfer for backup and better use of cloud storage.

HYCU, which is based in Boston, Mass., has built its strategy on offering specialized services that go deep in specific environments, according to Bertrand.

“It gives them this best of breed advantage over generalists, and our research shows that IT professionals have no problem using the best cloud backup solutions for the job at hand — meaning using a new solution or an additional vendor,” Bertrand wrote. “I believe that they are well-positioned to deliver additional services beyond backup and BC/DR, such as intelligent data management based on data reuse.”

HYCU Backup as a Service for Google Cloud is available on the GCP Marketplace and through authorized partners. Cost depends on the amount of data under protection and frequency of backup.

HYCU backup automatically updated for current customers in October.

In the coming weeks, HYCU expects to launch its Protégé product for multi-cloud disaster recovery and migration. It’s also planning a major update in early 2020 that will add another supported cloud platform.

Go to Original Article
Author:

Threat Stack Application Security Monitoring adds Python support

Threat Stack has announced Python support for its Threat Stack Application Security Monitoring product. The update comes with no additional cost as part of the Threat Stack Cloud Security Platform.

With Python support for Application Security Monitoring, Threat Stack customers who use Python with Django and Flask frameworks can ensure security in the software development lifecycle with risk identification of both third-party and native code, according to Tim Buntel, vice president of application security products at Threat Stack.

In addition, the platform also provides built-in capabilities to help developers learn secure coding practices and real-time attack blocking, according to the company.

“Today’s cloud-native applications are comprised of disparate components, including containers, virtual machines and scripts, including those written in Python, that serve as the connective tissue between these elements,” said Doug Cahill, senior analyst and group Practice Director, Cybersecurity at Enterprise Strategy Group. Hence, the lack of support for any one layer of a stack means a lack of visibility and a vulnerability an attacker could exploit.

Application Security Monitoring is a recent addition to Threat Stack Cloud Security Platform. Introduced last June, the platform is aimed at bringing visibility and protection to cloud-based architecture and applications. Threat Stack Cloud Security Platform touts the ability to identify and block attacks such as cross-site scripting (XSS) and SQL injection by putting the application in context with the rest of the stack. It also allows users to move from the application to the container or the host, where it is deployed with one click when an attack happens, according to the company.

“[Application Security Monitoring] … provides customers with full stack security observability by correlating security telemetry from the cloud management console, host, containers and applications in a single, unified platform,” Buntel said.

To achieve full stack security and insights from the cloud management console, host, containers, orchestration and applications, customers can combine Threat Stack Application Security Monitoring with the rest of the Threat Stack Cloud Security Platform, according to the company.

Cahill said customers should look for coverage of the technology stack as well as the lifecycle when looking to secure cloud-native applications, because such full stack and lifecycle support allows for threat detection and prevention capabilities “from the code level down to the virtual machine or container to be implemented in both pre-deployment stages and runtime.”

“Cloud security platforms, which integrate runtime application self-protection functionality with cloud workload protection platforms to provide full-stack and full lifecycle visibility and control, are just now being offered by a handful of cybersecurity vendors, including Threat Stack,” he added.

Threat Stack Application Security Monitoring for Python is available as of Wednesday.

Threat Stack competitors include CloudPassage, Dome9 and Sophos. CloudPassage Halo is a security automation platform delivering visibility, protection and compliance monitoring for cybersecurity risks; the platform also covers risks in Amazon Web Services and Azure deployments, according to the company. CloudGuard Dome9 is a software platform for public cloud security and compliance orchestration; the platform helps customers assess their security posture, detect misconfigurations and enforce security best practices to prevent data loss, according to the company. Sophos Intercept X enables organizations to detect blended threats that merge automation and human hacking skills, according to the company.

Go to Original Article
Author:

Google releases TensorFlow Enterprise for enterprise users

Google Wednesday launched TensorFlow Enterprise, which promises long-term support for previous versions of TensorFlow on its Google Cloud Platform.

The new product, which also bundles together some existing Google Cloud products for training and deploying AI models, is intended to aid organizations running older versions of TensorFlow.

The product is also designed to help “customers who are working with previous versions of TensorFlow and also those where AI is their business,” said Craig Wiley, director of product management for Google Cloud’s AI Platform.

Open sourced by Google in 2015, TensorFlow is a machine learning (ML) and deep learning framework widely used in the AI industry. TensorFlow Enterprise, available on the Google Cloud Platform (GCP), provides security patches and select bug fixes for certain older versions of TensorFlow for up to three years.

Also, organizations using TensorFlow Enterprise will have access to “engineer-to-engineer assistance from both Google Cloud and TensorFlow teams at Google,” according to an Oct. 30 Google blog post introducing the product.

“Data scientists voraciously download the latest version of TensorFlow because of the steady pace of new, valuable features. They always want to use the latest and greatest,” Forrester Research analyst Mike Gualtieri said.

Yet, he continued, “new versions don’t always work as expected,” so the “”dive-right-in” approach of data scientists is often at conflict with an enterprise’s standards.

Google’s TensorFlow Enterprise support of prior versions back to three years will accelerate enterprise adoption.
Mike GualtieriAnalyst, Forrester Research

“That’s why Google’s TensorFlow Enterprise support of prior versions back to three years will accelerate enterprise adoption,” Gualtieri said. “Data scientists and ML engineers can experiment with the latest and greatest, while enterprise operations professionals can insist on versions that work will continue to be available.”

TensorFlow Enterprise comes bundled with Google Cloud’s Deep Learning VMs, which are preconfigured virtual machine environments for deep learning, as well as the beta version of Google Cloud’s Deep Learning Containers.

To be considered for the initial rollout of TensorFlow Enterprise, however, organizations must have spent $500,000 annually, or commit to spending $500,000 annually on Google Cloud’s Deep Learning VMs, Deep Learning Containers, or AI Platform Training and Prediction products, or some combination of those systems.

Over the past several months, Google has made progress in a campaign to offer more tools on its Google Cloud Platform to train, test, and deploy AI models. In April 2019, the tech giant unveiled the Google Cloud AI Platform, a unified AI development platform that combined a mix of new and rebranded AI development products. At the time, analysts saw the release as a move to attract more enterprise-level customers to Google Cloud.

Go to Original Article
Author:

Gears 5 breaks records as biggest launch for any Xbox Game Studios game this generation – Xbox Wire

Thanks to the incredible support from our fans, Gears 5 kicked off the Holiday season strong – attracting over three million players in its opening weekend and setting new records for Xbox Game Pass with the biggest launch week of any Xbox Game Studios title this generation. The performance easily doubled the first week’s debut of Gears of War 4 and made Gears 5 the most-played Xbox Game Studios title in its first week since 2012’s Halo 4. First week performance includes the four-day early access period beginning Friday, September 6, which was exclusive to Gears 5 Ultimate Edition and Xbox Game Pass Ultimate members. On the PC, Gears 5 has nearly tripled the performance of its predecessor, becoming the biggest-ever launch for Xbox Game Pass for PC, and Xbox Game Studios’ best-ever debut on Steam.

On Tuesday, Gears 5 expanded into worldwide general availability with the launch of Gears 5 Standard Edition at retail, as well as new Xbox One X and Xbox One S console bundles. For PC fans, Gears 5 is included in a three-month Xbox Game Pass for PC membership bundled with the purchase of select AMD Ryzen™ processors and AMD Radeon™ graphics cards.

Described as “one of the best and most versatile action-game packages in recent memory” by IGN, the game is a hit with critics and fans alike, and enjoys a Metacritic user score of 8.7, the sixth best of any Xbox One game. The excitement extends to the PC version as well, with Digital Foundry declaring, “it fills every desire that I want for the PC version of a game.”

Gears 5 has also received special attention for its commitment to representation and accessibility. Accessibility features include full controller remapping, single stick movement, Xbox Adaptive Controller support, narrated UI and menus, improved subtitles, and more. It was the first AAA game to receive a perfect score from CanIPlayThat.com, “A Deaf/Hoh accessibility masterpiece.”

For more information on the Gears of War franchise, stay tuned to Xbox Wire or follow Gears on Twitter @gearsofwar.

Go to Original Article
Author: Microsoft News Center

The roots of Oracle’s cloud evolution: A 20-year review

Oracle’s strategy going into 2020 is to support users wherever they are, while not-so-subtly urging them to move onto Oracle cloud services – particularly databases.

In fact, some say its Oracle’s legacy as a database vendor that may be the key to the company’s long-term success as a major cloud player.

To reconcile the Oracle cloud persona of today with the identity of database giant that the company still holds, it helps to look back at key milestones in Oracle’s history over the past 20 years, beginning with Oracle database releases at the turn of the century.

Oracle releases Database 8i, 9i

Two major versions of Oracle’s database arrived in 1998 and 2001. Oracle Database 8i was the first written with a heavy emphasis on web applications — the “i” stood for Internet.

Then Oracle 9i introduced the feature Real Application Clusters (RAC) for high-availability scenarios. RAC is a widely popular and lucrative database option for Oracle, one it has held very close to date. RAC is only supported and certified for use on Oracle’s cloud service at this time. 

With the 9i update, Oracle made a concerted effort to improve the database’s administrative tooling, said Curt Monash, founder of Monash Research in Acton, Mass.

“This was largely in reaction to growing competition from Microsoft, which used its consumer software UI expertise to have true ease-of-administration advantages versus Oracle,” Monash said. “Oracle narrowed the gap impressively quickly.”

Timeline: These 10 milestones marked Oracle's path to modern cloud computing
These 10 milestones marked Oracle’s path to modern cloud computing

Oracle acquires PeopleSoft and Siebel

Silicon Valley is littered with the bones of once-prominent application software vendors that either shut down or got swallowed up by larger competitors. To that end, Oracle’s acquisitions of PeopleSoft and Siebel still resonate today.

The company launched what many considered to be a hostile takeover of PeopleSoft, the second-largest software vendor in 2003 after SAP. It ultimately succeeded with a $10.3 billion bid the following year. Soon after the deal closed, Oracle laid off more than half of PeopleSoft’s employees in a widely decried act.

Oracle also gained J.D. Edwards, known for its manufacturing ERP software, through the PeopleSoft purchase.

The PeopleSoft deal, along with Oracle’s $5.8 billion acquisition of Siebel in 2005, reinvented the company as a big player in enterprise applications and set up the path toward Fusion.

Oracle realized that to catch up to SAP in applications, it needed acquisitions, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif., who worked in business and product development roles at Oracle during much of the 2000s.

“To cement ownership within complex CRM, they needed Siebel,” Mueller said. Those Siebel customers largely remain in the fold today, he added. While rival HCM software vendor Workday has managed to poach some of Oracle’s PeopleSoft customers, Salesforce hasn’t had the same luck converting Siebel users over to its CRM, according to Mueller.

Oracle’s application deals were as much or more about acquiring customers as they were about  technology, said Frank Scavo, president of IT consulting firm Strativa in Irvine, Calif.

“Oracle had a knack for buying vendors when they were at or just past their peak,” he said. “PeopleSoft was an example of that.”

The PeopleSoft and Siebel deals also gave Oracle the foundation, along with its homegrown E-Business Suite, for a new generation of applications in the cloud era.

Oracle’s Fusion Applications saga

Oracle first invoked the word “Fusion” in 2005, under the promise it would deliver an integrated applications suite that comprised a superset of functionality from its E-Business Suite, PeopleSoft and Siebel software, with both cloud and on-premises deployment options.

The company also pledged that Fusion apps would deliver a consumer-grade user experience and business intelligence embedded throughout processes.

Fusion Applications were supposed to become generally available in 2008, but Oracle didn’t make these applications generally available to all customers until 2011.

It’s been suggested that Oracle wanted to take its time and had the luxury of doing so, since its installed base was still weathering a recession and had little appetite for a major application migration, no matter how useful the new software was.

Fusion Applications’ sheer scope was another factor. “It takes a long time to build software from scratch, especially if you have to replace things that were strong category leaders,” Mueller said.

Oracle’s main shortcoming with Fusion Applications was its inability to sell very much of them early on, Mueller added.

Oracle acquires Hyperion and BEA

After its applications shopping spree, Oracle eyed other areas of software. First, it bought enterprise performance management vendor Hyperion in 2007 for $3.3 billion to bolster its financials and BI business.

“Hyperion was a smart acquisition to get customers,” Mueller said. “It helped Oracle sell financials. But it didn’t help them in the move to cloud.”

In contrast, BEA and its well-respected application server did. The $8.5 billion deal also gave Oracle access to a large customer base and many developers, Mueller added.

John Rymer, ForresterJohn Rymer

BEA’s products also gave a boost to Oracle’s existing Fusion Middleware portfolio, said John Rymer, an analyst at Forrester. “At the time, Oracle’s big competitor in middleware was IBM,” he said. “[Oracle] didn’t have credibility.”

Oracle’s hardware play

Oracle made a major strategic shift in 2008 with the introduction of Exadata, its first foray into hardware.

Exadata packs servers, networking and storage, along with Oracle database and other software, into preconfigured racks. Oracle also created storage processing software for the machines, which its marketing arm initially dubbed “engineered systems.”

With the move, Oracle sought to take a bigger hold in the data warehousing market against the likes of Teradata and Netezza, which was subsequently acquired by IBM.

Exadata was a huge move for Oracle, Monash said.

“They really did architect hardware around software requirements,” he said. “And they attempted to change their business relationship with customers accordingly. … For context, recall that one of Oracle’s top features in its hypergrowth years in the 1980s was hardware portability.”

In fact, it would have been disastrous if Oracle didn’t come up with Exadata, according to Monash.

“Oracle was being pummeled by independent analytics DBMS vendors, appliance-based or others,” he said. “The competition was more cost-effective, naturally, but Exadata was good enough to stem much of the bleeding.”

Steve DahebSteve Daheb

Exadata and its relatives are foundational to Oracle’s IaaS, and the company also offers the systems on-premises through its Cloud at Customer program.

“We offer customers choice,” said Steve Daheb, senior vice president of Oracle Cloud. “If customers want to deploy [Oracle software] on IBM or HP [gear], you could do that. But we also continue to see this constant theme in tech, where things get complicated and then they get aggregated.”

Oracle buys Sun Microsystems

Few Oracle acquisitions were as controversial as its $7.4 billion move to buy Sun Microsystems. Critics of the deal bemoaned the potential fate of open-source technologies such as the MySQL database and the Java programming language under Oracle’s ownership, and the deal faced serious scrutiny from European regulators.

Oracle ultimately made a series of commitments about MySQL, which it promised to uphold for five years, and the deal won approval in early 2010.

Sun’s hardware became a platform for Exadata and other Oracle appliances. MySQL has chugged along with regular updates, contrary to some expectations that it would be killed off.

But many other Sun-related technologies fell into the darkness, such as Solaris and Sun’s early version of an AWS-style IaaS. Oracle also moved Java EE to the Eclipse Foundation, although it maintains tight hold over Java SE.

The Sun deal remains relevant today, given how it ties into Ellison’s long-term vision of making Oracle the IBM for the 21st century, Mueller said.

That aspiration realized would see Oracle become a “chip-to-click” technology provider, spanning silicon to end-user applications, he added. “The verdict is kind of still out over whether that is going to work.”

Oracle Database 12c

The company made a slight but telling change to its database naming convention with the 2013 release of 12c, swapping consonants for one that denoted “cloud,” rather than “g” for grid computing.

Oracle’s first iteration of 12c had multitenancy as a marquee feature. SaaS vendors at the time predominantly used multitenancy at the application level, with many customers sharing the same instance of an app. This approach makes it easier to apply updates across many customers’ apps, but is inherently weaker for security, Ellison contended.

Oracle 12c’s multi-tenant option provided an architecture where one container database held many “pluggable” databases.

Oracle later rolled out an in-memory option to compete with SAP’s HANA in-memory database. SAP hoped its customers, many of which used Oracle’s database as an underlying store, would migrate onto HANA.

2016: Oracle acquires NetSuite

Oracle’s $9.3 billion purchase of cloud ERP vendor NetSuite came with controversy, given Ellison’s large personal financial stake in the vendor. But on a strategic level, the move made plenty of sense.

NetSuite at the time had more than 10,000 customers, predominantly in the small and medium-sized business range. Oracle, in contrast, had 1,000 or so customers for its cloud ERP aimed at large enterprises, and not much presence in SMB.

Thus, the move plugged a major gap for Oracle. It also came as Oracle and NetSuite began to compete with each other at the margins for customers of a certain size.

Oracle’s move also gave it a coherent two-tier ERP strategy, wherein a customer that opens new offices would use NetSuite in those locations while tying it back to a central Oracle ERP system. This is a practice rival SAP has used with Business ByDesign, its cloud ERP product for SMBs, as well as Business One.

The NetSuite acquisition was practically destined from the start, said Scavo of Strativa.

“I always thought Larry was smart not to do the NetSuite experiment internally. NetSuite was able to develop its product as a cloud ERP system long before anyone dreamed of doing that,” Scavo said.

NetSuite customers could benefit as the software moves onto Oracle’s IaaS if they receive the promised benefits of better performance and elasticity, which NetSuite has grappled with at times, Scavo added. “I’m looking forward to seeing some evidence of that.”

Oracle launches its second-generation IaaS cloud

The IaaS market has largely coalesced around three players in hyperscale IaaS: AWS, Microsoft and Google. Other large companies such as Cisco and HPE tried something similar, but ceded defeat and now position themselves as neutral middle players keen to help customers navigate and manage multi-cloud deployments.

Oracle, meanwhile, came to market with an initial public IaaS offering based in part on OpenStack, but it failed to gain much traction. It subsequently made major investments in a second-generation IaaS, called Oracle Cloud Infrastructure, which offers many advancements at the compute, network and storage layers over the original.

With [on-premises software] we had to make it work with everybody. Part of it is working together to bring that to the cloud.
Steve DahebSenior vice president, Oracle Cloud

Oracle has again shifted gears, evidenced by its partnership with Microsoft to boost interoperability between Oracle Cloud Infrastructure and Azure. One expected use case is for IT pros to run their enterprise application logic and presentation tiers on Azure, while tying back to Oracle’s Autonomous Database on the Oracle cloud.

“We started this a while back and it’s something customers asked for,” Oracle’s Daheb said. There was significant development work involved and given the companies’ shared interests, the deal was natural, according to Daheb.

“If you think about this world we came from, with [on-premises software], we had to make it work with everybody,” Daheb said. “Part of it is working together to bring that to the cloud.”

Oracle Autonomous Database marks the path forward

Ellison will unveil updates to Oracle database 19c, which runs both on-premises and in the cloud, in a talk at OpenWorld. While details remain under wraps, it is safe to assume the news will involve autonomous management and maintenance capabilities Oracle first discussed in 2017.

Oracle database customers typically wait a couple of years before upgrading to a new version, preferring to let early adopters work through any remaining bugs and stability issues. Version 19c arrived in January, but is more mature than the name suggests. Oracle moved to a yearly naming convention and update path in 2018, and thus 19c is considered the final iteration of the 12c release cycle, which dates to 2013.

Oracle users should be mindful that autonomous database features have been a staple of database systems for decades, according to Monash.

But Oracle has indeed accomplished something special with its cloud-based Autonomous Database, according to Daheb. He referred to an Oracle marketing intern who was able to set up databases in just a couple of minutes on the Oracle Cloud version. “For us, cloud is the great democratizer,” Daheb said.

Go to Original Article
Author: