Tag Archives: support

What should CIOs do with SAP ECC support ending in 2025?

SAP has promised the end of SAP ECC support in 2025, and that means big changes for most SAP users. 

Companies using SAP ERP Central Component are faced with some major decisions. The most obvious is whether to stay on ECC or migrate their systems to S4/HANA. This is not easy decision to make, as each has its own set of pros and cons. No matter which choice a company makes, it will face business consequences, and must prepare accordingly.

From the vendor perspective, support staff and developers should focus on a new product. As part of this, most software vendors push their clients to adopt the latest platform, partly by imposing an end-of-support deadline. And this strategy has some success. Most clients don’t want to be left with an unsupported system that might cause work delays. But moving to a new product can also be problematic.

For an SAP ECC customer, moving to S4/HANA comes with its own set of challenges and poses risks. Implementing the latest SAP platform does not always equate to better and faster systems, as seen in Revlon’s disastrous SAP S/4HANA implementation. Revlon experienced shipping delays and revenue losses as a result of system, operational and implementation challenges. It was also sued by shareholders.

Such failures can’t always be blamed only on the new software. Other factors that can contribute to ERP implementation failure — whether a new SAP system or another vendor’s system — include lack of operational maturity, poor leadership, lack of experienced resources and cultural challenges. These can turn a potentially successful ERP implementation into a complete disaster.

End of SAP ECC support must be balanced with the risks of moving to S/4HANA. Companies must consider performing the following activities in order to prepare for the upcoming deadline:

  • Talk to others in the same vertical about their experience with S/4HANA.
  • Determine the costs and changes associated with the change.
  • Evaluate the latest version of S/4HANA.
  • Identify which vendors might potentially continue to provide third-party ECC support after SAP stops it.
  • Determine any compliance concerns that could arise from not receiving updates on ECC software.
  • Reach out to other companies within the SAP user groups and discuss what some of their plans are.
  • Determine a plan for necessary patching and bug fixes.

Go to Original Article
Author:

HYCU backup for Google Cloud adds SAP HANA support

HYCU enhanced its Google Cloud Platform backup with SAP HANA support, offering it as a managed service that eases the burden on IT.

The HYCU Backup as a Service for Google Cloud is purpose-built for GCP, similar to how HYCU’s first major product was purpose-built for Nutanix data protection. It’s fully integrated into Google Cloud Identity & Access Management.

“It was built with the Google administrator in mind,” so there’s no extra training needed, said Subbiah Sundaram, vice president of products at HYCU.

Offering it as a service is critical to protecting cloud workloads natively, according to Enterprise Strategy Group senior analyst Christophe Bertrand. The firm’s research shows that IT professionals want similar features in cloud-native data protection as in their on-premises environments, but there are gaps.

“Among the key areas are enterprise-class scalability, which HYCU is addressing in this release with enhancements to cloud-native incrementals, scalability, mission-critical application support with SAP HANA and performance optimizations,” Bertrand wrote in an email. “Cloud is about scale, and this means that data protection mechanisms have to adapt.”

Protection for a ‘mission-critical application’

HYCU backup for GCP is supporting SAP HANA for the first time with this release. The support requires a special understanding of the infrastructure being protected and a mechanism to coordinate with SAP HANA to get a consistent copy, according to Sundaram.

The HYCU Backup as a Service uses Google snapshots for database-consistent, impact-free backup and recovery. It includes support for single file recovery.

The use of native storage snapshots is a distinguished approach, according to Bertrand.

“I expect that we will see a number of HYCU customers scale their environments in time,” Bertrand wrote. “SAP HANA is a mission-critical application in many enterprises, and in combination with GCP, offers a lot of promise for scaling deployments up and out, and the ability to do analytics for business uses beyond just backup or BC/DR.”

Sundaram said Google sellers and partners asked for the SAP HANA support — they want more customers adding SAP HANA on GCP. SAP HANA, an in-memory database for processing high volumes of data in real time, is popular with large retailers.

Screenshot of HYCU backup for Google Cloud Platform
HYCU backups use changed block tracking functionality to enable optimized bucket storage use.

Dive deeper into HYCU’s strategy

HYCU’s GCP backup product originally launched in July 2018. Because it is a service, HYCU takes care of the installation, management and upgrades. HYCU claims one-click backups.

It was built with the Google administrator in mind.
Subbiah SundaramVice president of products, HYCU

Users back up to Google Cloud Storage buckets. Starting with this update, HYCU backup uses changed block tracking to enable optimized bucket storage consumption.

HYCU can keep costs down because the customer doesn’t pay for compute, Sundaram said. The product’s incremental backups and auto-tiering also save money.

The product does not require caching storage, according to HYCU, which means cheaper data transfer for backup and better use of cloud storage.

HYCU, which is based in Boston, Mass., has built its strategy on offering specialized services that go deep in specific environments, according to Bertrand.

“It gives them this best of breed advantage over generalists, and our research shows that IT professionals have no problem using the best cloud backup solutions for the job at hand — meaning using a new solution or an additional vendor,” Bertrand wrote. “I believe that they are well-positioned to deliver additional services beyond backup and BC/DR, such as intelligent data management based on data reuse.”

HYCU Backup as a Service for Google Cloud is available on the GCP Marketplace and through authorized partners. Cost depends on the amount of data under protection and frequency of backup.

HYCU backup automatically updated for current customers in October.

In the coming weeks, HYCU expects to launch its Protégé product for multi-cloud disaster recovery and migration. It’s also planning a major update in early 2020 that will add another supported cloud platform.

Go to Original Article
Author:

Threat Stack Application Security Monitoring adds Python support

Threat Stack has announced Python support for its Threat Stack Application Security Monitoring product. The update comes with no additional cost as part of the Threat Stack Cloud Security Platform.

With Python support for Application Security Monitoring, Threat Stack customers who use Python with Django and Flask frameworks can ensure security in the software development lifecycle with risk identification of both third-party and native code, according to Tim Buntel, vice president of application security products at Threat Stack.

In addition, the platform also provides built-in capabilities to help developers learn secure coding practices and real-time attack blocking, according to the company.

“Today’s cloud-native applications are comprised of disparate components, including containers, virtual machines and scripts, including those written in Python, that serve as the connective tissue between these elements,” said Doug Cahill, senior analyst and group Practice Director, Cybersecurity at Enterprise Strategy Group. Hence, the lack of support for any one layer of a stack means a lack of visibility and a vulnerability an attacker could exploit.

Application Security Monitoring is a recent addition to Threat Stack Cloud Security Platform. Introduced last June, the platform is aimed at bringing visibility and protection to cloud-based architecture and applications. Threat Stack Cloud Security Platform touts the ability to identify and block attacks such as cross-site scripting (XSS) and SQL injection by putting the application in context with the rest of the stack. It also allows users to move from the application to the container or the host, where it is deployed with one click when an attack happens, according to the company.

“[Application Security Monitoring] … provides customers with full stack security observability by correlating security telemetry from the cloud management console, host, containers and applications in a single, unified platform,” Buntel said.

To achieve full stack security and insights from the cloud management console, host, containers, orchestration and applications, customers can combine Threat Stack Application Security Monitoring with the rest of the Threat Stack Cloud Security Platform, according to the company.

Cahill said customers should look for coverage of the technology stack as well as the lifecycle when looking to secure cloud-native applications, because such full stack and lifecycle support allows for threat detection and prevention capabilities “from the code level down to the virtual machine or container to be implemented in both pre-deployment stages and runtime.”

“Cloud security platforms, which integrate runtime application self-protection functionality with cloud workload protection platforms to provide full-stack and full lifecycle visibility and control, are just now being offered by a handful of cybersecurity vendors, including Threat Stack,” he added.

Threat Stack Application Security Monitoring for Python is available as of Wednesday.

Threat Stack competitors include CloudPassage, Dome9 and Sophos. CloudPassage Halo is a security automation platform delivering visibility, protection and compliance monitoring for cybersecurity risks; the platform also covers risks in Amazon Web Services and Azure deployments, according to the company. CloudGuard Dome9 is a software platform for public cloud security and compliance orchestration; the platform helps customers assess their security posture, detect misconfigurations and enforce security best practices to prevent data loss, according to the company. Sophos Intercept X enables organizations to detect blended threats that merge automation and human hacking skills, according to the company.

Go to Original Article
Author:

Google releases TensorFlow Enterprise for enterprise users

Google Wednesday launched TensorFlow Enterprise, which promises long-term support for previous versions of TensorFlow on its Google Cloud Platform.

The new product, which also bundles together some existing Google Cloud products for training and deploying AI models, is intended to aid organizations running older versions of TensorFlow.

The product is also designed to help “customers who are working with previous versions of TensorFlow and also those where AI is their business,” said Craig Wiley, director of product management for Google Cloud’s AI Platform.

Open sourced by Google in 2015, TensorFlow is a machine learning (ML) and deep learning framework widely used in the AI industry. TensorFlow Enterprise, available on the Google Cloud Platform (GCP), provides security patches and select bug fixes for certain older versions of TensorFlow for up to three years.

Also, organizations using TensorFlow Enterprise will have access to “engineer-to-engineer assistance from both Google Cloud and TensorFlow teams at Google,” according to an Oct. 30 Google blog post introducing the product.

“Data scientists voraciously download the latest version of TensorFlow because of the steady pace of new, valuable features. They always want to use the latest and greatest,” Forrester Research analyst Mike Gualtieri said.

Yet, he continued, “new versions don’t always work as expected,” so the “”dive-right-in” approach of data scientists is often at conflict with an enterprise’s standards.

Google’s TensorFlow Enterprise support of prior versions back to three years will accelerate enterprise adoption.
Mike GualtieriAnalyst, Forrester Research

“That’s why Google’s TensorFlow Enterprise support of prior versions back to three years will accelerate enterprise adoption,” Gualtieri said. “Data scientists and ML engineers can experiment with the latest and greatest, while enterprise operations professionals can insist on versions that work will continue to be available.”

TensorFlow Enterprise comes bundled with Google Cloud’s Deep Learning VMs, which are preconfigured virtual machine environments for deep learning, as well as the beta version of Google Cloud’s Deep Learning Containers.

To be considered for the initial rollout of TensorFlow Enterprise, however, organizations must have spent $500,000 annually, or commit to spending $500,000 annually on Google Cloud’s Deep Learning VMs, Deep Learning Containers, or AI Platform Training and Prediction products, or some combination of those systems.

Over the past several months, Google has made progress in a campaign to offer more tools on its Google Cloud Platform to train, test, and deploy AI models. In April 2019, the tech giant unveiled the Google Cloud AI Platform, a unified AI development platform that combined a mix of new and rebranded AI development products. At the time, analysts saw the release as a move to attract more enterprise-level customers to Google Cloud.

Go to Original Article
Author:

Gears 5 breaks records as biggest launch for any Xbox Game Studios game this generation – Xbox Wire

Thanks to the incredible support from our fans, Gears 5 kicked off the Holiday season strong – attracting over three million players in its opening weekend and setting new records for Xbox Game Pass with the biggest launch week of any Xbox Game Studios title this generation. The performance easily doubled the first week’s debut of Gears of War 4 and made Gears 5 the most-played Xbox Game Studios title in its first week since 2012’s Halo 4. First week performance includes the four-day early access period beginning Friday, September 6, which was exclusive to Gears 5 Ultimate Edition and Xbox Game Pass Ultimate members. On the PC, Gears 5 has nearly tripled the performance of its predecessor, becoming the biggest-ever launch for Xbox Game Pass for PC, and Xbox Game Studios’ best-ever debut on Steam.

On Tuesday, Gears 5 expanded into worldwide general availability with the launch of Gears 5 Standard Edition at retail, as well as new Xbox One X and Xbox One S console bundles. For PC fans, Gears 5 is included in a three-month Xbox Game Pass for PC membership bundled with the purchase of select AMD Ryzen™ processors and AMD Radeon™ graphics cards.

Described as “one of the best and most versatile action-game packages in recent memory” by IGN, the game is a hit with critics and fans alike, and enjoys a Metacritic user score of 8.7, the sixth best of any Xbox One game. The excitement extends to the PC version as well, with Digital Foundry declaring, “it fills every desire that I want for the PC version of a game.”

Gears 5 has also received special attention for its commitment to representation and accessibility. Accessibility features include full controller remapping, single stick movement, Xbox Adaptive Controller support, narrated UI and menus, improved subtitles, and more. It was the first AAA game to receive a perfect score from CanIPlayThat.com, “A Deaf/Hoh accessibility masterpiece.”

For more information on the Gears of War franchise, stay tuned to Xbox Wire or follow Gears on Twitter @gearsofwar.

Go to Original Article
Author: Microsoft News Center

The roots of Oracle’s cloud evolution: A 20-year review

Oracle’s strategy going into 2020 is to support users wherever they are, while not-so-subtly urging them to move onto Oracle cloud services – particularly databases.

In fact, some say its Oracle’s legacy as a database vendor that may be the key to the company’s long-term success as a major cloud player.

To reconcile the Oracle cloud persona of today with the identity of database giant that the company still holds, it helps to look back at key milestones in Oracle’s history over the past 20 years, beginning with Oracle database releases at the turn of the century.

Oracle releases Database 8i, 9i

Two major versions of Oracle’s database arrived in 1998 and 2001. Oracle Database 8i was the first written with a heavy emphasis on web applications — the “i” stood for Internet.

Then Oracle 9i introduced the feature Real Application Clusters (RAC) for high-availability scenarios. RAC is a widely popular and lucrative database option for Oracle, one it has held very close to date. RAC is only supported and certified for use on Oracle’s cloud service at this time. 

With the 9i update, Oracle made a concerted effort to improve the database’s administrative tooling, said Curt Monash, founder of Monash Research in Acton, Mass.

“This was largely in reaction to growing competition from Microsoft, which used its consumer software UI expertise to have true ease-of-administration advantages versus Oracle,” Monash said. “Oracle narrowed the gap impressively quickly.”

Timeline: These 10 milestones marked Oracle's path to modern cloud computing
These 10 milestones marked Oracle’s path to modern cloud computing

Oracle acquires PeopleSoft and Siebel

Silicon Valley is littered with the bones of once-prominent application software vendors that either shut down or got swallowed up by larger competitors. To that end, Oracle’s acquisitions of PeopleSoft and Siebel still resonate today.

The company launched what many considered to be a hostile takeover of PeopleSoft, the second-largest software vendor in 2003 after SAP. It ultimately succeeded with a $10.3 billion bid the following year. Soon after the deal closed, Oracle laid off more than half of PeopleSoft’s employees in a widely decried act.

Oracle also gained J.D. Edwards, known for its manufacturing ERP software, through the PeopleSoft purchase.

The PeopleSoft deal, along with Oracle’s $5.8 billion acquisition of Siebel in 2005, reinvented the company as a big player in enterprise applications and set up the path toward Fusion.

Oracle realized that to catch up to SAP in applications, it needed acquisitions, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif., who worked in business and product development roles at Oracle during much of the 2000s.

“To cement ownership within complex CRM, they needed Siebel,” Mueller said. Those Siebel customers largely remain in the fold today, he added. While rival HCM software vendor Workday has managed to poach some of Oracle’s PeopleSoft customers, Salesforce hasn’t had the same luck converting Siebel users over to its CRM, according to Mueller.

Oracle’s application deals were as much or more about acquiring customers as they were about  technology, said Frank Scavo, president of IT consulting firm Strativa in Irvine, Calif.

“Oracle had a knack for buying vendors when they were at or just past their peak,” he said. “PeopleSoft was an example of that.”

The PeopleSoft and Siebel deals also gave Oracle the foundation, along with its homegrown E-Business Suite, for a new generation of applications in the cloud era.

Oracle’s Fusion Applications saga

Oracle first invoked the word “Fusion” in 2005, under the promise it would deliver an integrated applications suite that comprised a superset of functionality from its E-Business Suite, PeopleSoft and Siebel software, with both cloud and on-premises deployment options.

The company also pledged that Fusion apps would deliver a consumer-grade user experience and business intelligence embedded throughout processes.

Fusion Applications were supposed to become generally available in 2008, but Oracle didn’t make these applications generally available to all customers until 2011.

It’s been suggested that Oracle wanted to take its time and had the luxury of doing so, since its installed base was still weathering a recession and had little appetite for a major application migration, no matter how useful the new software was.

Fusion Applications’ sheer scope was another factor. “It takes a long time to build software from scratch, especially if you have to replace things that were strong category leaders,” Mueller said.

Oracle’s main shortcoming with Fusion Applications was its inability to sell very much of them early on, Mueller added.

Oracle acquires Hyperion and BEA

After its applications shopping spree, Oracle eyed other areas of software. First, it bought enterprise performance management vendor Hyperion in 2007 for $3.3 billion to bolster its financials and BI business.

“Hyperion was a smart acquisition to get customers,” Mueller said. “It helped Oracle sell financials. But it didn’t help them in the move to cloud.”

In contrast, BEA and its well-respected application server did. The $8.5 billion deal also gave Oracle access to a large customer base and many developers, Mueller added.

John Rymer, ForresterJohn Rymer

BEA’s products also gave a boost to Oracle’s existing Fusion Middleware portfolio, said John Rymer, an analyst at Forrester. “At the time, Oracle’s big competitor in middleware was IBM,” he said. “[Oracle] didn’t have credibility.”

Oracle’s hardware play

Oracle made a major strategic shift in 2008 with the introduction of Exadata, its first foray into hardware.

Exadata packs servers, networking and storage, along with Oracle database and other software, into preconfigured racks. Oracle also created storage processing software for the machines, which its marketing arm initially dubbed “engineered systems.”

With the move, Oracle sought to take a bigger hold in the data warehousing market against the likes of Teradata and Netezza, which was subsequently acquired by IBM.

Exadata was a huge move for Oracle, Monash said.

“They really did architect hardware around software requirements,” he said. “And they attempted to change their business relationship with customers accordingly. … For context, recall that one of Oracle’s top features in its hypergrowth years in the 1980s was hardware portability.”

In fact, it would have been disastrous if Oracle didn’t come up with Exadata, according to Monash.

“Oracle was being pummeled by independent analytics DBMS vendors, appliance-based or others,” he said. “The competition was more cost-effective, naturally, but Exadata was good enough to stem much of the bleeding.”

Steve DahebSteve Daheb

Exadata and its relatives are foundational to Oracle’s IaaS, and the company also offers the systems on-premises through its Cloud at Customer program.

“We offer customers choice,” said Steve Daheb, senior vice president of Oracle Cloud. “If customers want to deploy [Oracle software] on IBM or HP [gear], you could do that. But we also continue to see this constant theme in tech, where things get complicated and then they get aggregated.”

Oracle buys Sun Microsystems

Few Oracle acquisitions were as controversial as its $7.4 billion move to buy Sun Microsystems. Critics of the deal bemoaned the potential fate of open-source technologies such as the MySQL database and the Java programming language under Oracle’s ownership, and the deal faced serious scrutiny from European regulators.

Oracle ultimately made a series of commitments about MySQL, which it promised to uphold for five years, and the deal won approval in early 2010.

Sun’s hardware became a platform for Exadata and other Oracle appliances. MySQL has chugged along with regular updates, contrary to some expectations that it would be killed off.

But many other Sun-related technologies fell into the darkness, such as Solaris and Sun’s early version of an AWS-style IaaS. Oracle also moved Java EE to the Eclipse Foundation, although it maintains tight hold over Java SE.

The Sun deal remains relevant today, given how it ties into Ellison’s long-term vision of making Oracle the IBM for the 21st century, Mueller said.

That aspiration realized would see Oracle become a “chip-to-click” technology provider, spanning silicon to end-user applications, he added. “The verdict is kind of still out over whether that is going to work.”

Oracle Database 12c

The company made a slight but telling change to its database naming convention with the 2013 release of 12c, swapping consonants for one that denoted “cloud,” rather than “g” for grid computing.

Oracle’s first iteration of 12c had multitenancy as a marquee feature. SaaS vendors at the time predominantly used multitenancy at the application level, with many customers sharing the same instance of an app. This approach makes it easier to apply updates across many customers’ apps, but is inherently weaker for security, Ellison contended.

Oracle 12c’s multi-tenant option provided an architecture where one container database held many “pluggable” databases.

Oracle later rolled out an in-memory option to compete with SAP’s HANA in-memory database. SAP hoped its customers, many of which used Oracle’s database as an underlying store, would migrate onto HANA.

2016: Oracle acquires NetSuite

Oracle’s $9.3 billion purchase of cloud ERP vendor NetSuite came with controversy, given Ellison’s large personal financial stake in the vendor. But on a strategic level, the move made plenty of sense.

NetSuite at the time had more than 10,000 customers, predominantly in the small and medium-sized business range. Oracle, in contrast, had 1,000 or so customers for its cloud ERP aimed at large enterprises, and not much presence in SMB.

Thus, the move plugged a major gap for Oracle. It also came as Oracle and NetSuite began to compete with each other at the margins for customers of a certain size.

Oracle’s move also gave it a coherent two-tier ERP strategy, wherein a customer that opens new offices would use NetSuite in those locations while tying it back to a central Oracle ERP system. This is a practice rival SAP has used with Business ByDesign, its cloud ERP product for SMBs, as well as Business One.

The NetSuite acquisition was practically destined from the start, said Scavo of Strativa.

“I always thought Larry was smart not to do the NetSuite experiment internally. NetSuite was able to develop its product as a cloud ERP system long before anyone dreamed of doing that,” Scavo said.

NetSuite customers could benefit as the software moves onto Oracle’s IaaS if they receive the promised benefits of better performance and elasticity, which NetSuite has grappled with at times, Scavo added. “I’m looking forward to seeing some evidence of that.”

Oracle launches its second-generation IaaS cloud

The IaaS market has largely coalesced around three players in hyperscale IaaS: AWS, Microsoft and Google. Other large companies such as Cisco and HPE tried something similar, but ceded defeat and now position themselves as neutral middle players keen to help customers navigate and manage multi-cloud deployments.

Oracle, meanwhile, came to market with an initial public IaaS offering based in part on OpenStack, but it failed to gain much traction. It subsequently made major investments in a second-generation IaaS, called Oracle Cloud Infrastructure, which offers many advancements at the compute, network and storage layers over the original.

With [on-premises software] we had to make it work with everybody. Part of it is working together to bring that to the cloud.
Steve DahebSenior vice president, Oracle Cloud

Oracle has again shifted gears, evidenced by its partnership with Microsoft to boost interoperability between Oracle Cloud Infrastructure and Azure. One expected use case is for IT pros to run their enterprise application logic and presentation tiers on Azure, while tying back to Oracle’s Autonomous Database on the Oracle cloud.

“We started this a while back and it’s something customers asked for,” Oracle’s Daheb said. There was significant development work involved and given the companies’ shared interests, the deal was natural, according to Daheb.

“If you think about this world we came from, with [on-premises software], we had to make it work with everybody,” Daheb said. “Part of it is working together to bring that to the cloud.”

Oracle Autonomous Database marks the path forward

Ellison will unveil updates to Oracle database 19c, which runs both on-premises and in the cloud, in a talk at OpenWorld. While details remain under wraps, it is safe to assume the news will involve autonomous management and maintenance capabilities Oracle first discussed in 2017.

Oracle database customers typically wait a couple of years before upgrading to a new version, preferring to let early adopters work through any remaining bugs and stability issues. Version 19c arrived in January, but is more mature than the name suggests. Oracle moved to a yearly naming convention and update path in 2018, and thus 19c is considered the final iteration of the 12c release cycle, which dates to 2013.

Oracle users should be mindful that autonomous database features have been a staple of database systems for decades, according to Monash.

But Oracle has indeed accomplished something special with its cloud-based Autonomous Database, according to Daheb. He referred to an Oracle marketing intern who was able to set up databases in just a couple of minutes on the Oracle Cloud version. “For us, cloud is the great democratizer,” Daheb said.

Go to Original Article
Author:

IT pros look to VMware’s GPU acceleration projects to kick-start AI

SAN FRANCISCO — IT pros who need to support emerging AI and machine learning workloads see promise in a pair of developments VMware previewed this week to bolster support for GPU-accelerated computing in vSphere.

GPUs are uniquely suited to handle the massive processing demands of AI and machine learning workloads, and chipmakers like Nvidia Corp. are now developing and promoting GPUs specifically designed for this purpose.

A previous partnership with Nvidia introduced capabilities that allowed VMware customers to assign GPUs to VMs, but not more than one GPU per VM. The latest development, which Nvidia calls its Virtual Compute Server, allows customers to assign multiple virtual GPUs to a VM.

Nvidia’s Virtual Compute Server also works with VMware’s vMotion capability, allowing IT pros to live migrate a GPU-accelerated VM to another physical host. The companies have also extended this partnership to VMware Cloud on AWS, allowing customers to access Amazon Elastic Compute Cloud bare-metal instances with Nvidia T4 GPUs.

VMware gave the Nvidia partnership prime time this week at VMworld 2019, playing a prerecorded video of Nvidia CEO Jensen Huang talking up the companies’ combined efforts during Monday’s general session. However, another GPU acceleration project also caught the eye of some IT pros who came to learn more about VMware’s recent acquisition of Bitfusion.io Inc.

VMware acquired Bitfusion earlier this year and announced its intent to integrate the startup’s GPU virtualization capabilities into vSphere. Bitfusion’s FlexDirect connects GPU-accelerated servers over the network and provides the ability to assign GPUs to workloads in real time. The company compares its GPU vitalization approach to network-attached storage because it disaggregates GPU resources and makes them accessible to any server on the network as a pool of resources.

The software’s unique approach also allows customers to assign just portions of a GPU to different workloads. For example, an IT pro might assign 50% of a GPU’s capacity to one VM and 50% to another VM. This approach can allow companies to more efficiently use its investments in expensive GPU hardware, company executives said. FlexDirect also offers extensions to support field-programmable gate arrays and application-specific integrated circuits.

“I was really happy to see they’re doing this at the network level,” said Kevin Wilcox, principal virtualization architect at Fiserv, a financial services company. “We’ve struggled with figuring out how to handle the power and cooling requirements for GPUs. This looks like it’ll allow us to place to our GPUs in a segmented section of our data center that can handle those power and cooling needs.”

AI demand surging

Many companies are only beginning to research and invest in AI capabilities, but interest is growing rapidly, said Gartner analyst Chirag Dekate.

“By end of this year, we anticipate that one in two organizations will have some sort of AI initiative, either in the [proof-of-concept] stage or the deployed stage,” Dekate said.

In many cases, IT operations professionals are being asked to move quickly on a variety of AI-focused projects, a trend echoed by multiple VMworld attendees this week.

“We’re just starting with AI, and looking at GPUs as an accelerator,” said Martin Lafontaine, a systems architect at Netgovern, a software company that helps customers comply with data locality compliance laws.

“When they get a subpoena and have to prove where [their data is located], our solution uses machine learning to find that data. We’re starting to look at what we can do with GPUs,” Lafontaine said.

Is GPU virtualization the answer?

Recent efforts to virtualize GPU resources could open the door to broader use of GPUs for AI workloads, but potential customers should pay close attention to benchmark testing, compared to bare-metal deployments, in the coming years, Gartner’s Dekate said.

So far, he has not encountered a customer using these GPU virtualization tactics for deep learning workloads at scale. Today, most organizations still run these deep learning workloads on bare-metal hardware.

 “The future of this technology that Bitfusion is bringing will be decided by the kind of overheads imposed on the workloads,” Dekate said, referring to the additional compute cycles often required to implement a virtualization layer. “The deep learning workloads we have run into are extremely compute-bound and memory-intensive, and in our prior experience, what we’ve seen is that any kind of virtualization tends to impose overheads. … If the overheads are within acceptable parameters, then this technology could very well be applied to AI.”

Go to Original Article
Author:

Microsoft Investigator Fellowship seeks PhD faculty submissions

August 1, 2019 | By Jamie Harper, Vice-President, US Education

Microsoft is expanding its support for academic researchers through the new Microsoft Investigator Fellowship. This fellowship is designed to empower researchers of all disciplines who plan to make an impact with research and teaching using the Microsoft Azure cloud computing platform.

From predicting traffic jams to advancing the Internet of Things, Azure has continued to evolve with the times, and this fellowship aims to keep Azure at the forefront of new ideas in the cloud computing space. Similarly evolving, Microsoft fellowships have a long history of supporting researchers, seeking to promote diversity and promising academic research in the field of computing. This fellowship is an addition to this legacy that highlights the significance of Azure in education, both now and into the future.

Full-time faculty at degree-granting colleges or universities in the United States who hold PhDs are eligible to apply. This fellowship supports faculty who are currently conducting research, advising graduate students, teaching in a classroom, and plan to or currently use Microsoft Azure in research, teaching, or both.

Fellows will receive $100,000 annually for two years to support their research. Fellows will also be invited to attend multiple events during this time, where they will make connections with other faculty from leading universities and Microsoft. They will have the opportunity to participate in the greater academic community as well. Members of the cohort will also be offered various training and certification opportunities.

When reviewing the submissions, Microsoft will evaluate the proposed future research and teaching impact of Azure. This will include consideration of how the Microsoft Azure cloud computing platform will be leveraged in size, scope, or unique ways for research, teaching, or both.

Candidates should submit their proposals directly on the fellowship website by August 16, 2019. Recipients will be announced in September 2019.

We encourage you to submit your proposal! For more information on the Microsoft Investigator Fellowship, please check out the fellowship website.

Go to Original Article
Author: Microsoft News Center

Microsoft Office 365 now available from new South Africa cloud datacenters

As Microsoft strives to support the digital transformation of organizations and enterprises around the world, we continue to drive innovation and expand into new geographies to empower more customers with Office 365, the world’s leading cloud-based productivity solution, with more than 180 million commercial monthly active users. Today, we’re taking another step in our ongoing investment to help enable digital transformation and societal impact across Africa with the general availability of Office 365 services from our new cloud datacenters in South Africa.

Office 365, delivered from local datacenters in South Africa, helps our customers enable the modern workplace and empower their employees with real-time collaboration and cloud-powered intelligence while maintaining security, compliance, and in-country customer data residency. The addition of South Africa as a new geography for Office 365 increases the options for secure, cloud productivity services combined with customer data residency in 16 geographies across the globe along with three additional geographies also announced.

In-country data residency for core customer data helps Office 365 customers meet regulatory requirements, which is particularly important and relevant in industries such as healthcare, financial services, and government—where organizations need to keep specific data in-country to comply with local requirements. Customer data residency provides additional assurances regarding data privacy and reliability for organizations and enterprises. Core customer data is stored only in their datacenter geography (Geo)—in this case, the cloud datacenters within South Africa.

Customers like Altron and the Gauteng Provincial Government have used Office 365 to transform their workplaces. This latest development will enable them—and other organizations and enterprises adopting Office 365—to ramp up their digital transformation journey.

“Altron is committed to improving our infrastructure and embracing a strategy to become a cloud-first company to better serve our customers and empower our employees through modern collaboration. We’ve noticed a tangible difference since making the move to Office 365.”
—Debra Marais, Lead, IT Shared Services at Altron

“Office 365 is driving our modernization journey of Government ICT infrastructure and services by allowing us to develop pioneering solutions at manageable costs and create overall improvements in operations management, all while improving transparency and accountability.”
—David Kramer, Deputy Director General, ICT at Gauteng Provincial Government

Microsoft recently became the first global provider to deliver cloud services from the African continent with the opening of our new cloud datacenter regions. Office 365 joins Azure to expand the intelligent cloud service available from Africa. Dynamics 365 and Power Platform, the next generation of intelligent business applications, are anticipated to be available in the fourth quarter of 2019.

By delivering the comprehensive Microsoft cloud—which includes Azure, Office 365, and Dynamics 365—from datacenters in a given geography, we offer scalable, available, and resilient cloud services to companies and organizations while meeting customer data residency, security, and compliance needs. We have deep expertise in protecting data and empowering customers around the globe to meet extensive security and privacy requirements, including offering the broadest set of compliance certifications and attestations in the industry.

The new cloud regions in South Africa are connected to Microsoft’s other regions via our global network, one of the largest and most innovative on the planet—spanning more than 100,000 miles (161,000 kilometers) of terrestrial fiber and subsea cable systems to deliver services to customers. Microsoft is bringing the global cloud closer to home for African organizations and citizens through our trans-Arabian paths between India and Europe, as well as our trans-Atlantic systems, including Marea, the highest capacity cable to ever cross the Atlantic.

We’re committed to accelerating digital transformation across the continent through numerous initiatives and also recently announced Microsoft’s first Africa Development Centre (ADC), with two initial sites in Nairobi, Kenya and Lagos, Nigeria. The ADC will serve as a premier center of engineering for Microsoft, where world-class African talent can create solutions for local and global impact. With our new cloud datacenter regions, the ADC, and programs like 4Afrika, we believe Africa is poised to develop locally and scale for global impact better than ever before.

Learn more about Office 365 and Microsoft in the Middle East and Africa.

Go to Original Article
Author: Microsoft News Center