Tag Archives: competitive

Office 365 vs. G Suite: Google embraces UC to rival Microsoft

For Google, the unified communications market is a means to an end: keeping G Suite competitive with Microsoft’s Office 365. In 2020, Google plans to close in on the Microsoft suite’s core communication features by migrating businesses to Hangouts Chat, the messaging complement to G Suite’s calling and video conferencing apps.

In mid-2020, Hangouts Chat will replace an older, more basic chat app called Hangouts. While the new app is an improvement, Google will have to add features and build a much larger partner ecosystem to reach par with Office 365.

What’s more, Google’s strategy of maintaining separate products for core communications services is at odds with the direction of the market. Vendors like Microsoft have consolidated calling, messaging and meetings services into a single user interface. But Google is keeping Hangouts Chat distinct from the video conferencing app Hangouts Meet.

“Their challenges are more related to fundamentally who they are,” TJ Keitt, an analyst at Forrester Research, said. “They’re a company that, for a while, had struggled to indicate they understand all the things that large enterprises require.”

G Suite has trailed Office 365 for years. In particular, Google has struggled to appeal to organizations with thousands and tens of thousands of employees. Those customers often require complex feature sets, but Google likes to keep things simple.

“It’s really important for us to provide just really simple, delightful experiences that work,” Smita Hashim, manager of G Suite’s communications apps, said in December. “It’s not like we need every bell and whistle and every feature.”

In 2019, Google tackled low-hanging fruit that had been standing in the way of selling G Suite to customers with thousands of employees. Giving customers some control over where their data is stored was a significant change. Also, adding numerous IT controls and security backstops was critical to enterprises.

But Google does not appear interested in matching Office 365 feature-for-feature. Instead, analysts expect the company will seek to grow G Suite in 2020 and beyond by focusing on specific industries and kinds of companies.

“If Google plays the long game, they don’t need to really worry about whether or not they are beating Microsoft in a lot of the companies that are here right now,” Keitt said. Instead, Google can target new and adolescent companies that haven’t bought into Office 365.

Google’s targets will likely include the verticals of education and technology, as well as fast-growing businesses with a young workforce. The company has already won some big names. In 2019, G Suite added tech company Iron Mountain, with 26,000 employees, and Whirlpool, with 92,000 employees.

In 2020, Google needs to decide whether to get serious about building a communications portfolio on par with Microsoft’s. That would entail expanding the business calling service it launched this year, Google Voice for G Suite.

So far, the vendor has signaled it will keep the calling service simple. Whereas traditional telephony systems offer upwards of 200 features, Google opted for fewer than 20. The new year will likely bring only incremental changes, such as the certification of more desk phones.

“I think, incrementally, they are continuing to improve. They are trying to close the gap,” said Irwin Lazar, an analyst at Nemertes Research. “What I haven’t seen Google really try to do is leapfrog the market.”

Nevertheless, the cloud productivity market is likely still a lucrative one for Google. As of February, 5 million organizations subscribed to G Suite, some paying as much as $25 per user, per month. 

Google Cloud, a division that includes G Suite as well as the vendor’s infrastructure-as-a-service platform, was on track to generate $8 billion in annual revenue as of July.

“Being number two in a multi-billion-dollar [office productivity] market is fine,” said Jeffrey Mann, an analyst at Garter.

Go to Original Article
Author:

For Sale – AOC AGON AG251FZ 240Hz 24.5″ LED FHD (1920×1080) Freesync 1ms Gaming monitor

Don’t have time for competitive gaming anymore. Purchased brand new from Amazon 5 months ago, still have original packaging and power adaptor. The DisplayPort cable which came with it was faulty so this includes the one I bought.

Club3D CAC-2067 DisplayPort to DisplayPort 1.4/HBR3 Cable DP 1.4 8K 60Hz 1m/3.28ft, Black

I’ll be moving from Durham to Staffordshire soon so collection is available from Durham up to the 13th of December, after that date from Staffordshire.

Go to Original Article
Author:

For Sale – AOC AGON AG251FZ 240Hz 24.5″ LED FHD (1920×1080) Freesync 1ms Gaming monitor

Don’t have time for competitive gaming anymore. Purchased brand new from Amazon 5 months ago, still have original packaging and power adaptor. The DisplayPort cable which came with it was faulty so this includes the one I bought.

Club3D CAC-2067 DisplayPort to DisplayPort 1.4/HBR3 Cable DP 1.4 8K 60Hz 1m/3.28ft, Black

I’ll be moving from Durham to Staffordshire soon so collection is available from Durham up to the 13th of December, after that date from Staffordshire.

Go to Original Article
Author:

Spoken acquisition the highlight of Avaya Engage 2018

Avaya took a big step toward building a competitive cloud-based unified communications portfolio with the acquisition of contact-center-as-a-service provider Spoken Communications.

Avaya announced the all-cash deal this week at the Avaya Engage 2018 user conference — the first since Avaya exited bankruptcy late last year. The company also launched at the show a desktop phone series, an all-in-one huddle-room video conferencing system and cloud-based customer support software, called Ava.

Avaya plans to offer Spoken services as an option for customers who want to move to the cloud slowly. Companies using Avaya on-premises software can swap out call-center features one at a time and replace them with the Spoken cloud version.

“With the acquisition of Spoken, it’s clear that Avaya is putting more of an emphasis on building out its own hosted offerings that it can either sell direct or through channels,” said Irwin Lazar, an analyst at Nemertes Research, based in Mokena, Ill.

Avaya’s cloud strategy

The current executive team is determined to shift Avaya’s focus to the cloud in terms of both technology development and business model.
Elka Popovaanalyst at Frost & Sullivan

Only a small percentage of Avaya’s customers use its cloud-based services, which lag behind those of rivals Cisco and Microsoft. Nevertheless, the market for contact center and UC as a service is growing much faster than on-premises software, analysts said.

“The current executive team is determined to shift Avaya’s focus to the cloud in terms of both technology development and business model,” said Elka Popova, an analyst at consulting firm Frost & Sullivan, based in San Antonio. “The team acknowledges they are a bit late to the game, and most of their cloud portfolio is a work in progress, but the determination is there.”

Since last year, Avaya has worked with Spoken on bringing contact center as a service (CCaaS) to Avaya customers through product integrations. The joint effort has led to integration between Spoken’s cloud-based services and Avaya’s on-premises Call Center Elite and Aura Communication Manager. The latter is Avaya’s UC platform.

Spoken uses speech recognition in its CCaaS offering to automate call-center processes and make customer service agents more efficient. For example, Spoken can transcribe conversations agents have with each customer, which frees customer reps from having to type notes into the system manually.

Spoken technology can also listen for keywords. If it hears the word invoice, for example, it can retrieve the customer’s bill automatically for the agent.

Spoken has more than 170 patents and patent applications that will go to Avaya, which expects to close the transaction by the end of March. The company did not release financial details.

Other Avaya Engage 2018 announcements

In other Avaya Engage 2018 news, the vendor introduced a cloud-based messaging platform for reaching customers on social media, such as Facebook, Twitter, WeChat and Line. Avaya’s Ava can provide immediate self-service support or send customers to an agent. If the latter occurs, then all information gathered during the automated service is handed to the service rep.

Ava supports 34 languages and has APIs Avaya partners can use for product integration. Last year, Avaya launched an initiative called A.I.Connect to encourage other vendors to connect products with artificial intelligence or machine learning capabilities with Avaya communication software.

Despite its cloud focus, Avaya is still paying attention to hardware. The company announced at Engage the J series line of desktop phones. The three phones come with Bluetooth and Wi-Fi connectivity. Avaya plans to release the hardware in the second quarter.

Also, the company introduced a second Vantage touchscreen phone. Unlike the first one unveiled last year, the latest hardware comes with the option of a traditional keyboard. It also supports Avaya IP Office, which provides a combination of cloud-based and on-premises UC services.

Finally, Avaya launched the CU-360 all-in-one video conferencing system for huddle rooms, which small teams of corporate workers use for meetings. The hardware can connect to mobile devices for content sharing.

Overall, the Avaya Engage 2018 conference reflected positively on the executive team chosen by Avaya CEO Jim Chirico, analysts said. Formerly Avaya’s COO, Chirico replaced former CEO Kevin Kennedy, who retired Oct. 1.

“Overall, the event did not produce a wow effect,” Popova said. “There was nothing spectacular, but the spirits were high, and the partner and customer sentiments were mostly positive.”

AI apps demand DevOps infrastructure automation

Artificial intelligence can offer enterprises a significant competitive advantage for some strategic applications. But enterprise IT shops will also require DevOps infrastructure automation to keep up with frequent iterations.

Most enterprise shops won’t host AI apps in-house, but those that do will turn to sophisticated app-level automation techniques to manage IT infrastructure. And any enterprise that wants to inject AI into its apps will require rapid application development and deployment — a process early practitioners call “DevOps on steroids.”

“When you’re developing your models, there’s a rapid iteration process,” said Michael Bishop, CTO of Alpha Vertex, a fintech startup in New York that specializes in AI data analysis of equities markets. “It’s DevOps on steroids because you’re trying to move quickly, and you may have thousands of features you’re trying to factor in and explore.”

DevOps principles of rapid iteration will be crucial to train AI algorithms and to make changes to applications based on the results of AI data analysis at Nationwide Mutual Insurance Co. The company, based in Columbus, Ohio, experiments with IBM’s Watson AI system to predict whether new approaches to the market will help it sell more insurance policies and to analyze data collected from monitoring devices in customers’ cars that help it set insurance rates.

“You’ve got to have APIs and microservices,” said Carmen DeArdo, technology director responsible for Nationwide’s software delivery pipeline. “You’ve got to deploy more frequently to respond to those feedback loops and the market.”

This puts greater pressure on IT ops to provide developers and data scientists with self-service access to an automated infrastructure. Nationwide relies on ChatOps for self-service, as chatbots limit how much developers switch between different interfaces for application development and infrastructure troubleshooting. ChatOps also allows developers to correct application problems before they enter a production environment.

AI apps push the limits of infrastructure automation

Enterprise IT pros who support AI apps quickly find that no human can keep up with the required rapid pace of changes to infrastructure. Moreover, large organizations must deploy many different AI algorithms against their data sets to get a good return on investment, said Michael Dobrovolsky, executive director of the machine learning practice and global development at financial services giant Morgan Stanley in New York.

“The only way to make AI profitable from an enterprise point of view is to do it at scale; we’re talking hundreds of models,” Dobrovolsky said. “They all have different lifecycles and iteration [requirements], so you need a way to deploy it and monitor it all. And that is the biggest challenge right now.”

Houghton Mifflin Harcourt, an educational book and software publisher based in Boston, has laid the groundwork for AI apps with infrastructure automation that pairs Apache Mesos for container orchestration with Apache Aurora, an open source utility that allows applications to automatically request infrastructure resources.

Long term, the goal is to put all the workload management in the apps themselves, so that they manage all the scheduling … managing tasks in that way is the future.
Robert Allendirector of engineering, Houghton Mifflin Harcourt

“Long term, the goal is to put all the workload management in the apps themselves, so that they manage all the scheduling,” said Robert Allen, director of engineering at Houghton Mifflin Harcourt. “I’m more interested in two-level scheduling [than container orchestration], and I believe managing tasks in that way is the future.”

Analysts agreed application-driven infrastructure automation will be ideal to support AI apps.

“The infrastructure framework for this will be more and more automated, and the infrastructure will handle all the data preparation and ingestion, algorithm selection, containerization, and publishing of AI capabilities into different target environments,” said James Kobielus, analyst with Wikibon.

Automated, end-to-end, continuous release cycles are a central focus for vendors, Kobielus said. Tools from companies such as Algorithmia can automate the selection of back-end hardware at the application level, as can services such as Amazon Web Services’ (AWS) SageMaker. Some new infrastructure automation tools also provide governance features such as audit trails on the development of AI algorithms and the decisions they make, which will be crucial for large enterprises.

Early AI adopters favor containers and serverless tech

Until app-based automation becomes more common, companies that work with AI apps will turn to DevOps infrastructure automation based on containers and serverless technologies.

Veritone, which provides AI apps as a service to large customers such as CBS Radio, uses Iron Functions, now the basis for Oracle’s Fn serverless product, to orchestrate containers. The company, based in Costa Mesa, Calif., evaluated Lambda a few years ago, but saw Iron Functions as a more suitable combination of functions as a service and containers. With Iron Functions, containers can process more than one event at a time, and functions can attach to a specific container, rather than exist simply as snippets of code.

“If you have apps like TensorFlow or things that require libraries, like [optical character recognition], where typically you have to use Tesseract and compile C libraries, you can’t put that into functions AWS Lambda has,” said Al Brown, senior vice president of engineering for Veritone. “You need a container that has the whole environment.”

Veritone also prefers this approach to Kubernetes and Mesos, which focus on container orchestration only.

“I’ve used Kubernetes and Mesos, and they’ve provided a lot of the building blocks,” Brown said. “But functions let developers focus on code and standards and scale it without having to worry about [infrastructure].”

Beth Pariseau is senior news writer for TechTarget’s Cloud and DevOps Media Group. Write to her at [email protected] or follow @PariseauTT on Twitter.

IBM cooks up a hardware architecture for tastier cloud-based services

IBM hopes to raise its competitive profile in cloud services when it introduces new hardware and cloud infrastructure by the end of this year or early 2018.

The company will add a new collection of hardware and software products that deliver artificial intelligence (AI) and cloud-based services faster and more efficiently.

Among the server-based hardware technologies are 3D Torus, an interconnection topology for message-passing multicomputer systems, and new accelerators from Nvidia, along with advanced graphics processing unit (GPU) chips. Also included is Single Large Expensive Disk technology, a traditional disk technology currently used in mainframes and all-flash-based storage, according to sources familiar with the company’s plans.

The architecture achieves sub-20-millisecond performance latencies by eliminating routers and switches, and it embeds those capabilities into chips that communicate more directly with each other, one source said.

The new collection of hardware applies some of the same concepts as IBM’s Blue Gene supercomputer, which were among those used to create Watson. In the model of those special-purpose machines, the new system is designed specifically to do one thing: Deliver AI-flavored cloud-based services.

These technologies, which can work with both IBM Power and Intel chips in the same box, will be used only in servers housed in IBM’s data centers. IBM will not sell servers containing these technologies commercially to corporate users. The new technologies could reach IBM’s 56 data centers late this year or early next year.

AI to the rescue for IBM’s cognitive cloud

IBM’s cloud business has grown steadily from its small base over the past three to four years to revenues of $3.9 billion in the company’s second quarter reported last month and $15.1 billion over the past 12 months. The company’s annual run rate for as-a-service revenues rose 32% from a year ago to $8.8 billion.

At the same time, sales of the company’s portfolio of cognitive solutions, with Watson at its core, took a step back, falling 1% in the second quarter after 3% growth in this year’s first quarter.

That doesn’t represent a critical setback, but it has caused some concern, because the company hangs much of its future growth on Watson.

Three years ago, IBM sunk $1 billion to set up its Watson business unit in the New York City borough Manhattan. IBM CEO Ginni Rometty has often cited lofty goals for the unit when claiming Watson would reach 1 billion consumers by the end of 2017, $1 billion in revenues by the end of 2018 and, eventually, $10 billion in revenue by an unnamed date. For IBM to achieve those goals, it requires a steady infusion of AI and machine learning technologies.

IBM executives remain confident, given the technical advancements in AI and machine learning capabilities built into Watson and a strict focus on corporate business users, while competitors — most notably Amazon — pursue consumer markets.

“All of our efforts around cognitive computing and AI are aimed at businesses,” said John Considine, general manager of cloud infrastructure at IBM. “This is why we have made such heavy investments in GPUs, bare-metal servers and infrastructure, so we can deliver these services with the performance levels corporate users will require.”

However, not everyone is convinced that IBM can reach its goals for cognitive cloud-based services, at least in the predicted time frames. And it will still be an uphill climb for Big Blue, as it looks to vie with cloud competitors faster out of the gate.

Lydia Leong, an analyst with Gartner, could not confirm details of IBM’s upcoming new hardware for cloud services, but pointed to the company’s efforts around a new cloud-oriented architecture dubbed Next Generation Infrastructure. NGI will be a new platform run inside SoftLayer facilities, but it’s built from scratch by a different team within IBM, she said.

My expectation is IBM will not have a long-term speed advantage with this — I’m not even sure they will have a short-term one.
Lydia Leonganalyst, Gartner

IBM intends to catch up to the modern world of infrastructure with hardware and software more like those from competitors Amazon Web Services and Microsoft Azure, and thus deliver more compelling cloud-based services. NGI will be the foundation on which to build new infrastructure-as-a-service (IaaS) offerings, while IBM Bluemix, which remains a separate entity, will continue to run on top of bare metal.

Leong said she is skeptical, however, that any new server hardware will give the company a performance advantage to deliver cloud services.

“My expectation is IBM will not have a long-term speed advantage with this — I’m not even sure they will have a short-term one,” Leong said. “Other cloud competitors are intensely innovative and have access to the same set of technologies and tactical ideas, and they will move quickly.”

IBM has stumbled repeatedly with engineering execution in its cloud portfolio, which includes last year’s launch and demise of a new IaaS offering, OpenStack for Bluemix. “[IBM has] talked to users about this [NGI] for a while, but the engineering schedule keeps getting pushed back,” she said.

IBM now enters the cloud infrastructure market extremely late — and at a time when the core infrastructure war has been mostly won, Leong said. She suggested IBM might be better served to avoid direct competition with market leaders and focus its efforts where it has an established advantage and can differentiate with things like Watson.

Riverbed, Talari, Cisco bolster SD-WAN vendor product lines

Vendors are scrambling to make their SD-WAN product lines more competitive in a market that IDC predicts will grow by nearly 70% annually through 2021. Providers who bolstered their offerings this week to ride that growth include Cisco, Riverbed Technology and Talari Networks.

Cisco sought to improve its competitive posture in the software-defined WAN (SD-WAN) market with the completion of the Viptela acquisition announced in May. Viptela brings to Cisco customers a quicker return on investment than the vendor’s competing intelligent WAN (IWAN), analysts said.

IWAN is a WAN optimization-based product that uses Cisco’s ISR routers. Viptela is a pure-play SD-WAN product that includes virtualized routers running on less expensive x86 boxes.

Cisco said it would continue to invest in, and support, IWAN. However, the company has not said whether it would continue to advance Viptela on x86 systems. Instead, Cisco said it would incorporate Viptela technology on ISR and ASR routers and “customers will be able to migrate to the new unified solution as needed or desired.”

Cisco, Talari and Riverbed are among two dozen SD-WAN vendors battling for a slice of the crowded market, which IDC forecasts will grow from $574.2 million in 2016 to $8.05 billion in 2021. To build a broader SD-WAN platform, vendors are adding APIs that let partners tie in network and security services, such as WAN optimization, firewalls and IP address management.

WAN optimization added to Talari SD-WAN product line

Talari added WAN optimization features to its product and made them available at no extra cost. Also, it partnered with cloud-based security vendor Zscaler Inc. to provide customers with an alternative to on-premises firewalls for internet traffic.

The WAN-optimization features include congestion controls and data compression and deduplication. Talari said the features improve the efficiency of bulk data transfers from a single location.

Optimizing traffic flows is what Talari presents as its market differentiator. The vendor’s analytics factor in packet loss, latency, jitter and bandwidth in selecting the best traffic path.

Riverbed SD-WAN to get wireless integration

Riverbed’s biggest differentiator among SD-WAN vendors is its roots in WAN optimization — technology at the core of the company’s business since its founding in 2002. This year, Riverbed acquired Wi-Fi vendor Xirrus Inc. to integrate its wireless LAN technology into SD-WAN.

Riverbed plans to release this year the initial integration of Xirrus access points and switches to SteelConnect Manager, which is Riverbed’s SD-WAN management console. Riverbed already provides integration between SteelConnect and SteelCentral, the vendor’s network performance monitoring tool, and SteelHead, its WAN optimization technology.

The developing Riverbed, Talari and Cisco SD-WAN products reflect market trends driving sales. Factors making SD-WAN a hot ticket among companies are the increasing adoption of cloud-based applications, mobile devices and the internet of things. The three have substantially increased the amount of data flowing from corporate networks to the internet, a scenario requiring management technology more flexible than legacy systems.

Powered by WPeMatico