The post VRScout: How businesses are cutting employee training costs with VR appeared first on Stories.
New report outlines how businesses moving from on-premises datacenters to the Microsoft Cloud can achieve sustainable innovation
REDMOND, Wash. — May 17, 2018 — A new report issued Thursday by Microsoft Corp. in partnership with WSP shows significant energy and carbon emissions reduction potential from the Microsoft Cloud when compared with on-premises datacenters. These gains, as much as 93 percent more energy efficient and as high as 98 percent more carbon efficient, are due to Microsoft’s extensive investments in IT efficiency from chip-to-datacenter infrastructure, as well as renewable energy.
“The world is producing more data than ever, making our infrastructure decisions about how to power this digital transformation incredibly important,” said Brad Smith, president and chief legal officer, Microsoft. “Today’s report confirms what we’ve long believed — that investing in sustainability is good for business, good for customers and good for the planet.”
Specifically, the report found that cloud investments made by Microsoft in IT operational efficiency, IT equipment efficiency, datacenter infrastructure efficiency and renewable electricity were responsible for the environmental benefits. These efficiencies translate into both energy and carbon savings for Microsoft and customers using Microsoft Cloud services.
Microsoft Cloud services achieve energy and emissions reductions in comparison with every on-premises deployment scenario assessed — Microsoft Azure Cloud Compute, Azure Storage, Exchange Online and SharePoint Online.
With more regions than any other cloud provider, Microsoft provides cloud services to customers around the world. As customers across all industries move to the cloud, sustainability and environmental responsibility are key factors in their choice of cloud provider.
“Schneider Electric chose the Microsoft Cloud to power our numerous cloud-based offerings, and it has helped us achieve our goal of becoming a global leader in sustainable energy management,” said Michael MacKenzie, vice president, EcoStruxure Technology Platform – IoT & Digital Offers, Schneider Electric. “The fact that Microsoft shares our sustainability values and focus on decreasing environmental impact makes the company a natural partner for us.”
“When organizations choose low-carbon cloud computing, they are taking an important step forward on sustainability,” said Lance Pierce, president of CDP North America. “Sustainable digital transformation, powered by a cleaner cloud, enables the creation of a sustainable and thriving economy that works for people and planet in the long term.”
Learn more about the Microsoft’s investments and approach to sustainability in the cloud at https://blogs.microsoft.com/on-the-issues/?p=58951. The report can be found in full at “The Carbon Benefits of Cloud Computing: A Study on the Microsoft Cloud.”
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.
For more information, press only:
Microsoft Media Relations, WE Communications, (425) 638-7777,
Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.
The post The Microsoft Cloud can save customers 93 percent and more in energy and carbon efficiency appeared first on Stories.
The post Ingram Micro and Microsoft partner to help more businesses maximize their cloud investments appeared first on Stories.
The post Moving businesses forward with Microsoft Business Applications in Australia appeared first on Stories.
With businesses becoming more digitally dependent and IT responsibilities outpacing budgets, IT shops are being forced to evolve. This transformation requires not just a change in infrastructure technology, but in the organization of IT personnel as well — an organizational makeover that often determines the success of digital business.
As firms drive new digital initiatives, such as developing digital products and services, using analytics and investing in application development, IT services have started to have a more direct effect on revenue opportunities. As a result, IT must become more responsive in order to speed up the delivery of those new services.
To improve responsiveness, IT shops often shift personnel to work directly with the line-of-business teams to understand their demands better. Companies add budget and headcount to address this increase in IT demands and support each new initiative, while simultaneously adding budget to support the increased infrastructure needed to handle the new initiatives. Or you could find a new way to get the same results.
The new way
Ultimately, it’s the desire to find innovative ways to dramatically reduce the cost of routine IT maintenance and management that drives demand for infrastructure transformation. The end result is an as-a-service infrastructure that frees existing personnel to cover the added responsibilities and speed delivery of IT services. Multiple emergent technologies, such as flash storage, deliver transformational benefits in terms of performance, efficiency and TCO that can help. Technologies like flash are only part of the story, however. Another possibility that’s just as beneficial is IT infrastructure automation.
Manual tasks inhibit digital business. Every hour a highly trained IT resource spends on a manual — and likely routine — task is an hour that could have been spent helping to drive a potential revenue-generating digital initiative. As businesses increase their IT infrastructure automation efforts, an emerging concept called composable infrastructure has gained interest.
With composable infrastructure, IT infrastructure is virtualized to dynamically and efficiently allocate resources to individual applications. Composable infrastructure also provides the necessary analytics to fine-tune infrastructure. Ideally, software ensures the right resources are available at the right time, new resources can be added on demand, and capacity or performance can be contracted when demand changes. Cisco, Hewlett Packard Enterprise, Kaminario and other vendors promote the composable infrastructure concept.
There are several factors to consider as composable infrastructure gains traction:
- The intelligence to drive IT infrastructure automation: Arguably the first step in any effort to automate IT is knowing what to automate, along with when and how to do it efficiently. How much performance and capacity does each application need? How much can the infrastructure provide? How will these demands change over time? Providing this information requires the right level of intelligence and predictive analytics to understand the nature of each application’s demand. Done right, this results in more efficient infrastructure design and a reduction in capital investment. An even more valuable likely benefit is in personnel resource savings, as this intelligence enables automatic tuning of the infrastructure.
- Granularity of control: Intelligence is important, but the ability to use that intelligence offers the most tangible benefits. Composable infrastructure products typically provide controls, such as APIs, to enable programmatic management. In some cases, this lets the application automatically demand resources when it identifies increasing demand. The more likely near-term scenario is that these controls will be used to automate planned manual tasks, such as standing up infrastructure for the deployment of a new application. Or, for example, you could use the controls to automate the expansion of a virtual machine environment. As IT infrastructure automation efforts expand and the number of infrastructure elements — e.g., performance and capacity — that can be automatically controlled increases, the value of composable infrastructure increases.
- Architectural scale: Every IT infrastructure option seems to be scalable these days. For composable infrastructure, capacity and even performance scalability are just part of the story. Necessary data services and data management must scale as well. In addition, for the infrastructure to support IT automation, a time element is added to that scale. So when a request for scale is made, the infrastructure must react in a timely and predictable manner. For this, composable infrastructure requires high-performing components and latency reduction across data interconnects.
Nonvolatile memory express (NVMe) plays a role here. While some view NVMe as just faster flash, the low-latency interconnect is critical to a scalable IT infrastructure effort. Data services add latency, and reducing the latency of the data path lets these data services extend to a broader infrastructure. Additionally, flexible scale isn’t just about adding resources; it’s also about freeing up resources that can be better used elsewhere.
The end goal is to deliver an infrastructure that can respond effectively to automation and reduce the number of manual tasks that must be handled by IT. Composable infrastructure isn’t the only way to achieve IT infrastructure automation, however. Software-defined storage and converged infrastructure can also help automate IT and go a long way toward eliminating the enemy of digital business, manual IT tasks.
And the more manual your IT processes are, the less competitive you’ll be as a digital business. As businesses seek to build an as-a-service infrastructure, composable infrastructure is another innovative step to create and automatic an on-demand data center.
Organizations around the world are transforming for the digital era, changing how businesses, cities and citizens work. This new digital era will address many of the problems created in the earlier agricultural and industrial eras, making society safer, more sustainable, more efficient and more inclusive.
But an infrastructure gap is keeping this broad vision from becoming a reality. Digital transformation is happening faster than we expected — only in pockets. Microsoft and its partners seek to help city and other public infrastructures close the gaps, with advanced technologies in the cloud, data analytics, machine learning and artificial intelligence (AI).
Microsoft’s goal is to be a trusted partner to both public and private organizations in building connected societies. This summer, an IDC survey named Microsoft the top company for trust and customer satisfaction in enabling smart-city digital transformations.
Last week at a luncheon in New York City, Microsoft and executives from three organizations participating in the digital transformation shared how they are helping to close the infrastructure gap.
TomTom NV, based in Amsterdam, traditionally focused on providing consumers with personal navigation. Now, “the need for locations surpasses the need for navigation — it’s everywhere,” said Arnold Meijer, strategic business development manager. “Managing a fleet of connected devices or ordering a ride from your phone — these things weren’t possible five years ago. We’re turning to cloud connectivity and the Internet of Things as tools to keep our maps and locations up to date.”
Sensors from devices and vehicles on the road deliver condition and usage data essential to highway planners, infrastructure managers and fleet operators to make well informed decisions.
Autonomous driving is directly in TomTom’s sights, a way to cut down on traffic accidents, one of the top 10 causes of death worldwide, and to reduce emissions through efficient routing. “You probably won’t own a vehicle 20 years from now, and the one that picks you up won’t have a driver,” Meijer said. “If you do go out driving yourself, it will be for fun.”
With all that time freed up from driving, travelers can do something else such as relax or work. Either option presents new business opportunities for companies that offer entertainment or enable productivity for a mobile client, who is almost certainly connected to the internet. “There will be new companies coming out supporting that, and I definitely foresee Microsoft and other businesses active there,” Meijer said.
“Such greatly eased personal transport may decrease the need to live close to work or school, changing settlement patterns and reduce the societal impacts of mobility. All because we can use location- and cloud technology.” he added.
The New York City Dept. of Education is using Microsoft technology extensively in a five-year, $25-million project that will tell parents their children’s whereabouts while the students are in transit, increase use of the cafeterias and provide access to information about school sports.
The city’s Office of Pupil Transportation provides rides to more than 600,000 students per day, with more than 9,000 buses and vehicles. For a preliminary version of the student-tracking system, the city has equipped its leased buses with GPS devices.
“When the driver turns on the GPS and signs in his bus, we can find out where it is at any time,” said George Pitagorsky, executive director and CIO for the department’s Office of School Support Services. If parents know what bus their child is on, they can more easily meet it at the stop or be sure to be there when the child is brought home.
A next step will be GPS units that don’t require driver activation. To let the system track not just the vehicle but its individual occupants, drivers will still need to register students into the GPS when they get on the bus.
“Biometrics like facial recognition that automate check-in when a student steps onto a bus — we’re most likely going to be there, but we’re not there yet,” Pitagorsky said.
Further out within the $25-million Illumination Program, a new bus-routing tool will replace systems developed more than 20 years ago, allowing the creation of more efficient routes, making course corrections to avoid problems, easily gathering vehicle-maintenance costs and identifying problem vehicles.
Other current projects include a smartphone app to advise students of upcoming meal choices in the school cafeterias, with an eye to increasing cafeteria use, enhancing students’ nutritional intake and offering students a voice in entree choices. The department has also created an app that displays all high school sports games, locations and scores.
A new customer-relations management app will let parents update their addresses and request special transport services on behalf of their children, with no more need to make a special visit to the school to do so. A mobile app will allow parents and authorized others to locate their children or bus, replacing the need for a phone call to the customer service unit. And business intelligence and data warehousing will get a uniform architecture, to replace the patchwork data, systems and tools now in place.
Fathym, a startup in Boulder, Colorado, is directly addressing infrastructure gaps through a rapid-innovation platform intended to harmonize disparate data and apps and facilitate Internet of Things solutions.
“Too often, cities don’t have a plan worked out and are pouring millions of dollars into one solution, which is difficult to adjust to evolving needs and often leads to inaccessible, siloed data,” said co-founder and chief marketing officer Christy Szoke. “Our philosophy is to begin with a small proof of concept, then use our platform to build out a solution that is flexible to change and allows data to be accessible from multiple apps and user types.” Fathym makes extensive use of Azure services but hides that complexity from customers, she said.
To create its WeatherCloud service, Fathym combined data from roadside weather stations and sensors with available weather models to create a road weather forecast especially for drivers and maintenance providers, predicting conditions they’ll find precisely along their route.
“We’re working with at least eight data sets, all completely different in format, time intervals and spatial resolutions,” said Fathym co-founder and CEO Matt Smith. “This is hard stuff. You can’t have simplicity on the front end without a complicated back-end system, a lot of math, and a knowledgeable group of different types of engineers helping to make sense of it all.”
Despite the ease that cloud services have brought to application development, even 20 years from now foresees a need for experts to wrangle data.
“When people say, ‘the Internet of Things is here’ and ‘the robots are going to take over,’ I don’t think they have the respect they should have for how challenging it will remain to build complex apps,” Smith said.
Added Szoke, “You can’t just say ‘put an AI on it’ or ‘apply machine learning’ and expect to get useful data. You will still need creative minds, and data scientists, to understand what you’re looking at, and that will continue to be an essential industry.”
Virtual assistant technology, popular in the consumer world, is migrating toward businesses with the hopes of enhancing employee productivity and collaboration. Organizations could capitalize on the familiarity of home-based virtual assistants, such as Siri and Alexa, to boost productivity in the office and launch meetings quicker.
Last week, Amazon announced Alexa for Business, a virtual assistant that connects Amazon Echo devices to the enterprise. Alexa for Business allows organizations to equip conference rooms with Echo devices that can turn on video conferencing equipment and dial into a conference via voice commands.
“Virtual assistants, such as Alexa, greatly enhance the user experience and reduce the complexity in joining meetings,” Frost & Sullivan analyst Vaishno Srinivasan said.
Personal Echo devices connected to the Alexa for Business platform can also be used for hands-free calling and messaging, scheduling meetings, managing to-do lists and finding information on business apps, such as Salesforce and Concur.
Overcoming privacy and security hurdles
Before enterprise virtual assistants like Alexa for Business can see widespread adoption, they must overcome security concerns.
“Amazon and other providers will have to do some evangelizing to demonstrate to CIOs and IT leaders that what they’re doing is not going to compromise any security,” Gartner analyst Werner Goertz said.
Vaishno Srinivasananalyst, Frost & Sullivan
Srinivasan said organizations may have concerns about Alexa for Business collecting data and sharing it in a cloud environment. Amazon has started to address these concerns, particularly when connecting personal Alexa accounts and home Echo devices to a business account.
Goertz said accounts are sandboxed, so users’ personal information will not be visible to the organization. The connected accounts must also comply with enterprise authentication standards. The platform also includes administrative controls that offer shared device provisioning and management capabilities, as well as user and skills management.
Another key challenge is ensuring a virtual assistant device, like the Amazon Echo, responds to a user with information that is highly relevant and contextual, Srinivasan said.
“These devices have to be trained to enhance its intelligence to deliver context-sensitive and customized user experience,” she said.
Integrating with enterprise IT systems
End-user spending on virtual assistant devices is expected to reach $3.5 billion by 2021, up from $720 million in 2016, according to Gartner. Enterprise adoption is expected to ramp up by 2019.
Goertz said Amazon had to do a lot of work “under the hood” to enable the integrations with business apps and vendors such as Microsoft, Cisco, Polycom and BlueJeans. The deep integrations with enterprise IT systems is required to enable future capabilities, such as dictating and sending emails from an Echo device, he said.
Srinivasan said Alexa for Business can extend beyond conference rooms through APIs provided by Amazon’s Alexa Skills Kit for developers.
“Thousands of developers utilize these APIs and have created ‘skills’ that enable automation and increase efficiency within enterprises,” she said.
Taking use cases beyond productivity tools
While enterprise virtual assistants could be deployed in any type of company looking to boost productivity, Alexa for Business has already seen deployments in industries such as hospitality.
Wynn Las Vegas is equipping its rooms with Amazon Echo devices, which are managed with Alexa for Business, Goertz said. Guests of the hotel chain can use voice commands, called skills, to turn on the lights, close the blinds or order room service.
Another industry that could see adoption of virtual assistants is healthcare. Currently, Alexa for Business supports audio-only devices. But the platform could potentially support devices with a camera and display that could add video conferencing and telemedicine capabilities, Goertz said.
Alexa for Business also has the potential to disrupt the huddle room market by turning Echo devices into stand-alone conference phones, Srinivasan said.
Amazon Echo prices range from $50 to $200, and the most recent generation of devices offers improved audio quality. The built-in virtual assistant with Alexa for Business and developer ecosystem fills a gap that exists in the conference phone market, she wrote in a blog post.
“Amazon is well-positioned to grab this opportunity much ahead of Microsoft Cortana, Google Assistant and Apple’s Siri,” she said.
Digital transformation is the key IT trend driving enterprise data center modernization. Businesses today rapidly deploy web-scale applications, file sharing services, online content repositories, sensors for internet of things implementations and big data analytics. While these digital advancements facilitate new insights, streamline processes and enable better collaboration, they also increase unstructured data at an alarming rate.
Managing unstructured data and its massive growth can quickly strain legacy file storage systems that are poorly suited for managing vast amounts of this data. Taneja Group recently investigated the most common of these file storage limitations in a recent survey. The study found the top challenges IT faces with traditional file storage are lack of flexibility, poor storage utilization, inability to scale to petabyte levels and failure to support distributed data. These obstacles often lead to high storage costs, complex storage management and limited flexibility in unstructured data storage.
So how are companies addressing the unstructured data management challenge? As with all things IT, it’s essential to have the right architecture. For unstructured data storage, this means a highly scalable, resilient, flexible, economical and accessible secondary storage environment.
Let’s take a closer look at modern unstructured data storage requirements and examine why distributed file systems and a scale-out object storage design, or scale-out storage, are becoming a key part of modern secondary storage management.
Scalability and resiliency
Given the huge amounts of unstructured data, scalability is undeniably the most critical aspect of modern secondary storage. This is where scale-out storage shines. It’s ideal for managing huge amounts of unstructured data because it easily scales to hundreds of petabytes simply by adding storage nodes. This inherent advantage over scale-up file storage appliances that become bottlenecked by single or dual controllers has prompted several data protection vendors to offer scale-out secondary storage platforms. Notable vendors with scale-out secondary storage offerings are Cohesity, Rubik and — most recently — Commvault.
Attaining storage resiliency is another important requirement of modern secondary storage. Two key factors are required to achieve storage resiliency. The first is high fault tolerance. Scale-out storage is ideal in this area because it uses space-efficient erasure coding and flexible replication policies to tolerate site, multiple node and disk failures.
Rapid data recovery is the second key factor for storage resiliency. For near-instantaneous recovery times, IT managers should look for secondary storage products that provision clones from backup snapshots to recover applications in minutes or even seconds. Secondary storage products should allow administrators to run recovered applications directly on secondary storage until data is copied back to primary storage and be able to orchestrate the recovery of multi-tier applications.
Flexibility and cost
To handle multiple, unstructured data storage use cases, modern secondary storage must also be flexible. Central to flexibility is multiprotocol support. Scale-out storage should support both file and object protocols, such as NFS for Linux, SMB or CIFS for Windows and Amazon Simple Storage Service for web-scale applications. True system flexibility also requires modularity, or composable architecture, which enables multidimensional scalability and I/O flexibility. Admins must be able to quickly vary computing, network and storage resources to accommodate IOPS-, throughput- and capacity-intensive workloads.
Good economics is another requirement for modern secondary storage. Scale-out storage reduces hardware costs by enabling software-defined storage that uses standard, off-the-shelf servers. It’s also simple to maintain. Administrators can easily upgrade or replace computing nodes without having to migrate data among systems, reducing administration time and operating costs. Scale-out secondary storage also provides the option to store data in cost-effective public cloud services, such as Amazon Web Services, Google Cloud and Microsoft Azure.
Moreover, scale-out storage reduces administration time by eliminating storage silos and the rigid, hierarchical structure used in file storage appliances. It instead places all data in a flat address space or single storage pool. Scale-out secondary storage also provides built-in metadata file search capabilities that help users quickly locate the data they need.
Some vendors, such as Cohesity, offer full-text search that facilitates compliance activities by letting companies quickly find files containing sensitive data, such as passwords and Social Security numbers. Add to this support for geographically distributed environments, and it’s easy to see why scale-out storage is essential for cost-effectively managing large-scale storage environments.
The final important ingredient of modern secondary storage environments is providing easy access to services required to manage secondary data. As the amount of unstructured data grows, IT can make things easier for storage administrators and improve organizational agility by giving application owners self-service tools that automate the full data lifecycle. This means providing a portal or marketplace and predefined service-level agreement templates that establish the proper data storage parameters. These parameters include recovery points, retention periods and workload placement based on a company’s standard data policies. Secondary storage should also integrate with database management tools, such as Oracle Recovery Manager.
Clearly, distributed file systems and scale-out object storage architectures are a key part of modern secondary storage offerings. There is an evolution of secondary product portfolios to address the immense unstructured data storage needs of modern organizations in the digital era. So stay tuned, as I expect nearly all major data protection vendors will introduce scale-out secondary storage products over the next 12 to 18 months.
FORT LAUDERDALE, Fla. — The technology that managed service providers use to run their businesses can also help customers reinvent their operations.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
That’s one takeaway from this week’s Autotask Community Live 2017 conference, the IT business management software vendor’s annual MSP meetup. While automated systems enable MSPs to deliver services efficiently and profitably, those tools can also free up time to deal with customer’s digital transformation initiatives, Autotask executives suggested.
Mark Cattini, president and CEO of Autotask, said software-driven digital transformation powers enterprises from Amazon to Uber, but he noted small and medium-sized businesses — a core target market for many channel partners — are also on notice to recast themselves to remain relevant. But MSPs may need to change to help clients transform.
“Many of you are going to have to think about being a business technologist,” Cattini told MSP attendees at the conference, which wraps up Sept. 19.
He said customers are demanding digital transformation and if MSPs don’t deliver, those customers will turn elsewhere for services. Automation, however, can pave the way for MSPs to offer the more forward-looking services. In Autotask’s case, the company provides tools such as professional services automation (PSA), remote monitoring and management (RMM), file sync and share, and file backup.
Individual products, such as file sync and share, can directly contribute to a customer’s digital transformation. But the Autotask product line, as a whole, provides “a broad umbrella” that lets service providers automate and manage tactical, manual chores, so their personnel can play a more strategic role with customers, said Pat Burns, vice president of product management and strategy at the company.
Time-consuming management tasks can hinder service providers aspiring to offer higher value-added services. Indeed, Autotask’s annual IT service provider survey, dubbed Metrics that Matter, revealed many companies waste “up to 10 billable hours each week on manual processes that can be easily automated.” The survey, released at Autotask Community Live 2017, identified entering data into multiple systems and an inability to accurately capture billable hours among the top culprits. More than 1,030 service provider respondents participated in the survey.
Putting it all together
Product unification was another key theme at the conference, as it was at the 2016 event when Autotask unveiled Autotask Endpoint Backup as the fourth component of its product suite. The backup product lends channel partners the ability to offer backup services that work with other Autotask products, such as its PSA offering.
Mark Cattinipresident and CEO, Autotask
A year later, many of the company’s service provider customers have gone beyond one-product implementations. Cattini said more than half of MSPs reported using two or more Autotask products.
Users of multiple products have additional integrations on the horizon. Burns said the company’s approach is to pursue database- and interface-level integration, noting the first phase of unification focuses on PSA. “That’s because it is the most foundational piece of the platform,” he noted.
Integration initiatives, meanwhile, extend beyond the Autotask product set. Cattini said the company’s PSA software offers a significant footprint, but added there are areas outside the scope of a PSA that the company doesn’t cover. For those, Autotask continues to invest in integrations, he said, noting a total of 160 partner-built integrations.
“It’s about product adoption,” Cattini said. “We need to make it easier for you to adopt the products.”
Project UI revamp in the works
As for individual products, Autotask customers can expect the next major releases of the company’s PSA and RMM (Autotask Endpoint Management) products in early 2018, Burns said. File sync and share (Autotask Workplace) and Autotask Endpoint Backup will be up for major releases prior to the PSA and RMM updates, he said.
In another product move outlined at Autotask Community Live 2017, Autotask PSA’s project task component will get a new user interface (UI) along the lines of the much-anticipated revised ticket UI. Burns said the latest UI effort will ship considerably sooner because Autotask’s engineers will be able to take advantage of reusable frameworks and UI controls from the earlier ticket project.
Burns said the reusable components “will save a lot of time.”
MSPs can look for that UI development in 2018.
The Salesforce Chatter app enables cross-company cooperation that helps businesses drive productivity, accelerate innovation and share knowledge. Fast.
Salesforce Chatter is a forum for insights; a means to motivate and engage employees; and an easy way to exchange files, data and ideas. With it, you can track teams and projects wherever they are – in the office, on the road, at a conference, etc.
Dozens of functions and decisions can be executed right in the app, from conversations and approvals to edits and notifications, without waiting for a desk or a meeting.
So bring your company together, then move forward with Salesforce Chatter, free to download from the Windows Store.
Also, keep up with what’s hot, new and trending in the Windows Store on Twitter and Facebook.
Microsoft News Center Staff
Tags: Apps, Salesforce, Salesforce Chatter, Windows 10, Windows Store