This has had very little use as I actually received my new workstation earlier than expected. I bought it as a stop gap, and it has performed flawlessly. I really should have moved it on by now but haven’t had a chance.
I can post some pictures if needs be when I am home in two days, but it looks as it did in the previous listing.
It comes with the power supply, and I will include a small laptop bag I purchased to store it in (on a shelf) when not in use).
Any questions etc just ask
Price £730 including delivery to mainland uk which is what I paid if I remember correctly.
Looking around for the spec and condition this seems very fair.
Intelligent Edge devices and solutions were on full display at the Microsoft CES 2020 Showcase this week. New PCs, connected storefront and IOT solutions demonstrated the power of Microsoft’s unique and growing relationship with its partners around the world.
Working with this vibrant partner ecosystem has long been a priority for Microsoft and it is a key differentiator that’s helped make Windows 10 and Azure successful. Taking a collective, collaborative approach enables the top minds in our industry to work together, accelerate innovation, and ensure we bring great devices and experiences to market for our customers. Together with our partners, we are driving technology forward.
During the first CES of a new decade, we’ve been doing our own 10-year challenge this week, looking back at where we were in 2010 in contrast with how dramatically things have changed since then. Every year along the way has brought exciting innovation across the intelligent edge and computing as a whole, and it’s at these milestones where we can see the dramatic shifts in the tech landscape.
As we close out another great year at CES, let’s look at where we’ve been, how far we’ve come and where we’re headed.
A decade of disruptive change
In 2010, Intel launched its Core i7 line — the first six core desktop CPU. Blackberry owned double the market share of the iPhone and triple that of smartphones running other operating systems. The iPad and Instagram launched. Office 2010 was introduced and surpassed Microsoft’s previous records for adoption, deployment and revenue.
Ten years later, we can see how some of those innovations moved entire markets. Some of them played their role then passed the baton to other creations. And new, disruptive technology trends have emerged, affecting the entire technology industry.
Today we are in an age of services, cloud computing, AI and machine learning. Next-generation Windows PCs and entirely new product categories are being built to incorporate cloud and AI services, support new experiences like virtual and augmented reality, and address our insatiable demand for new ways to work with data.
With these emerging technologies and trends, we are creating new possibilities for the modern PC that weren’t on the radar in 2010. Yet in today’s PCs we still see parallels with the innovation from a decade earlier—in 2020 as in 2010, the whole ecosystem remains focused on delivering new customer scenarios. And as we move into a more connected, intelligent era of computing, PCs today can do what PCs simply could not do a decade ago.
Partner innovations at CES
The impact of the past decade’s innovations is showing up in many ways across a spectrum of devices. Features like Windows Hello, Windows Ink and Cortana all require specific hardware to work well, and this week, our partners have demonstrated innovations that complement our software, like fingerprint readers on the power button and cameras with security features, among many other advancements large and small.
Along with the evolution of devices have come changes in the expectations of both consumers and commercial customers. People want attractive, high-quality devices with no tradeoffs between cost and quality. At CES this week, we’ve seen how our partners continue to offer attractive, no-compromise devices packed with premium features to customers at all price points.
The Windows ecosystem is growing every year, and CES 2020 showcased some of the most innovative devices we’ve seen—from incredibly thin and light laptops, convertibles and 2-in-1s, to powerful gaming rigs, always-connected devices and more. Some of the devices at the show included:
The Acer Swift 3, which supports fast-charging, enabling up to four hours of use with a 30-minute charge.
The ASUS ExpertBook B9, which offers a frameless four-sided NanoEdge display meaning the 14-inch panel fits into a standard 13-inch laptop chassis with a 94% screen-to-body ratio.
The latest version of the popular XPS 13 from Dell, which has reduced its InfinityEdge borders and has a 6.8% larger 16:10 display that spans from all four edges and is 25% brighter than before.
The HP Elite Dragonfly, the first world’s laptop with built-in Tile technology, which allows users to easily find a lost notebook, and is made with ocean bound plastic material with more than 82% of the mechanical parts made from recycled materials.
The Lenovo Yoga Slim 7 with the exclusive Lenovo Q-Control Intelligent Cooling feature that uses AI to optimize battery life by an average of up to 20%.
The Samsung Galaxy Book Flex α, which offers a long-lasting battery, an immersive QLED display capable of producing over 1 billion colors and featuring an ultra-thin bezel, and up to 17.5-hours battery life.
The power of these products shows how Microsoft’s vibrant ecosystem continues to enable the top minds in our industry to work together, accelerate innovation and bring great experiences to market.
In the next 10 years, we expect this pace of innovation will only accelerate. The 5G networks coming online will process extremely high volumes of data at much faster speeds, driving new streaming experiences and even more services across a range of devices. Graphics will get even better as HDR displays continue to improve and 8K becomes ubiquitous. New types of devices are sure to appear, with dual screen and foldable PCs already on the horizon for 2020.
Customer expectations and demand for the most modern devices will keep rising, and the ways we use PCs will keep evolving as we expand our cloud computing capabilities and our understanding of AI and machine learning.
As we work through this new decade, Microsoft will continue to innovate with our partners to meet these demands, so everyone can achieve more. From 5G to the power of the intelligent edge, I’m excited to see what our partners will bring to market in the new “roaring twenties.”
Having a tech clearout and this Lenovo is up for sale as its not being used.
This Lenovo x240 comes with two internal batteries and a working 4G LTE module for internet access where there is no Wifi (options that are not normally found on these laptops.)
It’s in good overall condition for its age (one of the better ones actually) but does show signs on the plastics as you would expect.
The trackpad and keyboard are in great condition as you can see.
It has the following spec:
12.5″ 1366×768 matt screen (not IPS – In excellent condition apart from two small pressure marks on the right hand side – see pictures) i5 4300U @ 1.9Ghz 8GB Ram 180GB Intel SSD0E38417 Internal battery 1 – Genuine Lenovo P/N 45N1108 – 11.1V 1.93Ah battery Internal battery 2 – Genuine Lenovo P/N 45N1124 – 11.1v 2.09Ah 24w battery Both batteries give a combined life of anywhere between 4hrs to 6hrs and maybe more depending on use Fingerprint reader 4G LTE Module Wireless & BT Windows 10 pro official (upgraded from the Win 8 Pro it came with)
Comes with official Lenovo charger and has been a great workhorse over the years. Still one of the best laptop keyboards around too.
Please note… apart from the usual cosmetic marks and bumps you would expect from a laptop this old, there are two small pressure marks on the screen, bottom right. As you can see from the images, they are not visible on a black screen, but are visible on white screens. I have shown the two extremes here, so screen colours between those two will show differing variations of the impact of the pressure marks. The white screenshot shows the worst case scenario, so if you are OK with that then there will be no surprises as that is as bad as it gets.
Also, any lines you see on the images of the screen are due to the usual camera / screen htz issues. This screen is mint apart from the two small pressure marks as I have tried to show.
Looking for £120 including insured delivery (was £135 then £125)
The California Consumer Privacy Act went into effect Jan. 1 but will not be enforced until July 1. Those in the customer experience realms of sales, marketing, e-commerce and customer service who’ve already created GDPR compliance plans are a good chunk of the way to CCPA compliance, experts say.
CX teams who work for companies outside California may view CCPA compliance as a lower priority than GDPR, because it only represents one U.S. state. That is the case for a majority of clients of Blue Fountain Media, a New York-based digital agency specializing in marketing, e-commerce and overall customer experience, said general manager Brian Byer.
“This particular law isn’t going to be what drives the behavior across the entire United States,” Byer said. “Being a New Yorker, California is looked upon as being a little quirky, and once this becomes a federal mandate you will see a massive consumer effect. As of today, until somebody gets a massive fine, it’s going to be something consumers aren’t as cognizant of as, say, HIPAA compliance if they’re going to the doctor every week.”
Nationally, consumer data protection proposals are under consideration in Washington and Oregon as well, prompting some companies such as Microsoft to make CCPA compliance its national standard as it prepares for users to scrutinize cloud companies’ data-privacy practices as a patchwork of state laws may eventually lead to a national umbrella regulation.
Differences, similarities to GDPR
For CX teams, protecting customer privacy under CCPA is similar to the European GDPR law, which took effect in 2018, in that a core principle involves consumers’ “right to be forgotten,” or requiring a company to delete their personal data.
The differences between the two laws are borne of the different mindsets of the European and California legal systems, said IDC legal analyst Ryan O’Leary. CCPA makes an exception for customer loyalty programs, which are not covered under the law, while the GDPR doesn’t. CCPA also puts more responsibility on consumers to opt out of their data use for commercial purposes, rather than the company that holds the data.
Another difference with CCPA is that it gives consumers separate control over sale of their consumer data, the extent of which will remain somewhat “up in the air” until regulators decide what will and won’t be enforced, O’Leary added. But California consumers, in effect, can tell a company to hold on to their data, but not to sell it.
Ryan O’LearyAnalyst, IDC
“Businesses have to provide a clearly visible and worded opt-out link on their websites [for data sales],” O’Leary said, adding that cloud software platforms add more legal questions about who is responsible for data-selling violations — which can add up quickly, with fines of $7,500 per violation — for selling a consumer’s data after consumers have opted out. “If you’re not selling the data, but third parties you’re working with are leveraging your consumer data and going ahead and selling it, you could be held liable.”
That said, O’Leary added that he sees companies trying to limit the number of opt-outs — and therefore, the compliance load — by making it harder to do. Those can include benign “are you sure?” boxes, more onerous web forms, or even requiring consumers to call a contact center to opt out over the phone. It’s all legal, fitting in with CCPA’s mandate requiring companies to offer consumers two modes of contact for consumers to opt out of personal data retention.
What companies CCPA covers
Despite the fear of potential CCPA fines that could intimidate digital marketing and call center teams for mishandling consumer information, not every company is affected by the regulation. First, a company has to do business with Californians. Second, the law covers only companies that either do $25 million in gross revenue, receive personal information from at least 50,000 consumer or derive at least 50% of annual revenue from selling consumers’ personal information.
Some nonprofits may be excluded, according to Jackson Lewis attorneys Joseph Lazzarotti and Jason Gavejian in their analysis of the law, which also includes which data points that the law considers personal information, such as biometric data, education records and even “audio, electronic, visual, thermal, olfactory or similar information.”
For CX teams using cloud platform technology platforms, complying with CCPA and other potential consumer data-protection laws coming down the pike involves unifying consumer data and breaking down data silos — something they’re been working on already for business purposes, said IDC’s O’Leary.
“The first step in complying with these types of laws is to clean up your house and information governance practices,” O’Leary said. “We really need to stop thinking and working in silos. We need to start data mapping. There’s plenty of tools and consultants out there to help. … It will cost, but [consumer trust] is worth any cost to get a handle on your data, where it is and who has it.”
In this second post, we will review the different types of replication options and give you guidance on what you need to ask your storage vendor if you are considering a third-party storage replication solution.
If you want to set up a resilient disaster recovery (DR) solution for Windows Server and Hyper-V, you’ll need to understand how to configure a multi-site cluster as this also provides you with local high-availability. In the first post in this series, you learned about the best practices for planning the location, node count, quorum configuration and hardware setup. The next critical decision you have to make is how to maintain identical copies of your data at both sites, so that the same information is available to your applications, VMs, and users.
Multi-Site Cluster Storage Planning
All Windows Server Failover Clusters require some type of shared storage to allow an application to run on any host and access the same data. Multi-site clusters behave the same way, but they require multiple independent storage arrays at each site, with the data replicated between them. The data for the clustered application or virtual machine (VM) on each site should use its own local storage array, or it could have significant latency if each disk IO operation had to go to the other location.
If you are running Hyper-V VMs on your multi-site cluster, you may wish to use Cluster Shared Volumes (CSV) disks. This type of clustered storage configuration is optimized for Hyper-V and allows multiple virtual hard disks (VHDs) to reside on the same disk while allowing the VMs to run on different nodes. The challenge when using CSV in a multi-site cluster is that the VMs must make sure that they are always writing to their disk in their site, and not the replicated copy. Most storage providers offer CSV-aware solutions, and you must make sure that they explicitly support multi-site clustering scenarios. Often the vendors will force writes at the primary site by making the CSV disk at the second site read-only, to ensure that the correct disks are always being used.
Understanding Synchronous and Asynchronous Replication
As you progress in planning your multi-site cluster you will have to select how your data is copied between sites, either synchronously or asynchronously. With asynchronous replication, the application will write to the clustered disk at the primary site, then at regular intervals, the changes will be copied to the disk at the secondary site. This usually happens every few minutes or hours, but if a site fails between replication cycles, then any data from the primary site which has not yet been copied to the secondary site will be lost. This is the recommended configuration for applications that can sustain some amount of data loss, and this generally does not impose any restrictions on the distance between sites. The following image shows the asynchronous replication cycle.
Asynchronous Replication in a Multi-Site Cluster
With synchronous replication, whenever a disk write command occurs on the primary site, it is then copied to the secondary site, and an acknowledgment is returned to both the primary and secondary storage arrays before that write is committed. Synchronous replication ensures consistency between both sites and avoids data loss in the event that there is a crash between a replication cycle. The challenge of writing to two sets of disks in different locations is that the physical distance between sites must be close or it can affect the performance of the application. Even with a high-bandwidth and low-latency connection, synchronous replication is usually recommended only for critical applications that cannot sustain any data loss, and this should be considered with the location of your secondary site. The following image shows the asynchronous replication cycle.
Synchronous Replication in a Multi-Site Cluster
As you continue to evaluate different storage vendors, you may also want to assess the granularity of their replication solution. Most of the traditional storage vendors will replicate data at the block-level, which means that they track specific segments of data on the disk which have changed since the last replication. This is usually fast and works well with larger files (like virtual hard disks or databases), as only blocks that have changed need to be copied to the secondary site. Some examples of integrated block-level solutions include HP’s Cluster Extension, Dell/EMC’s Cluster Enabler (SRDF/CE for DMX, RecoverPoint for CLARiiON), Hitachi’s Storage Cluster (HSC), NetApp’s MetroCluster, and IBM’s Storage System.
There are also some storage vendors which provide a file-based replication solution that can run on top of commodity storage hardware. These providers will keep track of individual files which have changed, and only copy those. They are often less efficient than the block-level replication solutions as larger chunks of data (full files) must be copied, however, the total cost of ownership can be much less. A few of the top file-level vendors who support multi-site clusters include Symantec’s Storage Foundation High Availability, Sanbolic’s Melio, SIOS’s Datakeeper Cluster Edition, and Vision Solutions’ Double-Take Availability.
The final class of replication providers will abstract the underlying sets of storage arrays at each site. This software manages disk access and redirection to the correct location. The more popular solutions include EMC’s VPLEX, FalconStor’s Continuous Data Protector and DataCore’s SANsymphony. Almost all of the block-level, file-level, and appliance-level providers are compatible with CSV disks, but it is best to check that they support the latest version of Windows Server if you are planning a fresh deployment.
By now you should have a good understanding of how you plan to configure your multi-site cluster and your replication requirements. Now you can plan your backup and recovery process. Even though the application’s data is being copied to the secondary site, which is similar to a backup, it does not replace the real thing. This is because if the VM (VHD) on one site becomes corrupted, that same error is likely going to be copied to the secondary site. You should still regularly back up any production workloads running at either site. This means that you need to deploy your cluster-aware backup software and agents in both locations and ensure that they are regularly taking backups. The backups should also be stored independently at both sites so that they can be recovered from either location if one datacenter becomes unavailable. Testing recovery from both sites is strongly recommended. Altaro’s Hyper-V Backup is a great solution for multi-site clusters and is CSV-aware, ensuring that your disaster recovery solution is resilient to all types of disasters.
If you are looking for a more affordable multi-site cluster replication solution, only have a single datacenter, or your storage provider does not support these scenarios, Microsoft offers a few solutions. This includes Hyper-V Replica and Azure Site Recovery, and we’ll explore these disaster recovery options and how they integrate with Windows Server Failover Clustering in the third part of this blog series.
Let us know if you have any questions in the comments form below!
Amazon Web Services has a stranglehold on the public cloud market, but the company’s dominance in cloud security is facing new challenges.
The world’s largest cloud provider earned a reputation over the last 10 years as an influential leader in IaaS security, thanks to introducing products such as AWS Identity & Access Management and Key Management Service in the earlier part of the decade to more recent developments in event-driven security. AWS security features helped the cloud service provider establish its powerful market position; according to Gartner, AWS in 2018 earned an estimated $15.5 billion in revenue for nearly 48% of the worldwide public IaaS market.
But at the re:Invent 2019 conference last month, many of the new security tools and features announced were designed to fix existing issues, such as misconfigurations and data exposures, rather than push AWS security to new heights. “There wasn’t much at re:Invent that I’d call security,” said Colin Percival, founder of open source backup service Tarsnap and an AWS Community Hero, via email. “Most of what people are talking about as security improvements address what I’d call misconfiguration risk.”
Meanwhile, Microsoft has not only increased its cloud market share but also invested heavily in new Azure security features that some believe rival AWS’ offerings. Rich Mogull, president and analyst at Securosis, said there are two sides to AWS security — the inherent security of the platform’s architecture, and the additional tools and products AWS provides to customers.
“In terms of the inherent security of the platform, I still think Amazon is very far ahead,” he said, citing AWS’ strengths such as availability zones, segregation, and granular identity and access management. “Microsoft has done a lot with Azure, but Amazon still has a multi-year lead. But when it comes to security products, it’s more of a mixed bag.”
Colin PercivalFounder, Tarsnap
Microsoft has been able to close the gap in recent years with the introduction of its own set of products and tools that compete with AWS security offerings, he said. “Azure Security Center and AWS Security Hub are pretty comparable, and both have strengths and weaknesses,” Mogull said. “Azure Sentinel is quite interesting and seems more complete than AWS Detective.”
New tools, old problems
Arguably the biggest AWS security development at re:Invent was a new tool designed to fix a persistent problem for the cloud provider: accidental S3 bucket exposures. The IAM Access Analyzer, which is part of AWS’ Identity and Access Management (IAM) console, alerts users when an S3 bucket is possibly misconfigured to allow public access via the internet and lets them block such access with one click.
AWS had previously made smaller moves, including changes to S3 security settings and interfaces, to curb the spate of high-profile and embarrassing S3 exposures in recent years. IAM Access Analyzer is arguably the strongest move yet to resolve the ongoing problem.
“They created the S3 exposure issue, but they also fixed it,” said Jerry Gamblin, principal security engineer at vulnerability management vendor Kenna Security, which is an AWS customer. “I think they’ve really stepped up in that regard.”
Still, some AWS experts feel the tool doesn’t fully resolve the problem. “Tools like IAM Access Analyzer will definitely help some people,” Percival said, “but there’s a big difference between warning people that they screwed up and allowing people to make systems more secure than they could previously.”
Scott Piper, an AWS security consultant and founder of Summit Route in Salt Lake City, said “It’s yet another tool in the toolbelt and it’s free, but it’s not enabled by default.”
There are other issues with IAM Access Analyzer. “With this additional information, you have to get that to the customer in some way,” Piper said. “And doing that can be awkward and difficult with this service and others in AWS like GuardDuty, because it doesn’t make cross-region communication very easy.”
For example, EC2 regions are isolated to ensure the highest possible fault tolerance and stability for customers. But Piper said the isolation presents challenges for customers using multiple regions because it’s difficult to aggregate GuardDuty alerts to a single source, which requires security teams to analyze “multiple panes of glass instead of one.”
AWS recently addressed another security issue that became a high-profile concern for enterprises following the Capital One breach last summer. The attacker in that exploited an SSRF vulnerability to access the AWS metadata service for company’s EC2 instances, which allowed them to obtain credentials contained in the service.
The Capital One breach led to criticism from security experts as well as lawmakers such as Sen. Ron Wyden (D-Ore.), who questioned why AWS hadn’t addressed SSRF vulnerabilities for its metadata service. The lack of security around the metadata service has concerned some AWS experts for years; in 2016, Percival penned a blog post titled “EC2’s most dangerous feature.”
“I think the biggest problem Amazon has had in recent years — judging by the customers affected — is the lack of security around their instance metadata service,” Percival told SearchSecurity.
In November, AWS made several updates to the metadata service to prevent unauthorized access, including the option to turn off access to the service altogether. Mogull said the metadata service update was crucial because it improved security around AWS account credentials.
But like other AWS security features, the metadata service changes are not enabled by default. Percival said enabling the update by default would’ve caused issues for enterprise applications and services that rely on the existing version of the service. “Amazon was absolutely right in making their changes opt-in since if they had done otherwise, they would have broken all of the existing code that uses the service,” he said. “I imagine that once more or less everyone’s code has been updated, they’ll switch this from opt-in to opt-out — but it will take years before we get to that point.”
Percival also said the update is “incomplete” because it addresses common misconfigurations but not software bugs. (Percival is working on an open source tool that he says will provide “a far more comprehensive fix to this problem,” which he hopes to release later this month.)
Still, Piper said the metadata service update is an important step for AWS security because it showed the cloud provider was willing to acknowledge there was a problem with the existing service. That willingness and responsiveness hasn’t always been there in the past, he said.
“AWS has historically had the philosophy of providing tools to customers, and it’s kind of up to customers to use them and if they shoot themselves in the foot, then it’s the customers’ fault,” Piper said. “I think AWS is starting to improve and change that philosophy to help customers more.”
AWS security’s road ahead
While the metadata service update and IAM Access Analyzer addressed lingering security issues, experts highlighted other new developments that could strengthen AWS’ position in cloud security.
AWS Nitro Enclaves, for example, is a new EC2 capability introduced at re:Invent 2019 that allows customers to create isolated instances for sensitive data. The Nitro Enclaves, which will be available in preview this year, are virtual machines attached to EC2 instances but have CPU and memory isolation from the instances and can be accessed only through secure local connections.
“Nitro Enclaves will have a big impact for customers because of its isolation and compartmentalization capabilities” which will give enterprises’ sensitive data an additional layer of protection against potential breaches, Mogull said.
Percival agreed that Nitro Enclaves could possibly “raise the ceiling,” for AWS Security, though he cautioned against using them. “Enclaves are famously difficult for people to use correctly, so it’s hard to predict whether they will make a big difference or end up being another of the many ‘Amazon also has this feature, which nobody ever uses’ footnotes.”
Experts also said AWS’ move to strengthen its ARM-based processor business could have major security implications. The cloud provider announced at re:Invent 2019 that it will be launching EC2 instances that run on its new, customized ARM chips, dubbed Graviton2.
Gamblin said the Graviton2 processors are a security play in part because of recent microprocessor vulnerabilities and side channel attacks like Meltdown and Spectre. While some ARM chips were affected by both Meltdown and Spectre, subsequent side channel attacks and Spectre variants have largely affected x86 processors.
“Amazon doesn’t want to rely on other chips that may be vulnerable to side channel attacks and may have to be taken offline and rebooted or suffer performance issues because of mitigations,” Gamblin said.
Percival said he was excited by the possibility of the cloud provider participating in ARM’s work on the “Digital Security by Design” initiative, a private-sector partnership with the UK that is focused in part on fundamentally restructuring — and improving — processor security. The results of that project will be years down the road, Percival said, but it would show a commitment from AWS to once again raising the bar for security.
“If it works out — and it’s a decade-long project, which is inherently experimental in nature — it could be the biggest step forward for computer security in a generation.”
Companies at the forefront of digital transformation recognize how critical it is to enable all of their people with the right technology and tools. That’s why, in industries like retail, hospitality, and manufacturing, there’s a movement underway to digitally empower the Firstline Workforce—the more than 2 billion people worldwide who work in service- or task-oriented roles.
With Microsoft 365, the world’s productivity cloud, we’re in a unique position to help companies of all sizes and across all industries provide their employees the tools and expertise they need to do their best work, without sacrificing the security of their organization or customers’ data. Giving Firstline Workers the tools they need requires companies to address unique user experience, security and compliance, and IT management.
Microsoft 365 for Firstline Workers
Microsoft 365 combines intuitive best-in-class productivity apps with intelligent cloud services to empower your Firstline Workforce.
It’s inspiring to see how industry leaders, like IKEA and Mattress Firm, are driving higher levels of employee engagement and enhancing the customer experience by putting tools like Microsoft Teams into the hands of their Firstline Workforce. IKEA is connecting everyone in the organization with familiar features like chat and video calls and digitizing firstline processes such as shift management to save time and cost.
This video was created by Microsoft, with the agreement of Ingka Group.
Mattress Firm is empowering Firstline Workers with real-time access to the information, resources, and expertise they need to delight customers and provide a better shopping experience.
Ahead of next week’s National Retail Federation (NRF) tradeshow, we are excited to introduce new capabilities for Firstline Workers coming to Microsoft 365. Here’s a look at what’s coming soon:
New tools that make it easier for Firstline Workers to communicate and manage tasks
Walkie Talkie in Teams—This new push-to-talk experience enables clear, instant, and secure voice communication over the cloud, turning employee- or company-owned smartphones and tablets into a walkie-talkie. This functionality, built natively into Teams, reduces the number of devices employees must carry, and lowers costs for IT. Unlike analog devices with unsecure networks, customers no longer have to worry about crosstalk or eavesdropping from outsiders. And since Walkie Talkie functions over Wi-Fi or cellular data, this capability can be used across geographic locations. Walkie Talkie will be available in private preview in Teams in the first half of this year.
Intuitive push-to-talk experience to connect team members across departments and locations.
Tasks targeting, publishing, and reporting—With Tasks in Teams, now customers can drive consistent execution of store operations at scale across all of an organization’s locations. Corporate and regional leadership can send task lists targeted to the relevant locations, such as specific retail stores, and track their progress through automatic real-time reports. Managers have tools to easily direct activities within their stores, and Firstline Workers have a simple prioritized list available via their personal or company-issued device showing them exactly what to do next. Tasks targeting, publishing, and reporting is coming to Teams in the first half of this year.
Corporate headquarters can target, assign, and track tasks across locations. Firstline Workers can view tasks assigned to them and across the store.
Workforce management integrations—Customers using leading third-party workforce management systems—such as Kronos and JDA—for scheduling and time and attendance can now start integrating directly with Shifts via Shifts Graph APIs and SDK. Supported scenarios include management of shifts, schedules, schedule groups, swap requests, time off requests, and open shift requests. The JDA connector for Shifts is open sourced and available on GitHub. The Kronos connector for Shifts will also be available on GitHub later this quarter.
Enhanced identity and access management features that make it easier for IT pros to keep Firstline Workers productive and secure
SMS sign-in—With SMS sign-in, Firstline Workers are able to sign in to their Azure Active Directory (Azure AD) account using one-time SMS codes—reducing the need to remember usernames and passwords for all their Microsoft 365 and custom applications. Once enrolled, the user is prompted to enter their phone number, which generates an SMS text with a one-time password. SMS sign-in is a single sign-on (SSO) experience, enabling Firstline Workers to seamlessly access all the apps they are authorized to use. This new sign-in method can be enabled for select groups and configured at the user level in the My Staff portal—helping to reduce the burden on IT.
One-time SMS codes on mobile devices to streamline the sign-in experience for Firstline Workers.
Shared device sign-out—Many Firstline Workers use a single tablet or mobile device that is shared between shifts. This can pose unique security challenges to the organization when different employees who have access to different types of data use the same device over the course of a day. With shared device sign-out, Firstline Workers will be able to log out of all their Microsoft 365 and custom applications and browser sessions with one click at the end of their shift—preventing their data as well as any access to customer data from being accessible to the next user of that device.
With one click, Firstline Workers can sign out of a shared Android device and log out of all applications and browser sessions to prevent sensitive data being shared with another device user.
Off-shift access controls for Teams app—IT administrators can now configure Teams to limit employee access to the app on their personal device outside of working hours. This feature helps ensure employees are not involuntarily working while not on shift and helps employers to comply with labor regulations. This feature will begin rolling out to customers this quarter.
Display a message and/or disable access to Teams app when Firstline Workers are off shift.
Delegated user management—Firstline Managers can approve password resets and enable employees to use their phone numbers for SMS sign-in, all via a single customizable portal enabled by IT for Firstline Managers. Delegated user management can give Firstline Managers access to the My Staff portal, so they can unblock staff issues—reducing the burden of identity management on IT, and keeping employees connected to the apps they need on the job.
Through the My Staff portal, delegated user management enables a Firstline Manager to manage their team’s credentials and assist with password resets.
Inbound provisioning from SAP SuccessFactors to Azure AD—Azure AD’s user provisioning service now integrates with SAP SuccessFactors, making it easier than ever to onboard and manage Firstline Workers’ identities at scale, across any application using Azure AD. This feature—in public preview—builds upon the ability to provision users to Azure AD from Workday, another popular human capital management (HCM) system, already generally available. Integrating with these systems of record helps IT to scale Firstline Workers’ onboarding and productivity from day one.
With Azure AD’s user provisioning service now integrated with SAP SuccessFactors, as well as Workday, it’s easier than ever to onboard Firstline Workers user identities at scale. Shown here, you can start the provisioning cycle and use the progress bar and provisioning logs to track the provisioning process.
All of these capabilities are expected to roll out in the first half of this year except where noted.
Empowering Firstline Workers to gain a competitive advantage
New research in partnership with Harvard Business Review Analytic Services highlights the untapped potential of Firstline Workers in retail.
This is just the next step in our journey to empower every person and every organization on the planet to achieve more. We aim to build tools and experiences for the modern workplace and for workers at all levels of the organization. We’ll continue to develop and bring to market purpose-built Firstline Worker capabilities and experiences in partnership with our customers and industry leaders. And we’ll continue to innovate and build features that simplify work, bring people together, and help organizations big and small achieve more. Come see us next week at NRF 2020 in booth #4501.
IT services acquisitions got off to a fast start in 2020 with at least three transactions surfacing in the first week of the new year.
The early activity suggests the brisk pace of deals among services providers in 2019 may persist a bit longer. Consider the following:
IT services acquisitions: Deal drivers
The recent IT services acquisition emphasize the value buyers place on vertical market expertise, technology skill sets, software development and geographic reach.
Perficient’s purchase of MedTouch, based in Somerville, Mass., has boosted Perficient’s healthcare business revenue by nearly 10%, according to vice president Ed Hoffman. The acquisition adds 50 employees to the company’s roster.
Perficient’s healthcare business, the company’s largest vertical market, delivers business optimization and customer experience services to payer, provider and pharmaceutical organizations. MedTouch adds digital marketing “solutions and services focused on patient acquisition, customer experience, patient engagement and loyalty, and physician marketing,” Hoffman said.
MedTouch also contributes to Perficient’s customer experience (CX) work. Perficient’s Sitecore practice designs, builds and delivers websites based on the Sitecore Experience Platform. MedTouch brings experience with Sitecore, Acquia and other CX platforms, Hoffman noted.
Quisitive, meanwhile, has expanded its Microsoft technology resources and its geographic reach through its acquisition of Los Altos, Calif.-based Menlo Technologies. Menlo provides capabilities in Azure, Microsoft Office 365 and Microsoft Dynamics, along with software development services.
The Menlo Technologies deal furthers the company’s goal of becoming a Microsoft solutions provider spanning North America. Quisitive launched that strategy in 2018 and has since expanded to eight North American offices from its original base of operations in Dallas and Denver.
Quisitive CEO Mike Reinhart said the Menlo Technologies acquisition and its 2019 purchase of Corporate Renaissance Group “give us the fuel necessary to drive organic growth” as it builds its Microsoft business.
“These two acquisitions were critical in the sense that it added deep expertise in Microsoft Dynamics 365 capabilities, first-party SaaS IP, offshore development capability and full U.S. geographic coverage,” Reinhart said.
Ahead’s acquisition of Platform Consulting Group, based in Denver, marks the company’s fourth transaction. In October 2019, Ahead merged with Data Blue and acquired Sovereign Systems. The company also purchased Link Solutions Group last year.
Platform Consulting Group focuses on cloud-native application development and modern development frameworks, targeting enterprise clients. That customers are working to rationalize and refactor applications or build new applications with cloud-native patterns influenced Ahead’s purchase of Platform.
“This was part of the core thesis for our acquisition,” said Eric Kaplan, CTO at Ahead. “In addition, the lines are blurring between infrastructure and applications and our clients are looking for help from partners who can provide an end-to-end solution.”
More deals on tap?
Industry executives suggested the current run of IT services acquisitions looks set to continue in 2020.
Ed HoffmanVice president, Perficient
“We expect continued consolidation in the space in 2020 and remain in active dialogue with many firms,” Perficient’s Hoffman said.
He said Perficient is “highly selective when it comes to M&A,” emphasizing industry expertise, partner ecosystems, corporate cultures and values, and “exceptionally talented workforces.”
Reinhart also pointed to more consolidation. “Microsoft is exploding with growth in their cloud offerings and the partner ecosystem continues to be fragmented,” he said. Fragmentation, he added, opens an opportunity for Quisitive to look at regional and point-solution partners. Microsoft has said it has more than 64,000 cloud partners.
Quisitive, Reinhart said, “will be very surgical” in its acquisition approach, making sure targeted companies offer differentiated market strength or add to Quisitive’s geographic expansion and Microsoft specialization.
“I believe M&A activity will be strong, but highly correlated to overall markets,” Kaplan noted.
In Kaplan’s view, valuations in particular markets are inflated. “Segments of the market where skills are at a premium and where repeatable solutions exist will obviously command a premium,” he added.