Tag Archives: each

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

Try an Azure Site Recovery setup for DR needs

In today’s IT world, you can have workloads on premises and in the cloud. One common denominator for each location is a need to plan for disaster recovery. Azure Site Recovery is one option for administrators who need a way to cover every scenario.

Azure Site Recovery is a service used to protect physical and virtual Windows or Linux workloads outside of your primary data center and its traditional on-premises backup system. During the Azure Site Recovery setup process, you can choose either Azure or another data center for the replication target. In the event of a disaster, such as a power outage or hardware failure, your apps can continue to operate in the Azure cloud to minimize downtime. Azure Site Recovery also supports cloud failover of both VMware and Hyper-V virtual infrastructures.

One of the real advantages of this Azure service for a Windows shop is integration. All the functionality is built right into the admin portal and requires little effort to configure beyond the agent installation, which can be done automatically. Offerings from other vendors, such as Zerto and Veeam work the same way but require additional configuration using a management suite based outside the Azure portal.

Azure Site Recovery pricing

One of the big issues for any platform is cost. Each protected instance costs $25 per month with additional fees for the Azure Site Recovery license, storage in Azure, storage transactions and outbound data transfer. Organizations interested in testing the service can use it for free for the first 31 days.

As with most systems, there are caveats, including how replication and recovery are tied to specific Azure regions depending on the location of the cluster. There is a list of supported configurations on Microsoft’s documentation.

Azure includes the option to failover to an on-premises location, which reduces the cost to $16 per instance. However, this option requires meeting bandwidth requirements that are not a factor in an Azure-to-Azure failover scenario.

Azure Site Recovery uses vaults to store workload dependencies

Most disaster recovery (DR) environments utilize the concept of crash consistent applications, meaning the application fails over as a whole with all its dependencies. In Azure, you store the VM backups, their respective recovery points and the backup policies in a vault.

These vaults should contain all the servers that make up the services required for a successful failover. (You should test before an emergency occurs to make sure it functions expected.) It is possible to fail over individual VMs within a replication group if needed; this used to be an all-or-nothing scenario until recently.

How to create a Recovery Services vault

For this Azure Site Recovery setup tutorial, we’ll cover how to configure VMs for site-to-site replication between regions via the portal.azure.com link.

As with most Azure tools, the Disaster Recovery menu is on the left-hand side with the other Azure services. Under this menu is the Recovery Service vault option. Create one by filling in the fields as shown in Figure 1.

Recovery Services vault
Figure 1. Create the Recovery Services vault by filling in the project details.

When you have entered all your specifications, click Create to build the vault. The next step is to choose the purpose for the vault. The choice is either for backup or DR.

Next, add the VMs. To start, from the vault choices select Site Recovery.

From the on-premises option, click Replicate Application to open a wizard to add VMs. Next, click Review + Start replication to start the creation and replication process, which can take several minutes. For ease of access and experimentation purposes, I suggest pinning it to your dashboard. Opening the vault provides a health overview of the site and clicking on each item shows details about the replication status as shown in Figure 2.

Azure Site Recovery replication
Figure 2. The Azure Site Recovery setup process replicates the protected instances to the defined target, either an on-premises location or in Azure.

This completes the creation of a group with two protected VMs. Every VM added to that resource group automatically becomes a protected member of the vault. By default, the DR failover is set to a maximum duration of 24 hours. After the initial configuration, you can adjust the failover duration and snapshot frequency from the Site Recovery Policy – Retention policies page.

The last step is to create a recovery plan. From the vault, select Create recovery plan and then + Recovery plan and give it a name as shown in Figure 3 with our example called MyApplicationRecoveryPlan. You choose the source from either the on-premises location or Azure, and the Azure target.

Azure Site Recovery plans
Figure 3. The menu displays all your configured recovery plans in Azure Site Recovery. From here, you can execute a test failover to verify your settings.

When complete, opening the plan to verify it works properly by clicking Test for a nondisruptive assessment that checks the replication in an isolated environment. This process can detect any problems related to services and connectivity the application needs to function in a failover setting.

This tutorial covers some of the basic functionality of Azure Site Recovery. For more granular control, there are many more options available to provide advanced functionality.

Go to Original Article
Author:

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

For Sale – Huge clearout – PCs, laptops, components, RAM/GPUs

It’s time for a serious clearout! I haven’t had enough time to go through each individual set of parts yet, if there’s interest I’ll clean and test each one before sale.

I’ve been using work laptops and been too busy with work to bother much with my hardware below! I’m in the process of deciding what I want to keep and what to sell, it will take me a little while to clear off data in some instances. I do still need 2 PCs in the future but I’m taking this opportunity as a hard reset in my lineup.

This list will be updated as and when I get time / locate items.

Laptops
Lenovo E570 i3-6006u, 4gb DDR4 (I have more RAM below if needed). 500gb HDD, DVD writer, 15.6” screen. £200
Lenovo Yoga 12 i7 / 8gb / 250gb ssd £300
IBM T41 (specs TBC)

PCs
Dell XPS 8700, i7 4770, 12gb, GTX 645. £420
Dell Studio 540s SFF PC (ideal as HTPC), C2Q Q8300, 4gb, discrete half-height GPU (can’t remember which but will confirm) £100
HP Media Centre PC M7000 (includes the removable HDD in the bay at the front!) £100
Mesh Q6600 Elite, C2Q Q6600, 4gb (I think), no GPU £80
Thermaltake build – C2Q Q6600, 7600GS, RAM TBC £80

GPUs
2x Gigabyte 7970 GHz edition – boxed – barely seen any use in the past 4 years as I’ve been using a work laptop. £80ea
Palit GTX 780 £80
Palit GTX 980ti £200

SSDs
240gb Kingston KC300 £30
500gb Crucial MX500 £50
1.92tb Sandisk Ultra 900 USBC external drive brand new boxed £600

HDDs
3tb Seagate £35
Assorted 500gb-4tb 3.5” drives (5+ drives in total)
Assorted 60-500gb 2.5” drives (10+ drives in total)

Optical drives
5.25” DVD writers
Laptop DVD writers (I’ve amassed about 25 of these, please post if needed)

RAM
1x8gb DDR4 SODIMM £25
1x16gb DDR4 SODIMM £55
4x4gb Corsair Dominator 2133mhz DDR3 £100
4x8gb Patriot DDR3 £125
Assorted DDR1/DDR2 (including ECC) e.g. 4x512mb DDR2 ECC £10

Cases
NZXT Phantom £30 (boxed, some yellowing with age but it’s easy to get this back to white)

Components
Corsair AX850 PSU with black cable set £75
Corsair red cable set for AX650/AX750/AX850 £50
Corsair AX1200 PSU with cables £150
Sabertooth X79 mobo (found the manual and driver disc but no box) £80
Rampage IV Extreme X79 mobo (boxed with OC-key and manuals) £200

Corsair H100i £60
Corsair H100 £50
I7 3960x £170
I7 3930k £60

Misc
Trendnet Powerline AV500 adapter x2 boxed, £20
Netgear EX6120 Wifi AC1200 extender £30
Plantronics Calisto P610M USB speakerphone (Skype, Zoom etc.) – brand new boxed, 2 available, £60 each.

Go to Original Article
Author:

For Sale – 3x HP DL380 G7 2x E5620 2.40GHz Rack Servers

For Sale

3 servers in total. Spec of each server:

HP DL380 G7 Rack Server
2x E5620 2.40GHz 4 core CPU
24GB RAM
P410i RAID
8x 2.5 SAS HDD 146GB 10k
4x NIC
1x iLO Remote Management (NIC)
Redundant PSUs

Fully working order including remote management.

Pick only from Bristol.

Used for a personal cloud/virtualization project and no longer needed.

Open to offers.

Have a few other bits, such as Cisco switches, and an ML G9 Tower.

Location
Bristol
Price and currency
£100 each server
Delivery cost included
Delivery is NOT included
Prefer goods collected?
I prefer the goods to be collected
Advertised elsewhere?
Advertised elsewhere
Payment method
BT, PPG or COD

Last edited:

Go to Original Article
Author:

For Sale – 3x HP DL380 G7 2x E5620 2.40GHz Rack Servers

For Sale

3 servers in total. Spec of each server:

HP DL380 G7 Rack Server
2x E5620 2.40GHz 4 core CPU
24GB RAM
P410i RAID
8x 2.5 SAS HDD 146GB 10k
4x NIC
1x iLO Remote Management (NIC)
Redundant PSUs

Fully working order including remote management.

Pick only from Bristol.

Used for a personal cloud/virtualization project and no longer needed.

Open to offers.

Have a few other bits, such as Cisco switches, and an ML G9 Tower.

Location
Bristol
Price and currency
£100 each server
Delivery cost included
Delivery is NOT included
Prefer goods collected?
I prefer the goods to be collected
Advertised elsewhere?
Advertised elsewhere
Payment method
BT, PPG or COD

Last edited:

Go to Original Article
Author:

Dawn of a Decade: The Top Ten Tech Policy Issues for the 2020s

By Brad Smith and Carol Ann Browne

For the past few years, we’ve shared predictions each December on what we believe will be the top ten technology policy issues for the year ahead. As this year draws to a close, we are looking out a bit further. This January we witness not just the start of a new year, but the dawn of a new decade. It gives us all an opportunity to reflect upon the past ten years and consider what the 2020s may bring.

As we concluded in our book, Tools and Weapons: The Promise and the Peril of the Digital Age, “Technology innovation is not going to slow down. The work to manage it needs to speed up.” Digital technology has gone longer with less regulation than virtually any major technology before it. This dynamic is no longer sustainable, and the tech sector will need to step up and exercise more responsibility while governments catch up by modernizing tech policies. In short, the 2020s will bring sweeping regulatory changes to the world of technology.

Tech is at a crossroads, and to consider why, it helps to start with the changes in technology itself. The 2010s saw four trends intersect, collectively transforming how we work, live and learn. Continuing advances in computational power made more ambitious technical scenarios possible both for devices and servers, while cloud computing made these advances more accessible to the world. Like the invention of the personal computer itself, cloud computing was as important economically as it was technically. The cloud allows organizations of any size to tap into massive computing and storage capacity on demand, paying for the computing they need without the outlay of capital expenses. 

More powerful computers and cloud economics combined to create the third trend, the explosion of digital data. We begin the 2020s with 25 times as much digital data on the planet as when the past decade began.

These three advances collectively made possible a fourth: artificial intelligence, or AI. The 2010s saw breakthroughs in data science and neural networks that put these three advances to work in more powerful AI scenarios. As a result, we enter a new decade with an increasing capability to rely on machines with computer vision, speech recognition, and language translation, all powered by algorithms that recognize patterns within vast quantities of digital data stored in the cloud.

The 2020s will likely see each of these trends continue, with new developments that will further transform the use of technology around the world. Quantum computing offers the potential for breathtaking breakthroughs in computational power, compared to classical or digital computers. While we won’t walk around with quantum computers in our pockets, they offer enormous promise for addressing societal challenges in fields from healthcare to environmental sustainability.

Access to cloud computing will also increase, with more data centers in more countries, sometimes designed for specific types of customers such as governments with sensitive data. The quantity of digital data will continue to explode, now potentially doubling every two years, a pace that is even faster than the 2010s. This will make technology advances in data storage a prerequisite for continuing tech usage, explaining the current focus on new techniques such as optical- and even DNA-based storage.

The next decade will also see continuing advances in connectivity. New 5G technology is not only 20 times faster than 4G. Its innovative approach to managing spectrum means that it can support over a thousand more devices per meter than 4G, all with great precision and little latency. It will make feasible a world of ambient computing, where the Internet of Things, or IoT devices, become part of the embedded fabric of our lives, much as electrical devices do today. And well before we reach the year 2030, we’ll be talking about 6G and making use of thousands of satellites in low earth orbit.

All of this will help usher in a new AI Era that likely will lead to even greater change in the 2020s than the digital advances we witnessed during the past decade. AI will continue to become more powerful, increasingly operating not just in narrow use cases as it does today but connecting insights between disciplines. In a world of deep subject matter domains across the natural and social sciences, this will help advance learning and open the door to new breakthroughs.

In many ways, the AI Era is creating a world full of opportunities. In each technological era, a single foundational technology paved the way for a host of inventions that followed. For example, the combustion engine reshaped the first half of the 20th century. It made it possible for people to invent not just cars but trucks, tractors, airplanes, tanks, and submarines. Virtually every aspect of civilian economies and national security issues changed as a result.

This new AI Era likely will define not just one decade but the next three. Just as the impact of the combustion engine took four decades to unfold, AI will likely continue to reshape our world in profound ways between now and the year 2050. It has already created a new era of tech intensity, in which technology is reshaping every company and organization and becoming embedded in the fabric of every aspect of society and our lives.

Change of this magnitude is never easy. It’s why we live in both an era of opportunity and an age of anxiety. The indirect impacts of technology are moving some people and communities forward while leaving others behind. The populism and nationalism of our time have their roots in the enormous global and societal changes that technology has unleashed. And the rising economic power of large companies – perhaps especially those that are both tech platforms and content aggregators – has brought renewed focus to antitrust laws.

This is the backdrop for the top ten technology issues of the 2020s. The changes will be immense. The issues will be huge. And the stakes could hardly be higher. As a result, the need for informed discussion has rarely been greater. We hope the assessments that follow help you make up your own mind about the future we need collectively to help shape.

1. Sustainability – Tech’s role in the race to address climate change

A stream of recent scientific research on climate change makes clear that the planet is facing a tipping point. These dire predictions will catapult sustainability into one of the dominant global policy issues for the next decade, including for the tech sector. We see this urgency reflected already in the rapidly evolving views of our customers and employees, as well as in many electorates around the world. In countries where governments are moving more slowly on climate issues, we’re likely to see businesses and other institutions fill the gap. And over the coming decade, governments that aren’t prioritizing sustainability will be compelled to catch up.

For the tech sector, the sustainability issue will cut both ways. First, it will increase pressure on companies to make the use of technology more sustainable. With data centers that power the cloud ranking among the world’s largest users of electricity, Microsoft and other companies will need to move even more quickly than in recent years to use more and better renewable energy, while increasing work to improve electrical efficiency.

But this is just the tip of the iceberg. Far bigger than technology’s electrical consumption is “Scope 3” emissions – the indirect emissions of carbon in a company’s value chain for everything from the manufacturing of new devices to the production of concrete to build new buildings. While this is true for every sector of the economy, it’s an area where the tech sector will likely lead in part because it can. And should. With some of the world’s biggest income statements and healthiest balance sheets, look to Microsoft and other tech companies to invest and innovate, hopefully using the spirit of competition to bring out the best in each other.

This points to the other and more positive side of the tech equation for sustainability. As the world takes more aggressive steps to address the environment, digital data and technology will prove to be among the next decade’s most valuable tools. While carbon issues currently draw the most attention, climate issues have already become multifaceted. We need urgent and concerted action to address water, waste, biodiversity, and our ecosystems. Regardless of the issue or ultimate technology, insights and innovations will be fueled by data science and artificial intelligence. When quantum computing comes online, this will become even more promising.

By the middle or end of the next decade, the sustainability issue may have another impact that we haven’t yet seen and we’re not yet considering. This is on the world’s geopolitics. As the new decade begins, many governments are turning inward and nations are pulling apart. But sustainability is an issue that can’t be solved by any country alone. The world must unite to address environmental issues that know no boundaries. We all share a small planet, and the need to preserve humanity’s ability to live on it will force us to think and act differently across borders.

2. Defending Democracy – International threats and internal challenges

Early each New Year, we look forward to the release of the Economist Intelligence Unit’s annual Democracy Index. This past year’s report updated the data on the world’s 75 nations the Economist ranks as democracies. Collectively these countries account for almost half of the world’s population. Interestingly, they also account for 95 percent of Microsoft’s revenue. Perhaps more than any other company, Microsoft is the technology provider for the governments, businesses, and non-profits that support the world’s democracies. This gives us both an important vantage point on the state of democracy and a keen interest in democracy’s health.

Looking back at the past decade, the Economist’s data shows that the health of the world’s democracies peaked in the middle of the decade and has since declined slightly and stagnated. Technology-fueled change almost certainly has contributed in part to this trend.

As we enter the 2020s, defending democracy more than ever requires a focus on digital tech. The past decade saw nation-states weaponize code and launch cyber-attacks against the civilian infrastructure of our societies. This included the hacking of a U.S. presidential campaign in 2016, a tactic Microsoft’s Threat Intelligence Center has since seen repeated in numerous other countries. It was followed by the WannaCry and Not-Petya attacks in 2017, which unleashed damage around the world in ways that were unimaginable when the decade began.

The defense of democracy now requires determined efforts to protect political campaigns and governments from the hacking and leaking of their emails. Even more important, it requires digital protection of voter rolls and elections themselves. And most broadly, it requires protection against disinformation campaigns that have exploited the basic characteristics of social media platforms.

Each of these priorities now involves new steps by tech companies, as well as new strategies for and collaboration with and among governments. Microsoft is one of several industry leaders putting energy and resources into this area. Our Defending Democracy Program includes an AccountGuard program that protects candidates in 26 democratic nations, an ElectionGuard program to safeguard voting, and support for the NewsGuard initiative to address disinformation. As we look to the 2020s, we will need continued innovation to address the likely evolution of digital threats themselves.

The world will also need to keep working to solidify existing norms and add new legal rules to protect against cybersecurity threats. Recent years have seen more than 100 leading tech companies come together in a Tech Accord to advance security in new ways, while more than 75 nations and more than 1,000 multi-stakeholder signatories have now pledged their support for the Paris Call for Trust and Security in Cyberspace. The 2020s hopefully will see important advances at the United Nations, support from global groups such as the World Economic Forum, and by 2030, work on a global compact to make a Digital Geneva Convention a reality.

But the digital threats to democracy are not confined to attacks from other nations. As the new decade dawns, a new issue is emerging with potentially profound and thorny implications for the world’s democracies. Increasingly government officials in democratic nations are asking whether the algorithms that pilot social media sites are undermining the political health of their citizenries. 

It’s difficult to sustain a democracy if a population fragments into different “tribes” that are exposed to entirely different narratives and sources of information. While diverse opinions are older than democracy itself, one of democracy’s characteristics has traditionally involved broad exposure to a common set of facts and information. But over the past decade, behavioral-based targeting and monetization on digital platforms has arguably created more information siloes than democracy has experienced in the past. This creates a new question for a new decade. Namely, will tech companies and democratic governments alike need new approaches to address a new weakness for the world’s democracies? 

3.  Journalism – Technology needs to give the news business a boost

While we look to improve the health of the world’s democracies, we need to also monitor the well-being of another system playing a vital role in free societies across the globe: the independent press. For centuries, journalists have served as watch dogs for democracies, safeguarding political systems by monitoring and challenging public affairs and government institutions. As Victorian era historian Thomas Carlyle wrote, “There were Three Estates in Parliament; but, in the Reporters’ Gallery yonder, there sat a Fourth Estate more important far than they all.”

It’s clear that a healthy democracy requires healthy journalism, but newspapers are ailing – and many are on life support. The decline of quality journalism is not breaking news. It has been in slow decline since the start the 20th century with the advent of the radio and later when television overtook the air waves. By the turn of this century, the internet further eroded the news business as dotcoms like Craigslist disrupted advertising revenue, news aggregators lured away readers, and search engines and social media giants devoured both. While a number of bigger papers weathered the storm, most small local outlets were hard hit. According to data from the U.S. Bureau of Labor Statistics’ Occupational Employment Statistics, in 2018, 37,900 Americans were employed in the newsroom, down 14 percent from 2015 and down 47 percent from 2004.

The world will be hard pressed to strengthen its democracies if we can’t rejuvenate quality journalism. In the decade ahead the business model for journalism will need to evolve and become healthier, which hopefully will include partnerships that create new revenue streams, including through search and online ads. And as the world experiments with business models, we can’t forget to learn from and build on the public broadcasters that have endured through the years, like the BBC in the United Kingdom and NPR in the United States.  

Helping journalism recover will also include protecting journalists, as we’ve learned through Microsoft’s work with the Clooney Foundation for Justice. Around the world violence against journalists is on the rise, especially for those reporters covering conflict, human rights abuses, and corruption. According to the Committee to Protect Journalists, 25 journalists were killed, 250 were imprisoned, and 64 went missing in 2019. In the coming decade, look for digital technology like AI to play an important role in monitoring the safety of journalists, spotting threats, and helping ensure justice in the court of law. 

And lastly, it’s imperative that we use technology to protect the integrity of journalism. As the new decade begins, technologists warn that manipulated videos are becoming the purveyors of disinformation. These “deepfakes” do more than deceive the public, they call all journalism into question. AI is used to create this doctored media, but it will also be used to detect deepfakes and verify trusted, quality content. Look for the tech sector to partner with the news media and academia to create new tools and advocate for regulation to combat internet fakery and build trust in the authentic, quality journalism that underpins democracies around the world.

4. Privacy in an AI Era – From the second wave to the third

In the 2010s, privacy concerns exploded around the world. The decade’s two biggest privacy controversies redefined big tech’s relationships with government. In 2013, the Snowden disclosures raised the world’s ire about the U.S. Government’s access to data about people. The tech sector, Microsoft included, responded by expanding encryption protection and pushing back on our own government, including with litigation. Five years later, in 2018, the guns turned back on the tech sector after the Cambridge Analytica data scandal engulfed Facebook and digital privacy again became a top-level political issue in much of the world.

Along the way, privacy laws continued to spread around the world. The decade saw 49 new countries adopt broad privacy laws, adding to the 86 nations that protected privacy a decade ago. While the United States is not yet on that list, 2018 saw stronger privacy protections jump from Europe across the Atlantic and move all the way to the Pacific, as California’s legislature passed a new law that paves the way for action in Washington, D.C.

But it wasn’t just the geographic spread of privacy laws that marked the decade. With policy innovation centered in Brussels, the European Union effectively embarked on a second wave of privacy protection. The first wave was characterized by laws that required that web sites give consumers “notice and consent” rights before using their data. Europe’s General Data Protection Regulation, or GDPR, represented a second wave. It gives consumers “access and control” over their data, empowering them to review their data online and edit, move, or delete it under a variety of circumstances.

Both these waves empowered consumers – but also placed a burden on them to manage their data. With the volume of data mushrooming, the 2020s likely will see a third wave of privacy protection with a different emphasis. Rather than simply empowering consumers, we’re likely to see more intensive rules that regulate how businesses can use data in the first place. This will reach data brokers that are unregulated in some key markets today, as well as a focus on sensitive technologies like facial recognition and protections against the use of data to adversely impact vulnerable populations. We’re also likely to see more connections between privacy rules and laws in other fields, including competition law.

In short, fasten your seat belt. The coming decade will see more twists and turns for privacy issues.

5. Data and National Sovereignty – Economics meet geopolitics

When the combustion engine became the most important invention a century ago, the oil that fueled it became the world’s most important resource. With AI emerging as the most important technology for the next three decades, we can expect the data that fuels it to quickly become the 21st century’s most important resource. This quest to accumulate data is creating economic and geopolitical issues for the world.

As the 2020s commence, data economics are breeding a new generation of public policy issues. Part of this stems from the returns to scale that result from the use of data. While there are finite limits to the amount of gasoline that can be poured into the tank of a car, the desire for more data to develop a better AI model is infinite. AI developers know that more data will create better AI. Better AI will lead to even more usage for an AI system. And this in turn will create yet more data that will enable the system to improve yet again. There’s a risk that those with the most data, namely the first movers and largest companies and countries, will overtake others’ opportunity for success.

This helps explain the critical economic issues that are already emerging. And the geopolitical dynamics are no less vital.

Two of the biggest forces of the past decade – digital technology and geopolitics – pulled the world in opposite directions. Digital technology transmitted data across borders and connected people around the world. As technology brought the world together, geopolitical dynamics pulled countries apart and kindled tensions on issues from trade to immigration. This tug-of-war explains one reason a tech sector that started the decade as one of the most popular industries ended it under scrutiny and with mounting criticism.

This tension has created a new focus that is wrapped into a term that was seldom used just a few years ago – “digital sovereignty.” The current epicenter for this issue is Western Europe, especially Germany and France. With the ever-expanding ubiquity of digital technology developed outside of Europe and the potential international data flows that can result, the protection and control of national data is a new and complicated priority, with important implications for evolving concepts of national sovereignty.

The arrival of the AI Era requires that governments think anew about balancing some critical challenges. They need to continue to benefit from the world’s most advanced technologies and move a swelling amount of data across borders to support commerce in goods and services. But they want to do this in a manner that protects and respects national interests and values. From a national security perspective, this may lead to new rules that require that a nation’s public sector data stays within its borders unless the government provides explicit permission that it can move somewhere else. From an economic perspective, it may mean combining leading international technologies with incentives for local tech development and effective sovereignty protections.

All this has also created the need for open data initiatives to level the playing field. Part of this requires opening public data by governments to provide smaller players with access to larger data sets. Another involves initiatives to enable smaller companies and organizations to share – or “federate” – their data, without surrendering their ownership or control in the data they share. This in turn requires new licensing approaches, privacy protections, and technology platforms and tools. It also requires intellectual property policies, especially in the copyright space, that facilitate this work.

During the first two decades of this century, open source software development techniques transformed the economics of coding. During the next two decades, we’ll need open data initiatives that do the same thing for data and AI.

The past year has seen some of these concepts evolve from political theory to government proposals. This past October, the German Government proposed a project called GAIA-X to protect the country’s digital sovereignty. A month later, discussions advanced to propose a common approach that would bring together Germany and France.

It’s too early to know precisely how all these initiatives will evolve. For almost four centuries, the world has lived under a “Westphalian System” defined by territorial borders controlled by sovereign states. The technology advances of the past decade have placed new stress on this system. Every aspect of the international economy now depends on data that crosses borders unseen and at the speed of light. In an AI-driven economy and data-dependent world, the movement of data is raising increasingly important questions for sovereignty in a Westphalian world. The next decade will decide how this balance is struck.

6. Digital Safety – The need to constantly battle evolving threats

The 2010s began with optimism that new technology would advance online safety and better protect children from exploitation. It ended with a year during which terrorists and criminals used even newer technology to harm innocent children and adults in ways that seemed almost unimaginable when the decade began. While the tech sector and governments have moved to respond, the decade underscores the constant war that must be waged to advance digital safety.

Optimism marked the decade’s start in part because of PhotoDNA, developed in 2009 by Microsoft and Hany Farid, then a professor at MIT. The industry adopted it to identify and compare online photos to known illegal images of child exploitation. Working with key non-profit and law enforcement groups, the technology offered real hope for turning the tide against the horrific exploitation of children. And spurred on by the British Government and others, the tech sector took additional steps globally to address images of child pornography in search results and on other services.

Yet as the New York Times reported in late 2019, criminals have subsequently used advancing video and livestreaming technologies, as well as new approaches to file-sharing and encrypted messaging, to exploit children even more horrifically. As a result, political pressure is again pushing industry to do more to catch up. It’s a powerful lesson of the need for constant vigilance.

Meanwhile, online safety threats become more multifaceted. One of the decade’s tragic days came on March 15, 2019 in Christchurch, New Zealand. A terrorist and white supremacist used livestreaming on the internet as the stage for mass shootings at two mosques, killing 51 innocent civilians.

Led by Prime Minister Jacinda Ardern, the New Zealand Government spearheaded a global multi-stakeholder effort to create the Christchurch Call. It has brought governments and tech companies together to share information, launch a crisis incident protocol, and take other steps to reduce the possibility of others using the internet in a similar way in the future.

All of this has also led to new debate about the continued virtues of exempting social media platforms from legal liability for the content on their sites. Typified by section 230 of the United States’ Communications Decency Act, current laws shield these tech platforms from responsibilities faced by more traditional publishers. As we look to the 2020s, it seems hard to believe that this approach will survive the next decade without change.

7. Internet Inequality – A world of haves and have-nots

In 2010, fewer than a third of the world’s population had access to the internet. As this decade concludes, the number has climbed to more than half. This represents real progress. But much of the world still lacks internet access. And high-speed broadband access lags much farther behind, especially in rural areas.

In an AI Era, access to the internet and broadband have become indispensable for economic success. With public discussion increasingly focusing on economic inequality, we need to recognize that the wealth disparity in part is rooted in internet inequality.

There are many reasons to be optimistic that there will be faster progress in the decade ahead. But progress will require new approaches and not just more money.

This starts with working with better data about who currently has interest access and at what speeds. Imagine trying to restore electric power to homes after a big storm without accurate data on where the power is out. Yet that’s the fundamental reality in a country such as the United States when we discuss closing the broadband gap. The country spends billions of dollars a year without the data needed to invest it effectively. And this data gap is by no means confined to North America.

Better data can make its best contribution if it’s coupled with new and better technology. The next decade will see a world of new communications technologies, from 5G (and ultimately 6G) to thousands of low Earth orbiting satellites and terrestrial technologies like TV White Spaces. All of this is good news. But it will be essential to focus on where each technology can best be used, because there is no such thing as a one-size-fits-all approach for communications technology. For example, 5G will transform the world, but its signals travel shorter distances, making it less than optimal for many scenarios in rural areas.

With better data and new technology, it’s possible to bring high speed internet to 90 percent of the global population by 2030. This may sound ambitious, but with better data and sounder investments, it’s achievable. Internet equality calls for ambition on this level.

8. A Tech Cold War – Will we see a digital iron curtain down the Pacific?

The new decade begins with a tech question that wasn’t on the world’s radar ten years ago. Are we witnessing the start of a “tech cold war” between the United States and China? While it’s too early to know for certain, it’s apparent that recent years have been moving in this direction. And the 2020s will provide a definitive answer.

The 2010s saw China impose more constraints on technology and information access to its local market. This built on the Great Chinese Firewall constructed a decade before, with more active filtering of foreign content and more constraints on local technology licenses. In 2016, the Standing Committee of the National People’s Congress adopted a broad Cyber Security Law to advance data localization and enable the government to take “all necessary” steps to protect China’s sovereignty, including through a requirement to make key network infrastructure and information systems “secure and controllable.” Combined with other measures to manage digital technology that have raised human rights concerns, these policies have effectively created a local internet and tech ecosystem that is distinct from the rest of the world.

This Chinese tech ecosystem in the latter half of the decade also grew increasingly competitive. The pace and quality of innovation have been impressive. With companies such as Huawei, Ali Baba, and Tencent gaining worldwide prominence, Chinese technology is being adopted more globally while its own market is less open – and at the same time that it’s subject to Chinese cyber security public policies. 

As the 2010s close, the United States is responding with new efforts to contain the spread of Chinese technology. It’s not entirely different from the American efforts to contain Russian ideology and influence in the Cold War that began seven decades ago. Powered in part by American efforts to dissuade other governments from adopting 5G equipment from China, tensions heightened in 2019 when the U.S. Department of Commerce banned American tech companies from selling to Huawei components for its products.

In both Washington and Beijing, officials are entering the new decade preparing for these tensions around technology to harden. The implications are huge. Clearly, the best time to think about a Tech Cold War is before it begins. The Cold War between the United States and Soviet Union lasted more than four decades and impacted virtually every country on the planet. As we look ahead to the 2020s, the strategic questions for each country and the implications for the world are no smaller.

9. Ethics for Artificial Intelligence – Humanity needs to govern machines

For a world long accustomed to watching robots wreak havoc on the silver screen, the last few years have brought advances in artificial intelligence that still fall far short of the capabilities seen in science fiction, but are well beyond what had seemed possible when the decade began. While typically still narrow in scope, AI enters a new decade with an increasing ability to match human perception and cognition in vision, speech recognition, language translation, and machine learning based on discerning patterns in data.

In a decade that increasingly gave rise to anxiety over the impact of technology, it’s not surprising that these advances unleashed a wave of discussions focused on AI and its implications for ethics and human rights. If we’re going to empower machines to make decisions, how do we want these decisions to be made? This is a defining question not just for the decade ahead, but for all of us who are alive today. As the first generation of people to give machines the power to make decisions, we have a responsibility to get the balance right. If we fail, the generations that follow us are likely to pay a steep price.

The good news is that companies, governments, and civil society groups around the world have embraced the need to develop ethical and human rights principles for artificial intelligence. We published a set of six ethical principles at Microsoft in January 2018, and we’ve been tracking the trends. What we’re seeing is a global movement towards an increasingly common set of principles. It’s encouraging.

As we look to the 2020s, we’re likely to see at least two new trends. The first is the shift from the articulation of principles to the operationalization of ethics. In other words, it’s not sufficient to simply state what principles an organization wants to apply to its use of AI. It needs to implement this in more precise standards backed up by governance models, engineering requirements, and training, monitoring, and ultimately compliance. At Microsoft we published our first Responsible AI Standard in late 2019, spelling out many of these new pieces. No doubt we’ll improve upon it during the next few years, as we learn both from our own experience and the work of many others who are moving in a similar direction.

The second trend involves specific issues that are defining where “the rubber meets the road” for ethical and human rights concerns. The first such issue has involved facial recognition, which arguably has become a global policy issue more rapidly than any previous digital tech issue. Similar questions are being discussed about the use of AI for lethal autonomous weapons. And conversations are starting to focus on ethics and the use of algorithms more generally. This is just a beginning. By 2030, there will likely be enough issues to fill the table of contents for a lengthy book. If there’s one common theme that has emerged in the initial issues, it’s the need to bring together people from different countries, intellectual disciplines, and economic and government sectors to develop a more common vocabulary. It’s the only way people can communicate effectively with each other as we work to develop common and effective ethical practices for machines.

10. Jobs and Income Inequality in an AI Economy – How will the world manage a disruptive decade?

It’s clear that the 2020s will bring continued economic disruption as AI enables machines to replace many tasks and jobs that are currently performed by people. At the same time, AI will create new jobs, companies, and even industries that don’t exist today. As we’ve noted before, there is a lot to learn from the global economy’s transition from a horse-powered to automobile-driven economy a century ago. Like foundational technologies before it, AI will likely create something like an economic rollercoaster, with an uneven match between prosperity and distress during particular years or in specific places.

This will create many big issues, and two are already apparent. The first is the need to equip people with the new skills needed to succeed in an AI Economy. During the 2010s, technology drove globalization and created more economic opportunity for people in many developing economies around the world, perhaps especially in India and China. The resulting competition for jobs led not only to political pressure to turn inward in some developed nations, but to a recognition that economic success in the future requires more investments in education. As we saw through data published by LinkedIn, in a country like the United States there emerged a broadened interest in Europe’s approach to apprenticeships and technical skills and the pursuit of a range of post-secondary credentials. Given the importance of this trend, it’s not surprising that there was also broader political interest in addressing the educational costs for individuals pursuing these skills.

There’s every reason to believe that these trends will accelerate further in the decade ahead. If anything, expanding AI adoption will lead to additional economic ripple effects. We’re likely to see employers and governments alike invest in expanded learning opportunities. It has become a prerequisite for keeping pace.

In many ways, however, this marks the beginning rather than the conclusion of the economic debates that lie ahead. Four decades of technological change have already contributed to mounting income inequality. It’s a phenomenon that now impacts the politics of many communities and countries, with issues that range from affordable housing to tax rates, education and healthcare investments, and income redistribution.

All this raises some of the biggest political questions for the 2020s. It reminds us that history’s defining dates don’t always coincide with the start of a new decade. For example, one of the most important dates in American political history came on September 14, 1901. It was the day that Theodore Roosevelt succeeded to the United States Presidency. More than a century later, we can see that it represented the end of more than 30 years that combined advancing technology with regulatory restraint, which led to record levels of both prosperity and inequality. In important respects, it was the first day of the Progressive Era in the United States. Technology continued to progress, but in a new political age that included stronger business regulation, product liability laws, antitrust enforcement, public investment, and an income tax.

As we enter the 2020s, political leaders in many countries are debating whether to embark on a similar shift. No one has a crystal ball. But increasingly it seems like the next decade will usher in not only a new AI Era and AI Economy, but new approaches to politics and policy. As we’ve noted before, there’s a saying that “history doesn’t repeat itself, but it often rhymes.” From our vantage point, there seems a good chance that the next decade for technology and policy will involve some historical poetry.

Go to Original Article
Author: Microsoft News Center

Stepping into each other’s shoes leads to better online shopping and a newly innovative relationship for Mastercard and Microsoft | Transform

Mastercard and Microsoft walked a mile in each other’s shoes – or in an update on the old adage, spent three days hacking together – and came up with a new service to make shopping online easier and more secure around the world, not only for shoppers, but also retailers and banks.

The collaborative experience also kicked off a new way of thinking about innovation that promises to lead to even more developments to help e-commerce thrive.

New York-based Mastercard is a leading technology company in the payments space, processing about $20 billion in transactions a day across more than 210 countries or territories. And Microsoft is one of the top e-commerce merchants in the world, with online sales from the Microsoft Store, Xbox, Azure, Office 365 and more.

Both companies have felt an urgency in shifting toward online payments – especially with the increasing popularity of mobile apps and devices – that has made security more difficult even as consumers expect greater ease of use. So they brought together teams of engineers to tackle the issue at the recent Microsoft global Hackathon at Microsoft’s Redmond, Washington, headquarters.

man at table holding credit card and looking at computer screen“Both our infrastructures are used in creating online transactions, so we owe it to our customers to make them safe, secure and simple,” says Raj Dhamodharan, Mastercard’s executive vice president of channel propositions and partnerships. “Through co-innovation our customers benefit, because we’re solving a pain point that otherwise might take years to solve.”

The collaboration comes amid changing cultures at Microsoft and Mastercard that are being fostered from the top down.

“Both companies have shifted their mentality that by partnering and bringing in diverse thoughts, we build better products and work better together,” says Will White, Microsoft’s director of payments. “The benefit is you get true innovation from two companies that have radically different missions, in different industries, with different constituents.”

Mastercard provides payment services to Microsoft’s online stores, and Microsoft sells technology services back. So the Hackathon teams built on that symbiotic relationship and experimented with ways to securely store payment info, exchange credentials and authenticate identity with biometrics – using a PC to make a theoretical purchase of a game on the Microsoft Store as a trial.

Microsoft’s double role as merchant and tech company gave Mastercard engineers a better understanding of the challenges both stakeholders face, says Mohamed Abouelenin, Mastercard’s director of product development and innovation.

“That helped us push the bar in developing new services to help provide the best experience for consumers,” Abouelenin says.

It was the first time Mastercard had participated in another company’s Hackathon. The experience energized both groups and left them wanting more.

“I saw a big difference in my team when they got back, in how they approach their jobs and have a more customer-oriented perception of things now,” says Anand Mallepally, Mastercard’s vice president of cyber and intelligence solutions, whose group is based in St. Louis. Physically being together in Redmond was “a gamechanger” for the engineers as far as seeing situations from each other’s perspectives, he says. “I can foresee more and more innovative ideas now.”

A hand holding a credit card with a chip over a payment machineThat’s crucial at a time when chips on credit cards are stopping more fraud, leading criminals increasingly to focus on online forums instead, says Mallepally, who’s been working on fraud prevention and digital platforms with Mastercard for more than 12 years.

His team has to tread carefully, however, acknowledging that security protocols can bring friction to the shopping experience. Shoppers are turned off when they have to remember passwords or go through extra verification steps; retailers sell less when transactions take extra time; and the banks that issue credit cards incur extra expenses when they have to develop and implement new safety measures. So it’s critical to consider enhancements to improve the consumer’s experience, along with additional protections.

The situation is complicated by a new regulation Europe implemented in September that requires banks to communicate with the customer for two-factor authentication before online purchases – even for recurring charges such as monthly bills for utilities or streaming services.

The bank might send a code to a credit card customer’s mobile phone or email address, for example, and the customer has to type that in on the checkout screen before a purchase can proceed. That’s expected to reduce fraud but increase friction. It’s also expected to be adopted by other markets around the world, including the U.S., in coming years.

index finger resting on phone screenBut biometric authentication on mobile devices – such as a fingerprint scanner – has been approved to allow consumers to skip that step.

That got Microsoft’s White to thinking.

“How do we level the playing field between the mobile checkout experience and the PC checkout experience?” he wondered. “And why can’t we make e-commerce payments as fast and simple as we have in the physical world, where you tap or insert a card and you’re done?”

The Hackathon teams found an answer to both, with an extra measure of innovation thrown in.

They decided to leverage the infrastructure Microsoft already has with its Windows Hello technology, which allows 900 million Windows 10 users to access their devices with a fingerprint or facial recognition, instead of a password. Through their combined efforts, they came up with a new feature that screens the user’s biometrics again and then, as long as they match the Windows Hello identification, automatically authenticates the buyer and approves purchases. The new service will give banks and merchants the assurance they’re dealing with actual customers, and shoppers won’t have to go through additional steps to prove themselves.

And the solution can be used across many types of computers, laptops and tablets, without requiring people to own or use a specific device, as the mobile-phone offerings do.

woman on couch holding credit card and looking at computer screen“It’s a solution that neither Mastercard nor Microsoft could have done on our own,” says Matt Rossmeissl, Microsoft’s general manager for commerce engineering operations. “We each had to bring our own expertise to the table to get this done. They’ve got the relationships with the banks, and we’ve got hundreds of millions of Windows devices out there.”

Biometric authentication is built to make online shopping easier for everyone, but it will be especially helpful for those with disabilities, says Priyanka Banerjee, a senior program manager under Rossmeissl. Entering a code for two-factor authentication is a difficult process for anyone who’s blind, for example, or can’t use their fingers to type, especially since those codes are time-limited and expire quickly. But biometric authentication removes that friction.

“Microsoft is very focused on inclusiveness and accessibility, and that’s something that hadn’t yet been thought of in this scenario” by financial services companies, Banerjee says. “What we have built can be extended to those with disabilities, with no extra setup required, and we can make the experience of everybody better.”

The collaborative process is also helping to bring the concept to market faster. The Hackathon engineers were able to accomplish in a few days together what would have taken a month or more apart, says Mallepally.

“We created a prototype in only a week’s time, and I think that will change the relationship between us and Mastercard going forward, because we’ll be more willing to try new things and go do growth hacking,” Microsoft’s Rossmeissl says. “We have at least 10 conversations in parallel going on with Mastercard now.

“If you approach a challenge with an open mind and go into it thinking that what we produce will be better if we work together and leverage our unique independent strengths, we’ll find solutions to problems that could be far better than what we could have done if we’d tried to solve them ourselves.”

All photos provided by Mastercard.

Go to Original Article
Author: Steve Clarke