Tag Archives: Technologies

AI in mining takes root in the industry

The mining industry has used technologies such as advanced machinery, satellite imagery and hypersensitive measurement tools. However, the industry is just beginning to use AI in mining, which has the potential to save workers time and companies money.

Geospatial analysis and data science vendor Descartes Lab has many customers in the mining sector, with a few packaged products specifically for customers in that area. Based in Santa Fe, N.M., the 2014 startup spun out of Los Alamos National Laboratory, a U.S. Department of Energy weapons research center.

The mining sector is in the early stages of using AI technologies, said James Orsulak, senior director of enterprise sales at Descartes Labs. Still, he said, almost all of the company’s clients have small data science teams made up of highly skilled experts.

“We’re seeing a transition where there are more former geologists who went back to school to get a data science degree,” Orsulak said.

Astral imagery

The Descartes Labs platforms for mining companies combine data sets from NASA’s Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), an advanced imaging instrument on the Terra satellite, with AI and analytics.

Vendors like Descartes Labs sell AI in mining technology.
Vendors like Descartes Labs sell AI in mining technology.

Descartes Labs ingested the entire dataset from ASTER, a process that took many CPU hours, Orsulak said. Using machine learning, Descartes Labs then removed all the structures, clouds and snow from the satellite images, leaving only a bare earth model.

We’re seeing a transition where there are more former geologists who went back to school to get a data science degree.
James OrsulakSenior director of enterprise sales, Descartes Labs

Mining clients then combine their data with the platform and layer in other types of data on the model, including mineral classification, geochemistry and geophysics data.

The platform, among other things, can be used to find new mineral deposits with machine learning, as  it can use data on known deposits to  find similar, previously unknown deposits.

Manually, that can take months or years, said Lori Wickert, a geologist and principal remote sensing consultant at Newmont Corporation, a gold mining company. 

“Working with the Descartes platform is providing an opportunity to shortcut that process in a major way,” Wickert said, adding that the software has saved her countless hours of manual work.

Another style

Meanwhile, Kespry, an industrial drone software and analytics vendor, also focuses on the mining sector, but with a slightly different approach.

The 2013 startup, based in Menlo Park, Calif., uses industrial drone imagery to fly over mining sites for mine planning and inventory management, said George Mathew, CEO and chairman of Kespry.

Using drone imagery either collected from its own drones or mining industry customers, along with its data science platform, Kespry maps daily topography changes in active areas, identify slope stability, identify draining patterns and more.

The company can also use the imagery and platform to automatically measure stockpile volumes of mined materials.

For mining companies and other industrial businesses that aren’t yet using AI and machine learning, the time to start is now, Mathew said.  

“The companies that end up making those investments now, they end up with a head start,” he said.

Go to Original Article
Author:

Using Azure AD conditional access for tighter security

As is standard with technologies in the cloud, the features in Azure Active Directory are on the move.

The Azure version of Active Directory differs from its on-premises version in many ways, including its exposure to the internet. There are ways to protect your environment and be safe, but that’s not the case by default. Here are two changes you should make to protect your Azure AD environment.

Block legacy authentication

Modern authentication is Microsoft’s term for a set of rules and requirements on how systems can communicate and authenticate with Azure AD. This requirement is put in place for several security benefits, but it’s also not enforced by default on an Azure AD tenant.

Legacy authentication is used for many types of attacks against Azure AD-based accounts. If you block legacy authentication, then you will block those attacks, but there’s a chance you’ll prevent users trying to perform legitimate tasks.

This is where Azure AD conditional access can help. Instead of a simple off switch for legacy authentication, you can create one or more policies — a set of rules — that dictate what is and isn’t allowed under certain scenarios.

You can start by creating an Azure AD conditional access policy that requires modern authentication or it blocks the sign-in attempt. Microsoft recently added a “report only” option to conditional access policies, which is highly recommended to use and leave on a few days after deployment. This will show you the users still using legacy authentication that you need to remediate before you enforce the policy for real. This helps to ensure you don’t stop users from doing their jobs.

However, this change will severely limit mobile phone email applications. The only ones officially supported with modern authentication are Outlook for iOS and Android, and Apple iOS Mail.

Implement multifactor authentication

This sounds like an obvious one, but there are many ways to do multifactor authentication (MFA). Your Microsoft licensing is one of the factors that dictates your choices. The good news is that options are available to all licensing tiers — including the free one — but the most flexible options come from Azure AD Premium P1 and P2.

With those paid plans, conditional access rules can be a lot nicer than just forcing MFA all the time. For example, you might not require MFA if the user accesses a Microsoft service from an IP address at your office or if the device is Azure AD-joined. You might prefer that both of those scenarios are requirements to avoid MFA while other situations, such as a user seeking access on a PC not owned by the company, will prompt for extra authentication.

MFA doesn’t have to just be SMS-based authentication. Microsoft’s Authenticator App might take a few more steps for someone to set up the first time they register, but it’s much easier to just accept a pop-up on your mobile device as a second factor of authorization, rather than waiting for an SMS, reading the six-digit number, then typing it into your PC.

Without MFA, you’re running a high risk of having an internet-exposed authentication system that attackers can easily try leaked credentials or use spray attacks until they hit a successful login with a username and password.

The other common attack is credential phishing. This can be particularly successful when the threat actor uses a compromised account to send out phishing emails to the person’s contacts or use fake forms to get the contact’s credentials, too. This would be mostly harmless if the victim’s account required MFA.

Accounts in Azure AD will lock out after 10 failed attempts without MFA, but only for a minute, then gradually increase the time after further failure attempts. This is a good way to slow down the attackers, and it’s also smart enough to only block the attacker and keep your user working away. But the attacker can just move onto the next account and come back to the previous account at a later time, eventually hitting a correct password.

Azure AD conditional access changes are coming

The above recommendations can be enabled by four conditional access baseline policies, which should be visible in all Azure AD tenants (still in preview), but it appears these are being removed in the future.

baseline protection policies
Microsoft plans to replace the baseline protection policies with security defaults

The policies will be replaced by a single option called Security Defaults, found under the Manage > Properties section of Azure AD. The baseline policies helped you be a bit more granular about what security you wanted and the enablement of each feature. To keep that flexibility, you’ll need Azure AD Premium once these baseline policies go.

Turning on Security Defaults in your Azure AD tenant will:

  • force administrators to use MFA;
  • force privileged actions, such as using Azure PowerShell, to use MFA;
  • force all users to register for MFA within 14 days; and
  • block legacy authentication for all users.

I suspect the uptake wasn’t enough, which is why Microsoft is moving to a single toggle option to enable these recommendations. I also hazard to guess that Microsoft will make this option on by default for new tenants in the future, but there’s no need for you to wait. If you don’t have these options on, you should be working on enabling them as soon as you can.

Go to Original Article
Author:

KPMG expects to invest US$5 billion in digital strategy and expand Microsoft alliance to accelerate professional services transformation – Stories

New innovations built on Microsoft cloud and AI technologies help clients achieve greater accuracy and decision-making capabilities, increased productivity, and cost efficiencies.

AMSTELVEEN, Netherlands and REDMOND, Wash. — Dec. 5, 2019 — KPMG and Microsoft Corp. are strengthening their global relationship through a five-year agreement to accelerate digital transformation for KPMG member firms and mutual clients. As part of its announcement to significantly invest in technology, people and innovation,, KPMG is modernizing its workplace using the Microsoft 365 suite of cloud-based collaboration and productivity tools, including Microsoft Teams. KPMG is also utilizing Microsoft Azure and its AI capabilities as the backbone for a new common, global cloud-based platform. The platform will strengthen KPMG’s range of digital offerings with new innovations in cloud-based audit capabilities, tax solutions and risk management. Clients in all sectors, including those in highly regulated industries, benefit from globally consistent and continuous service delivery that enables greater speed of deployment while adhering to industry-leading compliance and security standards.

“Together with KPMG, we’re accelerating digital transformation across industries by bringing the latest advances in cloud, AI and security to highly regulated workloads in tax, audit and advisory,” said Satya Nadella, CEO, Microsoft. “KPMG’s deep industry and process expertise, combined with the power of our trusted cloud — spanning Azure, Dynamics 365 and Microsoft 365 — will bring the best of both organizations together to help customers around the world become more agile in an increasingly complex business environment.”

New business-critical solutions

As organizations expand to new geographies, develop new products and recruit new talent, processes can become increasingly complex and harder to scale. Market forces, such as evolving data protection laws, currency fluctuations and geopolitical tensions, increase the complexity and require a greater responsiveness for systems and tools.

The strong portfolio of KPMG and Microsoft alliance offerings can help address these challenges more quickly by building applications on demand, automating manual processes, and continuously analyzing information to minimize the risk of errors and increase the ability to make smart decisions.

“Our alliance with Microsoft has become a critical component in helping us deliver industry-leading solutions and services to clients. Our significant multiyear investment continues to empower our people to work more efficiently and collaboratively while maximizing the power of a workforce that blends people and technology,” said Bill Thomas, Global Chairman, KPMG International. “By harnessing Microsoft’s intelligent and trusted cloud, we aim to help clients be at the forefront of change and better prepared for a digital-first future.”

Combining industry expertise with advanced technology

Through a jointly funded incubator, KPMG and Microsoft are co-developing a robust portfolio of solutions and managed services in the areas of cloud business transformation, intelligent business applications and smart workplace solutions.

For example, KPMG in the U.S. and Microsoft are working together to bring the power of Azure to the healthcare and life sciences industries. This collaboration is enabling organizations within this highly regulated sector to maximize their clinical, operational and financial performance with an easily scalable solution that helps improve deployment speed, accelerate ROI and increase data-driven insights.

In addition, KPMG in the Netherlands has developed risk management, compliance and internal audit solutions that leverage discovery tools to enable the digitization of risk and compliance processes across domains such as finance, legal and IT. Designed by KPMG and built on Microsoft Azure, the solutions provide seamless and cost-efficient policy and controls automation, putting smart analytics directly in the hands of business and IT operators so they can make timely, corrective actions when deviations occur.

Smart audit platform

KPMG, with the launch of its smart audit platform KPMG Clara in 2017, became the first of the Big Four to take its audit workflow to the cloud. Based on Microsoft Azure, KPMG Clara is an automated, agile, intelligent and scalable platform that allows KPMG professionals to work smarter, bringing powerful data and analytics capabilities into one interface, while allowing clients to interact on a real-time basis with the audit process.

By enriching the audit mandate with AI, KPMG enables its professionals to make decisions based on real-time data. This further reinforces KPMG’s commitment to maintaining and enhancing audit quality and building a future where technology continually enriches the audit through the introduction of new digital advancements.

KPMG Clara will integrate with Microsoft Teams, providing a platform for audit professionals to centrally manage and securely share audit files, track audit-related activities, and communicate using chat, voice and video meetings. This helps simplify the auditors’ workflow, enabling them to stay in sync throughout the entire process and drive continuous communication with the client.

Empowering workforce transformation

Through its common, global cloud platform, KPMG will create a set of cloud-based capabilities ranging from hosting infrastructure based on Microsoft Azure to more than 50 advanced solutions, such as AI, cyber and robotic process automation (RPA). KPMG will further empower its global workforce of over 207,000 employees across 153 countries with Microsoft 365, including Teams, to better meet the needs of clients through increased collaboration, improved productivity and data-driven insights. In addition, more than 30,000 KPMG professionals across 17 member firms have been equipped with Dynamics 365, a suite of powerful customer relationship applications.

To read more about the KPMG and Microsoft alliance, visit the Microsoft Transform blog.

About KPMG International 

KPMG is a global network of professional services firms providing Audit, Tax and Advisory services. We operate in 153 countries and territories and have 207,000 people working in member firms around the world. The independent member firms of the KPMG network are affiliated with KPMG International Cooperative (“KPMG International”), a Swiss entity. Each KPMG firm is a legally distinct and separate entity and describes itself as such.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:
Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777,
[email protected]
Mark Walters, KPMG International, (646) 943-2115, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

Google to unveil post-Chronicle cloud cybersecurity plans

Google is set to reveal how cloud cybersecurity technologies developed by Chronicle have been worked into its portfolio for large enterprise customers.

In June, Google Cloud announced it had acquired Chronicle, a startup launched within parent company Alphabet in 2015. Integration work has proceeded since then, and details will be shared at the Cloud Next ’19 UK conference, which begins in London on Nov. 20.

A recent report on Chronicle from Vice’s Motherboard publication painted a bleak picture of the company post-Google acquisition, with key executives including its founder and CEO departing, and dismal morale in the product-development trenches.

“People keep quitting. Sales doesn’t know what to do, since there’s no real product roadmap anymore. Engineering is depressed for the same reason,” an unnamed Chronicle employee told the site.

Asked for comment, a Google spokeswoman pointed to the company’s blog post on the upcoming announcements at Cloud Next UK, and did not address the claims of unrest at Chronicle.

Google plans to announce “multiple new native capabilities” for security, as well as planned new features for Backstory, Chronicle’s main cloud cybersecurity product, according to the blog.

Backstory can ingest massive amounts of security telemetry data and process it for insights. It is geared toward companies that have a wealth of this information but lack the staff or resources to analyze it in-house.

Customers upload their telemetry data to a private repository on Google Cloud infrastructure, where it is indexed and analyzed by Chronicle’s software engine. The engine compares the customer’s data against threat intelligence signals mined from many sources and looks for problematic correlations.

Backstory will compete with both on-premises security information and event management platforms and cloud cybersecurity systems, such as Sumo Logic and Splunk. Rival cloud providers have responded as well, with one prominent case being Azure Sentinel, which Microsoft launched this year.

Beyond performance and results, pricing may be a key factor for Backstory. Chronicle has made much of the fact that it won’t be priced according to data volume, but the exact nature of the business model still isn’t clear. Microsoft uses a tiered, fixed-fee pricing scheme for Azure Sentinel based on daily data capacity.

Backstory’s biggest opportunity may be outside Google Cloud

Jon OltsikJon Oltsik

While Chronicle’s staff would have enjoyed more freedom if kept independent from Google Cloud, there’s no evidence to suggest it’s being held back at this point, according to Jon Oltsik, senior principal analyst for cybersecurity at Enterprise Strategy Group.

The Google Cloud management team needs to give Chronicle the latitude to innovate and compete.
Jon OltsikSenior principal analyst, cybersecurity, Enterprise Strategy Group

“The Google Cloud management team needs to give Chronicle the latitude to innovate and compete against a strong and dynamic market,” he said. “This should be the model moving forward and I’ll be monitoring how it proceeds.”

There is an emerging market for specific security analytics and operations tools for monitoring the security of cloud-based workloads, which aligns well with Google Cloud, Oltsik added. But the bigger opportunity lies with customers who aren’t necessarily Google Cloud users, he added.

Go to Original Article
Author:

Boost your ecommerce revenue with Dynamics 365 Fraud Protection – Dynamics 365 Blog

With the booming growth of online technologies and marketplaces comes the burgeoning rise of a variety of cybersecurity challenges for businesses that conduct any aspect of their operations through online software and the Internet. Fraud is one of the most pervasive trends of the modern online marketplace, and continues to be a consistent, invasive issue for all businesses.

As the rate of payment fraud continues to rise, especially in retail ecommerce where the liability lies with the merchant, so does the amount companies spend each year to combat and secure themselves against it. Fraud and wrongful rejections already significantly impact merchants’ bottom-line in a booming economy and as well as when the economy is soft.

The impact of outdated fraud detection tools and false alarms

Customers, merchants, and banking institutions have been impacted for years by suboptimal experiences, increased operational expenses, wrongful rejections, and reduced revenue. To combat these negative business impacts, companies have been implementing layered solutions. For example, merchant risk managers are bogged down with manual reviews and analysis of their own local 30/60/90-day historical data. These narrow, outdated views of data provide a partial hindsight view of fraud trends, leaving risk managers with no real-time information to work with when creating new rules to hopefully minimize fraud loss.

One of the most common ways that fraud impacts everyday consumers and business is through wrongful rejections. For example, when a merchant maintains an outdated and/or strict set of transaction rules and algorithms, a customer who initiates a retail ecommerce transaction through a credit card might experience a wrongful rejection known to consumers as a declined transaction, because of these outdated rules. Similarly, wrongful declined transactions can also happen when the card issuing bank refuses to authorize the purchase using the card due to suspicion of fraud. The implications of these suboptimal experiences for all parties involved (customers, merchants, and banks) directly correlates into loss of credibility, security, and business revenue.

Introducing Microsoft Dynamics 365 Fraud Protection

As one of the biggest technology organizations in the world, Microsoft saw an opportunity to provide software as a service that effectively and visibly helps reduce the rate and pervasiveness of fraud while simultaneously helping to reduce wrongful declined transactions and improving customer experience. Microsoft Dynamics 365 Fraud Protection is a cloud-based solution merchants can use in real-time to help lower their costs related to combatting fraud, help increase their revenue by improving acceptance of legitimate transactions, reduce friction in customer experience, and integrate easily into their existing order management system and payment stack. This solution offers a global level of fraud insights using data sets from participating merchants that are processed with real-time machine learning to detect and mitigate evolving fraud schemes in a timely manner.

Microsoft Dynamics 365 Fraud Protection houses five powerful capabilities designed to capitalize on the power of machine learning to provide merchants with an innovative fraud protection solution:

  • Adaptive AI technology continuously learns and adapts from patterns and trends and will equip fraud managers with the tools and data they need to make informed decisions on how to optimize their fraud controls.
  • A fraud protection network maintains up-to-date connected data that provides a global view of fraud activity and maintains the security of merchants’ confidential information and shoppers’ privacy.
  • Transaction acceptance booster shares transactional trust knowledge with issuing banks to help boost authorization rates.
  • Customer escalation support provides detailed risk insights about each transaction to help improve merchants’ customer support experience.
  • Account creation protection monitors account creation, helps minimize abuse and automated attacks on customer accounts, and helps to avoid incurring losses due to fraudulent accounts

See the image below to learn more about the relationship between merchants and banks when they both use Dynamics 365 Fraud Protection:

Banks worldwide can choose to participate in the Dynamics 365 Fraud Protection transaction acceptance booster feature to increase acceptance rates of legitimate authorization requests from online merchants using Dynamics 365 Fraud Protection. Merchants using the product can opt to use this feature to increase acceptance rates for authorization requests made to banks without having to make any changes to their existing authorization process.

Learn more

This week at Sibos 2019 in London, Microsoft will be showcasing its secure and compliant cloud solutions for the banking industry. Read a round-up of announcements unveiled at Sibos and  view an agenda of Microsoft events and sessions at the show. Stop by our booth (Z131) for a showcase of applications relevant to banking, including Microsoft Dynamics 365 Fraud Protection, which will be generally available on October 1st, 2019. Contact your Microsoft representative to get started.

Go to Original Article
Author: Microsoft News Center

Supporting modern technology policy for the financial services industry – guidelines by the European Banking Authority | Transform

The financial services community has unprecedented opportunity ahead. With new technologies like cloud, AI and blockchain, firms are creating new customer experiences, managing risk more effectively, combating financial crime, and meeting critical operational objectives. Banks, insurers and other services providers are choosing digital innovation to address these opportunities at a time when competition is increasing from every angle – from traditional and non-traditional players alike.

At the same time, our experience is that lack of clarity in regulation can hinder adoption of these exciting technologies, as regulatory compliance remains fundamental to financial institutions using technology they trust.  Indeed, the common question I get from customers is: Will regulators let me use your technology, and have you built in the capabilities to help me meet my compliance obligations?

A portrait of Dave Dadoun, assistant general counsel for Microsoft.
Dave Dadoun.

With this in mind, we applaud the European Banking Authority’s (EBA) revised Guidelines on outsourcing arrangements which, in part, address the use of cloud computing. For several years now we have shared perspectives with regulators on how regulation can be modernized to address cloud computing without diminishing the security, privacy, transparency and compliance safeguards necessary in a native cloud or hybrid-cloud world. In fact, cloud computing can afford financial institutions greater risk assurance – particularly on key things like managing data, securing data, addressing cyber threats and maintaining resilience.

At the core of the revised guidelines are a set of flexible principles addressing cloud in financial services. Indeed, the EBA has been clear these “guidelines are subject to the principle of proportionality,” and should be “applied in a manner that is appropriate, taking into account, in particular, the institution’s or payment institution’s size … and the nature, scope and complexity of its activities.” In addition, the guidelines set out to harmonize approaches across jurisdictions, a big step forward for financial institutions to have predictability and consistency among regulators in Europe. We think the EBA took this smart move to support leading-edge innovation and responsible adoption, and prepare for more advanced technology like machine learning and AI going forward.

Given these guidelines reflect a modernized approach that transcends Europe, we have updated our global Financial Services Amendment for customers to reflect these key changes. We have also created a regulatory mapping document which shows how our cloud services and underlying contractual commitments map to these requirements in an EU Checklist. The EU Checklist is accessible on the Microsoft Service Trust Portal. In essence, Europe offers the benchmark in establishing rules to permit use of cloud for financial services and we are proud to align to such requirements.

Because this is such an important milestone for the financial sector, we wanted to share our point-of-view on a few key aspects of the guidelines, which may help firms accelerate technology transformation with the Microsoft cloud going forward:

  • Auditability: As cloud has become more prevalent, we think it is natural to extend audit rights to cloud vendors in circumstances that warrant it. We also think that audits are not a one-size-fits-all approach but adaptable based on use cases – particularly whether it involves running core banking systems in the cloud. Microsoft has provided innovations to help supervise and audit hyper-scale cloud, including:
  • Data localization: We are pleased there are no data localization requirements in the EBA guidance. Rather, customers must assess the legal, security and other risks where data is stored, as opposed to mandating data be stored strictly in Europe. We help customers manage and assess such risk by providing:
    • Contractual commitments to store data at rest in a specified region (including Europe).
    • Transparency where data is stored.
    • Full commitments to meet key privacy requirements, like the General Data Protection Regulation (GDPR).
    • Flow-through of such commitments to our subcontractors.
  • Subcontractors. The guidelines address subcontractors, particularly those that provide “critical or important” functions. Management, governance and oversight of Microsoft’s subcontractors is core to what we do.  Among other things:
    • Microsoft’s subcontractors are subject to a vetting process and must follow the same privacy and governance controls we ourselves implement to protect customer data.
    • We provide transparency about subcontractors who may have access to customer data and provide 180 days notification about any new subcontractors as well.
    • We provide customers termination rights should they conclude a subcontractor presents a material increase in risk to a critical or important function of their operations.
  • Core platforms: We welcome the EBA’s position providing clarity that core platforms may run in the cloud. What matters is governance, documenting protocols, the security and resiliency of such systems, and having appropriate oversight (and audit rights), and commitments to terminate an agreement, if and when that becomes necessary. These are all capabilities Microsoft offers to its customers and we now see movement among leading banks to put core systems into our cloud because of the benefits we provide.
  • Business Continuity and Exit Planning. Institutions must have business continuity plans and test them periodically for use of critical or important functions. Microsoft has supported our customers to meet this requirement, including providing a Modern Cloud Risk Assessment toolkit and, in addition, in the Service Trust Portal documentation on our service resilience architecture, our Enterprise Business Continuity Management team (EBCM), and a quarterly report detailing results from our recent EBCM testing. In addition, we have supported our customers in preparing exit planning documentation, and we work with industry bodies like the European Banking Federation towards further industry guidance for these new EBA requirements.
  • Concentration risk: The EBA addresses the need to assess whether concentration risk may exist due to potential systemic failures in use of cloud services (and other legacy infrastructure). However, this is balanced with understanding what the risks are of a single point of failure, and to balance those risks and trade-offs from existing legacy systems. In short, financial institutions should assess the resiliency and safeguards provided with our hyper-scale cloud services, which can offer a more robust approach than systems in place today. When making those assessments, financial institutions may decide to lean-in more with cloud as they transform their businesses going forward.

The EBA framework is a great step forward to help modernize regulation and take advantage of cloud computing. We look forward to participating in ongoing industry discussion, such as new guidance under consideration by the European Insurance and Occupational Pension Authority concerning use of cloud services, as well as assisting other regions and countries in their journey to creating more modern policy that both supports innovation while protecting the integrity of critical global infrastructure.

For more information on Microsoft in the financial services industry, please go here.

Top photo courtesy of the European Banking Authority.

Go to Original Article
Author: Microsoft News Center

Azure and VMware innovation and momentum

Since announcing Azure VMware Solutions at Dell Technologies World this spring, we’ve been energized by the positive feedback we’ve received from our partners and customers who are beginning to move their VMware workloads to Azure. One of these customers is Lucky Brand, a leading retailer that is embracing digital transformation while staying true to its rich heritage. As part of their broader strategy to leverage the innovation possible in the cloud, Lucky Brand is transitioning several VMware workloads to Azure.

“We’re seeing great initial ROI with Azure VMware Solutions. We chose Microsoft Azure as our strategic cloud platform and decided to dramatically reduce our AWS footprint and 3rd Party co-located data centers. We have a significant VMware environment footprint for many of our on-premises business applications.

The strategy has allowed us to become more data driven and allow our merchants and finance analysts the ability to uncover results quickly and rapidly with all the data in a central cloud platform providing great benefits for us in the competitive retail landscape. Utilizing Microsoft Azure and VMware we leverage a scalable cloud architecture and VMware to virtualize and manage the computing resources and applications in Azure in a dynamic business environment.

Since May, we’ve been successfully leveraging these applications on the Azure VMware Solution by CloudSimple platform. We are impressed with the performance, ease of use and the level of support we have received by Microsoft and its partners.” 

Kevin Nehring, CTO, Lucky Brand

Expanding to more regions worldwide and adding new capabilities

Based on customer demand, we are excited to announce that we will expand Azure VMware Solutions to a total of eight regions across the US, Western Europe, and Asia Pacific by end of year.

In addition to expanding to more regions, we are continuing to add new capabilities to Azure VMware Solutions and deliver seamless integration with native Azure services. One example is how we’re expanding the supported Azure VMware Solutions storage options to include Azure NetApp Files by the end of the year. This new capability will allow IT organizations to more easily run storage intensive workloads on Azure VMware Solutions. We are committed to continuously innovating and delivering capabilities based on customer feedback.

Broadening the ecosystem

It is amazing to see the market interest in Azure VMware Solutions and the partner ecosystem building tools and capabilities that support Azure VMware Solutions customer scenarios.

RiverMeadow now supports capabilities to accelerate the migration of VMware environments on Azure VMware Solutions.

“I am thrilled about our ongoing collaboration with Microsoft. Azure VMware Solutions enable enterprise customers to get the benefit of cloud while still running their infrastructure and applications in a familiar, tried and trusted VMware environment. Add with the performance and cost benefits of VMware on Azure, you have a complete solution. I fully expect to see substantial enterprise adoption over the short term as we work with Microsoft’s customers to help them migrate even the most complex workloads to Azure.”

Jim Jordan, President and CEO, RiverMeadow

Zerto has integrated its IT Resilience Platform with Azure VMware Solutions, delivering replication and failover capabilities between Azure VMware Solution by CloudSimple, Azure and any other Hyper-V or VMware environments, keeping the same on-premises environment configurations, and reducing the impact of disasters, logical corruptions, and ransomware infections.

“Azure VMware Solution by CloudSimple, brings the familiarity and simplicity of VMware into Azure public cloud. Every customer and IT pro using VMware will be instantly productive with minimal or no Azure competency. With Zerto, VMware customers gain immediate access to simple point and click disaster recovery and migration capabilities between Azure VMware Solutions, the rest of Azure, and on-premises VMware private clouds. Enabled by Zerto, one of Microsoft’s top ISVs and an award-winning industry leader in VMware-based disaster recovery and cloud migration, delivers native support for Azure VMware Solutions. “

Peter Kerr, Vice President of Global Alliances, Zerto

Veeam Backup & Replication™ software is specialized in supporting VMware vSphere environments, their solutions will help customers meet the backup demands of organizations deploying Azure VMware Solutions.

“As a leading innovator of Cloud Data Management solutions, Veeam makes it easy for our customers to protect their virtual, physical, and cloud-based workloads regardless of where those reside. Veeam’s support for Microsoft Azure VMware Solutions by CloudSimple further enhances that position by enabling interoperability and portability across multi-cloud environments. With Veeam Backup & Replication, customers can easily migrate and protect their VMware workloads in Azure as part of a cloud-first initiative, create an Azure-based DR strategy, or simply create new Azure IaaS instances – all with the same proven Veeam solutions they already use today.”  

Ken Ringdahl, Vice President of Global Alliances Architecture, Veeam Software

Join us at VMworld

If you plan to attend VMworld this week in San Francisco, stop by our booth and witness Azure VMware Solutions in action; or sit down for a few minutes and listen to one of our mini theater presentations addressing a variety of topics such as Windows Virtual Desktop, Windows Server, and SQL Server on Azure in addition to Azure VMware Solutions!

Learn more about Azure VMware Solutions.

Go to Original Article
Author: Microsoft News Center

Data ethics issues create minefields for analytics teams

GRANTS PASS, Ore. — AI technologies and other advanced analytics tools make it easier for data analysts to uncover potentially valuable information on customers, patients and other people. But, too often, consultant Donald Farmer said, organizations don’t ask themselves a basic ethical question before launching an analytics project: Should we?

In the age of GDPR and like-minded privacy laws, though, ignoring data ethics isn’t a good business practice for companies, Farmer warned in a roundtable discussion he led at the 2019 Pacific Northwest BI & Analytics Summit. IT and analytics teams need to be guided by a framework of ethics rules and motivated by management to put those rules into practice, he said.

Otherwise, a company runs the risk of crossing the line in mining and using personal data — and, typically, not as the result of a nefarious plan to do so, according to Farmer, principal of analytics consultancy TreeHive Strategy in Woodinville, Wash. “It’s not that most people are devious — they’re just led blindly into things,” he said, adding that analytics applications often have “unforeseen consequences.”

For example, he noted that smart TVs connected to home networks can monitor whether people watch the ads in shows they’ve recorded and then go to an advertiser’s website. But acting on that information for marketing purposes might strike some prospective customers as creepy, he said.

Shawn Rogers, senior director of analytic strategy and communications-related functions at vendor Tibco Software Inc., pointed to a trial program that retailer Nordstrom launched in 2012 to track the movements of shoppers in its stores via the Wi-Fi signals from their cell phones. Customers complained about the practice after Nordstrom disclosed what it was doing, and the company stopped the tracking in 2013.

“I think transparency, permission and context are important in this area,” Rogers said during the session on data ethics at the summit, an annual event that brings together a small group of consultants and vendor executives to discuss BI, analytics and data management trends.

AI algorithms add new ethical questions

Being transparent about the use of analytics data is further complicated now by the growing adoption of AI tools and machine learning algorithms, Farmer and other participants said. Increasingly, companies are augmenting — or replacing — human involvement in the analytics process with “algorithmic engagement,” as Farmer put it. But automated algorithms are often a black box to users.

Mike Ferguson, managing director of U.K.-based consulting firm Intelligent Business Strategies Ltd., said the legal department at a financial services company he works with killed a project aimed at automating the loan approval process because the data scientists who developed the deep learning models to do the analytics couldn’t fully explain how the models worked.

We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.
Mike FergusonManaging director, Intelligent Business Strategies Ltd.

And that isn’t an isolated incident in Ferguson’s experience. “There’s a loggerheads battle going on now in organizations between the legal and data science teams,” he said, adding that the specter of hefty fines for GDPR violations is spurring corporate lawyers to vet analytics applications more closely. As a result, data scientists are focusing more on explainable AI to try to justify the use of algorithms, he said.

The increased vetting is driven more by legal concerns than data ethics issues per se, Ferguson said in an interview after the session. But he thinks that the two are intertwined and that the ability of analytics teams to get unfettered access to data sets is increasingly in question for both legal and ethical reasons.

“It’s pretty clear that legal is throwing their weight around on data governance,” he said. “We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.”

Jill Dyché, an independent consultant who’s based in Los Angeles, said she expects explainable AI to become “less of an option and more of a mandate” in organizations over the next 12 months.

Code of ethics not enough on data analytics

Staying on the right side of the data ethics line takes more than publishing a corporate code of ethics for employees to follow, Farmer said. He cited Enron’s 64-page ethics code, which didn’t stop the energy company from engaging in the infamous accounting fraud scheme that led to bankruptcy and the sale of its assets. Similarly, he sees such codes having little effect in preventing ethical missteps on analytics.

“Just having a code of ethics does absolutely nothing,” Farmer said. “It might even get in the way of good ethical practices, because people just point to it [and say], ‘We’ve got that covered.'”

Instead, he recommended that IT and analytics managers take a rules-based approach to data ethics that can be applied to all three phases of analytics projects: the upfront research process, design and development of analytics applications, and deployment and use of the applications.

Go to Original Article
Author:

Exten Technologies releases 3.0 version of NVMe platform

Exten Technologies has released the third generation of its HyperDynamic storage software, which was designed with the aim of bringing more resiliency, performance and management to data center customers.

New features in generation three include node-level resiliency with synchronous replicas, shared volumes with replicas for supporting parallel file systems, dual parity resiliency, and integrated drive management and hot swap.

Exten software is deployed on the storage target and does not require proprietary software on compute clients.

According to Exten, HyperDynamic 3.0 aims to improve TCP performance with Solarflare TCP acceleration that provides TCP performance near remote direct memory access.

It also has new features designed to simplify NVMe-oF storage management and deployment, Exten claims. These features include declustered RAID, which enables the configuration of resilient volumes that use Linux multi-path IO software to provide redundancy in both networking and storage. Exten’s interface provides node- and cluster-level telemetry. Users can also set quality-of-service limits in order to manage performance during drive or node rebuilds.

Exten Technologies is part of a batch of newer vendors making their way in the NVMe market.

Apeiron Data Systems offers a handful of NVMe storage products, including enterprise NVMe. It is NVMe over Ethernet, as opposed to over fabric, and was designed with the goal of delivering the performance and cost of server-based scale-out but with the manageability of enterprise storage.

Vendor Vexata also touts its RAID-protected NVMe and claims it has ultralow latency at scale. According to its website, the company was founded in an attempt to provide better performance and efficiency, while at a lower cost than other flash storage solutions.

Exten Technologies’ HyperDynamic 3.0 is available now.

Go to Original Article
Author:

Analyst, author talks enterprise AI expectations

For years, promoters have made AI technologies sound like the all-encompassing technology answer for enterprises, a ready-to-use piece of software that could solve all of an organization’s data and workflow problems with minimal effort.

While AI can certainly automate parts of a workflow and save employees and organizations time and money, it rarely requires no work or no integration to set up, something organizations are still struggling to understand.

In this Q&A ahead of the publication of his new book, Alan Pelz-Sharpe, founder of market advisory and research firm Deep Analysis, describes enterprises’ AI expectations and helps distinguish between realistic AI goals and expectations and the AI hype.

An AI project is different from a traditional IT project, Pelz-Sharpe said, and organizations should treat it as such.

“One of the reasons so many projects fail is people do not know that,” he said.

In this Q&A, Pelz-Sharpe, who is also a part-time voice and film actor, talks about AI expectations, deploying AI, and the realities of the technology.

Have you found that business users and enterprises have an accurate description of AI?

Alan Pelz-Sharpe: No. It’s a straightforward no. I’ll give you real world examples.

Alan Pelz-Sharpe, AI ExpectationsAlan Pelz-Sharpe

A very large, very well-known financial services company brought in the biggest vendor. They spent six and a half million dollars. Four months later, they fired the vendor, because they had nothing to show for it. They talked to me and it was heartbreaking, because I wanted to say to them, ‘Why? Why did you ever engage with them?’

It wasn’t because they were bad people engaged in this, but because they had very specific sets of circumstances and really, really specific requirements. I said to them, ‘You’re never going to buy this off the shelf, it doesn’t exist. You’re going to have to develop this yourself.’ Now, that’s what they’re doing, and they’re spending a lot less money and having a lot more success.

AI is being so overhyped; your heart goes out to buyers, because they don’t know who to believe. In some cases, they could save a fortune, go to some small startup [that] could, frankly, give them the product and get the job done. They don’t know that.

Are these cases of enterprises’ having the wrong AI expectations and not knowing what they want, or they are cases of a vendor misleading a buyer?

It’s absolutely both. Vendors have overhyped and oversold. Then the perception is I buy this tool, I plug it in, and it does its magic. It just isn’t like that ever. It’s never like that. That’s nonsense. So, the vendors are definitely guilty of that, but when haven’t they been?

From the buyer’s perspective, I think there are two things really. One, they don’t know. They don’t understand, they haven’t been told that they have to run this project very differently from an IT project. It’s not a traditional IT project in which you scope it out, decide what you’re going to use, test it and then it goes live. AI isn’t like that. It’s a lifetime investment. You’re never going to completely leave it alone, you’re always going to be involved with it.

Technically, there’s a perception that bigger and more powerful is better. Well, is it? If you’re trying to automatically classify statements versus purchases versus invoices, the usual back office paper stuff, why do you need the most powerful product? Why not, instead, just buy something simple, something that’s designed to do exactly that?

Often, buyers get themselves into deep waters. They buy a big Mack Truck to do what a small tricycle could do. That’s actually really common. Most real-world business use cases are surprisingly narrow and targeted.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author: