Tag Archives: wants

1080p Graphics card

Helping a friend build a gaming PC. He’s not fussed about running on ultra, but wants something decent. So, I’m looking for a good mid-range graphics card – something like a 960/1050/770 that kinda ballpark.

Lemme know if you’ve got one going. Would be much appreciated.

Location: Barton-upon-Humber

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you…

1080p Graphics card

Wanted – 1080p Graphics card

Helping a friend build a gaming PC. He’s not fussed about running on ultra, but wants something decent. So, I’m looking for a good mid-range graphics card – something like a 960/1050/770 that kinda ballpark.

Lemme know if you’ve got one going. Would be much appreciated.

Location: Barton-upon-Humber

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Google got faster pulling bad Android apps from Play Store

Google wants to reinforce that the Play Store is the safest place for Android users to get apps with a new set of stats on how its efforts to block bad Android apps have improved.

Andrew Ahn, product manager for Google Play, said the company has “halved the probability” of users installing bad Android apps and also made the Play Store “a more challenging place for those who seek to abuse the app ecosystem for their own gain.”

“In 2017, we took down more than 700,000 apps that violated the Google Play policies, 70% more than the apps taken down in 2016. Not only did we remove more bad apps, we were able to identify and action against them earlier,” Ahn wrote in a blog post. “In fact, 99% of apps with abusive contents were identified and rejected before anyone could install them. This was possible through significant improvements in our ability to detect abuse — such as impersonation, inappropriate content, or malware — through new machine learning models and techniques.”

Liviu Arsene, senior e-threat analyst at Romania-based antimalware firm Bitdefender, said it is “commendable that Google is going through great lengths to optimize be malicious app bouncing process,” considering the more than 3.5 million apps in the Play Store.

“However, malware developers don’t necessarily have to submit ‘bad Android apps’ when they can simply create something that’s barely functional with the sole purpose of getting past the vetting process. Some apps may offer deceptive descriptions and functionalities just to get installed on devices, from which they can request all sorts of permissions for tracking users or for bombarding them with ads,” Arsene told SearchSecurity. “There have been instances where apps walk a very fine line between complying with Google’s advertising policy and spamming users with nag screens, browser redirects, and unsolicited pop-ups just for the sole purpose of generating revenue for the developer. While, granted, they don’t install malware or pilfer personal data, some of them can still be borderline legitimate.”

Will the Play Store catch all the bad apps?

A Google spokesperson told SearchSecurity that there will always be a chance for bad Android apps to slip through because “they evade detection in a sneaky way, or seem to be very borderline cases,” and in those cases Google relies on analyzing how apps are being distributed, monitoring user community flagging and reviewing data from post-install Google Play Protect scans in order to take action on a potentially harmful app.

“Apps submitted to Google Play are automatically scanned for potentially malicious code as well as spammy developer accounts before they are published on the Google Play Store. To complement that effort, we recently introduced a proactive app review process to catch policy offenders earlier in the process, while still ensuring that developers can get their apps to market as soon as possible — in a matter of hours, not days or weeks,” the spokesperson said. “During that process, apps are specifically reviewed for compliance against our Google Play Developer Content Policy and Developer Distribution Agreement, which prevents things like apps that are impersonating legitimate companies or deceptive behavior.”

Arsene applauded the work done by Google to block bad Android apps “because Android is one of the most popular operating systems.”

“Some built in app scanning features even let users know if they’ve downloaded something malicious from a third-party marketplace, which acts as an additional line of defense,” Arsene said. “However, it’s recommended that everyone owning an Android device, regardless if they install apps from official marketplaces or not, install a mobile security solution as it will have the ability to protect them from much more than just malicious apps, but also against web-based attacks and other online threats.”

Wanted – 1080p Graphics card

Helping a friend build a gaming PC. He’s not fussed about running on ultra, but wants something decent. So, I’m looking for a good mid-range graphics card – something like a 960/1050/770 that kinda ballpark.

Lemme know if you’ve got one going. Would be much appreciated.

Location: Barton-upon-Humber

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

New VEP Charter promises vulnerability transparency

The White House wants more transparency in how federal agencies determine whether or not to disclose software vulnerabilities, but there are early questions regarding how it might work.

The Vulnerabilities Equities Process (VEP) was designed to organize how federal agencies would review vulnerabilities and decide if a flaw should be kept secret for use in intelligence or law enforcement operations or disclosed to vendors. The new VEP Charter announced by Rob Joyce, special assistant to the President and cybersecurity coordinator for the National Security Council, aims to ensure the government conducts “the VEP in a manner that can withstand a high degree of scrutiny and oversight from the citizens it serves.”

“I believe that conducting this risk/benefit analysis is a vital responsibility of the Federal Government,” Joyce wrote in a public statement. “Although I don’t believe withholding all vulnerabilities for operations is a responsible position, we see many nations choose it. I also know of no nation that has chosen to disclose every vulnerability it discovers.”

Joyce laid out the “key tenets” of the new VEP Charter, including increased transparency and an annual report, improved standardization of the process regarding the interests of various stakeholders and increased accountability.

“We make it clear that departments and agencies with protective missions participate in VEP discussions, as well as other departments and agencies that have broader equities, like the Department of State and the Department of Commerce. We also clarify what categories of vulnerabilities are submitted to the process and ensure that any decision not to disclose a vulnerability will be reevaluated regularly,” Joyce wrote. “There are still important reasons to keep many of the specific vulnerabilities evaluated in the process classified, but we will release an annual report that provides metrics about the process to further inform the public about the VEP and its outcomes.”

Questions about the VEP Charter

The VEP has previously been criticized by experts for being optional rather than being codified into law, but the VEP Charter does not include language making the process a requirement nor does it acknowledge the PATCH Act, a bill proposed in Congress which would enforce a framework for using the VEP.

Heather West, senior policy manager and Americas principal at Mozilla, noted in a blog post that “many of the goals of the PATCH Act [are] covered in this process release, [but] our overarching goal in codifying the VEP in law to ensure compliance and permanence cannot be met by unilateral executive action.”

Early readings of the VEP Charter have revealed what some consider a conflict of interest, in that the NSA is designated as the VEP Executive Secretariat with the responsibility to “facilitate information flow, discussions, determinations, documentation, and recordkeeping for the process.”

However, the VEP Charter also states that any flaw found in NSA-certified equipment or systems should be “reported to NSA as soon as practical. NSA will assume responsibility for this vulnerability and submit it formally through the VEP Executive Secretariat.”

Additionally some have taken issue with the following clause in the VEP Charter: “The [U.S. government’s] decision to disclose or restrict vulnerability information could be subject to restrictions by foreign or private sector partners of the USG, such as Non-Disclosure Agreements, Memoranda of Understanding, or other agreements that constrain USG options for disclosing vulnerability information.”

Edward Snowden said on Twitter that this could be considered an “enormous loophole permitting digital arms brokers to exempt critical flaws in U.S. infrastructure from disclosure” by using an NDA.

SAP boosts data integration with SAP Data Hub and Vora

SAP wants to cover as many data integration bases as possible with the recent release of the new SAP Data Hub and updates to SAP Vora.

SAP Data Hub and Vora both attempt to solve similar data integration challenges and provide businesses a means to extract value out of the reams of data they are collecting, but the specific goals for each product are quite different, according to Ken Tsai, SAP’s global vice president and head of cloud platform and data management product marketing.

The recently released Data Hub has the much broader mission, because it is intended to help organizations manage complex data landscapes by building pipelines between a variety of data sources, Tsai said. SAP Vora, which has been on the scene for two years, provides a way to get at data stored in Hadoop data lakes via an Apache Spark framework. SAP Data Hub uses Vora underneath the covers, but the products are not the same.

Similar products, different aims

“SAP Data Hub has a much bigger purpose in terms of building up data flow in order to ensure a more efficient data operation, rather than just doing computing, which SAP Vora aims to do,” Tsai said. “Both are very complementary, and we’re seeing good results so far from SAP Vora. And the idea for SAP Data Hub wouldn’t have come about if we hadn’t been investing in building computing solutions in the Hadoop big data space and seeing what customers needed beyond the vast computing engine directly into Hadoop.”

SAP Data Hub is important now because locating data centrally is not feasible anymore, Tsai said, and it centralizes the data management while keeping the data in the source repositories.

“We’re targeting not only the developers, but also the enterprise architects, data scientists [and] business analysts,” he said. The IT department today is evolving into multiple zones — the application zone, the data warehouse zone, the data lake zone — and each one of them needs to participate in a kind of data flow, as data has to flow from one zone to the other.”

Pharmaceutical manufacturer McKesson Corp. is one SAP customer that has deployed SAP Data Hub to consolidate data across multiple systems to derive one single source of truth from that data.

“Our work is about helping our customers improve patient care and driving efficiencies across the healthcare value chain,” Adam Fecadu, chief information architect at McKesson, based in St. Paul, Minn., said in a press release. “It starts with relentless focus on helping our customers and partners solve their toughest challenges. With numerous data sources, types and IT landscapes, we need a unified data solution across departments and business units to produce actionable insights and continuous innovation. SAP Data Hub is aligned with this vision.”

SAP Vora dives into the big data lake

Vora is designed to allow businesses to process big data from Hadoop data lakes and derive business value from the data, Tsai said. SAP Vora 2.0 has been rearchitected using Kubernetes containers to improve the scalability and reduce the complexity in deployment.

CenterPoint Energy, an electricity and natural gas utility based in Houston, is using SAP Vora along with SAP HANA to manage and analyze data that it gets from smart meters. Its application uses HANA to track and analyze the health of its infrastructure and grid in real time and moves older data into Hadoop. Vora is used to process and analyze the historical data in Hadoop to determine usage patterns and trends, and this data can be combined with the current HANA data, allowing insights that result in more proactive energy delivery and pricing, Tsai said.

Processing data where it lives

Data Hub is a good direction for SAP, because it allows users to work on data where it lives without having to move it, according to Stewart Bond, research director of data integration software at IDC.

Data is getting too big to move around anymore, and people don’t want to move the data around.
Stewart Bondresearch director of data integration software, IDC

“It’s kind of a departure from where SAP’s been in the past, where you have to pull the data into the SAP environment to be able to work with it. But it’s also similar to what we’re seeing in the rest of the market,” Bond said. “Data is getting too big to move around anymore, and people don’t want to move the data around. And the data that is getting moved is a subset. Organizations that use the big data repositories like Hadoop preprocess data before it ends up going into an enterprise data warehouse. And in the preprocessing, things get filtered out, things get cleansed, things get put into smaller shapes — data sets that are smaller than what we have in big data.”

SAP Vora is similar, but tries to solve a different problem, Bond explained.

“Vora has been about plugging into the Hadoop big data ecosystem, whereas Data Hub is more of a broader data play, with more of a variety of data sources that they want to connect to and capabilities for working with data in motion or data pipelining,” Bond said. “They’re leveraging the investment that they have in Vora by making that technology and those capabilities available in Data Hub for those times when Data Hub solutions need to tap into a Hadoop ecosystem to do something with the data. But I think they are slightly different problem spaces, and they might be going after a slightly different audience.”

These tools are important because they are creating more opportunities for businesses to solve problems with technology, said Ezra Gottheil, senior analyst with Technology Business Research, a research and analysis firm based in Hampton, N.H. This is a confluence of the core technologies becoming more manageable, the convenience of the cloud as a platform and the next-generation technologies like big data and internet of things.

“There are more different-shaped Lego blocks that are available for those who are creating applications to build with, so everybody is extremely eager to get those tools in the hands of not just developers, but business people,” Gottheil said. “SAP makes applications, and they make some pretty specialized ones as well, but they can only begin to address all the applications that are needed. So, they’ll have to come from customers and third parties, too, but putting out the tools and promoting them is the way they get at that market.”

Need to keep up with the competition

SAP faces a steep competition curve in the market, Bond said, and needs to have a fully developed product to keep up. Oracle, for example, has the Data Integration Platform Cloud, which brings data integration, data quality and data governance to a cloud platform.

“Data Hub is doing something very similar, so the challenge that they’re up against is that they’re talking about going to market with data governance, data pipeline and data integration, but there are still parts of that three-pronged story that need to be developed,” Bond said. “The data governance piece was demonstrated in the launch, but what was demonstrated was more technical-level data governance and really wasn’t that business metadata. So, it’s going to be critical that when they go to market that they have a truly competitive product, because their competitors are going to be there as quickly as they are.”

Huawei wants to grow public cloud market share

Huawei wants to gain public cloud market share and become a dominant public cloud provider, according to Brad Shimmin, an analyst at Current Analysis in Sterling, Va. At its annual Huawei Connect event, the Chinese vendor laid out its plans to grow public cloud market share to compete directly with Google, Microsoft, Amazon and IBM. However, as Shimmin noted, Huawei’s plan is not to dominate in the same way as its competitors — instead the vendor aims to create an open platform that interoperates with other clouds.

Huawei will initially focus its attention on growing public cloud market share among telcos and in its home market, with clients such as China Telecom and China Central TV. Shimmin doubts that Huawei can match other hyperscale cloud providers in scope and scale. Although the vendor recently launched Huawei Enterprise Intelligence AI, Shimmin still sees Huawei’s machine learning ranking far behind the AI capabilities of its competitors. “In my opinion, where Huawei is most likely to succeed with its cloud play is in helping partners and customers apply Huawei’s significant hardware expertise to trenchant problems like cross-cloud security, AI-scale hardware and IoT edge computing,” Shimmin said.

Read more of Shimmin’s thoughts on Huawei.

Achieving container workload performance

Dan Conde, an analyst at Enterprise Strategy Group in Milford, Mass., said instead of fretting over competition between containers, virtual machines (VMs) and serverless machines, professionals need to focus on the architecture of new applications. Many emerging applications are geared for microservices and depend on new infrastructure to scale and interoperate.

In Conde’s view, what really matters with containers and similar technologies is the performance of the workload, not how the workload is actually run. Having choices is vital — even if it means mixing and matching containers and VMs. The traditional system of underlying platforms and operating systems has been displaced by a much more complicated set of services such as Cassandra, NATS or Apache Spark; cloud platforms; and lower-level offerings such as Apache Mesos or Red Hat OpenShift. “The old notion of a highly curated platform as a service (PaaS) is effectively dead because developers demand choices and the world is changing too rapidly to choose a narrow path. …The analogy would be the five-year plans of the old planned economies. The current world is too dynamic to go down such a narrow path,” Conde said.

Dig deeper into Conde’s thoughts on container workload performance.

Cisco emphasizes LISP for enterprise campuses

Ivan Pepelnjak, writing in ipSpace, responded to questions he received from readers asking why Cisco was pushing LISP instead of EVPN for VXLAN-based enterprise systems. While Pepelnjak admitted that he wasn’t certain of the exact reasons, he suggested that Cisco and a few other large vendors still see a need for large Layer 2 domains. “It looks like the networking industry is in another lemming rush. Everyone is rolling out VXLAN to solve large VLAN challenges, or even replacing MPLS with VXLAN for L3VPN deployments,” Pepelnjak said.

He added that every large vendor is deploying EVPN as a control plane for VXLAN, including Cumulus Networks, Juniper Networks, Cisco and Arista Networks. According to Pepelnjak, LISP is a system searching for a problem that Cisco has chosen to deploy as an additional control plane, without any technical factors driving the decision.

Read more of Pepelnjak’s thoughts on LISP.

Windows Server 2016 changes prompt a new look at management

Microsoft wants more IT pros to get on board with its Azure platform and knows the fastest way to do that is through automation. The company took a subtle approach to engender a cloud mindset through various Windows Server 2016 changes — but it might not be as flexible in the near future.

Microsoft encourages IT admins to develop policies and Desired State Configurations that manage servers as a collective. But it hasn’t forgotten the legions of admins who spend their days — and nights — in the depths of the Microsoft Management Console. These IT workers are hands-on with each individual server. They perform manual configuration changes constantly and largely ignore anything with the suffix -aaS.

A shift in the server management mindset

With a few Windows Server 2016 changes to the server management model, Microsoft nudged administrators to look up from their individual servers and consider the infrastructure as a whole, not unlike a cloud provider.

In IT, there is a concerted effort to stop the micromanagement of individual servers. This trend is popularized by the pets vs. cattle analogy that contrasts how we care for our cats and dogs to the way commercial farmers manage a herd.

In IT, there is a concerted effort to stop the micromanagement of individual servers.

The new approach is to build identical servers and handle them as a collection. This approach is business-critical for web-scale companies that manage thousands of servers. They would face skyrocketing operations costs if they stuck with the old 1:100 admin-to-server ratio. If one server malfunctions, remove and replace it with another. Problem solved.

But for certain legacy shops, this approach to manage servers gets no traction. A midsize company might have a dozen servers, each with unique applications and possibly distinct OSes. When a server fails, a business crisis follows. The recovery process typically involves various backup media, coffee and swearing. Swap out the server? That’s not an option.

Why bother building out a cattle infrastructure if server and application deployments are few and far between? And what if the staff skill sets align more closely with the features in Windows Server 2008 R2 than Windows Server 2016?

A threshold is implied here — but not defined. The real question is: How big does an IT infrastructure need to be before a move from pets to cattle is a reasonable course of action? Do you go by the number of servers, applications or administrators — or data centers? Is it a combination of these factors?

Server Manager’s capabilities tempt admins

Between these two extremes, Microsoft positioned its Windows Server 2016 changes. The company must tread carefully to keep two sets of customers happy: the DevOps devotees with their cattle and the traditional server admins with their pets. Both groups represent much to Microsoft’s future, despite what you might hear from the CALMS crowd.

Windows Server 2016 is a bridge to facilitate the transition from the traditional way to manage Windows servers to an automated model. Consider what you see when you first log in to a Windows Server 2016 system: the Server Manager dashboard. Here, the administrator decides to either use Server Manager and move one step closer to managing cattle — or stick with the reliable Computer Management tool and keep shopping at the pet store.

Server Manager quickly and easily creates groups of servers based on a role or an application. With this tool, admins manage servers more efficiently and do not need to change connections. It’s a shame this tool’s functionality isn’t more obvious; many server admins don’t realize Server Manager makes it almost effortless to control multiple remote servers.

Windows Server 2016 is a single platform with multiple management points intended to seduce administrators away from the Computer Management console in favor of the sleeker Server Manager.

One step closer to cloud via PowerShell automation

It’s impossible to talk about Windows Server 2016 changes without a look at PowerShell. Some describe it as a gateway drug that leads to a hardcore automation addiction and, eventually, the cloud.

Where Microsoft once gently encouraged admins to use PowerShell, it now strong-arms admins toward management via the command-line interface (CLI). Admins who don’t pay close attention and click through the Windows Server 2016 installation options will find themselves staring at a blinking cursor of the PowerShell console instead of a desktop.

Microsoft’s message is clear: PowerShell is the preferred method to manage Windows Server 2016. The GUI is a consolation prize for admins who continue to scoff at scripting.

[embedded content]

An overview of the more cloud-friendly
features in Windows Server 2016

Here we are, with two sets of enthusiasts who aspire to apply their brand of management to the Windows world. There’s no reaching across the aisle here. Ideologies are entrenched, and very few admins show any willingness to switch sides. The PowerShell crowd wants an OS designed around Windows Remote Management that doesn’t need interactive control. The old school admin crowd wants Windows Server 2003 R2 but with a newer look.

Microsoft is smart to cater to both crowds with its Windows Server 2016 changes. DevOps and related methodologies are not evolutions of traditional server management — they are an attempt to manage cloud-native applications at scale in a smart and efficient manner. Both techniques can coexist, and an OS vendor would be foolish to force an all-in-one approach.

Given the major shift in Microsoft’s strategy since Satya Nadella’s arrival and the breakneck pace at which Azure chases enterprise cloud customers, I expect future Windows Server releases to further blur the line between on premises and cloud and to make that pets vs. cattle decision for its users. We’ll see PowerShell become the default method to manage servers, and administrators who currently jump through hoops to load the server GUI will finally cave to the CLI.

Configure Azure Active Directory SSO service and avoid delays

No one wants to enter the same password multiple times to use applications on a single machine. Many administrators…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

seek single sign-on, and Microsoft’s Active Directory Federation Services is the traditional way to get it. But ADFS doesn’t prevent login prompts in all applications; Outlook or Skype for Business users have to look elsewhere.

Businesses have a new option for SSO. Azure Active Directory (AD) Seamless SSO registers a special computer account in AD to act as a proxy so that Integrated Windows Authentication (IWA) — which authorizes users — works against specific URLs in Azure AD to sign a user in as if the URLs were an intranet site.

Administrators can configure Azure AD Connect, which integrates an on-premises directory with Azure AD, to perform Seamless SSO; set up an Office 365 tenant to support modern authentication; and, finally, examine the client experience.

Combine Azure Active Directory SSO with modern authentication, which enables features such as multifactor authentication and certificate-based authentication, to get a full SSO without ADFS. Modern authentication uses a web browser-based sign-in within the Office applications, which enables IWA to work.

Configure Azure AD Connect

To set up the feature, start with Azure AD Connect and password synchronization in place. Launch the Azure AD Connect configuration wizard, select the User Sign-In option and choose Enable single sign on, as shown in Figure 1.

Azure AD configuration wizard
Figure 1. Click on Enable single sign on to use Seamless SSO.

On the Enable single sign on page shown in Figure 2, enter the domain administrator credentials to create the special computer account for Azure AD Connect in the local AD.

Enable single sign on
Figure 2. Enter the domain administrator credentials to create a special computer account for Azure AD Connect.

Complete the setup wizard. Once Azure AD Connect updates the configuration, verify that the new computer account has been created. Open Active Directory Users and Computers, navigate to the Computers container and look for a new computer for Azure Active Directory SSO, named AZUREADSSOACC:

Active Directory Users and Computers
Figure 3. Verify that the action created a new computer account named AZUREADSSOACC.

Set up the Office 365 tenant

To use the Seamless SSO service with Outlook and Skype for Business applications, enable the Office 365 tenant for modern authentication.

Connect with Exchange Online PowerShell and use administrative credentials, as such:

$UserCredential = Get-Credential

$ExoSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri

https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection

Import-PSSession $ExoSession

Next, use the Set-OrganizationConfig cmdlet to enable the OAuth2 Client Profile:

Set-OrganizationConfig -OAuth2ClientProfileEnabled $true

For Skype for Business Online, download and install the Skype for Business Online Windows PowerShell module. Connect to Skype for Business Online from a PowerShell prompt:

$UserCredential = Get-Credential

$SfBSession = New-CsOnlineSession -Credential $UserCredential -Verbose

Import-PSSession $SfBSession

Invoke the Set-CsOAuthConfiguration cmdlet to enable Modern Authentication.

Set-CsOAuthConfiguration -ClientAdalAuthOverride Allowed

These are common steps to enable SSO with Windows 10 Azure AD-joined devices and ADFS.

If your organization uses Office 2013 with modern authentication enabled — or Office 2016, which uses modern authentication if available — then the system will prompt clients for a password until you have completed and tested the remainder of the steps.

Configure Intranet Zone settings

Azure Active Directory SSO requires an administrator to add two URLs to Internet Explorer’s Local Intranet Zone on client PCs. This indicates to the client that the specific URLs are safe to use with IWA.

The two URLs to add are:

When you add these URLs to the Intranet Zone in Internet Explorer, Office clients — including Outlook and Chrome — inherit them.

To test the functionality, open the Internet Explorer options page, and on the Security tab, choose Local Intranet, then Sites and finally add the URLs, as shown in Figure 4.

Internet Explorer options page
Figure 4. Test that the two mandatory URLs for Azure AD’s SSO service function in Internet Explorer.

Admins typically deploy these URLs via Group Policy. Open the Group Policy management tools for your domain, and either create or amend an existing policy for users who need SSO. Under the User Configuration section, as seen in Figure 5, navigate to Policies > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page. Select the Site to Zone Assignment List.

Group Policy Management Editor
Figure 5. Create or adjust Group Policy for users who need SSO.

Add both site URLs to the Site to Zone Assignment List, with the URL as the Value name and the Value as 1, which indicates that the URL should be added to the Intranet Zone, as seen in Figure 6.

Site to Zone Assignment List
Figure 6. Add the value name and value for each URL to join the Intranet Zone.

What are the caveats?

Once Seamless SSO is configured and you’ve deployed supporting policies, the sign-in experience removes almost all areas where a user would enter his username and eliminates the need to enter credentials.

But in some scenarios the user needs to enter a username.

A username — typically an email address — is required to access some web-based services, including the Office 365 portal, OneDrive and SharePoint. However, after entering the username, the system won’t prompt the user for a password.

The next-generation OneDrive client, which can sign into both consumer and business OneDrive services, is similar. On first entry, the user must enter a username to sign in but will not be prompted for a password.

Next Steps

Azure AD has a lot to offer Office 365 orgs

Keep abreast of Microsoft’s Azure portal changes

Pros and cons of the Azure AD PowerShell module

Powered by WPeMatico