All posts by admin

NRCC email breach confirmed eight months later

The National Republican Congressional Committee experienced an email breach in April, but little is known about the incident.

A new report — confirmed by committee officials — said four senior aides had their NRCC email accounts hacked. The incident was detected in April, but the NRCC email accounts were compromised for several months, according to Politico.

Both the FBI and CrowdStrike were asked to investigate the NRCC email breach. CrowdStrike also notably investigated the Democratic National Committee email breach from 2016.

“In April 2018, CrowdStrike was asked by the NRCC to perform an investigation related to unauthorized access to NRCC’s emails,” the spokesperson said. “Prior to the incident, CrowdStrike was helping to protect NRCC’s internal corporate network, which was not compromised in this incident.”

However, despite the investigation, many House Republicans — including senior officials — were not informed of the NRCC email breach until after Politico began investigating the incident.

The NRCC did not respond to requests for comment at the time of this post, and a spokesperson for the committee refused to offer details about the breach to Politico, claiming it might interfere with the investigation.

The NRCC email hack compromised thousands of messages, but committee officials claimed there have been no threats to leak the information. Even so, the NRCC hired two prominent law firms to “oversee the response to the hack.”

The NRCC email breach is the latest in a string of cyberattacks and data exposures involving political organizations. The Democratic National Committee and members of Hillary Clinton’s campaign had email messages stolen during the 2016 election cycle. Last year, the Republican National Committee misconfigured cloud storage and exposed data on nearly 200 million voters.

Go to Original Article
Author:

For Sale – MSI G45 Motherboard – i5 4670K – 16GB Corsair Vengance Ram

Discussion in ‘Desktop Computer Classifieds‘ started by Atmos, Nov 19, 2018.

  1. Atmos

    Well-known Member

    Joined:
    Jul 21, 2014
    Messages:
    1,676
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +591

    Only upgrading due to the high demands of VR.

    I’m replacing my kit bought from new in 2016 – all boxes included.

    MSI G45 Gaming Motherboard.
    Intel i5-4670K Processor.
    16GB Corsair Vengance DDR3

    Support For Z87-G45 GAMING | Motherboard – The world leader in motherboard design | MSI Global

    Intel® Core™ i5-4670K Processor (6M Cache, up to 3.80 GHz) Product Specifications

    VENGEANCE® Pro Series — 16GB (2 x 8GB) DDR3 DRAM 2400MHz C11 Memory Kit

    Price and currency: £350
    Delivery: Goods must be exchanged in person
    Payment method: Goods must be exchanged in person
    Location: Chester
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I prefer the goods to be collected

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

  2. Hi,
    would you consider selling the memory.

  3. Atmos

    Well-known Member

    Joined:
    Jul 21, 2014
    Messages:
    1,676
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +591

    Hi I would concider £50 for the memory including postage if you live in the UK.

    Last edited: Dec 4, 2018

  4. Do you know if they run ok in a msi z97 pc mate.

  5. Will take, pm you later with details.

  6. Atmos

    Well-known Member

    Joined:
    Jul 21, 2014
    Messages:
    1,676
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +591

    Thank you.

  7. Payment made.
    Thanks

  8. Atmos

    Well-known Member

    Joined:
    Jul 21, 2014
    Messages:
    1,676
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +591

    RAM SOLD to blackadder.

    Payment received, thanks.

  9. cabbie19

    Well-known Member

    Joined:
    Jan 16, 2009
    Messages:
    4,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +531

    Hi, would you sell the cpu separately?
    If yes, how much posted to IP27?

    Regards
    Jamie.

  10. Atmos

    Well-known Member

    Joined:
    Jul 21, 2014
    Messages:
    1,676
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +591

    Hi

    @cabbie

    I would post the CPU to you for £150.

  11. cabbie19

    Well-known Member

    Joined:
    Jan 16, 2009
    Messages:
    4,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +531

    Wow, sorry but that’s way too dear for me, CEX do it for £85 + £5 postage inc 2 years warranty, are you willing to re-consider your price?

    Jamie.

  12. Atmos

    Well-known Member

    Joined:
    Jul 21, 2014
    Messages:
    1,676
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    116
    Ratings:
    +591

    I’ll come down to £100 for the 4670k inc postage.

  13. cabbie19

    Well-known Member

    Joined:
    Jan 16, 2009
    Messages:
    4,527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +531

    OK, no worries, will go to CEX. glws

    Jamie.

Share This Page

Loading…

Go to Original Article
Author:

Oracle-DataFox acquisition could expand users’ AI options

NEW YORK — In a bid to enhance its AI-powered capabilities, expand available data sources for its business intelligence software and widen users’ AI options, multinational software giant Oracle acquired AI sales enablement services company DataFox.

The Oracle-DataFox acquisition, completed Oct. 31, will likely give Oracle users access to DataFox’s cloud-based AI data engine — and to the applications the vendor has built with it — once the software is incorporated into Oracle systems. Oracle also will continue to sell the DataFox AI engine as a stand-alone product.

The acquisition made sense for both companies, according to Bastiaan Janmaat, CEO and co-founder of DataFox, based in San Francisco.

“What we were building was just very aligned with the Oracle roadmap,” he told us in an interview with him and Melissa Boxer, vice president of Adaptive Intelligent Applications at Oracle, during the AI Summit.

Oracle, for its part, has been looking to expand its AI efforts.

Fire up that data engine

DataFox has a data engine that can “suck in unstructured content … and then pull out of that structured data about companies,” Janmaat explained. “Well, right now, it’s data about companies. But, long term, it could be about anything.”

That technology has enabled DataFox to create customer relationship management software that can automatically find reference and signal data about companies and distill it down to relevant, focused bits of information — “smart talking points,” as Janmaat put it — that can be used to provide sales reps with quick, useful information.

DataFox also sells tools for sales organizations that can automatically “surface companies that are a good fit based on past clients,” Janmaat said. The software also automatically prioritizes based on a customizable weighting system.

The AI tools aren’t meant to take jobs away from sales reps, Janmaat said, but instead augment their own abilities by automatically providing them with fast access to information about potential clients.

Humans plus AI

[The DataFox data engine can] suck in unstructured content … and then pull out of that structured data about companies.
Bastiaan JanmaatCEO and co-founder, DataFox

Behind the scenes, the software — still publicly available after the Oracle-DataFox acquisition — uses AI-powered tools like machine learning algorithms and natural language processing (NLP), as well as continued human input from a dedicated team from DataFox that helps customize and train the AI-based models for high accuracy.

The DataFox team tracks accuracy, which might be in the 70s percentiles at first, and the team keeps tweaking the models “until it gets to high 90s,” Janmaat said.

The DataFox engineers have strong domain authority, as well as machine learning expertise, Janmaat said.

They need to have that expertise, as the data engine draws in content from numerous places, including news sources, reference documents and company websites, which might not always have correct information, he said.

Companies rise, fall and change goals, so information must be constantly updated. Besides the human element, the Oracle-DataFox software also uses a blacklist to automatically block sources known to peddle fake news, like the satirical news site The Onion, for example.

The Oracle perspective

“We have a lot of great plans in place” for the data engine, Boxer said.

While saying she couldn’t talk about the details of the Oracle-DataFox acquisition or a potential integration timeline due to a financial quiet period, Boxer noted that Oracle will start to use data from DataFox immediately, and it will integrate the AI components with Oracle’s own applications later.

“We wanted the engine, because that will help us grow data and extend data to other domains,” she said.

“This is a very differentiated position that they are taking, which is to say we are going to take care of the third-party data for you,” Janmaat said, referring to the Oracle-DataFox acquisition.

The vendors said they don’t expect the acquisition to disrupt the ability of DataFox users to integrate the software into other platforms through DataFox’s API.

The AI Summit is Dec. 5 to Dec. 6 at the Javits Center.

Go to Original Article
Author:

For Sale – LG 27UD88-W 27″ 4K UHD IPS LED Monitor

Discussion in ‘Desktop Computer Classifieds‘ started by ascender, Nov 8, 2018.

  1. ascender

    ascender

    Active Member

    Joined:
    Sep 29, 2003
    Messages:
    527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Edinburgh
    Ratings:
    +46

    I’m replacing my laptop at the moment so have one of these brilliant monitors to sell. It’s in excellent to as-new condition as its only about 12 weeks old. It’s supplied in its original box with all manuals and cables and is one of the best USB-C monitors you can buy which isn’t £1000+!

    Full specifications can be found here:

    LG 27UD88-W: 27 Class 4K UHD IPS LED Monitor (27 Diagonal) | LG USA

    Price and currency: £400
    Delivery: Delivery cost is included within my country
    Payment method: BACS or Paypal
    Location: Edinburgh
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

  2. janmm

    janmm

    Active Member

    Joined:
    May 12, 2010
    Messages:
    847
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Rotherham
    Ratings:
    +56

    Hi

    Are you willing to send the monitor first considering your low feedback?
    Whats the lowest youll take?

  3. ascender

    ascender

    Active Member

    Joined:
    Sep 29, 2003
    Messages:
    527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Edinburgh
    Ratings:
    +46

    No, I wouldn’t be willing to send the monitor, but if you check out my feedback you’ll find its for a number of items which are both higher value than this, one being for a MacBook Pro.

    £400 all in is my best offer.

  4. janmm

    janmm

    Active Member

    Joined:
    May 12, 2010
    Messages:
    847
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Rotherham
    Ratings:
    +56

    Ok too pricey for me. GLWS

  5. janmm

    janmm

    Active Member

    Joined:
    May 12, 2010
    Messages:
    847
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Rotherham
    Ratings:
    +56

    would you take £300 delivered?

  6. ascender

    ascender

    Active Member

    Joined:
    Sep 29, 2003
    Messages:
    527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Edinburgh
    Ratings:
    +46

    No, ‘fraid not.

Share This Page

Loading…

Go to Original Article
Author:

Windows Admin Center lures IT pros with Azure integration

The Windows Admin Center offers a number of on-premises management perks, but it’s the added bonus of Azure integration that some organizations might find even more appealing.

Traditionally, Windows IT administrators use native Windows integrated tools, such as Server Manager and Microsoft Management Console, to manage their on-premises workloads unless they require additional features from the System Center Configuration Manager or a third-party alternative. Microsoft recently developed Windows Admin Center, a free, browser-based, on-premises management tool that bundles a number of administrative tools in a clean interface that offers a more streamlined way to handle everyday tasks.

Although it is not dependent on Azure, Windows Admin Center connects to Microsoft’s cloud platform to use different services such as Azure Active Directory and IaaS virtual machines. Administrators can install Windows Admin Center on either Windows 10 or Windows Server 2016 or 2019 machines.

The difference between Windows Admin Center and System Center is the ease of Azure integration and how lightweight Windows Admin Center is. System Center is a monster of an application with many different components, but it does feature a lot of great tooling and automation capabilities. Windows Admin Center seems to be better for specific troubleshooting and one-off tasks.

Windows Admin Center installation and requirements

Installing Windows Admin Center from Microsoft’s download page is a very simple process, but it’s even easier using the Chocolatey package manager. The command below installs the software locally with the default port and a self-signed certificate:

choco install windows-admin-center -y

[embedded content]
Remote management with Windows Admin Center

One downside to Windows Admin Center is it only manages servers running Windows Server 2012 R2 and above. Microsoft recently released limited Server 2008 support, but IT pros cannot completely manage Windows 7 and Server 2008 servers with this tool. However, these OSes will be out of support in early 2020, making this point moot.

Azure integration brings the cloud closer

It is somewhat surprising to see the scope of Windows Admin Center’s functionality. The ability to manage both on-premises and Azure servers from one place is a very distinct feature not found in many free utilities. Administrators might also find some use for the Azure service integrations, such as the ability to back up on-premises servers to Azure and centrally manage Windows updates with Azure Update Management.

In addition to its built-in features, Microsoft provided a way to expand the functionality of Windows Admin Center with a software development kit.

Another great Azure integration option is authentication to the Windows Admin Center from Azure Active Directory. When using this feature, a user who authenticates must be a member of the local users group on the gateway server and in Azure Active Directory. Because Azure allows for multifactor authentication, admins can use it to tighten security around Windows Admin Center and the systems it manages.

Windows Admin Center tuned for on-premises management

While there is some Azure integration, I suspect the majority of users want Windows Admin Center to manage on-premises resources, and a good portion of its features are geared toward data center administration.

Microsoft did a great job building an application that provides a unified view of certificates, firewall configurations, networking, processes, registry and resource monitoring. Windows Admin Center uses PowerShell and Windows Management Instrumentation (WMI) for its extensive management capabilities on remote systems. Windows Admin Center even connects to the remote file system to download and upload files.

remote Windows machine
In this screenshot, Windows Admin Center connects to a remote Windows machine and opens that system’s registry.

Extensibility augments management reach

In addition to its built-in features, Microsoft provided a way to expand the functionality of Windows Admin Center with a software development kit. Developers can build mock-ups and create tools and gateway plugins using different tools and technologies, including HTML5, CSS, Angular, TypeScript, jQuery, PowerShell and WMI.

Windows Admin Center extensibility
Vendors and developers who wish to use the Windows Admin Center management framework can produce their own extensions using a software development kit Microsoft provides.

New features arrive with version 1809

Microsoft released version 1809 of Windows Admin Center in September and added a slew of new features. With this release, users can see the PowerShell scripts built into Windows Admin Center that are associated with a particular action.

In the example below, the Windows Admin Center displays a list of scripts used to manage local users. Administrators can copy and modify these scripts for their own projects.

PowerShell scripts
Version 1809 of the Windows Admin Center shows the underlying PowerShell scripts used to perform certain actions.

Other new features in version 1809 include functionality for scheduled tasks, file shares, Hyper-V multi-VM bulk actions and limited functionality for Windows Server 2008 R2.

The platform-specific features for Windows Server 2019 include migrating servers with the Storage Migration Service and System Insights, which is a predictive analytics feature for Windows Admin Center.

Go to Original Article
Author:

11 changemakers chosen as recipients of Microsoft and National Geographic AI for Earth Innovation Grants – Stories

New grant offers awards of more than $1.2 million to advance uses of artificial intelligence in scientific exploration and research on critical environmental challenges

WASHINGTON — Dec. 11, 2018 — Eleven changemakers have been selected to receive Microsoft and National Geographic AI for Earth Innovation Grants to apply artificial intelligence (AI) to help understand and protect the planet. Each AI for Earth Innovation Grant recipient will be awarded between $45,000 and $200,000 to support their innovative projects.

National Geographic logo“The National Geographic Society is committed to achieving a planet in balance, and in joining forces with Microsoft on the AI for Earth Innovation Grant program, we are providing incredible potential to drive fundamental change through our unique combination of expertise in conservation, computer science, capacity building and public engagement,” said Jonathan Baillie, executive vice president and chief scientist of the Society. “We look forward to seeing these talented individuals create solutions to some of the most challenging environmental issues of the 21st century using the most advanced technologies available today.”

Eleven projects were selected from an impressive pool of more than 200 applicants. The high caliber of the applications prompted Microsoft and National Geographic to increase the funding for the 11 chosen projects from the initially planned $1 million to more than $1.28 million. This furthers the organizations’ commitment to investing in novel projects that use AI to help monitor, model and ultimately manage Earth’s natural systems for a more sustainable future.

“Human ingenuity, especially when paired with the speed, power and scale that AI brings, is our best bet for crafting a better future for our planet and everyone on it,” said Lucas Joppa, chief environmental officer at Microsoft Corp. “The caliber of the applications we received was outstanding and demonstrates the demand we’ve seen for these resources since we first launched AI for Earth. We’re looking forward to continuing our work with the National Geographic Society to support these new grantees in their work to explore, discover and improve the planet.”

The grant recipients and their project members will have the funds, computing power and technical support to advance exploration and discover new environmental solutions in the following core areas: sustainable agriculture, biodiversity, climate change and water. The diverse group hails from around the globe, originating from six countries and working in eight regions across five continents.

The 11 AI for Earth Innovation Grant recipients were announced at an event on Tuesday at National Geographic headquarters in Washington, D.C.:

  • Ketty Adoch: Geographical information systems specialist from Uganda. Her AI for Earth Innovation Grant project will detect, quantify and monitor land cover change in the area surrounding Lake Albert and Murchison Falls National Park, Uganda’s largest and oldest national park.
  • Torsten Bondo: Business development manager and senior remote sensing engineer at DHI GRAS in Denmark. With the AI for Earth Innovation Grant, his team aims to use machine learning and satellites to support irrigation development and improve crop water efficiency in Uganda together with the Ugandan geo-information company Geo Gecko. The goal is to contribute to food security, poverty alleviation and economic growth.
  • Kelly Caylor: Director of the Earth Research Institute and professor of ecohydrology in the Department of Geography and in the Bren School of Environmental Science and Management at the University of California, Santa Barbara. His team will use the AI for Earth Innovation Grant to help produce an online web map and geospatial analysis tools that will improve estimates of agricultural land use change and groundwater use.
  • Joseph Cook: Polar scientist from the United Kingdom. With his AI for Earth Innovation Grant, he aims to develop new tools that use modern techniques of machine learning and drone and satellite technology to explore the changing cryosphere.
  • Gretchen Daily: Co-founder and faculty director of the Natural Capital Project, based at Stanford University in Stanford, California. With this AI for Earth Innovation Grant, her team will develop a way of detecting dams and reservoirs around the world — most of which are hidden today, in digital terms — to quantify their impact and dependence on nature and guide investments in green growth that secure both people and the biosphere.
  • Stephanie Dolrenry: Director of Wildlife Guardians, based in Washington, D.C. Her team will use the AI for Earth Innovation Grant to help support the Lion Identification Network of Collaborators, an AI-assisted collaborative database for lion identification and interorganizational research.
  • Africa Flores: Research scientist at the Earth System Science Center at the University of Alabama in Huntsville and originally from Guatemala. Her AI for Earth Innovation Grant project will focus on developing a prototype of a harmful algal bloom (HAB) early warning system to inform Guatemalan authorities about upcoming HAB events in Lake Atitlan, a landmark of Guatemala’s biodiversity and culture.
  • Solomon Hsiang: Chancellor’s associate professor of public policy at the University of California, Berkeley, where he founded and directs the Global Policy Laboratory. With this AI for Earth Innovation Grant, his team will use 1.6 million historical aerial photographs to discern the effect of major droughts and climate change on human migration in Africa.
  • Holger Klinck: Director of the Cornell Lab’s Bioacoustics Research Program in Ithaca, New York. His team will use the AI for Earth Innovation Grant to develop a machine-learning algorithm for detecting and classifying the songs of insects in tropical rainforests to monitor species composition and spatial distribution, information that is critical for monitoring ecosystem health.
  • Justin Kitzes: Assistant professor in the Department of Biological Sciences at the University of Pittsburgh in Pennsylvania. With this AI for Earth Innovation Grant, he aims to develop the first free, open source models to allow academic researchers as well as agency, nonprofit and citizen scientists to identify bird songs in acoustic field recordings, with the goal of radically increasing global data collection on bird populations.
  • Heather J. Lynch: Quantitative ecologist and associate professor jointly appointed in the Department of Ecology and Evolution and in the Institute for Advanced Computational Science at Stony Brook University in New York. Her AI for Earth Innovation Grant project will couple AI with predictive-population modeling for real-time tracking of Antarctic penguin populations using satellite imagery.

The AI for Earth Innovation Grant program will provide award recipients with financial support, access to Microsoft Azure and AI tools, inclusion in the National Geographic Explorer community, and affiliation with National Geographic Labs, an initiative launched by National Geographic to accelerate transformative change and exponential solutions to the world’s biggest challenges by harnessing data, technology and innovation. The grants will support the creation and deployment of open source trained models and algorithms so they are available to other environmental researchers and innovators, and thereby have the potential to provide exponential global impact. The AI for Earth Innovation Grant program builds upon Microsoft’s AI for Earth program, which counts as grantees nearly 200 individuals and organizations on all seven continents, and the National Geographic Society’s 130-year history of grantmaking, supporting more than 13,000 grant projects along the way.

About National Geographic Society

The National Geographic Society is an impact-driven global nonprofit organization based in Washington, D.C. Since 1888, National Geographic has been pushing the boundaries of exploration, investing in bold people and transformative ideas to increase understanding of our world and generate solutions for a healthy, more sustainable future for generations to come. Our ultimate vision: a planet in balance. To learn more about the Society and its programs, visit https://www.nationalgeographic.org/.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777,

rrt@we-worldwide.com

Kelsey Flora, National Geographic Society, (202) 807-3133, kflora@ngs.org

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

 

Go to Original Article
Author: Microsoft News Center

New Microsoft Teams calling features narrow gap with Skype

Microsoft plans to add four advanced call controls to Microsoft Teams in the coming weeks, completing its roadmap for bringing the critical telephony features of Skype for Business to Teams. The announcement comes as Microsoft steps up its efforts to migrate users from Skype to Teams.

The four new Microsoft Teams calling features were previously only supported in on-premises deployments of Skype for Business. Microsoft finished adding to Teams all the telephony features of Skype for Business Online, the cloud-based version, earlier this year.

“They definitely have closed the feature gap with Skype for Business,” said Irwin Lazar, analyst at Nemertes Research in Mokena, Ill. “All in all, I think these feature announcements highlight that Teams is able to serve as an enterprise phone system.”

In October, Microsoft stopped letting small businesses sign up for Skype for Business Online, directing those customers to Teams. The vendor also began notifying cloud-based Skype customers with fewer than 500 employees of its intention to automatically migrate their users to Teams.

Technology companies are all looking to boost cloud sales. The vendors will initially forgo revenue from large one-time hardware sales, but monthly income from software subscriptions should make up for it over the long term.

Many enterprises, however, have resisted the switch to Teams, complaining that it lacks the advanced telephony features they need. The advanced calling features being added to Teams this month should alleviate many of those concerns, but large businesses still need an easier migration path to the cloud. 

Enterprises typically transition users from on-premises to cloud communications platforms in phases. That means Teams users need to be able to communicate with Skype users. But Microsoft does not support chat interoperability directly between on-premises Skype and Teams. Enterprise users expressed frustration about this hurdle at the Microsoft Ignite conference in September.

In contrast, Cisco announced this week that customers could now seamlessly link its on-premises unified communications client, Cisco Jabber, with its cloud-based collaboration app, Cisco Webex Teams, through a software update.

Microsoft appears poised to release a hybrid cloud setup similar to Cisco’s that would let on-premises customers begin using certain Teams messaging and meeting features within the Skype client. However, the vendor has yet to announce a timeline for that release.

Advanced Microsoft Teams calling features complete telephony roadmap

Three of the advanced features — group call pickup, call park and shared line appearance — should become available within the next several weeks. A fourth feature, location-based routing, is slated for release in the first quarter of 2019.

Group call pickup improves an existing feature that lets users automatically forward incoming calls to groups of colleagues. The system can ring each member of the group simultaneously or one at a time in a predetermined order. The update lets users customize the appearance and type of notifications that members of the group receive with incoming calls.

Call park is a sophisticated way to put callers on hold. Parking a call generates a code, which gets sent — in a text message, for example — to the employee the caller is attempting to reach. That employee can then answer the call in the Teams app.

Shared line appearance lets businesses create user accounts with multiple phone lines. The incoming calls to those lines are all automatically forwarded to other users. The auto-forwarding enhances the existing delegation feature whereby users give other people the ability to answer and make calls on their behalf. It should appeal to sales teams and other small groups as an alternative to an automatic call distributor.

Location-based routing is a way to establish rules for sending calls between voice over IP endpoints and public switched telephone network endpoints based on the regulatory restrictions in the locations of the callers.

Go to Original Article
Author:

For Sale – LG 27UD88-W 27″ 4K UHD IPS LED Monitor

Discussion in ‘Desktop Computer Classifieds‘ started by ascender, Nov 8, 2018.

  1. ascender

    ascender

    Active Member

    Joined:
    Sep 29, 2003
    Messages:
    527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Edinburgh
    Ratings:
    +46

    I’m replacing my laptop at the moment so have one of these brilliant monitors to sell. It’s in excellent to as-new condition as its only about 12 weeks old. It’s supplied in its original box with all manuals and cables and is one of the best USB-C monitors you can buy which isn’t £1000+!

    Full specifications can be found here:

    LG 27UD88-W: 27 Class 4K UHD IPS LED Monitor (27 Diagonal) | LG USA

    Price and currency: £400
    Delivery: Delivery cost is included within my country
    Payment method: BACS or Paypal
    Location: Edinburgh
    Advertised elsewhere?: Not advertised elsewhere
    Prefer goods collected?: I have no preference

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

  2. janmm

    janmm

    Active Member

    Joined:
    May 12, 2010
    Messages:
    847
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Rotherham
    Ratings:
    +56

    Hi

    Are you willing to send the monitor first considering your low feedback?
    Whats the lowest youll take?

  3. ascender

    ascender

    Active Member

    Joined:
    Sep 29, 2003
    Messages:
    527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Edinburgh
    Ratings:
    +46

    No, I wouldn’t be willing to send the monitor, but if you check out my feedback you’ll find its for a number of items which are both higher value than this, one being for a MacBook Pro.

    £400 all in is my best offer.

  4. janmm

    janmm

    Active Member

    Joined:
    May 12, 2010
    Messages:
    847
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Rotherham
    Ratings:
    +56

    Ok too pricey for me. GLWS

  5. janmm

    janmm

    Active Member

    Joined:
    May 12, 2010
    Messages:
    847
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Rotherham
    Ratings:
    +56

    would you take £300 delivered?

  6. ascender

    ascender

    Active Member

    Joined:
    Sep 29, 2003
    Messages:
    527
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Edinburgh
    Ratings:
    +46

    No, ‘fraid not.

Share This Page

Loading…

Go to Original Article
Author:

Portworx update targets hybrid and multi-cloud deployments

Portworx is adding the ability to move container-based application data across clouds with an update to its storage software for Kubernetes.

Portworx PX-Enterprise 2.0 now includes PX-Motion migration and a PX-Central management console. These features build on the startup’s original focus on helping customers provision persistent storage for stateful container-based applications.

Portworx CTO Gou Rao said the PX-Enterprise update responds to the growing trend of DevOps teams moving to Linux containers, rather than virtual machines, to manage and run their application infrastructure across clouds. These clouds can be on-premises or public clouds.

New PX-Motion feature

PX-Motion enables Kubernetes users to shift applications, data and configuration information between clusters. Customers can move the data or Kubernetes objects that control an application on demand. They can also set the product to automate workflows, such as backup and recovery and blue-green deployments for stateful applications. Users run two identical production environments in a blue-green deployment, with only one live at any time, to reduce the risk of downtime.

“Once a customer enables the PX-Motion functionality, they just go back to managing their applications through Kubernetes,” Rao said.

Migrating data between clouds could be hard with monolithic applications running in a machine-centric world, where any change might require moving a virtual machine image from one cloud to another, Rao said. By contrast, applications packaged in containers are distributed and service-oriented, with a smaller footprint, paving the way for Portworx features such as PX-Motion, he said.

“It’s not like a customer in the enterprise has one big, single, large Kubernetes cluster. What we have seen happening in the past couple of years is that people are typically running multiple Kubernetes clusters for a variety of reasons,” including compliance, risk mitigation or cost, Rao said. One common use case is running multiple clusters in the same public or private cloud and treating them as distinct cloud footprints, he added.

Portworx PX-Motion blue-green deployment
Portworx PX-Motion migrates container-based applications, data and configuration information as part of a DevOps-style blue-green deployment.

PX-Central management console

Portworx added PX-Central to facilitate single-pane-of-glass management, monitoring and metadata services across Portworx clusters built on the Kubernetes container scheduler and orchestration engine. Users can visualize and control an ongoing migration at an application level through custom policies.

Rhett Dillingham, a senior analyst at Moor Insights & Strategy, said Kubernetes users often struggle with storage availability and data management within and across clusters. This can be the case particularly when they run stateful applications, such as databases.

He said Portworx took an innovative approach to provide high-availability storage in the first version of PX-Enterprise. Now, it’s tackling data management to help enterprises scale Kubernetes deployments across clusters, development pipeline environments and multi-cloud environments.

Portworx’s competitors in cloud-native storage for Kubernetes include startup StorageOS and open source OpenEBS. But Dillingham said many organizations aren’t yet using any tools to solve the problems Portworx addresses.

“They just use the cloud block storage service from their cloud infrastructure platform,” Dillingham said. “But, as customers grow their use of Kubernetes to address more applications across more environments, they often recognize they need more storage platform capability than what they can get from the cloud infrastructure services.”

Portworx customer Fractal Industries started using Portworx PX-Enterprise to scale its machine-learning- and AI-based analytics-as-a-service product that ingests, integrates and correlates data in real time. The Reston, Va., startup uses stateful and stateless container-based services to power its Fractal OS platform and had trouble finding a good option to help with the stateful services. Fractal tried the open source REX-Ray container storage orchestration engine, mounted volumes, local volumes and other technologies before settling on Portworx.

Sunil Pentapati, director of platform operations at Fractal, said PX-Enterprise not only helps Fractal with scaling storage volumes and backup-and-restore functionality, but it also provides a common data management layer and real-time automation capabilities. He said the product includes APIs that minimize the manual effort Fractal needs to expend. The new Portworx PX-Enterprise 2.0 update will be important, as Fractal pursues its long-term goals related to automation, Pentapati noted.

“What is very important for us as a company is cloud independence,” Pentapati said. “The way we built our product is to abstract everything in such a way that we could scale our product across cloud providers or where the customer wants. For us to achieve that independence, it is not enough just to tackle the compute layer. We also need to extend the automation all the way to the data layer, too. Having the ability to scale our automation and operate across clouds is what Portworx brings to the table.”

Subscription-based pricing for Portworx PX-Enterprise is $1,500 per virtual machine, per year. PX-Motion and PX-Central are new features bundled with PX-Enterprise at no extra cost to the customer.

PX-Enterprise integrates Kubernetes distributions and container orchestration systems, such as Amazon Elastic Container Service for Kubernetes, Microsoft Azure Kubernetes Service, Docker Enterprise Edition, Google Kubernetes Engine, Heptio, IBM Cloud Kubernetes Service, Mesosphere DC/OS, Pivotal Container Service for Kubernetes, Rancher and Red Hat OpenShift.

Go to Original Article
Author:

How to Make an Offline Root Certificate Authority for Windows PKI in WSL

In a previous article, I talked about the concepts involved in PKI. In this article, I want to show you how to build your own PKI. I will mostly write this as a how-to, on the assumption that you read the previous article or already have equivalent knowledge. I will take a novel approach of implementing the root certification authority in Windows Subsystem for Linux. That allows you to enjoy the benefits of an offline root CA with almost none of the drawbacks.

Why Use Windows Subsystem for Linux as a Root Certification Authority?

Traditionally, building a Windows PKI with an offline CA involves Windows Server systems for all roles. Most Windows administrators find it easy — you only need to know the one operating system. I have seen some that employ a standard Linux system as the offline. Of course, you don’t necessarily need a Windows system at all.

I use Windows Subsystem for Linux to create an offline root CA and use a Windows system for an online intermediate (subordinate) CA for these reasons:

  • Licensing If you use a Windows Server system as the offline root, it consumes a license (physical installation) or a virtualization right (virtual installation). Since the offline root CA should be kept offline for nearly the entirety of its existence, that’s a waste.
  • Built-in components I won’t show you anything in this article that you can’t find natively in Windows/Windows Server. You could certainly go through all the effort to obtain and install a full Linux installation to use as the root CA. You could download and install OpenSSL for Windows to mimic what I’m doing with WSL.
  • Setup ease You can bring up the WSL instance with almost no effort. Compare to the alternatives.
  • Active Directory integration You can certainly create a full PKI without using Windows Server at all. You lose a lot, though. You can get a lot of mileage out of features such as auto-enrollment.

Using WSL for the offline root allows us to protect it easily. Using Windows Server as the intermediate allows us maximal benefits.

There a quite a few parts to this configuration so please use the table of contents below to keep track of your progress:

Prerequisites for a WSL/Windows Server Certification Pair

You’ll need a few things to make this work:

  • Credentials for a domain account that belongs to the Enterprise Admins security group
  • An installation of Windows Server (2016 used in this article; anything 2012 or later should suffice) to use as the online intermediate (subordinate) certificate server
    • Domain-joined
    • Machine name set (you cannot change it afterward)
    • IP set
  • A Windows system capable of running Windows Subsystem for Linux (Windows 10 used in this article; a current iteration of Windows Server, Semi-Annual Channel or Windows Server LTSC version 2019+ should suffice). This article is written with the expectation that you will use a physical Windows 10 system and store vital information on a removable USB device.
    • Any installation of Linux with OpenSSL, virtual or physical, would suffice as an alternative
    • OpenSSL on a Windows installation would also suffice
  • A folder on the Windows system where files can be transferred to and from the WSL environment. Pick something easy to type (I used D:CA in this article)
  • A DNS name where you will publish the root CA’s certificate and certificate revocation list (CRL). It must be reachable by the systems and devices that will treat your CA as authoritative. This DNS name becomes a permanent part of the issued certificates, so choose wisely.

If you want, you can use the same Windows Server system to host the online intermediate CA and the offline root. You only need to ensure that the Windows Server version can run WSL. Ordinarily, running both together would be a terrible action. In this case, the root CA will not “exist” long enough to cause concern.

Phase 1: Prepare the Subordinate CA on the Windows Server

Counter-intuitively, we will start with the subordinate certification authority. It will reduce the amount of file shuffling necessary. Because this setup is a simple procession of wizard screens, I did not include every screen.

Install the Subordinate CA Role

We begin in Server Manager. I have not done this in PowerShell although it is possible.

  1. Start the Add roles and features wizard from within Server Manager
  2. Select Active Directory Certificate Services:
    Active Directory Certificate Services
  3. You will be prompted to add the management features. You can skip this if you only want the Certificate Authority role to exist on the target server.
    Active Directory Certificate Services Features
  4. By default, the wizard will only want to install the Certification Authority role. We will install and configure other roles later. You can select them now, but you will be unable to configure them. I recommend only taking the default at this time:
    Certification Authority role services

Complete the wizard and wait for the installation to finish.

Perform Initial CA Configuration

Once the wizard completes, you have the role installed but you cannot use it. Configure the subordinate CA role.

  1. In Server Manager, click the notification flag with the yellow triangle at the top right of the window, then click Configure Active Directory Certificate Services on the destination server:
  2. In the wizard, choose the enterprise admin account selected for this procedure:
    credentials
  3. Choose to configure the Certification Authority only. If you had selected to install other roles, do not attempt to configure them at this time.
    role services AD CS configuration
  4. Choose to configure the CA as an Enterprise CA (you can select Standalone CA if you prefer; you will miss out on most Active Directory features, however):
    setup type AD CS Configuration
  5. Select the Subordinate CA type:
    CA type AD CS Configuration
  6. Choose to Create a new private key.
    Note 1: if you were re-installing a previously existing CA, you would work through the Use existing private key branch instead.
    Note 2: If you’d like to use OpenSSL to create the key set instead, you’ll find instructions further down. You would do that if you wanted more information to appear on the CA’s certificate than the Microsoft wizard allows, such as organization and location information. However, you will need to pause here, perform the WSL steps, and then return to this point.
    Private Key AD CS Configuration
  7. The default Cryptography selections should suffice for most installations. Reducing any settings could cause compatibility problems; increasing them might cause compatibility and performance problems. Do not check Allow administrator interaction when the private key is accessed by the CA. That will cause credential prompts where you don’t want them.
    Cryptography AD CS Configuration
  8. Choose an appropriate name for the CA’s common name. The default should work well enough, although I tend to choose a friendlier name. I recommend that you avoid spaces; they can be used but will cause problems in some places. You can also change the suffix, if necessary for your usage.
    CA name AD CS Configuration
  9. Choose to Save a certificate request to file on the target machine. You can make the filename friendlier if you like. Make sure that you keep track of the name though because you’ll enter it in a future step.
    parent CA AD CS Configuration
  10. Proceed through the remainder of the wizard.

You have now completed the initial configuration of the subordinate certificate authority. Now we turn to the WSL system to set up your offline root CA.

Enable Windows Subsystem for Linux and Choose Your Distribution

You can skip this entire section if you are bringing your own method for running OpenSSL.

Enabling Windows Subsystem for Linux is slightly different depending on whether you are using a desktop or server operating system.

Enable WSL on Windows 10

As of this writing, Windows 10 provides the simplest way to install WSL. If you prefer a server SKU, skip to the next sub-section.

  1. Use Cortana to search for Turn Windows features on or off (or enough of a subset for the following to be found), then open the link
    Windows Features on or off
  2. Choose Windows Subsystem for Linux:
    windows subsystem for linux windows features
  3. Once that’s complete, use the Microsoft Store to find and install the Linux distribution of your choice. I prefer Kali, but any should do (you can find starting instructions for installing a non-Store distribution on Microsoft’s blog):
    kali linux launch

Start the Kali image and follow its prompts to set up your user and password.

Enable WSL on SAC or Windows Server LTSC 2019 or Later

Be aware that not all Linux distributions will work on non-GUI servers (in any way that I know of) because they do not include a .exe launcher. For example, Kali only comes as appx and I could not get it to work. Ubuntu comes as exe and should not pose problems.

  1. In PowerShell, run  Install-WindowsFeature -Name Microsoft-Windows-Subsystem-Linux
  2. Restart the Windows Server computer.
  3. Access Microsoft’s list of Linux distributions. You have two purposes on this site: selecting your desired distribution and getting its URL. The page includes its own instructions which do not meaningfully deter from mine.
  4. Download the selected distribution:  Invoke-WebRequest -Uri https://aka.ms/wsl-kali-linux -OutFile $env:USERPROFILEDownloadsKali.zip -UseBasicParsing
  5. Create a folder to run the distribution from:  mkdir C:DistrosKali
  6. Extract the distribution:  Expand-Archive -Path $env:USERPROFILEDownloadsKali.zip -DestinationPath C:DistrosKali
  7. Check the directory listing to check your options:

     

    1. For .exe, just run it directly:  DistrosUbuntuubunt1804.exe
    2. For .appx, use Install-AppxPackage first (desktop experience only, apparently), then run your distribution from the Start menu:  Install-AppxPackage -Path C:DistrosKaliDistroLauncher-Appx_1.1.4.0_x64.appx

Follow the prompts to set up your user account and password.

Windows Subsystem for Linux Storage

This article is not about WSL. If you want to know more than I’m showing you, then you’ll need to research it elsewhere. However, we’re dealing with a root certificate authority and security is extremely vital. So I want to make one thing clear: be aware that the files for your WSL installation will be held underneath your user profile (C:Usersyour-user-accountAppDataLocalPackagesthedistroname). If you follow my directions, you will not permanently leave sensitive information in this location. If you skip that part, then you must maintain significant security over that location.

As mentioned in the intro, I will have you use a USB device on your physical system running WSL. WSL can see your Windows file system at /mnt/DRIVELETTER. As an example, your C: drive is /mnt/c, your D: drive is /mnt/d, etc.

Before proceeding, pick a folder to use to transfer files back and forth from WSL and your Windows installation. I am using “D:CA” (/mnt/d/CA). Copy in the CSR (the .req file) created by the wizard at the end of the Windows section above.

Creating a Root Certification Authority in Windows Subsystem for Linux

I have used Kali in WSL for all of these steps. Instructions should be the same, or at least similar, for other distributions. If you use a full installation of Linux rather than WSL, you must make modifications, primarily in transferring files between Windows and Linux. Your mileage may vary.

Configuring OpenSSL

Like most Linux programs, OpenSSL depends on a configuration file. I will include one, but OpenSSL provides a default file that you can modify. If you wish to copy that default file out to Windows for editing in something like Notepad++, I will provide the exact point at which you will perform that step. If you wish to use mine as your template, place it in the Windows-side folder that you’re using for transfer (D:CA on my system).

To keep the file short, I used only a few comments. Notes on the configuration points appear after the listing. Check each line before using this file yourself. I believe that you will only need to modify the four items at the top, but you may disagree. I have trimmed away all items that I considered non-essential for the task at hand. You will be unable to use this file for OpenSSL operations unrelated to certification authorities and x509 certificates.

OpenSSL Configuration Notes

A few notes on the decisions that I made in this file:

  • I included points where you can configure OCSP for the root CA if you wish, but I commented them out. Since your root CA will only sign one certificate, I see no justification for OCSP.
  • You will want to change the names to match your own; the scripts that I’m about to show you expect the same names, so take care that they align.
  • I set the CRL to expire after 210 days (about 7 months). You will need to regenerate and redeploy the CRL within that time or clients will cease trusting the subordinate CA, and therefore all certificates it has signed. CRL regeneration instructions provided later.
  • It is possible to sign the subordinate CA without including CRL information. That will alleviate the need to periodically redeploy the CRL. However, it will also eliminate most of the benefit of using an offline CA. One major reason to keep the root offline is because if it is ever compromised, no authority can revoke it. If you do not configure your subordinate CA so that it can be revoked, then it will also remain valid even if compromised.
  • Some other tutorials include generating a CRL for the root (here, in the v3_ca section). That is a wasted effort. No one is above the root, so it cannot be revoked. A CA cannot revoke its own certificate.
  • I restricted the subordinate CA with “pathlen:0”. That way, if it is ever compromised, thieves cannot use it to create further CAs. You can change or remove this restriction if you need a more complex CA structure.

You are certainly welcome to make any changes that suit your situation. Just remember that the upcoming processes have item names that must completely match what’s in your openssl.cnf.

The Process for Creating a Root Certification Authority Using openssl

On my system, I placed all of these into a single file. I copied out the blocks separately and pasted them into the WSL interface. At the end of each block, the system will prompt for input of some kind. It would be possible to pre-supply many (maybe all) of these items. I do not classify the creation of a root certification authority as a worthy purpose for full automation.

Overall instructions: paste the text from these blocks directly into WSL.

Part 1: Establish Variables

In this section, replace my named items with your own. Do not modify anything in the “composite variables” section. If you restart the WSL interface for any reason, you will need to resubmit these settings.

Note that I set the duration of the root CA’s validity to about 30 years and the subordinate CA’s validity to about 10 years. A CA cannot issue a certificate with a lifespan beyond its own. If both certificates expire at the same time, you will need to replace your entire PKI simultaneously.

Interim Optional Step: Copy the Default OpenSSL into Windows for Editing

If you wish to copy out the default openssl.cnf into Windows for easy editing rather than using mine, this will help (some distributions use a different directory structure):

Part 2: Create the Environment

Note: before starting, and for the duration of this procedure, I recommend setting  sudo s. Most everything requires sudo to run properly anyway.

This section will build up the folder structure that openssl expects without disrupting the built-in openssl setting. This is not a permanent build, so there is no need to try to integrate.

Part 3: Place Your Edited openssl.cnf File

I edited my openssl.cnf file in Windows and have written all my instructions accordingly. If you took a different approach, such as editing it within WSL in vim, then your goal in this section is to move into the $dir_root directory (/ca in my design):

Note: under WSL, openssl doesn’t care about Windows vs. UNIX line-endings in its cnf file. In any other configuration, make sure you have line endings set appropriately for your environment.

Part 4: Create the Root Certification Authority’s Keys

Use the following to generate new keys to use for your root certification authority:

You will be prompted to provide a password (“pass phrase”) to protect the key. Do not lose the password or the key will be irretrievable. You will also need it several times in the remaining steps of this article.

Warning: the key set you just created is the most important file in the entire procedure. If any unauthorized individual gains access to it, you should consider your root CA compromised. They will need the password to unlock it, of course, but the key cannot be considered secured.

Part 5: Self-sign the Root Certification Authority Certificate

The following will create a new certificate from the private/public key pair that you created in part 4.

You will first be prompted for the password to the private key. You will also be prompted for the common name of the root certification authority. I stored that in $rootcaname, but in this particular prompt I always typed it in manually.

Part 6: Sign the Subordinate CA Certificate

In this section, you’ll copy in the certificate request file that you created for the subordinate CA back in the Windows setup steps. Then, you’ll have your new root CA sign it.

Note: if you want to use openssl to create the subordinate CA’s certificate instead of using one provided by the Windows wizard, those instructions appear after this section. Perform those steps and then continue with Part 7.

You will be prompted for the password to the root CA’s private key (from step 4). You might also see an error about “index.txt.attr”. You can ignore that message. It will automatically create the necessary file with defaults.

I included -batch to suppress as much input as possible, but you will still see the certificate’s details and its base64-encoded text on-screen.

Part 7: Create the Certificate Revocation List

This step creates the certificate revocation list from the root CA. The list will be empty, but the file is absolutely necessary.

You will be prompted for the password to the root CA’s private key (from step 4).

Part 8: Separate out the Files to be Used in Windows

Certificate chains are often required. openssl can generate them easily, and you have both certificates on hand right now. You won’t find a much better time. Furthermore, several of the files that you’ll want to use in your Windows environment are scattered throughout the CA file structure. We’ll place all of these files into the same output directory (/ca/out in my design, held in $dir_out). I also rename the sub CA’s certificate name from 01.pem to a .crt file that contains its common name:

Part 9: Transfer all Files to Windows

Now we’ll copy out the entire CA file structure into Windows wholesale:

You’ll find all of the files needed for your Windows infrastructure in the out subfolder of your targeted Windows folder. Those are all safe to use and share anywhere. The rest of the folder structure should be treated as a singular unit and secured immediately. The private key for the root CA resides in private. It is imperative that you protect the .key file.

Part 10: Cleanup

You have now completed all the necessary steps in WSL. Cleanup with these steps:

  1. Start by verifying each file in the out folder; Windows should be able to open all of them directly. If a file is damaged, it will not open. Be aware that the subordinate CA will not yet be able to validate the entire chain because the root does not yet appear in the local trusted root store.
  2. Place the contents of the out folder somewhere that you can access them in upcoming sections. Remember that all of these are public files; they do not need any particular security measures.
  3. Once you have verified that all output files are valid, transfer all files in your target Windows folder to a secured, offline location, such as an external USB device that can be placed into a safe. You do not need to include the out folder.
  4. In WSL, run:  rm r $dir_root Note: this completely removes all of the CA data from WSL!

Remember that the CA environment that you created in WSL was intended to be temporary. You can’t really “save” it anywhere, and I’ve had multiple WSL environments fail completely for no reason that I could discern. Do not expect to keep it.

If space on your secured device is at a premium, you can delete the newcerts1.pem file. That is your sub-CA’s signed certificate, which you kept as part of the out file listing. If space is even more precious than that, the only file that you absolutely must keep is the .key file in private. With that, you can sign new certificates, revoke compromised certificates, and regenerate the CRL. You should also have the various serial files and certificate database (index.txt, serial, crlnumber, etc.), but you could reasonably approximate those if absolutely necessary. You will not be able to perfectly reproduce the root CA’s public certificate, but hopefully, you’ll have some copies of it spread around. openssl.cnf would also be difficult to reproduce, but not impossible. To avoid problems, the best thing is to retain the entire directory structure.

Altaro Dojo Forums

Got any burning Hyper-V questions?

Introducing the

forums logo

Ask questions, read answers, leave comments, and master Hyper-V

Moderated by Microsoft MVPs and leading IT industry experts

Optional: Use OpenSSL to Generate the Subordinate CA’s Keys and Certificate Request

You might wish to use OpenSSL to generate your subordinate CA’s keys and its CSR. The primary reason that I can think of would be control over the informational fields that the automatic CSR does not generate, such as location and organization data.

This is a text dump of the commands involved. It expects the same environmental setup that I used in Part 1 above and the folder structure from Part 2. I did not break out the separate sections this time, to ensure that you do not blindly copy/paste the entire thing. I documented where OpenSSL will interrupt you.

The pkcs12 output portion is optional, but it allows you to move your entire subordinate CA’s keys and certificate in a single package. Just remember to take special care of it, as it contains the CA’s private key.

Distributing the Root Certification Authority and Revocation List

It may seem counter-intuitive, but we’re not going to finish configuration of the subordinate CA yet. You could do it, but it will complain about not being able to verify the root. Things go more smoothly if it can validate against the root.

Place the Root Certificate into the Domain

I recommend distributing your root CA certificate by using group policy. If you place it into the domain policy, it will appear on all domain members at their next policy refresh. If you delete it from that policy, domain members will (usually) remove it at their next policy refresh.

I feel that the Group Policy Management Console falls under the category of “common knowledge”, so I will not give you a detailed walk-through on installing or using it. You will find it as an installable feature on Windows Server. You can read Microsoft’s official documentation.

In GPMC, follow these steps:

  1. Create a new GPO and link it to the root of the domain:
    GPO
  2. Give the GPO a meaningful name:
    GPO name
  3. Edit the new policy. Drill down to Computer ConfigurationWindows SettingsSecurity SettingsPublic Key Policies. Right-click Trusted Root Certification Authorities and click Import.
    trusted root certification authority
  4. Click Next to go to the import page where you can browse for the root CA’s certificate file:
    certificate import wizard
  5. Proceed through the remainder of the wizard without changing anything.

You do not need to modify the user settings; you can disable that branch if you wish.

Configuring DNS for Root Certificate and CRL Distribution

I am going to use the same system that hosts the subordinate CA to distribute the root CA’s certificate and CRL. Remember that these files are public by nature. There is no security risk. However, you do want to have the ability to separate them later if you ever need to decommission or revoke the subordinate CA. To facilitate this, I established a fully-qualified domain name just for the root CA. I’ll have DNS point that name to the subordinate certificate server where an IIS virtual directory will respond. If the situation changes in the future, it’s just a matter of retargeting DNS to the replacement web server.

We start in DNS with a simple CNAME record:

rootca properties

Configuring IIS for Root Certificate and CRL Distribution

My process only covers one possible option. You could simply install a standalone instance of IIS. You could use a different web server. You could use a Linux system. You could probably even get a container involved. You could employ a network load balancer. Whatever solution you employ, you only have one goal: ensure that the root certificate and CRL can be reached by any system that needs to validate the subordinate CA or a certificate that it signed.

In this article, I will piggyback off of the IIS installation enabled by the subordinate CA’s Certification Authority Web Enrollment role. It fits this purpose perfectly. In its current incarnation, this role has little more value than automatically publishing the CRT and CRL for the subordinate CA. It aligns with our goal of publishing the root CA’s CRT and CRL.

Install the Certification Authority Web Enrollment Feature

The role has not been installed by these instructions so far, so I’ll start with that.

  1. On the certificate server (or a management workstation connected to it), start the Add roles and features wizard in Server Manager. Step forward to the Roles page.
  2. Expand Active Directory Certificate Services and check Certification Authority Web Enrollment:
    server roles
  3. The wizard will prompt you to install several components of IIS. Agree by clicking Add Features.
    certification authority web enrollment
  4. Proceed through the remainder of the wizard, keeping all defaults.

Create an IIS Site to Publish the Root CA Certificate and CRL

We will configure the newly-installed role later. Right now, we want to set up the root CA’s information.

  1. In C:inetpub, create a folder named “rootca”. Place the root certification authority’s CRT and CRL file.
    rootca
  2. In Internet Information Services Manager, create a new site:
    Internet Information Services Manager
  3. Configure the new site. Pay attention to the indicated areas. The Site Name is up to you. Use port 80; all of the items are digitally-signed and public. Publishing them with https is counter-productive, at best. Make sure that you use the same FQDN for the Host name that you indicated in the CRL information in your openssl.cnf file.
  4. Test access to the CRL and CRT by accessing the complete URL that appears in the subordinate CA’s CRL information:
    CRL and CRT
  5. Don’t forget to test the .CRT as well.

Complete Configuration of the Subordinate CA

Everything is in place for the subordinate CA. Follow these simple steps to finish up:

  1. Run  gpupdate /force to ensure that group policy publishes the root certificate to the subordinate server.
  2. Assuming Server 2016, use Cortana to Manage computer certificates. On older servers, you’ll need to manually run MMC.EXE and add the Certificates snap-in.
    Manage computer certificates
  3. Make certain that the certificate appears in Trusted Root Certification Authorities:
    Trusted Root Certification Authorities
  4. Start the Certification Authority tool. You can find it under Windows Administrative Tools.
  5. Right-click your authority, go to All Tasks, and select Install CA Certificate.
    Certification Authority local
  6. Browse for any one of the subordinate CA’s certificate files that you generated into the out folder:
    CA installation
  7. Provided that everything is OK, it will run a short progress bar and then return you to the management console.
  8. Right-click on your certification authority, go to All Tasks, and click Start Service.
  9. Open Server Manager and click the notification flag with the yellow triangle at the top right of the window, then click Configure Active Directory Certificate Services on the destination server. If it is not present for some reason, then one of the recent tasks should show a link back to the Add roles and features wizard. It will start on the last page, which includes a link to the certification role configuration wizard.
    Configure Active Directory Certificate Services on the destination server
  10. Proceed to the role configuration page. Check Certification Authority Web Enrollment.
    Certification Authority Web Enrollment
  11. IIS should now have a “CertEnroll” virtual directory underneath the Default Web Site that redirects to C:Windowssystem32CertSrvCertEnroll. It should contain the CRT and CRL for your subordinate CA:
    CertEnroll Default Web Site

Congratulations! You have set up a functional public key infrastructure, complete with an offline root CA and an operational enterprise subordinate CA! If you check the certificate list on a domain member with a current policy update, you should see the sub-CA with an OK status:

public key infrastructure

You can request certificates using the Certificates MMC snap-in on a domain-joined computer. You can also instruct group policy to automatically issue certificates. Explaining such usage (and all of the other possibilities) exceeds what I want to accomplish in this article.

Root CA Maintenance and Activities

You don’t need to do much with your root CA, but ignoring it will cause problems.

I highly recommend placing all of the above activities into a single text file and storing it with your CA’s files. You can then easily reuse the portions that you need. You’ll also have them available if you have an emergency and need to rebuild a new PKI from scratch in a hurry. Append the functions in this section to the file with relevant comments.

When you need to reuse your files, spin up a WSL environment, enter  sudo s, and then run the portion that generates the environment variables. Then, run  cp rf $dir_host_transfer/* $dir_root.

Updating the Root CA’s CRL

You will need to issue an updated CRL prior to the expiration of the existing one or problems will ensue. Specifically, anything checking the validity of any certificate in the chain may opt to treat the entire chain as invalid if it cannot retrieve a current CRL.

Assuming that you have performed the preceding bit about regenerating the CA structure within WSL, you can create an updated CRL very simply:

To output the necessary CRL files back to the Windows environment:

Remember to  rm r /ca to remove the CA from WSL after you’ve done this. Copy the updated CRL file out to the web location and place the CA files back into secured storage.

Revoking the Subordinate CA’s CRL

If your subordinate CA becomes compromised, you’ll need to revoke it. That’s easily done. These instructions assume you followed the previous portion about rebuilding the CA structure and setting up the environment variables.

Copy the index.txt file back to Windows; it contains the CA’s database and will now have a record of the revoked subordinate CA certificate.

Perform all the steps in the “Updating the Root CA’s CRL” section above. Remove the authority from Active Directory.

Renewing the Subordinate CA’s CRL

“Renewing” a certificate is just a phrase we use. In truth, you just set up another subordinate CA, usually with the same name. If you want, you can issue another CSR from the same private/public key pair and have your CA sign it. I would not recommend using the same private key, though. CA’s traditionally have long validation periods and any key can be cracked given enough time. Generating a new private key resets the clock.

In the Certification Authority tool, right-click your authority, go to All Tasks and select Renew CA Certificate.

Renew CA Certificate

Follow the wizard to generate a new CSR. In the WSL portion above, locate the portion in Part 1 where you copy in the CSR file. Then, proceed from part 6 through to the end. Wrap up by starting at step 4 of the “Complete Configuration of the Subordinate CA” section above.

Further Reading

Microsoft’s base documentation for Certificate Services: https://docs.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/hh831740(v%3dws.11)

Information on the newer web services: https://social.technet.microsoft.com/wiki/contents/articles/7734.certificate-enrollment-web-services-in-active-directory-certificate-services.aspx

Public Key Infrastructure Explained | Everything you need to know

Want to ask a question that relates directly to your situation? Head on over to the Altaro Dojo Forums and start a discussion. I’m actively part of that community and answering questions on a daily basis.

Go to Original Article
Author: Eric Siron