Tag Archives: organizations

Schlumberger, Chevron and Microsoft announce collaboration to accelerate digital transformation – Stories

Global organizations will work together to accelerate development of cloud-native solutions and deliver actionable data insights for the industry

MONACO September 17, 2019 — Tuesday at the SIS Global Forum 2019, Schlumberger, Chevron and Microsoft. announced the industry’s first three-party collaboration to accelerate creation of innovative petrotechnical and digital technologies.

Data is quickly emerging as one of the most valuable assets to any company yet extracting insights from it is often difficult as information gets trapped in internal silos. As part of the collaboration, the three companies will work together to build Azure-native applications in the DELFI* cognitive E&P environment initially for Chevron, which will enable companies to process, visualize, interpret and ultimately obtain meaningful insights from multiple data sources.

DELFI* is a secure, scalable and open cloud-based environment providing seamless E&P software technology across exploration, development, production and midstream. Chevron and Schlumberger will combine their expertise and resources to accelerate the deployment of DELFI solutions in Azure, with support and guidance from Microsoft. The parties will ensure the software developments meet the latest standards in terms of security, performance, release management, and are compatible with the Open Subsurface Data Universe (OSDU) Data Platform. Building on this open foundation will amplify the capabilities of Chevron’s petrotechnical experts.

The collaboration will be completed in three phases starting with the deployment of the Petrotechnical Suite in the DELFI environment, followed by the development of cloud-native applications on Azure, and the co-innovation of a suite of cognitive computing native capabilities across the E&P value chain tailored to Chevron’s objectives.

Olivier Le Peuch, chief executive officer, Schlumberger, said, “Combining the expertise of these three global enterprises creates vastly improved and digitally enabled petrotechnical workflows. Never before has our industry seen a collaboration of this kind, and of this scale. Working together will accelerate faster innovation with better results, marking the beginning of a new era in our industry that will enable us to elevate performance across our industry’s value chain.”

“There is an enormous opportunity to bring the latest cloud and AI technology to the energy sector and accelerate the industry’s digital transformation,” said Satya Nadella, CEO of Microsoft. “Our partnership with Schlumberger and Chevron delivers on this promise, applying the power of Azure to unlock new AI-driven insights that will help address some of the industry’s—the world’s—most important energy challenges, including sustainability.”

Joseph C. Geagea, executive vice president, technology, projects and services, Chevron, said, “We believe this industry-first advancement will dramatically accelerate the speed with which we can analyze data to generate new exploration opportunities and bring prospects to development more quickly and with more certainty. It will pull vast quantities of information into a single source amplifying our use of artificial intelligence and high-performance computing built on an open data ecosystem.”

About Schlumberger

Schlumberger is the world’s leading provider of technology for reservoir characterization, drilling, production, and processing to the oil and gas industry. With product sales and services in more than 120 countries and employing approximately 100,000 people who represent over 140 nationalities, Schlumberger supplies the industry’s most comprehensive range of products and services, from exploration through production, and integrated pore-to-pipeline solutions that optimize hydrocarbon recovery to deliver reservoir performance.

Schlumberger Limited has executive offices in Paris, Houston, London, and The Hague, and reported revenues of $32.82 billion in 2018. For more information, visit.

About Chevron

Chevron Corporation is one of the world’s leading integrated energy companies. Through its subsidiaries that conduct business worldwide, the company is involved in virtually every facet of the energy industry. Chevron explores for, produces and transports crude oil and natural gas; refines, markets and distributes transportation fuels and lubricants; manufactures and sells petrochemicals and additives; generates power; and develops and deploys technologies that enhance business value in every aspect of the company’s operations. Chevron is based in San Ramon, Calif. More information about Chevron is available at www.chevron.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

###

*Mark of Schlumberger

For further information, contact:

Moira Duff
Corporate Communication Manager−Western Hemisphere
Schlumberger
Tel: +1 281 285 4376
[email protected]

Sean Comey
Sr. Advisor, External Affairs
Chevron
Tel: +1 925 842 5509
[email protected]

Microsoft Media Relations
WE Communications for Microsoft
(425) 638-7777
[email protected]

Go to Original Article
Author: Microsoft News Center

Transition to value-based care requires planning, communication

Transitioning to value-based care can be a tough road for healthcare organizations, but creating a plan and focusing on communication with stakeholders can help drive the change.

Value-based care is a model that rewards the quality rather than the quantity of care given to patients. The model is a significant shift from how healthcare organizations have functioned, placing value on the results of care delivery rather than the number of tests and procedures performed. As such, it demands that healthcare CIOs be thoughtful and deliberate about how they approach the change, experts said during a recent webinar hosted by Definitive Healthcare.

Andrew Cousin, senior director of strategy at Mayo Clinic Laboratories, and Aaron Miri, CIO at the University of Texas at Austin Dell Medical School and UT Health Austin, talked about their strategies for transitioning to value-based care and focusing on patient outcomes.

Cousin said preparedness is crucial, as organizations can jump into a value-based care model, which relies heavily on analytics, without the institutional readiness needed to succeed.  

“Having that process in place and over-communicating with those who are going to be impacted by changes to workflow are some of the parts that are absolutely necessary to succeed in this space,” he said.

Mayo Clinic Labs’ steps to value-based care

Cousin said his primary focus as a director of strategy has been on delivering better care at a lower cost through the lens of laboratory medicine at Mayo Clinic Laboratories, which provides laboratory testing services to clinicians.

Andrew Cousin, senior director of strategy, Mayo Clinic LaboratoriesAndrew Cousin

That lens includes thinking in terms of a mathematical equation: price per test multiplied by the number of tests ordered equals total spend for that activity. Today, much of a laboratory’s relationship with healthcare insurers is measured by the price per test ordered. Yet data shows that 20% to 30% of laboratory testing is ordered incorrectly, which inflates the number of tests ordered as well as the cost to the organization, and little is being done to address the issue, according to Cousin.

That was one of the reasons Mayo Clinic Laboratories decided to focus its value-based care efforts on reducing incorrect test ordering.

To mitigate the errors, Cousin said the lab created 2,000 evidence-based ordering rules, which will be integrated into a clinician’s workflow. There are more than 8,000 orderable tests, and the rules provide clinicians guidance at the start of the ordering process, Cousin said. The laboratory has also developed new datasets that “benchmark and quantify” the organization’s efforts.  

To date, Cousins said the lab has implemented about 250 of the 2,000 rules across the health system, and has identified about $5 million in potential savings.

Cousin said the lab crafted a five-point plan to begin the transition. The plan was based on its experience in adopting a value-based care model in other areas of the lab. The first three steps center on what Cousin called institutional readiness, or ensuring staff and clinicians have the training needed to execute the new model.

The plan’s first step is to assess the “competencies and gaps” of care delivery within the organization, benchmarking where the organization is today and where gaps in care could be closed, he said.

The second step is to communicate with stakeholders to explain what’s going to happen and why, what criteria they’ll be measured on and how, and how the disruption to their workflow will result in improving practice and financial reimbursement.

The third step is to provide education and guidance. “That’s us laying out the plans, training the team for the changes that are going to come about through the infusion of new algorithms and rules into their workflow, into the technology and into the way we’re going to measure that activity,” he said.

Cousin said it’s critical to accomplish the first three steps before moving on to the fourth step: launching a value-based care analytics program. For Mayo Clinic Laboratories, analytics are used to measure changes in laboratory test ordering and assess changes in the elimination of wasteful and unnecessary testing.

The fifth and final step focuses on alternative payments and collaboration with healthcare insurers, which Cousin described as one of the biggest challenges in value-based care. The new model requires a new kind of language that the payers may not yet speak.

Mayo Clinic Laboratories has attempted to address this challenge by taking its data and making it as understandable to payers as possible, essentially translating clinical data into claims data.     

Cousin gave the example of showing payers how much money was saved by intervening in over-ordering of tests. Presenting data as cost savings can be more valuable than documenting how many units of laboratory tests ordered it eliminated, he said.

How a healthcare CIO approaches value-based care

UT Health Austin’s Miri approaches value-based care from both the academic and the clinical side. UT Health Austin functions as the clinical side of Dell Medical School.

Aaron Miri, CIO at the University of Texas at Austin Dell Medical School and UT Health Austin Aaron Miri

The transition to value-based care in the clinical setting started with a couple of elements. Miri said, first and foremost, healthcare CIOs will need buy-in at the top. They also will need to start simple. At UT Health Austin, simple meant introducing a new patient-reported outcomes program, which aims to collect data from patients about their personal health views.

UT Health Austin has partnered with Austin-based Ascension Healthcare to collect patient reported outcomes as well as social determinants of health, or a patient’s lifestyle data. Both patient reported outcomes and social determinants of health “make up the pillars of value-based care,” Miri said.  

The effort is already showing results, such as a 21% improvement in the hip disability and osteoarthritis outcome score and a 29% improvement in the knee injury and osteoarthritis outcome score. Miri said the organization is seeing improvement because the organization is being more proactive about patient outcomes both before and after discharge.  

For the program to work, Miri and his team needs to make the right data available for seamless care coordination. That means making sure proper data use agreements are established between all UT campuses, as well as with other health systems in Austin.   

Value-based care data enables UT Health Austin to “produce those outcomes in a ready way and demonstrate that back to the payers and the patients that they’re actually getting better,” he said.

In the academic setting at Dell Medical School, Miri said the next generations of providers are being prepared for a value-based care world.

“We offer a dual master’s track academically … to teach and integrate value-based care principles into the medical school curriculum,” Miri said. “So we are graduating students — future physicians, future surgeons, future clinicians — with value-based at the core of their basic medical school preparatory work.”

Go to Original Article
Author:

Data integration problems a hurdle companies must overcome

As organizations try to analyze the vast amounts of information they’ve collected, they need to overcome data integration problems before they can extract meaningful insights.

Decades of data exist for enterprises that have stood the test of time, and it’s often housed in different locales and spread across disparate systems.

The scope of the business intelligence that enterprises glean from it in that haphazard form is limited. Attempting to standardize the data, meanwhile, can be overwhelming.

Enter vendors that specialize in solving data integration issues, whose service is helping other companies curate the vast amounts of information they possess and put it in a place — and in a format — where it can be accessed and used to produce meaningful BI.

Cloud data integration provider Talend, along with others such as Informatica and MuleSoft, recently acquired by Salesforce, is one such vendor.

In the second part of a two-part Q&A, Talend CEO discusses different data integration problems large enterprises face compared with their small and midsize brethren, as well as Talend’s strategy in helping companies address their sudden abundance of data.

In part one, Tuchen talks about the massive challenges that have developed over the last 10 to 15 years as organizations have begun to digitize and pool their data.

Are there different data integration problems a small- to medium-sized business might face compared to a large organization in terms of extracting data from a vast pool of information it has collected over the years?

Mike Tuchen, CEO of TalendMike Tuchen

Mike Tuchen: For a small or medium-sized company, for the most part they know where their systems are. There’s a much more human understandable set of sources where you’re going to get your data from, so for the most part cataloging for them isn’t required upfront. It’s something you can choose to do later and optionally. They can say, ‘I’m going to pull data from using Salesforce and NetSuite, and HubSpot and Salesforce and NetSuite, and Zendesk for support.’ They can pull data from all those systems, make sure they have a consistent definition of who’s a customer and what they’re doing, and then can start analyzing what the most effective campaigns are, who the most likely customers to convert are, who the most likely customers to retain or upsell are, or whatever they’re trying to do with the core analytics. Since you have a small number of systems — a small number of sources — you can go directly there and it turns more into a ‘let’s drive the integration process, let’s drive the cleaning process’ and the initial cleaning process is a simpler problem.

So in essence, even though they may not have the financial wherewithal to invest in a team of data scientists, is the process of solving data integration issues actually easier for them?

Tuchen: For sure. Size creates complexity. It creates an opportunity as well, but the bigger you get the more sources. Think about at one end of the spectrum you’ve got a large multinational company that has a whole bunch of different divisions spread out across the world, some of them brought in through acquisitions. Think about the plethora of different sources you have. We’re working with a customer that has a dozen different ERP systems that they’ve done and that they’re now trying to bring data together from, and that’s just in one type of data — transactional data around financial transactions. Think about that kind of complexity versus a small company.

What is the core service Talend provides?

Tuchen: Talend is a data integration company, and our core approach is to help companies collect, govern, transform and share their data. What we’re seeing is that data, more and more, is becoming a critical strategic asset. We’re seeing, worldwide, that as companies are more and more digitized they’re seeing that data managed correctly is a competitive advantage, and at the heart of every single industry is a strategic data battle that if you solve that well there’s an advantage and you’ll be out executing your competitors. With that recognition, the importance of the problem that we’re solving is going up in our customers’ minds, and that creates an opportunity for us.

How does what Talend does help customers overcome data integration problems?

Tuchen: We have a cloud-based offering called Talend Data Fabric that includes a number of different components, including a lot of the different capabilities we talked about. There’s a data catalog that solves that discovery process and the data definition issue, making sure that we have a consistent definition, lineage of where does data start and where does it end, what happens to it along the way so you can understand impact analysis, and so on. That’s one part of our offering. And we have an [application programming interface] offering that allows you to share that with customers or partners or suppliers.

As you look at where data integration and mining are headed, what is Talend’s roadmap for the next one to three years?

Tuchen: Right now we’re doubling and tripling down on the cloud. Our cloud business is exploding. It’s growing well over 100% a year. What we’re seeing is the entire IT landscape is moving to the cloud. In particular in the data analytics, data warehouses, just over the last couple of years we’ve reached the tipping point. Now we’re at the point where cloud data warehouses are significantly better than anything you can get on premises — they’re higher performance, more flexible, more scalable, you can plug in machine learning, you can plug in real-time flows to them, there’s no upfront commitment, they’re always up to date. It’s now at the point where the benefits are so dramatic that every company in the world has either moved or is planning to move and do most of their analytical processing in the cloud. That creates an enormous opportunity for us, and one that we’re maniacally focused on. We’re putting an enormous amount of effort into maintaining and extending our leadership in cloud-based data integration and governance.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

SolarWinds Discovery offers low-cost way to manage IT assets

IT asset management software vendor SolarWinds launched a software service aimed at IT organizations that seek a low-cost way to keep tabs on IT assets and improve their service delivery.

The software, called SolarWinds Discovery, is a SaaS tool designed to help IT teams locate, map and manage their software and hardware assets. The software combines both agent and agentless technology to provide a view into critical assets, providing insights for IT pros who manage and monitor those assets.

“IT service delivery requires managing the lifecycle of the technology that enables customers to meet their needs,” said Gartner analyst Roger Williams. “Many organizations, however, do not have visibility into everything on their network due to poor controls and an inability to keep up with the pace of change in their environment.”

The agentless SolarWinds Discovery Scanner locates and collects information on IP connected devices, like servers, routers, switches, firewalls, storage arrays, VMware hosts, VMs and printers, according to the company.

The SolarWinds Discovery agent can collect more than 200 data points from Windows and Apple computers and servers, as well as iOS and Android mobile devices. The software integrates with Microsoft System Center Configuration Manager, VMware vCenter and Chrome OS.

The new service integrates with SolarWinds Service Desk, enabling enterprises to focus on risks affecting IT services, as well as to comply with software licensing contracts. The tool can also import data from key configuration management sources, enabling organizations to regularly update all their asset data and make it available within SolarWinds Service Desk, the company said.

The product was launched on August 21 and is available only under SolarWinds Service Desk. Cost is per month, per agent and billed annually. A free one-month trial is available on the company’s website.

Williams said IT asset management is part of Gartner’s Software Asset Management, IT Asset Management and IT Financial Management category that saw $1.24 billion in revenue in 2018, a 23.4% increase over 2017.

He said vendors in this market are challenged to offer a product with features unique from what is already available, considering there are over 100 competitors. He said SolarWinds has the largest presence of any vendor in the related network performance monitoring and diagnostics market and is used as a discovery source by many organizations.

Competitors include ManageEngine, BMC Software and IBM, to cite a few examples. ManageEngine ServiceDesk combines asset management and help desk functionalities in one platform. BMC Helix offers digital and cognitive automation technologies intended to provide efficient service management across any environment. IBM Maximo offers a tool to analyze IoT data from people, sensors and devices to gain asset visibility.

Go to Original Article
Author:

Social determinants of health data provide better care

Social determinants of health data can help healthcare organizations deliver better patient care, but the challenge of knowing exactly how to use the data persists.

The healthcare community has long-recognized the importance of a patient’s social and economic data, said Josh Schoeller, senior vice president and general manager of LexisNexis Health Care at LexisNexis Risk Solutions. The current shift to value-based care models, which are ruled by quality rather than quantity of care, has put a spotlight on this kind of data, according to Schoeller.

But social determinants of health also pose a challenge to healthcare organizations. Figuring out how to use the data in meaningful ways can be daunting, as healthcare organizations are already overwhelmed by loads of data.

A new framework, released last month, by the not-for-profit eHealth Initiative Foundation, could help. The framework was developed by stakeholders, including LexisNexis Health Care, to give healthcare organizations guidance on how to use social determinants of health data ethically and securely.

Here’s a closer look at the framework.

Use cases for social determinants of health data

The push to include social determinants of health data into the care process is “imperative,” according to eHealth Initiative’s framework. Doing so can uncover potential risk factors, as well as gaps in care.

The eHealth Initiative’s framework outlines five guiding principles for using social determinants of health data. 

  1. Coordinating care

Determine if a patient has access to transportation or is food is insecure, according to the document. The data can also help a healthcare organization coordinate with community health workers and other organizations to craft individualized care plans.

  1. Using analytics to uncover health and wellness risks

Use social determinants of health data to predict a patient’s future health outcomes. Analyzing social and economic data can help the provider know if an individual is at an increased risk of having a negative health outcome, such as hospital re-admittance. The risk score can be used to coordinate a plan of action.

  1. Mapping community resources and identifying gaps

Use social determinants of health data to determine what local community resources exist to serve the patient populations, as well as what resources are lacking.

  1. Assessing service and impact

Monitor care plans or other actions taken using social determinants of health data and how it correlates to health outcomes. Tracking results can help an organization adjust interventions, if necessary.

  1. Customizing health services and interventions

Inform patients about how social determinants of health data are being used. Healthcare organizations can educate patients on available resources and agree on next steps to take.

Getting started: A how-to for healthcare organizations

The eHealth Initiative is not alone in its attempt to move the social determinants of health data needle.

Niki Buchanan, general manager of population health at Philips Healthcare, has some advice of her own.

  1. Lean on the community health assessment

Buchanan said most healthcare organizations conduct a community health assessment internally, which provides data such as demographics and transportation needs, and identifies at-risk patients. Having that data available and knowing whether patients are willing or able to take advantage of community resources outside of the doctor’s office is critical, she said.

Look for things that meet not only your own internal ROI in caring for your patients, but that also add value and patient engagement opportunities to those you’re trying to serve in a more proactive way.
Niki BuchananGeneral manager of population health management, Philips Healthcare

  1. Connect the community resource dots

Buchanan said a healthcare organization should be aware of what community resources are available to them, whether it’s a community driving service or a local church outreach program. The organization should also assess at what level it is willing to partner with outside resources to care for patients.

“Are you willing to partner with the Ubers of the world, the Lyfts of the world, to pick up patients proactively and make sure they make it to their appointment on time and get them home,” she said. “Are you able to work within the local chamber of commerce to make sure that any time there’s a food market or a fresh produce kind of event within the community, can you make sure the patients you serve have access?”

  1. Start simple

Buchanan said healthcare organizations should approach social determinants of health data with the patient in mind. She recommended healthcare organizations start small with focused groups of patients, such as diabetics or those with other chronic conditions, but that they also ensure the investment is a worthwhile one.

“Look for things that meet not only your own internal ROI in caring for your patients, but that also add value and patient engagement opportunities to those you’re trying to serve in a more proactive way,” she said.

Go to Original Article
Author:

Microsoft Office 365 now available from new South Africa cloud datacenters

As Microsoft strives to support the digital transformation of organizations and enterprises around the world, we continue to drive innovation and expand into new geographies to empower more customers with Office 365, the world’s leading cloud-based productivity solution, with more than 180 million commercial monthly active users. Today, we’re taking another step in our ongoing investment to help enable digital transformation and societal impact across Africa with the general availability of Office 365 services from our new cloud datacenters in South Africa.

Office 365, delivered from local datacenters in South Africa, helps our customers enable the modern workplace and empower their employees with real-time collaboration and cloud-powered intelligence while maintaining security, compliance, and in-country customer data residency. The addition of South Africa as a new geography for Office 365 increases the options for secure, cloud productivity services combined with customer data residency in 16 geographies across the globe along with three additional geographies also announced.

In-country data residency for core customer data helps Office 365 customers meet regulatory requirements, which is particularly important and relevant in industries such as healthcare, financial services, and government—where organizations need to keep specific data in-country to comply with local requirements. Customer data residency provides additional assurances regarding data privacy and reliability for organizations and enterprises. Core customer data is stored only in their datacenter geography (Geo)—in this case, the cloud datacenters within South Africa.

Customers like Altron and the Gauteng Provincial Government have used Office 365 to transform their workplaces. This latest development will enable them—and other organizations and enterprises adopting Office 365—to ramp up their digital transformation journey.

“Altron is committed to improving our infrastructure and embracing a strategy to become a cloud-first company to better serve our customers and empower our employees through modern collaboration. We’ve noticed a tangible difference since making the move to Office 365.”
—Debra Marais, Lead, IT Shared Services at Altron

“Office 365 is driving our modernization journey of Government ICT infrastructure and services by allowing us to develop pioneering solutions at manageable costs and create overall improvements in operations management, all while improving transparency and accountability.”
—David Kramer, Deputy Director General, ICT at Gauteng Provincial Government

Microsoft recently became the first global provider to deliver cloud services from the African continent with the opening of our new cloud datacenter regions. Office 365 joins Azure to expand the intelligent cloud service available from Africa. Dynamics 365 and Power Platform, the next generation of intelligent business applications, are anticipated to be available in the fourth quarter of 2019.

By delivering the comprehensive Microsoft cloud—which includes Azure, Office 365, and Dynamics 365—from datacenters in a given geography, we offer scalable, available, and resilient cloud services to companies and organizations while meeting customer data residency, security, and compliance needs. We have deep expertise in protecting data and empowering customers around the globe to meet extensive security and privacy requirements, including offering the broadest set of compliance certifications and attestations in the industry.

The new cloud regions in South Africa are connected to Microsoft’s other regions via our global network, one of the largest and most innovative on the planet—spanning more than 100,000 miles (161,000 kilometers) of terrestrial fiber and subsea cable systems to deliver services to customers. Microsoft is bringing the global cloud closer to home for African organizations and citizens through our trans-Arabian paths between India and Europe, as well as our trans-Atlantic systems, including Marea, the highest capacity cable to ever cross the Atlantic.

We’re committed to accelerating digital transformation across the continent through numerous initiatives and also recently announced Microsoft’s first Africa Development Centre (ADC), with two initial sites in Nairobi, Kenya and Lagos, Nigeria. The ADC will serve as a premier center of engineering for Microsoft, where world-class African talent can create solutions for local and global impact. With our new cloud datacenter regions, the ADC, and programs like 4Afrika, we believe Africa is poised to develop locally and scale for global impact better than ever before.

Learn more about Office 365 and Microsoft in the Middle East and Africa.

Go to Original Article
Author: Microsoft News Center

Use PowerShell printer management for quicker setups

Even the most high-tech organizations still need printers for users to do their jobs.

While the electronic transition of business documents has greatly reduced the need to print most documents, networked printers and print servers remain important pieces of the infrastructure. For a Windows shop, the simplest approach is to use Windows Server as the foundation for printing services. Armed with PowerShell printer management know-how, administrators have an automation tool to configure and manage these print servers.

Install print services

The first step to set up a Windows print server is to add the feature to the server. We can use the Server Manager GUI, but it’s easily done with a PowerShell command:

Add-WindowsFeature -Name Print-Server

PowerShell printer command
A PowerShell cmdlet adds the print feature to the Windows Server system to manage printing jobs.

The True value under the Success Boolean indicates the feature installed and is ready for use without requiring a restart.

Add printer drivers

Starting with Windows 8 and Windows Server 2012, Microsoft added class printer drivers, also called v4 printer drivers, for various manufacturers. You can add these and other v3 print drivers to a print server with the Add-PrinterDriver cmdlet. Keep in mind to use the cmdlet requires the driver in the server’s DriverStore folder; you can use the PnPUtil command-line tool to add any missing drivers.

The following command installs all drivers in INF files from a folder on a local drive for an HP LaserJet M607 M608 M609 PCL 6:

pnputil.exe -i -a C:HP_LJM607-M608-M609HP_LJM607-M608-M609_V3*.inf
Microsoft PnP Utility

Processing inf : hpat6b2a_x64.inf
Successfully installed the driver on a device on the system.
Driver package added successfully.
Published name : oem10.inf

Processing inf : hpat6b2a_x86.inf
Successfully installed the driver on a device on the system.
Driver package added successfully.
Published name : oem12.inf

Total attempted: 2
Number successfully imported: 2

After a successful attempt, you can add the drivers to the print server with Add-PrinterDriver by specifying the name of the driver:

Add-PrinterDriver -Name 'HP LaserJet M607 M608 M609 PCL 6'

The command does not generate any output to let you know if it worked, but the Get-PrinterDriver will let you verify its availability:

Get-PrinterDriver -Name 'HP LaserJet M607 M608 M609 PCL 6'

Add a port for printers

Most organizations use networked printers on a Windows print server by adding a port that connects to the printer IP address. The following command adds a standard TCP/IP port for the IP address 172.16.26.8 with the port 9100 which uses the RAW printer protocol:

Add-PrinterPort -Name '172.16.26.8' -PrinterHostAddress '172.16.26.8' -PortNumber 9100

Add print queues

To combine all these commands, let’s add a new print queue to the print server with PowerShell.

To make this tutorial easier to read, the settings are in a hashtable $Settings to use splatting with the Add-Printer cmdlet. This example limits the Add-Printer parameters to include a comment, driver name, location, name, port name, print processor and make sure it’s a shared printer:

$Settings = @{
Comment = 'HP m607'
DriverName = 'HP LaserJet M607 M608 M609 PCL 6'
Location = 'My Office'
Name = 'HP M607'
PortName = '172.16.26.8'
PrintProcessor = 'winprint'
Shared = $True
}
Add-Printer @Settings -Verbose

Run the Get-Printer command with the name HP M 607 to see the printer’s settings:

Get-Printer 'HP M607' | Format-List

Name : HP M607
ComputerName :
Type : Local
ShareName : HP M607
PortName : 172.16.26.8
DriverName : HP LaserJet M607 M608 M609 PCL 6
Location : My Office
Comment : HP m607
SeparatorPageFile :
PrintProcessor : winprint

Printer settings
The Get-Printer command generates a detailed listing of a printer’s settings.

Perform bulk changes with PowerShell printer management

One of the advantages of PowerShell scripting is speed and efficiency. When you need to make multiple changes across your infrastructure, PowerShell will save you time with these types of tasks. For example, you can use PowerShell to change the driver for many printers at once. The command below takes any printer whose name starts with HP M and changes the print driver to the HP universal print driver.

Get-printer "HP M*" | Set-Printer -DriverName 'HP Universal Printing PCL 6'

Next, to make a printer port for every IP address in a subnet, in this case 172.16.26.1/24, start by creating an array for the last octet:

$Sub = 1..255

Use the $Sub variable and perform a loop with Add-PrinterPort, changing the last octet each time:

$Sub | ForEach-Object {Add-PrinterPort -Name "172.16.26.$_" -PrinterHostAddress "172.16.26.$_" -PortNumber 9100}
PowerShell loop
This PowerShell command sets up the multiple printer ports using a loop to automate the process.

PowerShell gives you a simpler way to handle printer configuration via the command line or with a script that you can save and modify for the next round of printer setups. While it’s still valid to use the Windows Server GUI to work with printers, it’s not as efficient as PowerShell.

Go to Original Article
Author:

How to plot out an Office 365 tenant-to-tenant migration

With the number of users and organizations on Office 365, circumstances will inevitably require some of them to move between tenants.

An Office 365 tenant-to-tenant migration can occur for several reasons, such as after a merger or acquisition or part of a company gets sold. These business events come with complicated legal maneuvers with rigid timelines. Most of these situations require completing tenant-to-tenant migrations on a schedule made by lawyers and executives with little to no regard for the time it takes to move the associated data. It’s up to the technical team to work out how to complete the migration and meet their deadline.

Whatever the reason, successfully executing a tenant-to-tenant migration within Office 365 is a complex process with some significant limitations. Let’s walk through the process to clarify what’s involved with this type of data migration process.

What data can migrate?

The biggest challenge with an Office 365 tenant-to-tenant migration is knowing what data you can and cannot move. Microsoft develops new APIs continuously to give access to different types of data stored within Office 365, but as of publication, the accessibility is still somewhat limited.

To further complicate the issue, there is no Microsoft tool to move data between tenants. While there are many third-party applications to help migrate different types of Office 365 data between tenants, they all have different capabilities. Finding the right product for your Office 365 tenant-to-tenant migration requires some research to find the tool — or tools — that move the data your organization needs.

While there are many third-party applications to help migrate different types of Office 365 data between tenants, they all have different capabilities

Out of the entire Office 365 suite, Exchange Online, SharePoint Online and OneDrive for Business are the applications most commonly affected by this tenant migration effort. While there are multiple tools available to move these data types, this article can’t pick the right one for you.

The more popular and capable migration tools are BitTitan, Quest, ShareGate and AvePoint, which acquire new features and updated functionality regularly. My recommendation is to look at these tools and other similar ones to decide which one is best for your circumstances.

Different applications come with different complications

Not every application handles data the same, which further muddles the tenant migration process.

Planning a move between Office 365 tenants

For example, Skype for Business is an important communication tool for many organizations, but much of the data it uses is not with the application. The meetings data resides in Exchange and buddy lists live with the Skype for Business client. Most of the time, Skype for Business is not migrated between tenants but reconfigured at the destination tenant.

Teams is another application that presents a migration challenge. As of publication, Microsoft has not finished the APIs that will let third-party tools handle these tenant migrations. At the moment, you can move the SharePoint document libraries attached to Teams or the Exchange mailboxes that hold Teams calendars, but it’s not yet possible to migrate Teams entirely.

Preparing for your migration

As you inventory all the Office 365 applications that hold user data that you want to migrate, the list might overwhelm you. If you need to move mailboxes and data from multiple web applications such as Yammer, Bookings, Planner, Delve, Flow, Forms, Power BI, StaffHub, Stream, Sway and To-Do, then that’s going to take some work to get a handle on what is involved with each service.  

To further complicate this effort, you’ll find most of the Office 365 applications don’t support any migration. If you need data from a particular application, all you can do about it is try to recreate the data in your destination tenant the same way you did in the original tenant. Identifying what data you can, and cannot, move between Office 365 tenants is an important step, but it’s just the beginning of your migration process.

Another issue that can complicate the tenant migration timeline is bandwidth throttling. Microsoft controls the rate of speed when transferring data on Office 365 to ensure the health of the overall platform. This limitation can be very troublesome when you need to move a large amount of data in a short amount of time.

You can try to adjust the throttling on your Office 365 tenant if you open a support case with Microsoft, but there are no clear rules on how much, if at all, the throttling limits can be relaxed. In my experience, Microsoft will do everything possible to help you reach your deadline if you explain your situation.

Tenant names and vanity domains can be another difficult part of an Office 365 tenant-to-tenant migration. All Office 365 tenant names need to be unique. This can make it difficult to find a tenant name that matches the name of your organization. Furthermore, a vanity domain — @company.com — can only be associated with a single Office 365 tenant. Exchange migration tools need to connect to the source mailbox and the destination mailbox by name, so most tools recommend making that connection using the @tenant.onmicrosoft.com address for each mailbox.

The best advice I can give is to make sure you understand the tools you chose to move data between Office 365 tenants. They all have recommended configurations that will make this work go much smoother and bring you much better results.

Go to Original Article
Author:

Mature DevSecOps orgs refine developer security skills training

BOSTON — IT organizations that plan to tackle developer security skills as part of a DevSecOps shift have started to introduce tools and techniques that can help.

Many organizations have moved past early DevSecOps phases such as a ‘seat at the table‘ for security experts during application design meetings and locked-down CI/CD and container environments. At DevSecCon 2018 here this week, IT pros revealed they’ve begun in earnest to ‘shift security left’ and teach developers how to write more secure application code from the beginning.

“We’ve been successful with what I’d call SecOps, and now we’re working on DevSec,” said Marnie Wilking, global CISO at Orion Health, a healthcare software company based in Boston, during a Q&A after her DevSecCon presentation. “We’ve just hired an application security expert, and we’re working toward overall information assurance by design.”

Security champions and fast feedback shift developer mindset

Orion Health’s plan to bring an application security expert, or security champion, into its DevOps team reflects a model followed by IT security software companies, such as CA Veracode. The goal of security champions is to bridge the gap and liaise between IT security and developer teams, so that groups spend less time in negotiations.

“The security champions model is similar to having an SRE team for ops, where application security experts play a consultative role for both the security and the application development team,” said Chris Wysopal, CTO at CA Veracode in Burlington, Mass., in a presentation. “They can determine when new application backlog items need threat modeling or secure code review from the security team.”

However, no mature DevSecOps process allows time for consultation before every change to application code. Developers must hone their security skills to reduce vulnerable code without input from security experts to maintain app delivery velocity.

The good news is that developer security skills often emerge organically in CI/CD environments, provided IT ops and security pros build vulnerability checks into DevOps pipelines in the early phases of DevSecOps.

Marnie Wilking at DevSecCon
Marnie Wilking, global CISO at Orion Health, presents at DevSecCon.

“If you’re seeing builds fail day after day [because of security flaws], and it stops you from doing what you want to get done, you’re going to stop [writing insecure code],” said Julie Chickillo, VP of information security, risk and compliance at Beeline, a company headquartered in Jacksonville, Fla., which sell workforce management and vendor management software.

Beeline built security checks into its CI/CD pipeline that use SonarQube, which blocks application builds if it finds major, critical or limiting application security vulnerabilities in the code, and immediately sends that feedback to developers. Beeline also uses interactive code scanning tools from Contrast Security as part of its DevOps application delivery process.

“It’s all about giving developers constant feedback, and putting information in their hands that helps them make better decisions,” Chickillo said.

Developer security training tools emerge

Application code scans and continuous integration tests only go so far to make applications secure by design. DevSecOps organizations will also use updated tools to further developer security skills training.

Sooner or later, companies put security scanning tools in place, then realize they’re not enough, because people don’t understand the output of those tools.
Mark FelegyhaziCEO, Avatao.com Innovative Learning Ltd

“Sooner or later, companies put security scanning tools in place, then realize they’re not enough, because people don’t understand the output of those tools,” said Mark Felegyhazi, CEO of Avatao.com Innovative Learning Ltd, a startup in Hungary that sells developer security skills training software. Avatao competitors in this emerging field include Secure Code Warrior, which offers gamelike interfaces that train developers in secure application design. Avatao also offers a hands-on gamification approach, but its tools also cover threat modeling, which Secure Code Warrior doesn’t address, Felegyhazi said.

Firms also will look to internal and external training resources to build developer security skills. Beeline has sent developers to off-site security training, and plans to set up a sandbox environment for developers to practice penetration testing on their own code, so they better understand the mindset of attackers and how to head them off, Chickillo said.

Higher education must take a similar hands-on approach to bridge the developer security skills gap for graduates as they enter the workforce, said Gabor Pek, CTO at Avatao, in a DevSecCon presentation about security in computer science curricula.

“Universities don’t have security champion programs,” Pek said. “Most of their instruction is designed for a large number of students in a one-size-fits-all format, with few practical, hands-on exercises.”

In addition to his work with Avatao, Pek helped create a bootcamp for student leaders of capture-the-flag teams that competed at the DEFCON conference in 2015. Capture-the-flag exercises offer a good template for the kinds of hands-on learning universities should embrace, he said, since they are accessible to beginners but also challenge experts.

Transforming IT infrastructure and operations to drive digital business

It’s time for organizations to modernize their IT infrastructure and operations to not just support, but to drive digital business, according to Gregory Murray, research director at Gartner.

But to complete that transformation, organizations need to first understand their desired future state, he added.

“The future state for the vast majority of organizations is going to be a blend of cloud, on prem and off prem,” Murray told the audience at the recent Gartner Catalyst conference. “What’s driving this is the opposing forces of speed and control.”

From 2016 to 2024, the percentage of new workloads that will be deployed through on-premises data centers is going to plummet from about 80% to less than 20%, Gartner predicts. During the same period, cloud adoption will explode — going from less than 10% to as much as 45% — with off-premises, colocation and managed hosting facilities also picking up more workloads.

IT infrastructure needs to provide capabilities across these platforms, and operations must tackle the management challenges that come with it, Murray said.

How to transform IT infrastructure and operations

Once organizations have defined their future state — and Murray urged organizations to start with developing a public cloud strategy to determine which applications will be in the cloud — they should begin modernizing their infrastructure, he told the audience at the Gartner Catalyst conference. 

“Programmatic control is the key to enabling automation and automation is, of course, critical to addressing the disparity between the speed that we can deliver and execute in cloud, and improving our speed of execution on prem,” he said. 

Organizations will also need developers with the skills to take advantage of it, he said. Another piece of the automation equation when modernizing the infrastructure to gain speed is standardization, he said.

The future state for the vast majority of organizations is going to be a blend of cloud, on prem and off prem.
Gregory Murrayresearch director, Gartner

“We need to standardize around those programmatic building blocks, either by using individual components of software-defined networking, software-defined compute and software-defined storage, or by using a hyper-converged system.”

Hyper-converged simplifies the complexity associated with establishing programmatic control and helps create a unified API for infrastructure, he said.

Organizations also need to consider how to uplevel their standardization, according to Murray. This is where containers come into play. The atomic unit of deployment is specific to an application and it abstracts much of the dependencies and complications that come with moving an application independent of its operating system, he explained.

“And if we can do that, now I have a construct that I can standardize around and deploy into cloud, into on prem, into off prem and give it straight to my developers and give them the ability to move quickly and deploy their applications,” he said.

Hybrid is the new normal

To embrace this hybrid environment, Murray said organizations should establish a fundamental substrate to unify these environments.

“The two pieces that are so fundamental that they precede any sort of hybrid integration is the concept of networks — specifically your WAN and WAN strategy across your providers — and identity,” Murray said. “If I don’t have fundamental identity constructs, governance will be impossible.”

Organizations looking to modernize their network for hybrid capabilities should resort to SD-WAN, Murray said. This provides software-defined control that extends outside of the data center and allows a programmatic approach and automation around their WAN connectivity to help keep that hybrid environment working together, he explained.

But to get that framework of governance in place across this hybrid environment requires a layered approach, Murray said. “It’s a combination of establishing principles, publishing the policies and using programmatic controls to bring as much cloud governance as we can.”

Murray also hinted that embracing DevOps is the first step in “a series of cultural changes” that organizations are going to need to truly modernize IT infrastructure and operations. For those who aren’t operating at agile speed, operations still needs to get out of the business of managing tickets and delivering resources and get to a self-service environment where operations and IT are involved in brokering the services, he added.

There is also need to have a monitoring framework in place to gain visibility across the environment. Embracing AIOps — which uses big data, data analytics and machine learning — can help organizations become more predictive and more proactive with their operations, he added.