Coming soon: Microsoft System Center 2019! – Windows Server Blog

This blog post was authored by Vithalprasad Gaitonde, Principal PM Manager, System Center.

As customers grow their deployments in the public cloud and on-premises data centers, management tools are evolving to meet customer needs. System Center suite continues to play an important role in managing the on-premises data center and the evolving IT needs with the adoption of the public cloud.

Today, I am excited to announce that Microsoft System Center 2019 will be generally available in March 2019. System Center 2019 enables deployment and management of Windows Server 2019 at a larger scale to meet your data center needs.

System Center 2019 has been in private preview through the Windows Server Technical Adoption Program (TAP) customers since December 2018.  A big thank you to everyone who have given us feedback so far.

I would like to take a moment and give you an overview about the new release. System Center 2019 has the following areas of focus:

  • First-class tools to monitor and manage data centers
  • Support and manage capabilities in the latest versions of Windows Server
  • Enable hybrid management and monitoring capabilities with Azure

System Center 2019 is our LTSC (Long Term Servicing Channel) release and provides the 5 years of standard and 5 years of extended support that customers can rely on. Subsequent to the GA of System Center 2019, the suite will continue to accrue value through the Update Rollup releases every six months over the mainstream support window of 5 years.

System Center 2019 is designed to deliver value in the following areas:


As enterprise environments now span on-premises to the cloud, customers look to leverage the innovation in Azure services using their on-premises tools. To enable this, we have integrated System Center with a set of management services in Azure to augment the on-premises tools.

  • With Service Map integration with System Center Operations Manager (SCOM), you can automatically create distributed application diagrams in Operations Manager (OM) that are based on the dynamic dependency maps in Service Map.
  • With Azure Management Pack, you can now view perf and alert metrics in SCOM, integrate with web application monitoring in Application Insights, and monitor more PaaS services, such as Azure Blob Storage, Azure Data Factory, etc.
  • Virtual Machine Manager (VMM) 2019 enables simplified patching of VMs by integrating with Azure Update Management.

Dashboard for Azure resources in SCOM web console

Dashboard for Azure resources in SCOM web console


With the security threats growing in number and sophistication, security continues to be top priority for customers.

  • System Center products now support service logon and shun the dependency on interactive logon aligning with security best practice.
  • VMM 2019 now includes a new role, VM administrator, which provides just enough permissions for read-only visibility into the fabric of the data center, but prevents escalation of privilege to fabric administration.

Virtual machine administrator role in virtual machine manager

VM Administrator Role in VMM

Software defined data center

Hyper Converged Infrastructure (HCI) is a significant trend in on-premises data centers today. Customers see lowered costs by using their servers with high performant local disks to run compute and storage needs at the same time.

  • With VMM 2019, you can manage and monitor HCI deployment more efficiently – from upgrading or patching Storage Spaces Direct clusters without downtime to monitoring the health of disks.
  • VMM 2019 storage optimization enables you to optimize placement of VHDs across cluster shared volumes and prevents VM outages caused when the storage runs full.

Storage Health in virtual machine manager

Storage Health in VMM

Modernizing operations and monitoring

Customers have come to rely on SCOM for its extensibility and the ecosystem of management packs to monitor Microsoft and third-party workloads.

  • With HTML5 dashboards and drill down experiences in the SCOM web console, you will now be able to use a simplified layout and extend the monitoring console using custom widget and SCOM REST API.
  • Taking modernization a step further, email notifications in SCOM have been modernized as well with support for HTML-email in SCOM 2019.
  • SCOM 2019 brings a new alerts experience for monitor-based alerts whereby alerts have to be attended to and cannot be simply closed by operators when the respective underlying monitors are in unhealthy state.
  • SCOM has enhanced your Linux monitoring by leveraging Fluentd; and now is resilient to management server failovers in your Linux environments.
  • All the SCOM management packs will now support Windows Server 2019 roles and features.

System Center Operations Manager web console

SCOM web console

Faster backups with Data Protection Manager 2019

Data Protection Manager (DPM) 2019 will provide backups optimized in time (faster) and space (consumes less storage).

  • DPM improves performance of your backups with a 75 percent increase in speed and enables monitoring experience for key backup parameters via Log Analytics.
  • DPM further supports backup of VMware VMs to tape. In addition to Windows Server 2019, DPM now provides backups for new workloads such as SharePoint 2019 and Exchange 2019.

Data Protection Manager alerts and reports using Log Analytics

DPM alerts and reports using Log Analytics

Orchestrator 2019 and Service Manager 2019

Orchestrator 2019 supports PowerShell V 4.0 and above, enabling you to run 64-bit cmdlets. Service Manager 2019 will ship with an improved Active Directory (AD) connector that is now capable of synchronizing with a specific domain controller.

Changes to release cadence

Finally, we are making changes to System Center release cadence to optimize the way we are delivering new features. System Center has two release trains today – LTSC and SAC. There is also a release train called Update Rollups (URs).

Most of our customers use Long Term Servicing Channel (LTSC) like System Center 2016 to run their data center infrastructures. LTSC provides five years of mainstream support and five years of extended support – with Update Rollups (UR) providing the incremental fixes and updates. From talking to customers, we learned that LTSC works better for most System Center deployments as the update cycles are longer and more stable.

Based on the learnings, we will start to focus our resources on innovation plans for System Center in LTSC releases and stop SAC releases. System Center 2019 will support upgrades from two prior SAC releases so customers running System Center 1801 or System Center 1807 will be able to upgrade to System Center 2019; just as System Center 2016 can be upgraded to System Center 2019.

System Center Configuration Manager (SCCM) is not impacted by the 2019 release change and will continue current branch release cadence of three times per year as noted in the documentation, “Support for Configuration Manager current branch versions.”

Call to action

In March, customers will have access to System Center 2019 through all the channels! We will publish a blog post to mark the availability of System Center 2019 soon. As always, we would love to hear what capabilities and enhancements you’d like to see in our future releases. Please share your suggestions, and vote on submitted ideas, through our UserVoice channels.

Frequently asked questions

Q: When will I be able to download the System Center 2019?

A: System Center 2019 will be generally available in March 2019. We will update this blog to inform that the build is available for download through the Volume Licensing Service Center (VLSC).

Q: Is there any change in pricing for System Center 2019?

A: No.

Q: Will there be a new Semi-Annual Channel release along with System Center 2019?

A: No. There will not be Semi-Annual Channel releases, but new features before the next Long-Term Servicing Channel (LTSC) release will be delivered through Update Rollups.

Go to Original Article
Author: Steve Clarke

Cryptography techniques must keep pace with threats, experts warn

Cryptography techniques work well to protect data in a hyper-connected world, but the battle to maintain the integrity of encrypted data and ensure cryptography is used wherever necessary remains a challenge for experts in the field.

One such challenge came from the Australian government, which passed a law in December requiring technology companies to work with law enforcement agencies to decrypt encrypted data when a crime may have been committed. Technology companies and security experts argue this type of measure undermines data privacy and security efforts.

“Secret backdoors are like pathogens, and governments have done a terrible job of managing them,” said Paul Kocher, a cryptographer and independent researcher, during “The Cryptographers’ Panel” keynote at RSA Conference (RSAC) this week. He pointed to the NotPetya ransomware debacle in 2017, where security exploits from the National Security Agency were weaponized.

Under Australia’s new law, developers can be imprisoned if they refuse to build backdoors in their products, or if they tell anyone they’ve done it — a strategy cryptographers like Kocher consider “100% backward.”

“If anyone should be going to prison, it is developers who sneak backdoors into their products without telling their managers and their customers that they’ve done it,” he said during the panel discussion.  

Cryptographer and security expert Whitfield Diffie shared his view that legislators should not have a role in personal privacy decisions, as technologies that chip away at privacy continue to materialize.

“I am very worried. At this moment, you still have a certain amount of privacy in your own thoughts,” Diffie told RSAC attendees. “Electronic brain interfaces may well come to the point where they can read your mind … If you look at things that are still your own, they are eroding very quickly.”

Personal data protection is increasingly important, as more value is extracted from the troves of data being collected from devices. Corporations and government organizations use personal data in AI applications, such as facial recognition, speech recognition and machine vision, and data sets inform processes from setting bail to credit risk scoring, said Shafi Goldwasser, director of the Simons Institute for the Theory of Computing at the University of California, Berkeley, during the keynote panel discussion.

“You don’t only send information, but you also process it. And this data should be kept private, because the power is in the data,” Goldwasser said. “We have to talk about private computation, making sure computation is done in such a way that data privacy is maintained — in a way that’s robust and done correctly.”

Cryptography tools and techniques protect the privacy of data throughout the computing processes, while ensuring the computation is accurate, she added.

While legislation around data privacy is controversial, GDPR and California’s 2018 privacy law will play an important role in preventing companies from abusing and misusing personal data. Tal Rabin, manager of the Cryptographic Research Group at IBM Research, described regulations as an opportunity for the security community to foster development of built-in data protection features to support the use of data in a way that also protects privacy.

Cryptography technologies transform data into a format that can't be deciphered by unintended recipients.
Cryptography techniques use mathematical concepts and algorithms to protect information, so only the intended recipients of the data can decipher it.

Accessible cryptography techniques

Cryptography techniques continue to evolve, but there are tried-and-true cryptographic methods that have stood the test of time, which many people don’t know about, Kocher said.

Password-authenticated key exchange is old technology that works well and should be much more widely used than it is, according to Kocher. A second is threshold and multiparty computation techniques, which allow users to do computations like signing an SSL certificate, where not just one entity decides whether something will happen and the computation is split among multiple parties.

[Cryptography] initiatives are measured in decades, but they are incredibly important, and they are moving forward.
Paul Kochercryptographer and independent researcher

People should also use cold storage of data more, according to Kocher, who explained it is simple to put a public key somewhere offline and encrypt data as it is generated. Cold storage has found wider use in cryptocurrency, where it is also known as a cold wallet. It allows people to store their crypto-coin private keys offline, away from internet hackers.

“This isn’t cutting-edge stuff — it is stuff you can implement now, and it would make a big difference in solving some of the problems people have on a daily basis,” he said.

While those and other cryptography technologies work well, they sit atop operating systems, processors, firmware and application codes; if those don’t work perfectly, the crypto may fail, as well, Kocher warned.

But it has proven difficult to develop hardware that supports cryptography, said Ronald Rivest, a leader of the cryptography and information security research group at MIT’s Computer Science and Artificial Intelligence Laboratory.

“I talk to my colleagues at MIT who work in hardware and architecture, and they start thinking about Spectre and Meltdown [vulnerabilities], and their eyes just go crazy,” Rivest said during the panel discussion. “It’s really hard to make hardware that supports cryptography in the way that cryptographers like to think about it in the ideal world, where people have secrets that are maintained and used securely. Having that implemented in the real world is still a challenge.”

It doesn’t help that bad things happen quickly in the cybersecurity space, and the good things seem to take an enormous amount of effort and time, Kocher said. But those positives are coming to fruition.

The transition to domain name system security extensions has been plodding along and is making important progress. There’s the TLS 1.3 effort and safer languages like RUST, and there’s a switch from passwords to cryptographic authenticators. These developments are all difficult, but they really matter, according to Kocher.

“A lot of us are used to working on internet company time, where you get an idea and you have a mock-up, and it’s in the market six months later. These initiatives are measured in decades, but they are incredibly important, and they are moving forward.”

Go to Original Article

What’s new in Surface for Business: enabling multinational deals and new product configurations | Microsoft Devices Blog

As we discussed in our last Surface for Business blog, the Surface for Business portfolio is tailored specifically for commercial and education customers. We are constantly listening to and learning from our customers and have an amazing team dedicated to continually improving our products. Today I’m excited to announce new offerings that aim to help businesses better serve their customers, enable their employees and empower students.

Enabling more multinational deals and expanding configurations for Surface Pro and Surface Laptop
We have heard requests to enable broader market support so multinational corporations can purchase Surface on a more global basis to outfit workers across subsidiaries and regional offices. Therefore, we are authorizing commercial partners to sell select configurations of Surface Pro 6 for Business and Surface Laptop 2 for Business in 11 additional European markets: Bulgaria, Croatia, Czech Republic, Estonia, Greece, Hungary, Latvia, Lithuania, Romania, Slovenia and Slovakia. These offerings will leverage an International English keyboard (Surface Pro Type Cover sold separately) and include local languages in the operating system image, with availability beginning May 1, 2019.
There is also growing interest from commercial customers wanting to outfit mainstream productivity employees with devices containing 16GB memory. While 16GB memory options are already available on Intel Core i7 models of various Surface devices, we are now offering a configuration of 16GB memory on an Intel Core i5 model with 256GB storage for both Surface Pro 6 and Surface Laptop 2. We are offering these configurations in the U.S. and Canada this month and will evaluate opportunities to expand this configuration more broadly in the future.
Celebrating commercial and education customer success
We continue to be inspired by how our commercial and education customers are using Surface for Business devices to drive transformation within their industries and better serve their customers, employees and students. Here are a few recent examples:

The Law School Admission Council (LSAC) administers the Law School Admission Test (LSAT) – the only assessment tool accepted by every law school in the United States and Canada – to 100,000 potential law school applicants worldwide each year. As part of a broader technology initiative with Microsoft, LSAC will use Surface Go for the new Digital LSAT platform launching in July 2019. Among the reasons LSAC selected Surface Go were the numerous Windows 10 accessibility features available for test takers who need them, overall quality and reliability which is critical for users in such a high-stakes exam, and the Surface Go kickstand which allows test-takers the flexibility to angle the screen for their own comfort.
Future of StoryTelling (FoST) supports a passionate community of people from the worlds of media, technology and communications who explore and transform how storytelling is evolving in the digital age. A year ago, FoST themselves recognized the need to transform how it uses technology to operate more effectively. They replaced a combination of Apple and other devices with Surface devices allowing employees to “select one that matches their preferences and work habits.” Interoperability was particularly important: “The way Surface devices work so smoothly with Office 365, with the applications we use, and with each other is impressive [and] makes our day-to-day jobs much easier.”
Play’n GO specializes in developing top-quality gaming platforms and software for online and land-based casinos. They grew substantially over the last five years and recently switched the computers they use to collaborate and create on, at the same time as their switch to Microsoft 365—with Surface Laptop as the standard and Surface Studio for the art team. “For the first time in nearly 15 years, I can hold a PC in my hand that has a build quality equal to that of an Apple device. The Surface Laptop is the most reliable, problem-free laptop of any brand I’ve used,” says Christoffer Tykö, Internal IT Team Leader at Play’n GO. This standardization and the reliability of Surface devices has already helped decrease the time IT employees need to dedicate to troubleshooting company hardware, freeing the department to handle other responsibilities. The use of a unified, comprehensive solution has also paid dividends in employee productivity.

Learn more about Surface for Business or Surface for Education, check with your local reseller or contact your Microsoft representative to see which devices and prices are available for your organization or your school today.

For Sale – Virtualisation PC /media PC

Hi mate I may have low feedback jut this would be the third time I have sent something before I have received payment.. The last stuff was over 450 pounds worth of stuff and then I had to chase for two days to get payment. So I would prefer not too
I also stated that that the price doesn’t include delivery so will have to work something out there..
The case is in good condition I have just didn’t put it on properly for the first pic, third was to show how the ssd was seated lol

Forgot to mention its got an Intel pci e Nic card too.

One back panel is missing

No warranty except the ssd which I bought a few months back from Cex which should have two years and I’m happy to help with that.

Where abouts are you based as I travel alot for work

PayPal would work too.. I will get the post up for the last big sale too, I just haven’t received feedback for that yet

Go to Original Article

Windows 10 Tip: Increase text spacing and choose themes and colors using the learning tools built into Microsoft Edge | Windows Experience Blog

March 11, 2019 9:00 am

Did you know there are now more ways to improve your reading and focus, thanks to the Windows 10 October 2018 Update? 
Microsoft Edge is the only browser with Microsoft Learning Tools built-in that help improve reading and focus. You’ll notice more customization options in reading view, such as increasing text spacing, or choosing from a wide array of background colors and themes that work best for you. 
Check it out in action: 

If you like this, check out more Windows 10 Tips. 

Tags learning tools Microsoft Edge text spacing Themes Windows 10 Tips

A progress report on digital transformation in healthcare – Microsoft Industry Blogs

Two scientists using digital tablet in laboratory

It’s been an incredible year so far for the health industry. We’ve seen the dream and the opportunity of digital transformation and AI start to really take shape in the marketplace.

We saw many examples of this last month at HIMSS 2019, many of our partners and other cloud providers are offering commoditized access to complex healthcare algorithms and models to improve clinical and business outcomes.


These examples show how cloud computing and AI can deliver on the promise of digital transformation. But for health organizations to realize that potential, they have to trust the technology—and their technology partner.

Microsoft has always taken the lead on providing cloud platforms and services that help health organizations protect their data and meet their rigorous security and compliance requirements. Recently, we announced the HIPAA  eligibility and HITRUST certifications of Microsoft Cognitive Services and Office 365.

It’s crucial for health organizations to feel utterly confident not only in their technology partner’s ability to help them safeguard their data and infrastructure, and comply with industry standards, but also in their partner’s commitment to help them digitally transform their way—whatever their needs or objectives are. Our mission is to empower every person and every organization on the planet to achieve more. So whether you’re a health provider, pharmaceutical company, or retailer entering healthcare, your mission is our mission. Our business model is rooted in delivering success rather than disruption for our customers.


Another point of vital importance as we support the movement of healthcare as an industry—and healthcare data specifically—to the cloud is ensuring that we avoid the sins of the past, specifically data silos.

To that end, we jointly announced with other leading cloud providers that we’re committed to healthcare data interoperability among cloud platforms and supporting common data standards like Fast Healthcare Interoperability Resources (FHIR). And I was particularly thrilled to see the excitement in the health industry in reaction to our launch last last month with Azure API for FHIR and our commitment to develop open source FHIR servers. I hope you’ll join the huge movement behind health interoperability fueled by FHIR and encourage your technologists to start actively using the open-source project to bring diverse data sets together—and to build systems that learn from those data sets.

As my colleague, Doug Seven, recently wrote, interoperability helps you bring together data from disparate sources, apply AI to it to gain insights, and then enrich care team and patient tools with those insights to help you achieve your mission. That’s a crucial step in the digital transformation of health.


Another crucial step is supporting health teamwork. With the changing nature of care delivery, health services increasingly require coordination across multiple care settings and health professionals. So we launched a new set of capabilities to our Teams platform that provides workflows for first-line clinical workers such as doctors and nurses that they can use to access patient information and coordinate care in real time.

The end game

Why does all of this matter? To answer that question, I always come back to the quadruple aim, which all of us in the health industry strive for: enhancing both patient and caregivers’ experiences, improving the health of populations, and lowering the costs of healthcare.

Empowering care teams and patients with data insights and tools that help them coordinate care—and that they and your health organization can trust—will help bring about the desired outcomes of the quadruple aim. Not only will this systemic change improve clinical and business outcomes, but also, at an individual level, enhance the day-to-day and digital experiences of clinical workers and patients alike—creating better experiences, better insights, and better care across the delivery system.

Learn more about real-world use cases for AI in the e-book: “Breaking down AI: 10 real applications in healthcare.”

Go to Original Article
Author: Steve Clarke

Veritas acquires Aptare for storage analytics

Data protection market leader Veritas Technologies moved to beef up its storage analytics and monitoring capabilities by acquiring Aptare.

Veritas did not disclose the acquisition price when it revealed the acquisition on Thursday. Both companies are privately held, although Veritas is a multibillion-dollar-a-year vendor, and Aptare is a minnow in the storage world.

Aptare IT Analytics is a suite of products that includes Storage Management Suite, Backup Manager, Capacity Manager, Fabric Manager, File Analytics, Replication Manager and Virtualization Manager. Its design goal is to provide predictive storage analytics and help companies meet requirements for compliance and service-level agreements.

Veritas CEO Greg Hughes said his company will sell Aptare IT Analytics as a complement to the Veritas NetBackup and Backup Exec data protection applications and InfoScale storage management for on-premises and cloud data.

Hughes said Aptare came highly recommended by some of his largest customers who already used IT Analytics alongside Veritas products.

“I found some of our customers are using a product called Aptare,” Hughes said. “It provides a single pane of glass, not only for NetBackup, but for pulling insights from other technologies.

“Our customers were saying, ‘You should really think about buying this company.’ After a few times, I started getting more interested in Aptare. I met with [Aptare CEO] Rick Clark last year and got a demo. Aptare works very well with NetBackup and some of our other products.”

Greg Hughes, Veritas CEOGreg Hughes

He said Aptare’s capabilities go far beyond NetBackup OpsCenter, which provides basic analytics. “Aptare is a much more advanced product,” Hughes said.

Hughes praised Aptare for its storage analytics, reporting and monitoring capabilities for data on premises and in public clouds. He also said the acquisition will lead to a common reporting platform for data stored on tape, disk and across multiple clouds.

Aptare claims its products include more than 200 standard reports and allow customers to build custom reports.

Veritas’ plans for Aptare product, team

Our customers were saying, ‘You should really think about buying this company.’
Greg HughesCEO, Veritas

Veritas will keep the Aptare brand for IT Analytics and continue to sell it as a stand-alone product. Hughes said Veritas is still considering the long-term roadmap regarding the integration of Aptare technology inside Veritas products.

Clark founded Aptare in 1993 and remained its CEO and president until the Veritas acquisition. Hughes said Clark and the Aptare team will join Veritas. Clark will report to Scott Genereux, Veritas’ executive vice president of worldwide field operations.

Aptare, based in Campbell, Calif., claims more than 1,000 customers for IT Analytics, formerly known as StorageConsole.

Hughes said Veritas will continue Aptare’s partnerships with IT vendors. Hitachi Vantara sells Aptare IT Analytics as Hitachi Storage Viewer through an OEM deal. Aptare is also part of the Hewlett Packard Enterprise Technology Partner Program, and it sells Backup Manager Solution for ServiceNow.

Aptare is Veritas’ second acquisition since its 2016 spinout from security giant Symantec. It bought fluidOps, a Germany-based cloud data management startup, in March 2018.

Hughes, who became Veritas’ CEO in January 2018, said the company will look for more acquisitions for products that complement its portfolio.

Although Veritas is the largest stand-alone data protection vendor, it competes with well-funded, smaller companies that are also hunting for data intelligence and cloud features. Veeam Software, Rubrik, Cohesity and Actifio all picked up $100 million or more in funding over the past year. Several said they would use the money to make acquisitions.

“We’re in a market where we can’t do all the development ourselves,” Hughes said. “If we find products or technologies that are important to underpin our value, we’re looking at that.”

Predictive analytics is a hot driver for storage acquisitions. On the primary storage front, Hewlett Packard Enterprise spent $1.2 billion to acquire Nimble Storage in 2017, and has integrated Nimble’s InfoSight analytics into other HPE hardware. DataDirect Networks identified Tintri’s storage analytics as a major reason for acquiring Tintri in 2018. For secondary storage, analytics must also extend to public clouds.

“As long as systems management remains one of the major costs in IT, one of the keys to an efficient hybrid strategy lies in automation based on a combination of dynamic visibility and intelligence,” said Steven Hill, a senior analyst at 451 Research. “You simply can’t control what you can’t see.”

Christophe Bertrand, senior analyst at Enterprise Strategy Group, said Aptare IT Analytics adds “intelligent data management” to data managed by Veritas applications.

“In this case they decided to do this non-organically to accelerate time to market,” Bertrand said of the acquisition. “I expect that we will see Aptare’s IP being leveraged in other parts of the portfolio in the next few quarters.”

Symantec sold Veritas to the Carlyle Group for $7.4 billion in January 2016, more than 10 years after it acquired Veritas for $13.5 billion. Veritas executives have said the storage software vendor was never a good fit as part of Symantec and have worked to restore its strength as a stand-alone company. Veritas executives claim the company generated more than $2 billion in revenue in 2018.

Johnny Yu contributed to this story.

Go to Original Article

For Sale – Virtualisation PC /media PC

Hi mate I may have low feedback jut this would be the third time I have sent something before I have received payment.. The last stuff was over 450 pounds worth of stuff and then I had to chase for two days to get payment. So I would prefer not too
I also stated that that the price doesn’t include delivery so will have to work something out there..
The case is in good condition I have just didn’t put it on properly for the first pic, third was to show how the ssd was seated lol

Forgot to mention its got an Intel pci e Nic card too.

One back panel is missing

No warranty except the ssd which I bought a few months back from Cex which should have two years and I’m happy to help with that.

Where abouts are you based as I travel alot for work

PayPal would work too.. I will get the post up for the last big sale too, I just haven’t received feedback for that yet

Go to Original Article