Tag Archives: Most

HCI storage adoption rises as array sales slip

The value and volume of data keep growing, yet in 2019 most primary storage vendors reported a drop in sales.

Part of that has to do with companies moving data to the cloud. It is also being redistributed on premises, moving from traditional storage arrays to hyper-converged infrastructure (HCI) and data protection products that have expanded into data management.

That helps explain why Dell Technologies bucked the trend of storage revenue declines last quarter. A close look at Dell’s results shows its gains came from areas outside of traditional primary storage arrays that have been flat or down from its rivals.

Dell’s storage revenue of $4.15 billion for the quarter grew 7% over last year, but much of Dell’s storage growth came from HCI and data protection. According to Dell COO Jeff Clarke, orders of VxRail HCI storage appliances increased 82% over the same quarter in 2018. Clarke said new Data Domain products also grew significantly, although Dell provided no revenue figures for backup.

Hyper-converged products combine storage, servers and virtualization in one box. VxRail, which relies on vSAN software from Dell-owned VMware running on Dell PowerEdge, appears to be cutting in on sales of both independent servers and storage. Dell server revenue declined around 10% year-over-year, around the same as rival Hewlett Packard Enterprise’s (HPE) server decline.

“We’re in this data era,” Clarke said on Dell’s earnings call last week. “The amount of data created is not slowing. It’s got to be stored, which is probably why we are seeing a slightly different trend from the compute side to the storage side. But I would point to VxRail hyper-convergence, where we’ll bring computing and storage together, helping customers build on-prem private clouds.”

The amount of data created is not slowing. It’s got to be stored.
Jeff ClarkeCOO, Dell

Dell is counting on a new midrange storage array platform to push storage revenue in 2020. Clarke said he expected those systems to start shipping by the end of January.

Dell’s largest storage rivals have reported a pause in spending, partially because of global conditions such as trade wars and tariffs. NetApp revenues have fallen year-over-year each of the last three quarters, including a 9.6% dip to $1.38 billion last quarter. HPE said its storage revenue of $848 million dropped 12% from last year. HPE’s Nimble Storage midrange array platform grew 2% and Simplivity HCI increased 14% year-over-year, a sign that 3PAR enterprise arrays fell and the vendor’s new Primera flagship arrays have not yet generated meaningful sales.

Jeff Clarke, Dell COO
Dell Technologies COO Jeff Clarke

IBM storage has also declined throughout the year, dropping 4% year-over-year to $434 million last quarter. Pure Storage’s revenue of $428 million last quarter increased 16% from last year, but Pure had consistently grown revenue at significantly higher rates throughout its history.

Meanwhile, HCI storage revenue is picking up. Nutanix last week reported a leveling of revenue following a rocky start to 2019. Related to VxRail’s increase, VMware said its vSAN license bookings had increased 35%. HPE’s HCI sales grew, while overall storage dropped. Cisco did not disclose revenue for its HyperFlex HCI platform, but CEO Chuck Robbins called it out for significant growth last quarter.

Dell/VMware and Nutanix still combine for most of the HCI storage market. Nutanix’s revenue ($314.8 million) and subscription ($380.0 million) results were better than expected last quarter, although both numbers were around the same as a year ago. It’s hard to accurately measure Nutanix’s growth from 2018 because the vendor switched to subscription billing. But Nutanix added 780 customers and its 66 deals of more than $1 million were its most ever. And the total value of its customer contracts came to $305 million, up 9% from a year ago.

Nutanix’s revenue shift came after the company switched to a software-centric model. It no longer records revenue from the servers it ships its software on. Nutanix and VMware are the dominant HCI software vendors.

“It’s just the two of us, us and VMware,” Nutanix CEO Dheeraj Pandey said in an interview after his company’s earnings call. “Hyper-convergence now is really driven by software as opposed to hardware. I think it was a battle that we had to win over the last three or four years, and the dust has finally settled and people see it’s really an operating system play. We’re making it all darn simple to operate.”

Go to Original Article
Author:

Vendor 3PM uses AI and analytics to prevent Black Friday fraud

The months and weeks leading up to Black Friday, one of the most hectic shopping days of the year, keeps e-commerce intelligence vendor 3PM Solutions busy.

“This is a very important time,” said Rob Dunkel, CEO of 3PM.

More people buy products and more retailers and individuals sell products online on Black Friday than on any other day, and the number of counterfeit products listed for sale skyrockets, Dunkel said. Chicago-based 3PM, with its platform built to collect, change and then analyze unstructured data, identifies potentially counterfeit products for its e-commerce clients so they can crack down on Black Friday fraud.

Founded in 2013, 3PM sells software that automatically combs through products and reviews to give its e-commerce clients a better snapshot of what customers are purchasing and why, as well as to protect brands and identify and take down counterfeit or misrepresented items.

The 3PM platform automatically scrapes public data off e-commerce websites, such as Amazon and eBay, Dunkel explained. Data includes customer reviews, product images and descriptions, and buyer and seller information.

No APIs are used, he said — instead, the platform collects data as it appears on e-commerce dealers’ websites using machine learning and natural language processing. The collected data is then brought into the platform and structured for its clients, some of which include major e-commerce players.

The process is continuous and encompasses billions of online product listings.

Google AI

Black Friday fraud, shopping
AI and analytics help prevent online shopping fraud on Black Friday and all year

The vendor uses a host of Google Cloud products to support its platform. A few years ago, 3PM left AWS for the Google Cloud Platform, after seeing the capabilities of Google Cloud Bigtable, a scalable, fully managed NoSQL database.

The database product was in beta testing then, Dunkel said. But, with its ability to handle huge workloads, it seemed perfect for 3PM.

Also, Dunkel said, 3PM was drawn to Google for its machine learning and AI products and tools available on the cloud.

Google Cloud Vision AI, alongside Google Cloud TPU, gives 3PM the ability to automatically classify and match images, for example.

“We’re heavy users of Google AI,” Dunkel said.

Preventing Black Friday fraud

Analyzing products in search of counterfeits is particularly important around Black Friday. Due to the sudden, massive increase in buyers and sellers during this holiday period, Black Friday fraud is common.

Using its platform, 3PM can identify fraudulent products for its clients and partners generally within four hours, Dunkel claimed. He offered an example.

Game of Thrones: Season 8 comes out on DVD and Blu-ray soon. Given the popularity of the show, and the expected demand for the season, it’s inevitable that some sellers will purposely mislist similar products to make them appear to be Season 8, to trick potential buyers. They could, for example, use clever descriptors or images to pass off a poster of Season 8 for the DVD.

So, said Dunkel, “We’ve been able to train the system to understand each title” of the different products, to automatically identify from the title what the product is.

We’re able to build and train our models to understand what is good and what is not.
Rob DunkelCEO, 3PM Solutions

Moreover, the platform can identify and compare a product listing’s image to a known image of the product, and scan for discrepancies using image recognition. Models can also read the descriptors and listing categories and compare them with other listings or with what the category is known to be. In the case of the Game of Thrones: Season 8, a category might be DVD. If the product lists as something else, 3PM issues a warning.

“We’re able to build and train our models to understand what is good and what is not,” Dunkel said.

The platform can also analyze product reviews. On certain e-commerce sites, third-party sellers can change their listings. They may have listed a specific product that racked up many positive reviews, but later changed the listing to a completely different product. Yet, the reviews stayed.

At first glance, then, the product seems to have high reviews. By reading through the reviews, it may become clear that the positive reviews were meant for a different product altogether. The 3PM platform can automatically read through reviews, and comb through the history of the listing, to detect that, Dunkel said.

While most Black Friday sellers are honest, Dunkel emphasized the importance of watching for Black Friday fraud.

“With Black Friday, with all the people shopping, consumers need to be more diligent,” he said. “Consumers need to take more steps to make sure they are buying an authentic product.”

Go to Original Article
Author:

Partners need to target SAP SMB market for cloud transition

SAP SMB customers make up a large percentage of SAP’s overall customer base. Most SMB customers rely on SAP’s extensive partner network for sales and implementation of SAP systems. And SAP, in turn, is relying on its partners to help transition its SMB customer base from traditional on-premises ERP systems to the next-generation SAP S/4HANA Cloud ERP platform.

In this Q&A conducted at the Innovate Live by SAP event in November, Claus Gruenewald, SAP global vice president of Global Partner Organization portfolio management, discusses the roles that SAP partners play in selling to and servicing the SAP SMB market. Gruenewald explains that partners who have been successful in selling on-premises SAP systems will need to change their strategy to become successful in the cloud.

Why are SAP’s partners important for the SAP SMB customer base?

Claus Gruenewald: The SMB market is one which SAP predominantly serves through and with partners. So the partner business is a very important one for SAP SMB customers. Most SMB sales are driven by partners, and most of the partners are local. Sometimes we see the global partners in the SMB space, particularly Deloitte, but we don’t see that often. It’s a very local business. These partners really know the market space; they are also trusted by their customers because the name of the brand is known in the local space.

How many partners are currently active?

Gruenewald: There are a little over 800 active selling partners for SAP S/4HANA on-premises, and there are 300 partners that actively work on cloud with around 100 partners actively closing deals for SAP S/4HANA Cloud. There’s such a difference in on-premises because the sales and service cycles are longer compared to cloud. If a customer decides to go for on-premises on purpose — and there are reasons for this — typically the partner needs a little longer in the sales cycle, and partners are able to do one, two or maybe three projects a year, depending on the size of the partner. So it’s not a volume business, it’s a value business for the partner.

Claus GruenewaldClaus Gruenewald

What are some of the differences between on-premises and cloud implementations?

Gruenewald: The sales cycle and the project scope is shorter for the cloud, and it’s more often led by best practices. In on-premises, you sell an ERP on-premises license and the customer comes with precise requirements about what it wants to solve with the ERP implementation. The partners can then make a customized, on-premises ERP that’s specific to the customer, which makes the sales and implementation cycle longer. One strategy for customers is that they can differentiate in their industry with a specific customized ERP, so they may choose on-premises. However, another customer strategy is to say that almost everybody in the industry already has ERP, so the strategic differentiator … is fast rollout and using best practices in the industry, so they may choose the cloud.

What are some of the differences in the ways partners approach on-premises or cloud implementations?

Gruenewald: The on-premises partner typically doesn’t do more than three to four projects a year because it needs the resources and it only has a given amount of consultants. With the cloud, the partner is successful if it has a fast go-to-market [strategy], which means going after many customers. The cloud business model only works if a partner has four to six customers a year. The money from the cloud customer comes in quarterly fees, so the partner has to cover a cash flow dip in the beginning. But if it keeps the customer for one and a half years, the cash comes back. So the partner does well if it has four to six customers in the first year. The first year of cloud business for everyone is an investment business, but after one and a half or two years with six or seven customers, the profitability and cash flow curve is super steep. That’s if you don’t lose customers.

How can partners who have been successful with on-premises implementations focus more on cloud business?

Gruenewald: We have trainings for that but it’s also a mind shift to get into that business. Make the customer understand that it’s best to take it as is, it’s a best practice in cloud. So don’t sell feature functions, sell best practices. Once your customer accepts best practices, then it’s a cloud customer. The customer will be happy almost forever because in ERP, a customer usually doesn’t change the vendor because [ERP is] mission-critical for them. They usually don’t do it because the switching costs are simply too high, whether it’s cloud or on-premises.

What are some specific ways partners can sell their customers on the cloud?

Gruenewald: The partners understand ERP very well but if the partner just goes in with too many feature functions to a cloud-minded customer, that will not succeed. The partners have to help customers understand that SAP has a pretty good understanding of their industry, and that these are the best practices. For example, here are the best practices that matter in consumer goods or component manufacturing — and that’s pre-configured in the system. You take it to your customer with a demo system and show them the software, show them the nice [user interface], show them what has improved using machine learning and AI, show how much automation has to be put into the system. It’s not the original ERP system anymore where everything was done manually, which was nice for a professional user 20 years ago. Now, the ERP application has changed and is much more automated. It’s not made for these super professional users for only that system. This saves them time, which they can use for something else, because the system automatically gives them not a decision, but a decision proposal. It’s not just a system that you have to feed all the time with master data and transactional data, it’s basically automated now and all that process stuff is going away.

Go to Original Article
Author:

What should CIOs do with SAP ECC support ending in 2025?

SAP has promised the end of SAP ECC support in 2025, and that means big changes for most SAP users. 

Companies using SAP ERP Central Component are faced with some major decisions. The most obvious is whether to stay on ECC or migrate their systems to S4/HANA. This is not easy decision to make, as each has its own set of pros and cons. No matter which choice a company makes, it will face business consequences, and must prepare accordingly.

From the vendor perspective, support staff and developers should focus on a new product. As part of this, most software vendors push their clients to adopt the latest platform, partly by imposing an end-of-support deadline. And this strategy has some success. Most clients don’t want to be left with an unsupported system that might cause work delays. But moving to a new product can also be problematic.

For an SAP ECC customer, moving to S4/HANA comes with its own set of challenges and poses risks. Implementing the latest SAP platform does not always equate to better and faster systems, as seen in Revlon’s disastrous SAP S/4HANA implementation. Revlon experienced shipping delays and revenue losses as a result of system, operational and implementation challenges. It was also sued by shareholders.

Such failures can’t always be blamed only on the new software. Other factors that can contribute to ERP implementation failure — whether a new SAP system or another vendor’s system — include lack of operational maturity, poor leadership, lack of experienced resources and cultural challenges. These can turn a potentially successful ERP implementation into a complete disaster.

End of SAP ECC support must be balanced with the risks of moving to S/4HANA. Companies must consider performing the following activities in order to prepare for the upcoming deadline:

  • Talk to others in the same vertical about their experience with S/4HANA.
  • Determine the costs and changes associated with the change.
  • Evaluate the latest version of S/4HANA.
  • Identify which vendors might potentially continue to provide third-party ECC support after SAP stops it.
  • Determine any compliance concerns that could arise from not receiving updates on ECC software.
  • Reach out to other companies within the SAP user groups and discuss what some of their plans are.
  • Determine a plan for necessary patching and bug fixes.

Go to Original Article
Author:

Guarding the shop: Rewind backup protects e-commerce data

It’s the most wonderful time of the year for e-commerce … that is, until your site goes down and customers can’t shop anymore.

That’s where Rewind backup comes in.

Rewind provides backup for e-commerce sites hosted on Shopify and BigCommerce.

“Most people don’t know they need a backup,” Rewind CEO Mike Potter said.

For example, an e-commerce business that uses Shopify and deletes a product or blog post is not covered just because it’s in the cloud. Similar to cloud-based applications such as Microsoft Office 365 and Salesforce, the provider protects its infrastructure, but not always your data.

However, in Office 365, for example, users have a place for deleted items that they can access if they delete an email by mistake. That’s not the case in a lot of e-commerce platforms where “there is no trash bin,” Potter said.

Potter, who is also a founder of Ottawa-based Rewind, said he’s lost data before, so he understands the pain. Launched four years ago, Rewind had one customer lose everything right before Christmas but restored the store to a safe point in time from before the incident.

As a way to bring the backup issue to the forefront, this holiday season Rewind is offering a free version of its data protection software. Rewind: One-Time enables retailers to conduct a free one-time backup of up to 10,000 products and related data in their online stores. The Rewind backup offer is available for BigCommerce and Shopify merchants.

After an incident, Rewind: One-Time users can restore their data to the time they installed the product.

There needs to be a way for everyone to have protection in this holiday season.
Mike PotterCEO, Rewind

The one-time backup for BigCommerce includes product, brand, category, option set and option data, while the Shopify backup includes products, product images, custom collections and smart collections. The backups are stored indefinitely in the Rewind Vault, which is hosted in various Amazon regions. Data is encrypted in transit and at rest.

It’s the first time Rewind has offered this one-time backup.

“There needs to be a way for everyone to have protection in this holiday season,” Potter said.

A jump forward with Rewind backup

For Crossrope, an online jump rope seller and workout provider based in Raleigh, N.C., “it’s the biggest season of the year,” said digital marketing specialist Andy Lam.

“To have Rewind as a tool for backing up, it just gives us peace of mind,” Lam said.

Before adopting Rewind, one afternoon at the end of a workday, Crossrope made a change to its theme code that broke the site. Customers couldn’t add items to their carts and the company lost out on orders and revenue in the process.

The company had a manual backup saved from 30 days prior and spent a lot of time trying to restore the site manually.

“That kickstarted trying to find a better solution,” Lam said.

Crossrope heard from BigCommerce, its e-commerce platform of choice, about Rewind backup. It was the first backup company that Crossrope contacted.

“Because they were a full-fledged cloud backup tool, it was a no-brainer,” Lam said.

Now if there are any incorrect changes like the previous incident, Crossrope can “rewind” to a known good point in time, in just a couple of clicks. The company has been using Rewind backup for about four months and hasn’t had a major incident. Rewind performs daily backups for Crossrope, which Lam said is enough.

Screenshot of Rewind backup
Rewind backup enables merchants to restore their stores to a safe point in time.

“Now we feel safe,” Lam said. “I know they’re covering a lot of bases for us.”

While Rewind can restore the code in a couple of clicks, Lam said he is hoping the backup vendor can speed up product restoration.

A Rewind recap

Though e-commerce data loss can result from malicious acts and third-party integrations, human error is a common cause.

“We’ve seen everything,” Potter said. (Think of a cat jumping on a keyboard.) “You don’t get any warnings you’re going to have a disaster.”

Rewind claims more than 10,000 small and medium-sized enterprises as customers.

If they want backups more recent than the one-time protection, Rewind: One-Time users can upgrade to one of the paid options during the holiday season or beyond. Pricing ranges from $9 to $299 per month, depending on the size of the store and the number of orders. Many customers perform a daily Rewind backup, Potter said.

The Rewind: One-Time offer is available through Dec. 31, 2019. Customers who use it will have access to that backup indefinitely.

Rewind also provides backup for Mailchimp email marketing and QuickBooks Online accounting data.

Go to Original Article
Author:

Implement automated employee onboarding with PowerShell

One of those most common tasks help desk technicians and system administrators handle is provisioning all the resources to onboard new employees.

Depending on the organization, these tasks may include creating Active Directory user accounts, creating home folders, provisioning new Office 365 mailboxes and setting up a VoIP extension in the phone system. With a little PowerShell coding, you can put together an automated employee onboarding script that does a majority of this work in little to no time.

To automate this process, it’s essential to define all the tasks. Many companies have a document that outlines the steps to onboard a new employee, such as the following:

  • create an Active Directory user account;
  • create a user folder on a file share; and
  • assign a mobile device.

When building a PowerShell script, start by researching if the basics required for automation are available. For example, does the system you’re using to assign a mobile device have an API? If not, then it can’t be completely automated. Rather than bypass this step, you can still add something to your script that sends an email to the appropriate person to complete the setup.

For other tasks that PowerShell can automate, you can start by scaffolding out some code in your script editor.

Build the framework for the script

Start by adding some essential functions to encapsulate each task. The following code is an example of build functions for each of the tasks we want to automate.

param(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath
)

function New-CompanyAdUser {
[CmdletBinding()]
param
(

)

}

function New-CompanyUserFolder {
[CmdletBinding()]
param
(

)

}

function Register-CompanyMobileDevice {
[CmdletBinding()]
param
(

)

}

function Read-Employee {
[CmdletBinding()]
param
(

)

}

This isn’t our final code. This is just a brainstorming exercise.

Add the code to receive input

Notice the param block at the top and the Read-Employee function. This function receives any type of input, such as a CSV file or database. When we create a function, it’s easy to modify the code if the method changes.

For now, we are using a CSV file to make the Read-Employee script below. By default, this function takes the CSV file path when the script runs.

function Read-Employee {
[CmdletBinding()]
param
(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath = $CsvFilePath
)

Import-Csv -Path $CsvFilePath

}

Add a Read-Employee reference below this function.

We have a CSV file from human resources that looks like this:

FirstName,LastName,Department
Adam,Bertram,Accounting
Joe,Jones,HR

We’ll give the script a name New-Employee.ps1 with the CsvFilePath parameter.

./New-Employee.ps1 -CsvFilePath './Employees.csv'

Developing the functions

Next, fill in the other functions. This is just an example but should give you a good idea of how you could build your code for the specifics your automated employee onboarding script should have. For more information on the creation of the New-CompanyAdUser function, you can find more information at this blog post.

param(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath
)

function New-CompanyAdUser {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

## Generate a random password
$password = [System.Web.Security.Membership]::GeneratePassword((Get-Random -Minimum 20 -Maximum 32), 3)
$secPw = ConvertTo-SecureString -String $password -AsPlainText -Force

## Generate a first initial/last name username
$userName = "$($EmployeeRecord.FirstName.Substring(0,1))$($EmployeeRecord.LastName))"

## Create the user
$NewUserParameters = @{
GivenName = $EmployeeRecord.FirstName
Surname = $EmployeeRecord.LastName
Name = $userName
AccountPassword = $secPw
}
New-AdUser @NewUserParameters

## Add the user to the department group
Add-AdGroupMember -Identity $EmployeeRecord.Department -Members $userName
}

function New-CompanyUserFolder {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

$fileServer = 'FS1'

$null = New-Item -Path "\\$fileServer\Users\$($EmployeeRecord.FirstName)$($EmployeeRecord.LastName)" -ItemType Directory

}

function Register-CompanyMobileDevice {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

## Send an email for now. If we ever can automate this, we'll do it here.
$sendMailParams = @{
'From' = '[email protected]'
'To' = '[email protected]'
'Subject' = 'A new mobile device needs to be registered'
'Body' = "Employee: $($EmployeeRecord.FirstName) $($EmployeeRecord.LastName)"
'SMTPServer' = 'smtpserver.something.local'
'SMTPPort' = '587'
}

Send-MailMessage @sendMailParams

}

function Read-Employee {
[CmdletBinding()]
param
(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath = $CsvFilePath
)

Import-Csv -Path $CsvFilePath

}

Read-Employee

Calling the functions

Once you build the functions, pass each of the employee records returned from Read-Employee to each function, as shown below.

$functions = 'New-CompanyAdUser','New-CompanyUserFolder','Register-CompanyMobileDevice'
foreach ($employee in (Read-Employee)) {
foreach ($function in $functions) {
& $function -EmployeeRecord $employee
}
}

By standardizing the function parameters to have a single parameter EmployeeRecord which coincides to a row in the CSV file, you can define the functions you want to call in an array and loop over each of them.

Click here to download the code used in this article on GitHub.

Go to Original Article
Author:

Lessons learned from PowerShell Summit 2019

Most Windows administrators have at least dabbled with PowerShell to get started on their automation journey, but for more advanced practices, it helps to attend a conference, such as the PowerShell Summit.

For the second straight year, I attended the PowerShell + DevOps Global Summit in Bellevue, Wash., earlier this year. As an avid PowerShell user and evangelist, I greatly enjoy being around fellow community members to talk shop about how we use PowerShell in our jobs as well as catch the deep dive sessions.

This year was a bit different for me as I also presented a session, “Completely Automate Managing Windows Software … Forever,” to explain how I use Chocolatey with PowerShell to automate the deployment of third-party software in my full-time position.

There’s a lot of value in the sessions at PowerShell Summit. Administrators and developers who rely on PowerShell get a chance to learn something new and build their skill set. The videos from the sessions are on YouTube, but if you don’t attend in person you will miss out on the impromptu hallway discussions. These gatherings can be a great way to meet a lot of veteran IT professionals, community leads and even PowerShell team members. Jeffrey Snover, the inventor of PowerShell, was also in attendance.

In this article, I will cover my experiences at this year’s PowerShell Summit and some of the lessons learned during the week.

AWS Serverless Computing

Serverless computing is a very hot topic for many organizations that want to cut costs and reduce the work it takes to support a Windows deployment. With serverless computing, there is no need to manage a Windows Server machine and all its requisite setup and maintenance work. You can use an API to run PowerShell, and it will scale automatically.

I have not tried any sort of serverless computing, but it didn’t take much to see its potential and advantages during the demonstration.

Andrew Pearce, a senior systems development engineer at AWS, presented a session entitled “Unleash Your PowerShell With AWS Lambda and Serverless Computing.” Pearce’s talk covered how to use Amazon’s event-drive computing with PowerShell Core.

I have not tried any sort of serverless computing, but it didn’t take much to see its potential and advantages during the demonstration. Pearce explained that a PowerShell function can run from an event, such as when an image is placed in an AWS Simple Storage Service bucket, to convert the image to multiple resolutions depending on the organization’s need. Another possibility is to run a PowerShell function in response to an IoT event, such as someone ringing a doorbell.

Simple Hierarchy in PowerShell

Simple Hierarchy in PowerShell (SHiPS) is an area in PowerShell that looks interesting, but one that I have not had much experience with. The concepts of SHiPs is similar to traversing a file system in PowerShell; a SHiPs provider can create a provider that looks like a hierarchical file system.

Providers have been a part of PowerShell since version 1.0 and give access to data and components not easily reached from the command line, such as the Windows certificate store. One common example is the PowerCLI data store provider to let users to access a data store in vSphere.

You can see what providers you have on a system with the Get-PSProvider command.

PowerShell providers
The Get-PSProvider cmdlet lists the providers available on the system.

Another familiar use of a PowerShell provider is the Windows registry, which PowerShell can navigate like a traditional file system.

Glenn Sarti of Puppet gave a session entitled “How to Become a SHiPS Wright – Building With SHiPS.” Attempting to write your own provider from scratch is a complex undertaking that ordinarily requires advanced programming skill and familiarity with the PowerShell software development kit. SHiPs attempts to bypass this complexity and make provider development easier by writing provider code in PowerShell. Sarti explained that SHiPs reduces the amount of code to write a module from thousands of lines to less than 20 in some instances.

In his session, Sarti showed how to use SHiPs to create an example provider using the PowerShell Summit agenda and speakers as data. After watching this session, it sparked an idea to create a provider for a Chocolatey repository as if it were a file system.

PowerShell Remoting Internals

In this deep dive session, Paul Higinbotham, a member of the PowerShell team, covered how PowerShell remoting works. Higinbotham explained PowerShell’s five different transports to remoting, or protocols, it can use to run remote commands on another system.

In Windows, the most popular is WinRM since PowerShell is most often used on Windows. For PowerShell Core, OpenSSH is the best option for cross-platform use since both Windows and Linux. The advantage here is you can run PowerShell Core scripts from Windows to Linux and vice versa. Since I work in a mostly Linux environment, using Secure Shell (SSH) and PowerShell together makes a lot of sense.

This session taught me about the existence of interprocess communication remoting in PowerShell Core. This is accomplished with the command Enter-PSHostProcess and the main perk being the ability to debug scripts on a remote system from your local machine.

Toward the end of the presentation, Higinbotham shared a great diagram of the remoting architecture and goes over what each component does during a remoting session.

Go to Original Article
Author:

Carbon Black acquisition is ‘compelling’

SAN FRANCISCO — VMware’s acquisition of Carbon Black is “the most compelling security story” Steve Athanas has heard in a while.

“I don’t know any other vendor in the ecosystem that has more visibility to more business transactions happening than VMware does,” said Athanas, VMware User Group president and associate CIO at the University of Massachusetts Lowell.

At its annual user conference, VMware announced new features within Workspace One, its digital workspace product that enables IT to manage virtual desktops and applications, and talked up the enhanced security features the company will gain through its $2.1 billion Carbon Black acquisition. Like Athanas, VMworld attendees welcomed the news.

VMware CEO Pat Gelsinger
At the opening keynote for VMworld, VMware CEO Pat Gelsinger speaks about the recent Carbon Black acquisition.

In this podcast, Athanas said Carbon Black could provide endpoint security across an entire organization once the technology is integrated, a promise he said he’s still thinking through.

“Are [chief security officers] going to buy into this model of wanting security from one vendor? I’ve heard CSOs in the past say you don’t do that because if one fails, you want another application to be able to detect something,” he said. “I don’t know where the balance and benefit is between being able to see more through that single view from Carbon Black or to have multiple vendors.”

Aside from the Carbon Black acquisition, Athanas was drawn to newly unveiled features for Workspace One that are aimed at making day-to-day processes for end users, IT and HR admins easier. For IT admins, a new Employee Experience Management feature enables IT to proactively diagnose if an end user’s device has been compromised by a harmful email or cyberattack. The feature can prevent the employee from accessing more company applications, preventing the spread of a cyberattack.

Another feature is called Virtual Assistant, which can help automate some of the onboarding and device management aspects of hiring a new employee.

“The Virtual Assistant stuff is cool, but I’m going to reserve judgement on it, because there is a ton of work that needs to go into getting that AI to give you the right answer,” Athanas said.

Go to Original Article
Author:

Why You Should Be Using VM Notes in PowerShell

One of the nicer Hyper-V features is the ability to maintain notes for each virtual machine. Most of my VMs are for testing and I’m the only one that accesses them so I often will record items like an admin password or when the VM was last updated. Of course, you would never store passwords in a production environment but you might like to record when a VM was last modified and by whom. For single VM management, it isn’t that big a deal to use the Hyper-V manager. But when it comes to managing notes for multiple VMs PowerShell is a better solution.

In this post, we’ll show you how to manage VM Notes with PowerShell and I think you’ll get the answer to why you should be using VM Notes as well. Let’s take a look.

Using Set-VM

The Hyper-V module includes a command called Set-VM which has a parameter that allows you to set a note.

Displaying a Hyper-V VM note

As you can see, it works just fine. Even at scale.

Setting notes on multiple VMs

But there are some limitations. First off, there is no way to append to existing notes. You could get any existing notes and through PowerShell script, create a new value and then use Set-VM. To clear a note you can run Set-VM and use a value of “” for -Notes. That’s not exactly intuitive. I decided to find a better way.

Diving Deep into WMI

Hyper-V stores much in WMI (Windows Management Instrumentation). You’ll notice that many of the Hyper-V cmdlets have parameters for Cimsessions. But you can also dive into these classes which are in the root/virtualization/v2 namespace. Many of the classes are prefixed with msvm_.

Getting Hyper-V CIM Classes with PowerShell

After a bit of research and digging around in these classes I learned that to update a virtual machine’s settings, you need to get an instance of msvm_VirtualSystemSettingData, update it and then invoke the ModifySystemSettings() method of the msvm_VirtualSystemManagementService class. Normally, I would do all of this with the CIM cmdlets like Get-CimInstance and Invoke-CimMethod. If I already have a CIMSession to a remote Hyper-V host why not re-use it?

But there was a challenge. The ModifySystemSettings() method needs a parameter – basically a text version of the msvm_VirtualSystemSettingsData object. However, the text needs to be in a specific format. WMI has a way to format the text which you’ll see in a moment. Unfortunately, there is no technique using the CIM cmdlets to format the text. Whatever Set-VM is doing under the hood is above my pay grade. Let me walk you through this using Get-WmiObject.

First, I need to get the settings data for a given virtual machine.

This object has all of the virtual machine settings.

I can easily assign a new value to the Notes property.

$data.notes = “Last updated $(Get-Date) by $env:USERNAME”

At this point, I’m not doing much else than what Set-VM does. But if I wanted to append, I could get the existing note, add my new value and set a new value.

At this point, I need to turn this into the proper text format. This is the part that I can’t do with the CIM cmdlets.

To commit I need the system management service object.

I need to invoke the ModifySystemSettings() method which requires a little fancy PowerShell work.

Invoking the WMI method with PowerShell

A return value of 0 indicates success.

Verifying the change

The Network Matters

It isn’t especially difficult to wrap these steps into a PowerShell function. But here’s the challenge. Using Get-WmiObject with a remote server relies on legacy networking protocols. This is why Get-CimInstance is preferred and Get-WmiObject should be considered deprecated. So what to do? The answer is to run the WMI commands over a PowerShell remoting session. This means I can create a PSSession to the remote server using something like Invoke-Command. The connection will use WSMan and all the features of PowerShell remoting. In this session on the remote machine, I can run all the WMI commands I want. There’s no network connection required because it is local.

The end result is that I get the best of both worlds – WMI commands doing what I need over a PowerShell remoting session. By now, this might seem a bit daunting. Don’t worry. I made it easy.

Set-VMNote

In my new PSHyperVTools module, I added a command called Set-VMNote that does everything I’ve talked about. You can install the module from the PowerShell Gallery. If you are interested in the sausage-making, you can view the source code on Github at https://github.com/jdhitsolutions/PSHyperV/blob/master/functions/public.ps1. The function should make it easier to manage notes and supports alternate credentials.

Set-VMNote help

Now I can create new notes.

Creating new notes

Or easily append.

Appending notes

It might be hard to tell from this. Here’s what it looks like in the Hyper-V manager.

Verifying the notes

Most of the time the Hyper-V PowerShell cmdlets work just fine and meet my needs. But if they don’t, that’s a great thing about PowerShell – you can just create your own solution! And as you can probably guess, I will continue to create and share my own solutions right here.

Go to Original Article
Author: Jeffery Hicks