Tag Archives: other

Navy sails SAP ERP systems to AWS GovCloud

The U.S. Navy has moved several SAP and other ERP systems from on premises to AWS GovCloud, a public cloud service designed to meet the regulatory and compliance requirements of U.S. government agencies.

The project entailed migrating 26 ERPs across 15 landscapes that were set up around 60,000 users across the globe. The Navy tapped SAP National Security Services Inc. (NS2) for the migration. NS2 was spun out of SAP specifically to sell SAP systems that adhere to the highly regulated conditions that U.S. government agencies operate under.

Approximately half of the systems that moved to AWS GovCloud were SAP ERP systems running on Oracle databases, according to Harish Luthra, president of NS2 secure cloud business. SAP systems were also migrated to the SAP HANA database, while non-SAP systems remain on their respective databases.

Architecture simplification and reducing TCO

The Navy wanted to move the ERP systems to take advantage of the new technologies that are more suited for cloud deployments, as well as to simplify the underlying ERP architecture and to reduce the total cost of ownership (TCO), Luthra said.

The migration enabled the Navy to reduce the data size from 80 TB to 28 TB after the migration was completed.

Harish LuthraHarish Luthra

“Part of it was done through archiving, part was disk compression, so the cost of even the data itself is reducing quite a bit,” Luthra said. “On the AWS GovCloud side, we’re using one of the largest instances — 12 terabytes — and will be moving to a 24 terabyte instance working with AWS.”

The Navy also added applications to consolidate financial systems and improve data management and analytics functionality.

“We added one application called the Universe of Transactions, based on SAP Analytics that allows the Navy to provide a consolidated financial statement between Navy ERP and their other ERPs,” Luthra said. “This is all new and didn’t exist before on-premises and was only possible to add because we now have HANA, which enables a very fast processing of analytics. It’s a giant amount of transactions that we are able to crunch and produce a consolidated ledger.”

Joe GioffreJoe Gioffre

Accelerated timeline

The project was done at an accelerated pace that had to be sped up even more when the Navy altered its requirements, according to Joe Gioffre, SAP NS2 project principal consultant. The original go-live date was scheduled for May 2020, almost two years to the day when the project began. However, when the Navy tried to move a command working capital fund onto the on-premises ERP system, it discovered the system could not handle the additional data volume and workload.

This drove the HANA cloud migration go-live date to August 2019 to meet the fiscal new year start of Oct. 1, 2019, so the fund could be included.

“We went into a re-planning effort, drew up a new milestone plan, set up Navy staffing and NS2 staffing to the new plan so that we could hit all of the dates one by one and get to August 2019,” Gioffre said. “That was a colossal effort in re-planning and re-resourcing for both us and the Navy, and then tracking it to make sure we stayed on target with each date in that plan.”

It’s not as hard as landing on the moon, but you’re still entering orbital space when you are going to these cloud implementations.
Joshua GreenbaumPrincipal, Enterprise Applications Consulting

Governance keeps project on track

Tight governance over the project was the key to completing it in the accelerated timeframe.

“We had a very detailed project plan with a lot of moving parts and we tracked everything in that project plan. If something started to fall behind, we identified it early and created a mitigation for it,” Gioffre explained. “If you have a plan that tracks to this level of detail and you fall behind, unless you have the right level of governance, you can’t execute mitigation quickly enough.”

The consolidation of the various ERPs onto one SAP HANA system was a main goal of the initiative, and it now sets up the Navy to take advantage of next-generation technology.

“The next step is planning a move to SAP S/4HANA and gaining process improvements as we go to that system,” he said.

Proving confidence in the public cloud

It’s not a particular revelation that public cloud hyperscaler storage providers like AWS GovCloud can handle huge government workloads, but it is notable that the Department of Defense is confident in going to the cloud, according to analyst Joshua Greenbaum, principal at Enterprise Applications Consulting, a firm based in Berkeley, Calif.

“The glitches that happened with Amazon recently and [the breach of customer data from Capital One] highlight the fact that we have a long way to go across the board in perfecting the cloud model,” Greenbaum said. “But I think that SAP and its competitors have really proven that stuff does work on AWS, Azure and, to a lesser extent, Google Cloud Platform. They have really settled in as legitimate strategic platforms and are now just getting the bugs out of the system.”

Greenbaum is skeptical that the project was “easy,” but it would be quite an accomplishment if it was done relatively painlessly.

“Every time you tell me it was easy and simple and painless, I think that you’re not telling me the whole story because it’s always going to be hard,” he said. “And these are government systems, so they’re not trivial and simple stuff. But this may show us that if the will is there and the technology is there, you can do it. It’s not as hard as landing on the moon, but you’re still entering orbital space when you are going to these cloud implementations, so it’s always going to be hard.”

Go to Original Article
Author:

Capital One breach suspect may have hit other companies

A new report looking into the attacker accused in the Capital One breach discovered references to other potential victims, but no corroborating evidence has been found yet.

The FBI accused Paige Thompson, who allegedly went by the name “Erratic” on various online platforms, including an invite-only Slack channel. The Slack channel was first reported on by investigative cybersecurity journalist Brian Krebs, who pointed out that file names referenced in the channel pointed to other organizations potentially being victims of similar attacks.

A new report by cybersecurity firm CyberInt, based in London, regarding the Capital One breach built on the information discovered by Krebs. Jason Hill, lead cybersecurity researcher at CyberInt, said the company was able to gain access to the Slack channel via an open invitation link.

“This link was obtained from the now-offline ‘Seattle Warez Kiddies’ Meetup group (Listed as ‘Organized by Paige Thomson’),” Hill wrote via email. “Based on the publicly available information at the time of report completion, such as Capital One’s statement and the [FBI’s] Criminal Complaint, we were able to conduct open source intelligence gathering to fill in some of the missing detail and follow social media leads to gain an understanding of the alleged threat actor and their activity over the past months.”

According to Hill, CyberInt researchers followed the trail through a GitHub account, GitLab page and a screenshot of a file archival process shared in the Slack channel.

“The right-hand side of the screen appears to show the output of the Linux command ‘htop’ that lists current processes being executed. In this case, under the ‘Command’ heading, we can see a number of ‘tar –remove-files -cvf – ‘ processes, which are compressing data (and then removing the uncompressed source),” Hill wrote. “These files correlate with the directory listing, and potential other victims, as seen later within the Slack channel.”

Between the files named in the screenshot and the corresponding messages in the Slack channel, it appeared as though in addition to the Capital One breach, the threat actor may have stolen 485 GB of data from various other organizations. Some organizations were implied by only file names, such as Ford, but others were named directly by Erratic in messages, including the Ohio Department of Transportation, Michigan State University, Infoblox and Vodafone.

Hill acknowledged that CyberInt did not directly contact any of the organizations named, because the company policy is normally to “contact organizations when our research detects specific vulnerabilities that can be mitigated, or threats detected by our threat intelligence platform.

“However in this case, our research was focused on the Capital One breach to gain an understanding of the threat actor’s tactics, techniques and procedures (TTP) and resulted in the potential identification of additional victims rather than the identification of any specific vulnerability or ongoing threat,” Hill wrote. “Our report offered general advice for those concerned about the TTP based on these findings.”

We contacted some of the organizations either directly named or implied via file name in Erratic’s Slack channel. The Ohio Department of Transportation did not respond to a request for comment. Ford confirmed an investigation is underway to determine if the company was the victim of a data breach.

A spokesperson for Michigan State University also confirmed an investigation is underway and the university is cooperating with law enforcement authorities, but at this point there is “no evidence to suggest MSU was compromised.”

Similarly, an Infoblox spokesperson said the company was “continuing to investigate the matter, however, at this time, there is no indication that Infoblox was in any way involved with the Capital One breach. Additionally, there is no indication of an intrusion or data breach causing Infoblox customer data to be exposed.”

A Vodafone spokesperson claimed the company takes security seriously, but added, “Vodafone is not aware of any information that relates to the Capital One security breach.”

Go to Original Article
Author:

Data ethics issues create minefields for analytics teams

GRANTS PASS, Ore. — AI technologies and other advanced analytics tools make it easier for data analysts to uncover potentially valuable information on customers, patients and other people. But, too often, consultant Donald Farmer said, organizations don’t ask themselves a basic ethical question before launching an analytics project: Should we?

In the age of GDPR and like-minded privacy laws, though, ignoring data ethics isn’t a good business practice for companies, Farmer warned in a roundtable discussion he led at the 2019 Pacific Northwest BI & Analytics Summit. IT and analytics teams need to be guided by a framework of ethics rules and motivated by management to put those rules into practice, he said.

Otherwise, a company runs the risk of crossing the line in mining and using personal data — and, typically, not as the result of a nefarious plan to do so, according to Farmer, principal of analytics consultancy TreeHive Strategy in Woodinville, Wash. “It’s not that most people are devious — they’re just led blindly into things,” he said, adding that analytics applications often have “unforeseen consequences.”

For example, he noted that smart TVs connected to home networks can monitor whether people watch the ads in shows they’ve recorded and then go to an advertiser’s website. But acting on that information for marketing purposes might strike some prospective customers as creepy, he said.

Shawn Rogers, senior director of analytic strategy and communications-related functions at vendor Tibco Software Inc., pointed to a trial program that retailer Nordstrom launched in 2012 to track the movements of shoppers in its stores via the Wi-Fi signals from their cell phones. Customers complained about the practice after Nordstrom disclosed what it was doing, and the company stopped the tracking in 2013.

“I think transparency, permission and context are important in this area,” Rogers said during the session on data ethics at the summit, an annual event that brings together a small group of consultants and vendor executives to discuss BI, analytics and data management trends.

AI algorithms add new ethical questions

Being transparent about the use of analytics data is further complicated now by the growing adoption of AI tools and machine learning algorithms, Farmer and other participants said. Increasingly, companies are augmenting — or replacing — human involvement in the analytics process with “algorithmic engagement,” as Farmer put it. But automated algorithms are often a black box to users.

Mike Ferguson, managing director of U.K.-based consulting firm Intelligent Business Strategies Ltd., said the legal department at a financial services company he works with killed a project aimed at automating the loan approval process because the data scientists who developed the deep learning models to do the analytics couldn’t fully explain how the models worked.

We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.
Mike FergusonManaging director, Intelligent Business Strategies Ltd.

And that isn’t an isolated incident in Ferguson’s experience. “There’s a loggerheads battle going on now in organizations between the legal and data science teams,” he said, adding that the specter of hefty fines for GDPR violations is spurring corporate lawyers to vet analytics applications more closely. As a result, data scientists are focusing more on explainable AI to try to justify the use of algorithms, he said.

The increased vetting is driven more by legal concerns than data ethics issues per se, Ferguson said in an interview after the session. But he thinks that the two are intertwined and that the ability of analytics teams to get unfettered access to data sets is increasingly in question for both legal and ethical reasons.

“It’s pretty clear that legal is throwing their weight around on data governance,” he said. “We’ve gone from a bottom-up approach of everybody grabbing data and doing something with it to more of a top-down approach.”

Jill Dyché, an independent consultant who’s based in Los Angeles, said she expects explainable AI to become “less of an option and more of a mandate” in organizations over the next 12 months.

Code of ethics not enough on data analytics

Staying on the right side of the data ethics line takes more than publishing a corporate code of ethics for employees to follow, Farmer said. He cited Enron’s 64-page ethics code, which didn’t stop the energy company from engaging in the infamous accounting fraud scheme that led to bankruptcy and the sale of its assets. Similarly, he sees such codes having little effect in preventing ethical missteps on analytics.

“Just having a code of ethics does absolutely nothing,” Farmer said. “It might even get in the way of good ethical practices, because people just point to it [and say], ‘We’ve got that covered.'”

Instead, he recommended that IT and analytics managers take a rules-based approach to data ethics that can be applied to all three phases of analytics projects: the upfront research process, design and development of analytics applications, and deployment and use of the applications.

Go to Original Article
Author:

How to Set Up Hyper-V VM Groups with PowerShell

The other day I was poking around the Hyper-V PowerShell module and I came across a few commands that I must have initially ignored. I had no idea this feature existed until I came across the cmdlets. I can’t find anything in the Hyper-V Manager console that exposes this feature as well so if you want to take advantage of it, PowerShell is the way to go. It turns out that you can organize your virtual machines into groups with these commands.

VM Group cmdlets in PowerShell

NOTE: You should take the time to read through full help and examples before using any of these commands.

Creating a VM Group

The commands for working with VM Groups support remote connections and credentials. The default is the local host and take note that you can only specify credentials for remote connections. Creating a new group is a pretty simple matter. All you need is a group name and type.  You can create a group that is a collection of virtual machines (VMCollectionType) or a group that is a collection of other groups (ManagementCollection). I’m going to create a group for virtual machines.

Creating a new VM group

I’ve created the group and can retrieve it with Get-VMGroup.

Retrieving a VM Group with PowerShell

You can only create one group at a time, but you can create the same group on multiple servers.

This command created the management group Master on both Hyper-V hosts.

Adding a Group Member

Adding members to a group requires a separate step but isn’t especially difficult. To add members to a VMCollectionType group, you need references to the virtual machine object.

The command won’t write anything to the pipeline unless you use -Passthru. You can take advantage of nested expressions and create a group with members all in one line.

With this one-line command, I created another group call Win and added a few virtual machines to the group.

Creating a VM Group and Members in a PowerShell one-liner

Since I have two groups, let me add them to the management group.

Adding VM Management Groups with PowerShell

And yes, you can put a virtual machine in more than one group.

Retrieving Groups and Group Members

Using Get-VMGroup is pretty straightforward. Although once you understand the object output you can customize it.

Enumerating VM Groups

Depending on the group type you will have a nested collection of objects. You can easily enumerate them using dotted object notation.

Listing VM Group virtual machines

You can do something similar with management groups.

Expanding VM management groups with PowerShell

Be aware that it is possible to have nested management groups which might make unrolling things to get to the underlying virtual machines a bit tricky. I would suggest restraint until you fully understand how VM groups work and how you intend to take advantage of them.

The output of the VMMembers property is the same virtual machine object you would get using Get-VM so you can pipe this to other commands.

Incorporating VM Groups into a PowerShell expression

Group membership can also be discovered from the virtual machine.

Listing groups from the virtual machine

You cannot manage group membership from the virtual machine itself.

To remove groups and members, the corresponding cmdlets should be self-evident and follow the same patterns I demonstrated for creating VM groups and adding members.

Potential Obstacles

When you assign a virtual machine to a group, the membership is linked to the virtual machine. This may pose a challenge down the road. I haven’t setup replication with a virtual machine that belongs to a group so I’m not sure what the effect if any, might be. But I do know that if you export a virtual machine that belongs to a group and import the virtual machine on a different Hyper-V host, you’ll encounter an error about the missing group. You can remove the group membership on import so it isn’t that much of a showstopper. Still, you might want to remove the virtual machine from any groups prior to exporting.

Doing More

The VM group cmdlets are very useful, but not useful enough. At least for me. I have a set of expectations for using these groups and the cmdlets as written don’t meet those expectations. Fortunately, I can write a few PowerShell functions to get the job done. Next time, I’ll share the tools I’m building around these commands.

In the meantime, I hope you’ll share how you think you’ll take advantage of this feature!

Want to boost your Hyper-V performance? Discover 6 Hardware Tweaks that will Skyrocket your Hyper-V Performance

Go to Original Article
Author: Jeffery Hicks

For Sale – Watercooled i7-4790K build

I’m moving house and need funds for other priorities. I have all my AV gear for sale on here too.

Built with my own hands.

All components were bought from either CCL Computers or Scan.

Asus Z97-P
Intel i7-4790K (delidded running without HS with CPU Delid Die Guard for LGA115X)
G.Skill TridentX 16GB (2x8GB) 2400MHz DDR3 RAM
Sapphire Radeon R9 380 NITRO 4GB
Samsung EVO 850 120GB Solid State Drive
EVGA 750W G2
Core P5 Thermaltake Tempered Glass Open Frame Glass PC Gaming Case
Thermaltake TT Gaming Riser Cable 200mm PCI-E 3.0 X16
EK DuraClear 3M Clear Water Cooling Tubing 9.5mm/15.9mm
EK-Supremacy MX CPU block
EK 140 Revo D5 PWM Pump
EK-RES X3 250 Reservoir
Thermaltake Pacific CL420 Copper Water Cooling Radiator
EK 120mm Vardar F3-120 Fan x 3
EK-XTOP Revo D5 Plexi Pump Top
Gelid Solutions 1-to-4 PWM Fan Splitter

I have boxes for Motherboard, CPU, RAM, GPU, PSU (including all unused cables), Radiator, Pump, Fans, CPU block, Reservoir, SSD. This was a project I built over time so components are varying in age – this was mostly used as a Emby server for my NAS. Completely overspecced for what I used it for but I like quality components and something that looks nice.

All works beautifully. I will need to format the SSD and you will need to install O/S.

Price and currency: 525
Delivery: Goods must be exchanged in person
Payment method: BT, PPG
Location: Brighouse, West Yorkshire
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Microsoft Dynamics 365 AI going hard after Salesforce

Microsoft and Salesforce are attacking each other again. Microsoft Dynamics 365 AI tools are coming that will beef up sales, marketing and — most of all — service and support, unveiled the day after Salesforce announced Quip Slides, a PowerPoint competitor.

Salesforce appears to be annexing Microsoft’s business-productivity territory, while Microsoft is rolling its forces deeper into Salesforce’s CRM domain by more tightly connecting Teams collaboration with its CRM suite, freshened up with new AI capabilities.

“You’ve got Salesforce announcing Quip Slides, and you’ve got Microsoft doing a whole bunch of integration between Teams and Dynamics … who’s going after whose market?” said Alan Lepofsky, analyst at Constellation Research.

In a media briefing ahead of its Ignite user conference, the tech giant took some direct shots at rival Salesforce in introducing Microsoft Dynamics 365 AI tools that buttress CRM processes. Of particular note was Dynamics 365 AI for Customer Service, which adds out-of-the-box virtual agents.

Assistive AI for contact centers

Who’s going after whose market?
Alan Lepofskyanalyst, Constellation Research

Virtual agents can take several forms, two of which include chatbots that do the talking on behalf of humans, or assistive bots that prompt humans with suggested answers for engaging live with customers either on voice or text channels.

New Microsoft bots, built on Azure Cognitive Services, won’t require the code-intensive development or consultant services that other vendors’ CRM tools do, claimed Alysa Taylor, Microsoft corporate vice president of business applications and global industry. She singled out Salesforce as a CRM competitor in her comments.

“Many vendors offer [virtual agents] in a way that is very cumbersome for organizations to adopt,” Taylor said. “It requires a large services engagement; Salesforce partners with IBM Watson to be able to deliver this.”

Either way, the bots will require training. Microsoft Dynamics 365 AI-powered bots can be trained by call center managers, asserted Navrina Singh, Microsoft AI principal product lead, during a demo.

Microsoft CEO Satya Nadella
Microsoft CEO Satya Nadella’s taking on Salesforce with new CRM AI tools

The bots can tap into phone log transcriptions, email and other contact center data stores to shape answers to customer problems and take some of the workload off of overburdened contact center agents, Singh said.

The virtual agent introductions were significant enough that Microsoft brought out CEO Satya Nadella for a cameo with Singh during the briefing.

“The thing that’s most exciting to me,” Nadella said, “… is that [Microsoft] can make every company out there an AI-first company. They already have customers, they already have data. If you can democratize the use of AI tools, every company can harness the power of AI.”

Other Dynamics 365 AI tools for CRM

Sales and marketing staffs get their own Dynamics 365 AI infusion, too.

Microsoft brings Dynamics 365 AI for Sales in line with Salesforce Einstein tools that use AI to prioritize lead pipelines and sales-team performance management.

Microsoft Dynamics 365 AI for Market Insights plumbs marketing, social media and other customer engagement data to improve customer relations and “engage in relevant conversations and respond faster to trends,” Taylor wrote in a blog post announcing the new system.

While the Microsoft moves appear effective, industry observers questioned whether they can Microsoft make an impression in Salesforce’s massive market footprint, even if they are easier to use, more economical and more intuitive than Salesforce’s.

Lepofsky said he isn’t sure, because of the sheer numbers. The 150,000-strong Dreamforce user conference is at the same time as Ignite, and the latter will likely draw only about a sixth of the Dreamforce crowd. And Salesforce likely won’t be resting on its AI credentials either.

“I think you can speculate that Salesforce will also be talking about AI improvements at Dreamforce, so perhaps it’s not that differentiating for Dynamics,” Lepofsky said.

While Microsoft announced no release date for its AI tools, a preview site will go online this fall, Singh said.

For Sale – Asrock Deskmini Mini PC, G4600, 2x4gb Ram, 120gb SSD

Decided to sell HTPC as want to go other routes. Mint condtion, boxes, warranty.

This little PC is very powerful and can be easily used as main PC, obviously perfect for HTPC. All drivers/updates/bios are installed.
In case you don’t know, g4600 is basically 99% of i3 7100.
There is 2 x 2.5″ hdd/ssd slot and 1 x m.2 2280 but it has to be pci-e.

Asrock Deskmini
G4600 with intel stock cooler
2x4gb 2133Mhz Samsung DDR4
120gb Kingston A400 SSD
Intel AC WIFI card with antennas
Windows 10 Pro

Price and currency: 210
Delivery: Delivery cost is not included within my country
Payment method: BT/PPG/Cash on collection
Location: London
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.