Tag Archives: many

Epicor ERP system focuses on distribution

Many ERP systems try to be all things to all use cases, but that often comes at the expense of heavy customizations.

Some companies are discovering that a purpose-built ERP is a better and more cost-effective bet, particularly for small and midsize companies. One such product is the Epicor ERP system Prophet 21, which is primarily aimed at wholesale distributors.

The functionality in the Epicor ERP system is designed to help distributors run processes more efficiently and make better use of data flowing through the system.

In addition to distribution-focused functions, the Prophet 21 Epicor ERP system includes the ability to integrate value-added services, which could be valuable for distributors, said Mark Jensen, Epicor senior director of product management.

“A distributor can do manufacturing processes for their customers, or rentals, or field service and maintenance work. Those are three areas that we focused on with Prophet 21,” Jensen said.

Prophet 21’s functionality is particularly strong in managing inventory, including picking, packing and shipping goods, as well as receiving and put-away processes.

Specialized functions for distributors

Distribution companies that specialize in certain industries or products have different processes that Prophet 21 includes in its functions, Jensen said. For example, Prophet 21 has functionality designed specifically for tile and slab distributors.

“The ability to be able to work with the slab of granite or a slab of marble — what size it is, how much is left after it’s been cut, transporting that slab of granite or tile — is a very specific functionality, because you’re dealing with various sizes, colors, dimensions,” he said. “Being purpose-built gives [the Epicor ERP system] an advantage over competitors like Oracle, SAP, NetSuite, [which] either have to customize or rely on a third-party vendor to attach that kind of functionality.”

Jergens Industrial Supply, a wholesale supplies distributor based in Cleveland, has improved efficiency and is more responsive to shifting customer demands using Prophet 21, said Tony Filipovic, Jergens Industrial Supply (JIS) operations manager.

We looked at other systems that say they do manufacturing and distribution, but I just don’t feel that that’s the case.
Tony FilipovicOperations manager, Jergens Industrial Supply

“We like Prophet 21 because it’s geared toward distribution and was the leading product for distribution,” Filipovic said. “We looked at other systems that say they do manufacturing and distribution, but I just don’t feel that that’s the case. Prophet 21 is something that’s been top of line for years for resources distribution needs.”

One of the key differentiators for JIS was Prophet 21’s inventory management functionality, which was useful because distributors manage inventory differently than manufacturers, Filipovic said.

“All that functionality within that was key, and everything is under one package,” he said. “So from the moment you are quoting or entering an order to purchasing the product, receiving it, billing it, shipping it and paying for it was all streamlined under one system.”

Another key new feature is an IoT-enabled button similar to Amazon Dash buttons that enables customers to resupply stocks remotely. This allows JIS to “stay ahead of the click” and offer customers lower cost and more efficient delivery, Filipovic said.

“Online platforms are becoming more and more prevalent in our industry,” he said. “The Dash button allows customers to find out where we can get into their process and make things easier. We’ve got the ordering at the point where customers realize that when they need to stock, all they do is press the button and it saves multiple hours and days.”

Epicor Prophet 21 a strong contender in purpose-built ERP

Epicor Prophet 21 is on solid ground with its purpose-built ERP focus, but companies have other options they can look at, said Cindy Jutras, president of Mint Jutras, an ERP research and advisory firm in Windham, NH.

“Epicor Prophet 21 is a strong contender from a feature and function standpoint. I’m a fan of solutions that go that last mile for industry-specific functionality, and there aren’t all that many for wholesale distribution,” Jutras said. “Infor is pretty strong, NetSuite plays here, and then there a ton of little guys that aren’t as well-known.”

Prophet 21 may take advantage of new cloud capabilities to compete better in some global markets, said Predrag Jakovljevic, principal analyst at Technology Evaluation Centers, an enterprise computing analysis firm in Montreal.

“Of course a vertically-focused ERP is always advantageous, and Prophet 21 and Infor SX.e go head-to-head all the time in North America,” Jakovljevic said. “Prophet 21 is now getting cloud enabled and will be in Australia and the UK, where it might compete with NetSuite or Infor M3, which are global products.”

Go to Original Article
Author:

Microsoft’s history and future strategy beyond 2020

In 1975, a 20-year old Bill Gates stated a bold ambition that many at the time thought naïve: his fledgling startup would put a Windows computer on every desk and in every home.

Gates, and his co-founder Paul Allen, never quite realized that ambition. They did however, grow “Micro-Soft” from a tiny, underfunded company selling a BASIC interpreter for the Altair 8800 PC to a $22.9 billion company selling operating systems, applications and tools with a 90% share of the microcomputer software market by the year 2000 — 25 years into Microsoft’s history. Close enough.

That year, Microsoft was among the top five most valuable companies with a market cap of $258 billion. Fast forward to fiscal year 2020: The company’s market cap is slightly over $1 trillion and it now tops the list of the world’s most valuable companies.

One could surmise that Microsoft’s trip to the top of the heap was predictable given its position in the fast-growing computer industry over the past two decades. But the journey between the two mountain peaks saw dramatic changes to the company’s senior management and bold technology changes that strayed far from those products that made it rich and famous. It also helped that many of its archrivals made strategic missteps opening doors of opportunity.

Microsoft’s history marked by leadership changes

There are the more obvious reasons Microsoft remained near or at the top of the most influential companies in high tech: the arrival of Satya Nadella taking over from Steve Ballmer; the subsequent refocusing from proprietary products to open source, as well as making the cloud its first priority.

But as important to the company’s success as the right people rising to the top of the company and the right set of priorities, is an often-overlooked factor: Microsoft’s slow and steady progress convincing its mammoth user base to buy its core products through long-term, cloud-based subscriptions.

“If you want to point to one thing that’s kept the company’s revenues growing over the past 20 years it is subscription selling,” said Rob Helm, managing vice president of research at Directions On Microsoft, an independent analysis firm in Kirkland, Wash. “It’s not very exciting but the company has used all kinds of clever tactics to shift users to this model. And they aren’t done yet.”

The move to subscription selling, an initiative that originated before Steve Ballmer took over the day-to-day operations of the company, started with its Licensing 6.0 and Software Assurance program in 2002. The program got off to a slow start, mainly because users were unaccustomed to buying their products through long-term licensing contracts, said Al Gillen, group vice president in IDC’s software development and open source practice.

“That program wasn’t used much at all. Hardly anyone back then used subscriptions to buy software,” Gillen said.

But with the arrival of the cloud, Office 365 and Microsoft 365 in particular, longer-term cloud licensing skyrocketed.

“[Microsoft] became very enthusiastic about the cloud because for them, it was yet another way of moving its customers to long-term subscriptions,” Helm said.

The biggest obstacle to moving many of its customers to cloud-based subscriptions was Microsoft’s own success. For decades of Microsoft’s history, the company was wedded to its business model of selling a stack of products that included the operating system, applications and utilities, and signing up major hardware suppliers to sell the stack bundled with their systems. The company grew rich with this model, but by the mid-2000s, trouble was brewing.

If you want to point to one thing that’s kept the company’s revenues growing over the past 20 years it is subscription selling.
Rob HelmManaging vice president of research, Directions On Microsoft

Under Ballmer’s reign, the software stack model became outdated with the encroaching age of the cloud and open source software led by a new raft of competitors like AWS and Google. Nadella saw this and knew it was time to accelerate the company’s cloud-based subscription business.

So, although Ballmer famously monkey-danced his way across a stage shouting “developers, developers, developers” at a Microsoft 25th anniversary event in 2000, it was Nadella who knew what Windows and non-Windows developers wanted in the age of the cloud — and how to go about enlisting their cooperation.

“Satya decisively moved away from the full stack model and went to an IaaS model more resembling that of AWS,” Helm said. “This pivot allowed them to deliver cloud services much faster than it could have otherwise. But they could not have done this without cutting Windows adrift from the rest of the stack,” he said.

Open source fork in the road

Microsoft’s pivot to support Linux in 2018 and open source happened “just in time,” according to Gillen. While the company had open source projects underway as far back as the 2004 to 2005 timeframe, including a project to move SQL Server to Linux, it took a long while before open source was accepted across Microsoft’s development groups, according to Gillen.

“A developer [inside Microsoft] once told me he would show up for meetings and the Windows guys would look at him and say, ‘We don’t want you at this meeting, you’re the open source guy,” Gillen said. “Frankly, looking back they were lucky to make the transition at all.”

What made the transition to long-term cloud subscriptions easier for Microsoft and its users was the combination of cloud and Linux vendors who already had such licensing in play, according to Gillen.

“It became more palatable to users because they were getting more exposure to it from a variety of sources,” he said.

Microsoft has so fully embraced Linux, not just by delivering Linux compatible products and cloud-based services, but through acquisitions such as GitHub, the world’s largest open source repository, it is becoming hard to remember a time in Microsoft’s history when the company was despised by the open source community.

“If you are a 28-year-old programmer today, you aren’t aware of a time Microsoft was hated by the Linux world,” Gillen said. “But I’d argue now that Microsoft is as invested in open source software as any large cloud vendor.”

Microsoft cloud subscriptions start with desktops

One advantage Microsoft had over its more traditional competitors in moving to a SaaS-based subscription model was the fact that it started by moving desktop applications to the cloud instead of server-based applications that companies like IBM, Oracle and SAP were faced with.

“Microsoft’s desktop software is an easier conversion to the cloud and to move into a subscription model,” said Geoff Woollacott, senior strategy consultant and principal analyst at Technology Business Research Inc. “The degree of difficulty with desktops compared to an on-prem, server-based database that has to be changed to a microservices, subscription-based model is much harder.”

While Microsoft has downplayed the strategic importance of Windows in favor of Azure and cloud-based open source offerings, analysts believe the venerable operating system’s life will extend well into the future. IDC’s Gillen estimated that the Windows desktop and server franchise is likely still in excess of $20 billion a year, which is more than 15% of the company’s overall revenues of $125.8 billion for fiscal 2019 ended June 30.

“Large installed bases have longevity that goes far beyond what most want to have them. Just look at mainframes as an example,” Gillen said. “I’d argue that 60% to 70% of Windows apps are going to be around 10 and 15 years from now, which means Windows doesn’t go away.”

And those Windows apps are all being infused with AI capabilities, in the Azure cloud, on servers and desktops. Microsoft is also making it easier for developers to create AI-infused applications, with tools like AI Builder for PowerApps.

Microsoft’s AI, quantum computing future in focus

While Gillen and other analysts are hesitant to predict the future of Windows and its applications in 20 years from now, most believe the company will spend a generous amount of time focused on quantum computing.

At its recent Ignite conference, Nadella and other Microsoft executives made it clear they’ll have a deep commitment to quantum computing over the next couple of decades. The company delivered its first quantum software a couple of years ago and, surprisingly, is developing a processor capable of running quantum software with plans to deliver other quantum hardware components.

Over the coming years, Microsoft plans to build quantum systems to solve a wide range of issues from complex problems those doing advanced scientific enterprises face, said Julie Love, senior director of Quantum Computing at Microsoft.

“To realize that promise we will need a full system that scales with hundreds and thousands of qubits. It’s why we have this long-term deep development effort in our labs across the globe,” she said.

Microsoft’s first steps toward introducing quantum technology for enterprise use will be to combine quantum algorithms with classical algorithms to improve speed and performance and introduce new capabilities into existing systems. The company is already beta testing this approach with Case Western University, where quantum algorithms have tripled the performance of MRI machines.

“Quantum is going to be a hybrid architecture working alongside classical [architectures],” Love said. “If you look at the larger opportunities down the road, the most promise [for quantum computing] is in chemistry, science, optimization and machine learning.”

Unlike Bill Gates, Satya Nadella isn’t promising a quantum computer on every desktop and in every home by 2040. But you can assume that as long as Gates has an association with the company, it will make an earnest effort to put Microsoft software on those desktops that do.

Go to Original Article
Author:

Kubernetes tools vendors vie for developer mindshare

SAN DIEGO — The notion that Kubernetes solves many problems as a container orchestration technology belies the complexity it adds in other areas, namely for developers who need Kubernetes tools.

Developers at the KubeCon + CloudNativeCon North America 2019 event here this week noted that although native tooling for development on Kubernetes continues to improve, there’s still room for more.

“I think the tooling thus far is impressive, but there is a long way to go,” said a software engineer and Kubernetes committer who works for a major electronics manufacturer and requested anonymity.

Moreover, “Kubernetes is extremely elegant, but there are multiple concepts for developers to consider,” he said. “For instance, I think the burden of the onboarding process for new developers and even users sometimes can be too high. I think we need to build more tooling, as we flush out the different use cases that communities bring out.”

Developer-oriented approach

Enter Red Hat, which introduced an update of its Kubernetes-native CodeReady Workspaces tool at event.

Red Hat CodeReady Workspaces 2 enables developers to build applications and services on their laptops that mirror the environment they will run in production. And onboarding is but one of the target use cases for the technology, said Brad Micklea, vice president of developer tools, developer programs and advocacy at Red Hat.

The technology is especially useful in situations where security is an issue, such as bringing in new contracting teams or using offshore development teams where developers need to get up and running with the right tools quickly.

I think the tooling thus far is impressive, but there is a long way to go.
Anonymous Kubernetes committer

CodeReady Workspaces runs on the Red Hat OpenShift Kubernetes platform.

Initially, new enterprise-focused developer technologies are generally used in experimental, proof-of-concept projects, said Charles King, an analyst at Pund-IT in Hayward, Calif. Yet over time those that succeed, like Kubernetes, evolve from the proof-of-concept phase to being deployed in production environments.

“With CodeReady Workspaces 2, Red Hat has created a tool that mirrors production environments, thus enabling developers to create and build applications and services more effectively,” King said. “Overall, Red Hat’s CodeReady Workspaces 2 should make life easier for developers.”

In addition to popular features from the first version, such as an in-browser IDE, Lightweight Directory Access Protocol support, Active Directory and OpenAuth support as well as one-click developer workspaces, CodeReady Workspaces 2 adds support for Visual Studio Code extensions, a new user interface, air-gapped installs and a shareable workspace configuration known as Devfile.

“Workspaces is just generally kind of a way to package up a developer’s working workspace,” Red Hat’s Micklea said.

Overall, the Kubernetes community is primarily “ops-focused,” he said. However, tools like CodeReady Workspaces help to empower both developers and operations.

For instance, at KubeCon, Amr Abdelhalem, head of the cloud platform at Fidelity Investments, said the way he gets teams initiated with Kubernetes is to have them deliver on small projects and move on from there. CodeReady Workspaces is ideal for situations like that because it simplifies developer adoption of Kubernetes, Micklea said.

Such a tool could be important for enterprises that are banking on Kubernetes to move them into a DevOps model to achieve business transformation, said Charlotte Dunlap, an analyst with GlobalData.

“Vendors like Red Hat are enhancing Kubernetes tools and CLI [Command Line Interface] UIs to bring developers with more access and visibility into the ALM [Application Lifecycle Management] of their applications,” Dunlap said. “Red Hat CodeReady Workspaces is ultimately about providing enterprises with unified management across endpoints and environments.”

Competition for Kubernetes developer mindshare

Other companies that focus on the application development platform, such as IBM and Pivotal, have also joined the Kubernetes developer enablement game.

Earlier this week, IBM introduced a set of new open-source tools to help ease developers’ Kubernetes woes. Meanwhile, at KubeCon this week, Pivotal made its Pivotal Application Service (PAS) on Kubernetes generally available and also delivered a new release of the alpha version of its Pivotal Build Service. The PAS on Kubernetes tool enables developers to focus on coding while the platform automatically handles software deployment, networking, monitoring, and logging.

The Pivotal Build Service enables developers to build containers from source code for Kubernetes, said James Watters, senior vice president of strategy at Pivotal. The service automates container creation, management and governance at enterprise scale, he said.

The build service brings technologies such as Pivotal’s kpack and Cloud Native Buildpacks to the enterprise. Cloud Native Buildpacks address dependencies in the middleware layer, such as language-specific frameworks. Kpack is a set of resource controllers for Kubernetes. The Build Service defines the container image, its contents and where it should be kept, Watters said.

Indeed, Watters said he believes it just might be game over in the Kubernetes tools space because Pivotal owns the Spring Framework and Spring Boot, which appeal to a wide swath of Java developers, which is “one of the most popular ways enterprises build applications today,” he said.

“There is something to be said for the appeal of Java in that my team would not need to make wholesale changes to our build processes,” said a Java software developer for a financial services institution who requested anonymity because he was not cleared to speak for the organization.

Yet, in today’s polyglot programming world, programming language is less of an issue as teams have the capability to switch languages at will. For instance, Fidelity’s Abdelhalem said his teams find it easier to move beyond a focus strictly on tools and more on overall technology and strategy to determine what fits in their environment.

Go to Original Article
Author:

How PowerCLI automation brings PowerShell capabilities to VMware

VMware admins can use PowerCLI to automate many common tasks and operations in their data centers and perform them at scale. Windows PowerShell executes PowerCLI commands via cmdlets, which are abbreviated lines of code that perform singular, specific functions.

Automation can help admins keep a large, virtualized environment running smoothly. It helps with resource and workload provisioning. It also adds speed and consistency to most operations, since an automated task should behave the same way every time. And because automation can guide daily repetitions of testing, configuration and deployment without introducing the same errors that a tired admin might, it aids in the development of modern software as well.

PowerShell provides easy automation for Windows environments. VMware admins can also use the capabilities of PowerShell, however, with the help of VMware’s PowerCLI, which uses PowerShell as a framework to execute automated tasks on VMware environments.

PowerShell and PowerCLI

In a VMware environment, PowerCLI automation and management is provided at scale in a quicker way than using a GUI via the PowerShell framework. PowerCLI functions as a command-line interface (CLI) tool that “snaps into” PowerShell, which executes its commands through cmdlets. PowerCLI cmdlets can manage infrastructure components, such as High Availability, Distributed Resource Scheduler and vMotion, and can perform tasks such as gathering information, powering on and off VMs, and altering workloads and files.

In a single line of code, admins can enact mass changes to an entire VMware environment.

PowerShell commands consist of a function, which defines an action to take, and a cmdlet, which defines an object on which to perform that action. Parameters provide additional detail and specificity to PowerShell commands. In a single line of code, admins can enact mass changes to an entire VMware environment.

Common PowerCLI cmdlets

You can automate vCenter and vSphere using a handful of simple cmdlets.

With just five cmdlets, you can execute most major vCenter tasks. To obtain information about a VM — such as a VM’s name, power state, guest OS and ESXi host — use the Get-VM cmdlet. To modify an existing vCenter VM, use Set-VM. An admin can use Start-VM to start a single VM or many VMs at once. To stop a VM use Stop-VM, which simply shuts down a VM immediately, or Stop-VMGuest, which performs a more graceful shutdown. You can use these cmdlets to perform any of these tasks at scale across an entire data center.

You can also automate vSphere with PowerCLI. One of the most useful cmdlets for vSphere management is Copy-VMGuestFile, which enables an admin to copy files and folders from a local machine to a vSphere VM. Admins can add a number of parameters to this cmdlet to fine-tune vSphere VM behavior. For example, there is -GuestCredential, which authenticates a VM, and -GuestToLocal, which reverses the flow of information.

Recent updates to PowerCLI and PowerShell

PowerCLI features over 500 separate commands, and the list is only growing. In June 2019, VMware released PowerCLI 11.3, which added 22 new cmdlets for HCX management and support for opaque networks, additional network adapter types and high-level promotion of instant clones.

PowerShell is more than simply PowerCLI, of course. In May 2019, Microsoft released the most recent version of PowerShell: PowerShell 7, which includes several new APIs in the .NET Core 3.0 runtime. At the PowerShell summit in September 2019, Microsoft announced several other developments to PowerShell programming.

PowerShell now works with AWS serverless computing, which enables you to manage a Windows deployment without managing a Windows Server machine. So, you can run PowerShell on an API and use it to run serverless events, such as placing an image in an AWS Simple Storage Service bucket and converting that image to multiple resolutions.

PowerShell also offers a service called Simple Hierarchy in PowerShell (SHiPS). An admin can use SHiPS to build a hierarchical file system provider from scratch and bypass the normal complexity of such a task. SHiPS reduces the amount of code it takes to write a provider module from thousands of lines to around 20.

Go to Original Article
Author:

Forus Health uses AI to help eradicate preventable blindness – AI for Business

Big problems, shared solutions

Tackling global challenges has been the focus of many health data consortiums that Microsoft is enabling. The Microsoft Intelligent Network for Eyecare (MINE) – the initiative that Chandrasekhar read about – is now part of the Microsoft AI Network for Healthcare, which also includes consortiums focused on cardiology and pathology.

For all three, Microsoft’s aim is to play a supporting role to help doctors and researchers find ways to improve health care using AI and machine learning.

“The health care providers are the experts,” said Prashant Gupta, Program Director in Azure Global Engineering. “We are the enabler. We are empowering these health care consortiums to build new things that will help with the last mile.”

In the Forus Health project, that “last mile” started by ensuring image quality. When members of the consortium began doing research on what was needed in the eyecare space, Forus Health was already taking the 3nethra classic to villages to scan hundreds of villagers in a day. But because the images were being captured by minimally trained technicians in areas open to sunlight, close to 20% of the images were not high quality enough to be used for diagnostic purposes.

“If you have bad images, the whole process is crude and wasteful,” Gupta said. “So we realized that before we start to understand disease markers, we have to solve the image quality problem.”

Now, an image quality algorithm immediately alerts the technician when an image needs to be retaken.

The same thought process applies to the cardiology and pathology consortiums. The goal is to see what problems exist, then find ways to use technology to help solve them.

“Once you have that larger shared goal, when you have partners coming together, it’s not just about your own efficiency and goals; it’s more about social impact,” Gupta said.

And the highest level of social impact comes through collaboration, both within the consortiums themselves and when working with organizations such as Forus Health who take that technology out into the world.

Chandrasekhar said he is eager to see what comes next.

“Even though it’s early, the impact in the next five to 10 years can be phenomenal,” he said. “I appreciated that we were seen as an equal partner by Microsoft, not just a small company. It gave us a lot of satisfaction that we are respected for what we are doing.”

Top image: Forus Health’s 3nethra classic is an eye-scanning device that can be attached to the back of a moped and transported to remote locations. Photo by Microsoft. 

Leah Culler edits Microsoft’s AI for Business and Technology blog.

Go to Original Article
Author: Microsoft News Center

Microsoft’s new approach to hybrid: Azure services when and where customers need them | Innovation Stories

As business computing needs have grown more complex and sophisticated, many enterprises have discovered they need multiple systems to meet various requirements – a mix of technology environments in multiple locations, known as hybrid IT or hybrid cloud.

Technology vendors have responded with an array of services and platforms – public clouds, private clouds and the growing edge computing model – but there hasn’t necessarily been a cohesive strategy to get them to work together.

We got here in an ad hoc fashion,” said Erik Vogel, global vice president for customer experience for HPE GreenLake at Hewlett Packard Enterprise. Customers didn’t have a strategic model to work from.

Instead, he said, various business owners in the same company may have bought different software as a service (SaaS) applications, or developers may have independently started leveraging Amazon Web Services, Azure or Google Cloud Platform to develop a set of applications.

At its Ignite conference this week in Orlando, Florida, Microsoft announced its solution to such cloud sprawl. The company has launched a preview of Azure Arc, which offers Azure services and management to customers on other clouds or infrastructure, including those offered by Amazon and Google.

John JG Chirapurath, general manager for Azure data, blockchain and artificial intelligence at Microsoft, said the new service is both an acknowledgement of, and a response to, the reality that many companies face today. They are running various parts of their businesses on different cloud platforms, and they also have a lot of data stored on their own new or legacy systems.

In all those cases, he said, these customers are telling Microsoft they could use the benefits of Azure cloud innovation whether or not their data is stored in the cloud, and they could benefit from having the same Azure capabilities – including security safeguards – available to them across their entire portfolio.

We are offering our customers the ability to take their services, untethered from Azure, and run them inside their own datacenter or in another cloud,” Chirapurath said.

Microsoft says Azure Arc builds on years of work the company has done to serve hybrid cloud needs. For example, Azure Resource Manager, released in 2014, was created with the vision that it would manage resources outside of Azure, including in companies’ internal servers and on other clouds.

That flexibility can help customers operate their services on a mix of clouds more efficiently, without purchasing new hardware or switching among cloud providers. Companies can use a public cloud to obtain computing power and data storage from an outside vendor, but they can also house critical applications and sensitive data on their own premises in a private cloud or server.

Then there’s edge computing, which stores data where the user is, in between the company and the public cloud for example, on their customers’ mobile devices or on sensors in smart buildings like hospitals and factories.

YouTube Video

That’s compelling for companies that need to run AI models on systems that aren’t reliably connected to the cloud, or to make computations more quickly than if they had to send large amounts of data to and from the cloud. But it also must work with companies’ cloud-based, internet-connected systems.

“A customer at the edge doesn’t want to use different app models for different environments,” said Mark Russinovich, Azure chief technology officer. “They need apps that span cloud and edge, leveraging the same code and same management constructs.”

Streamlining and standardizing a customer’s IT structure gives developers more time to build applications that produce value for the business instead of managing multiple operating models. And enabling Azure to integrate administrative and compliance needs across the enterprise – automating system updates and security enhancements brings additional savings in time and money.

“You begin to free up people to go work on other projects, which means faster development time, faster time to market,” said HPE’s Vogel. HPE is working with Microsoft on offerings that will complement Azure Arc.

Arpan Shah, general manager of Azure infrastructure, said Azure Arc allows companies to use Azure’s governance tools for their virtual machines, Kubernetes clusters and data across different locations, helping ensure companywide compliance on things like regulations, security, spending policies and auditing tools.

Azure Arc is underpinned in part by Microsoft’s commitment to technologies that customers are using today, including virtual machines, containers and Kubernetes, an open source system for organizing and managing containers. That makes clusters of applications easily portable across a hybrid IT environment – to the cloud, the edge or an internal server.

“It’s easy for a customer to put that container anywhere,” Chirapurath said. “Today, you can keep it here. Tomorrow, you can move it somewhere else.”

Microsoft says these latest Azure updates reflect an ongoing effort to better understand the complex needs of customers trying to manage their Linux and Windows servers, Kubernetes clusters and data across environments.

“This is just the latest wave of this sort of innovation,” Chirapurath said. “We’re really thinking much more expansively about customer needs and meeting them according to how they’d like to run their applications and services.”

Top image: Erik Vogel, global vice president for customer experience for HPE GreenLake at Hewlett Packard Enterprise, with a prototype of memory-driven computing. HPE is working with Microsoft on offerings that will complement Azure Arc. Photo by John Brecher for Microsoft.

Related:

Go to Original Article
Author: Microsoft News Center

Employee activism, from composting to protests, is an HR issue

Similar to many software companies, CyberArk Software Ltd. has policies and and practices that appeal to people with skills in high demand. They include a social responsibility policy and catered lunches. The information security software firm also has something else that appeals to younger employees — an employee activism effort that brought about some real change.

Lex Register, an associate in corporate development and strategy at CyberArk, was hired in 2018. Soon after, he saw gaps in the firm’s environmental sustainability practices. The firm wasn’t, for instance, collecting food scraps for composting.

“If you’ve never composted before, the idea of leaving left out food in your office can be sort of a sticky subject,” Register said, who has a strong interest in environmental issues.

Register approached his managers at CyberArk’s U.S. headquarters in Newton, Mass., about improving its environmental sustainability. He had some specific ideas and wanted to put together an employee team to work on it. Management gave it approval and a budget.

Register helped organize a “green team,” which now makes up about 25% of its Newton office staff of 200. The firm’s global workforce is about 1,200.

CyberArk’s green team has four subgroups: transportation, energy, community and “green” habits in the office. It also has a management steering committee. Collectively, these efforts undertake a variety of actions such as volunteering on projects in the community, improving enviornmental practices in the office and working on bigger issues, such as installing electric vehicle charging stations for the office building.

When I think about the companies I want to work for, I really want to have pride in everything they do.
Lex RegisterAssociate in corporate development and strategy, CyberArk Software

“When I think about the companies I want to work for, I really want to have pride in everything they do,” Register said. 

Junior employees lead the effort

The green team subgroups are headed by junior employees, according to Register, who is 28.

“It’s a way for a lot of our junior employees who don’t necessarily have responsibility for managing people to sort of step up,” Register said. They “can run some of their own projects and show some leadership capabilities.”

Employee activism has become an increasingly public issue in the last 12 months. In May, for instance, thousands of Amazon employees signed a letter pressing the firm for action. In September, thousands walked out as part of the Global Climate Strike.

“This walkout is either a result of employees not feeling heard,” said Henry Albrecht, CEO at Limeade Inc., or employees feeling heard but fundamentally disagreeing with their leaders. Limeade makes employee experience systems. “The first problem has a simple fix: listen to employees, regularly, intentionally and with empathy,” he said. 

Some companies, such as Ford Motor Co., are using HR tools to listen to their employees and get more frequent feedback. In an interview with SearchHRSoftware, a Ford HR official said recently this kind of feedback encouraged the firm to join California in seeking emission standards that are stricter than those sought by President Trump’s administration.   

But employee activism that leads to public protest doesn’t tell the full employee activism story.

Interest in green teams rising

The Green Business Bureau provides education, assessment tools and processes that firms can use to measure their sustainability practices. In the past nine months, Bill Zujewski, CMO at the bureau, said it’s been hearing more about the formation of sustainability committees at firms. The employees leading the efforts are “almost always someone who’s a few years out of school,” he said.

HR managers, responding to “employee-driven” green initiatives, are often the ones Zujewski hears from.

Maggie Okponobi, funding coordination manager at School Specialty Inc, is one of the Green Business Bureau’s clients. Her employer is an educational services and products firm based in Greenville, Wisc. Her job is to help schools secure federal and state grants.

Okponobi is in an MBA program that has an emphasis on sustainability. As a final project, she proposed bringing a green certification to her company. The assessments evaluate a firm’s sustainability activities against best environmental practices.

Okponobi explained what she wanted to do to one of the executives. She got support and began her research, starting with an investigation of certification programs. She decided on Green Business Bureau assessments, as did CyberArk.

Company managers at School Speciality had been taking ad-hoc steps all along to improve sustainability. Efforts included installing LED lighting, and reducing paper useage by using both sides for printing and recycling, Okponobi said.

Okponobi collected data about the environmental practices for certification. The firm discovered it was eligible for gold level certification, one step below the highest level, platinum. 

The results were brought to an executive group, which included members from HR as well as marketing. Executives saw value in the ranking, and Okponobi believes it will help with recruiting efforts, especially with younger candidates. The company plans to create a green team to coordinate the sustainability efforts.

HR benefits from sustainability

Sustainability may help with retention, especially with younger workers, Okponobi said. “It gives them something exciting, positive to do in their workplace, and a goal to work toward,” she said.

Some employees are coming to workplaces with training on sustainability issues. One group that provides that kind of training is Manomet Inc., a 50-year-old science-based non-profit in Plymouth, Mass.

“We can’t make the progress that we need on climate change and other issues without the for-profit sector,” said Lora Babb, program manager of sustainable economies at Manomet.

Lora BabbLora Babb

The nonprofit takes about 20 undergrad college students each year, usually enrolled in majors that often have a sustainability component, and gives them “real world skills” to meet with businesses and conduct assessments. The training enables future employees to “make changes from the inside,” and understand practical, applied sustainability, Babb said.

This is not strictly an environmental assessment. The students also ask businesses about economic and social issues, including a workforce assessment that considers employee benefits, engagement and talent development, Babb said. 

A business with a strong environmental mission is “going to be far less effective at carrying out that mission if you are having constant workforce challenges,” Babb said.

And the results of such efforts can have an effect on culture. CyberArk’s employees have embraced composting, Register said. The company hired a firm that picks up food scraps about twice a week, processes them and makes compost — what master gardeners often refer to as black gold — available for employees to use in their home gardens. 

The results make employee composting efforts “very tangible for them,” Register said. 

Go to Original Article
Author:

Data silos and culture lead to data transformation challenges

It’s not as easy as it should be for many users to make full use of data for data analytics and business intelligence use cases, due to a number of data transformation challenges.

Data challenges arise not only in the form of data transformation problems, but also with broader strategic concerns about how data is collected and used.

Culture and data strategy within organizations are key causal factors of data transformation challenges, said Gartner analyst Mike Rollings.

“Making data available in various forms and to the right people at the right time has always been a challenge,” Rollings said. “The bigger barrier to making data available is culture.”

The path to overcoming data challenges is to create a culture of data and fully embrace the idea of being a data-driven enterprise, according to Rollings.

Rollings has been busy recently talking about the challenges of data analytics, including taking part in a session at the Gartner IT Symposium Expo from Oct. 20-24 in Orlando, where he also detailed some of the findings from the Gartner CDO (Chief Data Officer) survey.

Among the key points in the study is that most organizations have not included data and analytics as part of documented corporate strategies.

Making data available in various forms and to the right people at the right time has always been a challenge.
Mike RollingsAnalyst, Gartner

“The primary challenge is that data and data insights are not a central part of business strategy,” Rollings said.

Often, data and data analytics are actually just byproducts of other activities, rather than being the core focus of a formal data-driven architecture, he said. In Rollings’ view, data and analytics should be considered assets that can be measured, managed and monetized.

“When we talk about measuring and monetizing, we’re really saying, do you have an intentional process to even understand what you have,” he said. “And do you have an intentional process to start to evaluate the opportunities that may exist with data, or with analysis that could fundamentally change the business model, customer experience and the way decisions are made.”

Data transformation challenges

The struggle to make the data useful is a key challenge, said Hoshang Chenoy, senior manager of marketing analytics at San Francisco-based LiveRamp, an identity resolution software vendor.

Among other data transformation challenges is that many organizations still have siloed deployments, where data is collected and remains in isolated segments.

“In addition to having siloed data within an organization, I think the biggest challenge for enterprises to make their data ready for analytics are the attempts at pulling in data that has previously never been accessed, whether it’s because the data exists in too many different formats or for privacy and security reasons,” Chenoy said. “It can be a daunting task to start on a data management project but with the right tech, team and tools in place, enterprises should get started sooner rather than later.”

How to address the challenges

With the data warehouse and data lake technologies, the early promise was making it easier to use data.

But despite technology advances, there’s still a long way to go to solving data transformation challenges, said Ed Thompson, CTO of Matillion, a London-based data integration vendor that recently commissioned a survey on data integration problems.

The survey of 200 IT professionals found that 90% of organizations see making data available for insights as a barrier. The study also found a rapid rate of data growth of up to 100% a month at some organizations.

When an executive team starts to get good quality data, what typically comes back is a lot of questions that require more data. The continuous need to ask and answer questions is the cycle that is driving data demand.

“The more data that organizations have, the more insight that they can gain from it, the more they want, and the more they need,” Thompson said.

Go to Original Article
Author:

For Sale – Surface Go (8GB RAM/128GB SSD) inc. Black Type Keyboard

Purchased this recently on another forum but decided I have too many devices. I tested it fully and used it for a week or so, battery life and everything else working as expected.

Both the Surface Go and Type Keyboard are boxed and in good condition. It has been upgraded to full version of Windows 10 Home.

Screen is spotless, and the unit casing is fine bar a couple of hairline scratches where the keyboard has latched on, and a few minor ones on the back. Pics from previous sale are here, note this does not include the pen.

Looking for £350 including delivery (RMSD which will cost £20), this combo still retails for £610 new.

Go to Original Article
Author:

For Sale – Surface Go (8GB RAM/128GB SSD) inc. Black Type Keyboard

Purchased this recently on another forum but decided I have too many devices. I tested it fully and used it for a week or so, battery life and everything else working as expected.

Both the Surface Go and Type Keyboard are boxed and in good condition. It has been upgraded to full version of Windows 10 Home.

Screen is spotless, and the unit casing is fine bar a couple of hairline scratches where the keyboard has latched on, and a few minor ones on the back. Pics from previous sale are here, note this does not include the pen.

Looking for £350 including delivery (RMSD which will cost £20), this combo still retails for £610 new.

Go to Original Article
Author: