Tag Archives: resources

Beyond our four walls: How Microsoft is accelerating sustainability progress – Microsoft on the Issues

Our planet is changing — sea levels are rising, weather is becoming more extreme and our natural resources are being depleted faster than the earth’s ecosystems can restore them. These changes pose serious threats to the future of all life on our tiny blue dot, and they challenge us to find new solutions, work together and leverage the diversity of human potential to help right the course.

The good news is that progress is being made across the globe, and non-state actors, from cities to companies to individual citizens, are setting bold commitments and accelerating their work on climate change. But it’s also clear that we all must raise our ambitions, couple that with action and work more swiftly than ever.

At Microsoft, we fully understand and embrace this challenge. That is why, this week, at the Global Climate Action Summit, Microsoft is sharing our vision for a sustainable future — one where everyone everywhere is experiencing and deploying the power of technology to help address climate change and build a more resilient future. We are optimistic about what progress can be made because we are already seeing results of this technology-first enablement approach.

Today, we are unveiling five new tools, partnerships and the results of pilot projects that are already reducing emissions in manufacturing and advancing environmental research and showing immense potential to disrupt the building and energy sectors for a lower-emission future.

These include:

  • A new, open-source tool to find, use and incentivize lower-carbon building materials: To create low-carbon buildings, we need to choose low-carbon building materials. But right now, choosing these materials is challenging because the data is not readily available and what we do have lacks transparency to ensure it’s accurate. We are the first large corporate user of a new tool to track the carbon emissions of raw building materials, introduced by Skanska and supported by the University of Washington Carbon Leadership Forum, Interface and C-Change Labs, called the Embodied Carbon Calculator for Construction (EC3). We’ll use this in our new campus remodel. Our early estimates are that a low-carbon building in Seattle has approximately half the carbon emissions of an average building, so this could have a substantial impact on reducing carbon emissions in our remodel and eventually the entire built environment. We’re proud to not only be piloting it, but that this open-source tool is also running on Microsoft Azure.
  • The results of a “factory of the future” and solar-panel deployment at one of our largest suppliers of China: We partnered with our supplier’s management team to develop and install an energy-smart building solution running on Microsoft Azure to monitor and address issues as they emerge, saving energy and money. Additionally, Microsoft funded a solar panel installation, which generated more than 250,000 kilowatt-hours of electricity in the past fiscal year. This integrated solution is estimated to reduce emissions by approximately 3 million pounds a year.

The successful pilot of a grid-interactive energy storage battery: ​Solving storage is a critical piece of transforming the energy sector. That is why we’re excited to share the results of a new pilot in Virginia, in partnership with Eaton and PJM Interconnection. We used a battery that typically sits in our datacenter as a backup system, hooked it up to the grid to receive signals about when to take in power, when to store it and when to discharge to support the reliability of the system and integration of renewable energy. With thousands of batteries as part of our backup power systems at our datacenters, this pilot has the potential to rapidly scale storage solutions, allowing datacenters to smooth out the unpredictability of wind and solar.

  • New grantees and results from our AI for Earth program: Since we first introduced this grant, training and innovation program last year, we’ve experienced 200 percent growth. We are now supporting 137 grantees in more than 40 countries around the world, as well as doubling the number of larger featured projects we support. We’ve seen early results, too, allowing many people outside the grant program to benefit from our work, allowing us to process more than 10 trillion pixels in ten minutes and less than $50.
  • New LinkedIn online training module for sustainability, the Sustainable Learning Path: LinkedIn is providing new training courses to enable people everywhere to learn and gain job skills to participate in the clean energy economy and low-carbon future. The Sustainable Learning Path offers six hours of expert-created content; initial courses include an overview of sustainability strategies and introductions to LEED credentials and sustainable design. All six courses are unlocked until the end of October, in celebration of the Global Climate Action Summit, and can be accessed here.

While these are just the first proof points of the potential of technology to accelerate the pace of change beyond our four walls, they build on decades of sustainability progress within our operations.  These include operating 100 percent carbon neutral since 2012, purchasing more than 1 gigawatt of renewable energy on three continents, committing to reduce our operational carbon footprint by 75 percent by 2030, and a host of other initiatives. As meaningful as this operational progress is, we know it’s not enough. As a global technology company, we have a responsibility and a tremendous opportunity to help change the course of our planet.

As we look to the future, we’ll realize this opportunity in a few ways. We will use our operations as a test bed for innovation and share new insights about what works. We will work with our customers and suppliers to drive efficiencies that lead to tangible carbon reductions. We will continue to increase access to cloud and AI tools, especially among climate researchers and conservation groups, and work together to develop new tools that can be deployed by others in the field.

We are not naïve. Technology is not a panacea. Time and resources are short, and the task immense. But we refuse to believe that it is insurmountable or too late to build a better future, and we are convinced that technology can play a pivotal role in enabling that progress.

That optimism is borne out of our experience, lessons learned and the drive to create a better future that is core to Microsoft. At GCAS, I will be joined by 10 Microsoft and LinkedIn sustainability leaders, who will be sharing more details about this approach and the news outlined at panel sessions throughout the week, showcasing some of our technology solutions at events we are hosting and supporting the effort with more than 50 employees volunteering their time at GCAS. We are also proud to be an official sponsor of GCAS.

You can find our Microsoft delegation at the following events during the summit, as well as many others throughout the week. And we encourage you to follow us @Microsoft_Green for a full view of our conference activities and engagements, and the official hashtags for news of the event at #GCAS2018 #StepUp 2018.

Find Microsoft at the Global Climate Action Summit — event highlights

September 11, 8:00 a.m. PT: Sustainable Food Services Panel (LinkedIn hosting)

September 12, 9:00 a.m. PT: We Are Still In Forum

 September 12, 2:00 p.m. PT: “Energy, Transportation & Innovation – a Conversation with U.S. Climate Alliance Governors & Business Leaders” (Microsoft hosting)

  • Speaker: Shelley McKinley, General Manager for Technology and Civic Responsibility at Microsoft
  • Watch the livestream: https://aka.ms/CEO_Governors_Live and use #USCAxGCAS to submit questions on Twitter during the event

September 13, 9:00 a.m. PT: World Economic Forum: 4th IR for Earth

  • Speaker: Lucas Joppa, Chief Environmental Officer, Microsoft

September 13, 1:30 p.m. PT: GCAS Breakout Session – “What We Eat and How It’s Grown: Food Systems and Climate”

September 13, 3:00 p.m. PT: Meeting the Paris Goal: Strategies for Carbon Neutrality (Microsoft hosting)

  • Speaker: Elizabeth Willmott, Carbon Program Lead

September 13, 6 p.m. to 8 p.m. PT: We Are Still In Reception at Microsoft

September 14, 8:30 a.m. PT: Clean Energy in Emerging Markets (Microsoft hosting)

September 14, 11:00 a.m. PT: Climate Action Career Fair (LinkedIn hosting)

  • Speaker: Lucas Joppa, Chief Environmental Officer

Tags:

Use the new Activity Plans to organize your Skype in the Classroom experiences |

Skype in the Classroom has always been one of my favorite resources for engaging students in real-life, relevant learning experiences. I recently learned that using Skype in the Classroom has become even easier for teachers with the addition of FREE activity plans! These are activity plans written by educators for educators.

5 Reasons to get excited about the launch of these helpful new resources:

Teachers always have a long to-do list. Want to engage your students with a Skype in the Classroom experience but short on time? These activity plans will help make it happen! The plans are free and easy to download in just a few seconds. Each one includes objectives (GREAT to share with parents and administrators), activities to activate background knowledge, research, brainstorming, preparing students for the call, assessment, reflection and more.

  • Easy to adapt

Each activity plan can be followed step-by-step or easily adjusted to best fit your students’ needs. Once you download, each plan is fully editable, making any adjustments quick, seamless and easy. This is also a great way to save any changes after the Skype experience, so you will be able to remember how you adapted the lesson for the next time you use it with students.

  • Aligned to ISTE Standards

Not only are objectives listed for each activity, they are also aligned to ISTE standards. This is a nice way to keep the big picture in mind. It’s a valuable resource to share with students, parents, administrators and colleagues.

  • Help other teachers

Do you know other teachers who are curious about using Skype in the Classroom with their students but are unsure how or where to start? This handy resource will give them a step-by-step guide, including question prompts, research and assessment ideas, and more.

  • Deep(er) learning

Each activity plan includes ideas to launch the lesson, research, prepare students, reflect and assess. These ideas help teachers plan intentionally for each aspect of the learning experience and provide a framework of ideas for before, during and after the Skype experience.

Whether you are looking to launch a new school year with an exciting Skype call, planning for Skype experiences throughout the year, dreaming of Skype-a-Thon connections, or whatever your Skype in the Classroom goals may be, be sure to check out and download the free activity plans to support and enhance the experience for you and your students.

Hybrid cloud strategies should be reimagined as multi-cloud

Everyone is talking about hybrid cloud strategies, implying this framework is how all IT resources will be deployed in the future. It sounds great — on paper. Proponents tout hybrid cloud’s ability to tap into both public and private resources to build scalable applications that can be managed and secured as if they were being delivered privately — a way to have your cake and eat it, too.

Unfortunately, for the vast majority of workloads, this scenario will probably never play out. There may be a few isolated instances or use cases where this is workable, but, seriously, most workloads don’t need this level of flexibility, especially when stacked up against the considerable management and security challenges of a true hybrid cloud.

Analysts and pontificators — myself included — have talked about hybrid cloud strategies, but in reality, we are getting it wrong. Those who viewed hybrid cloud as a single entity — public and private — simply glossed over the business realities of creating, deploying and, most importantly, securing applications in a blended environment. Those who viewed hybrid cloud strategies as a mix of public clouds and private clouds — each with its own domain, where applications reside in one or the other — incorrectly labeled that architecture hybrid cloud. In reality, it’s multi-cloud.

Overlooking the challenges of hybrid cloud strategies

The first group, which believes a single cloud can span both public and private domains, tends to gloss over the different aspects of operating and managing that resource. How are administrators going to ensure security across multiple domains? What about the management tools that will vary between the two platforms?

Simply put, how many of us who switch between Mac and PC get frustrated by “delete” and “move to trash,” which are different ways to say the same thing? Now, imagine that at cloud scale. While many people like the idea of the elasticity of a hybrid cloud to scale out almost infinitely, how many applications truly need that much capacity on demand?

There are a few use cases where hybrid cloud strategies might make sense, primarily bursting applications out during peak load. The overly tired examples typically involve a retailer or tax firm with a highly seasonal business or a highly targeted business cycle. But is having to manage and drive consistencies all year across the two environments worth it for that short period?

So far, Microsoft is about the only true play here, with its public Azure cloud and private Azure Stack. This strategy gives IT the ability to create an application in a private environment and move to public, or vice versa, relying on the same security, development, libraries and management tools. This is about as close as many will actually come to a hybrid cloud. But, even here, most applications, while they can move from domain to domain, will probably live in one domain or the other 99% of the time.

Four cloud options

Time to change how cloud deployments are described

The second group, of which I was a staunch member, simply got the naming wrong. Most businesses will use a mixture of public and private cloud resources, depending on their applications. Those applications with regulatory constraints or data privacy considerations will probably remain in the private domain. This is also where many will see the more differentiated applications for their business — the tools that help define their business and how they deliver products and services to their customers.

The more we rally around nebulous buzzwords, the more we doom our projects to constant challenges.

Horizontal and nondifferentiated applications will probably move to a public cloud based on economics or a variety of other factors. In the future, most businesses will use multiple cloud resources. This is why hybrid cloud, as a description of this environment, is so wrong. Multi-cloud is a better description. The industry needs to get on board with this.

A great example is in the auto world. If you own one Toyota Prius, you run on electricity or gasoline — a hybrid car strategy. But if you own an all-electric Tesla and a gas-chugging Ford F-150, even though you are using both gas and electric, each is discrete — this is a multi-car strategy.

A complicating factor for a public-private combined entity is the cost and complexity of data transmission. It is expensive and time-consuming to move data back and forth within a hybrid environment. Most businesses are choosing to locate the compute closer to the data and simply send back the results and exceptions, rather than shuttle data back and forth between locations. This dynamic is what is causing a growth of edge computing — moving the compute closer to the network edge, closer to where the data is being collected.

It is time for the industry to change the way it defines hybrid cloud. There will be hybrid clouds, but they will be far fewer and far more limited. The phrase multi-cloud needs to take more prominence in IT’s lexicon, because the need to accurately define the environment will pay dividends in accelerating those plans by removing confusion. The more we rally around nebulous buzzwords and marketing terms, the more we doom our projects to constant challenges and questions. Clarity will help drive better outcomes.

Reduxio Systems’ storage wows human resources specialist

Reduxio Systems’ storage has gone from curiosity to mainstay at human resources software firm CPP Inc.

The maker of personality-assessment software initially installed Reduxio HX550 hybrid arrays to support standard systems for development, quality assurance and testing. Impressed by the performance, CPP has promoted the Reduxio SAN to handle mission-critical applications and a select number of primary workloads.

The plan is to eventually move most tier-one storage from existing SAN environments to Reduxio to take advantage of its capacity, native data protection and performance scaling, said Mike Johnson, director of global infrastructure and desktop support at CPP, based in Sunnyvale, Calif.

“I’ve always figured there isn’t one storage device that gives you all three of those things, but it’s looking like Reduxio Systems has the potential,” Johnson said.

CPP has two Reduxio HX550 hybrid arrays at its main data center in Sunnyvale and two others at a newly opened facility in the U.K.

Reduxio hybrid flash augments all-flash IBM V9000 primary SAN

The Reduxio HX550 Enterprise Flash Storage hybrid flagship is a dual-controller system housed in a 2U Seagate server chassis. The system accommodates 24 disk drives or SAS-connected SSDs, with enterprise multi-level cell NAND flash SSDs for 40 TB of raw block storage. Effective capacity scales to 150 TB of usable storage with Reduxio NoDup global inline data deduplication.

Reduxio Systems deduplicates data in 8K blocks in a pre-memory buffer. A unique timestamp is applied to each block in the databases. A separate database for metadata includes log data on which blocks received writes and when.

Until 2002, CPP was known as Corporate Psychologists Press Inc. The firm sells human resources software to corporations and career-minded individuals, and it’s best known for its flagship Myers-Briggs Type Indicator-certified assessment.

Over the years, CPP has used storage appliances from Dell EMC, NetApp, Hitachi Vantara and other vendors. CPP still uses an all-flash IBM V9000 SAN to support a Microsoft Dynamics AX enterprise resource planning system and related production systems, as well as a scale-out Coho Data DataStream SAN to increase capacity or performance on the fly.

Although the IBM V9000 is “one of the highest-performing SANs I’ve ever seen,” Johnson said it has limited capacity for all of CPP’s primary storage. The Coho Data storage is “plug-and-play,” but requires the upfront expense of customized Arista network switches.

Compounding the challenge is the demise of Coho Data, which went out of business in September.

Johnson credited a reseller with introducing him to Reduxio Systems. CPP had already purchased the IBM and Coho Data gear by that time, but Johnson was intrigued enough by Reduxio to give it a test run.

“I was willing to put it in as our tier-three storage device, but I didn’t know how it would perform,” he said. “Once we saw the performance was pretty good, we promoted it to our mission-critical workloads.”

Reduxio BackDating aids faster disaster recovery

Johnson’s IT team did further testing and research designed to answer a key question: Could Reduxio storage reliably support CPP’s moneymaking activities? Johnson said he was pleased at Reduxio’s ability to deliver primary storage performance without relying exclusively on flash.

Johnson said he also likes the native data protection in Reduxio’s TimeOS operating system, especially the BackDating that allows recovery to any-point-in-time snapshot. Reduxio Systems recently added NoMigrate replication and NoRestore copy data management.

“We decided our revenue-generating systems could reside on the Reduxio storage device,” Johnson said. “Our plan going forward is to put all our revenue-generating systems on Reduxio and reduce our recovery point objectives and recovery time objectives from hours to days to seconds to minutes.”

Misconfigured Amazon S3 buckets expose sensitive data

The cloud has simplified accessing compute and storage resources, making life a lot easier for application developers, IT administrators and company employees. However, when end users fail to properly secure the cloud, it can put data at greater risk.

In the past year, cybersecurity firms have reported on a rash of misconfigured Amazon S3 buckets that have left terabytes of corporate and top-secret military data exposed on the internet. This misconfiguration allows anyone with an Amazon account access to the data simply by guessing the name of the Simple Storage Service (S3) bucket instance.

Storage and cybersecurity experts point to IT administrators and end users as the culprits. Users have the option of protecting each storage block with an access control list (ACL) to keep data private, share it for reading or share it for reading and writing. Experts claim data was left exposed because the ACLs were configured to allow any user with an Amazon Web Services (AWS) account to access the data. The Amazon S3 buckets were not reconfigured to restrict access.

“Maybe that is too much power for the end users,” said Chris Vickery, director of cyber-risk research at cybersecurity firm UpGuard, based in Mountain View, Calif. “You really can’t put the blame on Amazon. The buckets are secured by default. It’s madness by the end user.”

In November, UpGuard reported two incidents of sensitive data left exposed in Amazon S3 buckets belonging to the United States Army Intelligence and Security Command (INSCOM), as well as the U.S. Central Command (CENTCOM) and Pacific Command.

Nearly 100 GB of critical data belonging to INSCOM was found in unsecured cloud storage repositories, including information labeled “top secret” and “NOFORN,” which means no foreign nationals should be able to view the data. The largest unsecured file found was an Oracle Virtual Appliance that contained a virtual hard drive and Linux-based operating system likely used for receiving Defense Department data from a remote location. UpGuard found top-secret data was tied to the defunct defense contractor Invertix.

“Also exposed within [the S3 storage] are private keys used for accessing distributed intelligence systems belonging to Invertix,” according to an UpGuard report. “Plainly put, the digital tools needed to potentially access the networks relied upon by multiple Pentagon intelligence agencies to disseminate information should not be something available to anybody entering a URL into a web browser.

“Although the UpGuard cyber-risk team has found and helped to secure multiple data exposures involving sensitive defense intelligence data, this is the first time classified information has been among the exposed data,” the report stated.

The CENTCOM data exposure involved a Pentagon contractor who did intelligence work and left an archive of 1.8 million publicly accessible social media posts exposed in Amazon S3 buckets. The military characterized that data breach as “benign,” because it was data scraped from around the world identifying persons of interest by the military.

These incidents are part of a series in which high-profile companies left data in Amazon S3 buckets exposed because the ACLs were configured to allow any user with an Amazon account to gain access to the data. The companies caught up in the problem include telco giant Verizon, U.S. government contractor Booz Allen Hamilton, consulting firm Accenture, World Wrestling Entertainment and Dow Jones.

Storage and cybersecurity experts agree this is not Amazon’s fault. The AWS S3 buckets are designed with top-level security by default when the storage instances are created. The user has control over what level of access to assign each bucket.

“Have we given too much power to the end user? Yeah, but we also gave them keyboards,” said George Crump, founder of Storage Switzerland. “People have to learn. I guess it’s like the seat belt law. Enough people have to go through a windshield before they do something about it. Organizations have to monitor cloud assets the same way they monitor their data center assets.

Organizations have to monitor cloud assets the same way they monitor their data center assets.
George Crumpfounder, Storage Switzerland

“There are more than a few tools out there that monitor open buckets,” Crump added. “I hate to have Amazon blameless in this, but they are. It would be like blaming the car manufacturer because people are not using their seat belts.”

Earlier this month, Amazon added new S3 encryption and security features to help address the data breaches. These features include default encryption that mandates all objects in a bucket must be stored in an encrypted form.

Amazon also added permission checks that display a prominent indicator next to each Amazon S3 bucket that is publicly accessible. Cross-region replication with a Key Management Service enables objects that are encrypted with keys to be replicated, and a detailed inventory report includes the encryption status of each object.

David Monahan, managing director for security and risk management at Enterprise Management Associates in Boulder, Colo., said consumers who are using cloud services need to ask more questions about where their data is being stored and get more details on how it is being protected.

“This is a data-owner issue,” he said. “Some owners are relying on the names of the bucket being private. That is insufficient. Others are creating permissions and then not following the rule of least privilege and making the data too open. To them, I say, ‘Stop being lazy.’ Others may not understand how the system access controls work. They have to learn before putting real data out there.”

Software Reviews | Computer Software Review

MSRP: $7.00



at

Bottom Line: Human resources (HR) software and management system BambooHR is not the cheapest option but you get what you pay for, namely, well-organized, visually appealing tools that are simple to set up and run. An open API allows the software to be integrated with a company’s existing HR tech vendors, and the performance review function fits with the way more companies are working.

Read Full Review

Software Reviews | Computer Software Review

MSRP: $7.00



at

Bottom Line: Human resources (HR) software and management system BambooHR is not the cheapest option but you get what you pay for, namely, well-organized, visually appealing tools that are simple to set up and run. An open API allows the software to be integrated with a company’s existing HR tech vendors, and the performance review function fits with the way more companies are working.

Read Full Review

On-premises pricing pushes Atlanta to Oracle ERP Cloud

The city of Atlanta is moving to Oracle’s full cloud for human resources, procurement and finance. To help win the deal, the city released eyebrow-raising cost estimates that show the city has little choice but to move to Oracle’s ERP/HCM Cloud platform.

Atlanta is a longtime Oracle user. Its last big ERP upgrade was around 2007. This time, it was planning on a hybrid cloud adoption, keeping some systems on premises and others in Oracle ERP Cloud. The city didn’t believe all of the Oracle ERP Cloud offerings were on par with the on-premises systems, hence the hybrid approach. This view changed as the planning progressed.

For Oracle, getting customers to migrate to its cloud platform is a top priority. But the financial incentives behind these deals are rarely disclosed, at least until Atlanta offered a glimpse at some of the cost estimates.

The Atlanta City Council finance committee was shown a series of slides that sketched out the financial case for a full cloud approach. Officials were told that the 10-year total cost of ownership difference between Oracle’s E-Business Suite (EBS)/HCM Cloud and Oracle’s full ERP/HCM Cloud was $26 million. That’s how much more the city would spend over a 10-year period if it went with a hybrid approach.

Oracle’s licensing terms between the two platforms were starkly different. Under the hybrid approach, the annual licensing would see a “4% increase per year for EBS/HCM Cloud hybrid (until year 10) vs. 0% increase until year 5” for the ERP Cloud. That full ERP Cloud saw a one-time 3% increase in year six over the 10-year agreement.

Analysts and consultants who have seen the slides say there’s not enough information to tell, exactly, how these estimates were calculated. However, the differences in licensing costs between hybrid, on-premises and full cloud ERP delivers a clear message.

“I assumed that this was Oracle’s way of financially motivating the decision they wanted,” said Marc Tanowitz, a managing director at Pace Harmon, a management consultancy that advises firms making similar decisions.

The Atlanta mayor’s office declined to make an official available for an interview or to answer written questions. A spokeswoman said the city would not comment. In addition, Oracle said it couldn’t discuss a specific customer agreement.

At the start of this project, Oracle’s HCM Cloud was described by Atlanta officials as mature and ready for full cloud deployment. The city initially concluded that there were functionality gaps in the finance system, and it intended to keep Oracle’s R12 financials on premises. That changed.

“Over the last six to eight months, Oracle has released new functionality to where we feel like those gaps will be addressed,” said John Gaffney, the city’s deputy chief financial officer, according to a video — discussion at the Oct. 25 meeting begins at about 2:17 in the video — of a recent city council finance meeting. That meant recommending a full cloud option.

Atlanta City Council members at the meeting didn’t probe the licensing difference. Gaffney, in presenting the savings, told them that “you’ve got lower costs that are primarily driven by your subscription cost being lower. You also don’t have to pay any hosting fees.”

Tanowitz said there were some things about Atlanta’s Oracle ERP Cloud project that were clear; the apparent 10-year agreement with Atlanta, for instance. Vendors have generally been seeking longer terms.

“That piece of it didn’t surprise us,” he said.

The first-year implementation cost for hybrid and full Oracle ERP Cloud were roughly equal, at about $19 million. That figure also wasn’t surprising to Tanowitz because there is a cost to migration. But Tanowitz said he struggles to understand why the on-premises deployment is escalating in cost faster than the full Oracle ERP Cloud deployment.

“If you think about the cloud cost, what are you paying for in a cloud subscription? You’re paying for some intellectual property and you’re paying for some hosting,” said Tanowitz. “That’s what’s under that number, if you peel it apart.”

“Why would an environment that I’m hosting on my own — presumably with the EBS deal — be going up at this rate?” he asked.

There has been a long-standing debate in IT about whether on premises is less costly than full cloud approaches. Frank Scavo, president of Computer Economics, a research firm, said the decision on these approaches can go either way.

“If the data center is underutilized, adding another application may not add much cost,” he said. “But if I need to build a new data center or add significant capacity, it will be much more costly. There is no right answer.”

By aligning sales and marketing, Mizuho OSI could sell faster

SAN FRANCISCO — To take on larger competitors with more resources, medical equipment manufacturer Mizuho OSI had to create a faster track from lead generation to sales.

To work smarter and faster to identify leads and close sales, the Union City, Calif., company broke down its internal silos, deciding that aligning sales and marketing departments would be its best bet.

“We had a gap in collaboration,” said Greg Neukirch, vice president of sales and marketing at Mizuho OSI, during a session at Dreamforce 2017 this week. “We needed to be smarter and faster and improve our customer experience beyond what we did in the past.”

Neukirch added that the company did extensive research to see which software tools could aid in aligning sales and marketing. It ultimately chose Salesforce for CRM and Salesforce Pardot for marketing automation.

“We had a sales team wanting more and a marketing team trying to give more, and we looked at how we could leverage Salesforce and Pardot to close the gap between those two functions,” Neukirch said.

Mizuho OSI adopted Salesforce in February 2016 and Pardot a year later, working to ensure close collaboration between the sales and marketing departments.

Bringing sales, marketing together

We had a sales team wanting more and a marketing team trying to give more, and we looked at how we could leverage Salesforce and Pardot to close the gap.
Greg Neukirchvice president of sales and marketing, Mizuho OSI

Breaking down internal silos for businesses is a common problem, because sales and marketing departments have historically had different objectives. But as consumers have become more educated through the buying process, aligning sales and marketing is a strategy that can bring a company more customers — it’s not an easy process, however.

“There was skepticism in our sales department,” Neukirch said. “They didn’t know the products or understand why they needed to do something different. But it was up to us to help communicate that value.”

New Salesforce Sales Cloud features are designed to make it easier for customers to better align sales and marketing. With the Lightning Data feature, for example, companies can discover and import new potential customers, according to Brooke Lane, director of product management for Sales Cloud.

“In today’s setting, we want to quickly close deals and also better understand customers,” Lane said. “With [the new feature] Campaign Management, it can help you show the impact of marketing activities on the sales pipeline. We want to continue bridging Salesforce and Pardot so you’re not troubled with tasks.”

Addressing implementation challenges

Mizuho OSI’s transition to a more efficient, modern customer journey — one that shortened the time for a prospect to become a customer — hasn’t come without challenges.

“Sales can’t do things on its own,” said Chris Lisle, director of North American sales at Mizuho OSI. “But the biggest hurdle was getting sales to adopt a new tool.”

Mizuho OSI ran into some hurdles during the implementation — mainly the time it takes to successfully change how the organization is run.

“We took time to identify the problems we wanted to solve — mainly that our customer journey was outdated,” said Kevin McCallum, director of marketing at Mizuho OSI. “We needed an aggressive timeline for our deployment, but however long you think it’ll take, it takes longer than that.”

But by aligning sales and marketing departments at the start of the project, Mizuho OSI was able to start modernizing its customer journey.

“Sales had full visibility with what we were doing and what we were working on and helped through the journey,” McCallum said.

Neukirch agreed, calling the alignment essential.

“To get that collaboration and see the departments come together, we were able to move faster,” Neukirch said.

And while the company is still aligning sales and marketing, it has seen anecdotal benefits of the change.

“What we did in the last nine months exceeded our expectations,” Neukirch said. “We were following that vision and executing on the deliverables and making sure we kept focus with how the customer could interact with us better and faster, so we’d have the opportunity to outpace the folks we’re in market against.”