Tag Archives: News

News roundup: TriNet software targets professional services

The theme of this news roundup is specialization: professional services HR, an effort to pull together a comprehensive benefits plan and a payroll offering aimed at the gig economy.

The debut of TriNet software for professional services HR is an effort to provide a platform for small and medium-sized businesses that caters to specific HR needs. TriNet Professional Services “is a bundle more relevant to what a small consulting company owner or an ad agency owner or any type of business depending on people to deliver a service would need,” explained Jimmy Franzone, senior vice president of strategy at TriNet, based in San Leandro, Calif. The new product joins other vertical TriNet software aimed at technology, nonprofits, life sciences and financial services.

In thinking about what the issues are around professional services HR, Franzone said the company bundled in a lot of applications that are ancillary in some other products, but should be important to this demographic. Expense management, performance management, application tracking and a variety of payroll-related tasks are at the core of the professional services package, Franzone said, because they are areas busy consultants, certified public accountants or lawyers would want to easily access.

TriNet’s heavy investment in its mobile application should also work nicely for those looking for professional services HR, Franzone said. “Mobile is a huge driver from the client side, especially in professional services,” he said. “We’re finding, in some ways, the professional services HR [market] is more mobile-enabled than tech or financial services firm employees who are always at a desk. Consulting firms and ad agencies are working outside of their desks. The need to be able to access data is critical to what they do.”

The new TriNet software is supported by a client services team that specializes in professional services HR, Franzone said. “They understand how those businesses work, what questions to ask, and what the trials and tribulations are.”

BenefitsPlace: All employee options on one platform

BenefitsPlace, a new platform from Benefitfocus, has a lofty goal: “We want to unify the entire benefits industry on one platform,” said Tom Dugan, vice president of product management at Benefitfocus, based in Charleston, S.C.

“We want the platform to show carriers’ insurance, life products and critical illness plans, as well as the emergent benefits that are focused on noninsurance products, like ID theft protection and concierge healthcare,” he continued. “We want to onboard all types of sellers’ products to make it easy for brokers to evaluate those sellers’ products and for employers to evaluate and make choices.”

BenefitsPlace won’t just offer the choices, Dugan stressed, but it will also present information around the offerings, so employers and consumers can make informed decisions about their benefits.

The average Benefitfocus customer offers 15 different benefits, and 20% of its clients offer 20 or more, Dugan said. So, the choices can be overwhelming to both employers and employees.

“We want to remove friction from the process,” he said. “We want to help people really understand what’s available, and it’s only getting more difficult when new products come in. We want to help consumers navigate those choices.”

An easier small-business payday process?

In the gig economy, small businesses can struggle with the prospect of payday happening potentially daily. Intuit’s QuickBooks just announced new payroll software options to help small businesses more easily deal with short-term employees who expect to be paid the day they work.

Contractor Direct Deposit brings “drop in the bank account” payment options to small businesses and syncs up with QuickBooks, so everything is streamlined at tax time. Same Day Direct Deposit is a new option in QuickBooks Full Service Payroll, and it’s an alternate way to pay contractors or freelancers more quickly and stay on top of expenses.

All of the Xbox E3 2018 Briefing Videos – Xbox Wire

Yesterday’s Xbox E3 2018 Briefing featured a ton of big news, from our announcements of new studios joining the Microsoft Studios family to the reveal of the next chapter in the Halo saga. There was something for everyone in the briefing too, including first looks at hardcore shooters like Metro: Exodus and Battlefield V, family-friendly titles like Ori and the Will of the Wisps and Kingdom Hearts 3, and indie gems like Session and Tunic. Did you miss out on the action? If so, we’ve got you covered with trailers and demos galore. Take a look below or watch the entire briefing above!

Halo Infinite

Video forAll of the Xbox E3 2018 Briefing Videos

Ori and the Will of the Wisps

Video forAll of the Xbox E3 2018 Briefing Videos

Crackdown 3

Video forAll of the Xbox E3 2018 Briefing Videos

Sea of Thieves – Cursed Sails and Forsaken Shores

Video forAll of the Xbox E3 2018 Briefing Videos

Forza Horizon 4

Video forAll of the Xbox E3 2018 Briefing Videos

Cyberpunk 2077 Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

PlayerUnknown’s Battlegrounds

Video forAll of the Xbox E3 2018 Briefing Videos

Gears 5 – Cinematic Announce Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

Gears 5 – Announce Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

Fallout 76

Video forAll of the Xbox E3 2018 Briefing Videos


Video forAll of the Xbox E3 2018 Briefing Videos

Devil May Cry 5

Video forAll of the Xbox E3 2018 Briefing Videos


Video forAll of the Xbox E3 2018 Briefing Videos

Gears POP!

Video forAll of the Xbox E3 2018 Briefing Videos

Xbox Game Pass Catalog Preview

Video forAll of the Xbox E3 2018 Briefing Videos

ID@Xbox Games Montage

Video forAll of the Xbox E3 2018 Briefing Videos

Battletoads Announce Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

Hyper Universe

Video forAll of the Xbox E3 2018 Briefing Videos

Xbox One X Enhanced Games

Video forAll of the Xbox E3 2018 Briefing Videos

Xbox One E3 2018 Montage

Video forAll of the Xbox E3 2018 Briefing Videos

Cuphead DLC Announce

Video forAll of the Xbox E3 2018 Briefing Videos

Captain Spirit Announce

Video forAll of the Xbox E3 2018 Briefing Videos

Jump Force Announce

Video forAll of the Xbox E3 2018 Briefing Videos

Just Cause 4 Announce

Video forAll of the Xbox E3 2018 Briefing Videos

Shadow of the Tomb Raider

Video forAll of the Xbox E3 2018 Briefing Videos

Tales of Vesperia: Remastered

Video forAll of the Xbox E3 2018 Briefing Videos

We Happy Few Story Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

NieR: Automata Become As Gods Edition

Video forAll of the Xbox E3 2018 Briefing Videos

Metro Exodus Gameplay Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

Kingdom Heart III Frozen Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

The Division 2 Gameplay Demo

Video forAll of the Xbox E3 2018 Briefing Videos

Dying Light 2 Gameplay Demo

Video forAll of the Xbox E3 2018 Briefing Videos

Dying Light 2 Announce Trailer

Video forAll of the Xbox E3 2018 Briefing Videos

Battlefield 5 Single Player Teaser

Video forAll of the Xbox E3 2018 Briefing Videos

Federal cybersecurity report says nearly 75% of agencies at risk

The latest federal cybersecurity report holds little good news regarding the security posture of government agencies, and experts are not surprised by the findings.

The Office of Management and Budget (OMB) and the Department of Homeland Security (DHS) developed the report in accordance with President Donald Trump’s cybersecurity executive order issued last year. The report acknowledged the difficulties agencies face in terms of budgeting, maintaining legacy systems and hiring in the face of the cybersecurity skills gap, and it identified 71 of 96 agencies as being either “at risk or high risk.”

“OMB and DHS also found that federal agencies are not equipped to determine how threat actors seek to gain access to their information. The risk assessments show that the lack of threat information results in ineffective allocations of agencies’ limited cyber resources,” OMB and DHS wrote in the report. “This situation creates enterprise-wide gaps in network visibility, IT tool and capability standardization, and common operating procedures, all of which negatively impact federal cybersecurity.”

The federal cybersecurity report tested the agencies involved under 76 metrics and identified four major areas of improvement: increasing threat awareness, standardizing IT capabilities, consolidating security operations centers (SOCs), and improving leadership and accountability.

Greg Touhill, president of Cyxtera Federal Group, based in Coral Gables, Fla., and former CISO for the United States, said the report was an “accurate characterization of the current state of cyber risk and a reflection of the improvements made over the last five years in treating cybersecurity as a risk management issue, rather than just a technology problem.”

“I am concerned that the deletions of and vacancies in key senior cyber leadership positions [are] sending the wrong message about how important cybersecurity is to the government workforce, commercial and international partners, and potential cyber adversaries,” Touhill wrote via email. “As national prosperity and national security are dependent on a strong cybersecurity program that delivers results that are effective, efficient and secure, I believe cybersecurity ought to be at the top of the agenda, and we need experienced cyber leaders sitting at the table to help guide the right decisions.”

Agencies at risk

The federal cybersecurity report said many agencies lack situational awareness and noted this has been a long-standing issue in the U.S. government.

I am concerned that the deletions of and vacancies in key senior cyber leadership positions [are] sending the wrong message about how important cybersecurity is to the government workforce, commercial and international partners, and potential cyber adversaries.
Greg Touhillpresident of Cyxtera Federal Group and former CISO for the United States

“For the better part of the past decade, OMB, the Government Accountability Office, and agency [inspectors general] have found that agencies’ enterprise risk management programs do not effectively identify, assess, and prioritize actions to mitigate cybersecurity risks in the context of other enterprise risks,” OMB wrote. “In fact, situational awareness is so limited that federal agencies could not identify the method of attack, or attack vector, in 11,802 of the 30,899 cyber incidents (38%) that led to the compromise of information or system functionality in [fiscal year] 2016.”

Sherban Naum, senior vice president of corporate strategy and technology at Bromium, based in Cupertino, Calif., said improving information sharing might not “address the protection component.”

“Sharing information in real time of an active and fully identified attack is critical. However, more information alone won’t help if there is no contextual basis to understand what was attacked, what vulnerability was leveraged, the attacker’s intent and impact to the enterprise,” Naum said. “I wonder what systems are in place or are needed to process the real-time threat data to then automatically protect the rest of the federal space.”

Not all of the news was bad. OMB noted that 93% of users in the agencies studied use multifactor authentication in the form of personal identity verification cards. However, the report said this was only the beginning, as “agencies have not matured their access management capabilities” for modern mobile use.

“One of the most significant security concerns that results from the current decentralized and fragmented IT landscape is ineffective identity, credential, and access management processes,” OMB wrote. “Fundamentally, any organization must have a clear understanding of the people, assets, and data on its networks.”

The federal cybersecurity report acknowledged the number of high-profile data leaks and breaches across government systems in recent years and said the situation there is not improving.

“Federal agencies do not have the visibility into their networks to effectively detect data exfiltration attempts and respond to cybersecurity incidents. The risk assessment process revealed that 73 percent of agency programs are either at risk or high risk in this critical area,” OMB wrote. “Specific metrics related to data loss prevention and exfiltration demonstrate even greater problems, with only 40 percent of agencies reporting the ability to detect the encrypted exfiltration of information at government-wide target levels. Only 27 percent of agencies report that they have the ability to detect and investigate attempts to access large volumes of data, and even fewer agencies report testing these capabilities annually.”

Additionally, only 16% of agencies have properly implemented encryption on data at rest.

Suggested improvements

The federal cybersecurity report had suggestions for improving many of the poor security findings, including consolidating email systems, creating standard software configurations and a shared marketplace for software, and improving threat intelligence sharing across SOCs. However, many of the suggestions related directly to following National Institute of Standards and Technology (NIST) Cybersecurity Framework guidelines, the Cyber Threat Framework developed by the Office of the Director of National Intelligence, or DHS’ Continuous Diagnostics and Mitigation (CDM) program.

Katherine Gronberg, vice president of government affairs at ForeScout Technologies, based in San Jose, Calif., said the focus of CDM is on real-time visibility.

“For example, knowing you have 238 deployed surveillance cameras found to have a particular vulnerability is a good example of visibility. Knowing that one or more of those cameras is communicating with high-value IT assets outside of its segment is further visibility, and then seeing that a camera is communicating externally with a known, malicious command-and-control IP address is the type of visibility that helps decision-making,” Gronberg wrote via email. “CDM intends to give agencies this level of real-time domain awareness in addition to securing data. It’s worth noting that many agencies are now moving to Phase 3 of CDM, which is about taking action on the problems that are discovered.”

Katie Lewin, federal director for the Cloud Security Alliance, said “standardization is an effective tool to get the best value from resources,” especially given that many risks faced by government agencies are due to the continued use of legacy systems.

“Standardized, professionally managed cloud systems will significantly help reduce risks and eliminate several threat vectors,” Lewis wrote via email. “If agencies adopt DHS’s Continuous Diagnostics and Mitigation process, they will not have to develop and reinvent custom programs. However, as with all standards, there needs to be some flexibility. Agencies should be able to modify a standard approach within defined limits. Failure to involve agencies in developing a common approach and in defining the boundaries of flexibility will result in limited acceptance and adoption of the common approach.”

Gary McGraw, vice president of security technology at Synopsys Inc., based in Mountain View, Calif., said focusing on standards may not hold much improvement.

“The NIST Framework has lots of very basic advice and is very useful. It would be a step in the right direction. However, it is important to keep in mind that standards generally reflect the bare minimum,” McGraw said. “Organizations that view security solely as a compliance requirement generally fall short, compared to others that treat it as a core or enabling component of their operations.”

Michael Magrath, director of global regulations and standards at OneSpan, said, “Improving resource allocations is a crucial to improving our federal cyberdefenses.” 

“With $5.7 billion in projected spending across federal civilian agencies, some agencies may cry poor. The report notes that email consolidation can save millions of dollars each year, and unless agencies have improved efficiencies like email consolidation, have implemented electronic signatures and migrated to the cloud, there remains an opportunity to reallocate funds to better protect their systems,” Magrath said. “The report also notes that agencies are operating multiple versions of the same software. This adds unnecessary expense, and as more and more agencies migrate to the cloud, efficiencies and cost reductions should follow enabling agencies to reallocate budget and IT resources to other areas.”

Equinix expands amid a flourishing data center services market

The massive surge in data volumes companies are producing, processing and storing is great news for Equinix. The Redwood City, Calif., company, which operates more than 175 data centers on five continents, is expanding its reach, investing $39 million to enlarge its London data center. It recently formed new partnerships with Microsoft Azure and Google Cloud Platform, allowing its customers to connect to more of the cloud providers’ services. And it is looking at yet more growth worldwide, eying South America, China and the Middle East for possible data center sites.

“We’re always watching the market and looking for expansion opportunities,” said Ryan Mallory, senior vice president for global solutions enablement at Equinix. In an interview at the MIT Sloan CIO Symposium in Cambridge, Mass., on May 23, Mallory spoke about Equinix’s endeavors and potential future plans amid a booming data center services market. His outlook for the company’s continued growth was optimistic.

The reasons: “exponential” increases in data translating into the need for more data center space; more companies experimenting with data-intensive technologies, like blockchain; and, in an economy where sellers of services often become buyers, the ever-increasing need for interconnectivity.

“Now, you have what used to be just an enterprise customer wanting to buy services from Microsoft or Azure in a very transactional manner,” Mallory said. “You actually now have companies like Microsoft or Google or SAP turning around and buying services back from the enterprise — more importantly, buying services from us together.”

Ryan Mallory, senior vice president for global solutions enablement, EquinixRyan Mallory

Mallory also touched on potential dangers: overbuilding in the data center services market and whether Britain’s impending exit from the EU would force contingency plans. Following are edited excerpts from the conversation.

Equinix is expanding its London data center in response to demand from financial services companies. How much is the growth in the amounts of data companies are using contributing to the need for more data center space?

Ryan Mallory: It’s exponential, and it’s exponential in a positive manner. You just touched on the financial industry. The capabilities associated with financial and the way they do transactions and the way they’re starting to look at blockchain infrastructure — those footprints become pretty massive. The need to have power, cooling and space becomes big. And you start to see trends in financial markets — London, New York, Tokyo — that have these builds for specific verticals. Now, when you start to look at the access to smart devices and those type of integrations, all it’s going to do is incrementally increase the amount of capabilities that are needed for access close to the edge, therefore driving up more demand for data centers.

What we’re seeing is with that comes the need for some of those early adopters of this neutral or this interconnected model to also benefit. Our network partners 10 years ago, people were going, ‘Man, are these companies really going to make it and be viable long term?’ We’ve seen a dramatic resurrection of the companies with fiber capabilities, and we’re partnering with them very carefully and very effectively, because the network access — unless you size that access to the infrastructure accordingly, all you have is big buildings with space and cooling in there.

Is that also what’s driving Equinix’s recent partnerships with Azure and Google?

Mallory: It is. You have to have that connectivity pivot point. But there’s also the demand, because we’ve got the buy side and the sell side that sit inside our facilities. And what we’ve seen is those completely merge, because now you have what used to be just an enterprise customer wanting to buy services from Microsoft or Azure in a very transactional manner. You actually now have companies like Microsoft or Google or SAP turning around and buying services back from the enterprise — more importantly, buying services from us together.

SmartKey is a great example. It allows third-party key management in an encryption fashion. What you’re seeing is this buy-sell relationship really become essential, because it’s not just a retailer and a purchaser. Those sides of the table swap often, and that’s what’s really helping drive the need to have these hyper-interconnected capabilities. And then have more services closer to that, because latency is important to applications and so is the design of how the application is accessed by the end user.

Is there a danger that the data center services market could get overbuilt?

Mallory: It’s a question that hits us all the time because of the amount of development and growth that’s going on. I don’t see it, because the amount of demand and the growth associated with compute and storage that’s out there is so high that that horizon is at least five years out, if at all. The demand that we see from the hyperscalers, as well as our network providers, is pretty substantial.

And now what we’re starting to see is a massive pivot from the enterprise to look at this interconnected-oriented architecture that moves it to become highly distributed, because they know their workforces are highly distributed. Even if it’s just a company in the North American or the U.S. market, they’re still looking at edge deployments in a minimum of four locations, because they know that they’ve got the coverage from a latency and application perspective, either through their own bare-metal deployments or from the IaaS-SaaS providers.

There was talk early on after Brexit that some data center companies might relocate or build facilities in other parts of Europe. Is the expansion of the London data center evidence that those concerns were overblown?

Mallory: We didn’t have those concerns because of the distributed footprint that we’ve had, basically organically and inorganically, over the past five years. The London expansion had to do with demand. We keep a very, very close watch on Brexit and, more importantly, GDPR [General Data Protection Regulation] — what that meant for the markets in the U.K. and the European Union. And while there are concerns about what does that really mean for data flow, the cloud providers and access from the enterprise, we haven’t seen any detrimental effect.

Do you have a Brexit strategy?

Mallory: We watch it, but it’s nothing that we’re building to. We’re not looking at, hey, we’ve got to invest X amount more Capex or from an AFFO perspective and start allocating more capabilities to the U.K. specifically. We’ve got growth models based on land banks that we have, as well as just customer demand both near term and within the next 36 months that we watch and build to.

So, it’s a scenario we need to watch because of just the dynamics of not only information, but revenue flow. But we’ve seen very, very strong demand in all the markets in Europe.

What can you tell me about any future expansions?

Mallory: We’re always watching the market and looking for expansion opportunities. Right now, we’re taking a look at South America, with the acquisition that we made a couple years ago in Alog down in Brazil and then the most recent one with Verizon — that gave us nodes inside Colombia. What we’ve seen is a much larger than expected demand profile for companies wanting this repeatable, Equinized model in countries in South America. We’re looking at it and saying, ‘OK, what is it going to take from an entry point?’ Because a lot of the core markets that we’ve wanted to be in, we’ve done a good job entering or land banking.

China has always got expansion capabilities. Beijing’s on the list. India is always on the list; it’s just a tough nut to crack, from an infrastructure standpoint and from some of the monopolistic rules that apply there and how you have to form JVs [joint ventures], etc. The Middle East is becoming a lot more appealing, with some of the new leaders that are in place and some of the relaxing of rules and laws that have historically made it difficult to enter based on a content distribution standpoint.

Right. There are lots of reforms happening in Saudi Arabia.

Mallory: Absolutely. We’re seeing a ripple effect through the region.

Equinix’s Ryan Mallory discusses what makes an intelligent enterprise in part one of this two-part Q&A.

Pure Storage software tooled up for containers, OpenShift, VMware

Pure Storage software news received less attention than its nonvolatile-memory-express-enabled FlashArray last week at the company’s Pure Accelerate user conference, but the new features did cover a lot of ground.

The expanded Pure Storage software includes integrated cloud synchronization with VMware, Red Hat OpenShift and Linux-based Docker containers. Pure also introduced the Evergreen Storage Service (ES2), a Pure-managed private cloud with utility pricing for on-premises bare-metal consumption.

Software features tackle virtualization woes

A core tenet in Pure Storage software development is to add data services while keeping things simple for customers, said Sandeep Singh, Pure’s senior director of cloud and new stack solutions marketing.

“This is all part of our strategy to help you build a data-centric architecture to transition workloads faster and with less effort to multiple cloud environments,” Singh said.

Pure Storage wrote its own software-defined stack for VMware SDDC. The VMware-certified code integrates with VMware Virtual Volumes and the full vRealize suite. Authorized users provision data storage to individual virtual machines via policy-based service catalogs. The underlying storage is Pure Storage FlashBlade in place of VMware Virtual SAN.

Pure also launched software tools to address the challenges of persistent Docker storage application containers. Customers can get a reference design for implementing platform as a service using Pure Storage FlashArray and Red Hat OpenShift.

Additional Pure Storage software can help optimize large-scale Docker deployments with Kubernetes. The Pure Service Orchestrator is intended to speed stateless and stateful containers via the Pure stack.

“You have a big pool of storage across many FlashArray and FlashBlade [machines] and can allow the container environment to provision storage automatically,” said Matt Kixmoeller, Pure’s vice president of strategy.

Pure multi-cloud approach includes on-premises managed service

The all-flash array vendor also added data protection software. PureSnap enables snapshot replication from Pure Storage flash to any NFS target. CloudSnap replication to Amazon Web Services’ S3 is on the product roadmap for this year, Singh said.

ES2 builds on the Pure Storage Evergreen Storage subscription that lets customers buy storage upfront and receive nondisruptive upgrades. With ES2, customers select capacity and performance requirements. Pure Storage selects the type and number of hardware arrays needed and deploys them at a customer’s site.

Kixmoeller said ES2 gives enterprises a chance to build scalable capacity while maintaining control of data.

Rob Green, CTO of desktop-as-a-service firm Dizzion, said he expects to deploy ES2 as a private cloud for back up and testing on an existing 42-node Pure Storage environment.

“I think ES2 will be huge for us. I don’t trust the big box clouds to protect my data. Having Pure on premises on a subscription? I’m geeked out about that,” Green said.

For DevOps organizations, the vendor released Pure Storage software development kits for full stack automation with open source configuration engines, including Ansible, Puppet, Python and SaltStack.