Tag Archives: proposed

Microsoft PowerApps pricing proposal puts users on edge

BOSTON — Microsoft’s proposed licensing changes for PowerApps, the cloud-based development tools for Office 365 and Dynamics 365, have confused users and made them fearful the software will become prohibitively expensive.

Last week, at Microsoft’s SPTechCon user conference, some organizations said the pricing changes, scheduled to take effect Oct. 1, were convoluted. Others said the new pricing — if it remains as previewed by Microsoft earlier this summer — would force them to limit the use of the mobile app development tools.

“We were at the point where we were going to be expanding our usage, instead of using it for small things, using it for larger things,” Katherine Prouty, a developer at the nonprofit Greater Lynn Senior Services, based in Lynn, Mass., said. “This is what our IT folks are always petrified of; [the proposed pricing change] is confirmation of their worst nightmares.”

This is what our IT folks are always petrified of; this is confirmation of their worst nightmares.
Katherine ProutyDeveloper, Greater Lynn Senior Services

Planned apps the nonprofit group might have to scrap if the pricing changes take effect include those for managing health and safety risks for its employees and clients in a regulatory-compliant way, and protecting the privacy of employees as they post to social media on behalf of the organization, Prouty said.

Developers weigh in

The latest pricing proposal primarily affects organizations building PowerApps that tap data sources outside of Office 365 and Dynamics 365. People connecting to Salesforce, for example, would pay $10 per user, per month, unless they opt to pay $40 per user, per month for unlimited use of data connectors to third-party apps.

The new pricing would take effect even if customers were only connecting Office 365 to Dynamics 365 or vice versa. That additional cost for using apps they’re already paying for does not sit well with some customers, while others find the pricing scheme perplexing. 

“It’s all very convoluted right now,” said David Drever, senior manager at IT consultancy Protiviti, based in Menlo Park, Calif.

Manufacturing and service companies that create apps using multiple data sources are among the businesses likely to pay a lot more in PowerApps licensing fees, said IT consultant Daniel Christian of PowerApps911, based in Maineville, Ohio.

Annual PowerApps pricing changes

However, pricing isn’t the only problem, Christian said. Microsoft’s yearly overhaul of PowerApps fees also contributes to customer handwringing over costs.

“Select [a pricing model] and stick with it,” he said. “I’m OK with change; we’ll manage it and figure it out. It’s the repetitive changes that bug me.”

Microsoft began restricting PowerApps access to outside data sources earlier this year, putting into effect changes announced last fall. The new policy required users to purchase a special PowerApps plan to connect to popular business applications such as Salesforce Chatter, GotoMeeting and Oracle Database. The coming changes as presented earlier this summer would take that one step further by introducing per-app fees and closing loopholes that were available on a plan that previously cost $7 per user per month.

Matt Wade, VP of client services at H3 Solutions Inc., based in Manassas, Va., said customers should watch Microsoft’s official PowerApps blog for future information that might clarify costs and influence possible tweaks to the final pricing model. H3 Solutions is the maker of AtBot, a platform for developing bots for Microsoft’s cloud-based applications.

“People who are in charge of administering Office 365 and the Power Platform need to be hyper-aware of what’s going on,” Wade said. “Follow the blog, comment, provide feedback — and do it respectfully.”

Go to Original Article
Author:

IoT Cybersecurity Improvement Act calls for deployment standards

Proponents of a proposed federal bill are seeking the development of security standards for all government-purchased Internet-connected devices — a move that could spur improved security for IoT deployments across non-government entities as well.     

The IoT Cybersecurity Improvement Act of 2019, co-sponsored by Reps. Robin Kelly (D-Ill.) and Will Hurd (R-Texas), would require the National Institute of Standards and Technology (NIST) to issue guidelines for the secure development, configuration and management of IoT devices. It would also require the federal government to comply with these NIST standards. 

Perhaps more significantly, the bill would likely reach beyond the federal government if passed and made into law. Security experts predict that NIST standards would help elevate IoT security throughout private industry and during development of consumer products.

“Our bill establishes baseline cybersecurity standards for government purchased and operated IoT devices,” Rep. Kelly said in an emailed response to questions about the proposed legislation. “Right now, we are focused on securing government IoT devices. I think the most relevant piece to executives would be the ability to use NIST’s Considerations for Managing Internet of Things (IoT) Cybersecurity and Privacy Risks as a model for internal standards.”

She added, “Our goal remains securing government IoT devices. If these standards are helpful to the private sector then that’s an additional benefit.”

IoT: Speed to market offsets cybersecurity

Security leaders said there’s a need for improved IoT security: Vendors work fast to bring IoT products to market, while enterprise leaders have moved just as quickly to capitalize on IoT deployments. In both cases, the desire for speed typically trumps security concerns, they said.

Now these security concerns are gaining new attention.

“People have been saying for at least three years that there’s a problem and we need to fix it,” says David Alexander, digital trust expert at PA Consulting.

Others agreed, adding that they think NIST is the right entity to take the lead on establishing security standards.

“We need government intervention,” said Balakrishnan Dasarathy, collegiate professor and program chair for Information Assurance at the Graduate School at the University of Maryland University College.

Our bill establishes baseline cybersecurity standards for government purchased and operated IoT devices.
Robin Kelly U.S. Representative (D-Ill.)

Dasarathy said the ripple effect from federal action on IoT legislation would improve product security for consumers and private industry alike. It would also give appropriate IoT security guidance to chief information security officers (CISOs) and other organizational executives.

“Right now many CISOs struggle to determine adequate security,” Dasarathy said.

Weak IoT security has had significant consequences. The Mirai botnets, for example, exploited vulnerabilities in networked devices and led to a massive distributed denial of service attack in 2016.

The skyrocketing number of connected devices also increases the amount of infrastructure to protect. Gartner, the technology research and advisory firm, predicted that 14.2 billion connected things will be used this year, a figure that will hit 25 billion by 2021. That growth means CISOs will be responsible for more than three times as many endpoints in 2023 than they were in 2018.

The emergence of IoT security standards

Despite often treating security as an afterthought, the IoT community — including vendors, executives engaged in IoT initiatives and regulatory bodies — has already started to address security and data privacy issues. This recognition helped create an emerging collection of standards, best practices and regulations such as California’s IoT device law known as SB-327. –It is the first such state law in the United States, and the European Telecommunications Standards Institute has developed similar rules.

However, the IoT Cybersecurity Improvement Act could push IoT safety to the forefront for IoT device makers and end users. This is because of the clout that NIST has in setting standards and that the federal government has in purchase power. The federal bill was advanced out of the House Oversight and Reform Committee in June.

“It will set a direction that will make it easy for others to follow,” said Gus Hunt, managing director and cyber strategist for Accenture Federal Services.

If the bill passes, IoT device makers that want to sell to the federal government would have to design and manufacture products according to NIST standards. To avoid designing a second-tier product for the nongovernment market, those makers would bring those same government devices to the broader market, Hunt explained.

Even if the IoT Cybersecurity Improvement Act doesn’t pass, Hunt said vendors now recognize that buyers want better security features in their products.

“Many manufacturers realize that they have to find a way [to make sure] that whatever they sell is safe, secure and doesn’t place people at higher risk simply by buying the device,” he added.

Security becoming an IoT priority

Meanwhile, private sector CISOs and CIOs could benefit if the bill is passed and NIST develops security standards that give them guidelines to adopt for their own IoT deployments.

“NIST standards could give them leverage in their discussions about budget, controls and selection of products,” Alexander said, as NIST protocols in other areas have often become the basis for best practices in private sector organizations seeking to strengthen their own programs.

However, the bill’s future is uncertain. A similar measure was introduced in 2017 and failed to move forward. On the other hand, the IoT Cybersecurity Improvement Act of 2019 does have bipartisan sponsors — which security experts said gives them some hope that Congress will take favorable action on this issue.

Yet that hope comes with a caveat: They said lawmakers — in Congress and elsewhere — must pay attention to each other’s IoT legislation to ensure they’re all moving in the same direction.

Also, they said NIST should work with industry to craft standards. This cooperative approach is one that NIST typically takes, and it would help ensure that all the various laws share common elements so that vendors understand what they must deliver to the market.

“These things cannot be contradictory. All these versions of [IoT] legislation need to be aligned because vendors want to make one version of their product. All the legislation has to be pointing in the same direction, otherwise it’s not going to work,” Alexander said.

Go to Original Article
Author:

Tintri acquisition proposal leaves customers in limbo

While DataDirect Networks awaits the outcome of its proposed acquisition of Tintri assets, Tintri customers wonder what it all means for them.

High-performance computing (HPC) storage specialist DataDirect Networks (DDN) believes the Tintri acquisition is a way to appeal to mainstream enterprise IT shops. DDN made a bid to acquire Tintri’s assets for an undisclosed sum after Tintri filed for bankruptcy in early July — barely a year after going public on the Nasdaq.

DDN sees a way to broaden its appeal with Tintri’s flash-based, analytics-driven storage arrays, but the acquisition isn’t yet certain. The two vendors have signed a letter of intent, but the proposed sale won’t be finalized until the completion of a court-ordered bidding process. That means bidders could emerge to challenge DDN.

Users react to planned Tintri acquisition

The uncertainly of the Tintri acquisition affects its customers, who remain unsure of what it means for their maintenance contracts. DDN declined to estimate how long the bidding process might take.

That leaves Tintri customers bracing for what comes next. The city of Lewiston, Idaho, used Tintri arrays to replace legacy storage. Systems administrator Danny Santiago said the hybrid storage was a “magic cure-all” for labor-intensive management. Lewiston’s storage includes a Tintri T620 and two T820 hybrid arrays for primary storage, backup and replication.

“I spent about half my day fighting LUN management. When we got the Tintri, I got that time back,” Santiago said. “I can go six months and never have to touch the Tintri storage. The interface is beautiful. It gives you metrics to let you know if a problem is [tied to] storage, the network or on the Windows OS side.”

Now, Santiago said he doesn’t know what to expect. His agency is in year one of an extended three-year Tintri support contract for the T620.

“Financially, we’re not in a position to change our storage,” he said. “We put a lot of money into these Tintri boxes, and we need to get the life expectancy out of them.”

The Fay Jones School of Architecture and Design at the University of Arkansas installed Tintri several years ago to replace aging EMC Clariion storage — the forerunner to the Dell EMC VNX SAN array. Scott Zemke, the Fayetteville, Ark., school’s director of technology, said Tintri competitors have already started knocking on his door.

“Quite honestly, we rarely have issues with the Tintri arrays. But, of course, we’re looking at contingency plans if we have to do a refresh. One vendor is offering ridiculously stupid deals to trade in our Tintri storage, so it will be an interesting next couple of months,” Zemke said.

“I know Tintri really wanted the business to work, but it seems like they have just had management problem after management problem. Hopefully, DDN will continue to support the stuff. We have DDN arrays in our HPC data center, and they’re a great company to work with, too,” Zemke said.

Is predictive analytics key to Tintri acquisition?

According to Tintri’s securities filings, DDN’s bid would encompass most of Tintri’s assets, including all-flash and hybrid virtualization arrays. But the predictive Tintri Analytics platform may have a greater impact on DDN’s business. The SaaS-based data lake provides real-time analytics and preventive maintenance. Customers can automate capacity and performance requirements for each virtual machine.

Predictive analytics is considered a valuable feature for modern storage arrays. Hewlett Packard Enterprise considered Nimble Storage’s InfoSight analytics a key driver of its $1.2 billion acquisition of Nimble in 2017, and HPE has since integrated InfoSight into its flagship 3PAR arrays. DDN could follow the same playbook by incorporating Tintri Analytics into its other products.

Financially, we’re not in a position to change our storage. We put a lot of money into these Tintri boxes, and we need to get the life expectancy out of them.
Danny Santiagosystem administrator for the city of Lewiston, Idaho

Tintri’s technology would help DDN serve mainstream enterprises seeking to implement AI and machine learning, said Kurt Kuckein, senior director of marketing at DDN, based in Chatsworth, Calif.

“We have plenty of organizations where we work with data scientists or the analytics team, but we really haven’t had a product for enterprise IT shops. Adding Tintri gives us a well-baked technology and a large installed base,” Kuckein said.

In the near term, DDN plans to maintain the Tintri brand as a separate engineering division. Real-time Tintri analytics eventually could wind up in branded AI ExaScaler turnkey appliances, he said.

Tintri Analytics is part of the Tintri Global Center management portal. The intelligence can predict hardware failures and automate support tickets. Tintri typically shipped replacement parts to customers by the next business day.

According to George Crump, president of IT analyst firm Storage Switzerland, Tintri’s analytics are “as good, if not better,” than Nimble’s InfoSight.

“DDN is probably the perfect acquirer for Tintri,” Crump said. “It’s profitable. It has a massive amount of storage experience. And there’s almost no overlap between the DDN and Tintri product. All the Tintri stuff would be net-new business.”

Will DDN breathe new life into Tintri storage?

The proposed Tintri acquisition follows a rocky period for the vendor. Tintri filed for Chapter 11 protection this month — just weeks after the one-year anniversary of its initial public offering (IPO). Some experts saw going public as a desperation move after Tintri failed to secure additional private investment. Tintri also went through two CEOs between April and June.

Tintri initially hoped for a share price in the range of $11.50 to raise about $109 million in June 2017, but its IPO opened at $7 and raised only $60 million. Shares rose no higher than $7.75, and Nasdaq eventually delisted Tintri after its shares dropped to below $1 for 30 consecutive trading sessions.

Aside from investors’ lukewarm reception, several strategic missteps conspired to doom Tintri. Crump said the company undercut its key differentiators of analytics and quality of service (QoS) when it launched an all-flash array in 2015.

“Tintri’s marketing message should have been, ‘Don’t buy an all-flash array, and here’s why,'” Crump said. “DDN should get rid of the all-flash model and just focus on selling the hybrid arrays. When your system is faster than all of your workloads combined, then you don’t really need QoS. That would get people’s attention.”

Jaguar Land Rover, BI Worldwide share GitLab migration pros and cons

Microsoft’s proposed acquisition of popular code repository vendor GitHub also thrust competitor GitLab into the spotlight. A quarter-million customers tried to move code repositories from GitHub to GitLab last week in the wake of the Microsoft news, a surge that crashed the SaaS version of GitLab.

Enterprises with larger, more complex code repositories will need more than a few days to weigh the risks of the Microsoft acquisition and evaluate alternatives to GitHub. However, they were preceded by other enterprise GitLab converts who shared their experience with GitLab migration pros and cons.

BI Worldwide, an employee engagement software company in Minneapolis, considered a GitLab migration when price changes to CloudBees Jenkins Enterprise software drove a sevenfold increase in the company’s licensing costs for both CloudBees Jenkins Enterprise and GitHub Enterprise.

GitLab offers built-in DevOps pipeline tools with its code repositories in both SaaS and self-hosted form. BI Worldwide found it could replace both GitHub Enterprise and CloudBees Jenkins Enterprise with GitLab for less cost, and made the switch in late 2017.

“GitLab offered better functionality over GitHub Enterprise because we don’t have to do the extra work to create web hooks between the code repository and CI/CD pipelines, and its CI/CD tools are comparable to CloudBees,” said Adam Dehnel, product architect at BI Worldwide.

GitLab pipelines
GitLab’s tools include both code version control and app delivery pipelines.

Jaguar Land Rover-GitLab fans challenge Atlassian incumbents

Automobile manufacturer Jaguar Land Rover, based in London, also uses self-hosted GitLab among the engineering teams responsible for its in-vehicle infotainment systems. A small team of three developers in a company outpost in Portland, Ore., began with GitLab’s free SaaS tool in 2016, though the company at large uses Atlassian’s Bitbucket and Bamboo tools.

As of May 2018, about a thousand developers in Jaguar Land Rover’s infotainment division use GitLab, and one of the original Portland developers to champion GitLab now hopes to see it rolled out across the company.

Sometimes vendors … get involved with other parts of the software development lifecycle that aren’t their core business, and customers get sold an entire package that they don’t necessarily want.
Chris Hillhead of systems engineering, Jaguar Land Rover’s infotainment systems

“Atlassian’s software is very good for managing parent-child relationships [between objects] and collaboration with JIRA,” said Chris Hill, head of systems engineering for Jaguar Land Rover’s infotainment systems. “But sometimes vendors can start to get involved with other parts of the software development lifecycle that aren’t their core business, and customers get sold an entire package that they don’t necessarily want.”

A comparison between tools such as GitLab and Bitbucket and Bamboo largely comes down to personal preference rather than technical feature gaps, but Hill said he finds GitLab more accessible to both developers and product managers.

“We can give developers self-service capabilities so they don’t have to chew up another engineer’s time to make merge requests,” Hill said. “We can also use in-browser editing for people who don’t understand code, and run tutorials with pipelines and rundeck-style automation jobs for marketing people.”

Jaguar Land Rover’s DevOps teams use GitLab’s collaborative comment-based workflow, where teams can discuss issues next to the exact line of code in question.

“That cuts down on noise and ‘fake news’ about what the software does and doesn’t do,” Hill said. “You can make a comment right where the truth exists in the code.”

GitLab offers automated continuous integration testing of its own and plugs in to third-party test automation tools. Continuous integration testing inside GitLab and with third-party tools is coordinated by the GitLab Runner daemon. Runner will be instrumental to deliver more frequent software updates over the air to in-car infotainment systems that use a third-party service provider called Redbend, which will mean Jaguar Land Rover vehicle owners will get automatic updates to infotainment systems without the need to go to a dealership for installation. This capability will be introduced with the new Jaguar I-Pace electric SUV in July 2018.

Balancing GitLab migration pros and cons

BI Worldwide and Jaguar Land Rover both use the self-hosted version of GitLab’s software, which means they escaped the issues SaaS customers suffered with crashes during the Microsoft GitHub exodus. They also avoided a disastrous outage that included data loss for GitLab SaaS customers in early 2017.

Still, their GitLab migrations have come with downsides. BI Worldwide jumped through hoops to get GitLab’s software to work with AWS Elastic File System (EFS), only to endure months of painful conversion from EFS to Elastic Block Store (EBS), which the company just completed.

GitLab never promised that its software would work well with EFS, and part of the issue stemmed from the way AWS handles EFS burst credits for performance. But about three times a day, response time from AWS EFS in the GitLab environment would shoot up from an average of five to eight milliseconds to spikes as high as 900 milliseconds, Dehnel said.

“EBS is quite a bit better, but we had to get an NFS server setup attached to EBS and work out redundancy for it, then do a gross rsync project to get 230 GB of data moved over, then change the mount points on our Rancher [Kubernetes] cluster,” Dehnel said. “The version control system is so critical, so things like that are not taken lightly, especially as we also rely on [GitLab] for CI/CD.”

GitLab is working with AWS to address the issues with its product on EFS, a company spokesperson said. For now, its documentation recommends against deployment with EFS, and the company suggests users consider deployments of GitLab to Kubernetes clusters instead.