Tag Archives: created

Parallels RAS pushes remote work flexibility

The sudden transition to remote work has created a demand for application and desktop virtualization products that, like Parallels Remote Application Server, will work with whatever device an employee has on hand.

Representatives from the application and desktop virtualization vendor said the COVID-19 outbreak has pushed both new and existing customers to seek flexibility as they strive to handle the unprecedented work-from-home situation.

The Parallels Remote Application Server (RAS) software can be deployed on multiple types of devices — from Macs to Chromebooks and from iPads to Android phones. The company released Parallels RAS 17.1 in December 2019, updating provisioning options and including new multi-tenant architecture possibilities.

John UppendahlJohn Uppendahl

John Uppendahl, vice president of communications at Parallels, said the product compares to offerings from Citrix and VMware.

“You can be up and running in less than an hour and deploying virtual Windows desktop applications to any device in less than a day,” Uppendahl said.

Shannon KalvarShannon Kalvar

Shannon Kalvar, research manager at IDC, listed Parallels among the virtual client computing market’s major players in his 2019-2020 vendor assessment, noting that customers praised its ease of management and ability to work across a range of devices. He said the sudden interest in remote work technology is driving up demand for the companies that provide it.

“Everybody’s phone is ringing off the hook,” he said. “Everybody’s flat out.”

A need for flexibility

Victor Fiss, director of sales engineering at Parallels, said COVID-19 drove many of its customers to seek temporary licenses for hundreds of additional employees. Parallels RAS can run on premises, on the Azure and AWS public clouds and in a hybrid environment, he said, giving existing customers flexibility in expanding.

Victor FissVictor Fiss

“A lot of our customers that are running on-prem are now adding 300, 400 users out of the blue because of COVID-19,” he said, adding that hybrid options have been enticing because they provide capacity without affecting the employee’s experience.

With Parallels RAS, he said, deployment is not only fast, according to the vendor, but it also allows for more ways to get work done — like support for native touch gestures in virtual desktop environments.

“If you’re using a mobile device — iOS or Android — you’re not getting a shrunken-down desktop that’s screaming for a keyboard and mouse you don’t have,” Uppendahl said. “Instead, you’re seeing application shortcuts — you can add or remove shortcuts to any application that runs on Windows — and, when you launch it, it will launch in full screen.”

Deploying Parallels

Wayne Hunter, CEO of managed service provider AvTek Solutions, Inc., said he had used Parallels RAS to enable remote work for a client of his. He said that client, a bank, went from zero remote users to 150 in two days.

Wayne HunterWayne Hunter

“The main thing that makes it easy to use is that it’s easy to install, easy to configure, easy to set up,” he said. “You can go from having nothing installed to having a working system up in just a couple hours.”

Hunter said several factors make Parallels RAS advantageous for IT professionals. The product’s ease of deployment and management, he said, would be especially beneficial to small IT teams managing many users.

For end users, Hunter said, the ability of Parallels RAS to work on a variety of devices without hassle was a key selling point.

“It’s just like logging in at their office,” he said, noting that users would find their profiles, desktop backgrounds and files unaffected by remote access. “It’s all there, just like it looked at the office.”

It can be challenging, Hunter noted, to ensure users have a proper device and high-speed internet connection at home to enable remote work. Parallels RAS, he said, eased those concerns.

“The beautiful part of Parallels RAS is [that] it doesn’t take much resources,” he said. “The software is very lightweight, so even some folks who didn’t have very good internet didn’t have any problems.”

An evolution of the virtualization market

Kalvar has spoken of a split in the virtualization market between the hosting of a desktop or application and fuller-featured workspace products. The pandemic’s work-from-home orders have furthered that divide; companies that are just beginning their efforts to change workflows through technology, he said, are more apt to explore traditional virtualization.

“For those [not far along with their business continuity plans], this is going to be an 18-month business continuity disaster,” he said. “If you’re in a continuity situation, and you don’t already have a solution in play — because, if you did, the first thing you would do is try to expand it — I think you’re looking more at the vendors who went down the virtualization side of the road … just because their technology matches up with what you need.”

“What [those] people need is a really fast, really cheap way to get people working from home quickly,” he added.

Kalvar said businesses — especially those just looking to maintain continuity through the crisis — must seek products that are both easy to stand up and manage.

“You have to be flexible, particularly when you’re in that business continuity situation,” he said. “In operations, you’re always looking for good enough, not perfect.”

“You’re looking for, ‘This solution meets enough of my criteria … at the lowest cost,'” he added.

Go to Original Article

Top 5 ERP software for small businesses

As ERP software providers have created cloud-based versions of their products, they’ve opened up these capabilities to small businesses.

The per-user, per-month pricing model makes ERP software more accessible to small businesses, and running it in the cloud means that they don’t need to invest in servers or IT staff to deploy, manage and troubleshoot it.

ERP software is ideal for small businesses that have outgrown their spreadsheets, paper-based systems or general small business accounting software. These software systems are now more widely available to businesses that had outgrown spreadsheets or small business accounting software and are looking for something that could better handle accounting, customer relationship management and other business functions.

There’s no hard-and-fast rule as to when small businesses should switch to ERP software. But if they’re struggling with a lot of manual tasks, want to get a better picture of the financial health of their business and take advantage of analytics, it might be time to start evaluating different vendors. Some other indicators that it’s time to look at ERP software include spending too much time trying to integrate other software packages to get a full picture of inventory, supply chain and customers, as well as difficulty meeting customer demands.

Here are the top five ERP software choices for small businesses:


Aimed squarely at small businesses in the distribution, wholesale, retail and services sector, OnCloud ERP is a fully cloud-based ERP software product. The OnCloud ERP suite of applications includes the expected accounting modules for real-time information on cash flow, as well as sales, inventory, purchase order and receipt tracking, inventory management and production planning. Add-on modules provide the ability to manage payroll, track and maintain assets, leverage CRM functions like lead tracking and manage projects.

One of the most attractive features for small businesses is that companies can implement OnCloud ERP without an IT department and uses a single platform for all the ERP functions. The software also offers mobile device and remote access capabilities.

OnCloud ERP offers a free trial for 14 days. Pricing starts at $10 per user per month for the “StartUp” plan, with a minimum of five users.

Microsoft Dynamics Business Central

While Microsoft Dynamics 365 is geared toward larger businesses, Microsoft offers a Business Central application for small businesses. This product includes financials, supply chain management, customer service and project management in one product.

The analytics capabilities in Business Central include the ability to connect data across accounting, sales, purchasing, inventory and customer transactions, then run reports in real-time using business intelligence dashboards. The product also enables users to access data modeling and analysis to create financial forecasts.

Because it’s a Microsoft product, users can integrate the product with Excel, Word, Outlook and Azure. Microsoft also offers pre-built add-on products like Continia Document Capture 365 for recognizing documents and approving invoices and Jet Reports to create financial reports inside Excel.

The pricing model is a per-user, per-month fee, based on whether the company chooses a basic or premium version. Microsoft delivers Business Central entirely in the cloud, and the vendor also offers a mobile application for remote access.

Oracle NetSuite

While Oracle markets NetSuite as ideal for businesses of any size, where NetSuite really shines is with smaller businesses. It’s an all-in-one software suite that includes financials, customer service and e-commerce capabilities, so small business owners don’t have to figure out how to use APIs to connect different software packages. NetSuite also packages analytics in with its ERP software to provide insight into how the business is performing, using key performance indicators.

NetSuite is delivered entirely in the cloud, on the NetSuite Cloud Platform. This enables organizations to add other applications and modules — such as SuitePeople, its human capital management system — to the software. The product is billed as good for manufacturing, media and publishing, nonprofit, retail, services, advertising, distribution and wholesale and software industries.

Potential users must contact NetSuite for pricing information.

Sage Intacct

The focus of Sage Intacct is finance and accounting, and Sage bills it as being “built for finance by finance.” Some of the features it offers includes the ability to automate complex processes, analyze data, create structured transactions and approvals, and manage multiple currencies and locations. It also provides the ability to track multiple accounts in real-time.

For companies that want to extend Sage Intacct beyond core financial functions, the software offers modules for fixed assets, inventory management, and time and expense management, among others. It also offers web services in the form of APIs to integrate with other software systems, as well as a built-in Salesforce integration.

Sage Intacct is priced on a quote basis and is cloud-based.

SAP Business One

As SAP’s ERP product for small businesses, SAP Business One is a single suite that includes financial management, sales and customer management, purchasing and inventory control, and analytics and reporting capabilities. It also includes a mobile access module so that users can check inventory, manage sales and service, and complete approvals from iOS or Android devices.

Companies can customize SAP Business One for their industries, including consumer products, manufacturing, retail, wholesale distribution and professional services. The can also customize the software using application extensions from SAP partners, create web applications that run on desktops or mobile devices, and use self-service options within SAP Business One to create additional fields, tables and forms.

Unlike a lot of other small business ERP products, companies can implement SAP Business One on premises. It’s also delivered in a cloud-based model, priced on a per-user, per-month basis. It’s sold exclusively through SAP partners.

ERP selection advice

Before beginning the ERP software evaluation process, small business leaders need to first identify the business problems they’re trying to solve. They will also want to audit their existing processes to see if the ERP system they’re considering has these processes built in or will let them create workflows.

As small businesses begin the evaluation process, it’s important to keep in mind what the company actually needs and what it can support. Most of these systems will let companies add users as needed, as well as extend capabilities using APIs. These top five ERP software for small business have features that go beyond basic accounting and let small businesses compete with larger companies, using tools that previously were not affordable.

Go to Original Article

Cloudian CEO: AI, IoT drive demand for edge storage

AI and IoT is driving demand for edge storage as data is being created faster than it can be reasonably moved across clouds, object storage vendor Cloudian’s CEO said.

Cloudian CEO Michael Tso said “Cloud 2.0” is giving rise to the growing importance of edge storage among other storage trends. He said customers are getting smarter about how they use the cloud, and that’s leading to growing demand for products that can support private and hybrid clouds. He also detects an increased demand for resiliency against ransomware attacks.

We spoke with Tso about these trends, including the Edgematrix subsidiary Cloudian launched in September 2019 that focuses on AI use cases at the edge. Tso said we can expect more demand for edge storage and spoke about an upcoming Cloudian product related to this. He also talked about how AI relates to object storage, and if Cloudian is preparing other Edgematrix-like spinoffs.

What do you think storage customers are most concerned with now?
Michael Tso: I think there is a lot, but I’ll just concentrate on two things here. One is that they continue to just need lower-cost, easier to manage and highly scalable solutions. That’s why people are shifting to cloud and looking at either public or hybrid/private.

Related to that point is I think we’re seeing a Cloud 2.0, where a lot of companies now realize the public cloud is not the be-all, end-all and it’s not going to solve all their problems. They look at a combination of cloud-native technologies and use the different tools available wisely.

I think there’s the broad brush of people needing scalable solutions and lower costs — and that will probably always be there — but the undertone is people getting smarter about private and hybrid.

Point number two is around data protection. We’re now seeing more and more customers worried about ransomware. They’re keeping backups for longer and longer and there is a strong need for write-once compliant storage. They want to be assured that any ransomware that is attacking the system cannot go back in time and mess up the data that was stored from before.

Cloudian actually invested very heavily in building write-once compliant technologies, primarily for financial and the military market because that was where we were seeing it first. Now it’s become a feature that almost everyone we talked to that is doing data protection is asking for.

People are getting smarter about hybrid and multi-cloud, but what’s the next big hurdle to implementing it?

Tso: I think as people are now thinking about a post-cloud world, one of the problems that large enterprises are coming up with is data migration. It’s not easy to add another cloud when you’re fully in one. I think if there’s any kind of innovation in being able to off-load a lot of data between clouds, that will really free up that marketplace and allow it to be more efficient and fluid.

Right now, cloud is a bunch of silos. Whatever data people have stored in cloud one is kind of what they’re stuck with, because it will take them a lot of money to move data out to cloud two, and it’s going to take them years. So, they’re kind of building strategies around that as opposed to really, truly being flexible in terms of where they keep data.

What are you seeing on the edge?

Tso: We’re continuing to see more and more data being created at the edge, and more and more use cases of the data needing to be stored close to the edge because it’s just too big to move. One classic use case is IoT. Sensors, cameras — that sort of stuff. We already have a number of large customers in the area and we’re continuing to grow in that area.

The edge can mean a lot of different things. Unfortunately, a lot of people are starting to hijack that word and make it mean whatever they want it to mean. But what we see is just more and more data popping up in all kinds of locations, with the need of having low-cost, scalable and hybrid-capable storage.

We’re working on getting a ruggedized, easy-to-deploy cloud storage solution. What we learned from Edgematrix was that there’s a lot of value to having a ruggedized edge AI device. But the unit we’re working on is going to be more like a shipping container or a truck as opposed to a little box like with Edgematrix.

What customers would need a mobile cloud storage device like you just described?

Tso: There are two distinct use cases here. One is that you want a cloud on the go, meaning it is self-contained. It means if the rest of the infrastructure around you has been destroyed, or your internet connectivity has been destroyed, you are still able to do everything you could do with the cloud. The intention is a completely isolatable cloud.

In the military application, it’s very straightforward. You always want to make sure that if the enemy is attacking your communication lines and shooting down satellites, wherever you are in the field, you need to have the same capability that you have during peak time.

But the civilian market, especially in global disaster, is another area that we are seeing demand. It’s state and local governments asking for it. In the event of a major disaster, oftentimes for a period, they don’t have any access to the internet. So the idea is to run in a cloud in a ruggedized unit that is completely stand-alone until connectivity is restored.

AI-focused Edgematrix started as a Cloudian idea. What does AI have to do with object storage?
Tso: AI is an infinite data consumer. Improvements on AI accuracy is a log scale — it’s an exponential scale in terms of the amount of data that you need for the additional improvements in accuracy. So, a lot of the reasons why people are accumulating all this data is to run their AI tools and run AI analysis. It’s part of the reason why people are keeping all their data.

Being S3 object store compatible is a really big deal because that allows us to plug into all of the modern AI workloads. They’re all built on top of cloud-native infrastructure, and what Cloudian provides is the ability to run those workloads wherever the data happens to be stored, and not have to move the data to another location.

Are you planning other Edgematrix-like spinoffs?
Tso: Not in the immediate future. We’re extremely pleased with the way Edgematrix worked out, and we certainly are open to do more of this kind of spin off.

We’re not a small company anymore, and one of the hardest things for startups in our growth stage is balancing creativity and innovation with growing the core business. We seem to have found a good sort of balance, but it’s not something that we want to do in volume because it’s a lot of work.

Go to Original Article

Google-Ascension deal reveals murky side of sharing health data

One of the largest nonprofit health systems in the U.S. created headlines when it was revealed that it was sharing patient data with Google — under codename Project Nightingale.

Ascension, a Catholic health system based in St. Louis, partnered with Google to transition the health system’s infrastructure to the Google Cloud Platform, to use the Google G Suite productivity and collaboration tools, and to explore the tech giant’s artificial intelligence and machine learning applications. By doing so, it is giving Google access to patient data, which the search giant can use to inform its own products.

The partnership appears to be technically and legally sound, according to experts. After news broke, Ascension released a statement saying the partnership is HIPAA-compliant and a business associate agreement, a contract required by the federal government that spells out each party’s responsibility for protected health information, is in place. Yet reports from The Wall Street Journal and The Guardian about the possible improper transfer of 50 million patients’ data has resulted in an Office for Civil Rights inquiry into the Google-Ascension partnership.

Legality aside, the resounding reaction to the partnership speaks to a lack of transparency in healthcare. Organizations should see the response as both an example of what not to do, as well as a call to make patients more aware of how they’re using health data, especially as consumer companies known for collecting and using data for profit become their partners.

Partnership breeds legal, ethical concerns

Forrester Research senior analyst Jeff Becker said Google entered into a similar strategic partnership with Mayo Clinic in September, and the coverage was largely positive.

Forrester Research senior analyst Jeff Becker Jeff Becker

According to a Mayo Clinic news release, the nonprofit academic medical center based in Rochester, Minn., selected Google Cloud to be “the cornerstone of its digital transformation,” and the clinic would use “advanced cloud computing, data analytics, machine learning and artificial intelligence” to improve healthcare delivery.

But Ascension wasn’t as forthcoming with its Google partnership. It was Google that announced its work with Ascension during a quarterly earnings call in July, and Ascension didn’t issue a news release about the partnership until after the news broke.

“There should have been a public-facing announcement of the partnership,” Becker said. “This was a PR failure. Secrecy creates distrust.”

Matthew Fisher, partner at Mirick O’Connell Attorneys at Law and chairman of its health law group, said the outcry over the Google-Ascension partnership was surprising. For years, tech companies have been trying to get access to patient data to help healthcare organizations and, at the same time, develop or refine their existing products, he said.

“I get the sense that just because it was Google that was announced to have been a partner, that’s what drove a lot of the attention,” he said. “Everyone knows Google mostly for purposes outside of healthcare, which leads to the concern of does Google understand the regulatory obligations and restrictions that come to bear by entering the healthcare space?”

Ascension’s statement in response to the situation said the partnership with Google is covered by a business associate agreement — a distinction Fisher said is “absolutely required” before any protected health information can be shared with Google. Parties in a business associate agreement are obligated by federal regulation to comply with the applicable portions of HIPAA, such as its security and privacy rules.

A business associate relationship allows identifiable patient information to be shared and used by Google only under specified circumstances. It is the legal basis for keeping patient data segregated and restricting Google from freely using that data. According to Ascension, the health system’s clinical data is housed within an Ascension-owned virtual private space in Google Cloud, and Google isn’t allowed to use the data for marketing or research.

“Our data will always be separate from Google’s consumer data, and it will never be used by Google for purposes such as targeting consumers for advertising,” the statement said.

Health IT and information security expert Kate Borten Kate Borten

But health IT and information security expert Kate Borten believes business associate agreements and the HIPAA privacy rule they adhere to don’t go far enough to ensure patient privacy rights, especially when companies like Google get involved. The HIPAA privacy rule doesn’t require healthcare organizations to disclose to patients who they’re sharing patient data with.

“The privacy rule says as long as you have this business associate contract — and business associates are defined by HIPAA very broadly — then the healthcare provider organization or insurer doesn’t have to tell the plan members or the patients about all these business associates who now have access to your data,” she said.

Chilmark Research senior analyst Jody Ranck said much of the alarm over the Google-Ascension partnership may be misplaced, but it speaks to a growing concern about companies like Google entering healthcare.

Since the Office for Civil Rights is looking into the partnership, Ranck said there is still a question of whether the partnership fully complies with the law. But the bigger question has to do with privacy and security concerns around collecting and using patient data, as well as companies like Google using patient data to train AI algorithms and the potential biases it could create.

All of this starts to feel like a bit of an algorithmic iron cage.
Jody RanckSenior analyst, Chilmark Research

Ranck believes consumer trust in tech companies is declining, especially as data privacy concerns get more play.

“Now that they know everything you purchase and they can listen in to that Alexa sitting beside your bed at night, and now they’re going to get access to health data … what’s a consumer to do? Where’s their power to control their destiny when algorithms are being used to assign you as a high-, medium-, or low-risk individual, as creditworthy?” Ranck said. “All of this starts to feel like a bit of an algorithmic iron cage.”

A call for more transparency

Healthcare organizations and big tech partnerships with the likes of Google, Amazon, Apple and Microsoft are growing. Like other industries, healthcare organizations are looking to modernize their infrastructure and take advantage of state of the art storage, security, data analytics tools and emerging tech like artificial intelligence.

But for healthcare organizations, partnerships like these have an added complexity — truly sensitive data. Forrester’s Becker said the mistake in the Google-Ascension partnership was the lack of transparency. There was no press release early on announcing the partnership, laying out what information is being shared, how the information will be used, and what outcome improvements the healthcare organization hopes to achieve.

“There should also be assurance that the partnership falls within HIPAA and that data will not be used for advertising or other commercial activities unrelated to the healthcare ambitions stated,” he said.

Fisher believes the Google-Ascension partnership raises questions about what the legal, moral and ethical aspects of these relationships are. While Ascension and Google may have been legally in the right, Fisher believes it’s important to recognize that privacy expectations are shifting, which calls for better consumer education, as well as more transparency around where and how data is being used.

Although he believes it would be “unduly burdensome” to require a healthcare organization to name every organization it shares data with, Fisher said better education on how HIPAA operates and what it allows when it comes to data sharing, as well as explaining how patient data will be protected when shared with a company like Google, could go a long way in helping patients understand what’s happening with their data.

“If you’re going to be contracting with one of these big-name companies that everyone has generalized concerns about with how they utilize data, you need to be ahead of the game,” Fisher said. “Even if you’re doing everything right from a legal standpoint, there’s still going to be a PR side to it. That’s really the practical reality of doing business. You want to be taking as many measures as you can to avoid the public backlash and having to be on the defensive by having the relationship found out and reported upon or discussed without trying to drive that discussion.”

Go to Original Article

Acquia, Drupal founder Dries Buytaert on open source, Vista, CDPs

NEW ORLEANS — In 2000, Acquia CTO Dries Buytaert created his own web platform, Drupal, which became an open source web content management system. And, in turn, he launched Acquia, which became the commercially supported instance of Drupal.

In an interview during the Acquia Engage user conference here this week, Buytaert discussed Vista Equity Partners’ billion-dollar majority stake in the company, the role of Acquia Drupal, marketing automation ambitions and a possible Acquia customer data platform.

Tell me how the Vista deal happened. Were you actively seeking a buyer?

Dries Buytaert: No. We were profitable, we really didn’t need more investment. But at the same time, we have an ambitious roadmap and our competitors are well-funded. We were starting to receive a lot of inbound requests from different firms, including Vista. When they come to you, you’ve got to look at it. It made sense.

How much of Acquia does Vista own; what is ‘a majority?’

Buytaert: I don’t know the exact number, but it’s a majority. Some of our investors got out, some stayed in. AWS, for example, stayed in.

Was CEO Mike Sullivan brought on two years ago to shop Acquia Drupal around for acquisition?

Buytaert: No. It’s funny how people read into these things as if it was all planned out. Mike was brought on because Tom Erickson left. I was in on the search for Mike.

Is your Acquia Drupal role changing now, or do you see it changing soon?

Buytaert: I can’t really speak for Vista, but in the conversation they were very excited about our momentum, our strategy and the team. My belief is that they will leave all of these things intact, including my role.

Last year at this time, you discussed IBM buying Red Hat and Adobe buying Magento and wondered about the future of their open source versions under new ownership. Now you’re in that position with Acquia Drupal.

Dries Buytaert
Drupal creator and Acquia CTO Dries Buytaert addressing the open-source faithful at his company’s Engage user conference.

Buytaert: The way I think about it, we’ve always been majority-owned by investors, and we’ve swapped out investors. For me, nothing really changed, it’s the same strategy, the same beliefs about open source. Just different investors. We’ll just keep going, faster and faster.

We surround Drupal with tools and projects and supercharge it. Ultimately, the financials speak for themselves. We are growing the top line and the bottom line — because the model works. I’ve been balancing this for 12 years, giving back to open source. It’s fascinating to people how we do this but to me, it’s so natural it’s hard to explain. The reality is that 99% of our customers use Drupal, so our success is completely tied to the success of Drupal.

Why did Acquia buy Mautic and jump into the marketing automation space, where there’s much more competition than web content management?

Buytaert: Mautic is the only open source marketing automation platform. Open source disrupted the web content management space, and I don’t think there’s any commercial, standalone, proprietary CMS that is really successful. The ones that are successful do a lot more than that, like Sitecore and Adobe, that do e-commerce, et cetera.

Pure-play proprietary CMS vendors have to shift to something else, like digital experience, to stay relevant. No sane person would pay hundreds of thousands of dollars just for the CMS piece.

In your keynote today, you talked about unified customer profiles and consolidating data sources, and said to stay tuned for future announcements. Is Acquia coming out with a customer data platform (CDP)?

Buytaert: I can’t comment on that, specifically, but we are thinking a lot about the data layer. It seems like there are a lot of similarities with CDPs — it’s about integration and unifying the profile with [personalization engine] Acquia Lift and Mautic. Mautic has about a hundred integrations built by community members. It’s not a new challenge; we have been thinking and working on it for a long time.

The whole CDP space is in an emergent state. No one talked about CDPs two years ago, and now everyone’s a CDP. Maybe that’s why I’m avoiding the term. Some CDPs are glorified tag managers, others are integration platforms, yet others positioned themselves two years ago as a content personalization platform. They come in so many flavors today that ‘CDP’ means different things to different people.

What we’re obsessed with — whether you want to call it a CDP or not — is creating a unified user profile, and do it in a way that’s complimentary to what we have.
Dries BuytaertCTO, Acquia

What we’re obsessed with — whether you want to call it a CDP or not — is creating a unified user profile, and doing it in a way that’s complimentary to what we have.

You’re moving into a space controlled by much bigger competitors like Oracle, Salesforce, Microsoft, Adobe and SAP. Is there enough room for Acquia?

Buytaert: Acquia is an open platform, and open source. Developers want that. Literally, no developer wakes up and says, “Let me buy an Adobe product.” Developers love open source, and that’s why there’s enough space for Acquia.

But many marketers who buy these tools are not developers. They want to sign up for a SaaS subscription, turn it on and start marketing.

Buytaert: If you really want to build great experiences, you need developers. You can’t just have marketers build great digital experiences, because the number of integrations you need is so vast — as with custom legacy systems — it’s not just a little drag-and-drop. Really bringing it together is going to require developers, and empowering them to build these things through open source is going to be a big thing.

Go to Original Article

The 3 types of DNS servers and how they work

Not all DNS servers are created equal, and understanding how the three different types of DNS servers work together to resolve domain names can be helpful for any information security or IT professional.

DNS is a core internet technology that translates human-friendly domain names into machine-usable IP addresses, such as www.example.com into The DNS operates as a distributed database, where different types of DNS servers are responsible for different parts of the DNS name space.

The three DNS server types server are the following:

  1. DNS stub resolver server
  2. DNS recursive resolver server
  3. DNS authoritative server

Figure 1 below illustrates the three different types of DNS server.

A stub resolver is a software component normally found in endpoint hosts that generates DNS queries when application programs running on desktop computers or mobile devices need to resolve DNS domain names. DNS queries issued by stub resolvers are typically sent to a DNS recursive resolver; the resolver will perform as many queries as necessary to obtain the response to the original query and then send the response back to the stub resolver.

Types of DNS servers
Figure 1. The three different types of DNS server interoperate to deliver correct and current mappings of IP addresses with domain names.

The recursive resolver may reside in a home router, be hosted by an internet service provider or be provided by a third party, such as Google’s Public DNS recursive resolver at or the Cloudflare DNS service at

Since the DNS operates as a distributed database, different servers are responsible — authoritative in DNS-speak — for different parts of the DNS name space.

Figure 2 illustrates a hypothetical DNS resolution scenario in which an application uses all three types of DNS servers to resolve the domain name www.example.com into an IPv4 address — in other words, a DNS address resource record.

DNS servers interoperating
Figure 2. DNS servers cooperate to accurately resolve an IP address from a domain name.

In step 1, the stub resolver at the host sends a DNS query to the recursive resolver. In step 2, the recursive resolver resends the query to one of the DNS authoritative name servers for the root zone. This authoritative name server does not have the response to the query but is able to provide a reference to the authoritative name server for the .com zone. As a result, the recursive resolver resends the query to the authoritative name server for the .com zone.

This process continues until the query is finally resent to an authoritative name server for the www.example.com zone that can provide the answer to the original query — i.e., what are the IP addresses for www.example.com? Finally, in step 8, this response is sent back to the stub resolver.

One thing worth noting is that all these DNS messages are transmitted in the clear, and there is the potential for malicious actors to monitor users’ internet activities. Anyone administering DNS servers should be aware of DNS privacy issues and the ways in which those threats can be mitigated.

Go to Original Article

SAP Concur creates Slack bot for booking flights

SAP Concur has created a Slack bot that lets users book travel and submit expenses within the team collaboration app. It’s the type of advanced business integration that Slack has embraced as a way to differentiate its platform from Microsoft Teams and Cisco Spark.

The SAP Concur Slack bot lets workers search and book flights within a Slack messaging channel. The integration makes it possible to share a flight itinerary with other Slack team members, who can then schedule the same travel arrangements with a couple of clicks. After travelers book a flight, they can direct the bot to create an expense report.

The travel bot is SAP Concur’s latest partnership with Slack. In March, the two vendors released a Slack bot that lets a user file, approve and manage expense reports. A worker could create an expense report, for example, by messaging the bot, “Expense $15 for lunch.”

SAP Concur has not released any bots that are compatible with Slack competitors Microsoft Teams or Cisco Webex Teams, although it does have integrations with Microsoft Outlook 365 and the AI voice assistant platform Alexa for Business.

SAP Concur’s Slack bot for travel uses technology from the consumer flight search tool Hello Hipmunk, which SAP Concur acquired in 2016. The business-grade application of the software syncs with the travel policies enterprises set within SAP Concur.

“I can see significant potential for this to cut down on email back and forth that typically occurs when a travel department sends a list of options to an employee, and then they respond, and then there’s some back and forth before everyone agrees on a travel plan,” said Irwin Lazar, analyst at Nemertes Research based in Mokena, Ill.

Slack builds on integration advantage

Slack has a better track record than its rivals for being able to accommodate and take advantage of deep integrations with third-party business software like SAP Concur, said Wayne Kurtzman, analyst at IDC. Slack has more than 1,500 apps in its directory, far exceeding the inventory of other leading team collaboration platforms.

“Microsoft and Cisco have to make pushes — and they will — to leverage AI in new and different ways that make work easier, but it really has to be easier,” Kurtzman said. “Slack appears ready for the competition, and the team collaboration market will retain double-digit growth rates for most of the next 10 years.”

Slack has recently taken steps to make integrations more useful, and easier to create. Last month, the vendor acquired Missions, a division of the startup Robots & Pencils, to incorporate its technology for helping nondevelopers build workflows and integrations within Slack. The company also introduced a new paradigm for using third-party apps, letting users export contextual information from Slack into those business tools.

Businesses are looking for ways to streamline workflows so that employees can get work done faster, and with easier access to the context they need to make decisions. As a result, the integration of business apps with team collaboration platforms like Slack has become one of the hottest new trends in collaboration.

But those integrations need to balance convenience, complexity and confidentiality, said Alan Lepofsky, analyst at Constellation Research. The industry is still in the early stages of determining which tasks are best done in separate apps rather than within a platform like Slack — as well as which types of apps to trust with confidential data, he said.

“That said, I think the creation of these ‘work-related’ bots is a step in the right direction, as our future will certainly be filled with digital assistants. The question is when and to what level of granularity,” Lepofsky said.

CMS creates chief health informatics officer position

The Centers for Medicare and Medicaid Services created a chief health informatics officer position geared toward driving health IT strategy development and technology innovation for the department.

According to the job description, the chief health informatics officer (CHIO) will be charged with developing “requirements and content for health-related information technology, with an initial focus on improving innovation and interoperability.”

The chief health informatics officer position will develop a health IT and information strategy for CMS and the U.S. Department of Health and Human Services, as well as provide subject-matter expertise for health IT information management and technology innovation policy.

Applying health informatics to IT

The position also entails working with providers and vendors to determine how CMS will apply health informatics methods to IT, as well as acting as a liaison between CMS and private industry to lead innovation, according to the job description.

A candidate must have at least one year of “qualifying specialized experience,” including experience using health informatics data to examine, analyze and develop policy and program operations in healthcare programs; offering guidance on program planning to senior management for an organization; and supervising subordinate staff.

Pamela Dixon, co-founder and managing partner of healthcare executive search firm SSi-SEARCH, based in Atlanta, said a chief health informatics officer must have all the skill sets of a chief medical information officer and more. Dixon said a CHIO must be a strategic systems thinker, with the ability to innovate, a strong communicator and a “true leader.”

“The role could and should unlock the key to moving technology initiatives through healthcare dramatically faster, dramatically more effective,” Dixon said.

Finding the right balance

The role could and should unlock the key to moving technology initiatives through healthcare dramatically faster, dramatically more effective.
Pamela Dixonco-founder and managing partner, SSi-SEARCH

Eric Poon, who has served as Duke University Health System’s chief health information officer for the last three and a half years, said a successful informatics professional enables individuals within an organization to achieve quality improvement and patient safety goals with technology. Poon oversees clinical systems and analytics teams and ensures data that’s been gathered can be used to support quality initiatives and research.

One of the most significant challenges Poon said he faces is determining how to balance resources between the day-to-day and “what’s new,” along with making data accessible in a “high-quality way,” so faculty and researchers can easily access the data to support their work in quality improvement and clinical research. Being successful means creating a bridge between technology and individuals within the organization, Poon said.

“I would like them to say that we are making it possible for them to push the envelope with regards to data science and research and data exchange,” Poon said. “I also like to think we will have innovators who are coming up with new apps, new data science, machine learning algorithms that are realigning how we engage patients and how we are really becoming smart about how to use IT to move the needle in quality and safety … and patient health in a cost-effective way.”

Emerging roles important for change

Dixon said new and emerging leadership roles are important because they make organizations think about both what they need or want the individual to accomplish, as well as what the organization itself could accomplish with the right person.

“The actual title is less important,” she said. “There are CHIOs that might just as easily carry the title chief innovation officer or chief transformation officer or chief data officer, depending on their focus. The important thing is that we encourage and foster growth, value and innovation by creating roles that are aimed at doing just that.”

The creation of a chief health informatics officer position and the push to focus on health IT within CMS is part of a larger initiative started earlier this year after the Donald Trump administration announced MyHealthEData, which allows patients to take control of their healthcare data and allows CMS to follow them on their healthcare journey.

Johnathan Monroe, director of the CMS media relations group, said the organization will be accepting applications for the chief health informatics officer position until July 20.

Tape storage capacity plays important role as data increases

As the amount of new data created is set to hit the multiple-zettabyte level in the coming years, where will we store it all?

With the release of LTO-8 and recent reports that total tape storage capacity continues to increase dramatically, tape is a strong option for long-term retention. But even tape advocates say it’s going to take a comprehensive approach to storage that includes other forms of media to handle the data influx.

Tape making a comeback?

The annual tape media shipment report released earlier this year by the LTO Program showed that 108,000 petabytes (PB) of compressed tape storage capacity shipped in 2017, an increase of 12.9% over 2016. The total marks a fivefold increase over the capacity of just over 20,000 PB shipped in 2008.

LTO-8, which launched in late 2017, provides 30 TB compressed capacity and 12 TB native, doubling the capacities of LTO-7, which came out in 2015. The 12 TB of uncompressed capacity is equivalent to 8,000 movies, 2,880,000 songs or 7,140,000 photos, according to vendor Spectra Logic.

“We hope now [with] LTO-8 another increase in capacity [next year],” said Laura Loredo, worldwide marketing product manager at Hewlett Packard Enterprise, one of the LTO Program’s Technology Provider Companies along with IBM and Quantum.

The media, entertainment and science industries have been traditionally strong users of tape for long-term retention. Loredo pointed to more recent uses that have gained traction. Video surveillance is getting digitized more often and kept for longer, and there is more of it in general. The medical industry is a similar story, as records get digitized and kept for long periods of time.

The ability to create digital content at high volumes is becoming less expensive, and with higher resolutions, those capacities are increasing, Quantum product and solution marketing manager Kieran Maloney said. So tape becomes a cost-efficient play for retaining that data.

Tape also brings security benefits. Because it is naturally isolated from a network, tape provides a true “air gap” for protection against ransomware, said Carlos Sandoval Castro, LTO marketing representative at IBM. If ransomware is in a system, it can’t touch a tape that’s not connected, making tapes an avenue for disaster recovery in the event of a successful attack.

“We are seeing customers come back to tape,” Loredo said.

LTO roadmap
The LTO roadmap projects out to the 12th generation.

Tape sees clear runway ahead

“There’s a lot of runway ahead for tape … much more so than either flash or disk,” said analyst Jon Toigo, managing partner at Toigo Partners International and chairman of the Data Management Institute.

Even public cloud providers such as Microsoft Azure are big consumers of tape, Toigo said. Those cloud providers can use the large tape storage capacity for their data backup.

Tape is an absolute requirement for storing the massive amounts of data coming down the pike.
Jon Toigochairman, Data Management Institute

However, with IDC forecasting dozens of zettabytes in need of storage by 2025, flash and disk will remain important. One zettabyte is equal to approximately 1 billion TBs.

“You’re going to need all of the above,” Toigo said. “Tape is an absolute requirement for storing the massive amounts of data coming down the pike.”

It’s not necessarily about flash versus tape or other comparisons, it’s about how best to use flash, disk, tape and the cloud, said Rich Gadomski, vice president of marketing at Fujifilm and a member of the Tape Storage Council.

The cloud, for example, is helpful for certain aspects, such as offsite storage, but it shouldn’t be the medium for everything.

“A multifaceted data protection approach continues to thrive,” Gadomski said.

There’s still a lot of education needed around tape, vendors said. So often the conversation pits technologies against each other, Maloney said, but instead the question should be “Which technology works best for which use?” In the end, tape can fit into a tiered storage model that also includes flash, disk and the cloud.

In a similar way, the Tape Storage Council’s annual “State of the Tape Industry” report, released in March, acknowledged that organizations are often best served by using multiple media for storage.

“Tape shares the data center storage hierarchy with SSDs and HDDs and the ideal storage solution optimizes the strengths of each,” the report said. “However, the role tape serves in today’s modern data centers is quickly expanding into new markets because compelling technological advancements have made tape the most economical, highest capacity and most reliable storage medium available.”

LTO-8 uses tunnel magnetoresistance (TMR) for tape heads, a switch from the previous giant magnetoresistance (GMR). TMR provides a more defined electrical signal than GMR, allowing bits to be written to smaller areas of LTO media. LTO-8 also uses barium ferrite instead of metal particles for tape storage capacity improvement. With the inclusion of TMR technology and barium ferrite, LTO-8 is only backward compatible to one generation. Historically, LTO had been able to read back two generations and write back to one generation.

“Tape continues to evolve — the technology certainly isn’t standing still,” Gadomski said.

Tape also has a clearly defined roadmap, with LTO projected out to the 12th generation. Each successive generation after LTO-8 projects double the capacity of the previous version. As a result, LTO-12 would offer 480 TB compressed tape storage capacity and 192 TB native. It typically takes between two and three years for a new LTO generation to launch.

In addition, IBM and Sony have said they developed technology for the highest recording areal density for tape storage media, resulting in approximately 330 TB uncompressed per cartridge.

On the lookout for advances in storage

Spectra Logic, in its “Digital Data Storage Outlook 2018” report released in June, said it projects much of the future zettabytes of data will “never be stored or will be retained for only a brief time.”

“Spectra’s projections show a small likelihood of a constrained supply of storage to meet the needs of the digital universe through 2026,” the report said. “Expected advances in storage technologies, however, need to occur during this timeframe. Lack of advances in a particular technology, such as magnetic disk, will necessitate greater use of other storage mediums such as flash and tape.”

While the report claims the use of tape for secondary storage has declined with backup moving to disk, the need for tape storage capacity in the long-term archive market is growing.

“Tape technology is well-suited for this space as it provides the benefits of low environmental footprint on both floor space and power; a high level of data integrity over a long period of time; and a much lower cost per gigabyte of storage than all other storage mediums,” the report said.