Tag Archives: Business

Canon breach exposes General Electric employee data

Canon Business Process Services suffered a security incident, according to a data breach disclosure by General Electric, for which Canon processes current and former employees’ documents and beneficiary-related documents.

GE systems were not impacted by the cyberattack, according to the company’s disclosure, but personally identifiable information for current and former employees as well as their beneficiaries was exposed in the Canon breach. The breach, which was first reported by BleepingComputer, took place between Feb. 3 and Feb. 14 of this year, and GE was notified of the breach on the 28th. According to the disclosure, “an unauthorized party gained access to an email account that contained documents of certain GE employees, former employees and beneficiaries entitled to benefits that were maintained on Canon’s systems.”

Said documents included “direct deposit forms, driver’s licenses, passports, birth certificates, marriage certificates, death certificates, medical child support orders, tax withholding forms, beneficiary designation forms and applications for benefits such as retirement, severance and death benefits with related forms and documents.” Personal information stolen “may have included names, addresses, Social Security numbers, driver’s license numbers, bank account numbers, passport numbers, dates of birth, and other information contained in the relevant forms.”

GE’s disclosure also said Canon retained “a data security expert” to conduct a forensic investigation. At GE’s request, Canon is offering two years of free identity protection and credit monitoring services.

GE shared the following statement with SearchSecurity regarding the Canon breach.

“We are aware of a data security incident experienced by one of GE’s suppliers, Canon Business Process Services, Inc. We understand certain personal information on Canon’s systems may have been accessed by an unauthorized individual. Protection of personal information is a top priority for GE, and we are taking steps to notify the affected employees and former employees,” the statement read.

Canon did not return SearchSecurity’s request for comment. At press time, Canon has not released a public statement.

Go to Original Article
Author:

Natural language query tools offer answers within limits

Natural language query has the potential to put analytics in the hands of ordinary business users with no training in data science, but the technology still has a ways to go before it develops into a truly transformational tool.

Natural language query (NLQ) is the capacity to query data by simply asking a question in ordinary language rather than code, either spoken or typed. Ideally, natural language query will empower a business user to do deep analytics without having to code.

That ideal, however, doesn’t exist.

In its current form, natural language query allows someone working at a Ford dealership to ask, “How many blue Mustangs were sold in 2019?” and follow up with, “How many red Mustangs were sold in 2019?” to compare the two.

It allows someone in a clothing store to ask, “What’s the November forecast for sales of winter coats?”

It is not, however, advanced enough to pull together unstructured data sitting in a warehouse, and it’s not advanced enough to do complicated queries and analysis.

Natural language query has the potential to democratize data throughout an organization.
Natural language query enables business users to explore data without having to know code.

“We’ve had voice search, albeit in a limited capacity, for years now,” said Mike Leone, senior analyst at Enterprise Strategy Group (ESG), an IT analysis and research firm in Milford, Mass. “We’re just hitting the point where natural language processing can be effectively used to query data, but we’re not even close to utilizing natural language query for complex querying that traditionally would require extensive back-end work and data science team involvement.”

Similarly, Tony Baer, founder and CEO of the database and analytics advisory firm DBInsight, said that natural language query is not at the point where it allows for deep analysis without the involvement of data scientists.

“You can’t go into a given tool or database and ask any random question,” he said. “It still has to be linked to some structure. We’re not at the point where it’s like talking to a human and the brain can process it. Where we are is that, given guardrails, given some structure to the data and syntax, it’s an alternative to structure a query in a specific way.”

NLQ benefits

At its most basic level, business intelligence improves the decision-making process. And the more people within an organization able to do data-driven analysis, the more informed the decision-making process, not merely at the top of an organization but throughout its workforce.

We’re just hitting the point where natural language processing can be effectively used to query data, but we’re not even close to utilizing natural language query for complex querying.
Mike LeoneSenior analyst, Enterprise Strategy Group

Meanwhile, natural language query doesn’t require significant expertise. It doesn’t force a user to write copious amount of code to come up with an answer to what might be a relatively simple analytical question. It frees business users from having to request the help of data science teams — at least for basic queries. It opens analytics to more users within an organization.

“Good NLQ will help BI power users and untrained business users alike get to insights more quickly, but it’s the business users who need the most help and have the most to gain,” said Doug Henschen, principal analyst at Constellation Research. “These users don’t know how to code SQL and many aren’t even unfamiliar with query constructs such as ‘show me X’ and ‘by Y time period’ and when to ask for pie charts versus bar charts versus line charts.”

“Think of all the people who want to run a report but aren’t able to do so,” echoed Jen Underwood, founder and principal consultant at Impact Analytix, an IT consulting firm in Tampa, Fla. “There’s some true beauty to the search. How many more people would be able to use it because they couldn’t do SQL? It’s simple, and it opens up the ability to do more things.”

In essence, natural language query and other low-code/no-code tools help improve data literacy, and increasing data literacy is a significant push for many organizations.

That said, in its current form it has limits.

“Extending that type of functionality to the business will enable a new demographic of folks to interact with data in a way that is comfortable to them,” Leone said. “But don’t expect a data revolution just because someone can use Alexa to see how many people bought socks on a Tuesday.”

The limitations

Perhaps the biggest hindrance to full-fledged natural language query is the nature of language itself.

Without even delving into the fact that there are more than 5,000 languages worldwide and an estimated 200 to 400 alphabets, individual languages are complicated. There are words that are spelled the same but have different meanings, others that are spelled differently but sound the same, and words that bear no visual or auditory relation to each other but are synonyms.

And within the business world, there are often terms that might mean one thing to one organization and be used differently by another.

Natural language query tools don’t actually understand the spoken or written word. They understand specific code and are programmed to translate a spoken or written query to SQL, and then translate the response from SQL back into the spoken or written word.

“Natural language query has trouble with things like synonyms, and domain-specific terminology — the context is missing,” Underwood said. “You still need humans for synonyms and the terminology a company might have because different companies have different meanings for different words.”

When natural language queries are spoken, accents can cause problems. And whether spoken or written, the slightest misinterpretation by the tool can result in either a useless response or, much worse, something incorrect.

“Accuracy is king when it comes to querying,” ESG’s Leone said. “All it takes is a minor misinterpretation of a voice request to yield an incorrect result.”

Over the next few years, he said, people will come to rely on natural language query to quickly ask a basic question on their devices, but not much more.

“Don’t expect NLQ to replace data science teams,” Leone said. “If anything, NLQ will serve as a way to quickly return a result that could then be used as a launching pad for more complex queries and expert analysis.”

While held back now by the limitations of language, that won’t always be the case. The tools will get more sophisticated, and aided by machine learning, will come to understand a user’s patterns to better comprehend just what they’re asking.

“Most of what’s standing in the way is a lack of experience,” DBInsight’s Baer said. “It’s still early on. Natural language query today is far advanced from where it was two years ago, but there’s still a lot of improvement to be made. I think that improvement will be incremental; machine learning will help.”

Top NLQ tools

Though limited in capability, natural language query tools do save business users significant time when asking basic questions of structured data. And some vendors’ natural language query tools are better than others.

Though one of the top BI vendors following its acquisition of Hyperion in 2007, Oracle lost momentum when data visualizations changed the consumption of analytics. Now that augmented intelligence and machine learning are central tenets of BI, however, Oracle is again pushing the technological capabilities of BI platforms. Oracle Analytics Cloud and Day by Day support voice-based queries and its natural language query works in 28 languages, which Henschen said is the broadest language support available.

“Oracle raised the bar on natural language query a couple of years ago when it released its Day By Day app, which used device-native voice-to-text and introduced explicit thumbs-up/thumbs-down training,” Henschen said.

Another vendor Henschen noted is Qlik, which advanced the natural language capabilities of its platform through its January 2019 acquisition of Crunch Data.

“A key asset was the CrunchBot, since rebranded as the Qlik Insight Bot,” Henschen said.

He added that Qlik Insight Bot is a bot-building feature that works with existing Qlik applications, and the bots can subsequently be embedded in third-party applications, including Salesforce, Slack, Skype and Microsoft Teams.

“It brings NLQ outside of the confines of Qlik Sense and interaction with a BI system,” Henschen said.

Tableau is yet another vendor attempting to ease the analytics process with a natural language processing tool. They introduced Ask Data in February 2019 and Tableau’s September 2019 update included the capability to embed Ask Data in other applications.

“When I think about designing a system and taking it the next step forward, Tableau is doing something. [It remembers if someone ran a similar query] and it gives guidance,” Underwood said. “It has the information and knows what people are asking, and it can surface recommendations.”

Baer similarly mentioned Tableau’s Ask Data, while Leone said that the eventual prevalence of natural language query will ultimately be driven by the Amazon Web Services, Google and Microsoft.

Go to Original Article
Author:

Alteryx 2020.1 highlighted by new data profiling tool

Holistic Data Profiling, a new tool designed to give business users a complete view of their data while in the process of developing workflows, highlighted the general availability of Alteryx 2020.1 on Thursday.

Alteryx, founded in 1997 and based in Irvine, Calif., is an analytics and data management specialist, and Alteryx 2020.1 is the vendor’s first platform update in 2020. It released its most recent update, Alteryx 2019.4, in December 2019, featuring a new integration with Tableau.

The vendor revealed the platform update in a blog post; in addition to Holistic Data Profiling, it includes 10 new features and upgrades. Among them are new language toggling feature in Alteryx Designer, the vendor’s data preparation product.

“The other big highlights are more workflow efficiency features,” said Ashley Kramer, Alteryx’s senior vice president of product management. “And the fact that Designer now ships with eight languages that can quickly be toggled without a reinstall is huge for our international customers.”

Holistic Data Profiling is a low-code/no-code feature that gives business users an instantaneous view of their data to help them better understand their information during the data preparation process — without having to consult a data scientist.

After dragging a Browse Tool — Alteryx’s means of displaying data from a connected tool as well as data profile information, maps, reporting snippets and behavior analysis information — onto Alteryx’s canvas, Holistic Data Profiling provides an immediate overview of the data.

Holistic Data Profiling is aimed to help business users understand data quality and how various columns of data may be related to one another, spot trends, and compare one data profile to another as they curate their data.

An overview of an organization's data is displayed in a sample Holistic Data Profiling gif from Alteryx.
A sample Holistic Data Profiling gif from Alteryx gives an overview of an organization’s data.

Users can zoom in on a certain column of data to gain deeper understanding, with Holistic Data Profiling providing profile charts and statistics about the data such as the type, quality, size and number of records.

That knowledge will subsequently inform how to proceed to the next move in order to ultimately make a data-driven decision.

It’s easy to get tunnel vision when analyzing data. Holistic Data Profiling enables end users — via low-code/no-code tooling — to quickly gain a comprehensive understanding of the current data estate.
Mike LeoneAnalyst, Enterprise Strategy Group

“It’s easy to get tunnel vision when analyzing data,” said Mike Leone, analyst at Enterprise Strategy Group. “Holistic Data Profiling enables end users — via low-code/no-code tooling — to quickly gain a comprehensive understanding of the current data estate. The exciting part, in my opinion, is the speed at which end users can potentially ramp up an analytics project.”

Similarly, Kramer noted the importance of being able to more fully understand data before the final stage of analysis.

“It is really important for our customers to see and understand the landscape of their data and how it is changing every step of the way in the analytic process,” she said.

Alteryx customers were previously able to view their data at any point — on a column-by-column or defined multi-column basis — but not to get a complete view, Kramer added.

“Experiencing a 360-degree view of your data with Holistic Data Profiling is a brand-new feature,” she said.

In addition to Holistic Data Profiling, the new language toggle is perhaps the other signature feature of the Alteryx platform update.

Using Alteryx Designer, customers can now switch between eight languages to collaborate using their preferred language.

Alteryx previously supported multiple languages, but for users to work in their preferred language, each individual user had to install Designer in that language. With the updated version of Designer, they can click on a new globe icon in their menu bar and select the language of their choice to do analysis.

“To truly enable enterprise-wide collaboration, breaking down language barriers is essential,” Leone said. “And with Alteryx serving customers in 80 different countries, adding robust language support further cements Alteryx as a continued leader in the data management space.”

Among the other new features and upgrades included in Alteryx 2020.1 are a new Power BI on-premises loader that will give users information about Power BI reports and automatically load those details into their data catalog in Alteryx Connect; the ability to input selected rows and columns from an Excel spreadsheet; and new Virtual Folder Connect to save custom queries.

Meanwhile, a streamlined loader of big data from Alteryx to the Snowflake cloud data warehouse is now in beta testing.

“This release and every 2020 release will have a balance of improving our platform … and fast-forwarding more innovation baked in to help propel their efforts to build a culture of analytics,” Kramer said.

Go to Original Article
Author:

Startup Uplevel targets software engineering efficiency

Featuring a business plan that aims to increase software engineering efficiency and armed with $7.5 million in venture capital funding, Uplevel emerged from stealth Wednesday.

Based in Seattle and founded in 2018, Uplevel uses machine learning and organizational science to compile data about the daily activity of engineers in order to ultimately help them become more effective.

One of the main issues engineers face is a lack of time to do their job. They may be assigned a handful of tasks to carry out, but instead of being allowed to focus their attention on those tasks they’re instead being bombarded by messages, or mired in an overabundance of meetings.

Uplevel aims to improve software engineering efficiency by monitoring messaging platforms such as Slack, collaboration software like Jira, calendar tools, and code repository software such as GitHub. It then compiles the data and is able to show how engineers are truly spending their time — whether they’re being allowed to do their jobs or instead being prevented from it by no fault of their own.

“I kept seeing pain around engineering effectiveness,” said Joe Levy, co-founder and CEO of Uplevel. “Engineers are often seen as artists, but what they’re trying to manage from a business perspective can be tough. If we can help engineers be more effective, organizations can be more effective without having to throw more bodies at the problem.”

Beyond arming the engineers themselves with data to show how they can be more effective, Uplevel attempts to provide the leaders of engineering teams the kind of information they previously lacked.

If we can help engineers be more effective, organizations can be more effective without having to throw more bodies at the problem.
Joe LevyCEO and co-founder, Uplevel

While sales and marketing teams have reams of data to drive the decision-making process — and present when asked for reports — engineering teams haven’t had the same kind of solid information.

“Sales, marketing, they have super detailed data that leads to understanding, but the head of engineering doesn’t have that same level of data,” Levy said. “There are no metrics of the same caliber [for engineers], but they’re still asked to produce the same kind of detailed projections.”

As Uplevel emerges from stealth, as with all startups one of its challenges will be to demonstrate how it’s providing something different than what’s already on the market.

Without differentiation, its likelihood of success is diminished.

But according to Vanessa Larco, a partner at venture capital investment firm New Enterprise Associates with an extensive background in computer science, what Uplevel provides is something that indeed is unique.

“This is really interesting,” she said. “I haven’t seen anything doing this exact thing. The value proposition of Uplevel is compelling if it helps quantify some of the challenges faced by R&D teams to enable them to restructure their workload and processes to better enable them to reach their goals. I haven’t seen or used the product, but I can understand the need they are fulfilling.”

Similarly, Mike Leone, analyst at Enterprise Strategy Group, believes Uplevel is on to something new.

“There are numerous time-based tracking solutions for software engineering teams available today, but they lack a comprehensive view of the entire engineering ecosystem, including messaging apps, collaboration tools, code repository tools and calendars,” he said. “The level of intelligence Uplevel can provide based on analyzing all of the collected data will serve as a major differentiator for them.”

Uplevel developed from a combination of research done by organizational psychologist David Youssefnia and a winning hackathon concept from Dave Matthews, who previously worked at Microsoft and Hulu. The two began collaborating at Madrona Venture Labs in Seattle to hone their idea of how to improve software engineering efficiency before Levy, also formerly of Microsoft, and Ravs Kaur, whose previous experience includes time at Tableau and Microsoft, joined to help Uplevel go to market.

Youssefnia serves as chief strategy officer, Matthews as director product management, and Kaur as CTO.

Startup vendor Uplevel aims to improve the efficiency of software engineers by offering a look into how many distractions engineers face as they work.
A sample chart from Uplevel displays the distractions faced by an organization’s software engineering team.

Uplevel officially formed in June 2018, attracted its first round of funding in September of that year and its second in April 2019. Leading investors include Norwest Venture Partners, Madrona Venture Group and Voyager Capital.

“Their fundamental philosophy was different from what we’d heard,” said Jonathan Parramore, senior data scientist at Avalara, a provider of automated tax compliance software and an Uplevel customer for about a year. “Engineering efficiency is difficult to measure, and they took a behavioral approach and looked holistically at multiple sources of data, then had the data science to meld it together. I’d say that everything they promised they would do, they have delivered.”

Still, Avalara would eventually like to see more capabilities as Uplevel matures.

“They have amazing reports they generate by looking at the data they have access to, but we’d like them to be able to create reports that are more in real time,” said Danny Fields, Avalara’s CTO and executive vice president of engineering. “That’s coming.”

Moving forward, while Uplevel doesn’t plan to branch out and offer a wide array of products, it is aiming to become an essential platform for all organizations looking to improve software engineering efficiency.

As it builds up its own cache of information about improving software engineering efficiency it will be able to share that data — masking the identity of individual organizations — with customers so that they can compare the efficiency of their engineers versus those of other organizations.

“The goal we’re focused on is to be the de facto platform that is helping engineers do their job,” Levy said. “We want to be a platform they can’t live without, that every big organization is reliant on.”

Go to Original Article
Author:

NLP, embedded BI and data literacy top 2020 analytics trends

While 2019 in business intelligence was marked by consolidation and the incremental improvement of augmented intelligence capabilities, 2020 analytics trends are expected to include increased adaptation of embedded BI and a rising level of data literacy across organizations.

The continued improvement of AI features, of course, will also be one of the 2020 analytics trends to watch.

“There’s still no generalized AI, but we’re starting to see it,” said Donald Farmer, principal at TreeHive Strategy. “It was so overhyped, but now we’re seeing generally intelligent assistants.”

Natural language processing

If there’s one AI feature that may become pervasive, it’s natural language processing (NLP).

“There’s a lot of buzz around NLP in BI platforms,” said Mike Leone, an analyst at Enterprise Strategy Group. “While it exists today, it’s still in its infancy. I think as the younger workforce continues to fill data-centric roles in organizations, there will be a growing desire to use voice technology. And I say ‘younger’ simply because that demographic is arguably the most adept at relying on voice technology in their everyday lives and will look for ways to use it to boost productivity at work too.”

By translating a vocal query into an SQL query, NLP will allow users to simply speak a data query and receive a vocal response.

It has the potential to significantly simplify data queries, lessening the need for a background in data science and opening data exploration to a wider range of business users.

But barriers still stand between the ideal of NLP and its reality, limited by the level of its machine learning capabilities.

Computers understand highly specific, unambiguous commands — code. They don’t understand human speech, and even when programmed to do so they can’t adjust for variations such as accents or imperfect syntax. Nevertheless, NLP features are appearing, and are expected to become more prominent as 2020 progresses.

Tableau, for example, introduced Ask Data in early 2019 and updated the tool in its November release. ThoughtSpot, meanwhile, unveiled SearchIQ in the fall of 2018. And Qlik acquired CrunchBot in early 2019 to add conversational capabilities.

“Natural language processing has been a trend the last few years, but now it’s reaching critical mass,” Farmer said.

Embedded BI

Another significant 2020 analytics trends is expected to be the expansion of embedded BI.

Eventually, analytics won’t be conducted on a standalone platform — it will be part of other commonplace business applications.

Instead of running reports — asking a data-driven question, sifting through stored data to come up with the relevant information to address the question, cleaning and preparing the data and ultimately creating a visualization based on the relevant information on a BI platform — business users will have key information delivered without ever having to ask for it, and without having to go through an IT department.

Next-generation architecture will tap data in applications, which will make getting real-time information easier. That will change the analytical paradigm.
Dan SommerGlobal market intelligence lead, Qlik

“Next-generation architecture will tap data in applications, which will make getting real-time information easier,” said Dan Sommer, global market intelligence lead at Qlik. “That will change the analytical paradigm. It was about reports once, and increasingly it will be about embedded — insights will come to you in your moment. It will make insights more consumerized — not from IT or developers. Now it will be everyone.”

Similarly, Doug Henschen, analyst at Constellation Research, pointed to embedded BI as a 2020 analytics trends to watch.

“There’s an increasingly popular saying that ‘every company is now a software company,’ but that’s an intentional overstatement pointing to what’s really a leading-edge trend,” he said.

Organizations — innovators and fast-followers — are making use of their data and finding ways to both enrich and monetize that data, he continued.

“A key enabler is embedded BI and analytics platforms that accelerate the development and delivery of data-driven and insight-driven software and services,” Henschen said. “These embedded platforms have been used primarily by independent software and services vendors to date. This more mainstream embrace of embedding is just getting started, and I think we’ll see more of it in 2020 and beyond.”

Data literacy

One more analytics 2020 analytics trend to watch — beyond such known entities as continued migration to the cloud — is the effort to make more of the workforce data literate.

Data literacy — the ability to derive insight from data — has been expanding beyond the realm of data scientists, but it still remains the domain of a minority within most organizations rather than the majority of employees.

“Data literacy as a service [will be a 2020 analytics trend],” Sommer said. “Data literacy is the ability to read, argue and use data, and data literacy has to happen for us to move into the digital stage. Organizations will realize they need help with this.”

Now, in an attempt to increase data literacy, a group of BI vendors are doing more than merely selling their software platforms. They’re also training line-of-business workers in the language of data.

In May of 2019, Qlik, whose research showed only 24% of employees were confident in their ability to effectively utilize data, began offering a free data literacy certification program. The previous October, Tableau, which offers free data literacy training videos, introduced a certification exam for beginners looking to improve their BI skills. And in September, IBM revealed that it certified 140 new data scientists after developing a data scientist certification program in partnership with The Open Group.

Alteryx, meanwhile, operates the Alteryx for Good program, which partners with colleges and universities — Stanford, the University of California Berkeley, the University of Michigan and Harvard among them — to incorporate data science and analytics into their curriculum.

“I think we’ll see a continued emphasis on enabling the desired visibility [of data],” Leone said, “and enablement of more personas to access data and derive insights.”

Go to Original Article
Author:

Microsoft offers few upgrades for Skype server in 2019

Microsoft added no significant end-user features to on-premises Skype for Business in 2019, closing out the year with a December update that mostly fixes bugs.

Microsoft’s lack of investment in Skype server underscores how the company views the product as a placeholder for businesses not yet ready to move to the cloud.

In recent updates, Microsoft extended location-based routing to Skype for Business mobile clients. The feature, now a standard component of modern business phone systems, helps companies reduce PSTN costs by keeping audio traffic in-network when possible.

Microsoft also this year began a phased replacement of the Skype server’s IT control panel, which is based on outdated technology. Another update gave IT admins new tools for automating user settings on a large scale.

Otherwise, the vendor’s July and December updates contained mostly bug fixes and security tweaks for the Skype server. In years past, those updates would have included significant features for the Skype product. But more recently, the vendor has focused its research and development efforts on cloud-based Microsoft Teams.

Consequentially, many organizations are not even bothering to purchase the latest iteration of Skype server, version 2019, released last October, said Tom Arbuthnot, principal solutions architect at Modality Systems, a Microsoft-focused systems integrator.

Instead, those customers are sticking with the previous iteration, version 2015. Microsoft has scheduled extended maintenance for the 2015 and 2019 versions of Skype to end simultaneously in 2025, giving businesses little incentive to make the costly switch.

“I don’t see load and loads of people upgrading to 2019. They will string out 2015 until they are ready to go to Teams,” Arbuthnot said. “[Microsoft is] disincentivizing you from going to 2019.”

The 2019 server introduced new ways to integrate the on-premises product with cloud services, such as cloud voicemail and Azure Active Directory. It also uses more recent security protocols. But it offers virtually no new end-user features compared to what was added to the 2015 version.

Microsoft’s decision to stop investing in its on-premises unified communications product stands in contrast to Cisco. The rival vendor has continued to enhance the features of the messaging app Cisco Jabber even while building out a cloud portfolio based on the Webex suite.

Microsoft appears more focused on winning subscribers to Office 365, a cloud-based suite of productivity apps that includes Teams. In particular, the company has taken aim at the collaboration app Slack, a competitor to Teams.

Microsoft announced last month that Teams had gained 20 million daily active users, more than Slack’s 13 million. But those figures still represent only a fraction of Microsoft’s base of customers, which includes 200 million commercial users of Office 365.

Microsoft has not revealed how many organizations are still using Skype, but it likely remains one of the most-installed UC apps in the market. More than 100 million people used Microsoft Lync as of 2015 when the product was rebranded to Skype for Business.

Meanwhile, Microsoft has announced that it will shutter Skype for Business Online, a cloud-based product within Office 365, on July 31, 2021.

Go to Original Article
Author:

This group of businesses is the most often attacked on earth—here’s how we helped

There are 79 million businesses worldwide who meet the “small or medium business” (SMB) definition of having 300 or fewer employees, and those businesses represent 95 percent of all the companies on earth—which amounts to a staggering 63 percent of the world’s workforce. As gigantic as those figures might be, they’re belied by other numbers that cast a shadow across worldwide employment: Last year, 55 percent of SMBs weathered cyberattacks, 52 percent of these breaches were caused by human error, and, in a quarter of these cases, sensitive customer data was breached. The average cyberattack will cost an SMB U.S. $190,000 and, after a ransomware attack, only one-third of SMBs can remain profitable.

This year, these numbers will only increase because 90 percent of SMBs do not currently have any data protection.

In an era where nearly every company is, in some regard, a technology company, the upcoming end of support for Windows 7 on January 14, 2020 only adds to the pressure on these businesses.

We considered our responsibility to this community

During my keynote at Microsoft Ignite, I spoke at length about the challenges associated with app compatibility, and I shared how Microsoft has taken on a responsibility for compatibility. The reasoning behind this is simple: Among other reasons, when more organizations are operating modern infrastructures, it’s much simpler to keep attacks from spreading throughout the world. Similarly, as my team looked at the needs of the SMB community, we considered our responsibility to their security posture. After some analysis, we discovered a way to help them that didn’t exist within the enterprise offering of Microsoft 365 (a product we had fine-tuned to the needs of large companies).

The answer was Microsoft 365 Business, and I believe it offers SMBs the best possible opportunity to be secure and productive at the lowest possible cost. Microsoft 365 Business offers the same security tools used by many banks, governments, and multi-national corporations, as well as the very same productivity tools in Office 365.

Recently, we’ve undertaken an effort to think and talk about this topic differently.

While many SMBs don’t have the resources to hire a Chief Security Officer (CSO) of their own, I think this community can use Microsoft 365 Business like a CSO. I encourage you to spend a few minutes at YourNewCSO to learn how to use these resources right away. No matter where you are on your security journey, the site and these eight quick (and funny?) videos will show you steps to better secure your business.

Our data clearly demonstrates that combining security with a huge boost in productivity is the type of innovation that will set an SMB apart in a competitive environment. A recent study of two customers by Qualtrics found that employees using modern tools were 50 percent more likely to say they could better serve their customers, and they were 121 percent more likely to feel valued by their company—a sentiment that is directly tied to improved productivity, loyalty, and a positive organizational culture.

Fully use what you already have

Rather than simply try to sell something throughout this post, I’d like to point out some ways SMBs who already own Office 365 can improve their security without spending any additional money. Included below are seven steps to improve your security at no extra cost—you can also read how to do it or watch this quick overview.

  1. Check your Microsoft Secure Score.
  2. Set up Multi-Factor Authentication (MFA). Setting up MFA will prevent 99 percent of identity attacks.
  3. Use the built-in mobile application management tools in Office 365.
  4. Set up a separate account for performing administrative tasks.
  5. Use an antivirus solution that leverages the cloud to protect from the latest threats. Microsoft Defender provides some great out-of-the-box capabilities in Windows 10 that more than 50 percent of enterprises are using.
  6. Store files in OneDrive for Business, and the cloud becomes your backup. No more manual PC backups, which saves you time and money. Even better, if you are hacked and are regularly saving your documents to OneDrive, you can simply revert your files back to before the hack occurred.
  7. Stop email auto-forwarding.

As we found from talking with hundreds of SMBs, creating a culture of security is one of the biggest first steps you can take. Right now is the time to educate your employees about how to identify security threats (e.g., don’t click that suspicious link, and if you do, please let someone know), and with Windows 7 very quickly reaching end of support, use this as an opportunity to move to our best available, most secure platform. Microsoft 365 Business can help.

Office 365 security tips

Seven security features in Office 365 you can use to secure your organization.

Watch the video

Why move from Office 365 to Microsoft 365 Business

Office 365 provides the suite of productivity tools you know and love, including capabilities like Exchange Online, SharePoint Online, and OneDrive for Business. But when you move to Microsoft 365 Business, you get that power of Office 365 as well as a comprehensive, cloud-based security solution that lets you defend your business against advanced threats. Microsoft 365 helps you to protect against cyberthreats with sophisticated phishing and ransomware protection; lets you control access to sensitive information, using encryption to keep data from being accidentally shared with someone not authorized to see it; and enables you to secure the devices that connect to your data, helping keep iOS, Android, Windows, and Mac devices secure and up-to-date. Microsoft 365 Business is fully integrated with Office 365, so you have one place for administration, billing, and 24×7 support.

Next steps

In addition to visiting YourNewCSO, consider the value of insuring yourself against a cyberattack. I’m excited to announce that, starting today, we’re piloting a new program in the U.S. in collaboration with AXA XL (a global insurer) and Slice Labs (on-demand insurance platform) to offer a free cybersecurity health check and support AXA XL’s provision of cyber insurance for qualified customers that use Microsoft 365 Business, Office 365 Business, and Office 365 Business Premium.

With your permission, AXA XL will assess your organization’s security and offer their services to qualifying customers, potentially with a discount. You can find more information about the collaboration in the AXA XL and Slice Labs press release, and you can read more about their offering and purchase insurance.

Go to Original Article
Author: Microsoft News Center

New capabilities added to Alfresco Governance Services

Alfresco Software introduced new information governance capabilities this week to its Digital Business Platform through updates to Alfresco Governance Services.

The updates include new desktop synchronization, federation services and AI-assisted legal holds features.

“In the coming year, we expect many organizations to be hit with large fines as a result of not meeting regulatory standards for data privacy, e.g., the European GDPR and California’s CCPA. We introduced these capabilities to help our customers guarantee their content security and circumvent those fines,” said Tara Combs, information governance specialist at Alfresco.

Federation Services enables cross-databases search

Federation Services is a new addition to Alfresco Governance Services. Users can search, view and manage content from Alfresco and other repositories, such as network file shares, OpenText, Documentum, Microsoft SharePoint, Dropbox.

Users can also search across different databases with the application without having to migrate content. Federation Services provides one user interface for users to manage all the information resources in an organization, according to the company.

Organizations can also store content in locations outside of Alfresco platform.

Legal holds feature provides AI-assisted search for legal teams

The legal holds feature provides document search and management capabilities that help legal teams identify relevant content for litigation purposes. Alfresco’s tool now uses AI to discover relevant content and metadata, according to the company.

“AI is offered in some legal discovery software systems, and over time all these specialized vendors will leverage AI and machine learning,” said Alan Pelz-Sharpe, founder and principal analyst at Deep Analysis. He added that the AI-powered feature of Alfresco Governance Services is one of the first such offerings from a more general information management vendor.

“It is positioned to augment the specialized vendors’ work, essentially curating and capturing relevant bodies of information for deeper analysis.”

Desktop synchronization maintains record management policies

Another new feature added to Alfresco Governance Services synchronizes content between a repository and a desktop, along with the records management policies associated with that content, according to the company.

With the desktop synchronization feature, users can expect to have the same record management policies when they access a document on their desktop computer or viewing it from the source repository, according to the company.

When evaluating a product like this in the market, Pelz-Sharpe said the most important feature a buyer should look for is usability. “AI is very powerful, but less than useless in the wrong hands. Many AI tools expect too much of the customer — usability and recognizable, preconfigured features that the customer can use with little to no training are essential.”

The new updates are available as of Dec. 3. There is no price difference between the updated version of Alfresco Governance Services and the previous version. Customers who already had a subscription can upgrade as part of their subscription, according to the company.

According to Pelz-Sharpe, Alfresco has traditionally competed against enterprise content management and business process management vendors. It has pivoted during recent years to compete more directly with PaaS competitors, offering a content- and process-centric platform upon which its customer can build their own applications. In the future, the company is likely to compete against the likes of Oracle and IBM, he said.

Go to Original Article
Author:

3 networking startups rearchitect routing

Startups played a pivotal role in disrupting the business of network switching. Today, they’re on track to do the same to routing.

Software under development by upstarts Arrcus, DriveNets and Volta Networks represents a new routing architecture, industry analysts agreed. Cloud service providers, SaaS providers, telcos and the largest financial institutions are the most likely candidates for deploying the networking startups’ technology in the data center and at the edge.

The vendors’ software could also be useful for peering among internet service providers and for data center interconnects (DCIs). Colocation companies like Equinix, Digital Realty Trust and Global Switch use DCIs to connect their facilities to customer data centers.

Market research firm IDC recently named the three companies 2019 innovators for their work in decoupling routing software from its underlying hardware. Separating management, control and data planes from the device make it possible to run the software on commodity products powered by merchant silicon from companies like Broadcom and Intel.

Severing software from hardware and running it on commodity gear — a process called disaggregation — reduces operational expenses. Companies can lower labor costs by managing multiple routers at once, instead of each one separately. The architecture also adds flexibility by making it possible to distribute and manage physical and virtual routers across data centers or at the network edge.

“Effectively, you’ve got a Lego that you can mix and match based on your requirements,” said Brad Casemore, an analyst at IDC. “It leads to a standardized environment where you can run the same software across all of it.”

Disaggregation from switching to routing

Disaggregation in network switching, a nearly 10-year trend, forced incumbents Cisco and Juniper Networks to acquire startups that had developed software capable of providing centralized network management. The transition led to an overhaul in the way the companies’ products manage switching fabrics.

New technologies developed by Arrcus, DriveNets and Volta show that there’s “an evolution in disaggregation to the routing layer,” Casemore said. Each of the vendors is initially targeting their products at communication and cloud service providers.

“It’s really compelling technology,” Casemore said.

Here is a brief description of each of the networking startups, including the key differentiators and market challenges listed in the 2019 IDC Innovators report on disaggregated routing platforms:

— Arrcus built a network operating system, called ArcOS, with extensive routing protocol support. This year, for example, the vendor incorporated the Link State Vector Routing (LSVR) protocol into ArcOS for organizations running hyperscale data centers and large cloud environments.

Arrcus has built its data plane adaptation layer to separate ArcOS from the underlying hardware. ArcOS is also the first independent NOS to support devices powered by Broadcom’s Trident 3, Tomahawk 3, Jericho+ or Jericho2 network silicon. The Jericho2 platform is for 100 Gb and 400 Gb routing.

Despite its innovative technology, Arrcus still has to prove it can deliver significant cost savings and ROI. The company also has to show a simple process for buying and supporting the underlying hardware.

Arrcus, based in San Jose, Calif., has more than 60 employees and has raised $45 million in funding.

— DriveNets developed a container-based router control plane for merchant silicon-based white boxes. Hardware manufacturers bundle the software with their products and sell it under a license that is free from capacity constraints.

The architecture provides carriers with a routing model that uses a cluster of low-cost white boxes capable of scaling to any size. DriveNets based the model on the one used in hyperscale data centers.

DriveNets’ hurdles include convincing communication service providers to change how they procure, deploy and manage router infrastructure. “The adoption of the DriveNets architecture might be slowed by the need for communication service providers to redesign internal processes and management systems,” IDC said.

DriveNets, based in Ra’anana, Israel, has more than 200 employees and has raised $117 million in funding.

— Volta built a cloud-native, cloud-hosted control plane that can spin up and manage as many as 255 instances of virtual routers on a single, on-premises commodity switch. The use of switching gear provides a “significant cost advantage,” while also making Volta technology useful for provider edge routing. Volta’s technology could be helpful to carriers overhauling cell sites to support next-generation 5G wireless technology.

Volta’s technology and its subscription model that covers support, maintenance and hardware warranty could provide significantly lower capital and operational expenses. However, as a startup, in a competitive industry, it faces a “significant challenge” in winning deals over better-known competitors with more money.

Volta, based in Cambridge, Mass., has 51 employees and has raised $3.3 million in funding.

Moving toward software-based routing

Companies with hyperscale data centers, like Amazon, Facebook, Google and Microsoft, have favored disaggregated networking software on standardized hardware for years. Today, major service providers and financial institutions use the same white box switches. Users include AT&T, Comcast, Verizon, JPMorgan Chase and Fidelity Investments.

As a result, in 2018, the share of the global Ethernet data center switching market held by Cisco and Juniper fell, while that of bare-metal switching manufacturers increased, according to IDC.

Analysts believe the same dynamics will likely play out in routing. “People are now noticing and realizing that white box approaches can work. They’re mature,” said Roy Chua, a principal analyst at AvidThink.

Potentially, these companies become M&A targets if they have traction in some high-value accounts.
Brad CasemoreAnalyst, IDC

Analysts expect carriers to seriously consider white box routers as they build out their network edge to deliver 5G services.

“They’re actually trying to move away from [physical] routers and toward software-based routing,” said Lee Doyle, principal analyst at Doyle Research. “None of this has been hugely deployed yet. But I think we’re going to see significant deployments in 2020 and 2021 in the 5G market.”

Routing sales for Cisco and Juniper have been declining. However, the decrease is primarily due to carriers cutting back on spending after they found they couldn’t wring any more revenue from consumers, Casemore said.

But with 5G deployments on the horizon, incumbents like Cisco and Juniper are probably watching networking startups closely to see which ones are winning deals for routing technology.

“Potentially, these companies become M&A targets if they have traction in some high-value accounts,” Casemore said.

Go to Original Article
Author:

Microsoft’s new approach to hybrid: Azure services when and where customers need them | Innovation Stories

As business computing needs have grown more complex and sophisticated, many enterprises have discovered they need multiple systems to meet various requirements – a mix of technology environments in multiple locations, known as hybrid IT or hybrid cloud.

Technology vendors have responded with an array of services and platforms – public clouds, private clouds and the growing edge computing model – but there hasn’t necessarily been a cohesive strategy to get them to work together.

We got here in an ad hoc fashion,” said Erik Vogel, global vice president for customer experience for HPE GreenLake at Hewlett Packard Enterprise. Customers didn’t have a strategic model to work from.

Instead, he said, various business owners in the same company may have bought different software as a service (SaaS) applications, or developers may have independently started leveraging Amazon Web Services, Azure or Google Cloud Platform to develop a set of applications.

At its Ignite conference this week in Orlando, Florida, Microsoft announced its solution to such cloud sprawl. The company has launched a preview of Azure Arc, which offers Azure services and management to customers on other clouds or infrastructure, including those offered by Amazon and Google.

John JG Chirapurath, general manager for Azure data, blockchain and artificial intelligence at Microsoft, said the new service is both an acknowledgement of, and a response to, the reality that many companies face today. They are running various parts of their businesses on different cloud platforms, and they also have a lot of data stored on their own new or legacy systems.

In all those cases, he said, these customers are telling Microsoft they could use the benefits of Azure cloud innovation whether or not their data is stored in the cloud, and they could benefit from having the same Azure capabilities – including security safeguards – available to them across their entire portfolio.

We are offering our customers the ability to take their services, untethered from Azure, and run them inside their own datacenter or in another cloud,” Chirapurath said.

Microsoft says Azure Arc builds on years of work the company has done to serve hybrid cloud needs. For example, Azure Resource Manager, released in 2014, was created with the vision that it would manage resources outside of Azure, including in companies’ internal servers and on other clouds.

That flexibility can help customers operate their services on a mix of clouds more efficiently, without purchasing new hardware or switching among cloud providers. Companies can use a public cloud to obtain computing power and data storage from an outside vendor, but they can also house critical applications and sensitive data on their own premises in a private cloud or server.

Then there’s edge computing, which stores data where the user is, in between the company and the public cloud for example, on their customers’ mobile devices or on sensors in smart buildings like hospitals and factories.

YouTube Video

That’s compelling for companies that need to run AI models on systems that aren’t reliably connected to the cloud, or to make computations more quickly than if they had to send large amounts of data to and from the cloud. But it also must work with companies’ cloud-based, internet-connected systems.

“A customer at the edge doesn’t want to use different app models for different environments,” said Mark Russinovich, Azure chief technology officer. “They need apps that span cloud and edge, leveraging the same code and same management constructs.”

Streamlining and standardizing a customer’s IT structure gives developers more time to build applications that produce value for the business instead of managing multiple operating models. And enabling Azure to integrate administrative and compliance needs across the enterprise – automating system updates and security enhancements brings additional savings in time and money.

“You begin to free up people to go work on other projects, which means faster development time, faster time to market,” said HPE’s Vogel. HPE is working with Microsoft on offerings that will complement Azure Arc.

Arpan Shah, general manager of Azure infrastructure, said Azure Arc allows companies to use Azure’s governance tools for their virtual machines, Kubernetes clusters and data across different locations, helping ensure companywide compliance on things like regulations, security, spending policies and auditing tools.

Azure Arc is underpinned in part by Microsoft’s commitment to technologies that customers are using today, including virtual machines, containers and Kubernetes, an open source system for organizing and managing containers. That makes clusters of applications easily portable across a hybrid IT environment – to the cloud, the edge or an internal server.

“It’s easy for a customer to put that container anywhere,” Chirapurath said. “Today, you can keep it here. Tomorrow, you can move it somewhere else.”

Microsoft says these latest Azure updates reflect an ongoing effort to better understand the complex needs of customers trying to manage their Linux and Windows servers, Kubernetes clusters and data across environments.

“This is just the latest wave of this sort of innovation,” Chirapurath said. “We’re really thinking much more expansively about customer needs and meeting them according to how they’d like to run their applications and services.”

Top image: Erik Vogel, global vice president for customer experience for HPE GreenLake at Hewlett Packard Enterprise, with a prototype of memory-driven computing. HPE is working with Microsoft on offerings that will complement Azure Arc. Photo by John Brecher for Microsoft.

Related:

Go to Original Article
Author: Microsoft News Center