Tag Archives: more

IBM expands patent troll fight with its massive IP portfolio

After claiming more than a quarter century of patent leadership, IBM has expanded its fight against patent assertion entities, also known as patent trolls, by joining the LOT Network. As a founding member of the Open Invention Network in 2005, IBM has been in the patent troll fight for nearly 15 years.

The LOT Network (short for License on Transfer) is a nonprofit community of more than 600 companies that have banded together to protect themselves against patent trolls and their lawsuits. The group says companies lose up to $80 billion per year on patent troll litigation. Patent trolls are organizations that hoard patents and bring lawsuits against companies they accuse of infringing on those patents.

IBM joins the LOT Network after its $34 billion acquisition of Red Hat, which was a founding member of the organization.

“It made sense to align IBM’s and Red Hat’s view on how to manage our patent portfolio,” said Jason McGee, vice president and CTO of IBM Cloud Platform. “We want to make sure that patents are used for their traditional purposes, and that innovation proceeds and open source developers can work without the threat of a patent litigation.”

To that end, IBM contributed more than 80,000 patents and patent applications to the LOT Network to shield those patents from patent assertion entities, or PAEs.

Charles KingCharles King

IBM joining the LOT Network is significant for a couple of reasons, said Charles King, principal analyst at Pund-IT in Hayward, Calif. First and foremost, with 27 years of patent leadership, IBM brings a load of patent experience and a sizable portfolio of intellectual property (IP) to the LOT Network, he said.

“IBM’s decision to join should also silence critics who decried how the company’s acquisition of Red Hat would erode and eventually end Red Hat’s long-standing leadership in open source and shared IP,” King said. “Instead, the opposite appears to have occurred, with IBM taking heed of its new business unit’s dedication to open innovation and patent stewardship.”

IBM’s decision to join should also silence critics who decried how the company’s acquisition of Red Hat would erode and eventually end Red Hat’s long-standing leadership in open source and shared IP.
Charles KingAnalyst, Pund-IT

The LOT Network operates as a subscription service that charges members for the IP protection they provide. LOT’s subscription rates are based on company revenue. Membership is free for companies making less than $25 million annually. Companies with annual revenues between $25 million and $50 million pay $5,000 annually to LOT. Companies with revenues between $50 million and $100 million pay $10,000 annually to LOT. Companies with revenues between $100 million and $1 billion pay $15,000. And LOT caps its annual subscription rates at $20,000 for companies with revenues greater than $1 billion.

Meanwhile, the Open Invention Network (OIN) has three levels of participation: members, associate members and licensees. Participation in OIN is free, the organization said.

“One of the most powerful characteristics of the OIN community and its cross-license agreement is that the board members sign the exact same licensing agreement as the other 3,100 business participants,” said Keith Bergelt, CEO of OIN. “The cross license is royalty-free, meaning it costs nothing to join the OIN community. All an organization or business must agree to do is promise not to sue other community participants based on the Linux System Definition.”

IFI Claims Patent Services confirms that 2019 marked the 27th consecutive year in which IBM has been the leader in the patent industry, earning 9,262 U.S. patents last year. The patents reach across key technology areas such as AI, blockchain, cloud computing, quantum computing and security, McGee said.

IBM achieved more than 1,800 AI patents, including a patent for a method for teaching AI systems how to understand implications behind certain text or phrases of speech by analyzing other related content. IBM also gained patents for improving the security of blockchain networks.

In addition, IBM inventors were awarded more than 2,500 patents in cloud technology and grew the number of patents the company has in the nascent quantum computing field.

“We’re talking about new patent issues each year, not the size of our patent portfolio, because we’re focused on innovation,” McGee said. “There are lots of ways to gain and use patents, we got the most for 27 years and I think that’s a reflection of real innovation that’s happening.”

Since 1920, IBM has received more than 140,000 U.S. patents, he noted. In 2019, more than 8,500 IBM inventors, spanning 45 different U.S. states and 54 countries contributed to the patents awarded to IBM, McGee added.

In other patent-related news, Apple and Microsoft this week joined 35 companies who petitioned the European Union to strengthen its policy on patent trolls. The coalition of companies sent a letter to EU Commissioner for technology and industrial policy Thierry Breton seeking to make it harder for patent trolls to function in the EU.

Go to Original Article
Author:

For insider threat programs, HR should provide checks and balances

Insider threats are on the rise and firms are doing more to stop them, according to a new report from Forrester Research. But it warns that insider threat programs can hurt employee engagement and productivity.

One of the ways companies are trying to curtail insider threats is by analyzing employee personal data to better detect suspicious or risky behavior. But IT security may go overboard in its collection process, security may be too stringent, and practices such as social media monitoring might “lead to eroded employee trust,” Forrester warns.

An insider threat program can turn adversarial, impacting employees in negative ways. It’s up to HR to work with IT security to provide the checks and balances, said Joseph Blankenship, vice president and research director of security and risk at Forrester.

Blankenship further discussed project delays in this Q&A. His responses were edited for clarity and length.

Insider threats are increasing. In 2015, malicious insiders accounted for about 26% of internal data breaches. And in 2019, it was 48%, according to Forrester’s survey data. Why this increase?

Joseph BlankenshipJoseph Blankenship

Joseph Blankenship: I think it’s twofold. You have the ability for users to monetize data and move data in large quantities like they’ve never had before. The ease of moving that data — and the portability of that data — is one factor. The other big factor is we’re looking for [threats] more often. The tools are better. Whenever we see a new capability for threat detection, that’s usually the period when we see this increase [in discovered incidents].

Nonetheless, this must be a stunning finding for a lot of firms. How do they respond to it?

Blankenship: Probably like the stages of grief. We see that pattern quite a bit in security. An event happens, and we realized we are at risk for that event happening again. So now we put effort behind it. We put budget behind it, we buy technology, we build a program and things improve.

Accidental release of internal data accounted for 43% of all insider incidents. What does that say about training?

Blankenship: It’s also culture. Do employees actually understand why the [security] policy is there? Some of that is people trying to get around policies. They find that the security policy is restrictive. You see some of that when people decide to work on their own laptop and their laptop gets stolen. It’s usually people that are somewhat well-meaning, but they find that the policy is getting in their way. Those are all mistakes. Those are all policy violations.

Types of insider threats
Types of insider threats

Who is responsible in a company for ensuring that the employees understand the rules?

Blankenship: Typically it’s the CISO’s responsibility to do this kind of security education.

Is this primarily the job of the IT security department?

Blankenship: Certainly, it’s in partnership with human resources.

IT manages the internal security program, but many of the risks from an insider threat program are HR-related such as increased turnover or hiring. The HR department’s metrics suffer if the program creates employee friction. Is that the case?

Blankenship: I don’t think that’s necessarily the case. You have to make the employee aware: ‘Hey, we’re doing this kind of monitoring because we have important customer data. We can’t afford a breach of customer trust. We’re doing this monitoring because we have intellectual property.’ Things become a lot less scary, a lot less onerous, when people understand the reasons why. If it’s too heavy-handed, if we’re doing things to either punish employees or make their jobs really difficult, it does create that adversarial relationship.

What is the best practice here? Should HR or IT spell out exactly what they do to protect company security?

Blankenship: I don’t know if you get into all the specifics of a security program, but make the employees aware. ‘We’re going to be monitoring things like email. We may be monitoring your computer usage.’  

What is HR’s role in helping the company implement these policies?

Because HR is the part of the company responsible for employee experience, it is very much incumbent on them to work with the security department and keep it a little bit honest.
Joseph BlankenshipVice president and research director, Forrester Research

Blankenship: Because HR is the part of the company responsible for employee experience, it is very much incumbent on them to work with the security department and keep it a little bit honest. I’m sure there are a lot of security folks that would love to really turn up the dial on security policies. If you remember some years ago, the big debate was should we allow personal internet usage on company issued devices. There were lots of security reasons why we would say, ‘absolutely not.’ However, the employee experience dictated that we had to allow some of that activity, otherwise we wouldn’t be able to recruit any new employees. We really had to find the balance.

It sounds as if HR’s responsibility here is to provide some checks and balances.

Blankenship: There’s checks and balances as well as helping [IT security] to design the education program. There’s probably not a lot of security technologists that are amazing at building culture, but that is absolutely the job of good HR professionals.

Go to Original Article
Author:

SAP Data Hub opens predictive possibilities at Paul Hartmann

Organizations have access to more data than they’ve ever had, and the number of data sources and volume of data just keeps growing.

But how do companies deal with all the data and can they derive real business use from it? Paul Hartmann AG, a medical supply company, is trying to answer those questions by using SAP Data Hub to integrate data from different sources and use the data to improve supply chain operations. The technology is part of the company’s push toward a data-based digital transformation, where some existing processes are digitized and new analytics-based models are being developed.

The early results have been promising, said Sinanudin Omerhodzic, Paul Hartmann’s CIO and chief data officer.

Paul Hartmann is a 200-year-old firm in Heidenheim, Germany that supplies medical and personal hygiene products to customers such as hospitals, nursing homes, pharmacies and retail outlets. The main product groups include wound management, incontinence management and infection management.

Paul Hartmann is active in 35 countries and turns over around $2.2 billion in sales a year. Omerhodzic described the company as a pioneer in digitizing its supply chain operations, running SAP ERP systems for 40 years. However, changes in the healthcare industry have led to questions about how to use technology to address new challenges.

For example, an aging population increases demand for certain medical products and services, as people live longer and consume more products than before.

One prime area for digitization was in Paul Hartmann’s supply chain, as hospitals demand lower costs to order and receive medical products. Around 60% of Paul Hartmann’s orders are still handled by email, phone calls or fax, which means that per-order costs are high, so the company wanted to begin to automate these processes to reduce costs, Omerhodzic said.

One method was to install boxes stocked with products and equipped with sensors in hospital warehouses that automatically re-order products when stock reaches certain levels. This process reduced costs by not requiring any human intervention on the customer side. Paul Hartmann installed 9,000 replenishment boxes in about 100 hospitals in Spain, which proved adept at replacing stock when needed. But it then began to consider the next step: how to predict with greater accuracy what products will be needed when and where to further reduce the wait time on restocking supplies.  

Getting predictive needs new data sources

This new level of supply chain predictive analytics requires accessing and analyzing vast amounts of data from a variety of new sources, Omerhodzic said. For example, weather data could show that a storm may hit a particular area, which could result in more accidents, leading hospitals to stock more bandages in preparation. Data from social media sources that refer to health events such as flu epidemics could lead to calculations on the number of people who could get sick in particular regions and the number of products needed to fight the infections.

“All those external data sources — the population data, weather data, the epidemic data — combined with our sales history data, allow us to predict and forecast for the future how many products will be required in the hospitals and for all our customers,” Omerhodzic said.

Paul Hartmann worked with SAP to implement a predictive system based on SAP Data Hub, a software service that enables organizations to orchestrate data from different sources without having to extract the data from the source. AI and machine learning are used to analyze the data, including the entire history of the company’s sales data, and after just a few months of the pilot project was making better predictions than the sales staff, Omerhodzic said.

“We have 200 years selling our product, so the sales force has a huge wealth of information and experience, but the new system could predict even better than they could,” he said. “This was a huge wake up for us and we said we need to learn more about our data, we need to pull more data inside and see how that could improve or maybe create new business models. So we are now in the process of implementing that.”

Innovation on the edge less disruptive

The use of SAP Data Hub as an innovation center is one example of how SAP can foster digital transformation without directly changing core ERP systems, said Joshua Greenbaum, principal analyst at Enterprise Applications Consulting. This can result in new processes that aren’t as costly or disruptive as a major ERP upgrade.

Joshua GreenbaumJoshua Greenbaum

“Eventually this touches your ERP because you’re going to be making and distributing more bandages, but you can build the innovation layer without it being directly inside the ERP system,” Greenbaum said. “When I discuss digital transformation with companies, the easy wins don’t start with the statement, ‘Let’s replace our ERP system.’ That’s the road to complexity and high costs — although, ultimately, that may have to happen.”

For most organizations, Greenbaum said, change management — not technology — is still the biggest challenge of any digital transformation effort.

Change management challenges

At Paul Hartmann, change management has been a pain point. The company is addressing the technical issues of the SAP Data Hub initiative through education and training programs that enhance IT skills, Omerhodzic said, but getting the company to work with data is another matter.

“The biggest change in our organization is to think more from the data perspective side and the projects that we have today,” he said. “To have this mindset and understanding of what can be done with the data requires a completely different approach and different skills in the business and IT. We are still in the process of learning and establishing the appropriate organization.”

Although the sales organization at Paul Hartmann may feel threatened by the predictive abilities of the new system, change is inevitable and affects the entire organization, and the change must be managed from the top, according to Omerhodzic.

“Whenever you have a change there’s always fear from all people that are affected by it,” he said. “We will still need our sales force in the future — but maybe to sell customer solutions, not the products. You have to explain it to people and you have to explain to them where their future could be.”

Go to Original Article
Author:

On-premises server monitoring tools meet business needs, budget

Although the market has shifted and more vendors are providing cloud-based monitoring, there are still a wide range of feature-rich server monitoring tools for organizations that must keep their workloads on site for security and compliance reasons.  

Here we examine open source and commercial on-premises server monitoring tools from eight vendors. Although these products broadly achieve the same IT goals, they differ in their approach, complexity of setup — including the ongoing aspects of maintenance and licensing — and cost. 

Cacti

Cacti is an open source network monitoring and graphing front-end application for RRDtool, an industry-standard open source data logging tool. RRDtool is the data collection portion of the product, while Cacti handles network graphing for the data that’s collected. Since both Cacti and RRDtool are open source, they may be practical options for organizations that are on a budget. Cacti support is community-driven.

Cacti can be ideal for organizations that already have RRDtool in place and want to expand on what it can display graphically. For organizations that don’t have RRDtool installed, or aren’t familiar with Linux commands or tools, both Cacti and RRDtool could be a bit of a challenge to install, as they don’t include a simple wizard or agents. This should be familiar territory for Linux administrators, but may require additional effort for Windows admins. Note that Cacti is a graphing product and isn’t really an alerting or remediation product. 

ManageEngine Applications Manager

The ManageEngine system is part of an extensive line of server monitoring tools that include application-specific tools as well as cloud and mobile device management. The application monitoring framework enables organizations to purchase agents from various vendors, such as Oracle and SAP, as well as customer application-specific tools. These server monitoring tools enable admins to perform cradle-to-grave monitoring, which can help them troubleshoot and resolve application server issues before they impact end-user performance. ManageEngine platform strengths include its licensing model and the large number of agents available. Although the monitoring license per device is all-inclusive for interfaces or sensors needed per device, the agents are sold individually.

Thirty-day trials are available for many of the more than 100 agents. Licensing costs range from less than $1,000 for 25 monitors and one user to more than $7,000 for 250 monitors with one user and an additional $245 per user. Support costs are often rolled into the cost of the monitors. This can be ideal for organizations that want to make a smaller initial investment and grow over time.

Microsoft System Center Operations Manager

The product monitors servers, enterprise infrastructure and applications, such as Exchange and SQL, and works with both Windows and Linux clients. Microsoft System Center features include configuration management, orchestration, VM management and data protection. System Center isn’t as expansive on third-party applications as it is with native Microsoft applications. System Center is based on core licensing to match Server 2016 and later licensing models.

The base price for Microsoft System Center Operations Manager starts at $3,600, assuming two CPUs and 16 cores total and can be expanded with core pack licenses. With Microsoft licensing, the larger the environment in terms of CPU cores, the more a customer site can expect to pay. While Microsoft offers a 180-day trial of System Center, this version is designed for the larger Hyper-V environments. Support is dependent on the contract the organization selects.  

Nagios Core

Nagios Core is free open source software that provides metrics to monitor server and network performance. Nagios can help organizations provide increased server, services, process and application availability. While Nagios Core comes with a graphical front end, the scope of what it can monitor is somewhat limited. But admins can deploy additional community-provided front ends that offer more views and additional functionality. Nagios Core natively installs and operates on Linux systems and Unix variants.

For additional features and functionality, the commercial Nagios XI product offers true dashboards, reporting, GUI configuration and enhanced notifications. Pricing for this commercial version ranges from less than $7,000 for 500 nodes and an additional $1,500 per enterprise for reporting and capacity planning tools. In addition to agents for OSes, users can also add network monitoring for a single point of service. Free 60-day trials and community support are available for the products that work with the free Nagios Core download.

Opsview

Opsview system monitoring software includes on-premises agents as well as agents from all the major cloud vendors. While the free version provides 25 hosts to monitor, the product’s main benefit is that it can support both SMBs and the enterprise. Pricing for a comprehensive offering that includes 300 hosts, reporting, multiple collectors and network analyzer is less than $20,000 a year, depending on the agents selected.  

Enterprise packages are available via custom quote. The vendor offers both on-premises and cloud variations. The list of agents Opsview can monitor is one of the most expansive of any of the products, bridging cloud, application, web and infrastructure. Opsview also offers a dedicated mobile application. Support for most packages is 24/7 and includes customer portals and a knowledgebase.

Paessler PRTG Network Manager

PRTG can monitor from the infrastructure to the application stack. The licensing model for PRTG Network Monitor follows a sensor model format over a node, core or host model. This means a traditional host might have more than 20 sensors monitoring anything from CPU to bandwidth. Services range from networking and bandwidth monitoring to other more application-specific services such as low Microsoft OneDrive or Dropbox drive space. A fully functional 30-day demo is available and pricing ranges from less than $6,000 for 2,500 sensors to less than $15,000 for an unlimited number of sensors. Support is email-based.

SolarWinds Server and Application Monitor

SolarWinds offers more than 1,000 monitoring templates for various applications and systems, such as Active Directory, as well as several virtualization platforms and cloud-based applications. It also provides dedicated virtualization, networking, databases and security monitoring products. In addition to standard performance metrics, SolarWinds provides application response templates to help admins with troubleshooting. A free 30-day trial is available. Pricing for 500 nodes is $73,995 and includes a year of maintenance.  

Zabbix

This free, open source, enterprise-scale monitoring product includes an impressive number of agents that an admin can download. Although most features aren’t point and click, the dashboards are similar to other open source platforms and are more than adequate. Given the free cost of entry and the sheer number of agents, this could be an ideal product for organizations that have the time and Linux experience to bring it online. Support is community-based and additional support can be purchased from a reseller.

The bottom line on server monitoring tools

The products examined here differ slightly in size, scope and licensing model. Outside of the open source products, many commercial server monitoring tools are licensed by node or agent type. It’s important that IT buyers understand all the possible options when getting quotes, as they can be difficult to understand.

Pricing varies widely, as do the features of the dashboards of the various server monitoring tools. Ensure the staff is comfortable with the dashboard and alerting functionality of each system as well as mobile ability and notifications. If an organization chooses an open source platform, keep in mind that the installation could require more effort if the staff isn’t Linux savvy.  

The dashboards for the open source monitors typically aren’t as graphical as the paid products, but that’s part of the tradeoff with open source. Many of the commercial products are cloud-ready or have that ability, so even if an organization doesn’t plan to monitor its servers in the cloud today, they can take advantage of this technology in the future. 

Go to Original Article
Author:

How technology intensity accelerates business value – Microsoft Industry Blogs

Organizations that embrace technology intensity are inherently more successful. What exactly is technology intensity, and why is it critical for today’s enterprises to build a cohesive digital strategy?

Technology intensity defined

Technology intensity has three components:

  1. Rapid adoption of hyper-scale cloud platforms
  2. rational business decision to invest in digital capabilities
  3. Relentless focus on building technology that customers can trust—relying on credible suppliers and building security into new products.

As Microsoft CEO Satya Nadella notes, “We must adopt technology in ways that are much faster than what we have done in the past. Each one of us, in our organizations, will have to build our own digital capability on top of the technology we have adopted. Tech intensity is one of the key considerations you have to get right.”

Technology intensity as a critical enabler

Simply put, technology intensity is a critical part of business strategy today. I meet regularly with leaders from companies around the world. In my experience, high-performance companies invest the most in digital capabilities and skillsets. In fact, there is a productivity gap between these top performers and their lesser performing counterparts that directly correlates with the scale of digital investments. 

Other research shows that technology spending into hiring developers and creating innovative software that is owned and used exclusively by a company is a key competitive advantage. These companies cultivate the ability to develop their own “digital IP,” building exclusive software and tools that only their customers have access to. Resources are always scarce, and these companies build differentiated IP on top of existing best-in-class technology platforms. 

By putting customers at the center of the OEM supply chain, a report from the Economist Intelligence Unit (EIU) sponsored by Microsoft, Lorenzo Fornaroli, Senior Director of Global Logistics and Supply Chain at China-based ICT Huawei Technologies, highlights an advantage of embracing technology intensity: “…as an ICT company, we have the internal resources needed to identify new technologies early and deploy them effectively. Skills and experience with these technologies are readily available in-house.”

New business models

Manufacturers are increasingly using technology intensity principles to extend their supply chain well past the point of delivery. They are creating smart, connected products and digitizing their businesses with Azure IoT, Microsoft AI, Azure Blockchain Service, Dynamics 365, and Microsoft 365.

Rolls-Royce, for example, takes a monthly fee from customers of its jet engines that is based on flying hours. Sellers of industrial machinery such as Sandvik Coromant and Tetrapak are exploring the concept of charging customers for parts machined and containers filled.

Using the data that connected products transmit back about their condition and usage, manufacturers build new digital services. For these forward-thinking manufacturers, the extended supply chain is an opportunity to move away from selling products to customers based on a one-off, upfront purchase price, and to charge for subscriptions based on performance guarantees. This is “technology intensity in action”, as manufacturers become digital software companies.

Technology intensity in action

Viewpoint of a city

Viewpoint of a cityBühler is a global market leader in die casting technology and has adopted technology intensity for connected products. Looking to continue driving innovation in the die casting process, Bühler aggregated data from different die casting cell components together under a single cell-management system. Bühler can now monitor, control, and manage a complete die casting cell in a unified, easy-to-use system. The company is also exploring additional avenues for digital transformation including real-time fleet learning by fine-tuning its Artificial Intelligence (AI) models to generate new insights.

Lexmark, a manufacturer of laser printers and imaging products, now offers Lexmark Cloud Print Infrastructure as a Service (CPI). Customers no longer manage onsite print infrastructure. Instead, Lexmark installs its own IoT-enabled devices and activates smart services, creating an always-on print environment. To roll out CPI, Lexmark worked with Microsoft to quickly adopt new Internet of Things (IoT), Customer Relationship Management (CRM), AI and collaboration tools to successfully grow their internal digital capabilities.

Colfax is another great example of a manufacturer embracing technology intensity. A global industrial technology company, Colfax, recognized the need to adopt industrial IoT technologies and the importance of embracing a comprehensive digital transformation initiative to expand the offerings of two of its business platforms—ESAB, a welding and cutting solutions provider, and Howden, an air and gas handling engineering company. Working with Microsoft and PTC, the company adopted advanced cloud technologies while building up a digital skillset. 

Investing in new digital skillsets

Savvy manufacturers harness data from connected products and combine this with data from many other sources, in unprecedented volumes. The copious amounts of data require mindset shifts, requiring organizations to build skills for a new digital era. Most manufacturers concur that they need to expand their internal capabilities in this manner.

“Senior supply-chain professionals have typically been accustomed to working with a fairly limited set of data to drive their decisions—but that’s all changed now,” says Daniel Helmig, group head of quality and operations at Swiss-Swedish industrial group ABB in the EIU report putting customers at the center of the OEM supply chain. He continues, “but being able to take advantage of the huge volumes of data available to us— and these are growing every day—demands a mind shift among supply-chain professionals, based on using new levels of visibility to respond to issues quickly and decisively.”

From the same report, Sheri Henck, vice-president of global supply chain, distribution, and logistics at Medtronic, a US-based manufacturer of medical equipment and devices, commented “In the past, a great deal of supply-chain decision-making was based on intuition, because data wasn’t available. Today, there is plenty of data available, but there’s also a recognition that skills and competencies for supply-chain leaders and their teams need to change if we are to make the most of data and use it to make data-driven recommendations and decisions.”

Explore how to optimize your digital operations with business solutions for intelligent factories with the e-book, “Factory of the future: Achieving digital excellence in manufacturing today“.

Go to Original Article
Author: Microsoft News Center

For Sale – MSI Z97 Gaming 5 Motherboard with Intel i7-4790K, 16GB Crucial Ballistix RAM and Cooler

Hey @Roan thanks for the offer. To be honest the cooler has little value any more, and even less so when sold alone, it was only £17 when I bought it new, so there’s not much point in me taking it out of the bundle really, even if I gave it no value as part of the bundle price.

Looking at another sale on here a couple of months ago, a set with the same motherboard and CPU, with some slightly faster RAM and an aftermarket cooler sold for £230 including delivery, so I’ll drop to £215inc and leave the cooler in there?

Let me know what you think.

EDIT – Postage would be via Hermes at this price, as insured Royal Mail delivery comes to £26.

Go to Original Article
Author:

AI trends 2020 outlook: More automation, explainability

The AI trends 2020 landscape will be dominated by increasing automation, more explainable AI and natural language capabilities, better AI chips for AI on the edge, and more pairing of human workers with bots and other AI tools.

AI trends 2020 — increased automation

In 2020, more organizations across many vertical industries will start automating their back-end processes with robotic process automation (RPA), or, if they are already using automation, increase the number of processes to automate.

RPA is “one of the areas where we are seeing the greatest amount of growth,” said Mark Broome, chief data officer at Project Management Institute (PMI), a global nonprofit professional membership association for the project management profession.

Citing a PMI report from summer 2019 that compiled survey data from 551 project managers, Broome said that now, some 21% of surveyed organizations have been affected by RPA. About 62% of those organizations expect RPA will have a moderate or high impact over the next few years.

RPA is an older technology — organizations have used RPA for decades. It’s starting to take off now, Broome said, partially because many enterprises are becoming aware of the technology.

“It takes a long time for technologies to take hold, and it takes a while for people to even get trained on the technology,” he said.

Moreover, RPA is becoming more sophisticated, Broome said. Intelligent RPA or simply intelligent process automation (IPA) — RPA infused with machine learning — is becoming popular, with major vendors such as Automation Anywhere and UiPath often touting their intelligent RPA products. With APIs and built-in capabilities, IPA enables users to more quickly and easily scale up their automation use cases or carry out more sophisticated tasks, such as automatically detecting objects on a screen, using technologies like optical character recognition (OCR) and natural language processing (NLP).

Sheldon Fernandez, CEO of DarwinAI, an AI vendor focused on explainable AI, agreed that RPA platforms are becoming more sophisticated. More enterprises will start using RPA and IPA over the next few years, he said, but it will happen slowly.

RPA, AI trends 2020
In 2020, enterprises will use more RPA and chatbots, and will get more explainable AI.

AI trends 2020 — push toward explainable AI

Even as AI and RPA become more sophisticated, there will be a bigger move toward more explainable AI.

“You will see quite a bit of attention and technical work being done in the area of explainability across a number of verticals,” Fernandez said.

You will see quite a bit of attention and technical work being done in the area of explainability.
Sheldon FernandezCEO, DarwinAI

Users can expect two sets of effort behind explainable AI. First, vendors will make AI models more explainable for data scientists and technical users. Eventually, they will make models explainable to business users.

Likely, technology vendors will move more to address problems of data bias as well, and to maintain more ethical AI practices.

“As we head into 2020, we’re seeing a debate emerge around the ethics and morality of AI that will grow into a highly contested topic in the coming year, as organizations seek new ways to remove bias in AI and establish ethical protocols in AI-driven decision-making,” predicted Phani Nagarjuna, chief analytics officer at Sutherland, a process transformation vendor.

AI trends 2020 — natural language

Furthermore, BI, analytics and AI platforms will likely get more natural language querying capabilities in 2020.

NLP technology also will continue to evolve, predicted Sid Reddy, chief scientist and senior vice president at virtual assistant vendor Conversica.

“Human language is complex, with hundreds of thousands of words, as well as constantly changing syntax, semantics and pragmatics and significant ambiguity that make understanding a challenge,” Reddy said.

“As part of the evolution of AI, NLP and deep learning will become very effective partners in processing and understanding language, as well as more clearly understanding its nuance and intent,” he continued.

Among the tech giants involved in AI, AWS, for example, in November 2019 revealed Amazon Kendra, an AI-driven search tool that will enable enterprise users to automatically index and search their business data. In 2020, enterprises can expect similar tools to be built into applications or sold as stand-alone products.

More enterprises will deploy chatbots and conversational agents in 2020 as well, as the technology becomes cheaper, easier to deploy and more advanced. Organizations won’t fully replace contact center employees with bots, however. Instead, they will pair human employees more effectively with bot workers, using bots to answer easy questions, while routing more difficult ones to their human counterparts.

“There will be an increased emphasis in 2020 on human-machine collaboration,” Fernandez said.

AI trends 2020 — better AI chips and AI at the edge

To power all the enhanced machine learning and deep learning applications, better hardware is required. In 2020, enterprises can expect hardware that’s specific to AI workloads, according to Fernandez.

In the last few years, a number of vendors, including Intel and Google, released AI-specific chips and tensor processing units (TPUs). That will continue in 2020, as startups begin to enter the hardware space. Founded in 2016, the startup Cerebras, for example, unveiled a giant AI chip that made national news. The chip, the largest ever made, Cerebras claimed, is the size of a dinner plate and designed to power massive AI workloads. The vendor shipped some last year, with more expected to ship this year.

While Cerebras may have created the largest chip in the world, 2020 will likely introduce smaller pieces of hardware as well, as more companies move to do AI at the edge.

Max Versace, CEO and co-founder of neural network vendor Neurala, which specializes in AI technology for manufacturers, predicted that in 2020, many manufacturers will move toward the edge, and away from the cloud.

“With AI and data becoming centralized, manufacturers are forced to pay massive fees to top cloud providers to access data that is keeping systems up and running,” he said. “As a result, new routes to training AI that can be deployed and refined at the edge will become more prevalent.”

Go to Original Article
Author:

For Sale – Imac 27″ (2017) 24gb Ram I5

I think its more to do with the fact there is no Apple keyboard or mouse included.. A friend of mine is interested but they would then need to shell out approx £100+ for new Apple peripherals. Appreciate its not that old but its also out of warranty. Just weighing up some options…

Go to Original Article
Author:

Data silos hinder IoT in healthcare; tech giants could help

The Internet of Things in healthcare may not be a new idea, but it’s the key to creating a more connected world within healthcare, according to one analyst.

The Internet of Things, or IoT, is the connection of a group of digitized objects that can collect, send and receive data. Digital medical device use was born out of clinical need, often circumventing IT for approval or advice, said Gartner analyst Gregg Pessin. Now healthcare organizations are dealing with silos of IoT devices and data.

Gregg PessinGregg Pessin

“In the past, the CIO or the IT department has had little input into what happens in that acquisition process, so you end up with IoT solutions, many of them from many different companies, that all work in their own little world inside that clinical environment,” Pessin said.

That is changing. Healthcare organizations are beginning to see value in breaking down silos and bringing IoT data together to create a single view of a patient. Tech giants like AWS are pushing into the healthcare market providing platforms to gather and analyze IoT data while making it more accessible.

CIO’s perspective on IoT in healthcare

IoT data silos and the lack of interoperability in healthcare are major challenges, according to Craig Richardville, CIO of SCL Health, based in Broomfield, Colo. They must be overcome for a healthcare organization to make better use of the IoT data it’s collecting.

Craig RichardvilleCraig Richardville

In healthcare, integrating vast amounts of IoT data into provider workflows is a complex, uphill battle, Richardville said. But as the healthcare industry matures, he said, there is growing opportunity to standardize and integrate IoT data back into provider workflows to create a more complete view of a patient.

“That’s really the ecosystem we all want to create,” he said. “The end game is [a system] that is fully connected all the way through, safely and securely, that allows us to consume or digest that information and get that back into someone’s professional workflow so they can take advantage of the information. The outcome of that is we make better decisions.” 

Richardville believes IoT is the future of healthcare, further enabling a healthcare organization’s connection to patients in their homes. IoT in healthcare can grow an organization’s capabilities when it comes to remote patient monitoring, social determinants of health and other areas of healthcare. IoT data can help providers and healthcare leaders “make more precise and intelligent decisions,” he said. 

Richardville said IoT could provide greater connection to patients but that privacy and security should remain top of mind for healthcare CIOs as that connection to patients and data collection grows. It’s also important that a healthcare system has the capability to analyze the data coming from connected devices — an area where tech giants could play a significant role.

Companies like Amazon, Apple, Google and Microsoft, all of which continue to push into healthcare, could provide healthcare organizations with IoT data gathering and analytics capabilities, Richardville said. SCL Health has a “strong relationship” with Google, which he sees as an “accelerator” to the digital healthcare work the organization is doing.  

“When you look at the companies, whether it’s Amazon or Google or Microsoft, all getting into this space … it actually allows us to be able to lift our game,” Richardville said. 

When it comes to IoT, Gartner’s Pessin said there is strong motivation in healthcare to move toward platform products, which offer tools to gather and analyze IoT data.  

Tech giants further enable IoT in healthcare

Healthcare organizations are buying more patient data-collecting and IoT-enabled devices, which is creating a “tidal wave of data” healthcare CIOs have to deal with, Pessin said.

The amount of computing and storage power required to process that much data is likely more than an on-premises data center can handle. That’s where external, third-party players like tech giants come in, according to Pessin.

“What are they great at? They’re great at scaling resources and they’re adding all of these great, specific kinds of platform solutions like IoT services that they can sell on the platform,” Pessin said.

AWS, for example, has AWS IoT services that health IT and medical device manufacturer Philips Healthcare is using. Philips created a customer-facing HealthSuite digital platform to provide customers with the capability to “connect devices, collect electronic health data, aggregate and store data securely, analyze data and create solutions on the cloud,” according to the Philips HealthSuite digital platform website.

Dale Wiggins, general manager of the HealthSuite digital platform, said Philips chose AWS to be its cloud provider to store large amounts of data and large X-ray and MRI image files from Philips medical devices. The next step for the Philips HealthSuite platform is to use AWS IoT services for remote support management of Philips devices, Wiggins said.

AWS IoT provides Philips with a more cost-effective way to offer remote support capabilities on Philips devices to healthcare customers, he said.

“We’re looking at using IoT to solve a lot of legacy issues with our existing remote support capabilities with new, cutting-edge, always on, always available services that AWS really supports through what they provide with IoT,” he said.

AWS IoT offers device software, control services and data services, depending on customer needs, according to Dirk Didascalou, vice president of AWS IoT. AWS provides the infrastructure for IoT services and is HIPAA-compliant, but it does not have access to customer data through AWS IoT, Didascalou said.

Partnerships with tech giants and healthcare organizations, medical device manufacturers and even EHRs are becoming the norm, according to Pessin. Healthcare organizations create the data and tech giants can provide tools to collect, analyze and store that data. Pessin said healthcare CIOs have to be ready to develop partnerships between the two.

“The advances in digital care delivery that are coming are going to require massive resources, and it’s those large digital giants that have that available,” Pessin said. 

Go to Original Article
Author:

The Week It Snowed Everywhere – New Zealand News Centre

NIWA and Microsoft Corp. are teaming up to make artificial intelligence handwriting recognition more accurate and efficient in a project that will support climate research.

The project aims to develop better training sets for handwriting recognition technology that will “read” old weather logs. The first step is to use weather information recorded during a week in July 1939 when it snowed all over New Zealand, including at Cape Reinga.

NIWA climate scientist Dr. Andrew Lorrey says the project has the potential to revolutionise how historic data can be used. Microsoft has awarded NIWA an AI for Earth grant for the artificial intelligence project, which will support advances in automating handwriting recognition. AI for Earth is a global programme that supports innovators using AI to support environmental initiatives related to water, climate change, sustainable agriculture and biodiversity.

Microsoft’s Chief Environmental Officer, Lucas Joppa, sees a project that could quite literally be world-changing. “This project will bring inanimate weather data to life in a way everyone can understand, something that’s more vital than ever in an age of such climate uncertainty.

“I believe technology has a huge role to play in shining a light on these types of issues, and grantees such as NIWA are providing the solutions that we get really excited about.”

YouTube Video

Dr. Lorrey has been studying the weather in the last week of July 1939 when snow lay 5 cm deep on top of Auckland’s Mt. Eden, the hills of Northland turned white and snow flurries were seen at Cape Reinga. “Was 1939 the last gasp of conditions that were more common during the Little Ice Age, which ended in the 1800s? Or the first glimpse of the extremes of climate change thanks to the Industrial Revolution?”

Weather records at that time were meticulously kept in logbooks with entries made several times a day, recording information such as temperature, barometric pressure and wind direction. Comments often included cloud cover, snow drifts or rainfall.

“These logs are like time machines, and we’re now using their legacy to help ours,” Dr. Lorrey says.

“We’ve had snow in Northland in the recent past, but having more detail from further back in history helps us characterise these extreme weather events better within the long-term trends. Are they a one-in-80-year event, do they just occur at random, can we expect to see these happening with more frequency, and why, in a warming climate, did we get snow in Northland?”

Dr Drew Lorrey

Until now, however, computers haven’t caught up with humans when it comes to deciphering handwriting. More than a million photographed weather observations from old logbooks are currently being painstakingly entered by an army of volunteer “citizen scientists” and loaded by hand into the Southern Weather Discovery website. This is part of the global Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative, which aims to produce better daily global weather animations and place historic weather events into a longer-term context.

“Automated handwriting recognition is not a solved problem,” says Dr. Lorrey. “The algorithms used to determine what a symbol is — is that a 7 or a 1? — need to be accurate, and of course for that there needs to be sufficient training data of a high standard.” The data captured through the AI for Earth grant will make the process of making deeper and more diverse training sets for AI handwriting recognition faster and easier.

“Old data is the new data,” says Patrick Quesnel, Senior Cloud and AI Business Group Lead at Microsoft New Zealand. “That’s what excites me about this. We’re finding better ways to preserve and digitise old data reaching back centuries, which in turn can help us with the future. This data is basically forgotten unless you can find a way to scan, store, sort and search it, which is exactly what Azure cloud technology enables us to do.”

Dr. Lorrey says the timing of the project is especially significant.
“This year is the 80th anniversary of The Week It Snowed Everywhere, so it’s especially fitting we’re doing this now. We’re hoping to have all the New Zealand climate data scanned by the end of the year, and quality control completed with usable data by the end of the next quarter.”

Ends.
About NIWA
The National Institute of Water and Atmospheric Research (NIWA) is New Zealand’s leading provider of climate, freshwater and ocean science. It delivers the science that supports economic growth, enhances human well-being and safety and enables good stewardship of our natural environment.
About Microsoft
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information contact:

Dr. Andrew Lorrey
NIWA climate scientist
Ph 09 375-2055
Mob 021 313-404
Andrea Jutson
On behalf of Microsoft New Zealand
Ph: (09) 354 0562 or 021 0843 0782
[email protected]

Go to Original Article
Author: Microsoft News Center