Tag Archives: 2020

U.S. spends more on AI as AI in China continues to grow

The Trump administration’s fiscal year 2020 budget includes nearly $1 billion in non-defense AI research and development as the adoption and funding of AI in China continues to grow rapidly.

The budget request is to fund the Network and Information Technology Research and Development Program, a federal program that coordinates the activities and budgeting for research and development of several technology agencies.

U.S. AI efforts

The fiscal year 2020 budget supplement, released in September, allocates $973.5 million for AI as a new spending category to support existing White House efforts to ramp up robotic process automation and AI innovation and adoption.

Millions of dollars will go toward funding research and development for a range of AI-related projects, including machine vision, cybersecurity challenges unique to AI, and chips optimized for neural nets.

While the funding should help U.S. government’s AI efforts, it’s unclear how it will match up with AI work in China.

Past efforts of the U.S. government, according to Rudina Seseri, founder and managing partner of venture-capital firm Glasswing Ventures, have lagged considerably to those of other countries.

“For all that is being done, it’s not even 1% of the spending that other countries have made,” Seseri said during a talk at the AI World Government 2019 conference in Washington, D.C. in June.

The U.S. government’s efforts in the global AI race are “far, far behind China,” a country that, in addition to spending more, also collects more data from its citizens, she noted.

In China, “the human rights of data (is) not a notion that exists,” she said. “The government has free control of everyone’s data.”

AI in China

That may be evident in China’s development of smart cities. The government is constructing, or plans to construct, more than 500 cities with smart city capabilities that include automatically monitoring and routing buses to optimize traffic flows, and prioritizing mobile payment systems. The urban strategy also leans heavily on the internationally controversial goal of expanding surveillance of citizens by setting up networks of security cameras with gait and facial recognition software.

Traditional industries are still lagging behind [in AI adoption in China].
Danny Muanalyst, Forrester

Given the Chinese government’s extensive involvement in city planning, and its propensity to spy on its own citizens, these kinds of smart cities have no real equivalent in the U.S. A number of U.S. cities, however, have begun using AI and analytics to better their transportation systems and some have started to install intelligent security cameras and intelligent street lights.

Other Western countries have begun creating smart cities as well, including the U.K. and Canada. Calgary, Alberta, for example, deployed sensors to collect data on noise levels to help with noise management and on soil conditions to create better care for gardens, among other things.

Development of AI in China has skyrocketed in past years, with the government spending heavily on research and development. In 2017, the State Council of China released a plan with China’s AI goals over the next decade.

The plan details China’s ambition to become level with other world players in AI theories and technologies by 2020, and become the world leader in AI by 2030, with a domestic industry worth almost $150 billion.

The plan includes efforts to promote and support domestic AI enterprises, and encourage partnerships between them and international AI schools and research teams.

Yet, at an enterprise-level, the adoption of AI in China is comparable to that in the U.S.

Chinese and U.S. enterprises

A 2018 Forrester study commissioned by Chinese tech-giant Huawei surveyed 200 large and medium-sized companies and found that the vast majority of respondents see AI as a driver of innovation in their industry. About 65% of respondents say AI will play an extremely important role in their digital transformation.

Meanwhile, a little over half of the respondents lacked professional AI talent, and 70% of respondents said a lack of technology skills is slowing the adoption of AI in the enterprise intelligence market.

According to Danny Mu, a Forrester analyst based in Beijing, AI has been widely adopted by digital-native enterprises in the internet industry, even as “other traditional industries are still lagging behind.”

That’s about on par with enterprises in the U.S. Reports regularly highlight that most business leaders see the importance of AI, even as many traditional companies have yet to start using it.

The U.S. trade war with China has hampered China’s push to develop AI technologies, however, and has begun to force China to become more independent of Western hardware and software.

Some of China’s largest technology companies, including Baidu, Alibaba, and Huawei have started developing their own hardware and AI platforms to rely less on Western-developed technology.

The “trade war is a good reminder to Chinese companies to evaluate the risk from dependence,” Mu said.

While the trade war may affect Chinese companies’ ability to purchase American-made technologies, Chinese scientists are still actively conducting AI research, something which “is hard to ban by trade wars,” Mu said.

“AI is changing industries,” he said. “Companies and governments can’t afford to lag behind when it comes to AI technology.”

Go to Original Article
Author:

The roots of Oracle’s cloud evolution: A 20-year review

Oracle’s strategy going into 2020 is to support users wherever they are, while not-so-subtly urging them to move onto Oracle cloud services – particularly databases.

In fact, some say its Oracle’s legacy as a database vendor that may be the key to the company’s long-term success as a major cloud player.

To reconcile the Oracle cloud persona of today with the identity of database giant that the company still holds, it helps to look back at key milestones in Oracle’s history over the past 20 years, beginning with Oracle database releases at the turn of the century.

Oracle releases Database 8i, 9i

Two major versions of Oracle’s database arrived in 1998 and 2001. Oracle Database 8i was the first written with a heavy emphasis on web applications — the “i” stood for Internet.

Then Oracle 9i introduced the feature Real Application Clusters (RAC) for high-availability scenarios. RAC is a widely popular and lucrative database option for Oracle, one it has held very close to date. RAC is only supported and certified for use on Oracle’s cloud service at this time. 

With the 9i update, Oracle made a concerted effort to improve the database’s administrative tooling, said Curt Monash, founder of Monash Research in Acton, Mass.

“This was largely in reaction to growing competition from Microsoft, which used its consumer software UI expertise to have true ease-of-administration advantages versus Oracle,” Monash said. “Oracle narrowed the gap impressively quickly.”

Timeline: These 10 milestones marked Oracle's path to modern cloud computing
These 10 milestones marked Oracle’s path to modern cloud computing

Oracle acquires PeopleSoft and Siebel

Silicon Valley is littered with the bones of once-prominent application software vendors that either shut down or got swallowed up by larger competitors. To that end, Oracle’s acquisitions of PeopleSoft and Siebel still resonate today.

The company launched what many considered to be a hostile takeover of PeopleSoft, the second-largest software vendor in 2003 after SAP. It ultimately succeeded with a $10.3 billion bid the following year. Soon after the deal closed, Oracle laid off more than half of PeopleSoft’s employees in a widely decried act.

Oracle also gained J.D. Edwards, known for its manufacturing ERP software, through the PeopleSoft purchase.

The PeopleSoft deal, along with Oracle’s $5.8 billion acquisition of Siebel in 2005, reinvented the company as a big player in enterprise applications and set up the path toward Fusion.

Oracle realized that to catch up to SAP in applications, it needed acquisitions, said Holger Mueller, an analyst with Constellation Research in Cupertino, Calif., who worked in business and product development roles at Oracle during much of the 2000s.

“To cement ownership within complex CRM, they needed Siebel,” Mueller said. Those Siebel customers largely remain in the fold today, he added. While rival HCM software vendor Workday has managed to poach some of Oracle’s PeopleSoft customers, Salesforce hasn’t had the same luck converting Siebel users over to its CRM, according to Mueller.

Oracle’s application deals were as much or more about acquiring customers as they were about  technology, said Frank Scavo, president of IT consulting firm Strativa in Irvine, Calif.

“Oracle had a knack for buying vendors when they were at or just past their peak,” he said. “PeopleSoft was an example of that.”

The PeopleSoft and Siebel deals also gave Oracle the foundation, along with its homegrown E-Business Suite, for a new generation of applications in the cloud era.

Oracle’s Fusion Applications saga

Oracle first invoked the word “Fusion” in 2005, under the promise it would deliver an integrated applications suite that comprised a superset of functionality from its E-Business Suite, PeopleSoft and Siebel software, with both cloud and on-premises deployment options.

The company also pledged that Fusion apps would deliver a consumer-grade user experience and business intelligence embedded throughout processes.

Fusion Applications were supposed to become generally available in 2008, but Oracle didn’t make these applications generally available to all customers until 2011.

It’s been suggested that Oracle wanted to take its time and had the luxury of doing so, since its installed base was still weathering a recession and had little appetite for a major application migration, no matter how useful the new software was.

Fusion Applications’ sheer scope was another factor. “It takes a long time to build software from scratch, especially if you have to replace things that were strong category leaders,” Mueller said.

Oracle’s main shortcoming with Fusion Applications was its inability to sell very much of them early on, Mueller added.

Oracle acquires Hyperion and BEA

After its applications shopping spree, Oracle eyed other areas of software. First, it bought enterprise performance management vendor Hyperion in 2007 for $3.3 billion to bolster its financials and BI business.

“Hyperion was a smart acquisition to get customers,” Mueller said. “It helped Oracle sell financials. But it didn’t help them in the move to cloud.”

In contrast, BEA and its well-respected application server did. The $8.5 billion deal also gave Oracle access to a large customer base and many developers, Mueller added.

John Rymer, ForresterJohn Rymer

BEA’s products also gave a boost to Oracle’s existing Fusion Middleware portfolio, said John Rymer, an analyst at Forrester. “At the time, Oracle’s big competitor in middleware was IBM,” he said. “[Oracle] didn’t have credibility.”

Oracle’s hardware play

Oracle made a major strategic shift in 2008 with the introduction of Exadata, its first foray into hardware.

Exadata packs servers, networking and storage, along with Oracle database and other software, into preconfigured racks. Oracle also created storage processing software for the machines, which its marketing arm initially dubbed “engineered systems.”

With the move, Oracle sought to take a bigger hold in the data warehousing market against the likes of Teradata and Netezza, which was subsequently acquired by IBM.

Exadata was a huge move for Oracle, Monash said.

“They really did architect hardware around software requirements,” he said. “And they attempted to change their business relationship with customers accordingly. … For context, recall that one of Oracle’s top features in its hypergrowth years in the 1980s was hardware portability.”

In fact, it would have been disastrous if Oracle didn’t come up with Exadata, according to Monash.

“Oracle was being pummeled by independent analytics DBMS vendors, appliance-based or others,” he said. “The competition was more cost-effective, naturally, but Exadata was good enough to stem much of the bleeding.”

Steve DahebSteve Daheb

Exadata and its relatives are foundational to Oracle’s IaaS, and the company also offers the systems on-premises through its Cloud at Customer program.

“We offer customers choice,” said Steve Daheb, senior vice president of Oracle Cloud. “If customers want to deploy [Oracle software] on IBM or HP [gear], you could do that. But we also continue to see this constant theme in tech, where things get complicated and then they get aggregated.”

Oracle buys Sun Microsystems

Few Oracle acquisitions were as controversial as its $7.4 billion move to buy Sun Microsystems. Critics of the deal bemoaned the potential fate of open-source technologies such as the MySQL database and the Java programming language under Oracle’s ownership, and the deal faced serious scrutiny from European regulators.

Oracle ultimately made a series of commitments about MySQL, which it promised to uphold for five years, and the deal won approval in early 2010.

Sun’s hardware became a platform for Exadata and other Oracle appliances. MySQL has chugged along with regular updates, contrary to some expectations that it would be killed off.

But many other Sun-related technologies fell into the darkness, such as Solaris and Sun’s early version of an AWS-style IaaS. Oracle also moved Java EE to the Eclipse Foundation, although it maintains tight hold over Java SE.

The Sun deal remains relevant today, given how it ties into Ellison’s long-term vision of making Oracle the IBM for the 21st century, Mueller said.

That aspiration realized would see Oracle become a “chip-to-click” technology provider, spanning silicon to end-user applications, he added. “The verdict is kind of still out over whether that is going to work.”

Oracle Database 12c

The company made a slight but telling change to its database naming convention with the 2013 release of 12c, swapping consonants for one that denoted “cloud,” rather than “g” for grid computing.

Oracle’s first iteration of 12c had multitenancy as a marquee feature. SaaS vendors at the time predominantly used multitenancy at the application level, with many customers sharing the same instance of an app. This approach makes it easier to apply updates across many customers’ apps, but is inherently weaker for security, Ellison contended.

Oracle 12c’s multi-tenant option provided an architecture where one container database held many “pluggable” databases.

Oracle later rolled out an in-memory option to compete with SAP’s HANA in-memory database. SAP hoped its customers, many of which used Oracle’s database as an underlying store, would migrate onto HANA.

2016: Oracle acquires NetSuite

Oracle’s $9.3 billion purchase of cloud ERP vendor NetSuite came with controversy, given Ellison’s large personal financial stake in the vendor. But on a strategic level, the move made plenty of sense.

NetSuite at the time had more than 10,000 customers, predominantly in the small and medium-sized business range. Oracle, in contrast, had 1,000 or so customers for its cloud ERP aimed at large enterprises, and not much presence in SMB.

Thus, the move plugged a major gap for Oracle. It also came as Oracle and NetSuite began to compete with each other at the margins for customers of a certain size.

Oracle’s move also gave it a coherent two-tier ERP strategy, wherein a customer that opens new offices would use NetSuite in those locations while tying it back to a central Oracle ERP system. This is a practice rival SAP has used with Business ByDesign, its cloud ERP product for SMBs, as well as Business One.

The NetSuite acquisition was practically destined from the start, said Scavo of Strativa.

“I always thought Larry was smart not to do the NetSuite experiment internally. NetSuite was able to develop its product as a cloud ERP system long before anyone dreamed of doing that,” Scavo said.

NetSuite customers could benefit as the software moves onto Oracle’s IaaS if they receive the promised benefits of better performance and elasticity, which NetSuite has grappled with at times, Scavo added. “I’m looking forward to seeing some evidence of that.”

Oracle launches its second-generation IaaS cloud

The IaaS market has largely coalesced around three players in hyperscale IaaS: AWS, Microsoft and Google. Other large companies such as Cisco and HPE tried something similar, but ceded defeat and now position themselves as neutral middle players keen to help customers navigate and manage multi-cloud deployments.

Oracle, meanwhile, came to market with an initial public IaaS offering based in part on OpenStack, but it failed to gain much traction. It subsequently made major investments in a second-generation IaaS, called Oracle Cloud Infrastructure, which offers many advancements at the compute, network and storage layers over the original.

With [on-premises software] we had to make it work with everybody. Part of it is working together to bring that to the cloud.
Steve DahebSenior vice president, Oracle Cloud

Oracle has again shifted gears, evidenced by its partnership with Microsoft to boost interoperability between Oracle Cloud Infrastructure and Azure. One expected use case is for IT pros to run their enterprise application logic and presentation tiers on Azure, while tying back to Oracle’s Autonomous Database on the Oracle cloud.

“We started this a while back and it’s something customers asked for,” Oracle’s Daheb said. There was significant development work involved and given the companies’ shared interests, the deal was natural, according to Daheb.

“If you think about this world we came from, with [on-premises software], we had to make it work with everybody,” Daheb said. “Part of it is working together to bring that to the cloud.”

Oracle Autonomous Database marks the path forward

Ellison will unveil updates to Oracle database 19c, which runs both on-premises and in the cloud, in a talk at OpenWorld. While details remain under wraps, it is safe to assume the news will involve autonomous management and maintenance capabilities Oracle first discussed in 2017.

Oracle database customers typically wait a couple of years before upgrading to a new version, preferring to let early adopters work through any remaining bugs and stability issues. Version 19c arrived in January, but is more mature than the name suggests. Oracle moved to a yearly naming convention and update path in 2018, and thus 19c is considered the final iteration of the 12c release cycle, which dates to 2013.

Oracle users should be mindful that autonomous database features have been a staple of database systems for decades, according to Monash.

But Oracle has indeed accomplished something special with its cloud-based Autonomous Database, according to Daheb. He referred to an Oracle marketing intern who was able to set up databases in just a couple of minutes on the Oracle Cloud version. “For us, cloud is the great democratizer,” Daheb said.

Go to Original Article
Author:

HR execs and politicians eye student debt relief

Student debt relief is not only an election issue in the 2020 race for president, but a problem for HR managers. Some firms, including a hospital in New York, are doing something about it.

Montefiore St. Luke’s Cornwall Hospital began offering a student loan relief program this year for its non-union employees. It employs 1,500 people and provides employees 32 vacation days a year.

Most employees don’t take all that time off, said Dan Bengyak, vice president of administrative services at the not-for-profit medical center with hospitals in Newburgh and Cornwall. He oversees HR, IT and other administrative operations.

In February, the hospital detailed its plan to apply paid time off to student debt relief. Employees in the Parents Plus Loan program had the option as well. The hospital set two sign-up windows, the first in May. Forty employees signed up. The next window is in November.

The program “has been extremely well received and it definitely has offered us a real competitive advantage in the recruiting world,” Bengyak said. He believes it will help with retention as well.

The maximum employee contribution for student debt relief is $5,000. The hospital also provides tuition help. This combination “offers significant financial assistance,” to employees seeking advanced degrees, Bengyak said.

A SaaS platform handles payments

The hospital uses Tuition.io, a startup founded in 2013 and based in Santa Monica, Calif. The platform manages all of the payments to the loan services. Its users pay a lump sum to cover the cost of the assistance. The employer doesn’t know the amount of the employee’s debt. The platform notifies the employee when a payment is posted.

It definitely has offered us a real competitive advantage in the recruiting world.
Dan BengyakVP of administrative services, Montefiore St. Luke’s Cornwall Hospital

Payments can be made as a monthly contribution, a lump sum on an employment anniversary or other methods, according to Scott Thompson, CEO at Tuition.io.

Tuition.io also analyzes repayment data, which can show the program’s retention impact, according to Thompson.

“Those individuals who are participating in this benefit stay longer with the employer — they just do,” he said. 

About one in five students has over $100,000 in debt and is, by definition, broke, Thompson said. They can’t afford an employer’s 401K program or buy a house. Employees with a burdensome loan “are always looking for a new job that pays you more money because you simply have to,” he said.

Legislation in pipeline

The amount of student loan debt is in excess of $1.5 trillion and exceeds credit card and auto debt combined, said Robert Keach, a past president at the American Bankruptcy Institute, in testimony at a recent U.S. House Judiciary Committee hearing on bankruptcy. More than a quarter of borrowers are in delinquency or default, he said. Student loan debt is expected to exceed $2 trillion by 2022.

“High levels of post-secondary education debt correlate with lower earnings, lower rates of home ownership, fewer automobile purchases, higher household financial distress, and delayed marriage and family formation, among other ripple effects,” Keach said.

Congress is considering legislation that may make it easier for firms to help employees with debt. One example is the Employer Participation in Repayment Act, a bill that has bipartisan support in both chambers. It would enable employers to give up to $5,250 annually per employee, tax free.

Go to Original Article
Author:

AI washing muddies the artificial intelligence products market

Analysts predict that by 2020, artificial intelligence technologies will be in almost every new software and service release. And if they’re not actually in them, technology vendors will probably use smoke and mirrors marketing tactics to make users believe they are.

Many tech vendors already shoehorn the AI label into the marketing of every new piece of software they develop, and it’s causing confusion in the market. To muddle things further, major software vendors accuse their competitors of egregious mislabeling, even when the products in question truly do include artificial intelligence technologies.

AI mischaracterization is one of the three major problems in the AI market, as highlighted by Gartner recently. More than 1,000 vendors with applications and platforms describe themselves as artificial intelligence products vendors, or say they employ AI in their products, according to the research firm. It’s a practice Gartner calls “AI washing” — similar to the cloudwashing and greenwashing, which have become prevalent over the years as businesses overexaggerate their association to cloud computing and environmentalism.

AI goes beyond machine learning

When a technology is labelled AI, the vendor must provide information that makes it clear how AI is used as a differentiator and what problems it solves that can’t be solved by other technologies, explained Jim Hare, a research VP at Gartner, who focuses on analytics and data science.

You have to go in with the assumption that it isn’t AI, and the vendor has to prove otherwise.
Jim Hareresearch VP, Gartner

“You have to go in with the assumption that it isn’t AI, and the vendor has to prove otherwise,” Hare said. “It’s like the big data era — where all the vendors say they have big data — but on steroids.”

“What I’m seeing is that anything typically called machine learning is now being labelled AI, when in reality it is weak or narrow AI, and it solves a specific problem,” he said.

IT buyers must hold the vendor accountable for its claims by asking how it defines AI and requesting information about what’s under the hood, Hare said. Customers need to know what makes the product superior to what is already available, with support from customer case studies. Also, Hare urges IT buyers to demand a demonstration of artificial intelligence products using their own data to see them in action solving a business problem they have.

Beyond that, a vendor must share with customers the AI techniques it uses or plans to use in the product and their strategy for keeping up with the quickly changing AI market, Hare said.

The second problem Gartner highlights is that machine learning can address many of the problems businesses need to solve. The buzz around more complicated types of AI, such as deep learning, gets so much hype that businesses overlook simpler approaches.

“Many companies say to me, ‘I need an AI strategy’ and [after hearing their business problem] I say, ‘No you don’t,'” Hare said.

Really, what you need to look for is a solution to a problem you have, and if machine learning does it, great,” Hare said. “If you need deep learning because the problem is too gnarly for classic ML, and you need neural networks — that’s what you look for.”

Don’t use AI when BI works fine

When to use AI versus BI tools was the focus of a spring TDWI Accelerate presentation led by Jana Eggers, CEO of Nara Logics, a Cambridge, Mass., company, that describes its “synaptic intelligence” approach to AI as the combination of neuroscience and computer science.

BI tools use data to provide insights through reporting, visualization and data analysis, and people use that information to answer their questions. Artificial intelligence differs in that it’s capable of essentially coming up with solutions to problems on its own, using data and calculations.

Companies that want to answer a specific question or problem should use business analytics tools. If you don’t know the question to ask, use AI to explore data openly, and be willing to consider the answers from many different directions, she said. This may involve having outside and inside experts comb through the results, perform A/B testing, or even outsource via platforms such as Amazon’s Mechanical Turk.

With an AI project, you know your objectives and what you are trying to do, but you are open to finding new ways to get there, Eggers said.

AI isn’t easy

A third issue plaguing AI is that companies don’t have the skills on staff to evaluate, build and deploy it, according to Gartner. Over 50% of respondents to Gartner’s 2017 AI development strategies survey said the lack of necessary staff skills was the top challenge to AI adoption. That statistic appears to coincide with the data scientist supply and demand problem.

Companies surveyed said they are seeking artificial intelligence products that can improve decision-making and process automation, and most prefer to buy one of the many packaged AI tools rather than build one themselves. Which brings IT buyers back to the first problem of AI washing; it’s difficult to know which artificial intelligence products truly deliver AI capabilities, and which ones are mislabeled.

After determining a prepackaged AI tool provides enough differentiation to be worth the investment, IT buyers must be clear on what is required to manage it, Hare said; what human services are needed to change code and maintain models over the long term? Is it hosted in a cloud service and managed by the vendor, or does the company need knowledgeable staff to keep it running?

“It’s one thing to get it deployed, but who steps in to tweak and train models over time?” he said. “[IBM] Watson, for example, requires a lot of work to stand up and you need to focus the model to solve a specific problem and feed it a lot of data to solve that problem.”

Companies must also understand the data and compute requirements to run the AI tool, he added; GPUs may be required and that could add significant costs to the project. And cutting-edge AI systems require lots and lots of data. Storing that data also adds to the project cost.