The post Lisbon school’s Minecraft project takes learning through multiple subjects, disciplines and worlds appeared first on Stories.
The post Microsoft Asia President Ralph Haupter: Learning to love AI appeared first on Stories.
ServiceNow has seasoned its core platform with AI and machine learning technologies in the hope that it will automatically route technical problems to IT professionals best equipped to handle them.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
The Agent Intelligence contained in the latest release, code named ServiceNow Kingston, is a supervised machine learning application designed to reduce the manual processing of help desk requests, and thus reduce IT help desk backlogs that face many shops today. The debut of the DxContinuum technology ServiceNow acquired in 2017, identifies and examines benchmarks within the IT infrastructure and predicts trends and events based on data collected through operational intelligence tools.
The development of ServiceNow Kingston’s AI and machine learning technologies was guided by research the company conducted among users. Above all else, the majority of them want assistance with everyday IT tasks, according to Allan Leinwand, ServiceNow’s CTO.
“We learned many users don’t have the in-house talent, such as data scientists or others fully indoctrinated in AI, to apply it,” he said. “They don’t want to have to learn machine learning in Amazon or Microsoft’s public clouds. They want to use it in a way that’s practical for them.”
However, to ensure the new technologies deliver accurate results, users must have at least 50,000 incidents or pieces of data, which means the product is best suited for midrange and larger IT shops.
“We found that figure to be the tipping point where they can get a high degree of accuracy, and get the most out of the system,” Leinwand said.
To illustrate how incident categorization, prioritization and routing capabilities work, Leinwand described a manufacturer of aircraft engines that develops a customized application that contains thousands of pieces of data to track and measure various performance metrics of engines.
“Through the ServiceNow platform, users can look at the table of information called aircraft engine performance, look at the column listing the number of hours in flight and generate back a prediction about the number of hours of flight time left before it needs to go into service,” Leinwand said.
Another addition to ServiceNow Kingston is Flow Designer, drag-and-drop software that lets non-programmers assemble process flows for projects. The software works in concert with the company’s existing Workflow editor and across ServiceNow’s product portfolio, as well as users’ third-party applications. Another addition to ServiceNow Kingston is the Integration Hub, which orchestrates the interaction between Flow Designer and a range of third-party products.
Leinwand said users “can put together a workflow on our platform that will send a notification out to Slack or a group within Microsoft Teams, and the Integration Hub will integrate a workflow with those of third parties.”
Through the testing process, ServiceNow discovered that the new technologies are applicable to more markets than originally estimated. For instance, consumer electronics companies with defective products in the field must route problems to appropriate technicians, and then notify potentially hundreds of thousands of users about how they should handle a recall or delivery of in-the-field fixes.
Governments and schools need to change the way children are taught as technology creates more learning opportunities outside the classroom, the Vice-President of Education at Microsoft has said.
Anthony Salcito (above), who oversees the worldwide execution of the company’s vision for education, added that the world will need “amazing teachers” who can guide students’ learning inside and outside schools, as more content and information becomes easier to access and share online.
Salcito was speaking on the first day of Bett, the London education conference that also featured speeches from Anne Milton MP, the Minister of State for Skills and Apprenticeships, and Ian Fordham, Director of Education at Microsoft, as well as chief executives of edtech companies and teachers.
“The way we think of students and the way they see themselves and their place in the world is fundamentally different,” Salcito said. “We often describe these students as ‘phygital’ – they don’t see the difference between the physical world and the digital world. They want to create, make and use digital tools in new ways
“The way students learn, share ideas, get access to content, create and collaborate is fundamentally different. Their mindsets are different, and the workplaces we are preparing them for are different, so we have to recognise there has been a lot of change. What we’ve now got to do at a system level, the institution level, is not only embrace that change but use it in a purposeful way to drive a different dynamic in classrooms.”
Speaking about new ways of working, Salcito pointed to Microsoft’s recent announcement of a cutting-edge mixed-reality partnership with British education company Pearson, which will see pupils and nurses learn by interacting with holograms.
In her speech opening Bett at the ExCeL, Milton pointed out that while the UK is at the “forefront” of edtech, many of the “best and brightest” companies were struggling to recruit the digital talent they needed. Technology can be used to make education more accessible and inclusive, she said, including using cloud services to allow teachers and students to share work.
“We need to make sure the enthusiasm that students have for digital skills and learning continues into the workplace,” Milton added.
Last year Microsoft launched a UK-wide digital skills programme that aims to ensure the country remains one of the global leaders in cloud computing, artificial intelligence and other next-generation technologies.
Milton’s view was later echoed by Salcito, who believed technology can “extend learning beyond the classroom” and will shake up the traditional educational model of a teacher standing in front of a class. Pupils will be able to work more closely together, on more projects and occasionally be in control of their own learning while at school.
“Technology is an amazing tool, and one of things it can do, which we have to harness, is the extension of learning beyond the classroom,” Salcito said. “Teachers can spend less time going through content chapter by chapter – chapter one, chapter two, test, chapter three, chapter four, test – and leverage this world of digital content and learning from others, learning by connecting students to work on projects outside the classroom. What does that mean for how people work inside the classroom? It means they can connect students, who can work on problem solving and new projects. They can have flip classrooms where students are in the driving seat.
“The size of the learning world for teachers has got bigger. They can influence a school student in the classroom but really guide their learning journey outside it, so we need amazing teachers now more than ever before.”
Learn more about Microsoft’s Digital Skills Programme
Artificial intelligence is the new electricity, said deep learning pioneer Andrew Ng. Just as electricity transformed every major industry a century ago, AI will give the world a major jolt. Eventually.
For now, 99% of the economic value created by AI comes from supervised learning systems, according to Ng. These algorithms require human teachers and tremendous amounts of data to learn. It’s a laborious, but proven process.
AI algorithms, for example, can now recognize images of cats, although they required thousands of labeled images of cats to do so; and they can understand what someone is saying, although leading speech recognition systems needed 50,000 hours of speech — and their transcripts — to do so.
Ng’s point is that data is the competitive differentiator for what AI can do today — not algorithms, which, once trained, can be copied.
“There’s so much open source, word gets out quickly, and it’s not that hard for most organizations to figure out what algorithms organizations are using,” said Ng, an AI thought leader and an adjunct professor of computer science at Stanford University, at the recent EmTech conference in Cambridge, Mass.
His presentation gave attendees a look at the state of the AI era, as well as the four characteristics he believes will be a part of every AI company, which includes a revamp of job descriptions.
Positive feedback loop
So data is vital in today’s AI era, but companies don’t need to be a Google or a Facebook to reap the benefits of AI. All they need is enough data upfront to get a project off the ground, Ng said. That starter data will attract customers who, in turn, will create more data for the product.
“This results in a positive feedback loop. So, after a period of time, you might have enough data yourself to have a defensible business,” said Ng.
A couple of his students at Stanford did just that when they launched Blue River Technology, an ag-tech startup that combines computer vision, robotics and machine learning for field management. The co-founders started with lettuce, collecting images and putting together enough data to get lettuce farmers on board, according to Ng. Today, he speculated, they likely have the biggest data asset of lettuce in the world.
“And this actually makes their business, in my opinion, pretty defensible because even the global giant tech companies, as far as I know, do not have this particular data asset, which makes their business at least challenging for the very large tech companies to enter,” he said.
Turns out, that data asset is actually worth hundreds of millions: John Deere acquired Blue River for $300 million in September.
“Data accumulation is one example of how I think corporate strategy is changing in the AI era, and in the deep learning era,” he said.
Four characteristics of an AI company
While it’s too soon to tell what successful AI companies will look like, Ng suggested another corporate disruptor might provide some insight: the internet.
One of the lessons Ng learned with the rise of the internet was that companies need more than a website to be an internet company. The same, he argued, holds true for AI companies.
“If you take a traditional tech company and add a bunch of deep learning or machine learning or neural networks to it, that does not make it an AI company,” he said.
Internet companies are architected to take advantage of internet capabilities, such as A/B testing, short cycle times to ship products, and decision-making that’s pushed down to the engineer and product level, according to Ng.
AI companies will need to be architected to do the same in relation to AI. What A/B testing’s equivalent will be for AI companies is still unknown, but Ng shared four thoughts on characteristics he expects AI companies will share.
- Strategic data acquisition. This is a complex process, requiring companies to play what Ng called multiyear chess games, acquiring important data from one resource that’s monetized elsewhere. “When I decide to launch a product, one of the criteria I use is, can we plan a path for data acquisition that results in a defensible business?” Ng said.
- Unified data warehouse. This likely comes as no surprise to CIOs, who have been advocates of the centralized data warehouse for years. But for AI companies that need to combine data from multiple sources, data silos — and the bureaucracy that comes with them — can be an AI project killer. Companies should get to work on this now, as “this is often a multiyear exercise for companies to implement,” Ng said.
- New job descriptions. AI products like chatbots can’t be sketched out the way apps can, and so product managers will have to communicate differently with engineers. Ng, for one, is training product managers to give product specifications.
- Centralized AI team. AI talent is scarce, so companies should consider building a single AI team that can then support business units across the organization. “We’ve seen this pattern before with the rise of mobile,” Ng said. “Maybe around 2011, none of us could hire enough mobile engineers.” Once the talent numbers caught up with demand, companies embedded mobile talent into individual business units. The same will likely play out in the AI era, Ng said.
Artificial intelligence isn’t just for the law-abiding. Machine learning algorithms are as freely available to cybercriminals and state-sponsored actors as they are to financial institutions, retailers and insurance companies.
“When we look especially at terrorist groups who are exploiting social media, [and] when we look at state-sponsored efforts to influence and manipulate, they’re using really powerful algorithms that are at everyone’s disposal,” said Yasmin Green, director of research and development at Jigsaw, a technology incubator launched by Google to try to solve geopolitical problems.
Criminals need not develop new algorithms or new AI, Green said at the recent EmTech conference in Cambridge, Mass. They can and are exploiting what is already out there to manipulate public opinion.
The good news about weaponized AI? The tools to combat these nefarious efforts are also advancing. One promising lead, according to Green, is bad actors don’t exhibit the same kinds of online behavior that typical users do. And security experts are hoping to exploit the behavioral “tells” they’re seeing — with the help of machines, of course.
Variations on weaponized AI
Cybercriminals and internet trolls are adept at using AI to simulate human behavior and trick systems or peddle propaganda. The online test used to tell humans from machines, CAPTCHA, is continuously bombarded by bad guys trying to trick it.
In an effort to stay ahead of cybercriminals, CAPTCHA, which stands for Completely Automated Public Turing Test to Tell Computers and Humans Apart, has had to evolve, creating some unanticipated consequences, according to Shuman Ghosemajumder, CTO at Shape Security in Mountain View, Calif. Recent data from Google shows that humans solve CAPTCHAs just 33% of the time. That’s compared to state-of-the-art machine learning optical character recognition technology that has a solve rate of 99.8%.
“This is doing exactly the opposite of what CAPTCHA was originally intended to do,” Ghosemajumder said. “And that has now been weaponized.”
He said advances in computer vision technology have led to weaponized AI services such as Death By CAPTCHA, an API plug-in that promises to solve 1,000 CAPTCHAs for $1.39. “And there are, of course, discounts for gold members of the service.”
A more aggressive attack is credential stuffing, where cybercriminals use stolen usernames and passwords from third-party sources to gain access to accounts.
Sony was the victim of a credential-stuffing attack in 2011. Cybercriminals culled a list of 15 million credentials stolen from other sites and then tested if they worked on Sony’s login page using a botnet. Today, an outfit by the good-guy-sounding name of Sentry MBA — the MBA stands for Modded By Artists — provides cybercriminals with a user interface and automation technology, making it easy to test the veracity of stolen usernames and passwords and to even bypass security features like CAPTCHAs.
“We see these types of attacks responsible for tremendous amounts of traffic on some of the world’s largest websites,” Ghosemajumder said. In the case of one Fortune 100 company, credential-stuffing attacks made up more than 90% of its login activity.
Behavioral tells in weaponized AI
Ghosemajumder’s firm Shape Security is now using AI to detect credential-stuffing efforts. One method is to use machine learning to identify behavioral characteristics that are typical of cybercriminal exploits.
When cybercriminals simulate human interactions, they will, for example, move the mouse from the username field to the password field quickly and efficiently — in an unhumanlike manner. “Human beings are not capable of doing things like moving a mouse in a straight line — no matter how hard they try,” Ghosemajumder said.
Jigsaw’s Green said her team is also looking for “technical markers” that can distinguish truly organic campaigns from coordinated ones. She described state-sponsored actors who peddle propaganda and attempt to spread misinformation through what she called “seed-and-fertilizer campaigns.”
Yasmin Greendirector of research and development, Jigsaw
“The goal of these state-sponsored campaigns is to plant a seed in social conversations and to have the unwitting masses fertilize that seed for it to actually become an organic conversation,” she said.
“There are a few dimensions that we think are promising to look at. One is the temporal dimension,” she said.
Looking across the internet, Jigsaw began to understand that coordinated attacks tend to move together, last longer than organic campaigns and pause as state-sponsored actors waited for instructions on what to do. “You’ll see a little delay before they act,” she said.
Other dimensions include network shape and semantics. State-sponsored actors tend to be more tightly linked together than communities within organic campaigns, and they tend to use “irregularly similar” language in their messaging.
The big question is can behavioral tells — identified by machines and combined with automated detection — be used to effectively identify state-sponsored campaigns? No doubt, time will tell.
NEW YORK — Machine learning and deep learning will be part of every data science organization, according to Edd Wilder-James, former vice president of technology strategy at Silicon Valley Data Science and now an open source strategist at Google’s TensorFlow.
Wilder-James, who spoke at the Strata Data Conference, pointed to recent advancements in image and speech recognition algorithms as examples of why machine learning and deep learning are going mainstream. He believes image and speech recognition software has evolved to the point where it can see and understand some things as well as — and in some use cases better than — humans. That makes it ripe to become part of the internal workings of applications and the driver of new and better services to internal and external customers, he said.
But what investments in AI should CIOs make to provide these capabilities to their companies? When building a machine learning strategy, choice abounds, Wilder-James said.
Machine learning vs. deep learning
Deep learning is a subset of machine learning, but it’s different enough to be discussed separately, according to Wilder-James. Examples of machine learning models include optimization, fraud detection and preventive maintenance. “We use machine learning to identify patterns,” Wilder-James said. “Here’s a pattern. Now, what do we know? What can we do as a result of identifying this pattern? Can we take action?”
Deep learning models perform tasks that more closely resemble human intelligence such as image processing and recognition. “With a massive amount of compute power, we’re able to look at a massively large number of input signals,” Wilder-James said. “And, so what a computer is able to do starts to look like human cognitive abilities.”
Some of the terrain for machine learning will look familiar to CIOs. Statistical programming languages such as SAS, SPSS and Matlab are known territory for IT departments. Open source counterparts such as R, Python and Spark are also machine-learning ready. “Open source is probably a better guarantee of stability and a good choice to make in terms of avoiding lock-in and ensuring you have support,” Wilder-James said.
Unlike other tech rollouts
The rollout of machine learning and deep learning models, however, is a different process than most technology rollouts. After getting a handle on the problem, CIOs will need to investigate if machine learning is even an appropriate solution.
“It may not be true that you can solve it with machine learning,” Wilder-James said. “This is one important difference from other technical rollouts. You don’t know if you’ll be successful or not. You have to enter into this on the pilot, proof-of-concept ladder.”
The most time-consuming step in deploying a machine learning model is feature engineering, or finding features in the data that will help the algorithms self-tune. Deep learning models skip the tedious feature engineering step and go right to the training step. To tune a deep learning model correctly requires immense data sets, graphic processing units or tensor processing units, and time. Wilder-James said it could take weeks and even months to train a deep learning model.
One more thing to note: Building deep learning models is hard and won’t be a part of most companies’ machine learning strategy.
“You have to be aware that a lot of what’s coming out is the closest to research IT has ever been,” he said. “These things are being published in papers and deployed in production in very short cycles.”
CIOs whose companies are not inclined to invest heavily in AI research and development should instead rely on prebuilt, reusable machine and deep learning models rather than reinvent the wheel. Image recognition models, such as Inception, and natural language models, such as SyntaxNet and Parsey McParseface, are examples of models that are ready and available for use.
“You can stand on the shoulders of giants, I guess that’s what I’m trying to say,” Wilder-James said. “It doesn’t have to be from scratch.”
Machine learning tech
The good news for CIOs is that vendors have set the stage to start building a machine learning strategy now. TensorFlow, a machine learning software library, is one of the best known toolkits out there. “It’s got the buzz because it’s an open source project out of Google,” Wilder-James said. “It runs fast and is ubiquitous.”
While not terribly developer-friendly, a simplified interface called Keras eases the burden and can handle the majority of use cases. And TensorFlow isn’t the only deep learning library or framework option, either. Others include MXNet, PyTorch, CNTK, and Deeplearning4j.
For CIOs who want AI to live on premises, technologies such as Nvidia’s DGX-1 box, which retails for $129,000, are available.
But CIOs can also utilize cloud as a computing resource, which would cost anywhere between $5 and $15 an hour, according to Wilder-James. “I worked it out, and the cloud cost is roughly the same as running the physical machine continuously for about a year,” he said.
Or they can choose to go the hosted platform route, where a service provider will run trained models for a company. And other tools, such as domain-specific proprietary tools like the personalization platform from Nara Logics, can fill out the AI infrastructure.
“It’s the same kind of range we have with plenty of other services out there,” he said. “Do you rent an EC2 instance to run a database or do you subscribe to Amazon Redshift? You can pick the level of abstraction that you want for these services.”
Still, before investments in technology and talent are made, a machine learning strategy should start with the basics: “The single best thing you can do to prepare with AI in the future is to develop a competency with your own data, whether it’s getting access to data, integrating data out of silos, providing data results readily to employees,” Wilder-James said. “Understanding how to get at your data is going to be the thing to prepare you best.”
New open source deep learning interface allows developers to more easily and quickly build machine learning models without compromising training performance. Jointly developed reference specification makes it possible for Gluon to work with any deep learning engine; support for Apache MXNet available today and support for Microsoft Cognitive Toolkit coming soon.
SEATTLE and REDMOND, Wash. — Oct. 12, 2017 — On Thursday, Amazon Web Services Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), and Microsoft Corp. (NASDAQ: MSFT) announced a new deep learning library, called Gluon, that allows developers of all skill levels to prototype, build, train and deploy sophisticated machine learning models for the cloud, devices at the edge and mobile apps. The Gluon interface currently works with Apache MXNet and will support Microsoft Cognitive Toolkit (CNTK) in an upcoming release. With the Gluon interface, developers can build machine learning models using a simple Python API and a range of prebuilt, optimized neural network components. This makes it easier for developers of all skill levels to build neural networks using simple, concise code, without sacrificing performance. AWS and Microsoft published Gluon’s reference specification so other deep learning engines can be integrated with the interface. To get started with the Gluon interface, visit https://github.com/gluon-api/gluon-api/.
Developers build neural networks using three components: training data, a model and an algorithm. The algorithm trains the model to understand patterns in the data. Because the volume of data is large and the models and algorithms are complex, training a model often takes days or even weeks. Deep learning engines like Apache MXNet, Microsoft Cognitive Toolkit and TensorFlow have emerged to help optimize and speed the training process. However, these engines require developers to define the models and algorithms up front using lengthy, complex code that is difficult to change. Other deep learning tools make model-building easier, but this simplicity can come at the cost of slower training performance.
The Gluon interface gives developers the best of both worlds — a concise, easy-to-understand programming interface that enables developers to quickly prototype and experiment with neural network models, and a training method that has minimal impact on the speed of the underlying engine. Developers can use the Gluon interface to create neural networks on the fly, and to change their size and shape dynamically. In addition, because the Gluon interface brings together the training algorithm and the neural network model, developers can perform model training one step at a time. This means it is much easier to debug, update and reuse neural networks.
“The potential of machine learning can only be realized if it is accessible to all developers. Today’s reality is that building and training machine learning models require a great deal of heavy lifting and specialized expertise,” said Swami Sivasubramanian, VP of Amazon AI. “We created the Gluon interface so building neural networks and training models can be as easy as building an app. We look forward to our collaboration with Microsoft on continuing to evolve the Gluon interface for developers interested in making machine learning easier to use.”
“We believe it is important for the industry to work together and pool resources to build technology that benefits the broader community,” said Eric Boyd, corporate vice president of Microsoft AI and Research. “This is why Microsoft has collaborated with AWS to create the Gluon interface and enable an open AI ecosystem where developers have freedom of choice. Machine learning has the ability to transform the way we work, interact and communicate. To make this happen we need to put the right tools in the right hands, and the Gluon interface is a step in this direction.”
“FINRA is using deep learning tools to process the vast amount of data we collect in our data lake,” said Saman Michael Far, senior vice president and CTO, FINRA. “We are excited about the new Gluon interface, which makes it easier to leverage the capabilities of Apache MXNet, an open source framework that aligns with FINRA’s strategy of embracing open source and cloud for machine learning on big data.”
“I rarely see software engineering abstraction principles and numerical machine learning playing well together — and something that may look good in a tutorial could be hundreds of lines of code,” said Andrew Moore, dean of the School of Computer Science at Carnegie Mellon University. “I really appreciate how the Gluon interface is able to keep the code complexity at the same level as the concept; it’s a welcome addition to the machine learning community.”
“The Gluon interface solves the age old problem of having to choose between ease of use and performance, and I know it will resonate with my students,” said Nikolaos Vasiloglou, adjunct professor of Electrical Engineering and Computer Science at Georgia Institute of Technology. “The Gluon interface dramatically accelerates the pace at which students can pick up, apply and innovate on new applications of machine learning. The documentation is great, and I’m looking forward to teaching it as part of my computer science course and in seminars that focus on teaching cutting-edge machine learning concepts across different cities in the U.S.”
“We think the Gluon interface will be an important addition to our machine learning toolkit because it makes it easy to prototype machine learning models,” said Takero Ibuki, senior research engineer at DOCOMO Innovations. “The efficiency and flexibility this interface provides will enable our teams to be more agile and experiment in ways that would have required a prohibitive time investment in the past.”
The Gluon interface is open source and available today in Apache MXNet 0.11, with support for CNTK in an upcoming release. Developers can learn how to get started using Gluon with MXNet by viewing tutorials for both beginners and experts available by visiting https://mxnet.incubator.apache.org/gluon/.
About Amazon Web Services
For 11 years, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud platform. AWS offers over 90 fully featured services for compute, storage, networking, database, analytics, application services, deployment, management, developer, mobile, Internet of Things (IoT), Artificial Intelligence (AI), security, hybrid and enterprise applications, from 44 Availability Zones (AZs) across 16 geographic regions in the U.S., Australia, Brazil, Canada, China, Germany, India, Ireland, Japan, Korea, Singapore, and the UK. AWS services are trusted by millions of active customers around the world — including the fastest-growing startups, largest enterprises, and leading government agencies — to power their infrastructure, make them more agile, and lower costs. To learn more about AWS, visit https://aws.amazon.com.
Amazon is guided by four principles: customer obsession rather than competitor focus, passion for invention, commitment to operational excellence, and long-term thinking. Customer reviews, 1-Click shopping, personalized recommendations, Prime, Fulfillment by Amazon, AWS, Kindle Direct Publishing, Kindle, Fire tablets, Fire TV, Amazon Echo, and Alexa are some of the products and services pioneered by Amazon. For more information, visit www.amazon.com/about and follow @AmazonNews.
Microsoft (Nasdaq “MSFT” @microsoft) is the leading platform and productivity company for the mobile-first, cloud-first world, and its mission is to empower every person and every organization on the planet to achieve more.
For more information, press only:
Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, firstname.lastname@example.org
Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.
SAN FRANCISCO — Last week saw an expansion of data handling and machine learning capabilities for Oracle cloud security and management product lines.
The rollout came along with some warnings about the dangers of unprotected data, and a few brickbats for upstart rival Splunk, which has made headway in the field of security information and event management (SIEM).
Oracle’s updates appear amid a whirl of headlines on a massive data breach at Equifax, the large credit and collections agency that this year put millions of Americans’ private data at risk. Some viewers suggest the breach was the work of state-sponsored hackers.
Among those viewers is Oracle founder and CTO Larry Ellison, who chose Oracle’s OpenWorld 2017 event to roll out updates to its Oracle Management Cloud and Oracle Security Monitoring and Analytics Cloud Service. State-sponsored hackers up the ante in cybersecurity, he said.
“Companies have to defend themselves against nation-states. And, some of these guys are very good at what they do,” Ellison warned. “This is really a very bad situation.”
Looking for bad patterns
Oracle database security has been a strong selling point for the company over many years, although its overall security came in for continual criticism after a 2008 purchase of Sun Microsystems that included Java and the J2EE framework.
Now, Oracle cloud security is gaining special focus. Oracle cloud security efforts were buttressed in 2016 with acquisitions, including DNS services provider Dyn and cloud access broker provider Palerra. For its new releases, acquired services are further strengthened by data management and machine learning advances forged within Oracle.
As described by Ellison and others, the essence of the updates to Oracle Management Cloud and Oracle Security Monitoring and Analytics Cloud Service rely on a well-curated, unified data store for massive amounts of log and other activity data.
Add to that a heaping helping of machine learning algorithms that look for good and bad patterns of activity. Finally, runbook-style automation will be employed to fix more and more security flaws without human intervention.
Splunk-y rival attracts wrath of Larry
Oracle OpenWorld sometimes serves as a stage for leader Ellison’s zest for heated competition. Last year, with cloud database technology being showcased, he berated Amazon Web Services. This year, with Oracle’s enhanced data, cloud and security management software on the docket, Ellison’s targets expanded to include Splunk, a San Francisco-based software company that has made a mark in log analysis in addition to SIEM.
Ellison challenged Splunk for lack of an entity model for unified data handling, difficult-to-use machine learning and lack of remediation capabilities. In his view, not surprisingly, the Oracle offering is better.
“It is not simply an analytical system, like Splunk. It is a security monitoring and management system designed to detect and remediate the problem,” he told the OpenWorld gathering.
Splunk — again, not surprisingly — responded. In a blog post entitled “Splunk Fires Back at Ludicrous Larry,” CEO Doug Merritt contended that there are drawbacks to single, unified repositories for threat and contextual data. Merritt dismissed Ellison’s assertion that Splunk is purely an analytical system, without remediation capabilities, citing hooks, for example, to ServiceNow operations automation. And, while Splunk does provide an SDK for data scientists, its capabilities are within reach of “anyone in IT, security or the business, no data science degree required,” he said.
“It was flattering that Oracle finally woke up to the power of machine data and the importance of security,” Merritt wrote. The blog post concludes with a photo of a capsized Oracle America’s Cup series catamaran.
Threats to Oracle cloud security
Oracle will find some favor with its security monitoring and analytics cloud services because they’re logical add-ons for its growing number of cloud-based offerings, according to Eric Parizo, a senior analyst at GlobalData Technology. The new services also have the potential to be a disruptive force among security offerings, Parizo said, if the company provides a cloud-based alternative that’s truly easier to use.
“Oracle sees Splunk succeeding with a security-centric approach that mirrors a lot of what Oracle does in the data management realm, so Oracle believes it is recapturing an opportunity it should have pursued earlier,” he said.
Still, Parizo continued, “it’s impossible to ignore Oracle’s poor track record on cybersecurity.” Over many years, Oracle has “released products rife with security flaws, and ignored those flaws for months or in some cases years after they’ve been widely known,” he said. “The company has a lot of work to do to prove its cybersecurity solutions are effective, and that its approach toward security has evolved enough to justify an investment.”
Meanwhile, Oracle may have found an out for at least some portion of its bad security press. The company recently ceded great portions of its Java software assets to the open source community, putting future revisions largely in the hands of the Eclipse Foundation.
The move could mean that Java flaws, many of which Oracle inherited along with its purchase of Java originator Sun, will become the responsibility of a wider group of software developers.
Predictive analytics software combines artificial intelligence, machine learning, data mining and modeling to parse big data resources and create highly accurate and insightful forecasts, but companies need flash technology to support it.
Thanks to its impressive speed, flash technology accelerates predictive analytics software. With flash’s sub-millisecond latency, business, engineering and other verticals can perform more complex analyses in less time than with conventional hard disk drive technology.
“Flash storage is a key technology that enables analysis at larger scales of data in faster time frames,” said Mike Matchett, an analyst with storage industry research firm Taneja Group Inc. in Hopkinton, Mass.
According to Vincent Hsu, IBM’s CTO for storage, there are three basic requirements storage must meet to effectively support analytical workloads: compelling data economics, enterprise resiliency and easy infrastructure integration.
“Put simply, faster response times can yield more business agility and quicker time to value from analytics, and more data analyzed at once means more potential value streams,” Hsu said.
There’s a competitive race today to use predictive analytics software in many forms, including machine learning and deep learning applied to operational optimization.
“By becoming predictive at increasing operational speeds, organizations can not only find marked improvement in existing business processes, but exploit disruptive new approaches to their markets,” Matchett said. “We’ve seen predictive analytics evolve from offline, small data scoring into massive web-scale, big data, real-time decision-making.”
Predictive analytics software is not just about analysis, but gaining the ability to respond — rather than react — to rapidly changing market conditions.
“Since actions based on the results are the whole point, faster, smarter and more relevant results win the day and, as a result, flash wins out,” said Donna Taylor, head of consulting firm Taylor & Associates and former Gartner analyst.
Matchett noted that organizations can add flash technology to almost any modern array in the form of cache or as a fast storage tier.
“We also see some innovation in having storage architectures ‘link up’ server-side flash as a virtual local performance tier of persistence,” he said.
Turning data into insights
The key obstacle many predictive analytics software users face is limited file-access speed.
“While the raw storage capacity of legacy [or] traditional storage has increased dramatically in recent years, the rate at which the data can be accessed and served has remained relatively flat,” said Sam Ruchlewicz, director of digital strategy and data analytics at Warschawski, a marketing communications agency based in Baltimore that uses predictive analytics software to study consumer trends and behaviors.
“As the sheer volume of customer data continues to grow, more predictive analytics applications are moving to flash storage to efficiently and effectively access actionable information,” he said.
Ruchlewicz noted that one of the biggest challenges in his field is making sense of terabytes — or even petabytes — of customer data in real time, then using that insight to deliver a better customer experience at relevant touch points.
“To accomplish that goal, the predictive analytics application [or] algorithm must query the database for the requisite information, process it and provide the result to the next component of your marketing technology stack,” he said. Flash technology is the key to making this process fast and efficient.
As they look to accelerate their predictive analytics capabilities, organizations must carefully examine where a flash technology investment can make the most sense.
“Storage-side flash tends to be shared widely, but is probably the most expensive,” Matchett said. “Server-side flash, such as NVMe, can provide a huge boost to applications that can make use of it locally, but might be quite a large investment to make across a large big data cluster.”
Matchett noted that flash storage prices will continue to fall, even as capacities increase.
“What is interesting is that we also see some possible new tiers of faster persistence coming with ReRAM MRAM and the like,” he said.
For now, many predictive analytics software users rely on a combination of storage media types, including HDDs, tape and flash technology.
“This is nothing new; however, companies looking to squeeze additional value from dense data sets will increasingly adopt flash technology in order to reap the benefits of faster seek and processing speeds,” Taylor said.
The essential attribute most flash customers are looking for, according to Hsu, is data agility — the automated, policy-driven reallocation of data to and from a storage medium without a lot of human intervention or time-consuming, expensive steps.
“It is in this state of data agility where flash really shines and paves the way for artificial intelligence and machine learning,” Hsu said.
Taylor urged predictive analytics software users who are planning a full or partial transition to flash technology to thoroughly research the market. “Otherwise, they risk being at the mercy of a salesman’s skewed sales pitch,” she says.
Ruchlewicz said he would advise any organization considering an infrastructure investment designed to support an analytics initiative to seriously think about using flash storage, noting that most predictive models request data faster than a legacy system can provide it.
“Even if the organization’s data set is within more reasonable bounds, flash is the superior alternative and the system of the future,” he noted.
Hsu concurred. “Data is the most valuable commodity organizations can lay claim to, and any organization that considers speed and insight as a competitive advantage can benefit from flash storage for predictive analytics,” he said.