Tag Archives: York

Secure remote access offering gains ground among MSPs

Todyl, a New York City company that sells a networking and security platform through MSPs, reported increasing interest in its product as organizations face secure remote access challenges.

“Things have been rapidly evolving over the last two weeks with the COVID-19 response,” Todyl CEO John Nellen said. “We have been really busy trying to help existing partners and new partners.”

The company offers MSPs — and their SMB customers — the ability to consolidate networking and security components into a cloud-based platform. Todyl MSP partners deploy the technology by installing agents on customers’ Windows, Mac, Linux, iOS or Android devices. A VPN tunnel then links customers to Todyl’s Secure Global Network offering, which incorporates web proxy, firewall, content filtering, intrusion detection/prevention (IDP), malware interception and security information and event management (SIEM) technologies.

The Secure Global Network’s points of presence link end customers to multiple network providers. Todyl’s platform connects organizations’ remote workers, data centers, cloud providers, main offices and branch locations, according to the company.

Todyl is currently offering its platform to MSPs for free for 30 days “to help support the immediate need,” Nellen said. Once the offer expires, pricing is device-based with add-on features. Todyl offers pricing for two groups: mobile (Android/iOS) and desktop/laptop/server (Windows, Mac, Linux).

MSP taps Todyl for remote enablement

Infinit Consulting Inc., an MSP based in Campbell, Calif., is selling Todyl as a white-labeled offering. The company has branded Todyl as Infinit Shield Total Defense, which it has paired with its own Infinit Shield security process management platform, according to Jerod Powell, president and founder of Infinit Consulting.

John Nellen, CEO at TodylJohn Nellen

Powell called Todyl “instrumental in helping our customers rapidly enable complete remote workforce capabilities.”

Infinit Consulting had previously enabled nearly all of its customers to use cloud services, but the company is currently tasked with helping them significantly expand remote workforces. The expansion sometimes includes moving customers from having 15% of employees working remotely to nearly 100%.

While assisting with remote workforce expansions, Infinit Consulting has run into issues such as licensing and hardware limitations around customers’ previous remote work applications, Powell said. He pointed to another issue: Properly securing devices to ensure data integrity, company policy adherence and security, while allowing employees to work remotely — often from their personal home PC or Mac.

Powell said Todyl lets Infinit Consulting enable remote access in a matter of a few hours in a full-scale deployment. The Todyl offering also lets the company “secure that remote connection 100%, end to end;” bring clients onto the Secure Global Network; and feed data back to the SIEM. The SIEM feature provides the MSP with “the telemetry needed to identify potential security risks [and] enforce corporate policy just as if [remote employees] were on the client’s LAN.”

Things have been rapidly evolving over the last two weeks with the COVID-19 response.
John NellenCEO, Todyl

He said Todyl also offers IDP and advanced threat protection scanning to flag potentially malicious applications and data before they reach customers.

The demand for supporting customers’ remote workforces is “extremely high,” Powell noted. He cited a case in which Infinit Consulting rolled out Todyl to a customer that needed to enable more than 500 users to work remotely. The customer’s previous remote work product only supported 100 users. Todyl also identified security issues in several of remote workers’ home PCs. The MSP was able to resolve those issues before admitting the remote workers’ devices onto the network, he added.

Powell said his company has created deployment packages for Todyl that can implement the product in an automated manner.

Waves of demand for secure remote access

Citing conversations with Todyl MSP partners, Nellen said MSPs anticipate two waves of unfolding demand for remote work technology.

The first wave consists of early adopters trying to quickly set up their organizations for newly distributed workforces. The second wave will comprise SMBs that have yet to determine the best way to support remote workers. Those companies will start making decisions, based on guidance from government agencies, in the coming weeks, Nellen said.

“They are expecting this not to be just a single shot, but something that is taking place and evolving over time,” Nellen said.

Go to Original Article
Author:

Google Cloud retail strategy provides search, hosting, AI for stores

NEW YORK — Google made a pitch to chain-store brands this week, taking on Microsoft Azure and AWS with a bundle of fresh Google Cloud retail hosting and services, backing it up with blue-chip customers.

In sessions at NRF 2020 Vision: Retail’s Big Show, Google Cloud CEO Thomas Kurian and Carrie Tharp, retail and consumer vice president, wooed retailers with promises of AI, uptime and search expertise — including voice and video, in addition to traditional text content — as well as store employee collaboration tools.

Home improvement chain Lowe’s said it will embark on a multiyear plan to rebuild its customer experience, both in-store and online, with Google Cloud at its center. Lowe’s plans to spend $500 million per year through 2021 on the project.

Kohl’s, Ulta Beauty’s business drivers

“Customers expect retailers to be as good with their tech as they are with their physical stores,” said Paul Gaffney, CTO of Kohl’s. The 1,060-store chain launched a major overhaul of its digital customer experience and IT infrastructure in 2018 with Google Cloud retail services, and plans to migrate 70% of its apps into Google Anthos.

Retailers need cloud services that create value for their brands among its customers, Gaffney said, but uptime and scalability is also a major consideration during peak selling times.

“The big rush of business used to be Black Friday, last year was the Cyber Five [Thanksgiving until the following Monday], and now seems like the months of November and December,” Gaffney said in a session with Kurian. “Folks who have been doing this a long time know that we all used to provision a lot of gear that lay idle other than during that period.”

Ulta Beauty, which operates 1,124 stores, chose the Google Cloud Platform for its Ulta Rewards loyalty program hosting and customer data handling, said Michelle Pacynski, vice president of digital innovation at Ulta. The program has 33.9 million members and drives 95% of Ulta’s sales, she added.

Ulta chose Google in part for its data, analytics and personalization platform, Pacynski said. But data ownership also weighed heavily in the decision.

“We looked at the usual subjects, who you would think we would look at,” Pacynski said. “Ultimately for us, we wanted to own our data, we wanted to have power over our data. We evaluated everybody and looked at how we could remain more autonomous with our data.”

Google Cloud retail taking on Azure, AWS

Google’s charge into the retail space started last year with the introduction of retail-specific services to manage customer loyalty, store operations, inventory and product lifecycle management. At NRF 2020, Google added search, AI and hosting services to that stack. It’s part of Google’s bigger push into verticals, Tharp said.

Really, where we see the future of cloud capabilities is in industry-specific solutions.
Carrie TharpRetail and consumer vice president, Google Cloud

“[Google] Cloud started as an infrastructure-as-a-service play,” Tharp said. “Really, where we see the future of cloud capabilities is in industry-specific solutions — having a deep understanding of the industry and building products specific to that. We’re constructing our entire organization around these industry-specific solutions.”

Tharp and some industry experts at NRF said that some retailers harbor resentment toward Amazon as a competitor and are looking for cloud partners other than AWS for future projects. But that is changing, as stores realize that offering Amazon-like speed of delivery and customer service in general is a more important business priority than beating Amazon.

Still, there’s enough anti-Amazon sentiment among retailers that Google has an opportunity to expand its foothold, said Sheryl Kingstone, 451 Research analyst.

“We’re seeing Google Cloud Platform pop up as one of the strategic vendors retailers are looking for in their digital transformations,” Kingstone said. “Azure is up there, and AWS is the 800-pound gorilla. But in the retail space, there’s that opportunity of stealing away someone who is very concerned about being on AWS.”

Go to Original Article
Author:

Ex-SAP exec steers Episerver CMS toward digital experience market

NEW YORK — The Episerver CMS is morphing into a digital experience platform, led by CEO Alex Atzberger, the former SAP C/4HANA customer experience platform lead. He departed SAP in October and joined Episerver last month.

We sat down with Atzberger at NRF 2020 Vision: Retail’s Big Show to discuss recent Episerver acquisitions such as Insite Software, future acquisitions, how digital experience and customer experience differ, why he left SAP and his vision for Episerver’s acquisition and product roadmap.

How did you end up at Episerver? Things happened kind of fast after you left SAP.

Alex Atzberger: It happened very fast. When I desired to leave SAP, I looked for a cloud company with triple-digit [hundreds of millions of dollars in] cloud revenue. I was looking for something in CX, the most exciting and fastest-growing part of enterprise software. And I was looking for something that had the right strategic mindset.

[Episerver] had been acquired by Insight Partners, which had put money into the business, so they’re at an inflection point. They are the leader in what is still king, which is content. Even if you look at commerce-centric businesses, content matters a lot. And how do you marry content and commerce together? There are very few companies that have both of those embedded, and Epi is one of them. It worked out well, and the timing was perfect, very fast.

You’ve only been at Episerver for a month, but how would you describe your vision for the company and the product roadmap moving forward?

There are a couple of trends that continue to be very important: One, understanding everything about your customer, and two, serving up the next best action.
Alex AtzbergerCEO, Episerver

Atzberger: We have an untold story. People are really, really happy with this technology. One big part of the strategy, going forward, is expansion in North America, and telling the story of Epi.

Because of the size of the U.S. market, we have to decide on which specific verticals to focus on. There’s a large part of the economy that is not digital, that is somehow forgotten. These companies will not work with [platform vendors] that are too large; they need [vendors] that are large enough to serve, but small enough to care about the results. … Ultimately commerce and content are the face of so many brands, the heart of your business. … We’re going to focus on that market, and bringing automation to content, using AI and automation to scale [digital operations].

Do you feel like you’re competing with your old company SAP, since Episerver CMS is now on its way to being a full-featured digital experience platform with content and commerce clouds?

Atzberger: When I built the SAP CX platform, we built it under the notion of connecting supply chain and demand chain. It was really a relevant message for very large companies that were looking at one platform. Epi is much more focused on the digital experience, truly understanding the digital customer, and doing it in such a way that companies between, say, a million and a billion, are the sweet spot. It’s 80% or 90% whole different [market].

What happened at SAP? Bill McDermott left [in October], and you weren’t far behind. It was all very quick.

Atzberger: If you look at the big picture, it was 15 years [at SAP]. We all want to be CEO of a company. At one point it becomes harder, and you basically end up being part of a company for life.

There’s too much innovation going on, too much excitement going on that I wanted to be part of as well. Ariba and CX are a massive part of SAP. I’m very proud of that and I’m proud of what SAP has done as a company. With the CEO change it was a natural point [to depart].

How do your past experiences at SAP and SAP Ariba color what you’ll be doing at Episerver?

Atzberger: Those involved transformation, and I think it’s going to be a bit of the same here, rallying people around a common cause and a common brand.

The acquisition of B2B e-commerce company Insite Software, which caters to manufacturers and distributors, happened within days after you joined Epi. The deal probably was in the works before you started, right? Did you have final sign-off on the acquisition, or was the deal finished before?

Atzberger: Yes, it was in the works. The strategic direction was important in speaking about it with [Epi’s private equity owners] Insight Partners. There’s a huge B2B commerce opportunity.

When the acquisitions of Insite and [product content tagging automation technology] Idio were discussed with me, not only was I supportive of them but also it attracted me to Episerver as a company. The acquisitions made it so much more compelling to be at this place, at this time. I went to Minneapolis and met [Insite Software CEO] Steve Shaffer, and saw how well they executed against their goals. It left me inspired. I left Minneapolis thinking, ‘This is part of the future of Epi.’

What can you tell me about how you’re thinking about future acquisitions? You’re growing, you’re flush with cash. You can’t be done. What’s next?

Atzberger: We’re not done. Our focus now is the integration of Idio and Insite. What’s interesting to me is that there are a couple of trends that continue to be very important: One, understanding everything about your customer, and two, serving up the next best action. Everything that we do in the foreseeable future will be focused on the digital experience, and helping our customers get better and more informed data about their customers so they can make better decisions.

Go to Original Article
Author:

Microsoft Store empowers students with free Computer Science Education Week workshops | Windows Experience Blog

Students at a Microsoft Store

One hundred and thirteen years ago in New York, a girl was born into a generation where the average woman was more likely to perfect a signature pie recipe than solve a pi-based equation. Pushing against expectations, this girl became one of the pre-eminent technologists of our times and made it possible to convert human language into machine code understood by computers. On Dec. 9, the anniversary of Admiral Grace Hopper’s birthday kicks off Computer Science Education Week, an annual program dedicated to inspiring K-12 students to take interest in computer science.

As technology such as AI and cloud computing rapidly transforms the future of work, it’s more important than ever for students and educators to develop STEM—science, technology, engineering and math—skills. It’s estimated that over 85 million jobs worldwide will go unfilled by 2030 if we don’t bridge the STEM skills gap—but schools often struggle to implement quality STEM curriculum and prepare students for career paths that are just starting to come into focus.

Inspire students to ignite a passion for Computer Science

Microsoft Stores are committed to empowering students and educators with computer science resources and will host over 400 events in partnership with STEM influencers throughout Computer Science Education Week across locations. Ensuring no one is left behind when it comes to developing increasingly crucial computer science skills, programming this year has an increased focus on inclusion for traditionally underrepresented students.

Microsoft Store workshops will offer hands-on learning centered around coding, game design, app development and more using technologies from Windows, Surface, Office 365, Minecraft and more. Participants will hear from a diverse group of mentors and organizations from across the STEM field, including Lynell Caldwell, NASA, Brandon Copeland, Black Girls Code, Al Smith, Curtis Baham, Lee Woodall, Dennis Brown and Titus O’Neil.

Students at a Microsoft Store

Check your local Microsoft Store to register for exciting workshops geared toward empowering every learner, including workshops like;

  • Latina Girls in Gaming with MakeCode Arcade: Learn basic block coding and create video games with Gabriela Ponce, producer with Turn 10 Studios and advocate for helping the Latinx community succeed in the gaming industry. Gaby will share more about her journey combining her passions for art, culture and technology, and empower Latina girls to embrace STEM skills.
  • All Kids Code with Tynker Space Quest: Solve coding puzzles to guide an astronaut in space with Nadmi Casiano, the first deaf woman to graduate with an aeronautical engineering degree. All students are welcome, and ASL interpreters will be available at participating Microsoft Store locations to empower students with hearing disabilities.
  • African American Girls Code with Tynker Space Quest: Joan Higgenbotham, one of the first African American female astronauts to go into space, will share her experience at a workshop geared toward inspiring African American Girls to pursue STEM. Participants will learn basic coding concepts in an engaging format as they navigate aliens in search for a spaceship.
  • MANCODE with Design and Code Apps: Brainstorm app ideas and bring ideas to life with MANCODE, an organization aimed at addressing the stagnate growth of African American males within the STEM industry, who currently represent only 2.2% of the field. This workshop is geared toward underrepresented male students aged 13 and older, who will meet a male minority mentor and learn about the importance of technology.
  • Harry Potter Kano Coding Kit Workshop: Explore the magic of STEM at a Harry Potter Kano Coding Kit Workshop that introduces foundational coding concepts, including drag-and-drop coding. This autism-friendly workshop features alternate activities to allow a broad level of participation, and parents are welcome to join with their child.

In addition to these workshops, Microsoft Stores will also host Minecraft Hour of Code workshops, teaching students of all ages to code with Minecraft. The new Minecraft Hour of Code lesson aligns with this year’s theme, Computer Science for Good. Students will explore coding and artificial intelligence as they protect a village from forest fires in an immersive Minecraft world. Anyone can learn how coding can help build a better world—in just one hour!

Check availability of workshops and RSVP at your local Microsoft Store. Programming will vary by location. And, do you know students always get 10% off at Microsoft Store?* Make sure to take advantage of your discount when you shop at Microsoft Store.

* See full terms at https://www.microsoft.com/en-us/store/b/education

Go to Original Article
Author: Microsoft News Center

Minecraft Earth Early Access Off to an Exciting Start Following Mobs in the Park Kickoff – Xbox Wire

Summary

  • Last week, we unveiled the Mobs in the Park pop-up experience in New York City, Sydney and London
  • Since kicking off early access on Oct. 17, the global community has placed 240.4 million blocks, collected 76 million tappables and started 6.8 million crafting and smelting sessions
  • The Mobs in the Park will continue over the next two weekends from 9 a.m. – 6 p.m. local time, as well as a special appearance on Black Friday in New York City

Last week in celebration of Minecraft Earth’s early access rollout, we unveiled the Mobs in the Park pop-up experience in three locations around the world – Hudson Yards in New York City, Campbell’s Cove in Sydney and the Queen’s Walk in London – granting players exclusive in-game access to the holiday-themed Jolly Llama mob. The community’s response to Mobs in the Park has been humbling over the first weekend, and we can’t wait to see even more reactions leading into the next two weekends.

The fun doesn’t stop with Mobs in the Park as Minecraft Earth continues to gain momentum and roll out to more countries worldwide. Last week the game released in the U.S., earlier this week it became available to players in Western Europe and Japan, and the goal is for the game to be worldwide by the end of the year.

Since kicking off early access on Oct. 17, the global community has placed 240.4 million blocks, collected 76 million tappables and started 6.8 million crafting and smelting sessions! We’re so proud of how the community has embraced the game in early access rollout and look forward to bringing even more exciting experiences to players everywhere in the coming weeks.

Minecraft Earth’s Mobs in the Park will continue over the next two weekends, so players interested in receiving the Jolly Llama for themselves can visit the interactive pop-ups from 9 a.m. – 6 p.m. local time during the weekends of November 23-24 and November 30-December 1, or during a special appearance at Hudson Yards in New York City on Black Friday, November 29. Only at these locations will players be able to get first access to the holiday-inspired Jolly Llama before it’s available broadly in December.

For more information on Mobs in the Park and what’s next with early access rollout, visit Xbox Wire and Minecraft.net.

Go to Original Article
Author: Microsoft News Center

Cloud database services multiply to ease admin work by users

NEW YORK — Managed cloud database services are mushrooming, as more database and data warehouse vendors launch hosted versions of their software that offer elastic scalability and free users from the need to deploy, configure and administer systems.

MemSQL, TigerGraph and Yellowbrick Data all introduced cloud database services at the 2019 Strata Data Conference here. In addition, vendors such as Actian, DataStax and Hazelcast said they soon plan to roll out expanded versions of managed services they announced earlier this year.

Technologies like the Amazon Redshift and Snowflake cloud data warehouses have shown that there’s a viable market for scalable database services, said David Menninger, an analyst at Ventana Research. “These types of systems are complex to install and configure — there are many moving parts,” he said at the conference. With a managed service in the cloud, “you simply turn the service on.”

Menninger sees cloud database services — also known as database as a service (DBaaS) — as a natural progression from database appliances, an earlier effort to make databases easier to use. Like appliances, the cloud services give users a preinstalled and preconfigured set of data management features, he said. On top of that, the database vendors run the systems for users and handle performance tuning, patching and other administrative tasks.

Overall, the growing pool of DBaaS technologies provides good options “for data-driven companies needing high performance and a scalable, fully managed analytical database in the cloud at a reasonable cost,” said William McKnight, president of McKnight Consulting Group.

Database competition calls for cloud services

For database vendors, cloud database services are becoming a must-have offering to keep up with rivals and avoid being swept aside by cloud platform market leaders AWS, Microsoft and Google, according to Menninger. “If you don’t have a cloud offering, your competitors are likely to eat your lunch,” he said.

Strata Data Conference
The Strata Data Conference was held from Sept. 23 to 26 in New York City.

Todd Blaschka, TigerGraph’s chief operating officer, also pointed to the user adoption of the Atlas cloud service that NoSQL database vendor MongoDB launched in 2016 as a motivating factor for other vendors, including his company. “You can see how big of a revenue generator that has been,” Blaschka said. Services like Atlas “allow more people to get access [to databases] more quickly,” he noted.

Blaschka said more than 50% of TigerGraph’s customers already run its namesake graph database in the cloud, using a conventional version that they have to deploy and manage themselves. But with the company’s new TigerGraph Cloud service, users “don’t have to worry about knowing what a graph is or downloading it,” he said. “They can just build a prototype database and get started.”

TigerGraph Cloud is initially available in the AWS cloud; support will also be added for Microsoft Azure and then Google Cloud Platform (GCP) in the future, Blaschka said.

Yellowbrick Data made its Yellowbrick Cloud Data Warehouse service generally available on all three of the cloud platforms, giving users a DBaaS alternative to the on-premises data warehouse appliance it released in 2017. Later this year, Yellowbrick also plans to offer a companion disaster recovery service that provides cloud-based replicas of on-premises or cloud data warehouses.

More cloud database services on the way

MemSQL, one of the vendors in the NewSQL database category, detailed plans for a managed cloud service called Helios, which is currently available in a private preview release on AWS and GCP. Azure support will be added next year, said Peter Guagenti, MemSQL’s chief marketing officer.

About 60% of MemSQL’s customers run its database in the cloud on their own now, Guagenti said. But he added that the company, which primarily focuses on operational data, was waiting for the Kubernetes StatefulSets API object for managing stateful applications in containers to become available in a mature implementation before launching the Helios service.

Actian, which introduced a cloud service version of its data warehouse platform on AWS last March, said it will make the Avalanche service available on Azure this fall and on GCP at a later date.

We ultimately are the caretaker of the system. We may not do the actual work, but we guide them on it.
Naghman WaheedData platforms lead, Bayer Crop Science

DataStax, which offers a commercial version of the Cassandra open source NoSQL database, said it’s looking to make a cloud-native platform called Constellation and a managed version of Cassandra that runs on top of it generally available in November. The new technologies, which DataStax announced in May, will initially run on GCP, with support to follow on AWS and Azure.

Also, in-memory data grid vendor Hazelcast plans in December to launch a version of its Hazelcast Cloud service for production applications. The Hazelcast Cloud Dedicated edition will be deployed in a customer’s virtual private cloud instance, but Hazelcast will configure and maintain systems for users. The company released free and paid versions of the cloud service for test and development uses in March on AWS, and it also plans to add support for Azure and GCP in the future.

Managing managed database services vendors

Bayer AG’s Bayer Crop Science division, which includes the operations of Monsanto following Bayer’s 2018 acquisition of the agricultural company, uses managed database services on Teradata data warehouses and Oracle’s Exadata appliance. Naghman Waheed, data platforms lead at Bayer Crop Science, said the biggest benefit of both on-premises and cloud database services is offloading routine administrative tasks to a vendor.

“You don’t have to do work that has very little value,” Waheed said after speaking about a metadata management initiative at Bayer in a Strata session. “Why would you want to have high-value [employees] doing that work? I’d rather focus on having them solve creative problems.”

But he said there were some startup issues with the managed services, such as standard operating procedures not being followed properly. His team had to work with Teradata and Oracle to address those issues, and one of his employees continues to keep an eye on the vendors to make sure they live up to their contracts.

“We ultimately are the caretaker of the system,” Waheed said. “We do provide guidance — that’s still kind of our job. We may not do the actual work, but we guide them on it.”

Go to Original Article
Author:

AWS Summit widens net with services for containers, devs

NEW YORK — AWS pledges to maintain its torrid pace of product and services innovations and continue to expand the breadth of both to meet customer needs.

“You decide how to build software, not us,” said Werner Vogels, Amazon vice president and CTO, in a keynote at the AWS Summit NYC event. “So, we need to give you a really big toolbox so you can get the tools you need.”

But AWS, which holds a healthy lead over Microsoft and Google in the cloud market, also wants to serve as an automation engine for customers, Vogels added.

“I strongly believe that in the future … you will only write business logic,” he said. “Focus on building your application, drop it somewhere and we will make it secure and highly available for you.”

Parade of new AWS services continues

Vogels sprinkled a series of news announcements throughout his keynote, two of which centered on containers. First, Amazon CloudWatch Container Insights, a service that provides container-level monitoring, is now in preview for monitoring clusters in Amazon Elastic Container Service and Amazon Fargate, in addition to Amazon EKS and Kubernetes. In addition, AWS for Fluent Bit, which serves as a centralized environment for container logging, is now generally available, he said.

Serverless compute also got some attention with the release of Amazon EventBridge, a serverless event bus to take in and process data across AWS’ own services and SaaS applications. AWS customers currently do this with a lot of custom code, so “the goal for us was to provide a much simpler programming model,” Vogels said. Initial SaaS partners for EventBridge include Zendesk, OneLogin and Symantec.

Focus on building your application, drop it somewhere and we will make it secure and highly available for you.
Werner VogelsCTO, AWS

AWS minds the past, with eye on the future

Most customers are moving away from the concept of a monolithic application, “but there are still lots of monoliths out there,” such as SAP ERP implementations that won’t go away anytime soon, Vogels said.

But IT shops with a cloud-first mindset focus on newer architectural patterns, such as microservices. AWS wants to serve both types of applications with a full range of instance types, containers and serverless functionality, Vogels said.

He cited customers such as McDonald’s, which has built a home-delivery system with Amazon Elastic Container Service. It can take up to 20,000 orders per second and is integrated with partners such as Uber Eats, Vogels said.

Vogels ceded the stage for a time to Steve Randich, executive vice president and CIO of the Financial Industry Regulatory Authority (FINRA), a nonprofit group that seeks to keep brokerage firms fair and honest.

FINRA moved wholesale to AWS and its systems now ingest up to 155 billion market events in a single day — double what it was three years ago. “When we hit these peaks, we don’t even know them operationally because the infrastructure is so elastic,” Randich said.

FINRA has designed the AWS-hosted apps to run across multiple availability zones. “Essentially, our disaster recovery is tested daily in this regard,” he said.

AWS’ ode to developers

Developers have long been a crucial component of AWS’ customer base, and the company has built out a string of tool sets aimed to meet a broad set of languages and integrated development environments (IDEs). These include AWS Cloud9, IntelliJ, Python, Visual Studio and Visual Studio Code.

VS Code is Microsoft’s lighter-weight, browser-based IDE, which has seen strong initial uptake. All the different languages in VS Code are now generally available, Vogels said to audience applause.

Additionally, AWS Cloud Development Kit (CDK) is now generally available with support for TypeScript and Python. AWS CDK makes it easier for developers to use high-level construct to define cloud infrastructure in code, said Martin Beeby, AWS principal developer evangelist, in a demo.

AWS seeks to keep the cloud secure

Vogels also used part of his AWS Summit talk to reiterate AWS’ views on security, as he did at the recent AWS re:Inforce conference dedicated to cloud security.

“There is no line in the sand that says, ‘This is good-enough security,'” he said, citing newer techniques such as automated reasoning as key advancements.

Werner Vogels, AWS CTO
Werner Vogels, CTO of AWS, on stage at the AWS Summit in New York.

Classic security precautions have become practically obsolete, he added. “If firewalls were the way to protect our systems, then we’d still have moats [around buildings],” Vogels said. Most attack patterns AWS sees are not brute-force front-door efforts, but rather spear-phishing and other techniques: “There’s always an idiot that clicks that link,” he said.

The full spectrum of IT, from operations to engineering to compliance, must be mindful of security, Vogels said. This is true within DevOps practices such as CI/CD from both an external and internal level, he said. The first involves matters such as identity access management and hardened servers, while the latter brings in techniques including artifact validation and static code analysis.

AWS Summit draws veteran customers and newcomers

The event at the Jacob K. Javits Convention Center drew thousands of attendees with a wide range of cloud experience, from FINRA to fledgling startups.

“The analytics are very interesting to me, and how I can translate that into a set of services for the clients I’m starting to work with,” said Donald O’Toole, owner of CeltTools LLC, a two-person startup based in Brooklyn. He retired from IBM in 2018 after 35 years.

AWS customer Timehop offers a mobile application oriented around “digital nostalgia,” which pulls together users’ photographs from various sources such as Facebook and Google Photos, said CTO Dmitry Traytel.

A few years ago, Timehop found itself in a place familiar to many startups: Low on venture capital and with no viable monetization strategy. The company created its own advertising server on top of AWS, dubbed Nimbus, rather than rely on third-party products. Once a user session starts, the system conducts an auction for multiple prominent mobile ad networks, which results in the best possible price for its ad inventory.

“Nimbus let us pivot to a different category,” Traytel said.

Go to Original Article
Author:

Containers key for Hortonworks alliance on big data hybrid

NEW YORK — Hortonworks forged a deal with IBM and Red Hat to produce the Open Hybrid Architecture Initiative. The goal of the Hortonworks alliance is to build a common architecture for big data workloads running both on the cloud and in on-premises data centers.

Central to the Hortonworks alliance initiative is the use of Kubernetes containers. Such container-based data schemes for cloud increasingly appear to set the tone for big data architecture in future data centers within organizations.

Hortonworks’ deal was discussed as part of the Strata Data Conference here, where computing heavyweight Dell EMC also disclosed an agreement with data container specialist BlueData Software to present users with reference architecture that brings cloud-style containers on premises.

Big data infrastructure shifts

Both deals indicate changes are afoot in infrastructure for big data. Container-based data schemes for cloud are starting to show the way that data will be handled in the future within organizations.

The Hortonworks alliance hybrid initiative — as well as Dell’s and other reference architecture — reflects changes spurred by the multitude of analytics engines now available to handle data workloads and as big data applications move to the cloud, said Gartner analyst Arun Chandrasekaran in an interview.

“Historically, big data was about coupling compute and storage together. That worked pretty well when MapReduce was the sole engine. Now, there are multiple processing engines working on the same data lake,” Chandrasekaran said. “That means, in many cases, customers are thinking about decoupling compute and storage.”

De-linking computing and storage

Essentially, modern cloud deployments decouple compute and storage, Chandrasekaran said. This approach is seeing greater interest in containerizing big data workloads for portability, he noted.

We are decoupling storage and compute again.
Arun Murthychief product officer and co-founder, Hortonworks

The shifts in architecture toward container orchestration show people want to use their infrastructure more efficiently, Chandrasekaran said.

The Hortonworks alliance with Red Hat and IBM shows a basic change is underway for the Hadoop-style open source distributed data processing framework. Cloud and on-premises architectural schemes are blending.

“We are decoupling storage and compute again,” said Arun Murthy, chief product officer and co-founder of Hortonworks, based in Santa Clara, Calif., in an interview. “As a result, the architecture will be consistent whether processing is on premises or on cloud or on different clouds.”

The elastic cloud

This style of architecture pays heed to elastic cloud methods.

Strata Data Conference 2018
This week’s Strata Data Conference in New York included a focus on Hortonworks’ deal with IBM and Red Hat, an agreement between Dell EMC and BlueData, and more.

“In public cloud, you don’t keep the architecture up and running if you don’t have to,” Murthy said.

That’s compared with what Hadoop has done traditionally in the data center, where clusters were often configured and sitting ready for high-peak loads.

For Lars Herrmann, general manager of the integrated solutions business unit at Red Hat, based in Raleigh, N.C., the Hortonworks alliance project is a step toward bringing in a large class of data applications to run natively on the OpenShift container platform. It’s also about deploying applications more quickly.

“The idea of containerization of applications allows organizations to be more agile. It is part of the trend we see of people adopting DevOps methods,” Herrmann said.

Supercharging on-premises applications

For its part, Dell EMC sees spinning up data applications more quickly on premises as an important part of the reference architecture it has created with help from BlueData.

“With the container approach, you can deploy different software on demand to different infrastructure,” Kevin Gray, director of product marketing at Dell EMC, said in an interview at Strata Data.

The notion of multi-cloud support for containers is popular, and Hadoop management and deployment software providers are moving to support various clouds. At Strata, BlueData made its EPIC software available on Google Cloud Platform and Microsoft Azure. EPIC cloud support has been available on AWS.

Big data evolves to singular architecture

Tangible benefits will accrue as big data architecture evolves shops to a more singular architecture for data processing on the cloud and in the data center, said Mike Matchett, analyst and founder of Small World Big Data, in an interview at the conference.

“Platforms need to be built such that they can handle distributed work and deal with distributed data. They will be the same on premises as on the cloud. And, in most cases, they will be hybridized, so the data and the processing can flow back and forth,” Matchett said.

There still will be some special optimizations for performance, Matchett added. IT managers will make decisions based on different workloads, as to where particular analytics processing will occur.

Matt Wood talks AWS’ AI platform, ethical use

NEW YORK — AWS spotlighted its evolving AI offerings at AWS Summit this week, with a variety of features and upgrades.

The company incorporated one emerging technology, Amazon Rekognition, into the event’s registration process as it scanned consenting attendees’ faces and compared them against photos submitted previously during registration.

But despite outlines for customers’ use, the AWS AI platform is not immune to growing concerns over potentially unethical usage of these advanced systems. Civil rights advocacy groups worry that technology providers’ breakneck pace to provide AI capabilities, such as Rekognition, could lead to abuses of power in the public sector and law enforcement, among others.

Matt Wood, AWS general manager of deep learning and AI, discussed advancements to the AWS AI platform, adoption trends, customer demands and ethical concerns in this interview.

AWS has added a batch transform feature to its SageMaker machine learning platform to process data sets for non-real-time inferencing. How does that capability apply to customers trying to process larger data files?

Matt Wood: We support the two major ways you’d want to run predictions. You want to run predictions against fresh data as it arrives in real time; you can do that with SageMaker-hosted endpoints. But there are tons of cases in which you want to be able to apply predictions to large amounts of data, either that just arrives or gets exported from a data warehouse, or that is just too large in terms of the raw data size to process one by one. These two things are highly complementary.

We see a lot of customers that want to run billing reports or forecasting. They want to look at product sales at the end of a quarter or the end of a month [and] predict the demand going forward. Another really good example is [to] build a machine learning model and test it out on a data set you understand really well, which is really common in oil and gas, medicine and medical imaging.

In the keynote, you cited 100 new machine learning features or services [AWS has developed] since re:Invent last year. What feedback do you get from customers for your current slate [of AI services]?

Wood: What we heard very clearly was a couple things. No. 1, customers really value strong encryption and strong network isolation. A lot of that has to do with making sure customers have good encryption integrated with Key Management Service inside SageMaker. We also recently added PrivateLink support, which means you can connect up your notebooks and training environment directly to DynamoDB, Redshift or S3 without that data ever flowing out over the private internet. And you can put your endpoints over PrivateLink as well. [Another] big trend is around customers using multiple frameworks together. You’ll see a lot of focus on improving TensorFlow, improving Apache MXNet, adding Chainer support, adding PyTorch support and making sure ONNX [Open Neural Network Exchange] works really well across those engines so that customers can take models trained in one and run them in a different engine.

Matt Wood, AWS GM of Deep Learning and AIMatt Wood speaks during the AWS Summit keynote address (Source: AWS).

What do you hear from enterprises that are reluctant or slow to adopt AI technologies? And what do you feel that you have to prove to those customers?

Wood: It’s still early for a lot of enterprises, and particularly for regulated workloads, there’s a lot of due diligence to do — around HIPAA [Health Insurance Portability and Accountability Act], for example, getting HIPAA compliance in place. The question is: ‘How can I move more quickly?’ That’s what we hear all the time.

There’s two main pathways that we see [enterprises take] today. The first is: They try and look at the academic literature, [which] is very fast-moving, but also very abstract. It’s hard to apply it to real business problems. The other is: You look around on the web, find some tutorials and try to learn it that way. That often gives you something which is up and running that works, but again, it glosses over the fundamentals of how do you collect training data, how do you label that data, how do you build and define a neural network, how do you train that neural network.

To help developers learn, you want a very fast feedback loop. You want to be able to try something out, learn from it, what worked and what didn’t work, then make a change. It’s kick-starting that flywheel, which is very challenging with machine learning.

What are some usage patterns or trends you’ve seen from SageMaker adopters that are particularly interesting?

Wood: A really big one is sports analytics. Major League Baseball selected SageMaker and the AWS AI platform to power their production stats that they use in their broadcasts and on [their] app. They’ve got some amazing ideas about how to build more predictive and more engaging visuals and analytics for their users. [It’s the] same thing with Formula 1 [F1]. They’re taking 65 years’ worth of performance data from the cars — they have terabytes of the stuff — to model different performance of different cars but also to look at race prediction and build an entirely new category of visuals for F1 fans. The NFL [is] doing everything from computer vision to using player telemetry, using their position on the field to do route prediction and things like that. Sports analytics drives such an improvement in the experience for fans, and it’s a big area of investment for us.

Another is healthcare and medical imaging. We see a lot of medical use cases — things like disease prediction, such as how likely are you to have congestive heart failure in the next 12 months, do outpatient prediction, readmittance prediction, those sorts of things. We can actually look inside an X-ray and identify very early-stage lung cancer before the patient even knows that they’re sick. [And] you can run that test so cheaply. You can basically run it against any chest X-ray.

You partnered with Microsoft on Gluon, the deep learning library. What’s the status of that project? What other areas might you collaborate with Microsoft or another major vendor on an AI project?

Wood: Gluon is off to a great start. Celgene, a biotech that’s doing drug toxicity prediction, is trying to speed up clinical trials to get drugs to market more quickly. All of that runs in SageMaker, and they use Gluon to build models. That’s one example; we have more.

Other areas of collaboration we see is around other engines. For example, we were a launch partner for PyTorch 1.0 [a Python-based machine learning library, at Facebook’s F8 conference]. PyTorch has a ton of interest from research scientists, particularly in academia, [and we] bring that up to SageMaker and work with Facebook on the development.

Microsoft President Bradford Smith recently called on Congress to consider federal regulation for facial recognition services. What is Amazon’s stance on AI regulation? How much should customers determine ethical use of AI, facial recognition or other cloud services, and what is AWS’ responsibility?

Wood: Our approach is that Rekognition, like all of our services, falls under our Acceptable Use Policy, [which] is very clear with what it allows and what it does not allow. One of the things that it does not allow is anything unconstitutional; mass surveillance, for example, is ruled out. We’re very clear that customers need to take that responsibility, and if they fall outside our Acceptable Use [Policy}, just like anyone else on AWS, they will lose access to those services, because we won’t support them. They need to be responsible with how they test, validate and communicate their use of these technologies because they can be hugely impactful.

AWS Summit Rekognition kiosksAmazon Rekognition kiosks scan the faces of attendees and print identification badges (Source: David Carty).

The American Civil Liberties Union, among others, has asked AWS to stop selling Rekognition to law enforcement agencies. Will you comply with that request? If not, under what circumstances might that decision change?

Wood: Again, that’s covered under our Acceptable Use Policy. If any customer in any domain is using any of our services in a way which falls outside of acceptable use, then they will lose access to that service.

Certainly, the Acceptable Use Policy covers lawful use, but do you think that also covers ethical use? That’s a thornier question.

Wood: It is a thornier question. I think it’s part of a broader dialogue that we need to have, just as we’ve had with motor cars and any large-scale technology which provides a lot of opportunity, but which also needs a public and open discussion.