As International Financial Data Services (IFDS) started containerizing more and more environments, it needed better Kubernetes backup.
IFDS first dipped its toes into containers in 2017. The company writes its own software for the financial industry, and the first container deployment was for its application development environment. With the success of the large containerized quality assurance testing environment, the company started using containers in production as well.
Headquartered in Toronto, IFDS provides outsourcing and technology for financial companies, such as investment funds’ record keeping and back-office support. It has around 2,400 employees, a clientele of about 240 financial organizations and $3.6 trillion CAD ($2.65 trillion US) in assets under administration.
Kent Pollard, senior infrastructure architect at IFDS, has been with the company for 25 of its 33-year history and said containerizing production opened up a need for backup. One of the use cases of containers is to quickly bring up applications or services with little resource overhead and without the need to store anything. However, Pollard said IFDS’ container environment was no longer about simply spinning up and spinning down.
“We’re not a typical container deployment. We have a lot of persistent storage,” Pollard said.
Zerto recently unveiled its Zerto for Kubernetes backup product at ZertoCon 2020, but Pollard has been working with an alpha build of it for the past month. He said it is still in early stages, and he’s been giving feedback to Zerto, but he has a positive impression so far. Pollard said not having to turn to another vendor such as IBM, Asigra or Trilio for Kubernetes backup will be a huge benefit.
Pollard’s current container backup method uses Zerto to restore containers in a roundabout way. His container environment is built in Red Hat OpenShift and running in a virtualized environment. Zerto is built for replicating VMs, so Pollard can use it to restore the entire VM housing OpenShift. The drawback is this reverts the entire VM to an earlier state, when all he wanted was to restore a single container.
Pollard said, at the least, Zerto for Kubernetes instead allows him to restore at the container level. He understood the early nature of what he’s been testing and said he is looking forward to when other Zerto capabilities get added, such as ordered recovery and automated workflows for failover and testing. From his limited experience, Pollard said he believes Zerto for Kubernetes has the potential to fill his container backup needs.
Pollard said Zerto for Kubernetes will give him incentive to containerize more of IFDS’ environment. The number of containers IFDS currently has in production is still relatively small, and part of the reason Pollard won’t put more critical workloads in containers is because he can’t protect them yet.
He said there were many reasons IFDS moved to containers three years ago. With containers, IFDS is able to more efficiently use its underlying hardware resources, enabling faster responses to application load changes. Pollard also said it improved IFDS’ security and supports the company’s future move to the cloud and built out a hybrid infrastructure. Zerto provided Pollard with an AWS environment to test Zerto for Kubernetes, but IFDS currently has no cloud footprint whatsoever.
IFDS first deployed Zerto in late 2014. It started as a small production environment deployment on a couple of VMs but became the company’s standard tool for disaster recovery. IFDS now uses Zerto to protect 190 VMs and 200 TB of storage. Pollard said he was sold after the first annual DR test when Zerto completed in 30 minutes.
“We never had anything that fast. It was always hours and hours for a DR test,” he said.
Organizations are creating and consuming more data than ever before, spawning enterprise data management system challenges and opportunities.
A key challenge is volume. With enterprises creating more data, they need to manage and store more data. Organizations are now also increasingly relying on the cloud for enterprise data management system storage needs because of the cloud’s scalability and low cost.
IDC’s Global DataSphereForecast currently estimates that in 2020, enterprises will create and capture 6.4zettabytesof new data. In terms of what types of new data is being created, productivity data — or operational, customer and sales data and embedded data — is the fastest-growing category, according to IDC.
“Productivity data encompasses most of the data we create on our PCs, in enterprise servers or on scientific computers,” saidJohn Rydning, research vice president for IDC’s Global DataSphere.
Productivity data also includes data captured by sensors embedded in industrial devices and endpoints, which can be leveraged by an organization to reduce costs or increase revenue.
Rydning also noted that IDC is seeing growth in productivity-relatedmetadata, which provides additional data about the captured or created data that can be used to enable deeper analysis.
Enterprise data management system challenges in a world of data growth
Looking ahead, Rydning sees challenges for enterprise data management.
Perhaps the biggest is dealing with the growing volume ofarchived data. With archival data, organizations will need to decide whether that data is best kept on relatively accessible storage systems for artificial intelligence analysis, or if it is more economical to move the data to lower-cost media such as tape, which is less readily available for analysis.
Another challenge is handling data from theedge of the network, which is expected to grow in the coming years. There too the question will be where organizations should store reference data for rapid analysis.
“Organizations will increasingly need to be prepared to keep up with the growth of data being generated across a wider variety of endpoint devices feeding workflows and business processes,” Rydning said.
The data management challenge in the cloud
In 2019, 34% of enterprise data was stored in the cloud. By 2024, IDC expects that 51% of enterprise data will be stored in the cloud.
While the cloud offers organizations a more scalable and often easier way to store data than on-premises approaches, not all that data has the same value.
Monte ZwebenCo-founder and CEO, Splice Machine
“Companies are continuing to dump data into storage without thinking about the applications that need to consume it,” saidMonte Zweben, co-founder and CEO of Splice Machine. “They just substituted cheap cloud storage, and they continue to not curate it or transform it to be useful. It is now a cloud data swamp.”
The San Francisco-based vendor develops adistributed SQL relational database management systemwith integrated machine learning capabilities. While simply dumping data into the cloud isn’t a good idea, that doesn’t mean Zweben is opposed to the idea of cloud storage.
Indeed, Zweben suggested that organizations use the cloud, since cloud storage is relatively cheap. The key is to make sure that instead of just dumping data, enterprises find way to use that data effectively.
“You may later realize you need to train ML [machine learning] models on data that you previously did not think was useful,” Zweben said.
Enterprise data management system lessons from data innovators
“Without a doubt, some companies are storing a lot of low-value data in the cloud,” said Andi Mann, chief technology advocate atSplunk, aninformation security and event management vendor. “But it is tough to say any specific dataset is unnecessary for any given business.”
In his view, the problem isn’t necessarily storing data that isn’t needed, but rather storing data that isn’t being used effectively.
Splunk sponsored a March 2019study conducted by Enterprise Strategy Group(ESG) about the value of data. The report, based on responses from 1,350 business and IT decision-makers, segments users by data maturity levels, with “data innovators” being the top category.
“While many organizations do have vast amounts of data — and that might put them in the data innovator category — the real difference between data innovators and the rest is not how much data they have, but how well they enable their business to access and use it,” Mann said.
Among the findings in the report is that 88% of data innovators employ highly skilled data investigators. However, even skilled people are not enough, so 85% of these innovative enterprises usebest-of-breed analyticstools, and make sure to provide easy access to them.
“Instead of considering any data unnecessary, look at how to store even low-value data in a way that is both cost-effective, while allowing you to surface important insights if or when you need to,” Mann suggested. “The key is to treat data according to its potential value, while always being ready to reevaluate that value.”
Virtual machines and containers are both types of virtualized workloads that have more similarities than you may think. Each serves a specific purpose and can significantly increase the performance of your infrastructure — as long as they are employed effectively.
Microsoft unveiled container support in Windows Server 2016, which might have seemed like a novelty feature for many Windows administrators. But now that containers and the surrounding technology — orchestration, networking and storage — has matured on the Windows Server 2019 release, is it time to give containers more thought?
How do you make the decision on when to use VMs vs. containers? Is there a tipping point when you should make a switch? To help steer your decision, let’s cover the three key abilities of containers and virtual machines.
When it comes to weighing the options, the difference in reliability is one of the first questions any engineer will ask. Although uptime ultimately depends on the engineers and engineering behind the technology, you can infer a lot about their dependability by analyzing the security and maintenance costs.
VMs. VMs are big, heavyweight, monoliths. This isn’t a comment about speed, because VMs can be blazingly fast. The reason they’re considered monoliths is because each contains a full stack of technology; virtualized hardware, an operating system and even more software are all layered on top of each other in one package.
The advantage of utilizing VMs becomes apparent when you drill down to the hypervisor. VMs have full isolation between themselves and any other VM running on the same hardware or in the same cluster. This is highly secure; you can’t directly attack one VM in a cluster from another.
The other reliability advantage is longevity. People have been using VMs in Windows production environments for about 20 years. There are a large number of engineers with vast amounts of experience managing, deploying and troubleshooting VMs. If an issue with a VM arises, there’s a good chance it’s not a unique occurrence.
Containers. Containers are lightweight and less hardware-intensive because they aren’t running a full suite of software on top of them. Containers can be thought of as wrappers around a process or applications that can run in a stand-alone fashion.
You can run many containers on the same VM; due to this, you don’t have full isolation in containers. You do have process isolation, but it’s not as absolute as it is with a VM. This can cause some difficulties in spinning up and maintaining containers when determining how to parcel out resources.
Additionally, because containers are so relatively new compared to VMs, you might have trouble finding the engineers with a similar amount of career dedication to their management. There are additional technologies to bring in to help with their administration and orchestration, the learning curve to get started is generally seen as higher compared to more traditional technologies, such as VMs.
Scalability is the capability of the technology to maximize utilization across your environment. When you’re ready for your application to be accessed by tens of thousands of people, scalability is your friend.
VMs. VMs take a long time to spin up and deploy. Cloud technology such as AWS Auto Scaling and Azure Virtual Machine Scale Sets build out clones of the same VM and load-balance across them. While this is one way to reach scale, it’s a little clunky because of the VM spin-up time.
For a one-off application, VMs can host it and work well, but when it comes to reaching the masses, they can fall short. This is particularly true when attempting to use non-cloud-native automation to scale VMs. The sheer time difference between a VM deployment and a container deployment can cause your automation to go haywire.
Containers. Containers were built for scale. You can spin up one or a hundred new containers in milliseconds, which makes automation and orchestration with native cloud tooling a breeze.
Scale is so innate to containers that the real question with scale is, “How far do you want to go?” You can use IaaS on AWS or Azure using your own Kubernetes orchestration, but you can even take this one step further with the PaaS technologies such as AWS Fargate or Azure Container Instances.
Once you have your VMs or containers running in production, you need a way to manage them. Deploying changes, updating software and even rotating technologies all fall under this purview.
VMs. There are scores of third-party tools to manage VMs, such as Puppet, Chef, System Center Configuration Manager and IBM BigFix. Each does software deployment, runs queries on your environment, and even performs more complex desired state configuration tasks. There are also a host of vendor tools to manage your VMs inside VMware, Citrix and Hyper-V.
VMs require care and feeding. Usually when you create a VM, there is a lifecycle it follows from the spin up to its sunset date. In between, it requires maintenance and monitoring. This is contrary to newer methodologies such as DevOps, infrastructure as code and immutable infrastructure. In these paradigms, servers and services are treated like cattle, not pets.
Containers. Orchestration and immutability are the hallmarks of containers. If a container breaks, you kill it and deploy another one without a second thought. There is no backup and restore procedure. Instead of spending time modifying or maintaining your environment, you fix a container by destroying it and creating a new one. VMs, because of the associated time and maintenance costs, simply can’t keep up with containers in this respect.
Containers are tailored for DevOps; containers are a component of the infrastructure that treats developers and infrastructure operators as first-class citizens. By layering the new methodology on new technology, it allows for a faster way to get things done by reducing the complexities tied to workload management.
Which is the way to go?
In the contest of VMs vs. containers, which one wins? The answer depends on your IT team and your use case. There are instances where VMs will continue to have an advantage and others where containers are a better choice. This comparison has just scratched the surface of the technical differences, but there are financial advantages to consider as well.
In a real-world environment, you will likely need both technologies. Monolithic VMs make sense for more solid and stable services such as Active Directory or the Exchange Server platform. For your development team and your homegrown apps utilizing the latest in release pipeline technology, containers will help them get up to speed and scale to the needs of your organization.
More than ever, educators are relying on technology to create inclusive learning environments that support all learners. As we recognize Global Accessibility Awareness Day, we’re pleased to mark the occasion with a spotlight on an innovative school that is committed to digital access and success for all.
Seattle-based Hamlin Robinson School, an independent school serving students with dyslexia and other language-based learning differences, didn’t set a specific approach to delivering instruction immediately after transitioning to remote learning. “Our thought was to send home packets of schoolwork and support the students in learning, and we quickly realized that was not going to work,” Stacy Turner, Head of School, explained in a recent discussion with the Microsoft Education Team.
After about a week into distance learning, the school quickly went to more robust online instruction. The school serves grades 1-8 and students in fourth-grade and up are utilizing Office 365 Education tools, including Microsoft Teams. So, leveraging those same resources for distance learning was natural.
Built-in accessibility features
Stacy said the school was drawn to Microsoft resources for schoolwide use because of built-in accessibility features, such as dictation (speech-to-text), and the Immersive Reader, which relies on evidence-based techniques to help students improve at reading and writing.
“What first drew us to Office 365 and OneNote were some of the assistive technologies in the toolbar,” Stacy said. Learning and accessibility tools are embedded in Office 365 and can support students with visual impairments, hearing loss, cognitive disabilities, and more.
Josh Phillips, Head of Middle School, says for students at Hamlin Robinson, finding the right tools to support their learning is vital. “When we graduate our students, knowing that they have these specific language-processing needs, we want them to have fundamental skills within themselves and strategies that they know how to use. But we also want them to know what tools are available to them that they can bring in,” he said.
For example, for students who have trouble typing, a popular tool is the Dictate, or speech-to-text, function of Office 365. Josh said that a former student took advantage of this function to write a graduation speech at the end of eighth grade. “He dictated it through Teams, and then he was able to use the skills we were practicing in class to edit it,” Josh said. “You just see so many amazing ideas get unlocked and be able to be expressed when the right tools come along.”
Supporting teachers and students
Providing teachers with expertise around tech tools also is a focus at Hamlin Robinson. Charlotte Gjedsted, Technology Director, said the school introduced its teachers to Teams last year after searching for a platform that could serve as a digital hub for teaching and learning. “We started with a couple of teachers being the experts and helping out their teams, and then when we shifted into this remote learning scenario, we expanded that use,” Charlotte said.
“Teams seems to be easiest platform for our students to use in terms of the way it’s organized and its user interface,” added Josh.
He said it was clear in the first days of distance learning that using Teams would be far better than relying on packets of schoolwork and the use of email or other tools. “The fact that a student could have an assignment issued to them, could use the accessibility tools, complete the assignment, and then return the assignment all within Teams is what made it clear that this was going to be the right app for our students,” he said.
A student’s view
Will Lavine, a seventh-grade student at the school says he appreciates the stepped-up emphasis on Teams and tech tools during remote learning and says those are helping meet his learning needs. “I don’t have to write that much on paper. I can use technology, which I’m way faster at,” he said.
“Will has been using the ease of typing to his benefit,” added Will’s tutor, Elisa Huntley. “Normally when he is faced with a hand written assignment, he would spend quite a bit of time to refine his work using only a pencil and eraser. But when he interfaces with Microsoft Teams, Will doesn’t feeling the same pressure to do it right the first time. It’s much easier for him to re-type something. His ideas are flowing in ways that I have never seen before.”
Will added that he misses in-person school, but likes the collaborative nature of Teams, particularly the ability to chat with teachers and friends.
With the technology sorted out, Josh said educators have been very focused on ensuring students are progressing as expected. He says that teachers are closely monitoring whether students are joining online classes, engaging in discussions, accessing and completing assignments, and communicating with their teachers.
Connect, explore our tools
We love hearing from our educator community and students and families. If you’re using accessibility tools to create more inclusive learning environments and help all learners thrive, we want to hear from you! One great way to stay in touch is through Twitter by tagging @MicrosoftEDU.
And if you want to check out some of the resources Hamlin Robinson uses, remember that students and educators at eligible institutions can sign up for Office 365 Education for free, including Word, Excel, PowerPoint, OneNote, and Microsoft Teams.
In honor of Global Accessibility Awareness Day, Microsoft is sharing some exciting updates from across the company. To learn more visit the links below:
As more of the workforce connects from their homes, there has been a spike in usage for remote productivity services. Many organizations are giving Microsoft Office 365 subscriptions to all of their staff, using more collaboration tools from Outlook, OneDrive, SharePoint, and Teams.
Unfortunately, this is creating new security vulnerabilities with more untrained workers being attacked by malware or ransomware through attachments, links, or phishing attacks.
This article will provide you with an overview of how Microsoft Office 365 Advanced Threat Protection (ATP) can help protect your organization, along with links to help you enable each service.
Microsoft Office 365 now comes with the Advanced Threat Protection service which secures emails, attachments, and files by scanning them for threats. This cloud service uses the latest in machine learning from the millions of mailboxes it protects to proactively detect and resolve common attacks. This technology has also been extended beyond just email to protect many other components of the Microsoft Office suite. In addition to ATP leveraging Microsoft’s global knowledge base, your organization can use ATP to create your own policies, investigate unusual activity, simulate threats, automate responses, and view reports.
Microsoft Office 365 ATP helps your users determine if a link is safe when using Outlook, Teams, OneNote, Word, Excel, PowerPoint and Visio. Malicious or misleading links are a common method for hackers to direct unsuspecting users to a site that can steal their information. These emails are often disguised to look like they are coming from a manager or the IT staff within the company. ATP will automatically scan links in emails and cross-reference them to a public or customized list of dangerous URLs. If a user tries to click on the malicious link, it will give them a warning so that they understand the risk if they continue to visit the website.
One of the most common ways which your users will get attacked is by opening an attachment that is infected with malware. When the file is opened, it could execute a script that could steal passwords or lock up the computer unless a bounty is paid, in what is commonly known as a ransomware attack. ATP will automatically scan all attachments to determine if any known virus is detected. You and your users will be notified about anything suspicious to help you avoid any type of infection.
When ATP anti-phishing is enabled, all incoming messages will be analyzed for possible phishing attacks. Microsoft Office 365 uses cloud-based AI to look for unusual or suspicious message elements, such as mismatched descriptions, links, or domains. Whenever an alert is triggered, the user is immediately warned, and the alert is logged so that it can be reviewed by an admin.
Approved users will have access to the ATP dashboard along with reports about recent threats. These reports contain detailed information about malware, phishing attacks, and submissions. A Malware Status Report will allow you to see malware detected by type, method, and the status of each message with a threat. The URL Protection Status Report will display the number of threats discovered for each hyperlink or application and the resulting action taken a user. The ATP Message Disposition report shows the different types of malicious file attachments actions in messages. The Email Security Reports include details about the top senders, recipients, spoofed mail, and spam detection.
Another important component of ATP is the Threat Explorer which allows admins or authorized users to get real-time information about active threats in the environment through a GUI console. It allows you to preview an email header and download an email body, and for privacy reasons, this is only permitted if permission is granted through role-based access control (RBAC). You can then trace any copies of this email throughout your environment to see whether it has been routed, delivered, blocked, replaced, failed, dropped, or junked. You can even view a timeline of the email to see how it has been accessed over time by recipients in your organization. Some users can even report suspicious emails and you can use this dashboard to view these messages.
Microsoft Office 365 leverages its broad network of endpoints to identify and report on global attacks. Administrators can add any Threat Tracker widgets which they want to follow to their dashboard through the ATP interface. This allows you to track major threats attacking your region, industry, or service type.
Another great security feature from Microsoft Office 365 ATP is the ability to automatically investigate well-known threats. Once a threat is detected, the Automated Incident Response (AIR) feature will try to categorize it and start remediating the issue based on the industry-standard best practices. This could include providing recommendations, quarantining, or deleting the infected file or message.
One challenge that many organizations experience when developing a protection policy is their inability to test how their users would actually respond to an attempted attack. The ATP Attack Simulator is a utility that authorized administrators can use to create artificial phishing and password attacks. These fake email campaigns try to identify and then educate vulnerable users by convincing them to perform an action that could expose them to a hacker. This utility can run a Spear Phishing Campaign, Brute Force Attack, and a Password Spray Attack.
This diverse suite of tools, widgets, and simulators can help admins protect their remote workforce from the latest attacks. Microsoft has taken its artificial intelligence capabilities to learn how millions of mailboxes are sharing information, and use this to harden the security of their entire platform.
If you want to learn more about Microsoft Office 365 ATP and Microsoft Office 365 in general, attend the upcoming Altaro webinar on May 27. I will be presenting that along with Microsoft MVP Andy Syrewicze so it’s your chance to ask me any questions you might have about ATP or other Microsoft Office 365 security features live! It’s a must-attend for all admins – save your seat now
Is Your Office 365 Data Secure?
Did you know Microsoft does not back up Office 365 data? Most people assume their emails, contacts and calendar events are saved somewhere but they’re not. Secure your Office 365 data today using Altaro Office 365 Backup – the reliable and cost-effective mailbox backup, recovery and backup storage solution for companies and MSPs.
SAP customers already appeared more receptive to cloud-based software at the start of 2020, but the COVID-19 pandemic may spur momentum for SAP S/4HANA Cloud 2005, the latest release of the SaaS version of the ERP platform.
SAP reported increases in cloud-based revenue for the first quarter of 2020, and, although this was not broken out into specific product groups, SAP is seeing a shift in demand for the cloud, said Jan Gilg, president of SAP S/4HANA.
“Customers are coming in to ask how quickly they can be up and running, or maybe how quickly they can set up a subsidiary or specific business units,” Gilg said. “So we’re seeing a lot of uptake and a lot of customers looking into the cloud model now than before.”
The cloud momentum is expected to continue even after the pandemic has passed, he said, as companies hit hard by the disruption will evaluate their IT capabilities and the status of ERP modernization and digital transformation projects.
One of the advantages of cloud-based software is that new functions can be introduced in each new version, Gilg said. SAP S/4HANA 2005 includes updates that could be valuable for companies dealing with the rapidly changing business environment brought on by COVID-19.
Supply chain, finance and integration with SAP SuccessFactors, an HCM platform, are the most prominent updates, he said.
Enabling a more flexible supply chain
Supply chain Situation Handling functionality now allows companies to monitor inventory more accurately. In the last few years, supply chains have been stretched around the globe and have focused on just-in-time delivery, keeping only as much stock in inventory as needed. The strategy has been exposed as a weakness by the pandemic, as companies have grappled with an abrupt disruption to production schedules.
This is leading companies to reassess supply chains by moving to more local suppliers and keeping more inventory in stock, Gilg said.
“S/4HANA Cloud 2005 puts more emphasis on inventory management and stock levels and gives companies the support to help them with intelligence that proactively alerts companies when inventory levels go down, go too low or run out,” he said. “In the current situation, it’s really critical to make sure that there’s enough flow of goods to the respective consumers; it’s about being flexible.”
Flexibility is also the key to new financial functions, which allow companies to monitor and approve payments from SAP and non-SAP systems. This will help companies keep a closer eye on cash flow, which will be important as business interruption makes cash flow an issue, Gilg said.
The other significant S/4HANA Cloud update is a more seamless integration between S/4HANA and SAP SuccessFactors Employee Central, which standardizes the data model and cost centers for ERP and HCM systems.
“The ambition here is that this should really look and feel like one solution, and ideally customers should not even notice that there’s two solutions behind the scenes,” Gilg said. “The transition is seamless from a UI perspective, from process data integration, and also from some of the technical attributes like the provisioning.”
Giving the customers what they want
Although there’s no hard evidence of an increase in demand for SAP S/4HANA Cloud, it wouldn’t be a surprise given the overall increase in demand for cloud applications, said analyst Jon Reed, co-founder of Diginomica.com, an enterprise applications news and analysis site.
However, the most appropriate market for S/4HANA Cloud may not be able to invest, given the current environment.
“Keep in mind that S/4HANA Cloud’s best vertical adoption, if we are talking the full cloud solution, not hosted S/4HANA, is in professional services, which, for the most part, is not a vertical that is thriving at the moment,” Reed said. “Modern ERP cloud is going to have be very vertical in its appeal, a topic SAP has understood for some time but has not moved nearly fast enough on.”
S/4HANA Cloud 2005’s updates should be welcomed by customers, Reed said.
“These are the types of features customers have been asking for,” he said. “In particular, the SuccessFactors integration should help S/4HANA Cloud have some response to Workday’s complete finance and HR integrations, although SAP has a long way to go there.”
S/4HANA Cloud 2005 looks impressive, with the SuccessFactors Employee Central integration and more end-to-end industry focus, said Predrag “PJ” Jakovljevic, principal industry analyst at Technology Evaluation Centers, an enterprise computing analysis firm in Montreal.
The current COVID-19 environment may spur more cloud demand, Jakovljevic said.
“Both S/4HANA Cloud 2005 and cloud ERP, SCM [supply chain management] and CRM, in general, should benefit from COVID-19, since many customer success stories nowadays talk about using cloud and mobile digital collaborative tools,” he said. “On-premises will still not necessarily fully die, however, because some places still have regulatory requirements and poor internet connectivity, and on-premises solutions can now come with remote access.”
Companies collaborate to make video analytics solutions more accessible in order to drive better business outcomes
TOKYO — May 19, 2020 — Sony Semiconductor Solutions (Sony) and Microsoft Corp. (Microsoft) today announced they are partnering to create solutions that make AI-powered smart cameras and video analytics easier to access and deploy for their mutual customers.
As a result of the partnership, the companies will embed Microsoft Azure AI capabilities on Sony’s intelligent vision sensor IMX500, which extracts useful information out of images in smart cameras and other devices. Sony will also create a smart camera managed app powered by Azure IoT and Cognitive Services that complements the IMX500 sensor and expands the range and capability of video analytics opportunities for enterprise customers. The combination of these two solutions will bring together Sony’s cutting-edge imaging & sensing technologies, including the unique functionality of high-speed edge AI processing, with Microsoft’s cloud expertise and AI platform to uncover new video analytics opportunities for customers and partners across a variety of industries.
“By linking Sony’s innovative imaging and sensing technology with Microsoft’s excellent cloud AI services, we will deliver a powerful and convenient platform to the smart camera market. Through this platform, we hope to support the creativity of our partners and contribute to overcoming challenges in various industries,” said Terushi Shimizu, Representative Director and President, Sony Semiconductor Solutions Corporation.
“Video analytics and smart cameras can drive better business insights and outcomes across a wide range of scenarios for businesses,” said Takeshi Numoto, corporate vice president and commercial chief marketing officer at Microsoft. “Through this partnership, we’re combining Microsoft’s expertise in providing trusted, enterprise-grade AI and analytics solutions with Sony’s established leadership in the imaging sensors market to help uncover new opportunities for our mutual customers and partners.”
Video analytics has emerged as a way for enterprise customers across industries to uncover new revenue opportunities, streamline operations and solve challenges. For example, retailers can use smart cameras to detect when to refill products on a shelf or to better understand the optimal number of available open checkout counters according to the queue length. Additionally, a manufacturer might use a smart camera to identify hazards on its manufacturing floor in real time before injuries occur. Traditionally, however, such applications — which rely on gathering data distributed among many smart cameras across different sites like stores, warehouses and distribution centers — struggle to optimize the allocation of compute resources, resulting in cost or power consumption increases.
To address these challenges, Sony and Microsoft will partner to simplify access to computer vision solutions by embedding Azure AI technology from Microsoft into Sony’s intelligent vision sensor IMX500 as well as enabling partners to embed their own AI models. This integration will result in smarter, more advanced cameras for use in enterprise scenarios as well as a more efficient allocation of resources between the edge and the cloud to drive cost and power consumption efficiencies.
Sony’s smart camera managed app powered by Azure is targeted toward independent software vendors (ISVs) specializing in computer vision and video analytics solutions, as well as smart camera original equipment manufacturers (OEMs) aspiring to add value to their hardware offerings. The app will complement the IMX500 sensor and will serve as the foundation on which ISVs and OEMs can train AI models to create their own customer- and industry-specific video analytics and computer vision solutions that address enterprise customer demands. The app will simplify key workflows and take reasonable security measures designed to protect data privacy and security, allowing ISVs to spend less time on routine, low-value integration and provisioning work and more time on creating unique solutions to meet customers’ demands. It will also enable enterprise customers to more easily find, train and deploy AI models for video analytics scenarios.
As part of the partnership, Microsoft and Sony will also work together to facilitate hands-on co-innovation with partners and enterprise customers in the areas of computer vision and video analytics as part of Microsoft’s AI & IoT Insider Labs program. Microsoft’s AI & IoT Insider Labs offer access and facilities to help build, develop, prototype and test customer solutions, working in partnership with Microsoft experts and other solution providers like Sony. The companies will begin working with select customers within these co-innovation centers later this year.
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.
About Sony Semiconductor Solutions
Sony Semiconductor Solutions Corporation is the global leader in image sensors. We strive to provide advanced imaging technologies that bring greater convenience and joy to people’s lives. In addition, we also work to develop and bring to market new kinds of sensing technologies with the aim of offering various solutions that will take the visual and recognition capabilities of both human and machines to greater heights. For more information, please visit: https://www.sony-semicon.co.jp/e/
For more information, press only:
Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, [email protected]
Cisco has promised to bring more advanced video conferencing features to Webex Teams eventually. But for now, users must rely on the vendor’s Webex Meetings product for full-featured video calling.
Cisco has been working for years to bring the two apps closer together. But despite relying on the same cloud infrastructure, Teams still lags behind its collaboration cousin.
Webex Teams lacks polls, in-meeting chat, screen-sharing with remote desktop control, 5×5 video displays and key host settings like the ability to automatically mute attendees upon entry.
What’s more, Webex Teams users cannot access essential video conferencing features without a license for Webex Meetings. Those capabilities include meeting recording, guest access and dial-in numbers.
Despite marking Webex Teams as an all-in-one collaboration app, Cisco generally sells the product in a bundle with Webex Meetings.
“We are actively working to bring all the advanced video conferencing capabilities of the Webex Meetings to Webex Teams,” Cisco said in an emailed statement.
Later this month, Cisco plans to address one significant shortcoming in Webex Teams by expanding the product’s video display. The app will soon support a 3×3 video grid. But it will still show fewer video panels than Webex Meetings, which has a 5×5 array.
Demand for large group video meetings has soared amid the coronavirus pandemic. People want to be able to see everyone on screen at the same time. Some customers have chosen a video platform based solely on this issue. Cisco did not say when it would enable a 5×5 grid view in Webex Teams.
Another feature missing from Webex Teams is a “health checker” button, like the one in Webex Meetings for troubleshooting connectivity issues. Furthermore, the video interfaces of Webex Teams and Webex Meetings are not identical, which could confuse users who host meetings in both.
Cisco launched Webex Teams as Cisco Spark in 2015. The app initially relied on a separate cloud engine than Webex Meetings. The company later rebranded the product as part of a broader strategy to streamline its portfolio of communications apps.
Unlike competitors Microsoft and Slack, Cisco has not disclosed how many people use its team collaboration app. However, the company said 324 million people attended a Webex meeting in March.
“Obviously, it’s been a work in progress from the Webex Teams side for a couple of years now,” said Josh Warcop, senior solutions engineer at Byteworks, a Cisco reseller. “We’re probably going to see a lot more feature parity here just this year.”
On the flip side, Cisco said it was also working to bring at least two Webex Teams video features to Webex Meetings. One is the ability for anyone to start a meeting, not just the host. The other is the integration of Meetings with video mesh nodes, which let businesses keep some video traffic on premises.
Enterprises can get a look at Dell EMC’s next-generation midrange storage, more than a year later than the array’s planned debut.
The Dell EMC PowerStore system that launched today marks the vendor’s first internally developed storage product since Dell bought EMC in 2015. Integration of Dell EMC-owned VMware is a key element, with an onboard ESXi hypervisor and capability to run applications on certain array models.
The base PowerStore is a 2U two-node enclosure for active-active failover and high availability. The chassis takes 25 NVMe SSDs, with support for Intel Optane persistent memory chips. Three 25-drive SAS expansion shelves can be added per chassis. Support for NVMe-f architecture is on Dell EMC’s roadmap.
The PowerStore midrange storage has been a strategic priority for several years. More than 1,000 engineers across Dell EMC storage and the wider Dell Technologies organization worked on the system, said Caitlin Gordon, senior vice president of Dell EMC storage marketing.
“Data has never been more diverse or more valuable, but customers have had to choose between prioritizing between service levels for performance and simplifying their operations. We know not every applications can be virtualized, and we engineered PowerStore so you can consolidate all workloads on a single platform,” Gordon said.
What’s next for Dell EMC midrange?
Dell EMC first scheduled the new midrange system to launch in 2019, but a series of delays pushed it back to now. The all-flash PowerStore adds to Dell EMC’s overlapping midrange storage, although the vendor said the new system would help streamline the portfolio. Dell EMC is the market leader in storage, with midrange platforms that include the Unity flagship all-flash and hybrid arrays that EMC brought to market. Other midrange systems include the SC Series and PS Series. Dell acquired Compellent and EqualLogic arrays years ago and renamed both products. Compellent is now known as SC Series and still sold and supported by Dell. The EqualLogic arrays were renamed PS Series, which Dell maintains but no longer sells. Dell EMC executives said the other systems will be phased out slowly with PowerStore’s arrival.
The PowerStoreOS operating system incorporates a Kubernetes framework to serve storage management from containers and includes a machine learning engine to automate rebalancing and other administrative tasks. Based on internal testing, Dell EMC claims PowerStore had seven times the performance and three times lower latency than Unity XT array.
The ground-up PowerStore design eventually will emerge as the dominant Dell EMC midrange storage, said Scott Sinclair, a storage analyst with Enterprise Strategy Group.
“This is a completely new architecture that’s based on a container framework. It’s designed to address a bunch of different workload needs on one array. That’s not the type of hard work you put into a product do just to add another midrange storage array,” Sinclair said.
A software capability called AppsOn allows data-intensive applications to access storage on PowerStore and use VMware vMotion to migrate it between core and cloud environments.
“The idea is you that can be within a VMware environment — let’s say VMware Cloud Foundation, or vSphere — and have different ways to move applications to various targets. AppsOn is a novel approach that gives you more flexibility to deploy apps, based on your resource needs,” Sinclair said
Beta customer tried to ‘blow up’ PowerStore
Dell EMC guarantees data reduction of 4-to-1 with always-on inline deduplication. Dell claims the inline data reduction does not degrade performance. Based on the ratio, a single Dell EMC PowerStore with three expansion enclosures is rated to provide 2.8 PB of usable storage per appliance. Effective capacity scales to 11.3 PB in a maximum eight-node cluster.
Five capacity models are available: PowerStore 1000 (384 TB), PowerStore 3000 (768 TB), PowerStore 5000 (1,152 TB), PowerStore 7000 (1,536 TB) and PowerStore 9000 (2,560 TB). PowerStore X models come with the VMware hypervisor and AppsOn, a software capability that allows data-intensive applications to access storage on the array across core and cloud environments. The Power T configuration does not include the latter features.
“I actually like the PowerStore X a lot more than I ever thought I would,” said Alan Hunt, the director of network operations for Detroit-based law firm Dickinson Wright. Hunt is running a PowerStore X and PowerStore T in beta to simulate live production. He said PowerStore will help Dickinson Wright to incorporate new storage with existing SC Series and retire PS Series arrays.
“We did a lot of testing and migrating of live workloads with the AppsOn feature, and that was excellent. We’re running simulate workloads and don’t have anything in production [on PowerStore], but I want to jump on it immediately. I take systems and try to blow them up, and this was definitely the most stable beta test I’ve ever done,” Hunt said.
Dell EMC initially said it would converge features of its multiple midrange arrays in 2019. The product launch was slated for Dell Tech World in Las Vegas in May, but that event was cancelled due to the coronavirus. Dell said it will have a virtual show later this year but has not specified dates.
Gordon said PowerStore systems started shipping in April.