Tag Archives: help

Google cloud network tools check links, firewalls, packet loss

Google has introduced several network monitoring tools to help companies pinpoint problems that could impact applications running on the Google Cloud Platform.

Google launched this week the first four modules of an online console called the Network Intelligence Center. The components for monitoring a Google cloud network include a network topology map, connectivity tests, a performance dashboard, and firewall metrics and insights. The first two are in beta, and the rest are in alpha, which means they are still in the early stages of development.

Here’s a brief overview of each module, based on a Google blog post:

— Google is providing Google Cloud Platform (GCP) subscribers with a graphical view of their network topology. The visualization shows how traffic is flowing between private data centers, load balancers, and applications running on computing environments within GCP. Companies can drill down on each element of the topology map to verify policies or identify and troubleshoot problems. They can also review changes in the network over the last six weeks.

— The testing module lets companies diagnose problems with network connections within GCP or from GCP to an IP address in a private data center or another cloud provider. Along with checking links, companies can test the impact of network configuration changes to reduce the chance of an outage.

–The performance dashboard provides a current view of packet loss and latency between applications running on virtual machines. Google said the tool would help IT teams determine quickly whether a packet problem is in the network or an app.

–The firewall metrics component offers a view of rules that govern the security software. The module is designed to help companies optimize the use of firewalls in a Google cloud network.

Getting access to the performance dashboard and firewall metrics requires a GCP subscriber to sign up as an alpha customer. Google will incorporate the tools into the Network Intelligence Center once they reach the beta level.

Go to Original Article
Author:

Slack services partners to help vendor target enterprises

Slack is partnering with IT services and consulting firms to help midsize or larger businesses adopt and use its team collaboration app.

It is Slack’s first significant step toward developing a partner channel that would help it compete with Microsoft and Cisco for large enterprises. Those vendors rely on an ecosystem of resellers and IT integrators to support businesses on a global scale.

But Slack’s initial partners are small and midsize organizations. The vendor has yet to recruit the world’s leading IT integrators. Until it does so, Slack will still be at a significant disadvantage against those larger rivals as it attempts to sell to businesses with tens of thousands of employees.

The move comes as financial analysts sour on Slack, worrying that the vendor will be unable to compete with Microsoft Teams in the enterprise market over the long term. Slack’s valuation has dropped from $19 billion to less than $12 billion amid a steady decline in its stock price over the past several months.

The Slack services partners will help businesses with more than 250 employees build integrations, train employees and figure out where Slack fits into their move to the cloud. Slack is launching the program with seven partners across the United States, the United Kingdom and Japan.

Slack could launch a reseller program in the future, said Rich Hasslacher, Slack’s head of global alliances and channels. But, for now, the company will pay its services partners a finder’s fee worth 8% of the first-year contract of any customer they refer to Slack.

The services partners are Robot & Pencils, Adaptavist, Abeam Consulting, Ricksoft, Rainmaker, Onix and Cprime. Slack plans to add additional partners to the program around February or March of 2020, targeting markets in continental Europe, Australia and Latin America.

Developing the right ecosystem of partners will be essential to Slack’s long-term viability, said Zeus Kerravala, principal analyst at ZK Research. Slack is more than just a messaging app. Yet, many businesses don’t understand how to take full advantage of the platform, he said.

“When you look at long-term viability, that’s always been around platforms, not products,” Kerravala said. “I think if Slack wants to go down that route, [the services partner program] is part of what they need to do.”

Developing a channel should also help Slack sell to IT departments, rather than to isolated business units and groups of end users. Slack has 12 million daily active users, but only 6 million paid seats. Winning more company-wide deployments would help Slack boost its paid user count.

Go to Original Article
Author:

New research reveals a surprising link between the workplace and business success

To help businesses stay a step ahead in the digital age, Microsoft has released new research in partnership with Dr. Michael Parke of the London Business School. Surveying 9,000 workers and business leaders across 15 European markets, the research delved into company growth, employee engagement, leadership styles and technology.

According to the findings, change is the new normal as businesses race to adapt and better compete: 92% of European leaders say their organization has recently undergone a major transformation.

And, the number-one transformation challenge in leaders’ minds is company culture.

Our customers and partners across Europe tell us that keeping up with the pace of digital transformation and innovation is among their chief concerns. But based on our own internal cultural transformation at Microsoft over the past few years, I always encourage business leaders to give as much consideration to company culture as they do to deploying new technology. After all, it’s not just about having the best technology; it’s about how you and your teams react to, and adapt to, change. – Vahé Torossian, Corporate Vice President, Microsoft, and President, Microsoft Western Europe.

The study revealed that getting the workplace culture component right can benefit businesses in a significant way.

Companies that were assessed as having ‘innovative cultures’ – generally defined as cultures where new ideas are embraced and supported – were twice as likely to expect double-digit growth. These businesses also seem positioned to win the war for talent: the majority of workers within these organizations (86%) plan to stay in their jobs, as opposed to 57% of those employees working in less innovative cultures.

There are three key attributes that set these innovative companies apart:

I. Tearing down silos and building bridges

Companies with the most innovative cultures have leaders who are not only tearing down silos, they’re replacing them with partnerships and transparency. These leaders are more likely to see effective collaboration as vital for business growth – whether it’s within teams, across teams, or with customers and partners.

Among leaders of highly innovative cultures:

  • 86 percent said collaboration within their teams is very important for future business growth, compared to 70 percent in less innovative cultures.
  • 86 percent said internal collaboration across teams is very important to growth, compared to 72 percent of leaders in less innovative businesses.
  • 79 percent said collaborating externally with their partners is vital for growing their business, compared to just 54 percent of their counterparts in lower-innovation companies.

II. Empowering teams and creating a learning culture

The research shows that in the most innovative companies, leaders are focused on mobilizing their teams and empowering them.

In the most innovative companies, 73 percent of workers say their teams can choose how they approach the work – with only 45 percent of workers in low-innovation workplaces feeling that way. Further, approximately twice as many people in high-innovation workplaces feel empowered to make decisions without a manger’s approval, compared to employees in low-innovation companies.

Finally, nearly three in four employees say their leaders create a culture where it’s OK to make mistakes, compared to just half of the employees in lower-innovation companies.

Profound growth requires innovation and to foster innovation, you need people to feel trusted and supported to experiment and learn. There can be real returns for leaders who learn to let go and coach teams to constantly improve. – Dr. Parke.

III. Protect attention and promote flow

Workers report feeling like they waste 52% of their time each week due to things like unproductive meetings and emails, unnecessary interruptions, and time taken to track down information.

The study suggests that a combination of having the right physical environment, tech tools and a manager who supports diverse ways of working can cut this sub-optimal time in half.

However, the data from the study highlights there’s a greater opportunity than just the possibility of employers helping people be more productive. In fact, there’s also a significant opportunity to bolster employee engagement. When people are able to devote all of their attention and energy to a particular task, they are able to work in a flow state – sometimes known as ‘in the zone.’ Employees who can work in this way – at least some of the time – were three times more likely to say they were happy in their jobs

A working culture that values empowerment and autonomy appear to have an advantage in terms of people being able to work in a flow state: 72 percent of employees who report that they are able to work in flow state say their teams can choose how they approach work. In workplaces with low states of flow, only half of workers feel similarly.

In quick summary: the business leaders that will succeed tomorrow are not thinking about how they can make their workforce more productive – they are focused on helping their people be more innovative.

Any business leader knows that innovation is the key to growth or survival. The challenge, however, is how to establish a culture that consistently innovates, again and again, to avoid getting left behind.– Dr. Parke.

Go to Original Article
Author: Microsoft News Center

New small business ERP delivers SAP Business One alternative

Priority Software has launched Priority Zoom, a range of cloud ERP products to help small businesses streamline operational tasks such as financials, inventory, sales and customer relationships.

Priority Zoom has built-in business intelligence analytics, advanced reports and dashboards to track and manage each phase of the sales cycle. Users can create sales orders, manage invoices and billing, synchronize purchasing processes, and automatically generate ledger, transaction, financial and cash flow reports, customer lists, product and services catalogs and pricing.

Designed for smaller businesses, Priority Zoom is $50 per user for up to five users. That’s slightly lower than a limited user subscription to SAP’s Business One ERP software for small and midsize companies, which lists for $54 per user, per month. The Business One Professional user license lists at $94 per user, per month.

According to Priority Software, customers who migrate to the platform can convert their data from Priority Software’s on-premises accounting program AccountEdge, or from other accounting software such as QuickBooks.

Priority Zoom is in contrast of Priority ERP, which is Priority Software’s full ERP offering for tens of thousands of users that includes capabilities for finance, manufacturing, logistics, human resources, time and attendance, BI, project management, CRM and warehouse management. It also offers open APIs, a mobile application generator, web software development kit and machine learning business process management.

There are many advantages to adopting an ERP platform, according to Cindy Jutras, the president of Mint Jutras, an advisory firm that specializes in enterprise applications. For small businesses specifically, Jutras highlights the advantage of having “a single source of data and truth.”

“Most companies will simply point to the visibility, transparency and efficiency gained,” she said. “If it just gets them out of spreadsheet hell — passing data back and forth, the risk of error, the lack of auditability — they view it as worth it.”

Jutras said that as ERP platforms continue to change in scale and price, it has become easier for smaller businesses to take on the migration.

“Every single company would benefit, but very often SMBs feel they can’t afford it, and therefore they try to make do with something less than ERP — perhaps desktop applications combined with the ubiquitous spreadsheets, or perhaps just spreadsheets,” she said.

That approach made sense in the past when early ERP systems were rigid and inflexible, limited in functionality, hard to install and implement and even harder to use, but that’s changed, she said.

“Solutions now are far more flexible and technology-enabled, provide many more features and functions, are easier to install, easier to implement and easier to use,” Jutras said. “And SaaS-based [ERP apps] allow even the smallest companies to invest without an upfront capital expenditure.”

Go to Original Article
Author:

Implement automated employee onboarding with PowerShell

One of those most common tasks help desk technicians and system administrators handle is provisioning all the resources to onboard new employees.

Depending on the organization, these tasks may include creating Active Directory user accounts, creating home folders, provisioning new Office 365 mailboxes and setting up a VoIP extension in the phone system. With a little PowerShell coding, you can put together an automated employee onboarding script that does a majority of this work in little to no time.

To automate this process, it’s essential to define all the tasks. Many companies have a document that outlines the steps to onboard a new employee, such as the following:

  • create an Active Directory user account;
  • create a user folder on a file share; and
  • assign a mobile device.

When building a PowerShell script, start by researching if the basics required for automation are available. For example, does the system you’re using to assign a mobile device have an API? If not, then it can’t be completely automated. Rather than bypass this step, you can still add something to your script that sends an email to the appropriate person to complete the setup.

For other tasks that PowerShell can automate, you can start by scaffolding out some code in your script editor.

Build the framework for the script

Start by adding some essential functions to encapsulate each task. The following code is an example of build functions for each of the tasks we want to automate.

param(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath
)

function New-CompanyAdUser {
[CmdletBinding()]
param
(

)

}

function New-CompanyUserFolder {
[CmdletBinding()]
param
(

)

}

function Register-CompanyMobileDevice {
[CmdletBinding()]
param
(

)

}

function Read-Employee {
[CmdletBinding()]
param
(

)

}

This isn’t our final code. This is just a brainstorming exercise.

Add the code to receive input

Notice the param block at the top and the Read-Employee function. This function receives any type of input, such as a CSV file or database. When we create a function, it’s easy to modify the code if the method changes.

For now, we are using a CSV file to make the Read-Employee script below. By default, this function takes the CSV file path when the script runs.

function Read-Employee {
[CmdletBinding()]
param
(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath = $CsvFilePath
)

Import-Csv -Path $CsvFilePath

}

Add a Read-Employee reference below this function.

We have a CSV file from human resources that looks like this:

FirstName,LastName,Department
Adam,Bertram,Accounting
Joe,Jones,HR

We’ll give the script a name New-Employee.ps1 with the CsvFilePath parameter.

./New-Employee.ps1 -CsvFilePath './Employees.csv'

Developing the functions

Next, fill in the other functions. This is just an example but should give you a good idea of how you could build your code for the specifics your automated employee onboarding script should have. For more information on the creation of the New-CompanyAdUser function, you can find more information at this blog post.

param(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath
)

function New-CompanyAdUser {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

## Generate a random password
$password = [System.Web.Security.Membership]::GeneratePassword((Get-Random -Minimum 20 -Maximum 32), 3)
$secPw = ConvertTo-SecureString -String $password -AsPlainText -Force

## Generate a first initial/last name username
$userName = "$($EmployeeRecord.FirstName.Substring(0,1))$($EmployeeRecord.LastName))"

## Create the user
$NewUserParameters = @{
GivenName = $EmployeeRecord.FirstName
Surname = $EmployeeRecord.LastName
Name = $userName
AccountPassword = $secPw
}
New-AdUser @NewUserParameters

## Add the user to the department group
Add-AdGroupMember -Identity $EmployeeRecord.Department -Members $userName
}

function New-CompanyUserFolder {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

$fileServer = 'FS1'

$null = New-Item -Path "\\$fileServer\Users\$($EmployeeRecord.FirstName)$($EmployeeRecord.LastName)" -ItemType Directory

}

function Register-CompanyMobileDevice {
[CmdletBinding()]
param
(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[pscustomobject]$EmployeeRecord
)

## Send an email for now. If we ever can automate this, we'll do it here.
$sendMailParams = @{
'From' = '[email protected]'
'To' = '[email protected]'
'Subject' = 'A new mobile device needs to be registered'
'Body' = "Employee: $($EmployeeRecord.FirstName) $($EmployeeRecord.LastName)"
'SMTPServer' = 'smtpserver.something.local'
'SMTPPort' = '587'
}

Send-MailMessage @sendMailParams

}

function Read-Employee {
[CmdletBinding()]
param
(
[Parameter()]
[ValidateNotNullOrEmpty()]
[string]$CsvFilePath = $CsvFilePath
)

Import-Csv -Path $CsvFilePath

}

Read-Employee

Calling the functions

Once you build the functions, pass each of the employee records returned from Read-Employee to each function, as shown below.

$functions = 'New-CompanyAdUser','New-CompanyUserFolder','Register-CompanyMobileDevice'
foreach ($employee in (Read-Employee)) {
foreach ($function in $functions) {
& $function -EmployeeRecord $employee
}
}

By standardizing the function parameters to have a single parameter EmployeeRecord which coincides to a row in the CSV file, you can define the functions you want to call in an array and loop over each of them.

Click here to download the code used in this article on GitHub.

Go to Original Article
Author:

Swim DataFabric platform helps to understand edge streaming data

The new Swim DataFabric platform aims to help IT professionals categorize and make sense of large volumes of streaming data in real time.

The startup, based in San Jose, Calif., emerged from stealth in April 2018, with the promise of providing advanced machine learning and artificial intelligence capabilities to meet data processing and categorization challenges.

With the new Swim DataFabric, released Sept. 18, the vendor is looking to help make it easier for more users to analyze data. The Swim DataFabric platform integrates with Microsoft Azure cloud services including IoT suite and Data Lake Storage to classify and analyze data, as well as helps make predictions in real time.

The Swim DataFabric platform helps users get the most out of their real-time data with any distributed application including IoT and edge use cases, said Krishnan Subramanian, Rishidot Research chief research advisor.

“Gone are those days where REST is a reasonable interface for real-time data because of latency and scalability issues,” Subramanian said. “This is where Swim’s WARP protocol makes more sense and I think it is going to change how the distributed applications are developed as well as the user experience for these applications.”

Why the Swim DataFabric is needed

I think it is going to change how the distributed applications are developed as well as the user experience for these applications.
Krishnan SubramanianChief research advisor, Rishidot Research

A big IT challenge today is that users are getting streams of data from assets that are essentially boundless, said Simon Crosby, CTO at Swim. “A huge focus in the product is on really making it extraordinarily simple for customers to plug in their data streams and to build the model for them, taking all the pain out of understanding what’s in their data,” Crosby said.

Swim’s technology is being used by cities across the U.S. to help with road traffic management. The vendor has a partnership with Trafficware for a program that receives data from traffic sensors as part of a system that helps predict traffic flows.

The Swim DataFabric platform moves the vendor into a different space. The Swim DataFabric is focused on enabling customers that are Microsoft Azure cloud adopters to benefit from the Swim platform.

“It has an ability to translate any old data format from the edge into the CDM (Common Data Model) format which Microsoft uses for the ADLS (Azure Data Lake Storage) Gen2,” Crosby said. “So, a Microsoft user can now just click on the Swim DataFabric, which will figure out what is in the data, then labels the data and deposits it into ADLS.”

Screenshot of Swim architecture
Swim architecture

With the labelled data in the data lake, Crosby explained that the user can then use whatever additional data analysis tool they want, such as Microsoft’s Power BI or Azure Databricks.

He noted that Swim also has a customer that has chosen to use Swim technology on Amazon Web Services, but he emphasized that the Swim DataFabric platform is mainly optimized for Azure, due to that platform’s strong tooling and lifecycle management capabilities.

Swim DataFabric digital twin

One of the key capabilities that the Swim DataFabric provides is what is known as a digital twin model. The basic idea is that a data model is created that is a twin or a duplicate of something that exists in the real world.

“What we want is independent, concurrent, parallel processing of things, each of which is a digital twin of a real-world data source,” Crosby explained.

The advantage of the digital twin approach is fast processing as well as the ability to correlate and understand the state of data. With the large volumes of data that can come from IoT and edge devices, Crosby emphasized that understanding the state of a device is increasingly valuable.

“Everything in Swim is about transforming data into streamed insights,” Crosby said.

Go to Original Article
Author:

What people are saying about the new book ‘Tools and Weapons’ | Microsoft On The Issues

“When your technology changes the world,” he writes, “you bear a responsibility to help address the world that you have helped create.” And governments, he writes, “need to move faster and start to catch up with the pace of technology.” 

In a lengthy interview, Mr. Smith talked about the lessons he had learned from Microsoft’s past battles and what he saw as the future of tech policymaking – arguing for closer cooperation between the tech sector and the government. It’s a theme echoed in the book, “Tools and Weapons: The Promise and the Peril of the Digital Age,” which he wrote with Carol Ann Browne, a member of Microsoft’s communications staff.

The New York Times, Sept. 8, 2019


In 2019, a book about tech’s present and future impact on humankind that was relentlessly upbeat would feel out of whack with reality. But Smith’s Microsoft experience allowed him to take a measured look at major issues and possible solutions, a task he says he relished.

“There are some people that are steeped in technology, but they may not be steeped in the world of politics or policy,” Smith told me in a recent conversation. “There are some people who are steeped in the world of politics and policy, but they may not be steeped in technology. And most people are not actually steeped in either. But these issues impact them. And increasingly they matter to them.”

Fast Company, Sept. 8, 2019


In ‘Tools & Weapons: The Promise and the Peril of the Digital Age,’ the longtime Microsoft executive and his co-author Carol Ann Browne tell the inside story of some of the biggest developments in tech and the world over the past decade – including Microsoft’s reaction to the Snowden revelations, its battle with Russian hackers in the lead up to the 2016 elections and its role in the ongoing debate over privacy and facial recognition technology.

The book goes behind-the-scenes at the Obama and Trump White Houses; explores the implications of the coming wave of artificial intelligence; and calls on tech giants and governments to step up and prepare for the ethical, legal and societal challenges of powerful new forms of technology yet to come.

-GeekWire, September 7, 2019


Tensions between the U.S. and China feature prominently in Smith’s new book, ‘Tools and Weapons: The Promise and the Peril of the Digital Age.’ While Huawei is its own case, Smith worries that broader and tighter strictures could soon follow. The Commerce Department is considering new restrictions on the export of emerging technologies on which Microsoft has placed big bets, including artificial intelligence and quantum computing. “You can’t be a global technology leader if you can’t bring your technology to the globe,” he says.

-Bloomberg Businessweek, Sept. 7, 2019


Tell us what you think about the book @MSFTIssues. You can buy the book here or at bookstores around the world.

Go to Original Article
Author: Microsoft News Center

Eclipse launches Che 7 IDE for Kubernetes development

SAN FRANCISCO — The Eclipse Foundation has introduced Eclipse Che 7, a new developer workspace server and IDE to help developers build cloud-native, enterprise applications on Kubernetes.

The foundation debuted the new technology at the Oracle Code One conference here. Eclipse Che is essentially a cloud-based IDE built on technology Red Hat acquired from Codenvy, and Red Hat developers are still heavily involved with the Eclipse project. With a focus on Kubernetes, Eclipse Che 7 abstracts away some of the development complexities associated with Kubernetes and helps to close the gap between the development and operations environments, said Mike Milinkovich, executive director of the Eclipse Foundation.

“We think this is important because it’s the first cloud-based IDE that tends to be natively Kubernetes,” he said. “It provides all of the pieces that a cognitive developer needs to be able to build and deploy a Kubernetes application.”

Eclipse Che 7 helps developers who may not be so familiar with Kubernetes by providing not just the IDE, but also its plug-ins and their dependencies. In addition, Che 7 automatically adds all the build and debugging tools developers need for their applications.

Mike MilinkovichMike Milinkovich

“It helps reduce the learning curve that’s related to Kubernetes that a lot of developers struggle with, in terms of setting up Kubernetes and getting their first apps locations up and running on Kubernetes,” Milinkovich said.

The technology can be deployed on a public Kubernetes cluster or an on-premises data center, and it provides centrally hosted private developer workspaces. In addition, the Eclipse Che IDE is based on an extended version of Eclipse Theia that provides an in-browser experience like Microsoft’s Visual Studio Code, Milinkovich said.

Eclipse Che and Eclipse Theia are part of cloud-native offerings from vendors such as Google, IBM and Broadcom. And it lies at the core of Red Hat CodeReady Workspaces, a development for Red Hat OpenShift.

Moreover, Broadcom’s CA Brightside product uses Eclipse Che to bring a modern, open approach to the mainframe platform. Che also integrates with IBM Codewind to provide a low barrier to entry for developing in a production container environment.

Kubernetes is hard to manage, so it will be helpful to have an out-of-the-box offering from an IDE vendor.
Holger MuellerAnalyst, Constellation Research

“It had to happen, and it happened sooner than later: The first IDE delivered inside Kubernetes,” said Holger Mueller, an analyst at Constellation Research.

There are benefits of having developers build software with the same mechanics and platforms on the IDE side as their target production environment, he explained, including similar experience and faster code deployments.

“And Kubernetes is hard to manage, so it will be helpful to have an out-of-the-box offering from an IDE vendor,” Mueller said. “But nothing beats the advantage of being able to standardize and quickly launch uniform and consistent developer environments. This gives development team scale to build their next-gen applications and helps their enterprise accelerate.”

Eclipse joins a group that includes major vendors that want to limit the complexity of Kubernetes. IBM and VMware recently introduced technology to reduce Kubernetes complexity for developers and operations staff.

For instance, IBM’s Kabanero open source project to simplify development and deployment of apps on Kubernetes uses Che as its hosted IDE.

The future of developer tools will be cloud-based, Milinkovich said. “Because of the complexity of the application scenarios today, developers are spending a lot of their time and energy building out development environments when they could just move developer workspaces into containers,” he said. “It’s far easier to update the entire development team to new runtime requirements. And you can push out new tools across the entire development team.”

The IDE is the last big piece of technology that developers use on a daily basis that has not moved into the cloud, so moving the IDE into the cloud is the next logical step, Milinkovich said.

Go to Original Article
Author:

Oracle Cloud Infrastructure updates hone in on security

SAN FRANCISCO — Oracle hopes a focus on advanced security can help its market-lagging IaaS gain ground against the likes of AWS, Microsoft and Google.

A new feature called Maximum Security Zones lets customers denote enclaves within their Oracle Cloud Infrastructure (OCI) environments that have all security measures turned on by default. Resources within the zones are limited to configurations that are known to be secure. The system will also prevent alterations to configurations and provide continuous monitoring and defenses against anomalies, Oracle said on the opening day of its OpenWorld conference.

Through Maximum Security Zones, customers “will be better protected from the consequences of misconfigurations than they are in other cloud environments today,” Oracle said in an obvious allusion to recent data breaches, such as the Capital One-AWS hack, which have been blamed on misconfigured systems that gave intruders a way in.

“Ultimately, our goal is to deliver to you a fully autonomous cloud,” said Oracle executive chairman and CTO Larry Ellison, during a keynote. 

“If you spend the night drinking and get into your Ford F-150 and crash it, that’s not Ford’s problem,” he said. “If you get into an autonomous Tesla, it should get you home safely.”

Oracle wants to differentiate itself and OCI from AWS, which consistently promotes a shared responsibility model for security between itself and customers. “We’re trying to leapfrog that construct,” said Vinay Kumar, vice president of product management for Oracle Cloud Infrastructure.

“The cloud has always been about, you have to bring your own expertise and architecture to get this right,” said Leo Leung, senior director of products and strategy at OCI. “Think about this as a best-practice deployment automatically. … We’re going to turn all the security on and let the customer decide what is ultimately right for them.”

Security is too important to rely solely on human effort.
Holger MuellerVice president and principal analyst, Constellation Research.

Oracle’s Autonomous Database, which is expected to be a big focal point at this year’s OpenWorld, will benefit from a new service called Oracle Data Safe. This provides a set of controls for securing the database beyond built-in features such as always-on encryption and will be included as part of the cost of Oracle Database Cloud services, according to a statement.

Finally, Oracle announced Cloud Guard, which it says can spot threats and misconfigurations and “hunt down and kill” them automatically. It wasn’t immediately clear whether Cloud Guard is a homegrown Oracle product or made by a third-party vendor. Security vendor Check Point offers an IaaS security product called CloudGuard for use with OCI.

Starting in 2017, Oracle began to talk up new autonomous management and security features for its database, and the OpenWorld announcements repeat that mantra, said Holger Mueller, an analyst at Constellation Research in Cupertino, Calif. “Security is too important to rely solely on human effort,” he said.

OCI expansions target disaster recovery, compliance

Oracle also said it will broadly expand OCI’s global cloud footprint, with the launch of 20 new regions by the end of next year. The rollout will bring Oracle’s region count to 36, spread across North America, Europe, South America, the Middle East, Asia-Pacific, India and Australia.

This expansion will add multiple regions in certain geographies, allowing for localized disaster recovery scenarios as well as improved regulatory compliance around data location. Oracle plans to add multi-region support in every country it offers OCI and claimed this approach is superior to the practice of including multiple availability zones in a single region.

Oracle’s recently announced cloud interoperability partnership with Microsoft is also getting a boost. The interconnect that ties together OCI and Azure, now available in Virginia and London, will also be offered in the Western U.S., Asia and Europe over the next nine months, according to a statement. In most cases, Oracle is leasing data center space from providers such as Equinix, according to Kumar.

Holger MuellerHolger Mueller

SaaS vendors are another key customer target for Oracle with OCI. To that end, it announced new integrated third-party billing capabilities for the OCI software marketplace released earlier this year. Oracle also cited SaaS providers who are taking advantage of Oracle Cloud Infrastructure for their own underlying infrastructure, including McAfee and Cisco.

There’s something of value for enterprise customers in OCI attracting more independent software vendors, an area where Oracle also lags against the likes of AWS, Microsoft and Google, according to Mueller.

“In contrast to enterprises, they bring a lot of workloads, often to be transferred from on-premises or even other clouds to their preferred vendor,” he said. “For the IaaS vendor, that means a lot of scale, in a market that lives by economies of scale: More workloads means lower prices.”

Go to Original Article
Author:

F5 Networks updates NGINX Application Platform, other tools

F5 Networks has added and tweaked a handful of tools in an effort to help DevOps, NetOps and SecOps teams work together to deliver applications.

F5 Networks has been working on the updates since acquiring NGINX and its NGINX Application Platform in May. The suite includes NGINX Plus for load balancing and application delivery, NGINX Web Application Firewall for security, NGINX Unit to run application code and NGINX Controller to monitor and manage the platform.

The F5 Networks updates include the following changes to NGINX Application Platform and other tools:

Open source projects: F5 Networks hopes to accelerate the development of NGINX open source technologies, including upcoming HTTP/3 capabilities in NGINX Open Source. It has also worked on improving proxying and network capabilities in the NGINX Unit application server.

NGINX Application Platform: There are four new versions of products that build on NGINX Open Source, which were designed to consolidate 13 tools into one software platform, according to F5 Networks. The new versions of products include improved security and observability features in NGINX Plus, a new developer portal and API importing in the NGINX Controller API Management Module, improved analytics and configuration management in the NGINX Controller Load Balancing Module and the addition of custom resource definitions in NGINX Kubernetes Ingress Controller.

Arm and NS1: Arm and NGINX created Arm Neoverse-based tools for a range application and runs on Amazon EC2 A1 instances in the AWS Cloud. NGINX also introduced a new certified module that integrates NS1 global server load balancing with NGINX Plus.

Go to Original Article
Author: