Tag Archives: help

GandCrab ransomware adds NSA tools for faster spreading

With version 4, GandCrab ransomware has undergone a major overhaul, adding an NSA exploit to help spread and targeting a larger set of systems.

The updated GandCrab ransomware was first discovered earlier this month, but researchers are just now learning the extent of the changes. The code structure of the GandCrab ransomware was completely rewritten. And, according to Kevin Beaumont, a security architect based in the U.K., the malware now uses the EternalBlue National Security Agency (NSA) exploit to target SMB vulnerabilities and spread faster.

“It no longer needs a C2 server (it can operate in airgapped environments, for example) and it now spreads via an SMB exploit – including on XP and Windows Server 2003 (along with modern operating systems),” Beaumont wrote in a blog post. “As far as I’m aware, this is the first ransomware true worm which spreads to XP and 2003 – you may remember much press coverage and speculation about WannaCry and XP, but the reality was the NSA SMB exploit (EternalBlue.exe) never worked against XP targets out of the box.”

Joie Salvio, senior threat researcher at Fortinet, based in Sunnyvale, Calif., found the GandCrab ransomware was being spread to targets via spam email and malicious WordPress sites and noted another major change to the code.

“The biggest change, however, is the switch from using RSA-2048 to the much faster Salsa20 stream cipher to encrypt data, which had also been used by the Petya ransomware in the past,” Salvio wrote in the analysis. “Furthermore, it has done away with connecting to its C2 server before it can encrypt its victims’ file, which means it is now able to encrypt users that are not connected to the Internet.”

However, the GandCrab ransomware appears to specifically target users in Russian-speaking regions. Fortinet found the malware checks the system for use of the Russian keyboard layout before it continues with the infection.

Despite the overhaul of the GandCrab ransomware and the expanded systems being targeted, Beaumont and Salvio both said basic cyber hygiene should be enough to protect users from attack. This includes installing the EternalBlue patch released by Microsoft, keeping antivirus up-to-date and disabling SMB version 1 altogether, which is advice that has been repeated by various outlets, including US-CERT, since the initial WannaCry attacks began.

Financial firms, vendors push self-service software delivery

The heavily regulated financial industry requires more help with software delivery than any other. In particular, self-service software delivery appeals to firms that frequently revise codebases to accommodate policy changes and other forces.

“People don’t like writing [help desk] tickets. And, often, engineers don’t want to interact with other people at all,” said Niko Kurtti, a production engineer at Ottawa-based e-commerce platform vendor Shopify, who was half-joking at the recent QCon conference in New York City. “It’s just easier to have the machine take care of it.”

A handful of companies have stepped up to address this issue. Atomist has added self-service features to its Software Delivery Machine (SDM), with its API for Software that manages the different parts of the DevOps pipeline.

“It’s more like self-service with guardrails,” said Rod Johnson, CEO and co-founder of Atomist, based in San Francisco. “They want things to be easy and quick, but also regulated.”

Atomist adheres to the policies companies uniquely apply to their system. So, for example, if Atomist wants to add a security scan for errant open source code, rather than update each microservice by hand, Atomist makes the change once and replicates it across all the system’s services. The self-service software aspect of Atomist helps developers and DevOps teams consistently create projects and avoid IT help desk tickets — or tickets with other departments in the organization — to test or add new features.

Another entry into the self-service space is LaunchDarkly, based in Oakland, Calif., which sells a management platform for developers and operations teams to control the feature lifecycle from conception to delivery. The company’s software integrates release management into the development process and focuses on delivery. It puts all the potential features into the release and allows developers to flip a switch on features and functions for different end users. This lets a common code set deliver different functions and test different code simultaneously, rather than multiple different releases and code branches.

Rod Johnson, CEO, AtomistRod Johnson

Other examples of companies that sell similar products include startups Netsil, which focuses on monitoring Kubernetes and Docker-based microservices apps; Mobincube, which primarily targets mobile app development; and Bonitasoft, which comes out of the business process management and workflow engine world.

Some enterprises, though, choose to skip this product class and roll out their own self-service software delivery options, with scripts and integration with native tools.

Pulumi doesn’t necessarily aim to compete directly in the automation space, but it does want to standardize cloud app development and shares the idea of defining things like configuration in code, rather than YAML. Also, CloudBees and the Jenkins community have a complementary service, Jenkins X, which integrates Kubernetes with Jenkins.

Atomist addresses software delivery as a per-organization or per-team concern, rather than per project, which enables customers to apply consistent policies and governance. It provides a consistent model to automate tasks that matter to software teams, such as project creation and dependency updates.

CI/CD evolves with code automation and containers

Atomist is applying programming language concepts to add a new kind of automation and predictability to software delivery.
Mik KerstenCEO, Tasktop Technologies

With SDM, Atomist is creating a programmable pipeline that bridges a gap between coding languages and delivery pipelines, which some view as the next big innovation to follow CI/CD.

“Atomist is applying programming language concepts to add a new kind of automation and predictability to software delivery,” said Mik Kersten, CEO of Tasktop Technologies, a DevOps toolmaker based in Vancouver, B.C.

To date, the worlds of application code and CI/CD have been disconnected and based on completely different technologies and paradigms. Atomist’s programmable domain models span the application to deployment, so DevOps shops can use and code automations and directly interact with events in the pipeline through Slack, Kersten noted.

The ability to code automations is particularly attractive, said one software architect for a New York investment bank, who declined to be identified. “That would save our developers and DevOps [teams] lots of time and effort,” he said.

Atomist pledged SDM’s support for Docker and Kubernetes at the DockerCon 2018 conference in San Francisco last month. With this support, any Atomist user’s SDM would respond to code change events from the Atomist platform, automatically build new Docker containers as required and deploy them into the right Kubernetes environments based on that user’s unique software delivery needs established via their own policies.

“The actual management of containers within the software delivery process has been lacking in the market so far,” said Edwin Yuen, an analyst at Enterprise Strategy Group in Milford, Mass. “By integrating Dockerized apps and K8s into their SDM, as well as ChatOps and other tools, Atomist is looking to help operationalize container deployments, which is the next area of focus, as container applications go into broader adoption.”

RADWIN and Microsoft announce strategic partnership to deliver innovative TV White Space solutions | Stories

The partnership will help make broadband more affordable and accessible for underserved and unserved customers in the rural U.S. and around the world

REDMOND, Wash. — July 2, 2018 — On Monday, RADWIN and Microsoft Corp. announced a new strategic partnership to address the rural broadband gap. RADWIN, a world leader in delivering high-performance broadband wireless access solutions, will be developing and introducing to the market TV White Space solutions to deliver broadband internet to unserved communities. Focused on introducing innovative technologies into the TV White Space market, the partnership will expand the TV White Space ecosystem, making broadband more affordable and accessible for customers in the rural U.S. and around the world. This partnership is part of Microsoft’s Airband Initiative, which aims to expand broadband coverage using a mixture of technologies including TV White Space.

Broadband is a vital part of 21st century infrastructure. Yet, only about half of the world’s population is connected to the internet. New cloud services and other technologies make broadband connectivity a necessity to starting and growing small businesses and taking advantage of advances in agriculture, telemedicine and education. According to findings by the Boston Consulting Group, a connectivity model that uses a combination of technologies, including TV White Space, can reduce the cost of extending broadband coverage in rural communities. TV White Space is an important part of the solution, creating broadband connections in UHF bands and enabling communication in challenging rural terrains and highly vegetated areas, all while protecting broadcasters and other licensees from harmful interference.

“The TV White Space radio ecosystem is rapidly growing, and we are excited to work with RADWIN to bring innovative technologies to market at a global scale,” said Paul Garnett, senior director of the Microsoft Airband Initiative. “Our partnership with RADWIN, a recognized global leader in fixed wireless broadband access, will help address the rural broadband gap for residents and businesses, enabling farmers, healthcare professionals, educators, business leaders and others to fully participate in the digital economy.”

“RADWIN is a leading provider of broadband access solutions, enabling service providers globally to connect unserved and underserved homes and businesses,” said Sharon Sher, RADWIN’s president and CEO. “We are therefore very excited to be Microsoft’s partner in leading a global effort to connect rural communities and grow the TVWS ecosystem in the U.S. and around the world. The addition of innovative TV White Space solutions to RADWIN’s portfolio, which complements our sub-6GHz and mmWave fixed wireless offering, would enable our service provider customers and partners to extend their footprint by connecting more remote subscribers in challenging deployment use cases, penetrating through terrain obstructions and vegetation, and therefore helping to close the digital divide.”

In addition to the partnerships with companies like RADWIN, Microsoft’s Airband Initiative invests in partnerships with internet service providers (ISPs) and other telecommunications companies, introduces innovative solutions for rural connectivity, and provides digital skills training for people in newly connected communities. RADWIN and Microsoft will be introducing the innovative TV White Space solutions to these Airband Initiative partners, as well as to the global telecommunications industry, during the second half of 2019.

About RADWIN

RADWIN is a leading provider of broadband wireless solutions. Deployed in over 150 countries, RADWIN’s solutions power applications including fixed wireless access, backhaul, private network connectivity, video surveillance transmission as well as delivering broadband on the move for trains, vehicles and vessels. RADWIN’s solutions are adopted and deployed by tier 1 service providers globally as well as by large corporations.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, +1 (425) 638-7777,

rrt@we-worldwide.com

RADWIN Media Contact, Tammy Levy, Marketing Communications Manager, +972-3-766-2916, pr@radwin.com

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com.Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Land O’Lakes CIO strives to optimize economics of digital farming

Powered by big data, digital farming aims to help farmers grow crops in smarter, more sustainable and more efficient ways than ever before. Today, Land O’Lakes Inc. is looking at how to use analytics and cutting-edge technology to further reduce the risk farmers take on every growing season.

In an interview filmed at the recent MIT Sloan CIO Symposium, Michael Macrie, senior vice president and CIO at Land O’Lakes, provided a glimpse into how AI and machine learning are transforming one of the oldest industries in the world and why the economics of digital farming are a critical part of the way forward.

Editor’s note: The following was edited for clarity and brevity.

Is it hard to sell AI and machine learning to the business?

Michael Macrie: Yes. At their fundamental levels, these concepts are tools. They’re tools that could be used for a number of different reasons or solutions. So we like to talk about that business value — what’s the value they’re going bring to our business, what’s the value they’re going bring to that specific individual and the company, what’s the value they’re going bring to the customer or consumer. And if we can do that, we have a richer discussion than talking about the tool sets.

Yes, we may use big data, we may use AI behind the scenes, but the reality is that most businesspeople are looking for the result. They’re looking for the solution. And if we use magic behind the scenes, I’m not sure they care. And some of this is pretty close to magic these days. I think there’s a big opportunity for CIOs to talk differently to their stakeholders about the end result and what [companies] care about — not about the technology.

I talked to your predecessor in 2013 about digital farming. Where do things stand today?

Macrie: Back in 2013, we were just launching our first tool called the R7 tool. And now, with five years under our belt, we’ve become one of the market leaders and the largest distributor of ag-technology (agriculture technology) to America’s farmers. We have four proprietary tools that do very well. The tools help our farmers do more and grow more in the most sustainable way possible. And it’s been a great run for us in deploying these ag-technology resources. We bought a satellite company, which analyzes all of the [customer] imagery in real time. We project all those alerts down to the fields and down to people’s handhelds. We direct farmers on where to go every day.

It’s really changing American agriculture as we know it. It’s helping them grow more and spend less and do it in a way that’s more sustainably and environmentally friendly.

What kind of data do the tools rely on?

Macrie: For each and every field in the United States, we analyzed 40 years of history, and we can tell the farmer what the best seed is to plant in each piece of his field — different populations, different nutrient recommendations, all in a custom prescription. We beam that to the tractor. The tractor drives itself and plants these seeds. During the year, we monitor it with those same satellites, and we can detect anomalies before the naked eye can detect them and direct farmers and scouts out to those areas to do some diagnosis on what’s going on in the field: Does it need nutrients? Is there a weed? Is there a pest problem? We can then take action and preserve their yield during the course of the year.

Do you use AI and machine learning to detect anomalies?

Macrie: We use statistics and machine learning to detect the anomalies in the field. We run statistics across not only the field itself, but the field’s history and the other fields around it that were planted on similar dates to detect those anomalies.

Now that you’ve developed the tools and architecture for digital farming, what new problem are you looking to solve?

Macrie: What we’re working on right now is the economics and the economic variables for that farmer. We think we’ve cracked the code on how to grow more with less, and we’re bringing those solutions to market today and next year. After that though, we have to get the economics right.

A farmer is probably one of the largest gamblers in all of society. They take so much risk — on weather, on the crop yields and on the economic outcomes with the commodity pricing. We have to help them reduce that risk. We have to help them make it more manageable. And so that’s where we’re investing a lot of time and technology today to reduce the risk farmers take every year and provide them a more sustainable path to income.

View All Videos

Qumulo storage parts the clouds with its K-Series active archive

Scale-out NAS startup Qumulo has added a dense cloud archiving appliance to help companies find hidden value in idle data.

Known as the K-Series, the product is an entry-level complement to Qumulo storage with C-Series hybrid and all-flash NVMe P-Series NAS primary arrays. The K-144T active archive target embeds the Qumulo File Fabric (QF2) scalable file system on a standard 1U server.

Qumulo, based in Seattle, didn’t disclose the source of the K-Series’ underlying hardware, but it has an OEM deal with Hewlett Packard Enterprise to package the Qumulo Scalable File System on HPE Apollo servers. Qumulo storage customers need a separate software subscription to add the K-Series archive to an existing Qumulo primary storage configuration.

“It’s routine for our customers to be storing billions of files, either tiny files generated by machines or large files generated by video,” Qumulo chief marketing officer Peter Zaballos said. “We now have a complete product line, from archiving up to blazing high performance.”

Analytics and cloud bursting

Customers can build a physical K-Series cluster with a minimum of six nodes and scale by adding single nodes. That allows them to replicate data from the K-Series target to an identical Qumulo storage cluster in AWS for analytics or cloud bursting. A cluster can scale to 1,000 nodes.

“There’s no need to pull data back from the cloud. You can do rendering against a tier of storage in the cloud and avoid [the expense] of data migration,” Qumulo product manager Jason Sturgeon said.

Each Qumulo storage K-Series node scales to 144 TB of raw storage. Each node accepts a dozen 12 TB HDDs for storage, plus three SSDs to capture read metadata. QumuloDB analytics collects the metadata information as the data gets written. A nine-node configuration provides 1 PB of usable storage.

Qumulo said it designed the K-Series arrays with an Intel Xeon D system-on-a-chip processor to reduce power consumption.

Exploding market for NFS, object archiving

Adding a nearline option to Qumulo storage addresses the rapid growth of unstructured data that requires file-based and object storage, said Scott Sinclair, a storage analyst at Enterprise Strategy Group.

“Qumulo is positioning the K-Series as a lower-cost, higher-density option for large-capacity environments,” Sinclair said. “There is a tremendous need for cheap and deep storage. Many cheap-and-deep workloads are using NFS protocols. This isn’t a file gateway that you retrofit on top of an object storage box. You can use normal file protocols.”

Those file protocols include NFS, SMB and REST-based APIs.

Sturgeon said the K-Series can ingest reads at 6 Gbps and writes at 3 Gpbs, per 1 PB of usable capacity.

To eliminate tree walks, the QF2 updates metadata of all files associated with a folder. Process checks occur every 15 seconds to provide visibility on the amount of data stored within the directory structure, allowing storage to be accessed and queried in nearly real time.

Qumulo has garnered more than $220 million in funding, including a $93 million Series D round earlier this month. Qumulo founders Peter Godman, Aaron Passey and Neal Fachan helped develop the Isilon OneFS clustered file system, leading the company to an IPO in 2006. EMC paid $2.25 billion to acquire the Isilon technology in 2010.

Godman is Qumulo CTO, and Fachan is chief scientist. Passey left in 2016 to take over as principal engineer at cloud hosting provider Dropbox.

How to tackle an email archive migration for Exchange Online

Problem solve
Get help with specific problems with your technologies, process and projects.

A move from on-premises Exchange to Office 365 also entails determining the best way to transfer legacy archives. This tutorial can help ease migration complications.


A move to Office 365 seems straightforward enough until project planners broach the topic of the email archive…

“;
}
});

/**
* remove unnecessary class from ul
*/
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

/**
* Replace “errorMessageInput” class with “sign-up-error-msg” class
*/
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {
$(this).removeClass(“errorMessageInput”).addClass(“sign-up-error-msg”);
}
});
}

/**
* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
*/
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
renameErrorMsgClass();
return validateReturn;
}

/**
* DoC pop-up window js – included in moScripts.js which is not included in responsive page
*/
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);
e.preventDefault();
});

migration.

Not all organizations keep all their email inside their messaging platform. Many organizations that archive messages also keep a copy in a journal that is archived away from user reach for legal reasons.

The vast majority of legacy archive migrations to Office 365 require third-party tools and must follow a fairly standardized process to complete the job quickly and with minimal expense. Administrators should migrate mailboxes to Office 365 first and then the archive for the fastest way to gain benefits from Office 365 before the archive reingestion completes.

An archive product typically scans mailboxes for older items and moves those to longer term, cheaper storage that is indexed and deduplicated. The original item typically gets replaced with a small part of the message, known as a stub or shortcut. The user can find the email in their inbox and, when they open the message, an add-in retrieves the full content from the archive.

Options for archived email migration to Office 365

The native tools to migrate mailboxes to Office 365 cannot handle an email archive migration. When admins transfer legacy archive data for mailboxes, they usually consider the following three approaches:

  1. Export the data to PST archives and import it into user mailboxes in Office 365.
  2. Reingest the archive data into the on-premises Exchange mailbox and then migrate the mailbox to Office 365.
  3. Migrate the Exchange mailbox to Office 365 first, then perform the email archive migration to put the data into the Office 365 mailbox.

Option 1 is not usually practical because it takes a lot of manual effort to export data to PST files. The stubs remain in the user’s mailbox and add clutter.

Option 2 also requires a lot of labor-intensive work and uses a lot of space on the Exchange Server infrastructure to support reingestion.

That leaves the third option as the most practical approach, which we’ll explore in a little more detail.

Migrate the mailbox to Exchange Online

When you move a mailbox to Office 365, it migrates along with the stubs that relate to the data in the legacy archive. The legacy archive will no longer archive the mailbox, but users can access their archived items. Because the stubs usually contain a URL path to the legacy archive item, there is no dependency on Exchange to view the archived message.

Some products that add buttons to restore the individual message into the mailbox will not work; the legacy archive product won’t know where Office 365 is without further configuration. This step is not usually necessary because the next stage is to migrate that data into Office 365.

Transfer archived data

Legacy archive solutions usually have a variety of policies for what happens with the archived data. You might configure the system to keep the stubs for a year but make archive data accessible via a web portal for much longer.

There are instances when you might want to replace the stub with the real message. There might be data that is not in the user’s mailbox as a stub but that users want on occasion.

We need tools that not only automate the data migration, but also understand these differences and can act accordingly.

We need tools that not only automate the data migration, but also understand these differences and can act accordingly. The legacy archive migration software should examine the data within the archive and then run batch jobs to replace stubs with the full messages. In this case, you can use the Exchange Online archive as a destination for archived data that no longer has a stub.

Email archive migration software connects via the vendor API. The software assesses the items and then exports them into a common temporary format — such as an EML file — on a staging server before connecting to Office 365 over a protocol such as Exchange Web Services. The migration software then examines the mailbox and replaces the stub with the full message.

migration dashboard
An example of a third-party product’s dashboard detailing the migration progress of a legacy archive into Office 365.

Migrate journal data

With journal data, the most accepted approach is to migrate the data into the hidden recoverable items folder of each mailbox related to the journaled item. The end result is similar to using Office 365 from the day the journal began, and eDiscovery works as expected when following Microsoft guidance.

For this migration, the software scans the journal and creates a database of the journal messages. The application then maps each journal message to its mailbox. This process can be quite extensive; for example, an email sent to 1,000 people will map to 1,000 mailboxes.

After this stage, the software copies each message to the recoverable items folder of each mailbox. While this is a complicated procedure, it’s alleviated by software that automates the job.

Legacy archive migration offerings

There are many products tailored for an email archive migration. Each has its own benefits and drawbacks. I won’t recommend a specific offering, but I will mention two that can migrate more than 1 TB a day, which is a good benchmark for large-scale migrations. They also support chain of custody, which audits the transfer of all data

TransVault has the most connectors to legacy archive products. Almost all the migration offerings support Enterprise Vault, but if you use a product that is less common, then it is likely that TransVault can move it. The TransVault product accesses source data either via an archive product’s APIs or directly to the stored data. TransVault’s service installs within Azure or on premises.

Quadrotech Archive Shuttle fits in alongside a number of other products suited to Office 365 migrations and management. Its workflow-based process automates the migration. Archive Shuttle handles fewer archive sources, but it does support Enterprise Vault. Archive Shuttle accesses source data via API and agent machines with control from either an on-premises Archive Shuttle instance or, as is more typical, the cloud version of the product.

Dig Deeper on Exchange Online administration and implementation

SAP and Accenture collaborate on entitlement management platform

SAP and Accenture are teaming to deliver an intelligent entitlement management application intended to help companies build and deploy new business models.

Entitlement management applications help companies grant, enforce and administer customer access entitlements (which are usually referred to as authorizations, privileges, access right, or permissions) to data, devices and services — including embedded software applications — from a single platform.

The new SAP Entitlement Management allows organizations to dynamically change individual customer access rights and install renewal automation capabilities in applications, according to SAP. This means they can create new offerings that use flexible pricing structures.

The new platform’s entitlement management and embedded analytics integrate with SAP S/4HANA’s commerce and order management functions, which according to SAP, can help organizations create new revenue streams and get new products and services to market faster.

Accenture will provide consulting, system development and integration, application implementation, and analytics capabilities to the initiative.

“As high-tech companies rapidly transition from stand-alone products to highly connected platforms, they are under mounting pressure to create and scale new intelligent and digital business models,” said David Sovie, senior managing director of Accenture’s high-tech practice, in a press release. “The solution Accenture is developing with SAP will help enable our clients to pivot to as-a-service business models that are more flexible and can be easily customized.”

SAP and Accenture go on the defense

SAP and Accenture also unveiled a new platform that provides digital transformation technology and services for defense and security organizations.

The digital defense platform is based in S/4HANA and contains advanced analytics capabilities, and allows more use of digital applications by military personnel. It includes simulations and analytics applications intended to help defense and security organizations plan and run operations efficiently and be able to respond quickly to changing operating environments, according to SAP and Accenture.

“This solution gives defense agencies the capabilities to operate in challenging and fast-changing geo-political environments that require an intelligent platform with deployment agility, increased situational awareness and industry-specific capabilities,” said Antti Kolehmainen, Accenture’s managing director of defense business, in a press release.

The platform provides data-driven insights intended to help leaders make better decisions, and it enables cross-enterprise data integration in areas like personnel, military supply chain, equipment maintenance, finances and real estate.

IoT integration will enable defense agencies to connect devices that can collect and exchange data. The digital defense platform technology is available to be deployed on premises or in the cloud, according to the companies.

“The next-generation defense solution will take advantage of the technology capabilities of SAP S/4HANA and Accenture’s deep defense industry knowledge to help defense agencies build and deploy solutions more easily and cost-effectively and at the same time enable the digital transformation in defense,” said Isabella Groegor-Cechowicz, SAP’s global general manager of public services, in a press release.

New application and customer experience tool for SAP environments

AppDynamics (a Cisco company) has unveiled a new application and customer experience monitoring software product for SAP environments.

AppDynamics for SAP provides visibility into SAP applications and customer experiences via code-level insights into customer taps, swipes and clicks, according to AppDynamics. This helps companies understand the performance of SAP applications and databases, as well as the code impact on customers and business applications.

To satisfy customer expectations, [the modern enterprise] needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.
Thomas Wyattchief strategy officer, AppDynamics

“The modern enterprise is in a challenging position,” said Thomas Wyatt, AppDynamics’ chief strategy officer, in a press release. “To satisfy customer expectations, it needs to meet the demands of an agile, digital business, while also maintaining and operating essential core systems.”

AppDynamics for SAP allows companies to collaborate around business transactions, using a unit of measurement that automatically reveals customers’ interactions with applications. They can then identify and map transactions flowing between each customer-facing application and systems of records — SAP ERP or CRM systems that include complex integration layers, such as SAP Process Integration and SAP Process Orchestration.

AppDynamics for SAP includes ABAP code-level diagnostics and native ABAP agent monitoring that provides insights into SAP environments with code and database performance monitoring, dynamic baselines, and transaction snapshots when performance deviates from the norm. It also includes intelligent alerting to IT based on health rules and baselines that are automatically set for key performance metrics on every business transaction. Intelligent alerting policies integrate with existing enterprise workflow tools, including ServiceNow, PagerDuty and JIRA.

This means that companies can understand dependencies across the entire digital business and baseline, identify, and isolate the root causes of problems before they affect customers. AppDynamics for SAP also helps companies to plan SAP application migrations to the cloud and monitor user experiences post-migration, according to AppDynamics.

Microsoft Build highlights new opportunity for developers, at the edge and in the cloud

Announcing new innovations that help developers build AI and multidevice, multisense experiences, and new $25M AI for Accessibility program

REDMOND, Wash. — May 7, 2018 — Monday at Microsoft Build 2018, Microsoft Corp.’s annual developer conference, Microsoft leaders showcased new technologies to help every developer be an AI developer, on Microsoft Azure, Microsoft 365 and across any platform. Building for AI is more important to developers than ever, as technology continues to change the way people live and work every day, across the cloud and across edge devices.

“The era of the intelligent cloud and intelligent edge is upon us,” said Satya Nadella, CEO, Microsoft. “These advancements create incredible developer opportunity and also come with a responsibility to ensure the technology we build is trusted and benefits all.”

As part of Microsoft’s commitment to trusted, responsible AI products and practices, the company also today announced AI for Accessibility, a new $25 million, five-year program aimed at harnessing the power of AI to amplify human capabilities for more than 1 billion people around the world with disabilities. The program comprises grants, technology investments and expertise, and will also incorporate AI for Accessibility innovations into Microsoft Cloud services. It builds on the success of the similar AI for Earth initiative.

Advancements in the intelligent edge and intelligent cloud

Smart devices are proliferating in homes and businesses across the globe, with more than 20 billion expected by 2020. These devices are so smart, in fact, they are powering advanced ways to see, listen, reason and predict, without constant connectivity to the cloud. That is the intelligent edge, and it is opening opportunities for consumers, businesses and entire industries, from the operating room to the factory floor. Today Microsoft is announcing new capabilities for developers to extend to the edge:

  • Microsoft is open sourcing the Azure IoT Edge Runtime, allowing customers to modify, debug and have more transparency and control for edge applications.
  • Custom Vision will now run on Azure IoT Edge, enabling devices such as drones and industrial equipment to take critical action quickly without requiring cloud connectivity. This is the first Azure Cognitive Service to support edge deployment, with more coming to Azure IoT Edge over the next several months.
  • DJI, the world’s biggest drone company, is partnering with Microsoft to create a new SDK for Windows 10 PCs, and it has also selected Azure as its preferred cloud provider to further its commercial drone and SaaS solutions. The SDK will bring full flight control and real-time data transfer capabilities to nearly 700M Windows 10 connected devices globally. As part of the commercial partnership, DJI and Microsoft will co-develop solutions leveraging Azure IoT Edge and Microsoft’s AI services to enable new scenarios across agriculture, construction, public safety and more.
  • Microsoft announced a joint effort with Qualcomm Technologies, Inc. to create a vision AI developer kit running Azure IoT Edge. This solution makes available the key hardware and software required to develop camera-based IoT solutions. Developers can create solutions that use Azure Machine Learning services and take advantage of the hardware acceleration available via the Qualcomm® Vision Intelligence Platform and Qualcomm® AI Engine. The camera can also power advanced Azure services, such as machine learning, stream analytics and cognitive services, that can be downloaded from the cloud to run locally on the edge.

Data and AI development for a new era

Using data, machine learning and cognitive intelligence, developers can build and manage AI-rich solutions that transform the ways people work, collaborate and live:

  • Microsoft announced Project Kinect for Azure, a package of sensors, including our next-generation depth camera, with onboard compute designed for AI on the Edge. Building on Kinect’s legacy that has lived on through HoloLens, Project Kinect for Azure empowers new scenarios for developers working with ambient intelligence. Combining Microsoft’s industry-defining Time of Flight sensor with additional sensors all in a small, power-efficient form factor, Project Kinect for Azure will leverage the richness of Azure AI to dramatically improve insights and operations. It can input fully articulated hand tracking and high-fidelity spatial mapping, enabling a new level of precision solutions.
  • A Speech Devices SDK announced today delivers superior audio processing from multichannel sources for more accurate speech recognition, including noise cancellation, far-field voice and more. With this, developers can build a variety of voice-enabled scenarios like drive-thru ordering systems, in-car or in-home assistants, smart speakers, and other digital assistants.
  • Azure Cosmos DB updates include new and differentiated multimaster at global scale capabilities, designed to support both the cloud and the edge, along with the VNET general availability for increased security. With these new updates, Cosmos DB delivers even greater cost-effectiveness and global scale, further cementing it as the fastest-growing database service in the world.
  • A preview of Project Brainwave, an architecture for deep neural net processing, is now available on Azure and on the edge. Project Brainwave makes Azure the fastest cloud to run real-time AI and is now fully integrated with Azure Machine Learning. It also supports Intel FPGA hardware and ResNet50-based neural networks.
  • New Azure Cognitive Services updates include a unified Speech service with improved speech recognition and text-to-speech, which support customized voice models and translation. Along with Custom Vision, these updates make it easier for any developer to add intelligence to their applications.
  • Microsoft is making Azure the best place to develop conversational AI experiences integrated with any agent. New updates to Bot Framework and Cognitive Services will power the next generation of conversational bots enabling richer dialogs, and full personality and voice customization to match the company’s brand identity.
  • A preview of Azure Search with Cognitive Services integration. This new feature combines AI with indexing technologies so it’s possible to quickly find information and insights, whether via text or images.

Multisense and multidevice experiences

Microsoft also demonstrated mixed-reality capabilities to enable richer experiences that understand the context surrounding people, the things they use, their activities and relationships:

  • A new initiative, Project Kinect for Azure — a package of sensors from Microsoft that contains our unmatched time of flight depth camera, with onboard compute, in a small, power-efficient form factor — designed for AI on the Edge. Project Kinect for Azure brings together this leading hardware technology with Azure AI to empower developers with new scenarios for working with ambient intelligence.
  • With Microsoft Remote Assist, customers can collaborate remotely with heads-up, hands-free video calling, image sharing, and mixed-reality annotations. Firstline Workers can share what they see with any expert on Microsoft Teams, while staying hands on to solve problems and complete tasks together, faster.
  • With Microsoft Layout, customers can design spaces in context with mixed reality. Import 3-D models to create room layouts in real-world scale, experience designs as high-quality holograms in physical space or in virtual reality, and share and edit with stakeholders in real time.

Modern tooling and experiences for any platform in any language

Microsoft is empowering developers to build for the new era of the intelligent edge, across Azure, Microsoft 365 and other platforms, using the languages and frameworks of their choice:

  • With Azure Kubernetes Service (AKS), developers can drastically simplify how they build and run container-based solutions without deep Kubernetes experience. Generally available in the coming weeks, AKS integrates with developer tools and workspaces, DevOps capabilities, networking, monitoring tools, and more in the Azure portal, so developers can write code, not stitch services together. In addition, Microsoft is now offering Kubernetes support for Azure IoT Edge devices.
  • Visual Studio IntelliCode is a new capability that enhances everyday software development with the power of AI. IntelliCode provides intelligent suggestions to improve code quality and productivity and is available in preview today in Visual Studio.
  • Visual Studio Live Share, now in preview, lets developers easily and securely collaborate in real time with team members who can edit and debug directly from their existing tools like Visual Studio 2017 and VS Code. Developers can use Live Share with any language for any scenario, including serverless, cloud-native and IoT development.
  • Building on our shared commitment to developers and open source, Microsoft announced a new partnership with GitHub that brings the power of Azure DevOps services to GitHub customers. Today, we released the integration of Visual Studio App Center and GitHub, which provides GitHub developers building apps for iOS and Android devices to seamlessly automate DevOps processes right from within the GitHub experience.
  • Available today, the new Microsoft Azure Blockchain Workbench makes it easier to develop blockchain applications by stitching together an Azure-supported blockchain network with cloud services like Azure Active Directory, Key Vault and SQL Database, reducing proof-of-concept development time dramatically.

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications, (425) 638-7777, rrt@we-worldwide.com

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

The post Microsoft Build highlights new opportunity for developers, at the edge and in the cloud appeared first on Stories.

STEAM CONTROLLER

Hello hope one of you guys can help me source a STEAM CONTROLLER. I already have the steam link so just looking for the controller.
Thank you in advance.

Location: Middlesbrough

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be…

STEAM CONTROLLER