Tag Archives: Week

For Sale – Microsoft Surface Laptop 2 i5 8gb 256gb 13.5”, cobalt blue colour

As title says

have this laptop for sale. Was bought only last week and not registered with Microsoft yet so all the warranty available

really sleek looking and light laptop with great screen and keyboard

used and will be sticking to Mac OS!

excellent condition fully boxed

will get photos up in the next day or two

looking for £850 delivered
NOW 750 DELIVERED

pics attached

Go to Original Article
Author:

New capabilities added to Alfresco Governance Services

Alfresco Software introduced new information governance capabilities this week to its Digital Business Platform through updates to Alfresco Governance Services.

The updates include new desktop synchronization, federation services and AI-assisted legal holds features.

“In the coming year, we expect many organizations to be hit with large fines as a result of not meeting regulatory standards for data privacy, e.g., the European GDPR and California’s CCPA. We introduced these capabilities to help our customers guarantee their content security and circumvent those fines,” said Tara Combs, information governance specialist at Alfresco.

Federation Services enables cross-databases search

Federation Services is a new addition to Alfresco Governance Services. Users can search, view and manage content from Alfresco and other repositories, such as network file shares, OpenText, Documentum, Microsoft SharePoint, Dropbox.

Users can also search across different databases with the application without having to migrate content. Federation Services provides one user interface for users to manage all the information resources in an organization, according to the company.

Organizations can also store content in locations outside of Alfresco platform.

Legal holds feature provides AI-assisted search for legal teams

The legal holds feature provides document search and management capabilities that help legal teams identify relevant content for litigation purposes. Alfresco’s tool now uses AI to discover relevant content and metadata, according to the company.

“AI is offered in some legal discovery software systems, and over time all these specialized vendors will leverage AI and machine learning,” said Alan Pelz-Sharpe, founder and principal analyst at Deep Analysis. He added that the AI-powered feature of Alfresco Governance Services is one of the first such offerings from a more general information management vendor.

“It is positioned to augment the specialized vendors’ work, essentially curating and capturing relevant bodies of information for deeper analysis.”

Desktop synchronization maintains record management policies

Another new feature added to Alfresco Governance Services synchronizes content between a repository and a desktop, along with the records management policies associated with that content, according to the company.

With the desktop synchronization feature, users can expect to have the same record management policies when they access a document on their desktop computer or viewing it from the source repository, according to the company.

When evaluating a product like this in the market, Pelz-Sharpe said the most important feature a buyer should look for is usability. “AI is very powerful, but less than useless in the wrong hands. Many AI tools expect too much of the customer — usability and recognizable, preconfigured features that the customer can use with little to no training are essential.”

The new updates are available as of Dec. 3. There is no price difference between the updated version of Alfresco Governance Services and the previous version. Customers who already had a subscription can upgrade as part of their subscription, according to the company.

According to Pelz-Sharpe, Alfresco has traditionally competed against enterprise content management and business process management vendors. It has pivoted during recent years to compete more directly with PaaS competitors, offering a content- and process-centric platform upon which its customer can build their own applications. In the future, the company is likely to compete against the likes of Oracle and IBM, he said.

Go to Original Article
Author:

AWS moves into quantum computing services with Braket

Amazon debuted a preview version of its quantum computing services this week, along with a new quantum computing research center and lab where AWS cloud users can work with quantum experts to identify practical, short-term applications.

The new AWS quantum computing managed service, called Amazon Braket, is aimed initially at scientists, researchers and developers, giving them access to quantum systems provided by IonQ, D-Wave and Rigetti.

Amazon’s quantum computing services news comes less than a month after Microsoft disclosed it is developing a chip capable of running quantum software. Microsoft also previewed a version of its Azure Quantum Service and struck partnerships with IonQ and Honeywell to help deliver the Azure Quantum Service.

In November, IBM said its Qiskit QC development framework supports IonQ’s ion trap technology, used by IonQ and Alpine Quantum Technologies.

Google recently claimed it was the first quantum vendor to achieve quantum supremacy — the ability to solve complex problems that classical systems either can’t solve or would take them an extremely long time to solve. Company officials said it represented an important milestone.

In that particular instance, Google’s Sycamore processor solved a difficult problem in just 200 seconds — a problem that would take a classical computer 10,000 years to solve. The claim was met with a healthy amount of skepticism by some competitors and other more objective sources as well. Most said they would reserve judgement on the results until they could take a closer look at the methodology involved.

Cloud services move quantum computing forward

Peter Chapman, CEO and president of IonQ, doesn’t foresee any conflicts with his respective agreements with rivals Microsoft and AWS. AWS jumping into the fray with Microsoft and IBM will help push quantum computing closer to the limelight and make users more aware of the technology’s possibilities, he said.

“There’s no question AWS’s announcements give greater visibility to what’s going on with quantum computing,” Chapman said. “Over the near term they are looking at hybrid solutions, meaning they will mix quantum and classical algorithms making [quantum development software] easier to work with,” he said.

There’s no question AWS’s announcements will give greater visibility to what’s going on with quantum computing.
Peter ChapmanCEO and president, IonQ

Microsoft and AWS are at different stages of development, making it difficult to gauge which company has advantages over the other. But what Chapman does like about AWS right now is the set of APIs that allows a developer’s application to run across the different quantum architectures of IonQ (ion trap), D-Wave (quantum annealing) and Rigetti (superconducting chips).

“At the end of the day it’s not how many qubits your system has,” Chapman said. “If your application doesn’t run on everyone’s hardware, users will be disappointed. That’s what is most important.”

Another analyst agreed that the sooner quantum algorithms can be melded with classical algorithms to produce something useful in an existing corporate IT environment, the faster quantum computing will be accepted.

“If you have to be a quantum expert to produce anything meaningful, then whatever you do produce stays in the labs,” said Frank Dzubeck, president of Communications Network Architects, Inc. “Once you integrate it with the classical world and can use it as an adjunct for what you are doing right now, that’s when [quantum technology] grows like crazy.”

Microsoft’s Quantum Development Kit, which the company open sourced earlier this year, also allows developers to create applications that operate across a range of different quantum architectures. Like AWS, Microsoft plans to combine quantum and classical algorithms to produce applications and services aimed at the scientific markets and ones that work on existing servers.

One advantage AWS and Microsoft provide for smaller quantum computing companies like IonQ, according to Chapman, is not just access to their mammoth user bases, but support for things like billing.

“If customers want to run something on our computers, they can just go to their dashboard and charge it to their AWS account,” Chapman said. “They don’t need to set up an account with us. We also don’t have to spend tons of time on the sales side convincing Fortune 1000 users to make us an approved vendor. Between the two of them [Microsoft and AWS], they have the whole world signed up as approved vendors,” he said.

The mission of the AWS Center for Quantum Computing will be to solve longer-term technical problems using quantum computers. Company officials said they have users ready to begin experimenting with the newly minted Amazon Braket but did not identify any users by name.

The closest they came was a prepared statement by Charles Toups, vice president and general manager of Boeing’s Disruptive Computing and Networks group. The company is investigating how quantum computing, sensing and networking technologies can enhance Boeing products and services for its customers, according to the statement.

“Quantum engineering is starting to make more meaningful progress and users are now asking for ways to experiment and explore the technology’s potential,” said Charlie Bell, senior vice president with AWS’s Utility Computing Services group.

AWS’s assumption going forward is quantum computing will be a cloud-first technology, which will be the way AWS will provide its users with their first quantum experience via Amazon Braket and the Quantum Solutions Lab.

Corporate and third-party developers can create their own customized algorithms with Braket, which gives them the option of executing either low-level quantum circuits or fully-managed hybrid algorithms. This makes it easier to choose between software simulators and whatever quantum hardware they select.

The AWS Center for Quantum Computing is based at Caltech, which has long invested in both experimental and theoretical quantum science and technology.

Go to Original Article
Author:

Aviso introduces version 2.0 of AI-guided sales platform

Aviso announced version 2.0 of its artificial intelligence guided sales platform last week. The new version is aimed at lowering costs and reducing the time that sales reps spend working on CRM databases by providing them with AI tools that predict deal close probabilities and guide next best actions.

Algorithmic-guided selling using AI technology and existing sales data to guide sellers through deals is a new but increasingly popular technology. Nearly 51% of sales organizations have already deployed or plan to deploy algorithmic-guided selling in the next five years, according to a 2019 Gartner survey.

Aviso’s 2.0 sales platform uses AI tools to prioritize sales opportunities and analyze data from sources including CRM systems, emails, user calendars, chat transcripts and support and success tools to deliver real-time insights and suggest next best action for sales teams. The support and success tools are external offerings that Aviso’s platform can connect with, including customer support tools like Zendesk or Salesforce Service Cloud, and customer success tools like Gainsight or Totango, according to Amit Pande, vice president of marketing at Aviso.

The forecasting and sales guidance vendor claims the new version will help sales teams close 20% more deals and reduce spending on non-core CRM licenses by 30% compared with conventional CRM systems. The cost reduction calculation is based on “the number of non-core licenses that can be eliminated, as well as additional costs such as storage and add-ons that can be eliminated when underutilized or unused licenses are eliminated,” Pande said.

According to Aviso, new AI-based features in version 2.0 of its sales platform include:

  • Deal Execution Tools, a trio of tools meant to assist in finalizing deals. Bookings Timeline uses machine learning to calculate when deals will close based on an organization’s unique history. Each booking timeline also includes the top factors that influence the prediction. Opportunity Acceleration helps sales teams determine which opportunities carry the highest probability of closing early if they are pulled into the current quarter. Informed Editing is intended to limit typos and unsaved changes during entry of data. The tool gives users contextual help before they commit to edits, telling them what quarter or whose forecast they are updating. Changes and edits are automatically saved by the software.
  • Deal and Forecast Rooms enable users to do what-if analysis, use scenario modeling and automatically summarize forecast calls and deal review transcripts.
  • Coaching Rooms help sales managers improve sales rep performance with data from past and current deals and from team activity in Deal and Forecast Rooms. 
  • Nudges provide reminders for sales reps through an app on mobile devices. Nudges also offer recommendation for course corrections, and potential next steps based on insights from the specific deal.

Aviso’s 2.0 sales platform is currently in beta with select customers.

Cybersecurity company FireEye has been using the Aviso platform for several years and is among the selected customers. Andy Pan, director of Americas and public sector sales operations at FireEye, said the Aviso platform has helped FireEye operate in a more predictive measure through some of its new AI-driven features. “The predictive features helps us review both the macro business as a whole, and the deal-specific features provides guided pathways towards the inspection of deals.”

Other sales forecasting tools vendors in the market include Salesforce and Clari. Sales forecasting feature from Salesforce enables organizations to make forecasts specific to their needs and let managers track their team’s performance. Clari’s product includes features such as predictive forecasting, which uses AI-based projection to see the team’s achievement at the end of the quarter, and history tracking to see who last made changes to the forecast.

Go to Original Article
Author:

IBM Cloud Pak for Security aims to unify hybrid environments

IBM this week launched Cloud Pak for Security, which experts say represents a major strategy shift for Big Blue’s security business

The aim of IBM’s Cloud Pak for Security is to create a platform built on open-source technology that can connect security tools from multiple vendors and cloud platforms in order to help reduce vendor lock-in. IBM Cloud Paks are pre-integrated and containerized software running on Red Hat OpenShift, and previously IBM had five options for Cloud Paks — Applications, Data, Integration, Automation and Multicloud Management — which could be mixed and matched to meet enterprise needs.

Chris Meenan, director of offering management and strategy at IBM Security, told SearchSecurity that Cloud Pak for Security was designed to tackle two “big rock problems” for infosec teams. The first aim was to help customers get data insights through federated search of their existing data without having to move it to one place. Second was to help “orchestrate and take action across all of those systems” via built-in case management and automation. 

Meenan said IT staff will be able to take actions across a multi-cloud environment, including “quarantining users, blocking IP addresses, reimaging machines, restarting containers and forcing password resets.”

“Cloud Pak for Security is the first platform to take advantage of STIX-Shifter, an open-source technology pioneered by IBM that allows for unified search for threat data within and across various types of security tools, datasets and environments,” Meenan said. “Rather than running separate, manual searches for the same security data within each tool and environment you’re using, you can run a single query with Cloud Pak for Security to search across all security tools and data sources that are connected to the platform.” 

Meenan added that Cloud Pak for Security represented a shift in IBM Security strategy because of its focus on delivering “security solutions and outcomes without needing to own the data.”

“That’s probably the biggest shift — being able to deliver that to any cloud or on-premise the customer needs,” Meenan said. “Being able to deliver that without owning the data means organizations can deploy any different technology and it’s not a headwind. Now they don’t need to duplicate the data. That’s just additional overhead and introduces friction.”

One platform to connect them all

Meenan said IBM was “very deliberate” to keep data transfers minimal, so at first Cloud Pak for Security will only take in alerts from connected vendor tools and search results.

“As our Cloud Pak develops, we plan to introduce some capability to create alerts and potentially store data as well, but as with other Cloud Paks, the features will be optional,” Meenan said. “What’s really fundamental is we’ve designed a Cloud Pak to deliver applications and outcomes but you don’t have to bring the data and you don’t have to generate the alerts. Organizations have a SIEM in place, they’ve got an EDR in place, they’ve got all the right alerts and insights, what they’re really struggling with is connecting all that in a way that’s easily consumable.”

In order to create the connections to popular tools and platforms, IBM worked with clients and service providers. Meenan said some connectors were built by IBM and some vendors built their own connectors. At launch, Cloud Pak for Security will include integration for security tools from IBM, Carbon Black, Tenable, Elastic, McAfee, BigFix and Splunk, with integration for Amazon Web Services and Microsoft Azure clouds coming later in Q4 2019, according to IBM’s press release.

Ray Komar, vice president of technical alliances at Tenable, said that from an integration standpoint, Cloud Pak for Security “eliminates the need to build a unique connector to various tools, which means we can build a connector once and reuse it everywhere.”

“Organizations everywhere are reaping the benefits of cloud-first strategies but often struggle to ensure their dynamic environments are secure,” Komar told SearchSecurity. “With our IBM Cloud Pak integration, joint customers can now leverage vulnerability data from Tenable.io for holistic visibility into their cloud security posture.”

Jon Oltsik, senior principal analyst and fellow at Enterprise Strategy Group, based in Milford, Mass., told SearchSecurity that he likes this new strategy for IBM and called it “the right move.”

“IBM has a few strong products but other vendors have much greater market share in many areas. Just about every large security vendor offers something similar, but IBM can pivot off QRadar and Resilient and extend its footprint in its base. IBM gets this and wants to establish Cloud Pak for Security as the ‘brains’ behind security. To do so, it has to be able to fit nicely in a heterogeneous security architecture,” Oltsik said. “IBM can also access on-premises data, which is a bit of unique implementation. I think IBM had to do this as the industry is going this way.”

Martin Kuppinger, founder and principal analyst at KuppingerCole Analysts AG, based in Wiesbaden, Germany, said Cloud Pak for Security should be valuable for customers, specifically “larger organizations and MSSPs that have a variety of different security tools from different vendors in place.”

“This allows for better incident response processes and better analytics. Complex attacks today might span many systems, and analysis requires access to various types of security information. This is simplified, without adding yet another big data lake,” Kuppinger told SearchSecurity. “Obviously, Security Cloud Pak might be perceived competitive by incident response management vendors, but it is open to them and provides opportunities by building on the federated data. Furthermore, a challenge with federation is that the data sources must be up and running for accessing the data — but that can be handled well, specifically when it is only about analysis; it is not about real-time transactions here.”

The current and future IBM Security products

Meenan told SearchSecurity that Cloud Pak for Security would not have any special integration with IBM Security products, which would “have to stand on their own merits” in order to be chosen by customers. However, Meenan said new products in the future will leverage the connections enabled by the Cloud Pak.

“Now what this platform allows us to do is to deliver new security solutions that are naturally cross-cutting, that require solutions that can sit across an EDR, a SIEM, multiple clouds, and enable those,” Meenan said. “When we think about solutions for insider threat, business risk, fraud, they’re very cross-cutting use cases so anything that we create that cuts across and provides that end-to-end security, absolutely the Cloud Pak is laying the foundation for us — and our partners and our customers — to deliver that.”

Oltsik said IBM’s Security Cloud Pak has a “somewhat unique hybrid cloud architecture” but noted that it is “a bit late to market and early versions won’t have full functionality.”

“I believe that IBM delayed its release to align it with what it’s doing with Red Hat,” Oltsik said. “All that said, IBM has not missed the market, but it does need to be more aggressive to compete with the likes of Cisco, Check Point, FireEye, Fortinet, McAfee, Palo Alto, Symantec, Trend Micro and others with similar offerings.”

Kuppinger said that from an overall IBM Security perspective, this platform “is rather consequent.”

“IBM, with its combination of software, software services, and implementation/consultancy services, is targeted on such a strategy of integration,” Kuppinger wrote via email. “Not owning data definitely is a smart move. Good architecture should segregate data, identity, and applications/apps/services. This allows for reuse in modern, service-oriented architectures. Locking-in data always limits that reusability.”

Go to Original Article
Author:

Minecraft Earth Early Access Off to an Exciting Start Following Mobs in the Park Kickoff – Xbox Wire

Summary

  • Last week, we unveiled the Mobs in the Park pop-up experience in New York City, Sydney and London
  • Since kicking off early access on Oct. 17, the global community has placed 240.4 million blocks, collected 76 million tappables and started 6.8 million crafting and smelting sessions
  • The Mobs in the Park will continue over the next two weekends from 9 a.m. – 6 p.m. local time, as well as a special appearance on Black Friday in New York City

Last week in celebration of Minecraft Earth’s early access rollout, we unveiled the Mobs in the Park pop-up experience in three locations around the world – Hudson Yards in New York City, Campbell’s Cove in Sydney and the Queen’s Walk in London – granting players exclusive in-game access to the holiday-themed Jolly Llama mob. The community’s response to Mobs in the Park has been humbling over the first weekend, and we can’t wait to see even more reactions leading into the next two weekends.

The fun doesn’t stop with Mobs in the Park as Minecraft Earth continues to gain momentum and roll out to more countries worldwide. Last week the game released in the U.S., earlier this week it became available to players in Western Europe and Japan, and the goal is for the game to be worldwide by the end of the year.

Since kicking off early access on Oct. 17, the global community has placed 240.4 million blocks, collected 76 million tappables and started 6.8 million crafting and smelting sessions! We’re so proud of how the community has embraced the game in early access rollout and look forward to bringing even more exciting experiences to players everywhere in the coming weeks.

Minecraft Earth’s Mobs in the Park will continue over the next two weekends, so players interested in receiving the Jolly Llama for themselves can visit the interactive pop-ups from 9 a.m. – 6 p.m. local time during the weekends of November 23-24 and November 30-December 1, or during a special appearance at Hudson Yards in New York City on Black Friday, November 29. Only at these locations will players be able to get first access to the holiday-inspired Jolly Llama before it’s available broadly in December.

For more information on Mobs in the Park and what’s next with early access rollout, visit Xbox Wire and Minecraft.net.

Go to Original Article
Author: Microsoft News Center

New Salesforce Customer 360 aims to unify data sources

SAN FRANCISCO — Salesforce’s new customer data and identity platform released this week connects data across sales, service, marketing and other departments within an organization to provide users with a single ID for each customer.

The platform, Salesforce Customer 360 Truth (also called Single Source of Truth and SSOT by Salesforce executives) is considered by many to be a customer data platform (CDP) — though that is only one component of the new platform.

Whatever one calls it, Salesforce Customer 360 Truth resembles competing CDPs from Oracle, Adobe and Microsoft in functionality. Customers, analysts and partners said the new feature bundle solves a problem endemic to many CX platforms: reconciling customer IDs and data across sales, marketing, e-commerce and customer service platforms to a golden record.

Features in Customer 360 Truth — which come at no extra charge to Salesforce subscribers — include customer data management, identity management and privacy tools that are available now. Salesforce plans to release a unified customer profile, the most significant feature, in 2020.

The capabilities, taken together, will not only aggregate updated customer data from one base like a CDP, but will be able go further than CDPs, typically used by marketing and advertising teams. Customer 360 Truth features can route it and push actions and personalizations to sales, service and support teams dealing with an individual customer, the company said.

Customer data will be structured on the Cloud Information Model open standard modeled by MuleSoft and developed jointly by AWS, Genesys and Salesforce for The Linux Foundation.

It’s all long overdue, IDC analyst Neil Ward-Dutton said.

“Salesforce should have done this five years ago. If anything’s surprising, it’s that they have managed to not have it until now, because many customers have more than one cloud and there’s been no easy way to get a view of individual, unique IDs,” Ward-Dutton said. “It’s instructive that it’s free. I think that’s the only thing they could do. If they would have charged for this, it would have been a really bad mistake.”

Salesforce Customer 360 Truth includes a Data Manager component.
Customer 360 Truth connects customer data across Salesforce clouds and non-Salesforce systems and matches that data to individual customers to create a common profile and issue a single Salesforce ID.

Customers, partners take stock

Customers generally reacted positively to the news of Salesforce Customer 360 Truth at Dreamforce here this week, with some wondering what data and process housecleaning outside of Salesforce will be required to use the tools.

If anything’s surprising, it’s that they have managed to not have it until now, because many customers have more than one cloud and there’s been no easy way to get a view of individual, unique IDs.
Neil Ward-DuttonAnalyst, IDC

That’s the case for e.l.f. Cosmetics, CIO and CTO Ekta Chopra said. Her company runs Salesforce marketing, sales, service and e-commerce clouds, and also is an early adopter of Salesforce’s order management system, processing about a million transactions a year. While Customer 360 Truth features look promising, her company will have to evaluate how to manage different types of profiles such as customers versus wholesalers.

“We have to make sure we’re gathering all that data in the best possible way,” Chopra said. “We’re not just a direct-to-consumer business.”

Hyland Software is both a Salesforce customer and a partner, with its OnBase enterprise content management system integration available on Salesforce’s AppExchange. Salesforce Customer 360 Truth is a move in the right direction to reconcile conflicting customer data, but the process will always require a mix of different vendors’ tools to nail it all down, said Ed McQuiston, Hyland executive vice president and chief commercial officer.

“There is no one, ubiquitous platform that gives you 360,” McQuiston said. “Salesforce plays a critical part for us in terms of understanding the customer, our interactions, et cetera. But we use our own product with it, because I want to see the contracts we have, the support information. I want that complete view.”

Patrick Morrissey, general manager of Salesforce partner Upland Software, said he thinks Customer 360 features will help Salesforce customers use Upland’s Altify revenue management tools more effectively.

“Customer revenue optimization intersects quite nicely with Customer 360,” Morrissey said. “The problem is that the data and processes don’t connect. The vision that Salesforce has around Customer 360 is fantastic, because it brings the data together for the customer and reduces friction.”

CDPs can only go so far

Salesforce might not call Customer 360 Truth a CDP because its capabilities extend beyond what competing CDPs do, said Joe Stanhope, analyst at Forrester Research, who watches the technology closely.

“Salesforce was talking quite a bit about CDPs in the early iterations of Customer 360,” Stanhope said. “But I think, over time, the scope evolved and expanded. Ultimately, Customer 360 is about more than a CDP, and even more than just marketing. Customer 360 is the key to enabling the Salesforce ecosystem with data.”

Arizona State University’s CTO of EdPlus online learning, Donna Kidwell, sees the Salesforce tools as a good start to wrangle sprawling data. Her team is building a blockchain ledger to track accomplishments of the university’s learners, which comprises students pursuing degrees, professionals earning certifications, high schoolers attending camps and others who interact in some way with the university.

The ambitious project involves Salesforce CRM data and Salesforce Blockchain as a spoke of a much larger wheel that ultimately will enable data sharing across educational institutions and employers.

CDPs in general — and Salesforce Customer 360 Truth in particular — may help consolidate data that can be fed into the ledger at some point in the future. But ultimately, managing customer data across learning systems, HR applications, Salesforce and other contributing systems is a much larger problem than a CDP can solve.

“I’m definitely tracking the CDPs,” Kidwell said. “I’m hopeful that Salesforce will ease some of those concerns, but I can’t imagine they’ll be the single source. There’s not going to be a single source of truth. We’re actually going to need data strategies, and our technologies will help implement those strategies.”

Go to Original Article
Author:

Microsoft scientist accepts Hamburg Prize for Theoretical Physics for quantum contributions – Microsoft Quantum

This week, Dr. Matthias Troyer, a Distinguished Scientist at Microsoft, accepted the 2019 Hamburg Prize for Theoretical Physics – one of the most valuable German prizes in the field – for his groundbreaking contributions to the development of quantum Monte Carlo algorithms.

“In Professor Troyer, we are honoring a scientist whose work connects myriad areas of physics and computer science. On account of his current research in the field of quantum computing, he partners with universities and companies in the US and around the world. He has also set up an open-source platform in order to share his knowledge. By awarding the prize to Professor Troyer, we also wish to recognize this contribution to collaborative research,” explained Dr. Nina Lemmens, Member of the Executive Board of the Joachim Herz Stiftung.

Dr. Troyer works at the interface between computer science and theoretical physics and is one of just a handful of leading international researchers in this field. Monte Carlo algorithms can predict how tiny particles will interact within quantum mechanical many-body systems such as atoms and molecules, and Dr. Troyer’s work in this area is playing a key role in the research and ongoing development of quantum computers and superconducting materials.

When asked about what this honor means to him, Dr. Troyer said, “One reason I came to Microsoft and why I want to build a quantum computer is that when inventing these Monte Carlo methods, we made big breakthroughs, but we also encountered a fundamental problem of Monte Carlo simulations of quantum systems, the so-called  ‘sign problem.’ The workaround becomes exponentially difficult; a quantum computer will help us move past these barriers.”

With the recent Microsoft announcement of Azure Quantum, teams will soon be able to experiment running algorithms like Monte Carlo against both classical hardware in Azure and quantum hardware from partners, knowing these solutions will scale to future quantum systems as well.

The prize not only comes with a grant, but also entails research visits to Hamburg that will see Dr. Troyer give talks and work closely with doctoral candidates, postdocs, and other colleagues.

Dr. Troyer continued, “I’m looking forward to engaging the academic community in discussing and further advancing what we can do with quantum computing. As we think of quantum algorithms for material science, what problems can we solve now with quantum simulations? And how do we develop quantum algorithms to run once we have a fully scalable quantum computer?”

“The connection to Hamburg means that we can engage with the academic and scientific communities, and with that, I look forward to talking to the people in Hamburg – and around the world – about applying quantum systems and quantum computing to make an impact on material science problems.”

Microsoft and the Azure Quantum team congratulate Dr. Troyer on this significant recognition, and we look forward to supporting his important work in making an impact in solving some of the world’s toughest challenges with quantum computing.

Go to Original Article
Author: Microsoft News Center

AWS, Azure and Google peppered with outages in same week

AWS, Microsoft Azure and Google Cloud all experienced service degradations or outages this week, an outcome that suggests customers should accept that cloud outages are a matter of when, not if.

In AWS’s Frankfurt region, EC2, Relational Database Service, CloudFormation and Auto Scaling were all affected Nov. 11, with the issues now resolved, according to AWS’s status page.

Azure DevOps services for Boards, Repos, Pipelines and Test Plans were affected for a few hours in the early hours of Nov. 11, according to its status page. Engineers determined that the problem had to do with identity calls and rebooted access tokens to fix the system, the page states.

Google Cloud said some of its APIs in several U.S. regions were affected, and others experienced problems globally on Nov. 11, according to its status dashboard. Affected APIs included those for Compute Engine, Cloud Storage, BigQuery, Dataflow, Dataproc and Pub/Sub. Those issues were resolved later in the day.

Google Kubernetes Engine also went through some hiccups over the past week, in which nodes in some recently upgraded container clusters resulted in high levels of kernel panics. Known more colloquially as the “blue screen of death” and other terms, kernel panics are conditions wherein a system’s OS can’t recover from an error quickly or easily.

The company rolled out a series of fixes, but as of Nov. 13, the status page for GKE remained in orange status, which indicates a small number of projects are still affected.

AWS, Microsoft and Google have yet to provide the customary post-mortem reports on why the cloud outages occurred, although more information could emerge soon.

Move to cloud means ceding some control

The cloud outages at AWS, Azure and Google this week were far from the worst experienced by customers in recent years. In September 2018, severe weather in Texas caused a power surge that shut down dozens of Azure services for days.

Stephen ElliotStephen Elliot

Cloud providers have aggressively pursued region and zone expansions to help with disaster recovery and high-availability scenarios. But customers must still architect their systems to take advantage of the expanded footprint.

Still, customers have much less control when it comes to public cloud usage, according to Stephen Elliot, an analyst at IDC. That reality requires some operational sophistication.

It’s a myth that outages won’t happen.
Stephen ElliotAnalyst, IDC

“Networks are so interconnected and distributed, lots of partners are involved in making a service perform and available,” he said. “[Enterprises] need a risk mitigation strategy that covers people, process, technologies, SLAs, etc. It’s a myth that outages won’t happen. It could be from weather, a black swan event, security or a technology glitch.”

Jay LymanJay Lyman

This fact underscores why more companies are experimenting with and deploying workloads across hybrid and multi-cloud infrastructures, said Jay Lyman, an analyst at 451 Research. “They either control the infrastructure and downtime with on-premises deployments or spread their bets across multiple public clouds,” he said.

Ultimately, enterprise IT shops can weigh the challenges and costs of running their own infrastructure against public cloud providers and find it difficult to match, said Holger Mueller, an analyst at Constellation Research.

“That said, performance and uptime are validated every day, and should a major and longer public cloud outage happen, it could give pause among less technical board members,” he added.

Go to Original Article
Author:

InfoTrax settles FTC complaint, will implement infosec program

InfoTrax Systems this week settled with the Federal Trade Commission regarding allegations that the company failed to protect consumer data after a nearly two-year-long data breach.

The FTC filed a complaint against the Utah-based multi-level marketing software company in the wake of attackers stealing sensitive information on approximately one million customers over the course of more than 20 malicious infiltrations between May 2014 and March 2016.

InfoTrax only became aware of the attacks in March 2016 because a data archive file created by the malicious actors grew so large that servers reached maximum storage capacity. “Only then did Respondents begin to take steps to remove the intruder from InfoTrax’s network,” the FTC wrote in the complaint.

The FTC asserted that InfoTrax — in part — failed to implement a process to inventory and delete unnecessary customer information, failed to detect malicious file uploads and stored consumers’ personal information, including Social Security numbers, bank account information, payment card information and more, “in clear, readable text on InfoTrax’s network.”

The FTC added, “Respondents could have addressed each of the failures … by implementing readily available and relatively low-cost security measures.”

As a result of the settlement, InfoTrax is “prohibited from collecting, selling, sharing or storing personal information unless they implement an information security program that would address the security failures identified in the complaint. This includes assessing and documenting internal and external security risks; implementing safeguards to protect personal information from cybersecurity risks; and testing and monitoring the effectiveness of those safeguards,” the FTC wrote in a statement published Tuesday.

The company will also have to obtain assessments of its information security program from a third party, approved by the FTC, every two years.

InfoTrax responds

On Nov. 12, Scott Smith, president and newly appointed CEO of InfoTrax, released a statement that is no longer hosted at its original link on PR Newswire. A copy was published by The Herald Journal.

Smith claimed the company “took immediate action” to secure data and prevent further unauthorized access after discovering the breach. The company then contacted affected clients, law enforcement agencies, including the FBI, as well as “top forensic security experts to help us identify where our system was vulnerable and to take steps to improve our security and prevent further incidents like this.”

“Without agreeing with the FTC’s findings from their investigation, we have signed a consent order that outlines the security measures that we will maintain going forward, many of which were implemented before we received the FTC’s order,” Smith said. “We deeply regret that this security incident happened. Information security is critical and integral to our operations, and our clients’ and customers’ security and privacy is our top priority.”

In response to SearchSecurity’s request for the original statement, InfoTrax offered a slightly modified one from the CEO, which notably removed the part about not agreeing to the FTC’s findings:

“This incident happened nearly four years ago, at which time we took immediate steps to identify and remediate the issue. We notified our clients and worked closely with security experts and law enforcement. We deeply regret that the incident happened,” Smith said in the statement. “Even though the FTC has just now released their documents, this is an issue we responded to immediately and aggressively as soon as we became aware of it in 2016, and we have not experienced additional incidents since then. The privacy and security of our clients’ information continues to be our top priority today.”

Richard Newman, an FTC defense attorney at Hinch Newman LLP in New York, told SearchSecurity that his overall take on the case was that “The FTC’s enforcement of data security matters based upon alleged unreasonable data security practices is becoming an increasingly common occurrence. The Commission does so under various theories, including that such acts and practices are ‘unfair’ in violation of the FTC Act.”

He added that the stipulation that InfoTrax is prohibited from collecting, sharing, or selling user data until they fix their security issues is “not uncommon,” and that “stipulated settlement agreements in this area have recently undergone an overhaul based upon judicial developments and enforceability-related challenges. Terms such as mandated information security programs, security assessments, etc. are now commonplace in such settlements.”

Regarding whether or not the settlement is adequate, Adam Solander, a partner at King & Spalding LLP in Atlanta, told SearchSecurity, “It’s hard to judge without being involved intimately with the facts, but the FTC is an aggressive organization. They take privacy and security very seriously, and I think this is evidence of how aggressive they are in their enforcement of it.”

Go to Original Article
Author: