For Sale – Draytek and i7 6700

Europe’s busiest forums, with independent news and expert reviews, for TVs, Home Cinema, Hi-Fi, Movies, Gaming, Tech and more.

AVForums.com is owned and operated by M2N Limited,
company number 03997482, registered in England and Wales.

Powered by Xenforo, Hosted by Nimbus Hosting, Original design Critical Media Ltd.
This website uses the TMDb API but is not endorsed or certified by TMDb.

Copyright © 2000-2020 E. & O.E.

Go to Original Article
Author:

For Sale – LG Ultragear 32GK850F 32 Inch LCD 144hz Gaming Monitor – Black.

LG Ultragear 32GK850F 32 Inch LCD 144hz Gaming Monitor – Black.

Price reduced as quick sale wanted, no dead pixels, no marks or scratches, immaculate like new, comes with box, power supply and display port cable, selling as have chosen a different size monitor, collection only, welcome to come and view.

Go to Original Article
Author:

For Sale – Elitebook x360 1030

Europe’s busiest forums, with independent news and expert reviews, for TVs, Home Cinema, Hi-Fi, Movies, Gaming, Tech and more.

AVForums.com is owned and operated by M2N Limited,
company number 03997482, registered in England and Wales.

Powered by Xenforo, Hosted by Nimbus Hosting, Original design Critical Media Ltd.
This website uses the TMDb API but is not endorsed or certified by TMDb.

Copyright © 2000-2020 E. & O.E.

Go to Original Article
Author:

For Sale – Lenovo 300 Compact PC – Windows 10

You have a pm

Go to Original Article
Author:

International Women’s Day 2020: Creating opportunity for all – Microsoft Partner Network

Sunday, March 8th will mark a day that’s close to my heart, International Women’s Day.

The day shines a light on the progress we’ve made in recognizing the potential of a diverse and inclusive economy and the power that comes from developing strong, female role models. Yet while we can reflect on that progress, we must also acknowledge the work that still must be done. It is critical for me that we address the challenges that still exist for women in today’s business landscape—there are many women around the world who are locked out of opportunities many take for granted, for a variety of reasons.

I’m proud of the work we’ve done at Microsoft so far to increase access and opportunities for women through our workplace culture, policies and technologies. I believe we have a responsibility to highlight other organizations that have also prioritized diversity and inclusion and encourage others to do the same.

Building opportunity and access for all through technology

Technology helps organizations empower their employees, optimize their operations, connect with their customers and transform their products. It’s also a key factor in building an inclusive economy; an economy that harnesses the power of diversity to create opportunities and positive business outcomes for all. At Microsoft, we understand that a diverse work force inspires diverse solutions, which ultimately helps drive innovations that benefit everyone.

That’s why I am excited to share that Microsoft is supporting the United Nation’s Sustainable Development Goals through our #BuildFor2030 campaign. Through October, we will be highlighting Microsoft partners with solutions that align to the UN’s goals. And in celebration of Women’s History Month and International Women’s Day on March 8th, we will be focusing on solutions by women-led organizations within our Microsoft partner community. I encourage you to read more about these incredible innovations here.

These solutions showcase the entrepreneurial spirit of women in technology—a community that is grossly underrepresented in the marketplace today. Recent studies suggest, if women and men participated equally as entrepreneurs, global GDP could rise by approximately 3% to 6%, boosting the global economy by as much as $5 trillion. If we work together, we can start that shift, and create more opportunities for everyone.


Did you know?

According to the McKinsey Global Institute:

  • Companies in the top quartile for gender diversity are 21 percent more profitable than companies in the bottom quartile
  • Companies in the top quartile for ethnic and cultural diversity are 33 percent more likely to outperform companies in the bottom quartile
  • Closing the gender gap in the workforce could add $28 trillion to the global GDP
  • Closing the gender gap in the workforce could add $28 trillion to the global GDP

Women in Cloud

In January, Microsoft hosted the Women in Cloud Summit in Redmond, and I had the privilege of discussing how we can all work to create more opportunities for women in technology. Women in Cloud is a community-led organization that brings together female entrepreneurs, global leaders, corporations, and policy makers to support economic development for women in tech. They have vowed to help create $1 billion in economic access and opportunity by 2030.

As an executive sponsor of this initiative, I have sat down with many female business owners and have heard their struggles, triumphs and breakthroughs. Everyone I’ve met has emphasized the importance of access to technology, customers, partners, and investments. My team and I are focused on creating access for their growth through co-marketing and co-selling opportunities as we strive to create an inclusive marketplace for all partners to deploy cloud solutions and services.

Building for the future

While we are focused on creating equal access and opportunity for women business owners today, we must also prepare the next generation of entrepreneurs and female tech leaders. To participate in the global economy and businesses of the future, understanding and innovating with technology will be a core skill of any job. Young women need to embrace technology and develop skills and passions that will be key success factors in a world where technology is part of every business in every industry.

I’d like to invite all Microsoft partners to join other impact-oriented technology solution leaders in the #BuildFor2030 campaign to highlight their innovative solutions. And in honor of International Women’s Day, I encourage you to take action and drive momentum towards creating a gender-equal society by supporting this campaign.

Go to Original Article
Author: Steve Clarke

Data Science Central co-founder talks AI, data science trends

The data science field has changed greatly with the advent of AI. Artificial intelligence has enabled the rise of citizen data scientists, the automation of data scientist’s workloads, as well as the need for more skilled data scientists.

Vincent Granville, co-founder of Data Science Central, a community and resource site for data specialists, expects to see an increase in AI and IoT in data science over the next few years, even as AI continues to change the data science field.

In this Q&A, Granville discusses data science trends, the impact of AI and IoT on data scientists, how organizations and data scientists will have to adapt to increased data privacy regulations, and the evolution of AI.

Data Science Central was acquired by TechTarget on March 4.

Will an increase in citizen data scientists due to AI, as well as an increase of more formal data science education programs, help fix the so-called data scientist shortage?

Vincent Granville: I believe that we will see an increase in two fronts. We will see more data science programs being offered by universities, perhaps even doctorates in addition to master degrees, as well as more bootcamps and online training aimed at practitioners working with data but lacking some skills such as statistical programming or modern techniques such as deep learning — something old but that became popular recently due to the computational power now available to train and optimize these models.

Vincent Granville, Data Science Central, Data science trendsVincent Granville

There is also a parallel trend that will increase, consisting of hiring professionals not traditionally thought of as data scientists, such as physicists, who have significant experience working with data. This is already the case in fintech, where these professionals learn the new skills required on the job. Along with corporations training staff internally via sending selected employees to tech and data bootcamps, this will help increase the pipeline of potential recruits for the needed positions.

Also, AI itself will help build more tools to automate some of the grunt work, like data exploration, that many data scientists do today, currently eating up to 80% of their time. Think of it as AI to automate AI.

Similarly, how will, or how has, data science changed with the advent of AI that can automate various parts of the data science workflow?

Granville: We will see more automation of data science tasks. In my day-to-day activities, I have automated as much as I can, or outsourced or used platforms to do a number of tasks — even automating pure mathematical work such as computing integrals or finding patterns in number sequences.

The issue is resistance by employees to use such techniques, as they may perceive it as a way to replace them. But the contrary is true: Anything you do [manually] that can be automated actually lowers your job security. A change in mentality must occur for further adoption of automated data science for specific tasks, simple or not so simple, such as the creation of taxonomies, or programs that write programs.

The trend probably started at least 15 years ago, with the advent of machine-to-machine communications, using API’s and the internet at large for machines, aka robots, to communicate between themselves, and even make decisions. Now with a huge amount of unexploited sensor data available, it even has a term of its own: IoT.

An example is this: EBay purchases millions of keywords on Google; the process, including predicting the value, ROI and set[ting] the pricing for keywords, is fully automated. There is a program at eBay that exchanges info with one running at Google to make this transparent, including keyword purchasing, via programmed APIs. Yet eBay employs a team of data scientists and engineers to make sure things run smoothly and are properly maintained, and same with Google.

How will increased data privacy regulations and a larger focus on cybersecurity change data science?

Granville: It will always be a challenge to find the right balance. People are getting concerned that their data is worth something, more than just $20, and don’t like to see this data sold and resold time and over by third parties, or worse, hijacked or sold for nefarious purposes such as surveillance. Anything you post on Facebook can be analyzed by third parties and end up in the hands of government agencies from various countries, for profiling purposes, or detection of undesirable individuals.

Some expectations are unrealistic: You cannot expect corporations to tell what is hidden in the deep layers of their deep learning algorithms. This is protected intellectual property. When Google shows you search results, nobody, not even Google, knows how what you see — sometimes personalized to you — came up that way. But Google publishes patents about these algorithms, and everyone can check them.

The same is true with credit scoring and refusal to offer a loan. I think in the future, we will see more and more auditing of these automated decisions. Sources of biases will have to be found and handled. Sources of errors due to ID theft, for example, will have to be found and addressed. The algorithms are written by human beings, so they are not less biased than the human beings who designed them in the first place. Some seemingly innocuous decisions such as deciding which features, or variables, to introduce in your algorithm, potentially carry a bias.

I could imagine some companies [may] relocate … or even stop doing business altogether in some countries that cause too many challenges. This is more likely to happen to small companies, as they don’t have the resources to comply with a large array of regulations. Yet we might see in the future AI tools that do just that: help your business comply transparently with all local laws. We have that already for tax compliance.

What other data science trends can we expect to see in 2020 and beyond?

Granville: We live in a world with so many problems arising all the time — some caused by new technologies. So, the use of AI and IoT will increase.

We live in a world with so many problems arising all the time — some caused by new technologies. So, the use of AI and IoT will increase.
Vincent GranvilleCo-founder and executive data scientist, Data Science Central

Some problems will find solutions in the next few years, such as fake news detection or robocalls, just like it took over 10 years to fix email spamming. But it is not just a data science issue: if companies benefit financially short-term from the bad stuff, like more revenue to publishers because of fake news or clickbait, or more revenue to mobile providers due to robocalls, it needs to be addressed with more than just AI.

Some industries evolve more slowly and will see benefits in using AI in the future: Think about automated medical diagnostics or personalized dosing of drugs, small lawsuits handled by robots, or even kids at school being taught, at least in part, by robots. And one of the problems I face all the time with my spell-checker is its inability to detect if I write in French or English, resulting in creating new typos rather than fixing them.

Chatbots will get better too, eventually, for tasks such as customer support, or purchasing your groceries via Alexa without even setting foot in a grocery store or typing your shopping list. In the very long term, I could imagine the disappearance of written language, replaced by humans communicating orally with machines.

Go to Original Article
Author:

For Sale – Type C to USB3.0 Hub & Card Reader

Europe’s busiest forums, with independent news and expert reviews, for TVs, Home Cinema, Hi-Fi, Movies, Gaming, Tech and more.

AVForums.com is owned and operated by M2N Limited,
company number 03997482, registered in England and Wales.

Powered by Xenforo, Hosted by Nimbus Hosting, Original design Critical Media Ltd.
This website uses the TMDb API but is not endorsed or certified by TMDb.

Copyright © 2000-2020 E. & O.E.

Go to Original Article
Author:

Splice Machine 3.0 integrates machine learning capabilities, database

Databases have long been used for transactional and analytics use cases, but they also have practical utility to help enable machine learning capabilities. After all, machine learning is all about deriving insights from data, which is often stored inside a database.

San Francisco-based database vendor Splice Machine is taking an integrated approach to enabling machine learning with its eponymous database. Splice Machine is a distributed SQL relational database management system that includes machine learning capabilities as part of the overall platform.

Splice Machine 3.0 became generally available on March 3, bringing with it updated machine learning capabilities. It also has new Kubernetes cloud native-based model for cloud deployment and enhanced replication features.

In this Q&A, Monte Zweben, co-founder and CEO of Splice Machine, discusses the intersection of machine learning and databases and provides insight into the big changes that have occurred in the data landscape in recent years.

How do you integrate machine learning capabilities with a database?

Monte ZwebenMonte Zweben

Monte Zweben: The data platform itself has tables, rows and schema. The machine learning manager that we have native to the database has notebooks for developing models, Python for manipulating the data, algorithms that allow you to model and model workflow management that allows you to track the metadata on models as they go through their experimentation process. And finally we have in-database deployment.

So as an example, imagine a data scientist working in Splice Machine working in the insurance industry. They have an application for claims processing and they are building out models inside Splice Machine to predict claims fraud. There’s a function in Splice Machine called deploy, and what it will do is take a table and a model to generate database code. The deploy function builds a trigger on the database table that tells the table to call a stored procedure that has the model in it for every new record that comes in the table.

So what does this mean in plain English? Let’s say in the claims table, every time new claims would come in, the system would automatically trigger, grab those claims, run the model that predicts claim cause and outputs those predictions in another table. And now all of a sudden, you have real-time, in-the-moment machine learning that is detecting claim fraud on first notice of loss.

What does distributed SQL mean to you?

Zweben: So at its heart, it’s about sharing data across multiple nodes. That provides you the ability to parallelize computation and gain elastic scalability. That is the most important distributed attribute of Splice Machine.

In our new 3.0 release, we just added distributed replication. It’s another element of distribution where you have secondary Splice Machine instances in geo-replicated areas, to handle failover for disaster recovery.

What’s new in Splice Machine 3.0?

Zweben: We moved our cloud stack for Splice Machines from an old Mesos architecture to Kubernetes. Now our container-based architecture is all Kubernetes, and that has given us the opportunity to enable the separation of storage and compute. You literally can pause Splice Machine clusters and turn them back on. This is a great utility for consumption based usage of databases.

Along with our upgrade to Kubernetes, we also upgraded our machine learning manager from an older notebook technology called Zeppelin to a newer notebook technology that has really gained momentum in the marketplace, as much as Kubernetes has in the DevOps world. Jupyter notebooks have taken off in the data science space.

We’ve also enhanced our workflow management tool called mlflow, which is an open source tool that originated with Databricks and we’re part of that community. Mlflow allows data scientists to track their experiments and has that record of metadata available for governance.

What’s your view on open source and the risk of a big cloud vendor cannibalizing open source database technology?

Zweben: We do compose many different open source projects into a seamless and highly performant integration. Our secret sauce is how we put these things together at a very low level, with transactional integrity, to enable a single integrated system. This composition that we put together is open source, so that all of the pieces of our data platform are available in our open source repository, and people can see the source code right now.

I’m intensely worried about cloud cannibalization. I switched to an AGPL license specifically to protect against cannibalization by cloud vendors.

On the other hand, we believe we’re moving up the stack. If you look at our machine learning package, and how it’s so inextricably linked with the database, and the reference applications that we have in different segments, we’re going to be delivering more and more higher-level application functionality.

What are some of the biggest changes you’ve seen in the data landscape over the seven years you’ve been running Splice Machine?

Zweben: With the first generation of big data, it was all about data lakes, and let’s just get all the data the company has into one repository. Unfortunately, that has proven time and time again, at company after company, to just be data swamps.

Data repositories work, they’re scalable, but they don’t have anyone using the data, and this was a mistake for several reasons.

Instead of thinking about storing the data, companies should think about how to use the data.
Monte ZwebenCo-founder and CEO, Splice Machine

Instead of thinking about storing the data, companies should think about how to use the data. Start with the application and how you are going to make the application leverage new data sources.

The second reason why this was a mistake was organizationally, because the data scientists who know AI were all centralized in one data science group, away from the application. They are not the subject matter experts for the application.

When you focus on the application and retrofit the application to make it smart and inject AI, you can get a multidisciplinary team. You have app developers, architects, subject-matter experts, data engineers and data scientists, all working together on one purpose. That is a radically more effective and productive organizational structure for modernizing applications with AI.

Go to Original Article
Author:

Microsoft for Healthcare: Empowering our customers and partners to provide better experiences, insights and care – The Official Microsoft Blog

At Microsoft, our goal within healthcare is to empower people and organizations to address the complex challenges facing the healthcare industry today. We help do this by co-innovating and collaborating with our customers and partners as a trusted technology provider. Today, we’re excited to share progress on the latest innovations from Microsoft aimed at helping address the most prevalent and persistent health and business challenges:

  • Empower care teams with Microsoft 365: Available in the coming weeks, the new Bookings app in Microsoft Teams will empower care teams to schedule, manage and conduct virtual visits with remote patients via video conference. Also coming soon, clinicians will be able to target Teams messages to recipients based on the shift they are working. Finally, healthcare customers can support their security and compliance requirements with the HIPAA/HITECH assessment in Microsoft Compliance Score.
  • Protect health information with Azure Sphere: Microsoft’s integrated security solution for IoT (Internet of Things) devices and equipment – is now widely available for the development and deployment of secure, connected devices. Azure Sphere helps securely personalize patient experiences with connected devices and solutions. And, to make it easier for healthcare leaders to develop their own IoT strategies, today we’re launching a new IoT Signals report focused on the healthcare industry that provides an industry pulse on the state of IoT adoption and helpful insights for IoT strategies. Learn more about Microsoft’s IoT offerings for healthcare here.
  • Enable personalized virtual care with Microsoft Healthcare Bot: Today, we’re pleased to announce that Microsoft Healthcare Bot, our HITRUST-certified platform for creating virtual health assistants, is enriching its healthcare intelligence with new built-in templates for healthcare-specific use cases, and expanding its integrated medical content options. With the addition of Infermedica, a cutting-edge triage engine based on advanced artificial intelligence (AI) that enables symptom checking in 17 languages Healthcare Bot is empowering providers to offer global access to care.
  • Reimagine healthcare using new data platform innovations: With the 2019 release of Azure API for FHIR, Microsoft became the first cloud provider with a fully managed, enterprise-grade service for health data in the Fast Healthcare Interoperability Resources (FHIR) format. We’re excited to expand those offerings with several new innovations around connecting, converting and transforming data. The first is Power BI FHIR Connector, which makes it simple and easy to bring FHIR data into Power BI for analytics and insights. The second, IoMT (Internet of Medical Things) FHIR Connector, is now available as open source software (OSS) and allows for seamless ingestion, normalization and transformation of Protected Health Information data from health devices into FHIR. Another new open source project, FHIR Converter, provides an easy way to convert healthcare data from legacy formats (i.e., HL7v2) into FHIR. And lastly, FHIR Tools for Anonymization, is now offered via OSS and enables anonymization and pseudonymization of data in the FHIR format. Including capabilities for redaction and date shifting in accordance with the HIPAA privacy rule.

Frictionless exchange of health information in FHIR makes it easier for researchers and clinicians to collaborate, innovate and improve patient care. As we move forward working with our customers and partners and others across the health ecosystem, Microsoft is committed to enabling and improving interoperability and required standards to make it easier for patients to manage their healthcare and control their information. At the same time, trust, privacy and compliance are a top priority – making sure Protected Health Information (PHI) remains under control and custodianship of healthcare providers and their patients.

We’ve seen a growing number of healthcare organizations not only deploy new technologies, but also begin to develop their own digital capabilities and solutions that use data and AI to transform and innovate healthcare and life sciences in profoundly positive ways. Over the past year, together with our customers and partners, we’ve announced new strategic partnerships aimed at empowering this transformation.

For example, to enable caregivers to focus more on patients by dramatically reducing the burden of documenting doctor-patient visits, Nuance has released Nuance Dragon Ambient eXperience (DAX). This ambient clinical intelligence technologies (ACI) is enriched by AI and cloud capabilities from Microsoft, including the ambient intelligence technology, EmpowerMD, which is coming to market as part of Nuance’s DAX solution. The solution aims to transform the exam room by deploying ACI to capture, with patient consent, interactions between clinicians and patients so that clinical documentation writes itself.

Among health systems, Providence St. Joseph Health is using Microsoft’s cloud, AI, productivity and collaboration technologies to deploy next-generation healthcare solutions while empowering their employees. NHS Calderdale is enabling patients and their providers to hold appointments virtually via Microsoft Teams for routine and follow-up visits, which helps lower costs while maintaining the quality of care. The U.S. Veterans Affairs Department is embracing mixed reality by working with technology providers Medivis, Microsoft and Verizon to roll out its first 5G-enabled hospital. And specifically for health consumers, Walgreens Boots Alliance will harness the power of our cloud, AI and productivity technologies to empower care teams and deliver new retail solutions to make healthcare delivery more personal, affordable and accessible.

Major payor, pharmaceutical and health technology platform companies are also transforming healthcare in collaboration with us. Humana will develop predictive solutions for personalized and secure patient support, and by using Azure, Azure AI and Microsoft 365, they’ll also equip home healthcare workers with real-time access to information and voice technology to better understand key factors that influence patient health. In pharmaceuticals, Novartis will bring Microsoft AI capabilities together with its deep expertise in life sciences to address specific challenges that make the process of discovering, developing and delivering new medicines so costly and time-consuming.

We’re pleased to showcase how together with our customers and partners, we’re working to bring healthcare solutions to life and positively impact the health ecosystem.

To keep up to date with the latest announcements visit the Microsoft Health News Room.

About the authors:
As Corporate Vice President of Health Technology and Alliances, Dr. Greg Moore leads the dedicated research and development collaborations with our strategic partners, to deliver next-generation technologies and experiences for healthcare.

Vice President and Chief Medical Officer Dr. David Rhew recently joined Microsoft’s Worldwide Commercial Business Healthcare leadership team and provides executive-level support, engaging in business opportunities with our customers and partners.

As Corporate Vice President of Healthcare, Peter Lee leads the Microsoft organization that works on technologies for better and more efficient healthcare, with a special focus on AI and cloud computing.

Tags: , , , ,

Go to Original Article
Author: Microsoft News Center

Intel CSME flaw deemed ‘unfixable’ by Positive Technologies

An underlying flaw in Intel chipsets, which was originally disclosed in May of 2019, was recently discovered by Positive Technologies to be far worse than previously reported.

Researchers from the vulnerability management vendor discovered a bug in the read-only memory of the Intel Converged Security and Management Engine (CSME) could allow threat actors to compromise platform encryption keys and steal sensitive information. The Intel CSME vulnerability, known as CVE-2019-0090, is present in both the hardware and the firmware of the boot ROM and affects all chips other than Intel’s 10th-generation “Ice Point” processors.

“We started researching the Intel CSME IOMMU [input-output memory management unit] in 2018,” Mark Ermolov, lead specialist of OS and hardware security at Positive Technologies, said via email. “We’ve been interested in that topic especially because we’ve known that Intel CSME shares its static operative memory with the host (main CPU) on some platforms. Studying the IOMMU mechanisms, we were very surprised that two main mechanisms of CSME and IOMMU are turned off by default. Next, we started researching Intel CSME boot ROM’s firmware to ascertain when CSME turns on the IOMMU mechanists and we found that there is a very big bug: the IOMMU is activated too late after x86 paging structures were created and initialized, a problem we found in October.”

The bug, which Positive calls “unfixable,” can be found in most Intel chipsets released in the last five years, according to Positive Technologies. 

“Intel CSME is responsible for initial authentication of Intel-based systems by loading and verifying all other firmware for modern problems,” Ermolov said. “It is the cryptographic basis for hardware security technologies developed by Intel and used everywhere, such as DRM, fTPM [firmware Trusted Platform Module] and Intel Identity protection. The main concern is that, because this vulnerability allows a compromise at the hardware level, it destroys the chain of trust for the platform as a whole.”

Although Intel has issued patches and mitigations that complicate the attack, Positive Technologies said fully patching the flaw is impossible because firmware updates can’t fully address all of the vectors.

“In the CVE-2019-0090 patch, Intel blocked ISH [Integrated Sensors Hub], so now it can’t issue DMA transactions to CSME. But we’re convinced there are other exploitation vectors and they will be found soon. To exploit a system that has not patched for CVE-2019-0090, an attacker doesn’t need to be very sophisticated,” Ermolov said.

In addition, Positive Technologies said extracting the chipset key is impossible to detect.

“The chipset key being leaked can’t be detected by CSME or by the main OS,” Ermolov said. “You’re already in danger, but you don’t know it. The attack (by DMA) also doesn’t leave any footprint. When an attacker uses the key to compromise the machine’s identity, this might be detected by you and you only, but only after it’s happened when it is too late.”

Once they’ve breached the system, threat actors can exploit this vulnerability in several ways, according to Positive Technologies.

“With the chipset key, attackers can pass off an attacker computer as the victims’ computer. They can gain remote certification into companies to access digital content usually under license (such as videos or films from companies like Netflix),” the company said via email. “They can steal temporary passwords to embezzle money. They can pose as a legitimate point-of-sale payment terminal to charge funds to their own accounts. Abusing this vulnerability, criminals can even spy on companies for industrial espionage or steal sensitive data from customers.”

Positive Technologies recommended disabling Intel CSME-based encryption or completely replacing CPUs with the latest generation of Intel chips.

This is the second vulnerability disclosed regarding Intel chips since January, when computer science researchers discovered a speculative execution attack that leaks data from an assortment of Intel processors released before the fourth quarter of 2018.

Go to Original Article
Author: