As title, for sale is one Intel NUC6CAYH with one stick 8GB RAM and an Integral V series 240GB SSD which has 198GB free after Windows & prog’s. This has a licensed Win 10 Pro installed together with the 1909 update, as well as: VLC; BBC iPlayer; ITV Hub; All 4; My 5; Amazon; Netflix; Youtube; Firefox; and Glary Utilities. It has had very little use (see below) and is in excellent condition with factory protective film still attached to the top surface and to the front. It all works just as it should with zero problems noted; it comes in the original box with backplate, booklets etc. plus it comes with an extra spare power supply.
The plan was that following a major operation I would be bed-ridden for around three months and this was going to be the bedroom PC “link” to the outside world while I recuperated. The spare power supply was to allow easy removal from behind my bedroom TV to wherever I needed it, & the PSU has been tested and works just fine. So that’s why it was set up as almost a Windows HTPC, so that I could use it for both media and business purposes BUT thankfully my recovery surprised my consultant as much as it surprised me and I never needed to use it. And that’s why it sat in my bedroom for over a year doing nothing other then me updating it when I remembered to.
Payment by BT only please (no PP) and delivery will be by Royal Mail Special Delivery with me posting it as soon as I can following receipt of payment. If you want to collect from the Lincolnshire coast you’re welcome. Price: £170 including delivery
I may also sell my NUC6i5SYH with 16GB RAM/500GB Samsung 950 Pro M2 PCIE/480GB Kingston SSD if anybody is interested at a sensible price? This one is fast with the Samsung software benchmarking the 950 Pro at 2508MB/s read & 1501 write in this system. Again with Win10 Pro installed and with original box, and in good condition £270 delivered.
GumGum developed computer vision and NLP technology to help clients better advertise to their users.
The Santa Monica, Calif.-based vendor, founded in 2008, automatically scans video, audio, images and text on webpages, identifying and extracting key elements. It then uses that data to help advertisers place relevant ads on the webpages.
To power its machine learning and computer vision technology, GumGum needs a lot of training data. To meet its data needs, about two years ago the company turned to Figure Eight, a crowdsourcing machine learning annotation vendor.
Acquired by Appen, another crowdsourcing machine learning annotation company, in April 2019, Figure Eight provides training data to a variety of similar vendors. Figure Eight relies on a network of contributors to annotate huge amounts of data.
The contributors are trained, although they are mostly not data scientists, and are screened for security purposes. Their large contributor network enables Figure Eight to train data at scale, as well as continue to review annotated data while a job is running.
Getting training data
Before using Figure Eight, GumGum employed full-time staff for machine learning annotation, said Erica Nishimura, data curator at GumGum. That worked, but it was costly and, at times, slow. With large amounts of data, it could take months to get useable training data. Besides, the staff could only work in English, but GumGum has clients internationally.
Figure Eight, meanwhile, works in a number of languages. At the time, Nishimura said, it was one of the only companies that worked in Japanese. As GumGum has a thriving Japanese division, the language support was one of the main reasons it chose Figure Eight.
Scalability, said Lane Schechter, product manager at GumGum, was the other reason GumGum chose Figure Eight.
Working with Figure Eight has increased GumGum’s data capacity tenfold, Schechter said. Also, instead of taking months to get completed machine learning annotation, it now happens in about a week.
Still, that’s not to say that working with Figure Eight has been without its share of problems.
One of the biggest challenges has been communicating directly with Figure Eight’s crowdsource contributors, Nishimura said.
At times, the contributors have had trouble understanding exactly what GumGum wants, but, because there is no way to directly interact with the contributors, Nishimura said it is hard to know if the contributors are having problems, or what they might be.
The best GumGum can do is put in a message, Nishimura said, but there is no way to alert each contributor to the message. Besides, a single message isn’t the same as having a conversation, she added.
While she was unsure if other similar crowdsourcing machine learning annotation companies have a better way to communicate with contributors, Nishimura said some other companies have their own checkers, who do spot-checks on completed annotations.
“It’s one more step to ensure quality,” Nishimura said. But, she added, the prices of those services are generally higher than those of Figure Eight’s.
Citrix introduced an analytics service to help IT professionals better identify the cause of slow application performance within its Virtual Apps and Desktops platform.
The company announced the general availability of the service, called Citrix Analytics for Performance, at its Citrix Summit, an event for the company’s business partners, in Orlando on Monday. The service carries an additional cost.
Steve Wilson, the company’s vice president of product for workspace ecosystem and analytics, said many IT admins must deal with performance problems as part of the nature of distributed applications. When they receive a call from workers complaining about performance, he said, it’s hard to determine the root cause — be it a capacity issue, a network problem or an issue with the employee’s device.
Performance, he said, is a frequent pain point for employees, especially remote and international workers.
“There are huge challenges that, from a performance perspective, are really hard to understand,” he said, adding that the tools available to IT professionals have not been ideal in identifying issues. “It’s all been very technical, very down in the weeds … it’s been hard to understand what [users] are seeing and how to make that actionable.”
Part of the problem, according to Wilson, is that traditional performance-measuring tools focus on server infrastructure. Keeping track of such metrics is important, he said, but they do not tell the whole story.
“Often, what [IT professionals] got was the aggregate view; it wasn’t personalized,” he said.
When the aggregate performance of the IT infrastructure is “good,” Wilson said, that could mean that half an organization’s users are seeing good performance, a quarter are seeing great performance, but a quarter are experiencing poor performance.
With its performance analytics service, Citrix is offering a more granular picture of performance by providing metrics on individual employees, beyond those of the company as a whole. That measurement, which Citrix calls a user experience or UX score, evaluates such factors as an employee’s machine performance, user logon time, network latency and network stability.
“With this tool, as a system administrator, you can come in and see the entire population,” Wilson said. “It starts with the top-level experience score, but you can very quickly break that down [to personal performance].”
Wilson said IT admins who had tested the product said this information helped them address performance issues more expeditiously.
“The feedback we’ve gotten is that they’ve been able to very quickly get to root causes,” he said. “They’ve been able to drill down in a way that’s easy to understand.”
A proactive approach
Eric Klein, analyst at VDC Research Group Inc., said the service represents a more proactive approach to performance problems, as opposed to identifying issues through remote access of an employee’s computer.
“If something starts to degrade from a performance perspective — like an app not behaving or slowing down — you can identify problems before users become frustrated,” he said.
Klein said IT admins would likely welcome any tool that, like this one, could “give time back” to them.
“IT is always being asked to do more with less, though budgets have slowly been growing over the past few years,” he said. “[Administrators] are always looking for tools that will not only automate processes but save time.”
Enterprise Strategy Group senior analyst Mark Bowker said in a press release from Citrix announcing the news that companies must examine user experience to ensure they provide employees with secure and consistent access to needed applications.
Eric KleinAnalyst, VDC Research Group
“Key to providing this seamless experience is having continuous visibility into network systems and applications to quickly spot and mitigate issues before they affect productivity,” he said in the release.
Wilson said the performance analytics service was the product of Citrix’s push to the cloud during the past few years. One of the early benefits of that process, he said, has been in the analytics field; the company has been able to apply machine learning to the data it has garnered and derive insights from it.
“We do see a broad opportunity around analytics,” he said. “That’s something you’ll see more and more of from us.”
Lamicall Silver Metal Laptop Stand Selling as silver doesn’t match my new Space Grey set-up, so selling to buy a new matching set! About 6 months old and still in excellent condition. RRP £29.99 on Amazon and the only stand with 89 reviews, all at 5 star!
And Lamicall Silver Adjustable Tablet Stand/Holder About 12 months old and still in excellent condition. RRP £14.99 on Amazon and the only stand with over 1000 reviews, all at 5 star!
Samsung PM981 512gb SSD – One of the fastest on the market. Intel 9260 WiFi – the killer WiFi that comes with these line of laptops has many issues, it has been replaced with an Intel 9260 which is one of the best / most reliable on the market.
The NFL will use AWS’ AI and machine learning products and services to better simulate and predict player injuries, with the goal of ultimately improving player health and safety.
The new NFL machine learning and AWS partnership, announced during a press event Thursday with AWS CEO Andy Jassy and NFL Commissioner Roger Goodell at AWS re:Invent 2019, will change the game of football, Goodell said.
“It will be changing the way it’s played, it will [change] the way its coached, the way we prepare athletes for the game,” he said.
The NFL machine learning journey
The partnership builds off Next Gen Stats, an existing NFL and AWS agreement that has helped the NFL capture and process data on its players. That partnership, revealed back in 2017, introduced new sensors on player equipment and the football to capture real-time location, speed and acceleration data.
That data is then fed into AWS data analytics and machine learning tools to provide fans, broadcasters and NFL Clubs with live and on-screen stats and predictions, including expected catch rates and pass completion probabilities.
Taking data from that, as well as from other sources, including video feeds, equipment choice, playing surfaces, player injury information, play type, impact type and environmental factors, the new NFL machine learning and AWS partnership will create a digital twin of players.
The NFL began the project with a collection of different data sets from which to gather information, said Jeff Crandall, chairman of the NFL Engineering Committee, during the press event.
It wasn’t just passing data, but also “the equipment that players were wearing, the frequency of those impacts, the speeds the players were traveling, the angles that they hit one another,” he continued.
Typically used in manufacturing to predict machine outputs and potential breakdowns, a digital twin is essentially a complex virtual replica of a machine or person formed out of a host of real-time and historical data. Using machine learning and predictive analytics, a digital twin can be fed into countless virtual scenarios, enabling engineers and data scientists to see how its real-life counterpart would react.
The new AWS and NFL partnership will create digital athletes, or digital twins of a scalable sampling of players, that can be fed into infinite scenarios without risking the health and safety of real players. Data collected from these scenarios is expected to provide insights into changes to game rules, player equipment and other factors that could make football a safer game.
“For us, what we see the power here is to be able to take the data that we’ve created over the last decade or so” and use it, Goodell said. “I think the possibilities are enormous.”
Partnership’s latest move to enhance safety
Roger GoodellCommissioner, NFL
New research in recent years has highlighted the extreme health risks of playing football. In 2017, researchers from the VA Boston Healthcare System and the Boston University School of Medicine published a study in the Journal of the American Medical Association that indicated football players are at a high risk for developing long-term neurological conditions.
The study, which did not include a control group, looked at the brains of high school, college and professional-level football players. Of the 111 NFL-level football players the researchers looked at, 110 of them had some form of degenerative brain disease.
The new partnership is just one of the changes the NFL has made over the last few years in an attempt to make football safer for its players. Other recent efforts include new helmet rules, and a recent $3 million challenge to create safer helmets.
The AWS and NFL partnership “really has a chance to transform player health and safety,” Jassy said.
AWS re:Invent, the annual flagship conference of AWS, was held this week in Las Vegas.
SAP is focused on better understanding what’s on the minds of their customers with the latest release of S/4HANA Cloud.
SAP S/4HANA Cloud 1911, which is now available, has SAP Qualtrics experience management (XM) embedded into the user interface, creating a feedback loop for the product management team about the application. This is one of the first integrations of Qualtrics XM into SAP products since SAP acquired the company a year ago for $8 billion.
“Users can give direct feedback on the application,” said Oliver Betz, global head of product management for S/4HANA Cloud at SAP. “It’s context-sensitive, so if you’re on a homescreen, it asks you, ‘How do you like the homescreen on a scale of one to five?’ And then the user can provide more detailed feedback from there.”
The customer data is consolidated and anonymized and sent to the S/4HANA Cloud product management team, Betz said.
“We’ll regularly screen the feedback to find hot spots,” he said. “In particular we’re interested in the outliers to the good and the bad, areas where obviously there’s something we specifically need to take care of, or also some areas where users are happy about the new features.”
Because S/4HANA Cloud is a cloud product that sends out new releases every quarter, the customer feedback loop that Qualtrics provides will inform developers on how to continually improve the product, Betz said.
“This is the first phase in the next iteration [of S/4HANA Cloud], which will add more granular features,” he said. “From a product management perspective, you can potentially have a new application and have some questions around the application to better understand the usage, what customers like and what they don’t like, and then to take it in a feedback loop to iterate over the next quarterly shipments so we can always provide new enhancements.”
Qualtrics integration may take time to provide value
It has taken a while, but it’s a good thing that SAP has now begun a real Qualtrics integration story, said Jon Reed, analyst and co-founder of Diginomica.com, an analysis and news site that focuses on enterprise applications. Still, SAP faces a few obstacles before the integration into S/4HANA Cloud can be a real differentiator.
“This isn’t a plug-and-play thing where customers are immediately able to use this the way you would a new app on your phone, like a new GPS app. This is useful experiential data which you must then analyze, manage and apply,” Reed said. “Eventually, you could build useful apps and dashboards with it, but you still have to apply the insights to get the value. However, if SAP has made those strides already on integrating Qualtrics with S/4HANA Cloud 1911, that’s a positive for them and we’ll see if it’s an advantage they can use to win sales.”
The Qualtrics products are impressive, but it’s still too early in the game to judge how the SAP S/4HANA integration will work out, said Vinnie Mirchandani, analyst and founder of Deal Architect, an enterprise applications focused blog.
“SAP will see more traction with Qualtrics in the employee and customer experience feedback area,” Mirchandani said. “Experiential tools have more impact where there are more human touchpoints — employees, customer service, customer feedback on product features — so I think the blend with SuccessFactors and C/4HANA is more obvious. This doesn’t mean that S/4 won’t see benefits, but the traction may be higher in other parts of the SAP portfolio.”
SAP SuccessFactors is also beginning to integrate Qualtrics into its employee experience management functions.
It’s a good thing that SAP is attempting to become a more customer-centric company, but it will need to follow through on the promise and make it a part of the company culture, said Faith Adams, senior analyst who focuses on customer experience at Forrester Research.
Many companies are making efforts to appear to be customer-centric, but aren’t following through with the best practices that are required to become truly customer-centric, like taking actions on the feedback they get, Adams said.
“It’s sometimes more of a ‘check the box’ activity rather than something that is embedded into the DNA or a way of life,” Adams said. “I hope that SAP does follow through on the best practices, but that’s to be determined.”
Bringing analytics to business users
SAP S/4HANA Cloud 1911 also now has SAP Analytics Cloud directly embedded. This will enable business users to take advantage of analytics capabilities without going to separate applications, according to SAP’s Betz.
It comes fully integrated out of the box and doesn’t require configuration, Betz said. Users can take advantage of included dashboards or create their own.
“The majority usage at the moment is in the finance application where you can directly access your [key performance indicators] there and have it all visualized, but also create and run your own dashboards,” he said. “This is about making data more available to business users instead of waiting for a report or something to be sent; everybody can have this information on hand already without having some business analyst putting [it] together.”
The embedded analytics capability could be an important differentiator for SAP in making data analytics more democratic across organizations, said Dana Gardner, president of IT consultancy Interarbor Solutions LLC. He believes companies need to break data out of “ivory towers” now as machine learning and AI grow in popularity and sophistication.
“The more people that use more analytics in your organization, the better off the company is,” Gardner said. “It’s really important that SAP gets aggressive on this, because it’s big and we’re going to see much more with machine learning and AI, so you’re going to need to have interfaces with the means to bring the more advanced types of analytics to more people as well.”
Students will be able to better engage with school staff and track their college coursework with the help of some new features in Salesforce.org Education Cloud.
These features can help students and staff get a better view of the student journey throughout the college lifecycle without having to use external systems and help better connect K-12 schools in the Salesforce ecosystem.
Higher education is an industry that lags in terms of digital transformation, said Joyce Kim, a higher education analyst at Ovum. But Salesforce’s foundation in the enterprise has a lot of applicability to the higher-education model.
“Student retention and completion are really important targets for institutions but having the right data and insights that will help a school achieve those goals is a challenge,” Kim said.
Enriching the student journey
A feature that could have a widespread effect is Salesforce Advisor Link Pathways, which assists in degree planning and helps keep students on track to graduate.
Currently, staff and students in the San Mateo County Community College District (SMCCCD) use a third-party system called DegreeWorks to aid with degree planning. Students currently have access to Salesforce, but they can only see members of their success team and alerts from faculty — such as a student failed a test, and tasks such as applying for a summer internship, resume review and updating a LinkedIn profile. The Pathways feature can bring degree planning right into the Salesforce system, eliminating the need for a third-party app.
“The big thing for staff and students is being able to have everyone integrated onto one system, and be able to take action in real time,” said Karrie Mitchell, vice president of planning for the SMCCCD. “There are so many different systems that are siloed, and Education Cloud brings it all together.”
Joyce KimHigher education analyst, Ovum
Many students take extra credits and extra student loan debt that don’t add up to a degree, and this will help advisers proactively manage and support students to make it to their graduation goal, said Nathalie Mainland, senior vice president and general manager of Education Cloud at Salesforce.org. Salesforce.com acquired Salesforce.org in April 2019 at a price of $300 million.
Another feature that could be beneficial to SMCCCD is the Einstein Analytics template for recruitment and admissions, using the Education Data Architecture as a foundation. These templates may help admission staff find trends in each year’s class, including demographics, areas of study and who’s taking what classes — and prevent the need to manually input information into the system, Mitchell said.
“This gives you a 360-degree view of the student, and that’s critical for us,” said Daman Grewal, CTO at SMCCCD.
Other new features
Also, in the Education Data Architecture, Salesforce is adding application and test score objects, making it easier to bring in data for the recruiting and admission process, such as application data and test scores. Previously, schools were doing custom builds, and now there will be a standardized way to bring this data into the system, Mainland said.
Other new Salesforce Advisor Link features include queue management — the No. 1 most requested feature from customers — and Salesforce Advisor Link for onboarding and pre-advising. Queue management will enable students to proactively make an appointment with their advisers, and the onboarding and pre-advising feature help catch students that received offer letters to be sure they accept and show up on campus.
And while K-12 institutions already have been using Salesforce.org Education Cloud, there is now a K-12 architecture kit. This will accelerate schools’ ability to use Education Cloud and Salesforce technology, and users will no longer have to customize it themselves, Mainland said.
While Oracle, Ellucian, Jenzabar and Campus management are all competing vendors with end-to-end CRM suites, Salesforce is positioned competitively in the higher-education CRM market, Kim said.
“Because of its user-friendly interface and ability to use emerging technologies for things like predictive analytics and automating processes, end users find their products are intuitive and effective,” she said.
Salesforce plans to dig into this Education Cloud news during Dreamforce, which takes place Nov. 19 to 22 in San Francisco, Mainland said.
The Education Data Architecture and K-12 architecture kits are both free, open source and available to both Salesforce and non-Salesforce users. They will be available on AppExchange and GitHub.
The Education Data Architecture features will be available in January. Everything else will be available by Nov. 18.
Data storage containers have become a popular way to create and package applications for better portability and simplicity. Seen by some analysts as the technology to unseat virtual machines, containers have steadily gained more attention as of late, from customers and vendors alike.
Why choose containers and containerization over the alternatives? Containers work on bare-metal systems, cloud instances and VMs, and across Linux and select Windows and Mac OSes. Containers typically use fewer resources than VMs and can bind together application libraries and dependencies into one convenient, deployable unit.
Below, you’ll find key terms about containers, from technical details to specific products on the market. If you’re looking to invest in containerization, you’ll need to know these terms and concepts.
Containerization. With its roots in partitioning, containerization is an efficient data storage strategy that virtually isolates applications, enabling multiple containers to run on one machine but share the same OS. Containers run independent processes in a shared user space and are capable of running on different environments, which makes them a flexible alternative to virtual machines.
The benefits of containerization include reduced overhead on hardware and portability, while concerns include the security of data stored on containers. With all of the containers running under one OS, if one container is vulnerable, the others are as well.
Container management software. As the name indicates, container management software is used to simplify, organize and manage containers. Container management software automates container creation, destruction, deployment and scaling and is particularly helpful in situations with large numbers of containers on one OS. However, the orchestration aspect of management software is complex and setup can be difficult.
Products in this area include Kubernetes, an open source container orchestration software; Apache Mesos, an open source project that manages compute clusters; and Docker Swarm, a container cluster management tool.
Persistent storage. In order to be persistent, a storage device must retain data after being shut off. While persistence is essentially a given when it comes to modern storage, the rise of containerization has brought persistent storage back to the forefront.
Containers did not always support persistent storage, which meant that data created with a containerized app would disappear when the container was destroyed. Luckily, storage vendors have made enough advances in container technology to solve this issue and retain data created on containers.
Stateful app. A stateful app saves client data from the activities of one session for use in the next session. Most applications and OSes are stateful, but because stateful apps didn’t scale well in early cloud architectures, developers began to build more stateless apps.
With a stateless app, each session is carried out as if it was the first time, and responses aren’t dependent upon data from a previous session. Stateless apps are better suited to cloud computing, in that they can be more easily redeployed in the event of a failure and scaled out to accommodate changes.
However, containerization allows files to be pulled into the container during startup and persist somewhere else when containers stop and start. This negates the issue of stateful apps becoming unstable when introduced to a stateless cloud environment.
Container vendors and products
While there is one vendor undoubtedly ahead of the pack when it comes to modern data storage containers, the field has opened up to include some big names. Below, we cover just a few of the vendors and products in the container space.
Docker. Probably the most synonymous with data storage containers, Docker is even credited with bringing about the container renaissance in the IT space. Docker’s platform is open source, which enables users to register and share containers over various hosts in both private and public environments. In recent years, Docker made containers accessible and offers various editions of containerization technology.
When you refer to Docker, you likely mean either the company itself, Docker Inc., or the Docker Engine. Initially developed for Linux systems, the Docker Engine had version updates extended to operate natively on both Windows and Apple OSes. The Docker Engine supports tasks and workflows involved in building, shipping and running container-based applications.
Container Linux. Originally referred to as CoreOS Linux, Container Linux by CoreOS is an open source OS that deploys and manages the applications within containers. Container Linux is based on the Linux kernel and is designed for massive scale and minimal overhead. Although, Container Linux is open source, CoreOS sells support for the OS. Acquired by Red Hat in 2018, CoreOS develops open source tools and components.
Azure Container Instances (ACI). With ACI, developers can deploy data storage containers on the Microsoft Azure cloud. Organizations can spin up a new container via the Azure portal or command-line interface, and Microsoft automatically provisions and scales the underlying compute resources. ACI also supports standard Docker images and Linux and Windows containers.
Microsoft Windows containers. Windows containers are abstracted and portable operating environments supported by the Microsoft Windows Server 2016 OS. They can be managed with Docker and PowerShell and support established Windows technologies. Along with Windows Containers, Windows Server 2016 also supports Hyper-V containers.
VMware vSphere Integrated Containers (VIC). While VIC can refer to individual container instances, it is also a platform that deploys and manages containers within VMs from within VMware’s vSphere VM management software. Previewed under the name Project Bonneville, VMware’s play on containers comes with the virtual container host, which represents tools and hardware resources that create and control container services.