Tag Archives: used

For Sale – Billion BiPAC 8800AXL Dual Band Wireless Router

For sale is a used billion bipac 8800axl dual band wireless AC 1300 mbps vdsl2 /adsl2+ 3G/4G LTE router in mint condition. Comes with all original accessories and still has protective film on front. Comes with original box but no sleeve

Has been used in smoke free home and in full Working order

[ATTACH…

Billion BiPAC 8800AXL Dual Band Wireless Router

Go to Original Article
Author:

AI in mining takes root in the industry

The mining industry has used technologies such as advanced machinery, satellite imagery and hypersensitive measurement tools. However, the industry is just beginning to use AI in mining, which has the potential to save workers time and companies money.

Geospatial analysis and data science vendor Descartes Lab has many customers in the mining sector, with a few packaged products specifically for customers in that area. Based in Santa Fe, N.M., the 2014 startup spun out of Los Alamos National Laboratory, a U.S. Department of Energy weapons research center.

The mining sector is in the early stages of using AI technologies, said James Orsulak, senior director of enterprise sales at Descartes Labs. Still, he said, almost all of the company’s clients have small data science teams made up of highly skilled experts.

“We’re seeing a transition where there are more former geologists who went back to school to get a data science degree,” Orsulak said.

Astral imagery

The Descartes Labs platforms for mining companies combine data sets from NASA’s Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), an advanced imaging instrument on the Terra satellite, with AI and analytics.

Vendors like Descartes Labs sell AI in mining technology.
Vendors like Descartes Labs sell AI in mining technology.

Descartes Labs ingested the entire dataset from ASTER, a process that took many CPU hours, Orsulak said. Using machine learning, Descartes Labs then removed all the structures, clouds and snow from the satellite images, leaving only a bare earth model.

We’re seeing a transition where there are more former geologists who went back to school to get a data science degree.
James OrsulakSenior director of enterprise sales, Descartes Labs

Mining clients then combine their data with the platform and layer in other types of data on the model, including mineral classification, geochemistry and geophysics data.

The platform, among other things, can be used to find new mineral deposits with machine learning, as  it can use data on known deposits to  find similar, previously unknown deposits.

Manually, that can take months or years, said Lori Wickert, a geologist and principal remote sensing consultant at Newmont Corporation, a gold mining company. 

“Working with the Descartes platform is providing an opportunity to shortcut that process in a major way,” Wickert said, adding that the software has saved her countless hours of manual work.

Another style

Meanwhile, Kespry, an industrial drone software and analytics vendor, also focuses on the mining sector, but with a slightly different approach.

The 2013 startup, based in Menlo Park, Calif., uses industrial drone imagery to fly over mining sites for mine planning and inventory management, said George Mathew, CEO and chairman of Kespry.

Using drone imagery either collected from its own drones or mining industry customers, along with its data science platform, Kespry maps daily topography changes in active areas, identify slope stability, identify draining patterns and more.

The company can also use the imagery and platform to automatically measure stockpile volumes of mined materials.

For mining companies and other industrial businesses that aren’t yet using AI and machine learning, the time to start is now, Mathew said.  

“The companies that end up making those investments now, they end up with a head start,” he said.

Go to Original Article
Author:

Las Vegas shores up SecOps with multi-factor authentication

The city of Las Vegas used AI-driven infrastructure security tools to stop an attacker in January before sensitive IT systems were accessed, but the city’s leadership bets future attempts won’t even get that far.

“Between CrowdStrike [endpoint security] and Darktrace [threat detection], both tools did exactly what they were supposed to do,” said Michael Sherwood, chief innovation officer for Las Vegas. “We had [a user] account compromised, and that allowed someone to gain short-term access to our systems.”

The city’s IT staff thwarted that attacker almost immediately in the early morning of Jan. 7. IT pros took measures to keep the attacker from accessing any of the city’s data once security monitoring tools alerted them to the intrusion.

The city has also used Okta access management tools for the last two years to consolidate user identity and authentication for its internal employees and automate access to applications through a self-service portal. Next, it will reinforce that process with multi-factor authentication using the same set of tools, in the hopes further cyberattacks will be stopped well outside its IT infrastructure.

Multi-factor security will couple a physical device — such as an employee badge or a USB key issued by the city — with usernames and passwords. This will reduce the likelihood that such an account compromise will happen again, Sherwood said. Having access management and user-level SecOps centralized within Okta has been key for the city to expand its security measures quickly based on what it learned from this breach. By mid-February, its IT team was able to test different types of multi-factor authentication systems and planned to roll one out within 60 days of the security incident.

Michael SherwoodMichael Sherwood

“With dual-factor authentication, you can’t just have a user ID and password — something you know,” Sherwood said. “A bad actor might know a user ID and password, but now they have to [physically] have something as well.”

SecOps automation a shrewd gamble for Las Vegas

Las Vegas initially rolled out Okta in 2018 to improve the efficiency of its IT help desk. Sherwood estimated the access management system cut down on help desk calls relating to forgotten passwords and password resets by 25%. The help desk also no longer had to manually install new applications for users because of an internal web portal connected to Okta that automatically manages authorization and permissions for self-service downloads. That freed up help desk employees for more strategic SecOps work, which now includes the multi-factor authentication rollout.

Another SecOps update slated for this year will add city employees’ mobile devices to the Okta identity management system, and an Okta single sign-on service for Las Vegas citizens that use the city’s web portal.

Residents will get one login for all services under this plan, Sherwood said. “If they get a parking citation and they’re used to paying their sewer bill, it’s the same login, and they can pay them both through a shopping cart.”

With dual-factor authentication, you can’t just have a user ID and password — something you know. A bad actor might know a user ID and password, but now they have to [physically] have something as well.
Michael SherwoodChief innovation officer, city of Las Vegas

Okta replaced a hodgepodge of different access management systems the city used previously, usually built into individual IT systems. When Las Vegas evaluated centralized access management tools two years ago, Okta was the only vendor in the group that was completely cloud-hosted, Sherwood said. This was a selling point for the city, since it minimized the operational overhead to set up and run the system.

Okta’s service competes with the likes of Microsoft Active Directory, OneLogin and Auth0. Las Vegas also uses Active Directory for access management in its back-end IT infrastructure, while Okta serves the customer and employee side of the organization.

“There is still separation between certain things, even though one product may well be capable of [handling] both,” he said.

Ultimately, the city would like to institute a centralized online payment system for citizens to go along with website single sign-on, and Sherwood said he’d like to see Okta offer that feature and electronic signatures as well.

“They’d have lot of opportunity there,” he said. “We can do payments and electronic signatures with different providers, but it would be great having that more integrated into the authentication process.”

An Okta representative said the company doesn’t have plans to support payment credentials at this time but that the company welcomes customer feedback.

Go to Original Article
Author:

For Sale – 8GB HyperX Fury DDR4 RAM

Only used for one week as I upgraded my ram shortly after buying my new computer
2x4GB sticks 2400mhz
Comes in box

Looking for £30 including 1st class recorded delivery Royal mail

Accept PayPal and bank transfer

Location
West Midlands, UK
Price and currency
£30
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Not advertised elsewhere
Payment method
Bank transfer or paypal

Last edited:

Go to Original Article
Author:

For Sale – Asus PG279Q & 7x 1TB SSDs (Crucial and Samsung)

6x Crucial MX500 1TB & 1x Samsung 860 EVO 1TB SSDs

These drives were bought to be used in my home dev server & as cache for my NAS.

1 of the Crucials is brand new in box, it was left as a cold spare.
All others have seen very light use… sub-2TB written to each disk, so not far off brand new.

Crucial – £60 posted each
Samsung – £70 posted each

Go to Original Article
Author:

Splice Machine 3.0 integrates machine learning capabilities, database

Databases have long been used for transactional and analytics use cases, but they also have practical utility to help enable machine learning capabilities. After all, machine learning is all about deriving insights from data, which is often stored inside a database.

San Francisco-based database vendor Splice Machine is taking an integrated approach to enabling machine learning with its eponymous database. Splice Machine is a distributed SQL relational database management system that includes machine learning capabilities as part of the overall platform.

Splice Machine 3.0 became generally available on March 3, bringing with it updated machine learning capabilities. It also has new Kubernetes cloud native-based model for cloud deployment and enhanced replication features.

In this Q&A, Monte Zweben, co-founder and CEO of Splice Machine, discusses the intersection of machine learning and databases and provides insight into the big changes that have occurred in the data landscape in recent years.

How do you integrate machine learning capabilities with a database?

Monte ZwebenMonte Zweben

Monte Zweben: The data platform itself has tables, rows and schema. The machine learning manager that we have native to the database has notebooks for developing models, Python for manipulating the data, algorithms that allow you to model and model workflow management that allows you to track the metadata on models as they go through their experimentation process. And finally we have in-database deployment.

So as an example, imagine a data scientist working in Splice Machine working in the insurance industry. They have an application for claims processing and they are building out models inside Splice Machine to predict claims fraud. There’s a function in Splice Machine called deploy, and what it will do is take a table and a model to generate database code. The deploy function builds a trigger on the database table that tells the table to call a stored procedure that has the model in it for every new record that comes in the table.

So what does this mean in plain English? Let’s say in the claims table, every time new claims would come in, the system would automatically trigger, grab those claims, run the model that predicts claim cause and outputs those predictions in another table. And now all of a sudden, you have real-time, in-the-moment machine learning that is detecting claim fraud on first notice of loss.

What does distributed SQL mean to you?

Zweben: So at its heart, it’s about sharing data across multiple nodes. That provides you the ability to parallelize computation and gain elastic scalability. That is the most important distributed attribute of Splice Machine.

In our new 3.0 release, we just added distributed replication. It’s another element of distribution where you have secondary Splice Machine instances in geo-replicated areas, to handle failover for disaster recovery.

What’s new in Splice Machine 3.0?

Zweben: We moved our cloud stack for Splice Machines from an old Mesos architecture to Kubernetes. Now our container-based architecture is all Kubernetes, and that has given us the opportunity to enable the separation of storage and compute. You literally can pause Splice Machine clusters and turn them back on. This is a great utility for consumption based usage of databases.

Along with our upgrade to Kubernetes, we also upgraded our machine learning manager from an older notebook technology called Zeppelin to a newer notebook technology that has really gained momentum in the marketplace, as much as Kubernetes has in the DevOps world. Jupyter notebooks have taken off in the data science space.

We’ve also enhanced our workflow management tool called mlflow, which is an open source tool that originated with Databricks and we’re part of that community. Mlflow allows data scientists to track their experiments and has that record of metadata available for governance.

What’s your view on open source and the risk of a big cloud vendor cannibalizing open source database technology?

Zweben: We do compose many different open source projects into a seamless and highly performant integration. Our secret sauce is how we put these things together at a very low level, with transactional integrity, to enable a single integrated system. This composition that we put together is open source, so that all of the pieces of our data platform are available in our open source repository, and people can see the source code right now.

I’m intensely worried about cloud cannibalization. I switched to an AGPL license specifically to protect against cannibalization by cloud vendors.

On the other hand, we believe we’re moving up the stack. If you look at our machine learning package, and how it’s so inextricably linked with the database, and the reference applications that we have in different segments, we’re going to be delivering more and more higher-level application functionality.

What are some of the biggest changes you’ve seen in the data landscape over the seven years you’ve been running Splice Machine?

Zweben: With the first generation of big data, it was all about data lakes, and let’s just get all the data the company has into one repository. Unfortunately, that has proven time and time again, at company after company, to just be data swamps.

Data repositories work, they’re scalable, but they don’t have anyone using the data, and this was a mistake for several reasons.

Instead of thinking about storing the data, companies should think about how to use the data.
Monte ZwebenCo-founder and CEO, Splice Machine

Instead of thinking about storing the data, companies should think about how to use the data. Start with the application and how you are going to make the application leverage new data sources.

The second reason why this was a mistake was organizationally, because the data scientists who know AI were all centralized in one data science group, away from the application. They are not the subject matter experts for the application.

When you focus on the application and retrofit the application to make it smart and inject AI, you can get a multidisciplinary team. You have app developers, architects, subject-matter experts, data engineers and data scientists, all working together on one purpose. That is a radically more effective and productive organizational structure for modernizing applications with AI.

Go to Original Article
Author:

For Sale – or Trade (eGPU enclosure) : Dell 3020M USFF (4150T, 8GB, SSD, WiFi)

Very compact PC which has been used mainly as an HTPC. Intel 4150T, 8GB, wifi, 128GB SSD, Win10.
Will consider trade with a graphics card enclosure as long as it’s TB3 compatible. Cash your way depending on model.

Location
bristol
Price and currency
120
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Advertised elsewhere
Payment method
BT

Last edited:

Go to Original Article
Author:

For Sale – Oculus Rift VR headset

Hi

Selling my Oculus Rift as it’s just not getting used that much. I have only used it to play the occasional game of Beat Sabre and Project Cars VR and so the condition is very good, it comes boxed with everything apart from the lens cleaning cloth.

If anybody is looking to get into PC VR then this is still a great starting point.

Pictures below.

Thanks for looking.

Location
St.Helens
Price and currency
£190
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Not advertised elsewhere
Payment method
PPG

Go to Original Article
Author:

For Sale – Samsung 34 Inch 21:9 WQHD (Reduced £200)

This monitor is too big for me, so looking to sell after a downsize. I’ve only used for work so I can’t comment on gaming.

Samsung LS34J550WQUXEN 34 inch LED Monitor
Blue/grey colour
3440 x 1440 VA panel
VESA Compatible

Collection only please. Open to offers.

The only things that have bothered me are the loose power cable (not an issue if left static on desk) and the stand is not height adjustable.

Location
York England
Price and currency
£200
Delivery cost included
Delivery is NOT included
Prefer goods collected?
I prefer the goods to be collected
Advertised elsewhere?
Advertised elsewhere
Payment method
Cash or bank transfer

Last edited:

Go to Original Article
Author:

For Sale – Lenovo Yoga 2 11″

A super-handy little notebook/convertible (Yoga being the 360°hinge and touchscreen for the uninitiated). I used this as my lightweight work machine for a while (at a time when I had a “desktop replacement” that was entirely inappropriate for air travel), and was subsequently deployed as homework duty for my daughter, but these days gets very little use. Has a rubberised edge, fanless and no moving parts so pretty ideal for kids. Still enough power for browsing and light tasks, though probably wouldn’t want to be running Photoshop on it.

This is a much higher spec than the ones I’ve spied on auction sites (which mainly seem to have lower power CPUs and HDDs):
Core i3-4012Y
4GB RAM
120GB SSD
11.6” 1366×768 10 Point multi-touch screen
Windows 10 Pro
US Keyboard Layout
Lenovo Original Charger (UK plug)

Battery life is still serviceable, but you won’t get a whole day out of it.

A couple of small scratches and one small crack on the edge of the keyboard, but generally good condition. Screen is excellent, no marks that I can see.

Go to Original Article
Author: